WorldWideScience

Sample records for database access technologies

  1. Training Database Technology in DBMS MS Access

    Directory of Open Access Journals (Sweden)

    Nataliya Evgenievna Surkova

    2015-05-01

    Full Text Available The article describes the methodological issues of learning relational database technology and management systems relational databases. DBMS Microsoft Access is the primer for learning of DBMS. This methodology allows to generate some general cultural competence, such as the possession of the main methods, ways and means of production, storage and processing of information, computer skills as a means of managing information. Also must formed professional competence such as the ability to collect, analyze and process the data necessary for solving the professional tasks, the ability to use solutions for analytical and research tasks modern technology and information technology.

  2. Training Database Technology in DBMS MS Access

    OpenAIRE

    Nataliya Evgenievna Surkova

    2015-01-01

    The article describes the methodological issues of learning relational database technology and management systems relational databases. DBMS Microsoft Access is the primer for learning of DBMS. This methodology allows to generate some general cultural competence, such as the possession of the main methods, ways and means of production, storage and processing of information, computer skills as a means of managing information. Also must formed professional competence such as the ability to coll...

  3. High-Performance Secure Database Access Technologies for HEP Grids

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the

  4. High-Performance Secure Database Access Technologies for HEP Grids

    International Nuclear Information System (INIS)

    Vranicar, Matthew; Weicher, John

    2006-01-01

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist's computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that 'Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications'. There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure

  5. Database Access through Java Technologies

    Directory of Open Access Journals (Sweden)

    Nicolae MERCIOIU

    2010-09-01

    Full Text Available As a high level development environment, the Java technologies offer support to the development of distributed applications, independent of the platform, providing a robust set of methods to access the databases, used to create software components on the server side, as well as on the client side. Analyzing the evolution of Java tools to access data, we notice that these tools evolved from simple methods that permitted the queries, the insertion, the update and the deletion of the data to advanced implementations such as distributed transactions, cursors and batch files. The client-server architectures allows through JDBC (the Java Database Connectivity the execution of SQL (Structured Query Language instructions and the manipulation of the results in an independent and consistent manner. The JDBC API (Application Programming Interface creates the level of abstractization needed to allow the call of SQL queries to any DBMS (Database Management System. In JDBC the native driver and the ODBC (Open Database Connectivity-JDBC bridge and the classes and interfaces of the JDBC API will be described. The four steps needed to build a JDBC driven application are presented briefly, emphasizing on the way each step has to be accomplished and the expected results. In each step there are evaluations on the characteristics of the database systems and the way the JDBC programming interface adapts to each one. The data types provided by SQL2 and SQL3 standards are analyzed by comparison with the Java data types, emphasizing on the discrepancies between those and the SQL types, but also the methods that allow the conversion between different types of data through the methods of the ResultSet object. Next, starting from the metadata role and studying the Java programming interfaces that allow the query of result sets, we will describe the advanced features of the data mining with JDBC. As alternative to result sets, the Rowsets add new functionalities that

  6. Advanced technologies for scalable ATLAS conditions database access on the grid

    International Nuclear Information System (INIS)

    Basset, R; Canali, L; Girone, M; Hawkings, R; Valassi, A; Viegas, F; Dimitrov, G; Nevski, P; Vaniachine, A; Walker, R; Wong, A

    2010-01-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysis of server performance under stress tests indicates that Conditions Db data access is limited by the disk I/O throughput. An unacceptable side-effect of the disk I/O saturation is a degradation of the WLCG 3D Services that update Conditions Db data at all ten ATLAS Tier-1 sites using the technology of Oracle Streams. To avoid such bottlenecks we prototyped and tested a novel approach for database peak load avoidance in Grid computing. Our approach is based upon the proven idea of pilot job submission on the Grid: instead of the actual query, an ATLAS utility library sends to the database server a pilot query first.

  7. Reactome graph database: Efficient access to complex pathway data

    Science.gov (United States)

    Korninger, Florian; Viteri, Guilherme; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D’Eustachio, Peter

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types. PMID:29377902

  8. Reactome graph database: Efficient access to complex pathway data.

    Directory of Open Access Journals (Sweden)

    Antonio Fabregat

    2018-01-01

    Full Text Available Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j as well as the new ContentService (REST API that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types.

  9. Reactome graph database: Efficient access to complex pathway data.

    Science.gov (United States)

    Fabregat, Antonio; Korninger, Florian; Viteri, Guilherme; Sidiropoulos, Konstantinos; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D'Eustachio, Peter; Hermjakob, Henning

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types.

  10. Advanced technologies for scalable ATLAS conditions database access on the grid

    CERN Document Server

    Basset, R; Dimitrov, G; Girone, M; Hawkings, R; Nevski, P; Valassi, A; Vaniachine, A; Viegas, F; Walker, R; Wong, A

    2010-01-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysi...

  11. Database application research in real-time data access of accelerator control system

    International Nuclear Information System (INIS)

    Chen Guanghua; Chen Jianfeng; Wan Tianmin

    2012-01-01

    The control system of Shanghai Synchrotron Radiation Facility (SSRF) is a large-scale distributed real-time control system, It involves many types and large amounts of real-time data access during the operating. Database system has wide application prospects in the large-scale accelerator control system. It is the future development direction of the accelerator control system, to replace the differently dedicated data structures with the mature standardized database system. This article discusses the application feasibility of database system in accelerators based on the database interface technology, real-time data access testing, and system optimization research and to establish the foundation of the wide scale application of database system in the SSRF accelerator control system. Based on the database interface technology, real-time data access testing and system optimization research, this article will introduce the application feasibility of database system in accelerators, and lay the foundation of database system application in the SSRF accelerator control system. (authors)

  12. Scaling up ATLAS Database Release Technology for the LHC Long Run

    International Nuclear Information System (INIS)

    Borodin, M; Nevski, P; Vaniachine, A

    2011-01-01

    To overcome scalability limitations in database access on the Grid, ATLAS introduced the Database Release technology replicating databases in files. For years Database Release technology assured scalable database access for Monte Carlo production on the Grid. Since previous CHEP, Database Release technology was used successfully in ATLAS data reprocessing on the Grid. Frozen Conditions DB snapshot guarantees reproducibility and transactional consistency isolating Grid data processing tasks from continuous conditions updates at the 'live' Oracle server. Database Release technology fully satisfies the requirements of ATLAS data reprocessing and Monte Carlo production. We parallelized the Database Release build workflow to avoid linear dependency of the build time on the length of LHC data-taking period. In recent data reprocessing campaigns the build time was reduced by an order of magnitude thanks to a proven master-worker architecture used in the Google MapReduce. We describe further Database Release optimizations scaling up the technology for the LHC long run.

  13. Intelligent Access to Sequence and Structure Databases (IASSD) - an interface for accessing information from major web databases.

    Science.gov (United States)

    Ganguli, Sayak; Gupta, Manoj Kumar; Basu, Protip; Banik, Rahul; Singh, Pankaj Kumar; Vishal, Vineet; Bera, Abhisek Ranjan; Chakraborty, Hirak Jyoti; Das, Sasti Gopal

    2014-01-01

    With the advent of age of big data and advances in high throughput technology accessing data has become one of the most important step in the entire knowledge discovery process. Most users are not able to decipher the query result that is obtained when non specific keywords or a combination of keywords are used. Intelligent access to sequence and structure databases (IASSD) is a desktop application for windows operating system. It is written in Java and utilizes the web service description language (wsdl) files and Jar files of E-utilities of various databases such as National Centre for Biotechnology Information (NCBI) and Protein Data Bank (PDB). Apart from that IASSD allows the user to view protein structure using a JMOL application which supports conditional editing. The Jar file is freely available through e-mail from the corresponding author.

  14. NCBI2RDF: Enabling Full RDF-Based Access to NCBI Databases

    Directory of Open Access Journals (Sweden)

    Alberto Anguita

    2013-01-01

    Full Text Available RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments.

  15. NCBI2RDF: Enabling Full RDF-Based Access to NCBI Databases

    Science.gov (United States)

    Anguita, Alberto; García-Remesal, Miguel; de la Iglesia, Diana; Maojo, Victor

    2013-01-01

    RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments. PMID:23984425

  16. NCBI2RDF: enabling full RDF-based access to NCBI databases.

    Science.gov (United States)

    Anguita, Alberto; García-Remesal, Miguel; de la Iglesia, Diana; Maojo, Victor

    2013-01-01

    RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments.

  17. Fermilab Security Site Access Request Database

    Science.gov (United States)

    Fermilab Security Site Access Request Database Use of the online version of the Fermilab Security Site Access Request Database requires that you login into the ESH&Q Web Site. Note: Only Fermilab generated from the ESH&Q Section's Oracle database on May 27, 2018 05:48 AM. If you have a question

  18. PathwayAccess: CellDesigner plugins for pathway databases.

    Science.gov (United States)

    Van Hemert, John L; Dickerson, Julie A

    2010-09-15

    CellDesigner provides a user-friendly interface for graphical biochemical pathway description. Many pathway databases are not directly exportable to CellDesigner models. PathwayAccess is an extensible suite of CellDesigner plugins, which connect CellDesigner directly to pathway databases using respective Java application programming interfaces. The process is streamlined for creating new PathwayAccess plugins for specific pathway databases. Three PathwayAccess plugins, MetNetAccess, BioCycAccess and ReactomeAccess, directly connect CellDesigner to the pathway databases MetNetDB, BioCyc and Reactome. PathwayAccess plugins enable CellDesigner users to expose pathway data to analytical CellDesigner functions, curate their pathway databases and visually integrate pathway data from different databases using standard Systems Biology Markup Language and Systems Biology Graphical Notation. Implemented in Java, PathwayAccess plugins run with CellDesigner version 4.0.1 and were tested on Ubuntu Linux, Windows XP and 7, and MacOSX. Source code, binaries, documentation and video walkthroughs are freely available at http://vrac.iastate.edu/~jlv.

  19. Information literacy skills and accessibility of databases among ...

    African Journals Online (AJOL)

    Previous researches on the accessibility of Databases by Undergraduate Students of Umaru Musa Yar'adua University (UMYU) have found out that there was low level of accessibility of library databases. This paper investigates the further factors of the accessibility of databases among Undergraduate Students of Umaru ...

  20. Assesment of access to bibliographic databases and telemetry databases in Astronomy: A groundswell for development.

    Science.gov (United States)

    Diaz-Merced, Wanda Liz; Casado, Johanna; Garcia, Beatriz; Aarnio, Alicia; Knierman, Karen; Monkiewicz, Jacqueline; Alicia Aarnio.

    2018-01-01

    Big Data" is a subject that has taken special relevance today, particularly in Astrophysics, where continuous advances in technology are leading to ever larger data sets. A multimodal approach in perception of astronomical data data (achieved through sonification used for the processing of data) increases the detection of signals in very low signal-to-noise ratio limits and is of special importance to achieve greater inclusion in the field of Astronomy. In the last ten years, different software tools have been developed that perform the sonification of astronomical data from tables or databases, among them the best known and in multiplatform development are Sonification Sandbox, MathTrack, and xSonify.In order to determine the accessibility of software we propose to start carrying out a conformity analysis of ISO (International Standard Organization) 9241-171171: 2008. This standard establishes the general guidelines that must be taken into account for accessibility in software design, and it is applied to software used in work, public places, and at home. To analyze the accessibility of web databases, we take into account the "Web Content Content Accessibility Guidelines (WCAG) 2.0", accepted and published by ISO in the ISO / IEC 40500: 2012 standard.In this poster, we present a User Centered Design (UCD), Human Computer Interaction (HCI), and User Experience (UX) framework to address a non-segregational provision of access to bibliographic databases and telemetry databases in Astronomy. Our framework is based on an ISO evaluation on a selection of data bases such as ADS, Simbad and SDSS. The WCAG 2.0 and ISO 9241-171171: 2008 should not be taken as absolute accessibility standards: these guidelines are very general, are not absolute, and do not address particularities. They are not to be taken as a substitute for UCD, HCI, UX design and evaluation. Based on our results, this research presents the framework for a focus group and qualitative data analysis aimed to

  1. Accessing and using chemical databases

    DEFF Research Database (Denmark)

    Nikolov, Nikolai Georgiev; Pavlov, Todor; Niemelä, Jay Russell

    2013-01-01

    Computer-based representation of chemicals makes it possible to organize data in chemical databases-collections of chemical structures and associated properties. Databases are widely used wherever efficient processing of chemical information is needed, including search, storage, retrieval......, and dissemination. Structure and functionality of chemical databases are considered. The typical kinds of information found in a chemical database are considered-identification, structural, and associated data. Functionality of chemical databases is presented, with examples of search and access types. More details...... are included about the OASIS database and platform and the Danish (Q)SAR Database online. Various types of chemical database resources are discussed, together with a list of examples....

  2. Database theory and SQL practice using Access

    International Nuclear Information System (INIS)

    Kim, Gyeong Min; Lee, Myeong Jin

    2001-01-01

    This book introduces database theory and SQL practice using Access. It is comprised of seven chapters, which give description of understanding database with basic conception and DMBS, understanding relational database with examples of it, building database table and inputting data using access 2000, structured Query Language with introduction, management and making complex query using SQL, command for advanced SQL with understanding conception of join and virtual table, design on database for online bookstore with six steps and building of application with function, structure, component, understanding of the principle, operation and checking programming source for application menu.

  3. Information persistence using XML database technology

    Science.gov (United States)

    Clark, Thomas A.; Lipa, Brian E. G.; Macera, Anthony R.; Staskevich, Gennady R.

    2005-05-01

    The Joint Battlespace Infosphere (JBI) Information Management (IM) services provide information exchange and persistence capabilities that support tailored, dynamic, and timely access to required information, enabling near real-time planning, control, and execution for DoD decision making. JBI IM services will be built on a substrate of network centric core enterprise services and when transitioned, will establish an interoperable information space that aggregates, integrates, fuses, and intelligently disseminates relevant information to support effective warfighter business processes. This virtual information space provides individual users with information tailored to their specific functional responsibilities and provides a highly tailored repository of, or access to, information that is designed to support a specific Community of Interest (COI), geographic area or mission. Critical to effective operation of JBI IM services is the implementation of repositories, where data, represented as information, is represented and persisted for quick and easy retrieval. This paper will address information representation, persistence and retrieval using existing database technologies to manage structured data in Extensible Markup Language (XML) format as well as unstructured data in an IM services-oriented environment. Three basic categories of database technologies will be compared and contrasted: Relational, XML-Enabled, and Native XML. These technologies have diverse properties such as maturity, performance, query language specifications, indexing, and retrieval methods. We will describe our application of these evolving technologies within the context of a JBI Reference Implementation (RI) by providing some hopefully insightful anecdotes and lessons learned along the way. This paper will also outline future directions, promising technologies and emerging COTS products that can offer more powerful information management representations, better persistence mechanisms and

  4. Correlates of Access to Business Research Databases

    Science.gov (United States)

    Gottfried, John C.

    2010-01-01

    This study examines potential correlates of business research database access through academic libraries serving top business programs in the United States. Results indicate that greater access to research databases is related to enrollment in graduate business programs, but not to overall enrollment or status as a public or private institution.…

  5. Tri-party agreement databases, access mechanism and procedures. Revision 2

    International Nuclear Information System (INIS)

    Brulotte, P.J.

    1996-01-01

    This document contains the information required for the Washington State Department of Ecology (Ecology) and the U.S. Environmental Protection Agency (EPA) to access databases related to the Hanford Federal Facility Agreement and Consent Order (Tri-Party Agreement). It identifies the procedure required to obtain access to the Hanford Site computer networks and the Tri-Party Agreement related databases. It addresses security requirements, access methods, database availability dates, database access procedures, and the minimum computer hardware and software configurations required to operate within the Hanford Site networks. This document supersedes any previous agreements including the Administrative Agreement to Provide Computer Access to U.S. Environmental Protection Agency (EPA) and the Administrative Agreement to Provide Computer Access to Washington State Department of Ecology (Ecology), agreements that were signed by the U.S. Department of Energy (DOE), Richland Operations Office (RL) in June 1990, Access approval to EPA and Ecology is extended by RL to include all Tri-Party Agreement relevant databases named in this document via the documented access method and date. Access to databases and systems not listed in this document will be granted as determined necessary and negotiated among Ecology, EPA, and RL through the Tri-Party Agreement Project Managers. The Tri-Party Agreement Project Managers are the primary points of contact for all activities to be carried out under the Tri-Party Agreement. Action Plan. Access to the Tri-Party Agreement related databases and systems does not provide or imply any ownership on behalf of Ecology or EPA whether public or private of either the database or the system. Access to identified systems and databases does not include access to network/system administrative control information, network maps, etc

  6. Nuclear Criticality Technology and Safety Project parameter study database

    International Nuclear Information System (INIS)

    Toffer, H.; Erickson, D.G.; Samuel, T.J.; Pearson, J.S.

    1993-03-01

    A computerized, knowledge-screened, comprehensive database of the nuclear criticality safety documentation has been assembled as part of the Nuclear Criticality Technology and Safety (NCTS) Project. The database is focused on nuclear criticality parameter studies. The database has been computerized using dBASE III Plus and can be used on a personal computer or a workstation. More than 1300 documents have been reviewed by nuclear criticality specialists over the last 5 years to produce over 800 database entries. Nuclear criticality specialists will be able to access the database and retrieve information about topical parameter studies, authors, and chronology. The database places the accumulated knowledge in the nuclear criticality area over the last 50 years at the fingertips of a criticality analyst

  7. Access To The PMM's Pixel Database

    Science.gov (United States)

    Monet, D.; Levine, S.

    1999-12-01

    The U.S. Naval Observatory Flagstaff Station is in the process of enabling access to the Precision Measuring Machine (PMM) program's pixel database. The initial release will include the pixels from the PMM's scans of the Palomar Observatory Sky Survey I (POSS-I) -O and -E surveys, the Whiteoak Extension, the European Southern Observatory-R survey, the Science and Engineering Council-J, -EJ, and -ER surveys, and the Anglo- Australian Observatory-R survey. (The SERC-ER and AAO-R surveys are currently incomplete.) As time allows, access to the POSS-II -J, -F, and -N surveys, the Palomar Infrared Milky Way Atlas, the Yale/San Juan Southern Proper Motion survey, and plates rejected by various surveys will be added. (POSS-II -J and -F are complete, but -N was never finished.) Eventually, some 10 Tbytes of pixel data will be available. Due to funding and technology limitations, the initial interface will have only limited functionality, and access time will be slow since the archive is stored on Digital Linear Tape (DLT). Usage of the pixel data will be restricted to non-commercial, scientific applications, and agreements on copyright issues have yet to be finalized. The poster presentation will give the URL.

  8. Large scale access tests and online interfaces to ATLAS conditions databases

    International Nuclear Information System (INIS)

    Amorim, A; Lopes, L; Pereira, P; Simoes, J; Soloviev, I; Burckhart, D; Schmitt, J V D; Caprini, M; Kolos, S

    2008-01-01

    The access of the ATLAS Trigger and Data Acquisition (TDAQ) system to the ATLAS Conditions Databases sets strong reliability and performance requirements on the database storage and access infrastructures. Several applications were developed to support the integration of Conditions database access with the online services in TDAQ, including the interface to the Information Services (IS) and to the TDAQ Configuration Databases. The information storage requirements were the motivation for the ONline A Synchronous Interface to COOL (ONASIC) from the Information Service (IS) to LCG/COOL databases. ONASIC avoids the possible backpressure from Online Database servers by managing a local cache. In parallel, OKS2COOL was developed to store Configuration Databases into an Offline Database with history record. The DBStressor application was developed to test and stress the access to the Conditions database using the LCG/COOL interface while operating in an integrated way as a TDAQ application. The performance scaling of simultaneous Conditions database read accesses was studied in the context of the ATLAS High Level Trigger large computing farms. A large set of tests were performed involving up to 1000 computing nodes that simultaneously accessed the LCG central database server infrastructure at CERN

  9. The AAS Working Group on Accessibility and Disability (WGAD) Year 1 Highlights and Database Access

    Science.gov (United States)

    Knierman, Karen A.; Diaz Merced, Wanda; Aarnio, Alicia; Garcia, Beatriz; Monkiewicz, Jacqueline A.; Murphy, Nicholas Arnold

    2017-06-01

    The AAS Working Group on Accessibility and Disability (WGAD) was formed in January of 2016 with the express purpose of seeking equity of opportunity and building inclusive practices for disabled astronomers at all educational and career stages. In this presentation, we will provide a summary of current activities, focusing on developing best practices for accessibility with respect to astronomical databases, publications, and meetings. Due to the reliance of space sciences on databases, it is important to have user centered design systems for data retrieval. The cognitive overload that may be experienced by users of current databases may be mitigated by use of multi-modal interfaces such as xSonify. Such interfaces would be in parallel or outside the original database and would not require additional software efforts from the original database. WGAD is partnering with the IAU Commission C1 WG Astronomy for Equity and Inclusion to develop such accessibility tools for databases and methods for user testing. To collect data on astronomical conference and meeting accessibility considerations, WGAD solicited feedback from January AAS attendees via a web form. These data, together with upcoming input from the community and analysis of accessibility documents of similar conferences, will be used to create a meeting accessibility document. Additionally, we will update the progress of journal access guidelines and our social media presence via Twitter. We recommend that astronomical journals form committees to evaluate the accessibility of their publications by performing user-centered usability studies.

  10. Accessing the SEED genome databases via Web services API: tools for programmers.

    Science.gov (United States)

    Disz, Terry; Akhter, Sajia; Cuevas, Daniel; Olson, Robert; Overbeek, Ross; Vonstein, Veronika; Stevens, Rick; Edwards, Robert A

    2010-06-14

    The SEED integrates many publicly available genome sequences into a single resource. The database contains accurate and up-to-date annotations based on the subsystems concept that leverages clustering between genomes and other clues to accurately and efficiently annotate microbial genomes. The backend is used as the foundation for many genome annotation tools, such as the Rapid Annotation using Subsystems Technology (RAST) server for whole genome annotation, the metagenomics RAST server for random community genome annotations, and the annotation clearinghouse for exchanging annotations from different resources. In addition to a web user interface, the SEED also provides Web services based API for programmatic access to the data in the SEED, allowing the development of third-party tools and mash-ups. The currently exposed Web services encompass over forty different methods for accessing data related to microbial genome annotations. The Web services provide comprehensive access to the database back end, allowing any programmer access to the most consistent and accurate genome annotations available. The Web services are deployed using a platform independent service-oriented approach that allows the user to choose the most suitable programming platform for their application. Example code demonstrate that Web services can be used to access the SEED using common bioinformatics programming languages such as Perl, Python, and Java. We present a novel approach to access the SEED database. Using Web services, a robust API for access to genomics data is provided, without requiring large volume downloads all at once. The API ensures timely access to the most current datasets available, including the new genomes as soon as they come online.

  11. Access database application in medical treatment management platform

    International Nuclear Information System (INIS)

    Wu Qingming

    2014-01-01

    For timely, accurate and flexible access to medical expenses data, we applied Microsoft Access 2003 database management software, and we finished the establishment of a management platform for medical expenses. By developing management platform for medical expenses, overall hospital costs for medical expenses can be controlled to achieve a real-time monitoring of medical expenses. Using the Access database management platform for medical expenses not only changes the management model, but also promotes a sound management system for medical expenses. (authors)

  12. Extending Database Integration Technology

    National Research Council Canada - National Science Library

    Buneman, Peter

    1999-01-01

    Formal approaches to the semantics of databases and database languages can have immediate and practical consequences in extending database integration technologies to include a vastly greater range...

  13. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    CERN Document Server

    Valassi, A; Kalkhof, A; Salnikov, A; Wache, M

    2011-01-01

    The CORAL software is widely used at CERN for accessing the data stored by the LHC experiments using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several backends and deployment models, including local access to SQLite files, direct client access to Oracle and MySQL servers, and read-only access to Oracle through the FroNTier web server and cache. Two new components have recently been added to CORAL to implement a model involving a middle tier "CORAL server" deployed close to the database and a tree of "CORAL server proxy" instances, with data caching and multiplexing functionalities, deployed close to the client. The new components are meant to provide advantages for read-only and read-write data access, in both offline and online use cases, in the areas of scalability and performance (multiplexing for several incoming connections, optional data caching) and security (authentication via proxy certificates). A first implementation of the two new c...

  14. Database design for Physical Access Control System for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Sathishkumar, T., E-mail: satishkumart@igcar.gov.in; Rao, G. Prabhakara, E-mail: prg@igcar.gov.in; Arumugam, P., E-mail: aarmu@igcar.gov.in

    2016-08-15

    Highlights: • Database design needs to be optimized and highly efficient for real time operation. • It requires a many-to-many mapping between Employee table and Doors table. • This mapping typically contain thousands of records and redundant data. • Proposed novel database design reduces the redundancy and provides abstraction. • This design is incorporated with the access control system developed in-house. - Abstract: A (Radio Frequency IDentification) RFID cum Biometric based two level Access Control System (ACS) was designed and developed for providing access to vital areas of nuclear facilities. The system has got both hardware [Access controller] and software components [server application, the database and the web client software]. The database design proposed, enables grouping of the employees based on the hierarchy of the organization and the grouping of the doors based on Access Zones (AZ). This design also illustrates the mapping between the Employee Groups (EG) and AZ. By following this approach in database design, a higher level view can be presented to the system administrator abstracting the inner details of the individual entities and doors. This paper describes the novel approach carried out in designing the database of the ACS.

  15. Database design for Physical Access Control System for nuclear facilities

    International Nuclear Information System (INIS)

    Sathishkumar, T.; Rao, G. Prabhakara; Arumugam, P.

    2016-01-01

    Highlights: • Database design needs to be optimized and highly efficient for real time operation. • It requires a many-to-many mapping between Employee table and Doors table. • This mapping typically contain thousands of records and redundant data. • Proposed novel database design reduces the redundancy and provides abstraction. • This design is incorporated with the access control system developed in-house. - Abstract: A (Radio Frequency IDentification) RFID cum Biometric based two level Access Control System (ACS) was designed and developed for providing access to vital areas of nuclear facilities. The system has got both hardware [Access controller] and software components [server application, the database and the web client software]. The database design proposed, enables grouping of the employees based on the hierarchy of the organization and the grouping of the doors based on Access Zones (AZ). This design also illustrates the mapping between the Employee Groups (EG) and AZ. By following this approach in database design, a higher level view can be presented to the system administrator abstracting the inner details of the individual entities and doors. This paper describes the novel approach carried out in designing the database of the ACS.

  16. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research.

    Directory of Open Access Journals (Sweden)

    Surasak Saokaew

    Full Text Available Health technology assessment (HTA has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced.Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases' characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided.Forty databases- 20 from Thailand and 20 from Japan-were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources.Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed.

  17. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research.

    Science.gov (United States)

    Saokaew, Surasak; Sugimoto, Takashi; Kamae, Isao; Pratoomsoot, Chayanin; Chaiyakunapruk, Nathorn

    2015-01-01

    Health technology assessment (HTA) has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced. Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases' characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided. Forty databases- 20 from Thailand and 20 from Japan-were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources. Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed.

  18. Free access to INIS database provides a gateway to nuclear energy research results

    International Nuclear Information System (INIS)

    Tolonen, E.; Malmgren, M.

    2009-01-01

    Free access to INIS database was opened to all the Internet users around the world on May, 2009. The article reviews the history of INIS (the International Nuclear Information System), data aquisition process, database content and search possibilities. INIS is focused on the worldwide literature of the peaceful uses of nuclear energy and the database is produced in close collaboration with the IEA/ETDE World Energy Base (ETDEWEB), a database focusing on all aspects of energy. Nuclear Science Abstracts database (NSA), which is a comprehensive collection of international nuclear science and technology literature for the period 1948 through 1976, is also briefly discussed in the article. In Finland, the recently formed Aalto University is responsible for collecting and disseminating information (literature) and for the preparation of input to the INIS and IEA/ETDE Databases on the national level

  19. Accessing Electronic Databases for Curriculum Delivery in Schools ...

    African Journals Online (AJOL)

    This paper discussed the role of electronic databases in education with emphasis on the means of accessing the electronic databases. The paper further highlighted the various types and categories of electronic databases which the schools can explore in the process of teaching and learning as well as the techniques of ...

  20. A review of accessibility of administrative healthcare databases in the Asia-Pacific region.

    Science.gov (United States)

    Milea, Dominique; Azmi, Soraya; Reginald, Praveen; Verpillat, Patrice; Francois, Clement

    2015-01-01

    We describe and compare the availability and accessibility of administrative healthcare databases (AHDB) in several Asia-Pacific countries: Australia, Japan, South Korea, Taiwan, Singapore, China, Thailand, and Malaysia. The study included hospital records, reimbursement databases, prescription databases, and data linkages. Databases were first identified through PubMed, Google Scholar, and the ISPOR database register. Database custodians were contacted. Six criteria were used to assess the databases and provided the basis for a tool to categorise databases into seven levels ranging from least accessible (Level 1) to most accessible (Level 7). We also categorised overall data accessibility for each country as high, medium, or low based on accessibility of databases as well as the number of academic articles published using the databases. Fifty-four administrative databases were identified. Only a limited number of databases allowed access to raw data and were at Level 7 [Medical Data Vision EBM Provider, Japan Medical Data Centre (JMDC) Claims database and Nihon-Chouzai Pharmacy Claims database in Japan, and Medicare, Pharmaceutical Benefits Scheme (PBS), Centre for Health Record Linkage (CHeReL), HealthLinQ, Victorian Data Linkages (VDL), SA-NT DataLink in Australia]. At Levels 3-6 were several databases from Japan [Hamamatsu Medical University Database, Medi-Trend, Nihon University School of Medicine Clinical Data Warehouse (NUSM)], Australia [Western Australia Data Linkage (WADL)], Taiwan [National Health Insurance Research Database (NHIRD)], South Korea [Health Insurance Review and Assessment Service (HIRA)], and Malaysia [United Nations University (UNU)-Casemix]. Countries were categorised as having a high level of data accessibility (Australia, Taiwan, and Japan), medium level of accessibility (South Korea), or a low level of accessibility (Thailand, China, Malaysia, and Singapore). In some countries, data may be available but accessibility was restricted

  1. A review of accessibility of administrative healthcare databases in the Asia-Pacific region

    Science.gov (United States)

    Milea, Dominique; Azmi, Soraya; Reginald, Praveen; Verpillat, Patrice; Francois, Clement

    2015-01-01

    Objective We describe and compare the availability and accessibility of administrative healthcare databases (AHDB) in several Asia-Pacific countries: Australia, Japan, South Korea, Taiwan, Singapore, China, Thailand, and Malaysia. Methods The study included hospital records, reimbursement databases, prescription databases, and data linkages. Databases were first identified through PubMed, Google Scholar, and the ISPOR database register. Database custodians were contacted. Six criteria were used to assess the databases and provided the basis for a tool to categorise databases into seven levels ranging from least accessible (Level 1) to most accessible (Level 7). We also categorised overall data accessibility for each country as high, medium, or low based on accessibility of databases as well as the number of academic articles published using the databases. Results Fifty-four administrative databases were identified. Only a limited number of databases allowed access to raw data and were at Level 7 [Medical Data Vision EBM Provider, Japan Medical Data Centre (JMDC) Claims database and Nihon-Chouzai Pharmacy Claims database in Japan, and Medicare, Pharmaceutical Benefits Scheme (PBS), Centre for Health Record Linkage (CHeReL), HealthLinQ, Victorian Data Linkages (VDL), SA-NT DataLink in Australia]. At Levels 3–6 were several databases from Japan [Hamamatsu Medical University Database, Medi-Trend, Nihon University School of Medicine Clinical Data Warehouse (NUSM)], Australia [Western Australia Data Linkage (WADL)], Taiwan [National Health Insurance Research Database (NHIRD)], South Korea [Health Insurance Review and Assessment Service (HIRA)], and Malaysia [United Nations University (UNU)-Casemix]. Countries were categorised as having a high level of data accessibility (Australia, Taiwan, and Japan), medium level of accessibility (South Korea), or a low level of accessibility (Thailand, China, Malaysia, and Singapore). In some countries, data may be available but

  2. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  3. Distributed Database Access in the LHC Computing Grid with CORAL

    CERN Document Server

    Molnár, Z; Düllmann, D; Giacomo, G; Kalkhof, A; Valassi, A; CERN. Geneva. IT Department

    2009-01-01

    The CORAL package is the LCG Persistency Framework foundation for accessing relational databases. From the start CORAL has been designed to facilitate the deployment of the LHC experiment database applications in a distributed computing environment. In particular we cover - improvements to database service scalability by client connection management - platform-independent, multi-tier scalable database access by connection multiplexing, caching - a secure authentication and authorisation scheme integrated with existing grid services. We will summarize the deployment experience from several experiment productions using the distributed database infrastructure, which is now available in LCG. Finally, we present perspectives for future developments in this area.

  4. XML technology planning database : lessons learned

    Science.gov (United States)

    Some, Raphael R.; Neff, Jon M.

    2005-01-01

    A hierarchical Extensible Markup Language(XML) database called XCALIBR (XML Analysis LIBRary) has been developed by Millennium Program to assist in technology investment (ROI) analysis and technology Language Capability the New return on portfolio optimization. The database contains mission requirements and technology capabilities, which are related by use of an XML dictionary. The XML dictionary codifies a standardized taxonomy for space missions, systems, subsystems and technologies. In addition to being used for ROI analysis, the database is being examined for use in project planning, tracking and documentation. During the past year, the database has moved from development into alpha testing. This paper describes the lessons learned during construction and testing of the prototype database and the motivation for moving from an XML taxonomy to a standard XML-based ontology.

  5. Using XML technology for the ontology-based semantic integration of life science databases.

    Science.gov (United States)

    Philippi, Stephan; Köhler, Jacob

    2004-06-01

    Several hundred internet accessible life science databases with constantly growing contents and varying areas of specialization are publicly available via the internet. Database integration, consequently, is a fundamental prerequisite to be able to answer complex biological questions. Due to the presence of syntactic, schematic, and semantic heterogeneities, large scale database integration at present takes considerable efforts. As there is a growing apprehension of extensible markup language (XML) as a means for data exchange in the life sciences, this article focuses on the impact of XML technology on database integration in this area. In detail, a general architecture for ontology-driven data integration based on XML technology is introduced, which overcomes some of the traditional problems in this area. As a proof of concept, a prototypical implementation of this architecture based on a native XML database and an expert system shell is described for the realization of a real world integration scenario.

  6. Nuclear technology databases and information network systems

    International Nuclear Information System (INIS)

    Iwata, Shuichi; Kikuchi, Yasuyuki; Minakuchi, Satoshi

    1993-01-01

    This paper describes the databases related to nuclear (science) technology, and information network. Following contents are collected in this paper: the database developed by JAERI, ENERGY NET, ATOM NET, NUCLEN nuclear information database, INIS, NUclear Code Information Service (NUCLIS), Social Application of Nuclear Technology Accumulation project (SANTA), Nuclear Information Database/Communication System (NICS), reactor materials database, radiation effects database, NucNet European nuclear information database, reactor dismantling database. (J.P.N.)

  7. "Mr. Database" : Jim Gray and the History of Database Technologies.

    Science.gov (United States)

    Hanwahr, Nils C

    2017-12-01

    Although the widespread use of the term "Big Data" is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as "Mr. Database" in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research, Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush's idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed "Fourth Paradigm Science". This article gives an overview of Gray's contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use.

  8. Freely Accessible Chemical Database Resources of Compounds for in Silico Drug Discovery.

    Science.gov (United States)

    Yang, JingFang; Wang, Di; Jia, Chenyang; Wang, Mengyao; Hao, GeFei; Yang, GuangFu

    2018-05-07

    In silico drug discovery has been proved to be a solidly established key component in early drug discovery. However, this task is hampered by the limitation of quantity and quality of compound databases for screening. In order to overcome these obstacles, freely accessible database resources of compounds have bloomed in recent years. Nevertheless, how to choose appropriate tools to treat these freely accessible databases are crucial. To the best of our knowledge, this is the first systematic review on this issue. The existed advantages and drawbacks of chemical databases were analyzed and summarized based on the collected six categories of freely accessible chemical databases from literature in this review. Suggestions on how and in which conditions the usage of these databases could be reasonable were provided. Tools and procedures for building 3D structure chemical libraries were also introduced. In this review, we described the freely accessible chemical database resources for in silico drug discovery. In particular, the chemical information for building chemical database appears as attractive resources for drug design to alleviate experimental pressure. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  9. Internet-accessible radiographic database of Vietnam War casualties for medical student education.

    Science.gov (United States)

    Critchley, Eric P; Smirniotopoulos, James G

    2003-04-01

    The purpose of this study was to determine the feasibility of archiving radiographic images from Vietnam era conflict casualties into a personal computer-based electronic database of text and images and displaying the data using an Internet-accessible database for preservation and educational purposes. Thirty-two patient cases were selected at random from a pool of 1,000 autopsy reports in which radiographs were available. A total of 74 radiographs from these cases were digitized using a commercial image scanner and then uploaded into an Internet accessible database. The quality of the digitized images was assessed by administering an image-based test to a group of 12 medical students. No statistically significant (p > 0.05) differences were found between test scores when using the original radiographs versus using the digitized radiographs on the Internet-accessible database. An Internet-accessible database is capable of effectively archiving Vietnam era casualty radiographs for educational purposes.

  10. Multimedia database retrieval technology and applications

    CERN Document Server

    Muneesawang, Paisarn; Guan, Ling

    2014-01-01

    This book explores multimedia applications that emerged from computer vision and machine learning technologies. These state-of-the-art applications include MPEG-7, interactive multimedia retrieval, multimodal fusion, annotation, and database re-ranking. The application-oriented approach maximizes reader understanding of this complex field. Established researchers explain the latest developments in multimedia database technology and offer a glimpse of future technologies. The authors emphasize the crucial role of innovation, inspiring users to develop new applications in multimedia technologies

  11. DOE technology information management system database study report

    Energy Technology Data Exchange (ETDEWEB)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  12. Technology solutions to support supervisory activities and also to provide information access to the society

    Science.gov (United States)

    Paladini, D.; Mello, A. B.

    2016-07-01

    Inmetro's data about the conformity of certificated products, process and services are, usually, displayed at fragmented databases of difficult access for several reasons, for instance, the lack of computational solutions which allow this kind of access to its users. A discussion about some of the technological solutions to support supervisory activities by the appropriate regulatory bodies and also to provide information access to society in general is herein presented, along with a theoretical explanation of the pros and cons of such technologies to the conclusion that a mobile platform seems to be the best tool for the requirements of Inmetro.

  13. Mandatory and Location-Aware Access Control for Relational Databases

    Science.gov (United States)

    Decker, Michael

    Access control is concerned with determining which operations a particular user is allowed to perform on a particular electronic resource. For example, an access control decision could say that user Alice is allowed to perform the operation read (but not write) on the resource research report. With conventional access control this decision is based on the user's identity whereas the basic idea of Location-Aware Access Control (LAAC) is to evaluate also a user's current location when making the decision if a particular request should be granted or denied. LAAC is an interesting approach for mobile information systems because these systems are exposed to specific security threads like the loss of a device. Some data models for LAAC can be found in literature, but almost all of them are based on RBAC and none of them is designed especially for Database Management Systems (DBMS). In this paper we therefore propose a LAAC-approach for DMBS and describe a prototypical implementation of that approach that is based on database triggers.

  14. The INIS database on another efficient site... and on free access

    International Nuclear Information System (INIS)

    Libmann, F.

    2009-01-01

    This article presents the INIS database, its history, document type content, and availability. It stresses on the recent opening of the database to free access, on the functionality of the searching interface and on the quality of the work and the professionalism of the database producers. (J.S.)

  15. USING THE INTERNATIONAL SCIENTOMETRIC DATABASES OF OPEN ACCESS IN SCIENTIFIC RESEARCH

    Directory of Open Access Journals (Sweden)

    O. Galchevska

    2015-05-01

    Full Text Available In the article the problem of the use of international scientometric databases in research activities as web-oriented resources and services that are the means of publication and dissemination of research results is considered. Selection criteria of scientometric platforms of open access in conducting scientific researches (coverage Ukrainian scientific periodicals and publications, data accuracy, general characteristics of international scientometrics database, technical, functional characteristics and their indexes are emphasized. The review of the most popular scientometric databases of open access Google Scholar, Russian Scientific Citation Index (RSCI, Scholarometer, Index Copernicus (IC, Microsoft Academic Search is made. Advantages of usage of International Scientometrics database Google Scholar in conducting scientific researches and prospects of research that are in the separation of cloud information and analytical services of the system are determined.

  16. Relational Database Technology: An Overview.

    Science.gov (United States)

    Melander, Nicole

    1987-01-01

    Describes the development of relational database technology as it applies to educational settings. Discusses some of the new tools and models being implemented in an effort to provide educators with technologically advanced ways of answering questions about education programs and data. (TW)

  17. Evolution of Database Replication Technologies for WLCG

    CERN Document Server

    Baranowski, Zbigniew; Blaszczyk, Marcin; Dimitrov, Gancho; Canali, Luca

    2015-01-01

    In this article we summarize several years of experience on database replication technologies used at WLCG and we provide a short review of the available Oracle technologies and their key characteristics. One of the notable changes and improvement in this area in recent past has been the introduction of Oracle GoldenGate as a replacement of Oracle Streams. We report in this article on the preparation and later upgrades for remote replication done in collaboration with ATLAS and Tier 1 database administrators, including the experience from running Oracle GoldenGate in production. Moreover, we report on another key technology in this area: Oracle Active Data Guard which has been adopted in several of the mission critical use cases for database replication between online and offline databases for the LHC experiments.

  18. HCUP State Emergency Department Databases (SEDD) - Restricted Access File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The State Emergency Department Databases (SEDD) contain the universe of emergency department visits in participating States. Restricted access data files are...

  19. libChEBI: an API for accessing the ChEBI database.

    Science.gov (United States)

    Swainston, Neil; Hastings, Janna; Dekker, Adriano; Muthukrishnan, Venkatesh; May, John; Steinbeck, Christoph; Mendes, Pedro

    2016-01-01

    ChEBI is a database and ontology of chemical entities of biological interest. It is widely used as a source of identifiers to facilitate unambiguous reference to chemical entities within biological models, databases, ontologies and literature. ChEBI contains a wealth of chemical data, covering over 46,500 distinct chemical entities, and related data such as chemical formula, charge, molecular mass, structure, synonyms and links to external databases. Furthermore, ChEBI is an ontology, and thus provides meaningful links between chemical entities. Unlike many other resources, ChEBI is fully human-curated, providing a reliable, non-redundant collection of chemical entities and related data. While ChEBI is supported by a web service for programmatic access and a number of download files, it does not have an API library to facilitate the use of ChEBI and its data in cheminformatics software. To provide this missing functionality, libChEBI, a comprehensive API library for accessing ChEBI data, is introduced. libChEBI is available in Java, Python and MATLAB versions from http://github.com/libChEBI, and provides full programmatic access to all data held within the ChEBI database through a simple and documented API. libChEBI is reliant upon the (automated) download and regular update of flat files that are held locally. As such, libChEBI can be embedded in both on- and off-line software applications. libChEBI allows better support of ChEBI and its data in the development of new cheminformatics software. Covering three key programming languages, it allows for the entirety of the ChEBI database to be accessed easily and quickly through a simple API. All code is open access and freely available.

  20. Access to digital library databases in higher education: design problems and infrastructural gaps.

    Science.gov (United States)

    Oswal, Sushil K

    2014-01-01

    After defining accessibility and usability, the author offers a broad survey of the research studies on digital content databases which have thus far primarily depended on data drawn from studies conducted by sighted researchers with non-disabled users employing screen readers and low vision devices. This article aims at producing a detailed description of the difficulties confronted by blind screen reader users with online library databases which now hold most of the academic, peer-reviewed journal and periodical content essential for research and teaching in higher education. The approach taken here is borrowed from descriptive ethnography which allows the author to create a complete picture of the accessibility and usability problems faced by an experienced academic user of digital library databases and screen readers. The author provides a detailed analysis of the different aspects of accessibility issues in digital databases under several headers with a special focus on full-text PDF files. The author emphasizes that long-term studies with actual, blind screen reader users employing both qualitative and computerized research tools can yield meaningful data for the designers and developers to improve these databases to a level that they begin to provide an equal access to the blind.

  1. Full-Text Linking: Affiliated versus Nonaffiliated Access in a Free Database.

    Science.gov (United States)

    Grogg, Jill E.; Andreadis, Debra K.; Kirk, Rachel A.

    2002-01-01

    Presents a comparison of access to full-text articles from a free bibliographic database (PubSCIENCE) for affiliated and unaffiliated users. Found that affiliated users had access to more full-text articles than unaffiliated users had, and that both types of users could increase their level of access through additional searching and greater…

  2. WEB-BASED DATABASE ON RENEWAL TECHNOLOGIES ...

    Science.gov (United States)

    As U.S. utilities continue to shore up their aging infrastructure, renewal needs now represent over 43% of annual expenditures compared to new construction for drinking water distribution and wastewater collection systems (Underground Construction [UC], 2016). An increased understanding of renewal options will ultimately assist drinking water utilities in reducing water loss and help wastewater utilities to address infiltration and inflow issues in a cost-effective manner. It will also help to extend the service lives of both drinking water and wastewater mains. This research effort involved collecting case studies on the use of various trenchless pipeline renewal methods and providing the information in an online searchable database. The overall objective was to further support technology transfer and information sharing regarding emerging and innovative renewal technologies for water and wastewater mains. The result of this research is a Web-based, searchable database that utility personnel can use to obtain technology performance and cost data, as well as case study references. The renewal case studies include: technologies used; the conditions under which the technology was implemented; costs; lessons learned; and utility contact information. The online database also features a data mining tool for automated review of the technologies selected and cost data. Based on a review of the case study results and industry data, several findings are presented on tren

  3. Optimization and Accessibility of the Qweak Database

    Science.gov (United States)

    Urban, Erik; Spayde, Damon

    2010-11-01

    The Qweak experiment is a multi-institutional collaborative effort at Thomas Jefferson National Accelerator Facility designed to accurately determine the weak nuclear charge of a proton through measurements of the parity violating asymmetries of electron-proton elastic scattering that result from pulses of electrons with opposite helicities. Through the study of these scattering asymmetries, the Qweak experiment hopes to constrain extensions of the Standard Model or find indications of new physics. Since precision is critical to the success of the Qweak experiment, the collaboration will be taking data for thousands of hours. The Qweak database is responsible for storing the non-binary, processed data of this experiment in a meaningful and organized manner for use at a later date. The goal of this undertaking to not only create a database which can input and output data quickly, but create one which can easily be accessed by those who have minimal knowledge of the database language. Through tests on the system, the speed of retrieval and insert times has been optimized and, in addition, the implementation of summary tables and additional programs should make the majority of commonly sought results readily available to database novices.

  4. Database mirroring in fault-tolerant continuous technological process control

    Directory of Open Access Journals (Sweden)

    R. Danel

    2015-10-01

    Full Text Available This paper describes the implementations of mirroring technology of the selected database systems – Microsoft SQL Server, MySQL and Caché. By simulating critical failures the systems behavior and their resilience against failure were tested. The aim was to determine whether the database mirroring is suitable to use in continuous metallurgical processes for ensuring the fault-tolerant solution at affordable cost. The present day database systems are characterized by high robustness and are resistant to sudden system failure. Database mirroring technologies are reliable and even low-budget projects can be provided with a decent fault-tolerant solution. The database system technologies available for low-budget projects are not suitable for use in real-time systems.

  5. The TJ-II Relational Database Access Library: A User's Guide

    International Nuclear Information System (INIS)

    Sanchez, E.; Portas, A. B.; Vega, J.

    2003-01-01

    A relational database has been developed to store data representing physical values from TJ-II discharges. This new database complements the existing TJ-EI raw data database. This database resides in a host computer running Windows 2000 Server operating system and it is managed by SQL Server. A function library has been developed that permits remote access to these data from user programs running in computers connected to TJ-II local area networks via remote procedure cali. In this document a general description of the database and its organization are provided. Also given are a detailed description of the functions included in the library and examples of how to use these functions in computer programs written in the FORTRAN and C languages. (Author) 8 refs

  6. Database of Information technology resources

    OpenAIRE

    Barzda, Erlandas

    2005-01-01

    The subject of this master work is the internet information resource database. This work also handles the problems of old information systems which do not meet the new contemporary requirements. The aim is to create internet information system, based on object-oriented technologies and tailored to computer users’ needs. The internet information database system helps computers administrators to get the all needed information about computers network elements and easy to register all changes int...

  7. A database in ACCESS for assessing vaccine serious adverse events

    Directory of Open Access Journals (Sweden)

    Thomas RE

    2015-04-01

    Full Text Available Roger E Thomas,1 Dave Jackson2,3 1Department of Family Medicine, G012 Health Sciences Centre, University of Calgary Medical School, Calgary, AB, Canada; 2Independent Research Consultant, Calgary, AB, Canada; 3Database Consultant, University of Calgary, Calgary, AB, Canada Purpose: To provide a free flexible database for use by any researcher for assessing reports of adverse events after vaccination. Results: A database was developed in Microsoft ACCESS to assess reports of serious adverse events after yellow fever vaccination using Brighton Collaboration criteria. The database is partly automated (if data panels contain identical data fields the data are automatically also entered into those fields. The purpose is to provide the database free for developers to add additional panels to assess other vaccines. Keywords: serious adverse events after vaccination, database, process to assess vaccine-associated events 

  8. NoSQL technologies for the CMS Conditions Database

    Science.gov (United States)

    Sipos, Roland

    2015-12-01

    With the restart of the LHC in 2015, the growth of the CMS Conditions dataset will continue, therefore the need of consistent and highly available access to the Conditions makes a great cause to revisit different aspects of the current data storage solutions. We present a study of alternative data storage backends for the Conditions Databases, by evaluating some of the most popular NoSQL databases to support a key-value representation of the CMS Conditions. The definition of the database infrastructure is based on the need of storing the conditions as BLOBs. Because of this, each condition can reach the size that may require special treatment (splitting) in these NoSQL databases. As big binary objects may be problematic in several database systems, and also to give an accurate baseline, a testing framework extension was implemented to measure the characteristics of the handling of arbitrary binary data in these databases. Based on the evaluation, prototypes of a document store, using a column-oriented and plain key-value store, are deployed. An adaption layer to access the backends in the CMS Offline software was developed to provide transparent support for these NoSQL databases in the CMS context. Additional data modelling approaches and considerations in the software layer, deployment and automatization of the databases are also covered in the research. In this paper we present the results of the evaluation as well as a performance comparison of the prototypes studied.

  9. Applications of GIS and database technologies to manage a Karst Feature Database

    Science.gov (United States)

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  10. Evolution of grid-wide access to database resident information in ATLAS using Frontier

    CERN Document Server

    Barberis, D; The ATLAS collaboration; de Stefano, J; Dewhurst, A L; Dykstra, D; Front, D

    2012-01-01

    The ATLAS experiment deployed Frontier technology world-wide during the the initial year of LHC collision data taking to enable user analysis jobs running on the World-wide LHC Computing Grid to access database resident data. Since that time, the deployment model has evolved to optimize resources, improve performance, and streamline maintenance of Frontier and related infrastructure. In this presentation we focus on the specific changes in the deployment and improvements undertaken such as the optimization of cache and launchpad location, the use of RPMs for more uniform deployment of underlying Frontier related components, improvements in monitoring, optimization of fail-over, and an increasing use of a centrally managed database containing site specific information (for configuration of services and monitoring). In addition, analysis of Frontier logs has allowed us a deeper understanding of problematic queries and understanding of use cases. Use of the system has grown beyond just user analysis and subsyste...

  11. [Project evidência [evidence]: research and education about accessing scientific databases in Azores].

    Science.gov (United States)

    Soares, Hélia; Pereira, Sandra M; Neves, Ajuda; Gomes, Amy; Teixeira, Bruno; Oliveira, Carolina; Sousa, Fábio; Tavares, Márcio; Tavares, Patrícia; Dutra, Raquel; Pereira, Hélder Rocha

    2013-04-01

    Project Evidência [Evidence] intends to promote the use of scientific databases among nurses. This study aims to design educational interventions that facilitate nurses' access to these databases, to determine nurses' habits regarding the use of scientific databases, and to determine the impact that educational interventions on scientific databases have on Azorean nurses who volunteered for this project. An intervention project was conducted, and a quantitative descriptive survey was designed to evaluate the impact two and five months after the educational intervention. This impact was investigated considering certain aspects, namely, the nurses' knowledge, habits and reasons for using scientific databases. A total of 192 nurses participated in this study, and the primary results indicate that the educational intervention had a positive impact based not only on the increased frequency of using platforms or databases of scientific information (DSIs) s but also on the competence and self-awareness regarding its use and consideration of the reasons for accessing this information.

  12. Database architecture optimized for the new bottleneck: Memory access

    NARCIS (Netherlands)

    P.A. Boncz (Peter); S. Manegold (Stefan); M.L. Kersten (Martin)

    1999-01-01

    textabstractIn the past decade, advances in speed of commodity CPUs have far out-paced advances in memory latency. Main-memory access is therefore increasingly a performance bottleneck for many computer applications, including database systems. In this article, we use a simple scan test to show the

  13. Optimizing Database Architecture for the New Bottleneck: Memory Access

    NARCIS (Netherlands)

    S. Manegold (Stefan); P.A. Boncz (Peter); M.L. Kersten (Martin)

    2000-01-01

    textabstractIn the past decade, advances in speed of commodity CPUs have far out-paced advances in memory latency. Main-memory access is therefore increasingly a performance bottleneck for many computer applications, including database systems. In this article, we use a simple scan test to show the

  14. Solar Sail Propulsion Technology Readiness Level Database

    Science.gov (United States)

    Adams, Charles L.

    2004-01-01

    The NASA In-Space Propulsion Technology (ISPT) Projects Office has been sponsoring 2 solar sail system design and development hardware demonstration activities over the past 20 months. Able Engineering Company (AEC) of Goleta, CA is leading one team and L Garde, Inc. of Tustin, CA is leading the other team. Component, subsystem and system fabrication and testing has been completed successfully. The goal of these activities is to advance the technology readiness level (TRL) of solar sail propulsion from 3 towards 6 by 2006. These activities will culminate in the deployment and testing of 20-meter solar sail system ground demonstration hardware in the 30 meter diameter thermal-vacuum chamber at NASA Glenn Plum Brook in 2005. This paper will describe the features of a computer database system that documents the results of the solar sail development activities to-date. Illustrations of the hardware components and systems, test results, analytical models, relevant space environment definition and current TRL assessment, as stored and manipulated within the database are presented. This database could serve as a central repository for all data related to the advancement of solar sail technology sponsored by the ISPT, providing an up-to-date assessment of the TRL of this technology. Current plans are to eventually make the database available to the Solar Sail community through the Space Transportation Information Network (STIN).

  15. A Model-driven Role-based Access Control for SQL Databases

    Directory of Open Access Journals (Sweden)

    Raimundas Matulevičius

    2015-07-01

    Full Text Available Nowadays security has become an important aspect in information systems engineering. A mainstream method for information system security is Role-based Access Control (RBAC, which restricts system access to authorised users. While the benefits of RBAC are widely acknowledged, the implementation and administration of RBAC policies remains a human intensive activity, typically postponed until the implementation and maintenance phases of system development. This deferred security engineering approach makes it difficult for security requirements to be accurately captured and for the system’s implementation to be kept aligned with these requirements as the system evolves. In this paper we propose a model-driven approach to manage SQL database access under the RBAC paradigm. The starting point of the approach is an RBAC model captured in SecureUML. This model is automatically translated to Oracle Database views and instead-of triggers code, which implements the security constraints. The approach has been fully instrumented as a prototype and its effectiveness has been validated by means of a case study.

  16. Applying artificial intelligence to astronomical databases - a surveyof applicable technology.

    Science.gov (United States)

    Rosenthal, D. A.

    This paper surveys several emerging technologies which are relevant to astronomical database issues such as interface technology, internal database representation, and intelligent data reduction aids. Among the technologies discussed are natural language understanding, frame and object representations, planning, pattern analysis, machine learning and the nascent study of simulated neural nets. These techniques will become increasingly important for astronomical research, and in particular, for applications with large databases.

  17. Pan European Phenological database (PEP725): a single point of access for European data

    Science.gov (United States)

    Templ, Barbara; Koch, Elisabeth; Bolmgren, Kjell; Ungersböck, Markus; Paul, Anita; Scheifinger, Helfried; Rutishauser, This; Busto, Montserrat; Chmielewski, Frank-M.; Hájková, Lenka; Hodzić, Sabina; Kaspar, Frank; Pietragalla, Barbara; Romero-Fresneda, Ramiro; Tolvanen, Anne; Vučetič, Višnja; Zimmermann, Kirsten; Zust, Ana

    2018-02-01

    The Pan European Phenology (PEP) project is a European infrastructure to promote and facilitate phenological research, education, and environmental monitoring. The main objective is to maintain and develop a Pan European Phenological database (PEP725) with an open, unrestricted data access for science and education. PEP725 is the successor of the database developed through the COST action 725 "Establishing a European phenological data platform for climatological applications" working as a single access point for European-wide plant phenological data. So far, 32 European meteorological services and project partners from across Europe have joined and supplied data collected by volunteers from 1868 to the present for the PEP725 database. Most of the partners actively provide data on a regular basis. The database presently holds almost 12 million records, about 46 growing stages and 265 plant species (including cultivars), and can be accessed via http://www.pep725.eu/. Users of the PEP725 database have studied a diversity of topics ranging from climate change impact, plant physiological question, phenological modeling, and remote sensing of vegetation to ecosystem productivity.

  18. Pan European Phenological database (PEP725): a single point of access for European data

    Science.gov (United States)

    Templ, Barbara; Koch, Elisabeth; Bolmgren, Kjell; Ungersböck, Markus; Paul, Anita; Scheifinger, Helfried; Rutishauser, This; Busto, Montserrat; Chmielewski, Frank-M.; Hájková, Lenka; Hodzić, Sabina; Kaspar, Frank; Pietragalla, Barbara; Romero-Fresneda, Ramiro; Tolvanen, Anne; Vučetič, Višnja; Zimmermann, Kirsten; Zust, Ana

    2018-06-01

    The Pan European Phenology (PEP) project is a European infrastructure to promote and facilitate phenological research, education, and environmental monitoring. The main objective is to maintain and develop a Pan European Phenological database (PEP725) with an open, unrestricted data access for science and education. PEP725 is the successor of the database developed through the COST action 725 "Establishing a European phenological data platform for climatological applications" working as a single access point for European-wide plant phenological data. So far, 32 European meteorological services and project partners from across Europe have joined and supplied data collected by volunteers from 1868 to the present for the PEP725 database. Most of the partners actively provide data on a regular basis. The database presently holds almost 12 million records, about 46 growing stages and 265 plant species (including cultivars), and can be accessed via http://www.pep725.eu/ . Users of the PEP725 database have studied a diversity of topics ranging from climate change impact, plant physiological question, phenological modeling, and remote sensing of vegetation to ecosystem productivity.

  19. Distributed Access View Integrated Database (DAVID) system

    Science.gov (United States)

    Jacobs, Barry E.

    1991-01-01

    The Distributed Access View Integrated Database (DAVID) System, which was adopted by the Astrophysics Division for their Astrophysics Data System, is a solution to the system heterogeneity problem. The heterogeneous components of the Astrophysics problem is outlined. The Library and Library Consortium levels of the DAVID approach are described. The 'books' and 'kits' level is discussed. The Universal Object Typer Management System level is described. The relation of the DAVID project with the Small Business Innovative Research (SBIR) program is explained.

  20. Coordinating Mobile Databases: A System Demonstration

    OpenAIRE

    Zaihrayeu, Ilya; Giunchiglia, Fausto

    2004-01-01

    In this paper we present the Peer Database Management System (PDBMS). This system runs on top of the standard database management system, and it allows it to connect its database with other (peer) databases on the network. A particularity of our solution is that PDBMS allows for conventional database technology to be effectively operational in mobile settings. We think of database mobility as a database network, where databases appear and disappear spontaneously and their network access point...

  1. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-06-17

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  2. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    International Nuclear Information System (INIS)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-01-01

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  3. Fusion research and technology records in INIS database

    International Nuclear Information System (INIS)

    Hillebrand, C.D.

    1998-01-01

    This article is a summary of a survey study ''''A survey on publications in Fusion Research and Technology. Science and Technology Indicators in Fusion R and T'''' by the same author on Fusion R and T records in the International Nuclear Information System (INIS) bibliographic database. In that study, for the first time, all scientometric and bibliometric information contained in a bibliographic database, using INIS records, is analyzed and quantified, specific to a selected field of science and technology. A variety of new science and technology indicators which can be used for evaluating research and development activities is also presented in that study that study

  4. Functionally Graded Materials Database

    Science.gov (United States)

    Kisara, Katsuto; Konno, Tomomi; Niino, Masayuki

    2008-02-01

    Functionally Graded Materials Database (hereinafter referred to as FGMs Database) was open to the society via Internet in October 2002, and since then it has been managed by the Japan Aerospace Exploration Agency (JAXA). As of October 2006, the database includes 1,703 research information entries with 2,429 researchers data, 509 institution data and so on. Reading materials such as "Applicability of FGMs Technology to Space Plane" and "FGMs Application to Space Solar Power System (SSPS)" were prepared in FY 2004 and 2005, respectively. The English version of "FGMs Application to Space Solar Power System (SSPS)" is now under preparation. This present paper explains the FGMs Database, describing the research information data, the sitemap and how to use it. From the access analysis, user access results and users' interests are discussed.

  5. K-12 Technology Accessibility: The Message from State Governments

    Science.gov (United States)

    Shaheen, Natalie L.; Lazar, Jonathan

    2018-01-01

    This study examined state education technology plans and technology accessibility statutes to attempt to answer the question--is K-12 instructional technology accessibility discussed in state-level technology accessibility statutes and education technology plans across the 50 United States? When a K-12 school district is planning the construction…

  6. The research of network database security technology based on web service

    Science.gov (United States)

    Meng, Fanxing; Wen, Xiumei; Gao, Liting; Pang, Hui; Wang, Qinglin

    2013-03-01

    Database technology is one of the most widely applied computer technologies, its security is becoming more and more important. This paper introduced the database security, network database security level, studies the security technology of the network database, analyzes emphatically sub-key encryption algorithm, applies this algorithm into the campus-one-card system successfully. The realization process of the encryption algorithm is discussed, this method is widely used as reference in many fields, particularly in management information system security and e-commerce.

  7. The bovine QTL viewer: a web accessible database of bovine Quantitative Trait Loci

    Directory of Open Access Journals (Sweden)

    Xavier Suresh R

    2006-06-01

    Full Text Available Abstract Background Many important agricultural traits such as weight gain, milk fat content and intramuscular fat (marbling in cattle are quantitative traits. Most of the information on these traits has not previously been integrated into a genomic context. Without such integration application of these data to agricultural enterprises will remain slow and inefficient. Our goal was to populate a genomic database with data mined from the bovine quantitative trait literature and to make these data available in a genomic context to researchers via a user friendly query interface. Description The QTL (Quantitative Trait Locus data and related information for bovine QTL are gathered from published work and from existing databases. An integrated database schema was designed and the database (MySQL populated with the gathered data. The bovine QTL Viewer was developed for the integration of QTL data available for cattle. The tool consists of an integrated database of bovine QTL and the QTL viewer to display QTL and their chromosomal position. Conclusion We present a web accessible, integrated database of bovine (dairy and beef cattle QTL for use by animal geneticists. The viewer and database are of general applicability to any livestock species for which there are public QTL data. The viewer can be accessed at http://bovineqtl.tamu.edu.

  8. An Open Access Database of Genome-wide Association Results

    Directory of Open Access Journals (Sweden)

    Johnson Andrew D

    2009-01-01

    Full Text Available Abstract Background The number of genome-wide association studies (GWAS is growing rapidly leading to the discovery and replication of many new disease loci. Combining results from multiple GWAS datasets may potentially strengthen previous conclusions and suggest new disease loci, pathways or pleiotropic genes. However, no database or centralized resource currently exists that contains anywhere near the full scope of GWAS results. Methods We collected available results from 118 GWAS articles into a database of 56,411 significant SNP-phenotype associations and accompanying information, making this database freely available here. In doing so, we met and describe here a number of challenges to creating an open access database of GWAS results. Through preliminary analyses and characterization of available GWAS, we demonstrate the potential to gain new insights by querying a database across GWAS. Results Using a genomic bin-based density analysis to search for highly associated regions of the genome, positive control loci (e.g., MHC loci were detected with high sensitivity. Likewise, an analysis of highly repeated SNPs across GWAS identified replicated loci (e.g., APOE, LPL. At the same time we identified novel, highly suggestive loci for a variety of traits that did not meet genome-wide significant thresholds in prior analyses, in some cases with strong support from the primary medical genetics literature (SLC16A7, CSMD1, OAS1, suggesting these genes merit further study. Additional adjustment for linkage disequilibrium within most regions with a high density of GWAS associations did not materially alter our findings. Having a centralized database with standardized gene annotation also allowed us to examine the representation of functional gene categories (gene ontologies containing one or more associations among top GWAS results. Genes relating to cell adhesion functions were highly over-represented among significant associations (p -14, a finding

  9. Knowledge base technology for CT-DIMS: Report 1. [CT-DIMS (Cutting Tool - Database and Information Management System)

    Energy Technology Data Exchange (ETDEWEB)

    Kelley, E.E.

    1993-05-01

    This report discusses progress on the Cutting Tool-Database and Information Management System (CT-DIMS) project being conducted by the University of Illinois Urbana-Champaign (UIUC) under contract to the Department of Energy. This project was initiated in October 1991 by UIUC. The Knowledge-Based Engineering Systems Research Laboratory (KBESRL) at UIUC is developing knowledge base technology and prototype software for the presentation and manipulation of the cutting tool databases at Allied-Signal Inc., Kansas City Division (KCD). The graphical tool selection capability being developed for CT-DIMS in the Intelligent Design Environment for Engineering Automation (IDEEA) will provide a concurrent environment for simultaneous access to tool databases, tool standard libraries, and cutting tool knowledge.

  10. Marginalized Student Access to Technology Education

    Science.gov (United States)

    Kurtcu, Wanda M.

    The purpose of this paper is to investigate how a teacher can disrupt an established curriculum that continues the cycle of inequity of access to science, technology, engineering, and math (STEM) curriculum by students in alternative education. For this paper, I will focus on the technology components of the STEM curriculum. Technology in the United States, if not the world economy, is developing at a rapid pace. Many areas of day to day living, from applying for a job to checking one's bank account online, involve a component of science and technology. The 'gap' in technology education is emphasized between the 'haves and have-nots', which is delineated along socio-economic lines. Marginalized students in alternative education programs use this equipment for little else than remedial programs and credit recovery. This level of inequity further widens in alternative education programs and affects the achievement of marginalized students in credit recovery or alternative education classes instead of participation technology classes. For the purposes of this paper I focus on how can I decrease the inequity of student access to 21st century technology education in an alternative education program by addressing the established curriculum of the program and modifying structural barriers of marginalized student access to a technology focused curriculum.

  11. Evaluation of an Online Instructional Database Accessed by QR Codes to Support Biochemistry Practical Laboratory Classes

    Science.gov (United States)

    Yip, Tor; Melling, Louise; Shaw, Kirsty J.

    2016-01-01

    An online instructional database containing information on commonly used pieces of laboratory equipment was created. In order to make the database highly accessible and to promote its use, QR codes were utilized. The instructional materials were available anytime and accessed using QR codes located on the equipment itself and within undergraduate…

  12. Investigation on construction of the database system for research and development of the global environment industry technology; Chikyu kankyo sangyo gijutsu kenkyu kaihatsuyo database system no kochiku ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-01

    This paper studies a concrete plan to introduce a new database system of Research Institute of Innovative Technology for the Earth (RITE) which is necessary to promote the industrial technology development contributing to solution of the global environmental problem. Specifications for system introduction are about maker selection, operation system, detailed schedule for introduction, etc. RITE inhouse database has problems on its operation system and its maintenance cost, and is apt to be high in a construction cost in comparison with a utilization factor. Further study is made on its introduction. Information provided by the inhouse database is only the one owned by the organization, and information outside the organization is provided by the external database. The information is registered and selected by the registerer himself. The access network is set by personal computer network at the beginning and is set to transit to INTERNET in the future. For practical construction of the system, it is necessary to make user`s detailed needs clear for the system design and to adjust functions between hardware systems. 32 figs., 9 tabs.

  13. Open-access databases as unprecedented resources and drivers of cultural change in fisheries science

    Energy Technology Data Exchange (ETDEWEB)

    McManamay, Ryan A [ORNL; Utz, Ryan [National Ecological Observatory Network

    2014-01-01

    Open-access databases with utility in fisheries science have grown exponentially in quantity and scope over the past decade, with profound impacts to our discipline. The management, distillation, and sharing of an exponentially growing stream of open-access data represents several fundamental challenges in fisheries science. Many of the currently available open-access resources may not be universally known among fisheries scientists. We therefore introduce many national- and global-scale open-access databases with applications in fisheries science and provide an example of how they can be harnessed to perform valuable analyses without additional field efforts. We also discuss how the development, maintenance, and utilization of open-access data are likely to pose technical, financial, and educational challenges to fisheries scientists. Such cultural implications that will coincide with the rapidly increasing availability of free data should compel the American Fisheries Society to actively address these problems now to help ease the forthcoming cultural transition.

  14. Development of Information Technology of Object-relational Databases Design

    Directory of Open Access Journals (Sweden)

    Valentyn A. Filatov

    2012-12-01

    Full Text Available The article is concerned with the development of information technology of object-relational databases design and study of object features infological and logical database schemes entities and connections.

  15. Art : accessible, renewable technology

    International Nuclear Information System (INIS)

    Middleton, C.D.

    2004-01-01

    This paper focuses on the role of non-governmental organization (NGO) citizen groups in Ontario in the use and production of electricity. NGOs have the potential to act both directly on their own accord, and indirectly by pressuring government and others. Current demand for electricity is divided between industrial, commercial and residential users. Citizens have an important role to play in reducing energy demand. On the supply side, there is a revival of interest in renewable energy based on wind, photovoltaic and local-hydro technologies as a result of the escalating environmental and economic costs of coal and nuclear generation. However, citizen groups have greater interest and enthusiasm than technical expertise, creating a mismatch between technological solutions and human need or use of them. This paper discusses how this mismatch applies to renewable-energy technologies, many of which are not especially user-friendly, or accessible. While alternative technologies are increasingly welcomed by government, industry is developing a large and growing array of technological devices. In between this is the citizen, who, despite keen interest, can be overwhelmed by the complexity of the situation. This paper links the theoretical perspective to the real world with a discussion of the dynamics between people and renewable energy in citizen groups and makes particular reference to one group, Citizens for Renewable Energy, that has been making renewable energy technology more accessible to its members for over a decade

  16. NoSQL technologies for the CMS Conditions Database

    CERN Document Server

    Sipos, Roland

    2015-01-01

    With the restart of the LHC in 2015, the growth of the CMS Conditions dataset will continue, therefore the need of consistent and highly available access to the Conditions makes a great cause to revisit different aspects of the current data storage solutions.We present a study of alternative data storage backends for the Conditions Databases, by evaluating some of the most popular NoSQL databases to support a key-value representation of the CMS Conditions. An important detail about the Conditions that the payloads are stored as BLOBs, and they can reach sizes that may require special treatment (splitting) in these NoSQL databases. As big binary objects may be a bottleneck in several database systems, and also to give an accurate baseline, a testing framework extension was implemented to measure the characteristics of the handling of arbitrary binary data in these databases. Based on the evaluation, prototypes of a document store, using a column-oriented and plain key-value store, are deployed. An adaption l...

  17. Accessing the public MIMIC-II intensive care relational database for clinical research.

    Science.gov (United States)

    Scott, Daniel J; Lee, Joon; Silva, Ikaro; Park, Shinhyuk; Moody, George B; Celi, Leo A; Mark, Roger G

    2013-01-10

    The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge "Predicting mortality of ICU Patients". QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database.

  18. For 481 biomedical open access journals, articles are not searchable in the Directory of Open Access Journals nor in conventional biomedical databases.

    Science.gov (United States)

    Liljekvist, Mads Svane; Andresen, Kristoffer; Pommergaard, Hans-Christian; Rosenberg, Jacob

    2015-01-01

    Background. Open access (OA) journals allows access to research papers free of charge to the reader. Traditionally, biomedical researchers use databases like MEDLINE and EMBASE to discover new advances. However, biomedical OA journals might not fulfill such databases' criteria, hindering dissemination. The Directory of Open Access Journals (DOAJ) is a database exclusively listing OA journals. The aim of this study was to investigate DOAJ's coverage of biomedical OA journals compared with the conventional biomedical databases. Methods. Information on all journals listed in four conventional biomedical databases (MEDLINE, PubMed Central, EMBASE and SCOPUS) and DOAJ were gathered. Journals were included if they were (1) actively publishing, (2) full OA, (3) prospectively indexed in one or more database, and (4) of biomedical subject. Impact factor and journal language were also collected. DOAJ was compared with conventional databases regarding the proportion of journals covered, along with their impact factor and publishing language. The proportion of journals with articles indexed by DOAJ was determined. Results. In total, 3,236 biomedical OA journals were included in the study. Of the included journals, 86.7% were listed in DOAJ. Combined, the conventional biomedical databases listed 75.0% of the journals; 18.7% in MEDLINE; 36.5% in PubMed Central; 51.5% in SCOPUS and 50.6% in EMBASE. Of the journals in DOAJ, 88.7% published in English and 20.6% had received impact factor for 2012 compared with 93.5% and 26.0%, respectively, for journals in the conventional biomedical databases. A subset of 51.1% and 48.5% of the journals in DOAJ had articles indexed from 2012 and 2013, respectively. Of journals exclusively listed in DOAJ, one journal had received an impact factor for 2012, and 59.6% of the journals had no content from 2013 indexed in DOAJ. Conclusions. DOAJ is the most complete registry of biomedical OA journals compared with five conventional biomedical databases

  19. ASAView: Database and tool for solvent accessibility representation in proteins

    Directory of Open Access Journals (Sweden)

    Fawareh Hamed

    2004-05-01

    Full Text Available Abstract Background Accessible surface area (ASA or solvent accessibility of amino acids in a protein has important implications. Knowledge of surface residues helps in locating potential candidates of active sites. Therefore, a method to quickly see the surface residues in a two dimensional model would help to immediately understand the population of amino acid residues on the surface and in the inner core of the proteins. Results ASAView is an algorithm, an application and a database of schematic representations of solvent accessibility of amino acid residues within proteins. A characteristic two-dimensional spiral plot of solvent accessibility provides a convenient graphical view of residues in terms of their exposed surface areas. In addition, sequential plots in the form of bar charts are also provided. Online plots of the proteins included in the entire Protein Data Bank (PDB, are provided for the entire protein as well as their chains separately. Conclusions These graphical plots of solvent accessibility are likely to provide a quick view of the overall topological distribution of residues in proteins. Chain-wise computation of solvent accessibility is also provided.

  20. Ginseng Genome Database: an open-access platform for genomics of Panax ginseng.

    Science.gov (United States)

    Jayakodi, Murukarthick; Choi, Beom-Soon; Lee, Sang-Choon; Kim, Nam-Hoon; Park, Jee Young; Jang, Woojong; Lakshmanan, Meiyappan; Mohan, Shobhana V G; Lee, Dong-Yup; Yang, Tae-Jin

    2018-04-12

    The ginseng (Panax ginseng C.A. Meyer) is a perennial herbaceous plant that has been used in traditional oriental medicine for thousands of years. Ginsenosides, which have significant pharmacological effects on human health, are the foremost bioactive constituents in this plant. Having realized the importance of this plant to humans, an integrated omics resource becomes indispensable to facilitate genomic research, molecular breeding and pharmacological study of this herb. The first draft genome sequences of P. ginseng cultivar "Chunpoong" were reported recently. Here, using the draft genome, transcriptome, and functional annotation datasets of P. ginseng, we have constructed the Ginseng Genome Database http://ginsengdb.snu.ac.kr /, the first open-access platform to provide comprehensive genomic resources of P. ginseng. The current version of this database provides the most up-to-date draft genome sequence (of approximately 3000 Mbp of scaffold sequences) along with the structural and functional annotations for 59,352 genes and digital expression of genes based on transcriptome data from different tissues, growth stages and treatments. In addition, tools for visualization and the genomic data from various analyses are provided. All data in the database were manually curated and integrated within a user-friendly query page. This database provides valuable resources for a range of research fields related to P. ginseng and other species belonging to the Apiales order as well as for plant research communities in general. Ginseng genome database can be accessed at http://ginsengdb.snu.ac.kr /.

  1. Microsoft Access Small Business Solutions State-of-the-Art Database Models for Sales, Marketing, Customer Management, and More Key Business Activities

    CERN Document Server

    Hennig, Teresa; Linson, Larry; Purvis, Leigh; Spaulding, Brent

    2010-01-01

    Database models developed by a team of leading Microsoft Access MVPs that provide ready-to-use solutions for sales, marketing, customer management and other key business activities for most small businesses. As the most popular relational database in the world, Microsoft Access is widely used by small business owners. This book responds to the growing need for resources that help business managers and end users design and build effective Access database solutions for specific business functions. Coverage includes::; Elements of a Microsoft Access Database; Relational Data Model; Dealing with C

  2. A method to implement fine-grained access control for personal health records through standard relational database queries.

    Science.gov (United States)

    Sujansky, Walter V; Faus, Sam A; Stone, Ethan; Brennan, Patricia Flatley

    2010-10-01

    Online personal health records (PHRs) enable patients to access, manage, and share certain of their own health information electronically. This capability creates the need for precise access-controls mechanisms that restrict the sharing of data to that intended by the patient. The authors describe the design and implementation of an access-control mechanism for PHR repositories that is modeled on the eXtensible Access Control Markup Language (XACML) standard, but intended to reduce the cognitive and computational complexity of XACML. The authors implemented the mechanism entirely in a relational database system using ANSI-standard SQL statements. Based on a set of access-control rules encoded as relational table rows, the mechanism determines via a single SQL query whether a user who accesses patient data from a specific application is authorized to perform a requested operation on a specified data object. Testing of this query on a moderately large database has demonstrated execution times consistently below 100ms. The authors include the details of the implementation, including algorithms, examples, and a test database as Supplementary materials. Copyright © 2010 Elsevier Inc. All rights reserved.

  3. Freeing up access to CERN technology

    CERN Multimedia

    Joannah Caborn Wengler

    2012-01-01

    In line with CERN’s principle of maximising the dissemination of knowledge to society, the Knowledge Transfer (KT) Group has launched a new collaborative initiative to share the products of CERN’s scientific and technological labours: Easy Access IP, where IP stands for intellectual property.   CERN has a whole portfolio of dissemination channels designed and implemented by the KT Group, with Easy Access IP being the latest addition. “Inspired by the UK’s Easy Access Innovation initiative, our scheme involves making some of CERN's technologies available royalty-free and through a more agile licensing process,” explains Giovanni Anelli, head of the Group. “This approach seems to be an appropriate model for CERN, where the ultimate goal of technology transfer is not to generate income but to transfer knowledge to external partners.” The new scheme, as the name suggests, is designed to make it easier for industry and othe...

  4. Understanding the patient perspective on research access to national health records databases for conduct of randomized registry trials.

    Science.gov (United States)

    Avram, Robert; Marquis-Gravel, Guillaume; Simard, François; Pacheco, Christine; Couture, Étienne; Tremblay-Gravel, Maxime; Desplantie, Olivier; Malhamé, Isabelle; Bibas, Lior; Mansour, Samer; Parent, Marie-Claude; Farand, Paul; Harvey, Luc; Lessard, Marie-Gabrielle; Ly, Hung; Liu, Geoffrey; Hay, Annette E; Marc Jolicoeur, E

    2018-07-01

    Use of health administrative databases is proposed for screening and monitoring of participants in randomized registry trials. However, access to these databases raises privacy concerns. We assessed patient's preferences regarding use of personal information to link their research records with national health databases, as part of a hypothetical randomized registry trial. Cardiology patients were invited to complete an anonymous self-reported survey that ascertained preferences related to the concept of accessing government health databases for research, the type of personal identifiers to be shared and the type of follow-up preferred as participants in a hypothetical trial. A total of 590 responders completed the survey (90% response rate), the majority of which were Caucasians (90.4%), male (70.0%) with a median age of 65years (interquartile range, 8). The majority responders (80.3%) would grant researchers access to health administrative databases for screening and follow-up. To this end, responders endorsed the recording of their personal identifiers by researchers for future record linkage, including their name (90%), and health insurance number (83.9%), but fewer responders agreed with the recording of their social security number (61.4%, pgranting researchers access to the administrative databases (OR: 1.69, 95% confidence interval: 1.03-2.90; p=0.04). The majority of Cardiology patients surveyed were supportive of use of their personal identifiers to access administrative health databases and conduct long-term monitoring in the context of a randomized registry trial. Copyright © 2018 Elsevier Ireland Ltd. All rights reserved.

  5. Microcomputer Database Management Systems that Interface with Online Public Access Catalogs.

    Science.gov (United States)

    Rice, James

    1988-01-01

    Describes a study that assessed the availability and use of microcomputer database management interfaces to online public access catalogs. The software capabilities needed to effect such an interface are identified, and available software packages are evaluated by these criteria. A directory of software vendors is provided. (4 notes with…

  6. JASPAR 2010: the greatly expanded open-access database of transcription factor binding profiles

    DEFF Research Database (Denmark)

    Portales-Casamar, Elodie; Thongjuea, Supat; Kwon, Andrew T

    2009-01-01

    JASPAR (http://jaspar.genereg.net) is the leading open-access database of matrix profiles describing the DNA-binding patterns of transcription factors (TFs) and other proteins interacting with DNA in a sequence-specific manner. Its fourth major release is the largest expansion of the core database...... to an active research community. As binding models are refined by newer data, the JASPAR database now uses versioning of matrices: in this release, 12% of the older models were updated to improved versions. Classification of TF families has been improved by adopting a new DNA-binding domain nomenclature...

  7. Toward an open-access global database for mapping, control, and surveillance of neglected tropical diseases.

    Directory of Open Access Journals (Sweden)

    Eveline Hürlimann

    2011-12-01

    Full Text Available BACKGROUND: After many years of general neglect, interest has grown and efforts came under way for the mapping, control, surveillance, and eventual elimination of neglected tropical diseases (NTDs. Disease risk estimates are a key feature to target control interventions, and serve as a benchmark for monitoring and evaluation. What is currently missing is a georeferenced global database for NTDs providing open-access to the available survey data that is constantly updated and can be utilized by researchers and disease control managers to support other relevant stakeholders. We describe the steps taken toward the development of such a database that can be employed for spatial disease risk modeling and control of NTDs. METHODOLOGY: With an emphasis on schistosomiasis in Africa, we systematically searched the literature (peer-reviewed journals and 'grey literature', contacted Ministries of Health and research institutions in schistosomiasis-endemic countries for location-specific prevalence data and survey details (e.g., study population, year of survey and diagnostic techniques. The data were extracted, georeferenced, and stored in a MySQL database with a web interface allowing free database access and data management. PRINCIPAL FINDINGS: At the beginning of 2011, our database contained more than 12,000 georeferenced schistosomiasis survey locations from 35 African countries available under http://www.gntd.org. Currently, the database is expanded to a global repository, including a host of other NTDs, e.g. soil-transmitted helminthiasis and leishmaniasis. CONCLUSIONS: An open-access, spatially explicit NTD database offers unique opportunities for disease risk modeling, targeting control interventions, disease monitoring, and surveillance. Moreover, it allows for detailed geostatistical analyses of disease distribution in space and time. With an initial focus on schistosomiasis in Africa, we demonstrate the proof-of-concept that the establishment

  8. Toward an Open-Access Global Database for Mapping, Control, and Surveillance of Neglected Tropical Diseases

    Science.gov (United States)

    Hürlimann, Eveline; Schur, Nadine; Boutsika, Konstantina; Stensgaard, Anna-Sofie; Laserna de Himpsl, Maiti; Ziegelbauer, Kathrin; Laizer, Nassor; Camenzind, Lukas; Di Pasquale, Aurelio; Ekpo, Uwem F.; Simoonga, Christopher; Mushinge, Gabriel; Saarnak, Christopher F. L.; Utzinger, Jürg; Kristensen, Thomas K.; Vounatsou, Penelope

    2011-01-01

    Background After many years of general neglect, interest has grown and efforts came under way for the mapping, control, surveillance, and eventual elimination of neglected tropical diseases (NTDs). Disease risk estimates are a key feature to target control interventions, and serve as a benchmark for monitoring and evaluation. What is currently missing is a georeferenced global database for NTDs providing open-access to the available survey data that is constantly updated and can be utilized by researchers and disease control managers to support other relevant stakeholders. We describe the steps taken toward the development of such a database that can be employed for spatial disease risk modeling and control of NTDs. Methodology With an emphasis on schistosomiasis in Africa, we systematically searched the literature (peer-reviewed journals and ‘grey literature’), contacted Ministries of Health and research institutions in schistosomiasis-endemic countries for location-specific prevalence data and survey details (e.g., study population, year of survey and diagnostic techniques). The data were extracted, georeferenced, and stored in a MySQL database with a web interface allowing free database access and data management. Principal Findings At the beginning of 2011, our database contained more than 12,000 georeferenced schistosomiasis survey locations from 35 African countries available under http://www.gntd.org. Currently, the database is expanded to a global repository, including a host of other NTDs, e.g. soil-transmitted helminthiasis and leishmaniasis. Conclusions An open-access, spatially explicit NTD database offers unique opportunities for disease risk modeling, targeting control interventions, disease monitoring, and surveillance. Moreover, it allows for detailed geostatistical analyses of disease distribution in space and time. With an initial focus on schistosomiasis in Africa, we demonstrate the proof-of-concept that the establishment and running of a

  9. Colorado Late Cenozoic Fault and Fold Database and Internet Map Server: User-friendly technology for complex information

    Science.gov (United States)

    Morgan, K.S.; Pattyn, G.J.; Morgan, M.L.

    2005-01-01

    Internet mapping applications for geologic data allow simultaneous data delivery and collection, enabling quick data modification while efficiently supplying the end user with information. Utilizing Web-based technologies, the Colorado Geological Survey's Colorado Late Cenozoic Fault and Fold Database was transformed from a monothematic, nonspatial Microsoft Access database into a complex information set incorporating multiple data sources. The resulting user-friendly format supports easy analysis and browsing. The core of the application is the Microsoft Access database, which contains information compiled from available literature about faults and folds that are known or suspected to have moved during the late Cenozoic. The database contains nonspatial fields such as structure type, age, and rate of movement. Geographic locations of the fault and fold traces were compiled from previous studies at 1:250,000 scale to form a spatial database containing information such as length and strike. Integration of the two databases allowed both spatial and nonspatial information to be presented on the Internet as a single dataset (http://geosurvey.state.co.us/pubs/ceno/). The user-friendly interface enables users to view and query the data in an integrated manner, thus providing multiple ways to locate desired information. Retaining the digital data format also allows continuous data updating and quick delivery of newly acquired information. This dataset is a valuable resource to anyone interested in earthquake hazards and the activity of faults and folds in Colorado. Additional geologic hazard layers and imagery may aid in decision support and hazard evaluation. The up-to-date and customizable maps are invaluable tools for researchers or the public.

  10. Relational Database Extension Oriented, Self-adaptive Imagery Pyramid Model

    Directory of Open Access Journals (Sweden)

    HU Zhenghua

    2015-06-01

    Full Text Available With the development of remote sensing technology, especially the improvement of sensor resolution, the amount of image data is increasing. This puts forward higher requirements to manage huge amount of data efficiently and intelligently. And how to access massive remote sensing data with efficiency and smartness becomes an increasingly popular topic. In this paper, against current development status of Spatial Data Management System, we proposed a self-adaptive strategy for image blocking and a method for LoD(level of detailmodel construction that adapts, with the combination of database storage, network transmission and the hardware of the client. Confirmed by experiments, this imagery management mechanism can achieve intelligent and efficient storage and access in a variety of different conditions of database, network and client. This study provides a feasible idea and method for efficient image data management, contributing to the efficient access and management for remote sensing image data which are based on database technology under network environment of C/S architecture.

  11. Marginalized Student Access to Technology Education

    Science.gov (United States)

    Kurtcu, Wanda M.

    2017-01-01

    The purpose of this paper is to investigate how a teacher can disrupt an established curriculum that continues the cycle of inequity of access to science, technology, engineering, and math (STEM) curriculum by students in alternative education. For this paper, I will focus on the technology components of the STEM curriculum. Technology in the…

  12. Research on the establishment of the database system for R and D on the innovative technology for the earth; Chikyu kankyo sangyo gijutsu kenkyu kaihatsuyo database system ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-03-01

    For the purpose of structuring a database system of technical information about the earth environmental issues, the `database system for R and D of the earth environmental industrial technology` was operationally evaluated, and study was made to open it and structure a prototype of database. In the present state as pointed out in the operational evaluation, the utilization frequency is not heightened due to lack of UNIX experience, absence of system managers and shortage of utilizable articles listed, so that the renewal of database does not ideally progress. Therefore, study was then made to introduce tools utilizable by the initiators and open the information access terminal to the researchers at headquarters utilizing the internet. In order for the earth environment-related researchers to easily obtain the information, a database was prototypically structured to support the research exchange. Tasks were made clear to be taken for selecting the fields of research and compiling common thesauri in Japanese, Western and other languages. 28 figs., 16 tabs.

  13. Analysis of technologies databases use in physical education and sport

    Directory of Open Access Journals (Sweden)

    Usychenko V.V.

    2010-03-01

    Full Text Available Analysis and systematization is conducted scientific methodical and the special literature. The questions of the use of technology of databases rise in the system of preparation of sportsmen. The necessity of application of technologies of operative treatment of large arrays of sporting information is rotined. Collected taking on the use of computer-aided technologies of account and analysis of results of testing of parameters of training process. The question of influence of technologies is considered on training and competition activity. A database is presented «Athlete». A base contains anthropometric and myometrical indexes of sportsmen of bodybuilding of high qualification.

  14. Requirements for the next generation of nuclear databases and services

    Energy Technology Data Exchange (ETDEWEB)

    Pronyaev, Vladimir; Zerkin, Viktor; Muir, Douglas [International Atomic Energy Agency, Nuclear Data Section, Vienna (Austria); Winchell, David; Arcilla, Ramon [Brookhaven National Laboratory, National Nuclear Data Center, Upton, NY (United States)

    2002-08-01

    The use of relational database technology and general requirements for the next generation of nuclear databases and services are discussed. These requirements take into account an increased number of co-operating data centres working on diverse hardware and software platforms and users with different data-access capabilities. It is argued that the introduction of programming standards will allow the development of nuclear databases and data retrieval tools in a heterogeneous hardware and software environment. The functionality of this approach was tested with full-scale nuclear databases installed on different platforms having different operating and database management systems. User access through local network, internet, or CD-ROM has been investigated. (author)

  15. TRANSNET -- access to radioactive and hazardous materials transportation codes and databases

    International Nuclear Information System (INIS)

    Cashwell, J.W.

    1992-01-01

    TRANSNET has been developed and maintained by Sandia National Laboratories under the sponsorship of the United States Department of Energy (DOE) Office of Environmental Restoration and Waste Management to permit outside access to computerized routing, risk and systems analysis models, and associated databases. The goal of the TRANSNET system is to enable transfer of transportation analytical methods and data to qualified users by permitting direct, timely access to the up-to-date versions of the codes and data. The TRANSNET facility comprises a dedicated computer with telephone ports on which these codes and databases are adapted, modified, and maintained. To permit the widest spectrum of outside users, TRANSNET is designed to minimize hardware and documentation requirements. The user is thus required to have an IBM-compatible personal computer, Hayes-compatible modem with communications software, and a telephone. Maintenance and operation of the TRANSNET facility are underwritten by the program sponsor(s) as are updates to the respective models and data, thus the only charges to the user of the system are telephone hookup charges. TRANSNET provides access to the most recent versions of the models and data developed by or for Sandia National Laboratories. Code modifications that have been made since the last published documentation are noted to the user on the introductory screens. User friendly interfaces have been developed for each of the codes and databases on TRANSNET. In addition, users are provided with default input data sets for typical problems which can either be used directly or edited. Direct transfers of analytical or data files between codes are provided to permit the user to perform complex analyses with a minimum of input. Recent developments to the TRANSNET system include use of the system to directly pass data files between both national and international users as well as development and integration of graphical depiction techniques

  16. Jelly Views : Extending Relational Database Systems Toward Deductive Database Systems

    Directory of Open Access Journals (Sweden)

    Igor Wojnicki

    2004-01-01

    Full Text Available This paper regards the Jelly View technology, which provides a new, practical methodology for knowledge decomposition, storage, and retrieval within Relational Database Management Systems (RDBMS. Intensional Knowledge clauses (rules are decomposed and stored in the RDBMS founding reusable components. The results of the rule-based processing are visible as regular views, accessible through SQL. From the end-user point of view the processing capability becomes unlimited (arbitrarily complex queries can be constructed using Intensional Knowledge, while the most external queries are expressed with standard SQL. The RDBMS functionality becomes extended toward that of the Deductive Databases

  17. Implementation of dragon-I database system based on B/S model

    International Nuclear Information System (INIS)

    Jiang Wei; Lai Qinggui; Chen Nan; Gao Feng

    2010-01-01

    B/S architecture is utilized in the database system of 'Dragon-I'. The dynamic web software is designed with the technology of ASP. NET, and the web software are divided into three main tiers: user interface tier, business logic tier and access tier. The data of accelerator status and the data generated in experiment processes are managed with SQL Server DBMS, and the database is accessed based on the technology of ADO. NET. The status of facility, control parameters and testing waves are queried by the experiment number and experiment time. The demand of storage, management, browse, query and offline analysis are implemented entirely in this database system based on B/S architecture. (authors)

  18. Accessible Electronic and Information Technology

    Science.gov (United States)

    This Policy establishes EPA's responsibilities and procedures for making its Electronic and Information Technology (EIT) products accessible to all people, including people with disabilities, in accordance with Section 508 of the Rehabilitation Act.

  19. Integrated Identity and Access Management System for Tertiary ...

    African Journals Online (AJOL)

    Nigerian Journal of Technology ... identity management and access control and the unavailability of actionable information on pattern of ... This Tertiary Identity and Access Management System (T-IAMS) is a fingerprint biometric database that ...

  20. XML databases and the semantic web

    CERN Document Server

    Thuraisingham, Bhavani

    2002-01-01

    Efficient access to data, sharing data, extracting information from data, and making use of the information have become urgent needs for today''s corporations. With so much data on the Web, managing it with conventional tools is becoming almost impossible. New tools and techniques are necessary to provide interoperability as well as warehousing between multiple data sources and systems, and to extract information from the databases. XML Databases and the Semantic Web focuses on critical and new Web technologies needed for organizations to carry out transactions on the Web, to understand how to use the Web effectively, and to exchange complex documents on the Web.This reference for database administrators, database designers, and Web designers working in tandem with database technologists covers three emerging technologies of significant impact for electronic business: Extensible Markup Language (XML), semi-structured databases, and the semantic Web. The first two parts of the book explore these emerging techn...

  1. Beginning C# 2008 databases from novice to professional

    CERN Document Server

    Fahad Gilani, Syed; Reid, Jon; Raghuram, Ranga; Huddleston, James; Hammer Pedersen, Jacob

    2008-01-01

    This book is for every C# programmer. It assumes no prior database experience and teaches through hands-on examples how to create and use relational databases with the standard database language SQL and how to access them with C#.Assuming only basic knowledge of C# 3.0, Beginning C# 3.0 Databases teaches all the fundamentals of database technology and database programming readers need to quickly become highly proficient database users and application developers. A comprehensive tutorial on both SQL Server 2005 and ADO.NET 3.0, this book explains and demonstrates how to create database objects

  2. Creation of the NaSCoRD Database

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jankovsky, Zachary Kyle [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stuart, William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    This report was written as part of a United States Department of Energy (DOE), Office of Nuclear Energy, Advanced Reactor Technologies program funded project to re-create the capabilities of the legacy Centralized Reliability Database Organization (CREDO) database. The CREDO database provided a record of component design and performance documentation across various systems that used sodium as a working fluid. Regaining this capability will allow the DOE complex and the domestic sodium reactor industry to better understand how previous systems were designed and built for use in improving the design and operations of future loops. The contents of this report include: overview of the current state of domestic sodium reliability databases; summary of the ongoing effort to improve, understand, and process the CREDO information; summary of the initial efforts to develop a unified sodium reliability database called the Sodium System Component Reliability Database (NaSCoRD); and explain both how potential users can access the domestic sodium reliability databases and the type of information that can be accessed from these databases.

  3. Opportunities for Engaging Low-Income, Vulnerable Populations in Health Care: A Systematic Review of Homeless Persons’ Access to and Use of Information Technologies

    Science.gov (United States)

    Li, Alice E.; Hogan, Timothy P.

    2013-01-01

    We systematically reviewed the health and social science literature on access to and use of information technologies by homeless persons by searching 5 bibliographic databases. Articles were included if they were in English, represented original research, appeared in peer-reviewed publications, and addressed our research questions. Sixteen articles met our inclusion criteria. We found that mobile phone ownership ranged from 44% to 62%; computer ownership, from 24% to 40%; computer access and use, from 47% to 55%; and Internet use, from 19% to 84%. Homeless persons used technologies for a range of purposes, some of which were health related. Many homeless persons had access to information technologies, suggesting possible health benefits to developing programs that link homeless persons to health care through mobile phones and the Internet. PMID:24148036

  4. View discovery in OLAP databases through statistical combinatorial optimization

    Energy Technology Data Exchange (ETDEWEB)

    Hengartner, Nick W [Los Alamos National Laboratory; Burke, John [PNNL; Critchlow, Terence [PNNL; Joslyn, Cliff [PNNL; Hogan, Emilie [PNNL

    2009-01-01

    OnLine Analytical Processing (OLAP) is a relational database technology providing users with rapid access to summary, aggregated views of a single large database, and is widely recognized for knowledge representation and discovery in high-dimensional relational databases. OLAP technologies provide intuitive and graphical access to the massively complex set of possible summary views available in large relational (SQL) structured data repositories. The capability of OLAP database software systems to handle data complexity comes at a high price for analysts, presenting them a combinatorially vast space of views of a relational database. We respond to the need to deploy technologies sufficient to allow users to guide themselves to areas of local structure by casting the space of 'views' of an OLAP database as a combinatorial object of all projections and subsets, and 'view discovery' as an search process over that lattice. We equip the view lattice with statistical information theoretical measures sufficient to support a combinatorial optimization process. We outline 'hop-chaining' as a particular view discovery algorithm over this object, wherein users are guided across a permutation of the dimensions by searching for successive two-dimensional views, pushing seen dimensions into an increasingly large background filter in a 'spiraling' search process. We illustrate this work in the context of data cubes recording summary statistics for radiation portal monitors at US ports.

  5. Advanced information technology: Building stronger databases

    Energy Technology Data Exchange (ETDEWEB)

    Price, D. [Lawrence Livermore National Lab., CA (United States)

    1994-12-01

    This paper discusses the attributes of the Advanced Information Technology (AIT) tool set, a database application builder designed at the Lawrence Livermore National Laboratory. AIT consists of a C library and several utilities that provide referential integrity across a database, interactive menu and field level help, and a code generator for building tightly controlled data entry support. AIT also provides for dynamic menu trees, report generation support, and creation of user groups. Composition of the library and utilities is discussed, along with relative strengths and weaknesses. In addition, an instantiation of the AIT tool set is presented using a specific application. Conclusions about the future and value of the tool set are then drawn based on the use of the tool set with that specific application.

  6. Access to augmentative and alternative communication: new technologies and clinical decision-making.

    Science.gov (United States)

    Fager, Susan; Bardach, Lisa; Russell, Susanne; Higginbotham, Jeff

    2012-01-01

    Children with severe physical impairments require a variety of access options to augmentative and alternative communication (AAC) and computer technology. Access technologies have continued to develop, allowing children with severe motor control impairments greater independence and access to communication. This article will highlight new advances in access technology, including eye and head tracking, scanning, and access to mainstream technology, as well as discuss future advances. Considerations for clinical decision-making and implementation of these technologies will be presented along with case illustrations.

  7. Evolution of Database Replication Technologies for WLCG

    OpenAIRE

    Baranowski, Zbigniew; Pardavila, Lorena Lobato; Blaszczyk, Marcin; Dimitrov, Gancho; Canali, Luca

    2015-01-01

    In this article we summarize several years of experience on database replication technologies used at WLCG and we provide a short review of the available Oracle technologies and their key characteristics. One of the notable changes and improvement in this area in recent past has been the introduction of Oracle GoldenGate as a replacement of Oracle Streams. We report in this article on the preparation and later upgrades for remote replication done in collaboration with ATLAS and Tier 1 databas...

  8. RFID Based Security Access Control System with GSM Technology

    OpenAIRE

    Peter Adole; Joseph M. Môm; Gabriel A. Igwue

    2016-01-01

    The security challenges being encountered in many places today require electronic means of controlling access to secured premises in addition to the available security personnel. Various technologies were used in different forms to solve these challenges. The Radio Frequency Identification (RFID) Based Access Control Security system with GSM technology presented in this work helps to prevent unauthorized access to controlled environments (secured premises). This is achieved mainly...

  9. Inference Attacks and Control on Database Structures

    Directory of Open Access Journals (Sweden)

    Muhamed Turkanovic

    2015-02-01

    Full Text Available Today’s databases store information with sensitivity levels that range from public to highly sensitive, hence ensuring confidentiality can be highly important, but also requires costly control. This paper focuses on the inference problem on different database structures. It presents possible treats on privacy with relation to the inference, and control methods for mitigating these treats. The paper shows that using only access control, without any inference control is inadequate, since these models are unable to protect against indirect data access. Furthermore, it covers new inference problems which rise from the dimensions of new technologies like XML, semantics, etc.

  10. Technological Innovation and Cooperation for Foreign Information Access Program

    Science.gov (United States)

    Office of Postsecondary Education, US Department of Education, 2012

    2012-01-01

    The Technological Innovation and Cooperation for Foreign Information Access (TICFIA) Program supports projects focused on developing innovative technologies for accessing, collecting, organizing, preserving, and disseminating information from foreign sources to address the U.S.' teaching and research needs in international education and foreign…

  11. The Ruby UCSC API: accessing the UCSC genome database using Ruby.

    Science.gov (United States)

    Mishima, Hiroyuki; Aerts, Jan; Katayama, Toshiaki; Bonnal, Raoul J P; Yoshiura, Koh-ichiro

    2012-09-21

    The University of California, Santa Cruz (UCSC) genome database is among the most used sources of genomic annotation in human and other organisms. The database offers an excellent web-based graphical user interface (the UCSC genome browser) and several means for programmatic queries. A simple application programming interface (API) in a scripting language aimed at the biologist was however not yet available. Here, we present the Ruby UCSC API, a library to access the UCSC genome database using Ruby. The API is designed as a BioRuby plug-in and built on the ActiveRecord 3 framework for the object-relational mapping, making writing SQL statements unnecessary. The current version of the API supports databases of all organisms in the UCSC genome database including human, mammals, vertebrates, deuterostomes, insects, nematodes, and yeast.The API uses the bin index-if available-when querying for genomic intervals. The API also supports genomic sequence queries using locally downloaded *.2bit files that are not stored in the official MySQL database. The API is implemented in pure Ruby and is therefore available in different environments and with different Ruby interpreters (including JRuby). Assisted by the straightforward object-oriented design of Ruby and ActiveRecord, the Ruby UCSC API will facilitate biologists to query the UCSC genome database programmatically. The API is available through the RubyGem system. Source code and documentation are available at https://github.com/misshie/bioruby-ucsc-api/ under the Ruby license. Feedback and help is provided via the website at http://rubyucscapi.userecho.com/.

  12. The Ruby UCSC API: accessing the UCSC genome database using Ruby

    Science.gov (United States)

    2012-01-01

    Background The University of California, Santa Cruz (UCSC) genome database is among the most used sources of genomic annotation in human and other organisms. The database offers an excellent web-based graphical user interface (the UCSC genome browser) and several means for programmatic queries. A simple application programming interface (API) in a scripting language aimed at the biologist was however not yet available. Here, we present the Ruby UCSC API, a library to access the UCSC genome database using Ruby. Results The API is designed as a BioRuby plug-in and built on the ActiveRecord 3 framework for the object-relational mapping, making writing SQL statements unnecessary. The current version of the API supports databases of all organisms in the UCSC genome database including human, mammals, vertebrates, deuterostomes, insects, nematodes, and yeast. The API uses the bin index—if available—when querying for genomic intervals. The API also supports genomic sequence queries using locally downloaded *.2bit files that are not stored in the official MySQL database. The API is implemented in pure Ruby and is therefore available in different environments and with different Ruby interpreters (including JRuby). Conclusions Assisted by the straightforward object-oriented design of Ruby and ActiveRecord, the Ruby UCSC API will facilitate biologists to query the UCSC genome database programmatically. The API is available through the RubyGem system. Source code and documentation are available at https://github.com/misshie/bioruby-ucsc-api/ under the Ruby license. Feedback and help is provided via the website at http://rubyucscapi.userecho.com/. PMID:22994508

  13. The Ruby UCSC API: accessing the UCSC genome database using Ruby

    Directory of Open Access Journals (Sweden)

    Mishima Hiroyuki

    2012-09-01

    Full Text Available Abstract Background The University of California, Santa Cruz (UCSC genome database is among the most used sources of genomic annotation in human and other organisms. The database offers an excellent web-based graphical user interface (the UCSC genome browser and several means for programmatic queries. A simple application programming interface (API in a scripting language aimed at the biologist was however not yet available. Here, we present the Ruby UCSC API, a library to access the UCSC genome database using Ruby. Results The API is designed as a BioRuby plug-in and built on the ActiveRecord 3 framework for the object-relational mapping, making writing SQL statements unnecessary. The current version of the API supports databases of all organisms in the UCSC genome database including human, mammals, vertebrates, deuterostomes, insects, nematodes, and yeast. The API uses the bin index—if available—when querying for genomic intervals. The API also supports genomic sequence queries using locally downloaded *.2bit files that are not stored in the official MySQL database. The API is implemented in pure Ruby and is therefore available in different environments and with different Ruby interpreters (including JRuby. Conclusions Assisted by the straightforward object-oriented design of Ruby and ActiveRecord, the Ruby UCSC API will facilitate biologists to query the UCSC genome database programmatically. The API is available through the RubyGem system. Source code and documentation are available at https://github.com/misshie/bioruby-ucsc-api/ under the Ruby license. Feedback and help is provided via the website at http://rubyucscapi.userecho.com/.

  14. 39 CFR 255.4 - Accessibility to electronic and information technology.

    Science.gov (United States)

    2010-07-01

    ... AND INFORMATION TECHNOLOGY § 255.4 Accessibility to electronic and information technology. (a) In... burden, that the electronic and information technology the agency procures allows— (1) Individuals with... 39 Postal Service 1 2010-07-01 2010-07-01 false Accessibility to electronic and information...

  15. Physical Access Control Database -

    Data.gov (United States)

    Department of Transportation — This data set contains the personnel access card data (photo, name, activation/expiration dates, card number, and access level) as well as data about turnstiles and...

  16. Study on Mandatory Access Control in a Secure Database Management System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper proposes a security policy model for mandatory access control in class B1 database management system whose level of labeling is tuple. The relation-hierarchical data model is extended to multilevel relation-hierarchical data model. Based on the multilevel relation-hierarchical data model, the concept of upper-lower layer relational integrity is presented after we analyze and eliminate the covert channels caused by the database integrity. Two SQL statements are extended to process polyinstantiation in the multilevel secure environment. The system is based on the multilevel relation-hierarchical data model and is capable of integratively storing and manipulating multilevel complicated objects (e. g., multilevel spatial data) and multilevel conventional data ( e. g., integer. real number and character string).

  17. Design of Nutrition Catering System for Athletes Based on Access Database

    OpenAIRE

    Hongjiang Wu,; Haiyan Zhao; Xugang Liu; Mingshun Xing

    2015-01-01

    In order to monitor and adjust athletes' dietary nutrition scientifically, Active X Data Object (ADO) and Structure Query Language (SQL) were used to produce program under the development environment of Visual Basic 6.0 and Access database. The consulting system on food nutrition and dietary had been developed with the two languages combination and organization of the latest nutrition information. Nutrition balance of physiological characteristics, assessment for nutrition intake, inquiring n...

  18. Technology for People, Not Disabilities: Ensuring Access and Inclusion

    Science.gov (United States)

    Foley, Alan; Ferri, Beth A.

    2012-01-01

    The potential of technology to connect people and provide access to education, commerce, employment and entertainment has never been greater or more rapidly changing. Communication technologies and new media promise to "revolutionize our lives" by breaking down barriers and expanding access for disabled people. Yet, it is also true that technology…

  19. SciELO, Scientific Electronic Library Online, a Database of Open Access Journals

    Science.gov (United States)

    Meneghini, Rogerio

    2013-01-01

    This essay discusses SciELO, a scientific journal database operating in 14 countries. It covers over 1000 journals providing open access to full text and table sets of scientometrics data. In Brazil it is responsible for a collection of nearly 300 journals, selected along 15 years as the best Brazilian periodicals in natural and social sciences.…

  20. From Punched Cards to "Big Data": A Social History of Database Populism

    Directory of Open Access Journals (Sweden)

    Kevin Driscoll

    2012-08-01

    Full Text Available Since the diffusion of the punched card tabulator following the 1890 U.S. Census, mass-scale information processing has been alternately a site of opportunity, ambivalence and fear in the American imagination. While large bureaucracies have tended to deploy database technology toward purposes of surveillance and control, the rise of personal computing made databases accessible to individuals and small businesses for the first time. Today, the massive collection of trace communication data by public and private institutions has renewed popular anxiety about the role of the database in society. This essay traces the social history of database technology across three periods that represent significant changes in the accessibility and infrastructure of information processing systems. Although many proposed uses of "big data" seem to threaten individual privacy, a largely-forgotten database populism from the 1970s and 1980s suggests that a reclamation of small-scale data processing might lead to sharper popular critique in the future.

  1. DbAccess: Interactive Statistics and Graphics for Plasma Physics Databases

    International Nuclear Information System (INIS)

    Davis, W.; Mastrovito, D.

    2003-01-01

    DbAccess is an X-windows application, written in IDL(reg s ign), meeting many specialized statistical and graphical needs of NSTX [National Spherical Torus Experiment] plasma physicists, such as regression statistics and the analysis of variance. Flexible ''views'' and ''joins,'' which include options for complex SQL expressions, facilitate mixing data from different database tables. General Atomics Plot Objects add extensive graphical and interactive capabilities. An example is included for plasma confinement-time scaling analysis using a multiple linear regression least-squares power fit

  2. Nuclear Criticality Information System. Database examples

    Energy Technology Data Exchange (ETDEWEB)

    Foret, C.A.

    1984-06-01

    The purpose of this publication is to provide our users with a guide to using the Nuclear Criticality Information System (NCIS). It is comprised of an introduction, an information and resources section, a how-to-use section, and several useful appendices. The main objective of this report is to present a clear picture of the NCIS project and its available resources as well as assisting our users in accessing the database and using the TIS computer to process data. The introduction gives a brief description of the NCIS project, the Technology Information System (TIS), online user information, future plans and lists individuals to contact for additional information about the NCIS project. The information and resources section outlines the NCIS database and describes the resources that are available. The how-to-use section illustrates access to the NCIS database as well as searching datafiles for general or specific data. It also shows how to access and read the NCIS news section as well as connecting to other information centers through the TIS computer.

  3. Nuclear Criticality Information System. Database examples

    International Nuclear Information System (INIS)

    Foret, C.A.

    1984-06-01

    The purpose of this publication is to provide our users with a guide to using the Nuclear Criticality Information System (NCIS). It is comprised of an introduction, an information and resources section, a how-to-use section, and several useful appendices. The main objective of this report is to present a clear picture of the NCIS project and its available resources as well as assisting our users in accessing the database and using the TIS computer to process data. The introduction gives a brief description of the NCIS project, the Technology Information System (TIS), online user information, future plans and lists individuals to contact for additional information about the NCIS project. The information and resources section outlines the NCIS database and describes the resources that are available. The how-to-use section illustrates access to the NCIS database as well as searching datafiles for general or specific data. It also shows how to access and read the NCIS news section as well as connecting to other information centers through the TIS computer

  4. 76 FR 77738 - Telecommunications Act Accessibility Guidelines; Electronic and Information Technology...

    Science.gov (United States)

    2011-12-14

    ... Telecommunications Act Accessibility Guidelines and its Electronic and Information Technology Accessibility Standards... electronic and information technology covered by Section 508 of the Rehabilitation Act Amendments of 1998. 76.... 2011-07] RIN 3014-AA37 Telecommunications Act Accessibility Guidelines; Electronic and Information...

  5. Direct access to INIS

    International Nuclear Information System (INIS)

    Zheludev, I.S.; Romanenko, A.G.

    1981-01-01

    Librarians, researchers, and information specialists throughout the world now have the opportunity for direct access to coverage of almost 95% of the world's literature dealing with the peaceful uses of atomic energy and nuclear science. This opportunity has been provided by the International Nuclear Information System (INIS) of the IAEA. INIS, with the voluntary collaboration of more than 60 of the Agency's Member States, maintains a comprehensive, computer-resident data-base, containing the bibliographic details plus informative abstracts of the bulk of the world's literature on nuclear science and technology. Since this data-base is growing at a rate of 75,000 items per year, and already contains more than 500,000 items, it is obviously important to be able to search this collection conveniently and efficiently. The usefulness of this ability is enhanced when other data-bases on related subjects are made available on an information network. During the early 1970s, on-line interrogation of large bibliographic data-bases became the accepted method for searching this type of information resource. Direct interaction between the searcher and the data-base provides quick feed-back resulting in improved literature listings for launching research and development projects. On-line access enables organizations which cannot store a large data-base on their own computer to expand the information resources at their command. Because of these advantages, INIS undertook to extend to interested Member States on-line access to its data-base in Vienna

  6. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    Science.gov (United States)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  7. Health technology management: a database analysis as support of technology managers in hospitals.

    Science.gov (United States)

    Miniati, Roberto; Dori, Fabrizio; Iadanza, Ernesto; Fregonara, Mario M; Gentili, Guido Biffi

    2011-01-01

    Technology management in healthcare must continually respond and adapt itself to new improvements in medical equipment. Multidisciplinary approaches which consider the interaction of different technologies, their use and user skills, are necessary in order to improve safety and quality. An easy and sustainable methodology is vital to Clinical Engineering (CE) services in healthcare organizations in order to define criteria regarding technology acquisition and replacement. This article underlines the critical aspects of technology management in hospitals by providing appropriate indicators for benchmarking CE services exclusively referring to the maintenance database from the CE department at the Careggi Hospital in Florence, Italy.

  8. The Neotoma Paleoecology Database

    Science.gov (United States)

    Grimm, E. C.; Ashworth, A. C.; Barnosky, A. D.; Betancourt, J. L.; Bills, B.; Booth, R.; Blois, J.; Charles, D. F.; Graham, R. W.; Goring, S. J.; Hausmann, S.; Smith, A. J.; Williams, J. W.; Buckland, P.

    2015-12-01

    The Neotoma Paleoecology Database (www.neotomadb.org) is a multiproxy, open-access, relational database that includes fossil data for the past 5 million years (the late Neogene and Quaternary Periods). Modern distributional data for various organisms are also being made available for calibration and paleoecological analyses. The project is a collaborative effort among individuals from more than 20 institutions worldwide, including domain scientists representing a spectrum of Pliocene-Quaternary fossil data types, as well as experts in information technology. Working groups are active for diatoms, insects, ostracodes, pollen and plant macroscopic remains, testate amoebae, rodent middens, vertebrates, age models, geochemistry and taphonomy. Groups are also active in developing online tools for data analyses and for developing modules for teaching at different levels. A key design concept of NeotomaDB is that stewards for various data types are able to remotely upload and manage data. Cooperatives for different kinds of paleo data, or from different regions, can appoint their own stewards. Over the past year, much progress has been made on development of the steward software-interface that will enable this capability. The steward interface uses web services that provide access to the database. More generally, these web services enable remote programmatic access to the database, which both desktop and web applications can use and which provide real-time access to the most current data. Use of these services can alleviate the need to download the entire database, which can be out-of-date as soon as new data are entered. In general, the Neotoma web services deliver data either from an entire table or from the results of a view. Upon request, new web services can be quickly generated. Future developments will likely expand the spatial and temporal dimensions of the database. NeotomaDB is open to receiving new datasets and stewards from the global Quaternary community

  9. Validity of administrative database code algorithms to identify vascular access placement, surgical revisions, and secondary patency.

    Science.gov (United States)

    Al-Jaishi, Ahmed A; Moist, Louise M; Oliver, Matthew J; Nash, Danielle M; Fleet, Jamie L; Garg, Amit X; Lok, Charmaine E

    2018-03-01

    We assessed the validity of physician billing codes and hospital admission using International Classification of Diseases 10th revision codes to identify vascular access placement, secondary patency, and surgical revisions in administrative data. We included adults (≥18 years) with a vascular access placed between 1 April 2004 and 31 March 2013 at the University Health Network, Toronto. Our reference standard was a prospective vascular access database (VASPRO) that contains information on vascular access type and dates of placement, dates for failure, and any revisions. We used VASPRO to assess the validity of different administrative coding algorithms by calculating the sensitivity, specificity, and positive predictive values of vascular access events. The sensitivity (95% confidence interval) of the best performing algorithm to identify arteriovenous access placement was 86% (83%, 89%) and specificity was 92% (89%, 93%). The corresponding numbers to identify catheter insertion were 84% (82%, 86%) and 84% (80%, 87%), respectively. The sensitivity of the best performing coding algorithm to identify arteriovenous access surgical revisions was 81% (67%, 90%) and specificity was 89% (87%, 90%). The algorithm capturing arteriovenous access placement and catheter insertion had a positive predictive value greater than 90% and arteriovenous access surgical revisions had a positive predictive value of 20%. The duration of arteriovenous access secondary patency was on average 578 (553, 603) days in VASPRO and 555 (530, 580) days in administrative databases. Administrative data algorithms have fair to good operating characteristics to identify vascular access placement and arteriovenous access secondary patency. Low positive predictive values for surgical revisions algorithm suggest that administrative data should only be used to rule out the occurrence of an event.

  10. Distributed Pseudo-Random Number Generation and Its Application to Cloud Database

    OpenAIRE

    Chen, Jiageng; Miyaji, Atsuko; Su, Chunhua

    2014-01-01

    Cloud database is now a rapidly growing trend in cloud computing market recently. It enables the clients run their computation on out-sourcing databases or access to some distributed database service on the cloud. At the same time, the security and privacy concerns is major challenge for cloud database to continue growing. To enhance the security and privacy of the cloud database technology, the pseudo-random number generation (PRNG) plays an important roles in data encryptions and privacy-pr...

  11. A Database for Reviewing and Selecting Radioactive Waste Treatment Technologies and Vendors

    International Nuclear Information System (INIS)

    P. C. Marushia; W. E. Schwinkendorf

    1999-01-01

    Several attempts have been made in past years to collate and present waste management technologies and solutions to waste generators. These efforts have been manifested as reports, buyers' guides, and databases. While this information is helpful at the time it is assembled, the principal weakness is maintaining the timeliness and accuracy of the information over time. In many cases, updates have to be published or developed as soon as the product is disseminated. The recently developed National Low-Level Waste Management Program's Technologies Database is a vendor-updated Internet based database designed to overcome this problem. The National Low-Level Waste Management Program's Technologies Database contains information about waste types, treatment technologies, and vendor information. Information is presented about waste types, typical treatments, and the vendors who provide those treatment methods. The vendors who provide services update their own contact information, their treatment processes, and the types of wastes for which their treatment process is applicable. This information is queriable by a generator of low-level or mixed low-level radioactive waste who is seeking information on waste treatment methods and the vendors who provide them. Timeliness of the information in the database is assured using time clocks and automated messaging to remind featured vendors to keep their information current. Failure to keep the entries current results in a vendor being warned and then ultimately dropped from the database. This assures that the user is dealing with the most current information available and the vendors who are active in reaching and serving their market

  12. INIST: databases reorientation

    International Nuclear Information System (INIS)

    Bidet, J.C.

    1995-01-01

    INIST is a CNRS (Centre National de la Recherche Scientifique) laboratory devoted to the treatment of scientific and technical informations and to the management of these informations compiled in a database. Reorientation of the database content has been proposed in 1994 to increase the transfer of research towards enterprises and services, to develop more automatized accesses to the informations, and to create a quality assurance plan. The catalog of publications comprises 5800 periodical titles (1300 for fundamental research and 4500 for applied research). A science and technology multi-thematic database will be created in 1995 for the retrieval of applied and technical informations. ''Grey literature'' (reports, thesis, proceedings..) and human and social sciences data will be added to the base by the use of informations selected in the existing GRISELI and Francis databases. Strong modifications are also planned in the thematic cover of Earth sciences and will considerably reduce the geological information content. (J.S.). 1 tab

  13. Multilevel security for relational databases

    CERN Document Server

    Faragallah, Osama S; El-Samie, Fathi E Abd

    2014-01-01

    Concepts of Database Security Database Concepts Relational Database Security Concepts Access Control in Relational Databases      Discretionary Access Control      Mandatory Access Control      Role-Based Access Control Work Objectives Book Organization Basic Concept of Multilevel Database Security IntroductionMultilevel Database Relations Polyinstantiation      Invisible Polyinstantiation      Visible Polyinstantiation      Types of Polyinstantiation      Architectural Consideration

  14. Analysis of technologies databases use in physical education and sport

    OpenAIRE

    Usychenko V.V.; Byshevets N.G.

    2010-01-01

    Analysis and systematization is conducted scientific methodical and the special literature. The questions of the use of technology of databases rise in the system of preparation of sportsmen. The necessity of application of technologies of operative treatment of large arrays of sporting information is rotined. Collected taking on the use of computer-aided technologies of account and analysis of results of testing of parameters of training process. The question of influence of technologies is ...

  15. Technology Adoption: Influence of Availability and Accessibility

    Science.gov (United States)

    McConnell, William Stewart

    2009-01-01

    Farmers are small business leaders using available technology to remain competitive. The availability of technology is dependent on the suppliers' use of the marketing mix 4Ps theory--product, price, placement, and promotion. The purpose of this study was to determine how the relation between availability and accessibility influences the adoption…

  16. Database Systems - Present and Future

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available The database systems have nowadays an increasingly important role in the knowledge-based society, in which computers have penetrated all fields of activity and the Internet tends to develop worldwide. In the current informatics context, the development of the applications with databases is the work of the specialists. Using databases, reach a database from various applications, and also some of related concepts, have become accessible to all categories of IT users. This paper aims to summarize the curricular area regarding the fundamental database systems issues, which are necessary in order to train specialists in economic informatics higher education. The database systems integrate and interfere with several informatics technologies and therefore are more difficult to understand and use. Thus, students should know already a set of minimum, mandatory concepts and their practical implementation: computer systems, programming techniques, programming languages, data structures. The article also presents the actual trends in the evolution of the database systems, in the context of economic informatics.

  17. Instructional Technology for Rural Schools: Access and Acquisition

    Science.gov (United States)

    Sundeen, Todd H.; Sundeen, Darrelanne M.

    2013-01-01

    Integrating instructional technology into all classrooms has the potential to transform modern education and student learning. However, access to technology is not equally available to all districts or schools. Decreased funding and budgetary restraints have had a direct impact on technology acquisition in many rural school districts. One of the…

  18. Allelic database and accession divergence of a Brazilian mango collection based on microsatellite markers.

    Science.gov (United States)

    Dos Santos Ribeiro, I C N; Lima Neto, F P; Santos, C A F

    2012-12-19

    Allelic patterns and genetic distances were examined in a collection of 103 foreign and Brazilian mango (Mangifera indica) accessions in order to develop a reference database to support cultivar protection and breeding programs. An UPGMA dendrogram was generated using Jaccard's coefficients from a distance matrix based on 50 alleles of 12 microsatellite loci. The base pair number was estimated by the method of inverse mobility. The cophenetic correlation was 0.8. The accessions had a coefficient of similarity from 30 to 100%, which reflects high genetic variability. Three groups were observed in the UPGMA dendrogram; the first group was formed predominantly by foreign accessions, the second group was formed by Brazilian accessions, and the Dashehari accession was isolated from the others. The 50 microsatellite alleles did not separate all 103 accessions, indicating that there are duplicates in this mango collection. These 12 microsatellites need to be validated in order to establish a reliable set to identify mango cultivars.

  19. Personalized medicine and access to genetic technologies.

    Science.gov (United States)

    den Exter, André

    2010-01-01

    Personalized medicine started after the Human Genome Project and is a relatively new concept that will dramatically change clinical practice. It offers clear clinical advantages by applying genetic diagnostic tests and then treating the patient with targeted medicines based on his or her genetic make-up. Its potential seems promising but there are quite a few legal concerns. One of these questions deals with the right to health care and access to genetic technologies. In this paper, the author explains the meaning of such a right to health care under international human rights law, its relevance for making genetic services eligible for public funding, how to cope with quality concerns of commercial testing, and finally, the patentability controversy and clinical access to genetic information. Apart from more traditional human rights concerns (consent, privacy, confidentiality) and genetics, States should be aware of the meaning of the equal access concept under international law and its consequences when introducing new technologies such genetic testing and services.

  20. Exploiting relational database technology in a GIS

    Science.gov (United States)

    Batty, Peter

    1992-05-01

    All systems for managing data face common problems such as backup, recovery, auditing, security, data integrity, and concurrent update. Other challenges include the ability to share data easily between applications and to distribute data across several computers, whereas continuing to manage the problems already mentioned. Geographic information systems are no exception, and need to tackle all these issues. Standard relational database-management systems (RDBMSs) provide many features to help solve the issues mentioned so far. This paper describes how the IBM geoManager product approaches these issues by storing all its geographic data in a standard RDBMS in order to take advantage of such features. Areas in which standard RDBMS functions need to be extended are highlighted, and the way in which geoManager does this is explained. The performance implications of storing all data in the relational database are discussed. An important distinction is made between the storage and management of geographic data and the manipulation and analysis of geographic data, which needs to be made when considering the applicability of relational database technology to GIS.

  1. Database design and database administration for a kindergarten

    OpenAIRE

    Vítek, Daniel

    2009-01-01

    The bachelor thesis deals with creation of database design for a standard kindergarten, installation of the designed database into the database system Oracle Database 10g Express Edition and demonstration of the administration tasks in this database system. The verification of the database was proved by a developed access application.

  2. Wireless access to a pharmaceutical database: A demonstrator for data driven Wireless Application Prorocol (WAP) applications in medical information processing

    DEFF Research Database (Denmark)

    Hansen, Michael Schacht; Dørup, Jens

    2001-01-01

    script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2......) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. RESULTS: A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol...... service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. CONCLUSIONS: We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further...

  3. Multi-Dimensional Bitmap Indices for Optimising Data Access within Object Oriented Databases at CERN

    CERN Document Server

    Stockinger, K

    2001-01-01

    Efficient query processing in high-dimensional search spaces is an important requirement for many analysis tools. In the literature on index data structures one can find a wide range of methods for optimising database access. In particular, bitmap indices have recently gained substantial popularity in data warehouse applications with large amounts of read mostly data. Bitmap indices are implemented in various commercial database products and are used for querying typical business applications. However, scientific data that is mostly characterised by non-discrete attribute values cannot be queried efficiently by the techniques currently supported. In this thesis we propose a novel access method based on bitmap indices that efficiently handles multi-dimensional queries against typical scientific data. The algorithm is called GenericRangeEval and is an extension of a bitmap index for discrete attribute values. By means of a cost model we study the performance of queries with various selectivities against uniform...

  4. Hyperdatabase: A schema for browsing multiple databases

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, M A [Dalhousie Univ., Halifax (Canada). Computer Science Div.; Watters, C R [Waterloo Univ., Waterloo (Canada). Computer Science Dept.

    1990-05-01

    In order to insure effective information retrieval, a user may need to search multiple databases on multiple systems. Although front end systems have been developed to assist the user in accessing different systems, they access one retrieval system at a time and the search has to be repeated for each required database on each retrieval system. More importantly, the user interacts with the results as independent sessions. This paper models multiple bibliographic databases distributed over one or more retrieval systems as a hyperdatabase, i.e., a single virtual database. The hyperdatabase is viewed as a hypergraph in which each node represents a bibliographic item and the links among nodes represent relations among the items. In the response to a query, bibliographic items are extracted from the hyperdatabase and linked together to form a transient hypergraph. This hypergraph is transient in the sense that it is ``created`` in response to a query and only ``exists`` for the duration of the query session. A hypertext interface permits the user to browse the transient hypergraph in a nonlinear manner. The technology to implement a system based on this model is available now, consisting of powerful workstation, distributed processing, high-speed communications, and CD-ROMs. As the technology advances and costs decrease such systems should be generally available. (author). 13 refs, 5 figs.

  5. Hyperdatabase: A schema for browsing multiple databases

    International Nuclear Information System (INIS)

    Shepherd, M.A.; Watters, C.R.

    1990-05-01

    In order to insure effective information retrieval, a user may need to search multiple databases on multiple systems. Although front end systems have been developed to assist the user in accessing different systems, they access one retrieval system at a time and the search has to be repeated for each required database on each retrieval system. More importantly, the user interacts with the results as independent sessions. This paper models multiple bibliographic databases distributed over one or more retrieval systems as a hyperdatabase, i.e., a single virtual database. The hyperdatabase is viewed as a hypergraph in which each node represents a bibliographic item and the links among nodes represent relations among the items. In the response to a query, bibliographic items are extracted from the hyperdatabase and linked together to form a transient hypergraph. This hypergraph is transient in the sense that it is ''created'' in response to a query and only ''exists'' for the duration of the query session. A hypertext interface permits the user to browse the transient hypergraph in a nonlinear manner. The technology to implement a system based on this model is available now, consisting of powerful workstation, distributed processing, high-speed communications, and CD-ROMs. As the technology advances and costs decrease such systems should be generally available. (author). 13 refs, 5 figs

  6. Application of Optical Disc Databases and Related Technology to Public Access Settings

    Science.gov (United States)

    1992-03-01

    librarians during one on one instruction, and the ability of users to browse the database. Correlation of the James A. Haley Veterans Hospital study findings...library to another, librarians must collect and study data about information gathering characteristics of their own users . (Harter and Jackson 1988...based training: improving the quality of end- user searching. The Journal of Academic Librarianship 17, no. 3: 152-56. Ciuffetti, Peter D. 1991a. A plea

  7. MetaboLights: An Open-Access Database Repository for Metabolomics Data.

    Science.gov (United States)

    Kale, Namrata S; Haug, Kenneth; Conesa, Pablo; Jayseelan, Kalaivani; Moreno, Pablo; Rocca-Serra, Philippe; Nainala, Venkata Chandrasekhar; Spicer, Rachel A; Williams, Mark; Li, Xuefei; Salek, Reza M; Griffin, Julian L; Steinbeck, Christoph

    2016-03-24

    MetaboLights is the first general purpose, open-access database repository for cross-platform and cross-species metabolomics research at the European Bioinformatics Institute (EMBL-EBI). Based upon the open-source ISA framework, MetaboLights provides Metabolomics Standard Initiative (MSI) compliant metadata and raw experimental data associated with metabolomics experiments. Users can upload their study datasets into the MetaboLights Repository. These studies are then automatically assigned a stable and unique identifier (e.g., MTBLS1) that can be used for publication reference. The MetaboLights Reference Layer associates metabolites with metabolomics studies in the archive and is extensively annotated with data fields such as structural and chemical information, NMR and MS spectra, target species, metabolic pathways, and reactions. The database is manually curated with no specific release schedules. MetaboLights is also recommended by journals for metabolomics data deposition. This unit provides a guide to using MetaboLights, downloading experimental data, and depositing metabolomics datasets using user-friendly submission tools. Copyright © 2016 John Wiley & Sons, Inc.

  8. Seamless access to OER with mobile technologies

    NARCIS (Netherlands)

    Tabuenca, Bernardo

    2014-01-01

    This presentation provides insight on how ubiquitous technology can support lifelong learners facilitating access across context. The 3LHub tool is presented as suitable tool to scaffold personal learning ecologies.

  9. Planned and ongoing projects (pop) database: development and results.

    Science.gov (United States)

    Wild, Claudia; Erdös, Judit; Warmuth, Marisa; Hinterreiter, Gerda; Krämer, Peter; Chalon, Patrice

    2014-11-01

    The aim of this study was to present the development, structure and results of a database on planned and ongoing health technology assessment (HTA) projects (POP Database) in Europe. The POP Database (POP DB) was set up in an iterative process from a basic Excel sheet to a multifunctional electronic online database. The functionalities, such as the search terminology, the procedures to fill and update the database, the access rules to enter the database, as well as the maintenance roles, were defined in a multistep participatory feedback loop with EUnetHTA Partners. The POP Database has become an online database that hosts not only the titles and MeSH categorizations, but also some basic information on status and contact details about the listed projects of EUnetHTA Partners. Currently, it stores more than 1,200 planned, ongoing or recently published projects of forty-three EUnetHTA Partners from twenty-four countries. Because the POP Database aims to facilitate collaboration, it also provides a matching system to assist in identifying similar projects. Overall, more than 10 percent of the projects in the database are identical both in terms of pathology (indication or disease) and technology (drug, medical device, intervention). In addition, approximately 30 percent of the projects are similar, meaning that they have at least some overlap in content. Although the POP DB is successful concerning regular updates of most national HTA agencies within EUnetHTA, little is known about its actual effects on collaborations in Europe. Moreover, many non-nationally nominated HTA producing agencies neither have access to the POP DB nor can share their projects.

  10. Teaching Case: Adapting the Access Northwind Database to Support a Database Course

    Science.gov (United States)

    Dyer, John N.; Rogers, Camille

    2015-01-01

    A common problem encountered when teaching database courses is that few large illustrative databases exist to support teaching and learning. Most database textbooks have small "toy" databases that are chapter objective specific, and thus do not support application over the complete domain of design, implementation and management concepts…

  11. Database Constraints Applied to Metabolic Pathway Reconstruction Tools

    Directory of Open Access Journals (Sweden)

    Jordi Vilaplana

    2014-01-01

    Full Text Available Our group developed two biological applications, Biblio-MetReS and Homol-MetReS, accessing the same database of organisms with annotated genes. Biblio-MetReS is a data-mining application that facilitates the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (reannotation of proteomes, to properly identify both the individual proteins involved in the process(es of interest and their function. It also enables the sets of proteins involved in the process(es in different organisms to be compared directly. The efficiency of these biological applications is directly related to the design of the shared database. We classified and analyzed the different kinds of access to the database. Based on this study, we tried to adjust and tune the configurable parameters of the database server to reach the best performance of the communication data link to/from the database system. Different database technologies were analyzed. We started the study with a public relational SQL database, MySQL. Then, the same database was implemented by a MapReduce-based database named HBase. The results indicated that the standard configuration of MySQL gives an acceptable performance for low or medium size databases. Nevertheless, tuning database parameters can greatly improve the performance and lead to very competitive runtimes.

  12. Database constraints applied to metabolic pathway reconstruction tools.

    Science.gov (United States)

    Vilaplana, Jordi; Solsona, Francesc; Teixido, Ivan; Usié, Anabel; Karathia, Hiren; Alves, Rui; Mateo, Jordi

    2014-01-01

    Our group developed two biological applications, Biblio-MetReS and Homol-MetReS, accessing the same database of organisms with annotated genes. Biblio-MetReS is a data-mining application that facilitates the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (re)annotation of proteomes, to properly identify both the individual proteins involved in the process(es) of interest and their function. It also enables the sets of proteins involved in the process(es) in different organisms to be compared directly. The efficiency of these biological applications is directly related to the design of the shared database. We classified and analyzed the different kinds of access to the database. Based on this study, we tried to adjust and tune the configurable parameters of the database server to reach the best performance of the communication data link to/from the database system. Different database technologies were analyzed. We started the study with a public relational SQL database, MySQL. Then, the same database was implemented by a MapReduce-based database named HBase. The results indicated that the standard configuration of MySQL gives an acceptable performance for low or medium size databases. Nevertheless, tuning database parameters can greatly improve the performance and lead to very competitive runtimes.

  13. E-MSD: the European Bioinformatics Institute Macromolecular Structure Database.

    Science.gov (United States)

    Boutselakis, H; Dimitropoulos, D; Fillon, J; Golovin, A; Henrick, K; Hussain, A; Ionides, J; John, M; Keller, P A; Krissinel, E; McNeil, P; Naim, A; Newman, R; Oldfield, T; Pineda, J; Rachedi, A; Copeland, J; Sitnov, A; Sobhany, S; Suarez-Uruena, A; Swaminathan, J; Tagari, M; Tate, J; Tromm, S; Velankar, S; Vranken, W

    2003-01-01

    The E-MSD macromolecular structure relational database (http://www.ebi.ac.uk/msd) is designed to be a single access point for protein and nucleic acid structures and related information. The database is derived from Protein Data Bank (PDB) entries. Relational database technologies are used in a comprehensive cleaning procedure to ensure data uniformity across the whole archive. The search database contains an extensive set of derived properties, goodness-of-fit indicators, and links to other EBI databases including InterPro, GO, and SWISS-PROT, together with links to SCOP, CATH, PFAM and PROSITE. A generic search interface is available, coupled with a fast secondary structure domain search tool.

  14. JASPAR 2010: the greatly expanded open-access database of transcription factor binding profiles

    Science.gov (United States)

    Portales-Casamar, Elodie; Thongjuea, Supat; Kwon, Andrew T.; Arenillas, David; Zhao, Xiaobei; Valen, Eivind; Yusuf, Dimas; Lenhard, Boris; Wasserman, Wyeth W.; Sandelin, Albin

    2010-01-01

    JASPAR (http://jaspar.genereg.net) is the leading open-access database of matrix profiles describing the DNA-binding patterns of transcription factors (TFs) and other proteins interacting with DNA in a sequence-specific manner. Its fourth major release is the largest expansion of the core database to date: the database now holds 457 non-redundant, curated profiles. The new entries include the first batch of profiles derived from ChIP-seq and ChIP-chip whole-genome binding experiments, and 177 yeast TF binding profiles. The introduction of a yeast division brings the convenience of JASPAR to an active research community. As binding models are refined by newer data, the JASPAR database now uses versioning of matrices: in this release, 12% of the older models were updated to improved versions. Classification of TF families has been improved by adopting a new DNA-binding domain nomenclature. A curated catalog of mammalian TFs is provided, extending the use of the JASPAR profiles to additional TFs belonging to the same structural family. The changes in the database set the system ready for more rapid acquisition of new high-throughput data sources. Additionally, three new special collections provide matrix profile data produced by recent alternative high-throughput approaches. PMID:19906716

  15. Survey of Canadian Myotonic Dystrophy Patients' Access to Computer Technology.

    Science.gov (United States)

    Climans, Seth A; Piechowicz, Christine; Koopman, Wilma J; Venance, Shannon L

    2017-09-01

    Myotonic dystrophy type 1 is an autosomal dominant condition affecting distal hand strength, energy, and cognition. Increasingly, patients and families are seeking information online. An online neuromuscular patient portal under development can help patients access resources and interact with each other regardless of location. It is unknown how individuals living with myotonic dystrophy interact with technology and whether barriers to access exist. We aimed to characterize technology use among participants with myotonic dystrophy and to determine whether there is interest in a patient portal. Surveys were mailed to 156 participants with myotonic dystrophy type 1 registered with the Canadian Neuromuscular Disease Registry. Seventy-five participants (60% female) responded; almost half were younger than 46 years. Most (84%) used the internet; almost half of the responders (47%) used social media. The complexity and cost of technology were commonly cited reasons not to use technology. The majority of responders (76%) were interested in a myotonic dystrophy patient portal. Patients in a Canada-wide registry of myotonic dystrophy have access to and use technology such as computers and mobile phones. These patients expressed interest in a portal that would provide them with an opportunity to network with others with myotonic dystrophy and to access information about the disease.

  16. Scientific information: technology, will and access

    Directory of Open Access Journals (Sweden)

    Ana M. B. Pavani

    2007-12-01

    Full Text Available This article addresses the access to information from a point of view that relates the evolution of technology to the methods of treating information and to the desire for knowledge. The first part introduces some important events in ancient times, the end of the middle ages/beginning of the modern age and the XIX century. Then, it presents an overview of the current situation of traditional libraries and compares some characteristics with the corresponding ones for digital libraries. It ends by mentioning the international efforts towards open archives and open access to information; it shows examples of positive results.

  17. Experience with a run file archive using database technology

    International Nuclear Information System (INIS)

    Nixdorf, U.

    1993-12-01

    High Energy Physics experiments are known for their production of large amounts of data. Even small projects may have to manage several Giga Byte of event information. One possible solution for the management of this data is to use today's technology to archive the raw data files in tertiary storage and build on-line catalogs which reference interesting data. This approach has been taken by the Gammas, Electrons and Muons (GEM) Collaboration for their evaluation of muon chamber technologies at the Superconducting Super Collider Laboratory (SSCL). Several technologies were installed and tested during a 6 month period. Events produced were first recorded in the UNIX filesystem of the data acquisition system and then migrated to the Physics Detector Simulation Facility (PDSF) for long term storage. The software system makes use of a commercial relational database management system (SYBASE) and the Data Management System (DMS), a tape archival system developed at the SSCL. The components are distributed among several machines inside and outside PDSF. A Motif-based graphical user interface (GUI) enables physicists to retrieve interesting runs from the archive using the on-line database catalog

  18. Database use and technology in Japan: JTEC panel report. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Wiederhold, G.; Beech, D.; Bourne, C.; Farmer, N.; Jajodia, Sushil; Kahaner, D.; Minoura, Toshi; Smith, D.; Smith, J.M.

    1992-04-01

    This report presents the findings of a group of database experts, sponsored by the Japanese Technology Evaluation Center (JTEC), based on an intensive study trip to Japan during March 1991. Academic, industrial, and governmental sites were visited. The primary findings are that Japan is supporting its academic research establishment poorly, that industry is making progress in key areas, and that both academic and industrial researchers are well aware of current domestic and foreign technology. Information sharing between industry and academia is effectively supported by governmental sponsorship of joint planning and review activities, and enhances technology transfer. In two key areas, multimedia and object-oriented databases, the authors can expect to see future export of Japanese database products, typically integrated into larger systems. Support for academic research is relatively modest. Nevertheless, the senior faculty are well-known and respected, and communicate frequently and in depth with each other, with government agencies, and with industry. In 1988 there were a total of 1,717 Ph.D.`s in engineering and 881 in science. It appears that only about 30 of these were academic Ph.D.`s in the basic computer sciences.

  19. Fine-grained policy control in U.S. Army Research Laboratory (ARL) multimodal signatures database

    Science.gov (United States)

    Bennett, Kelly; Grueneberg, Keith; Wood, David; Calo, Seraphin

    2014-06-01

    The U.S. Army Research Laboratory (ARL) Multimodal Signatures Database (MMSDB) consists of a number of colocated relational databases representing a collection of data from various sensors. Role-based access to this data is granted to external organizations such as DoD contractors and other government agencies through a client Web portal. In the current MMSDB system, access control is only at the database and firewall level. In order to offer finer grained security, changes to existing user profile schemas and authentication mechanisms are usually needed. In this paper, we describe a software middleware architecture and implementation that allows fine-grained access control to the MMSDB at a dataset, table, and row level. Result sets from MMSDB queries issued in the client portal are filtered with the use of a policy enforcement proxy, with minimal changes to the existing client software and database. Before resulting data is returned to the client, policies are evaluated to determine if the user or role is authorized to access the data. Policies can be authored to filter data at the row, table or column level of a result set. The system uses various technologies developed in the International Technology Alliance in Network and Information Science (ITA) for policy-controlled information sharing and dissemination1. Use of the Policy Management Library provides a mechanism for the management and evaluation of policies to support finer grained access to the data in the MMSDB system. The GaianDB is a policy-enabled, federated database that acts as a proxy between the client application and the MMSDB system.

  20. Review on management of horticultural plant germplasm resources and construction of related database

    Directory of Open Access Journals (Sweden)

    Pan Jingxian

    2017-02-01

    Full Text Available The advances of databases on horticulture germplasm resources from China and abroad was briefly reviewed and the key technologies were discussed in details,especially in descriptors of data collection of germplasm resources. The prospective and challenges of databases were also discussed. It was evident that there was an urgent need to develop the databases of horticulture germplasm resources,with increasing diversity of germplasm,more user friendly and systematically access to the databases.

  1. Programming and Technology for Accessibility in Geoscience

    Science.gov (United States)

    Sevre, E.; Lee, S.

    2013-12-01

    Many people, students and professors alike, shy away from learning to program because it is often believed to be something scary or unattainable. However, integration of programming into geoscience education can be a valuable tool for increasing the accessibility of content for all who are interested. It is my goal to dispel these myths and convince people that: 1) Students with disabilities can use programming to increase their role in the classroom, 2) Everyone can learn to write programs to simplify daily tasks, 3) With a deep understanding of the task, anyone can write a program to do a complex task, 4) Technology can be combined with programming to create an inclusive environment for all students of geoscience, and 5) More advanced knowledge of programming and technology can lead geoscientists to create software to serve as assistive technology in the classroom. It is my goal to share my experiences using technology to enhance the classroom experience as a way of addressing the aforementioned issues. Through my experience, I have found that programming skills can be included and learned by all to enhance the content of courses without detracting from curriculum. I hope that, through this knowledge, geoscience courses can become more accessible for people with disabilities by including programming and technology to the benefit of all involved.

  2. Development of SRS.php, a Simple Object Access Protocol-based library for data acquisition from integrated biological databases.

    Science.gov (United States)

    Barbosa-Silva, A; Pafilis, E; Ortega, J M; Schneider, R

    2007-12-11

    Data integration has become an important task for biological database providers. The current model for data exchange among different sources simplifies the manner that distinct information is accessed by users. The evolution of data representation from HTML to XML enabled programs, instead of humans, to interact with biological databases. We present here SRS.php, a PHP library that can interact with the data integration Sequence Retrieval System (SRS). The library has been written using SOAP definitions, and permits the programmatic communication through webservices with the SRS. The interactions are possible by invoking the methods described in WSDL by exchanging XML messages. The current functions available in the library have been built to access specific data stored in any of the 90 different databases (such as UNIPROT, KEGG and GO) using the same query syntax format. The inclusion of the described functions in the source of scripts written in PHP enables them as webservice clients to the SRS server. The functions permit one to query the whole content of any SRS database, to list specific records in these databases, to get specific fields from the records, and to link any record among any pair of linked databases. The case study presented exemplifies the library usage to retrieve information regarding registries of a Plant Defense Mechanisms database. The Plant Defense Mechanisms database is currently being developed, and the proposal of SRS.php library usage is to enable the data acquisition for the further warehousing tasks related to its setup and maintenance.

  3. Multilingual access to full text databases

    International Nuclear Information System (INIS)

    Fluhr, C.; Radwan, K.

    1990-05-01

    Many full text databases are available in only one language, or more, they may contain documents in different languages. Even if the user is able to understand the language of the documents in the database, it could be easier for him to express his need in his own language. For the case of databases containing documents in different languages, it is more simple to formulate the query in one language only and to retrieve documents in different languages. This paper present the developments and the first experiments of multilingual search, applied to french-english pair, for text data in nuclear field, based on the system SPIRIT. After reminding the general problems of full text databases search by queries formulated in natural language, we present the methods used to reformulate the queries and show how they can be expanded for multilingual search. The first results on data in nuclear field are presented (AFCEN norms and INIS abstracts). 4 refs

  4. Language and technology literacy barriers to accessing government services

    CSIR Research Space (South Africa)

    Barnard, E

    2003-01-01

    Full Text Available of field experiments are done to gain an improved understanding of the extent to which citizens’ exposure to technology and home language affect their ability to access electronic services. These experiments will influence technology development...

  5. Application of an access technology delivery protocol to two children with cerebral palsy.

    Science.gov (United States)

    Mumford, Leslie; Chau, Tom

    2015-07-14

    This study further delineates the merits and limitations of the Access Technology Delivery Protocol (ATDP) through its application to two children with severe disabilities. We conducted mixed methods case studies to demonstrate the ATDP with two children with no reliable means of access to an external device. Evaluations of response efficiency, satisfaction, goal attainment, technology use and participation were made after 8 and 16 weeks of training with custom access technologies. After 16 weeks, one child's switch offered improved response efficiency, high teacher satisfaction and increased participation. The other child's switch resulted in improved satisfaction and switch effectiveness but lower overall efficiency. The latter child was no longer using his switch by the end of the study. These contrasting findings indicate that changes to any contextual factors that may impact the user's switch performance should mandate a reassessment of the access pathway. Secondly, it is important to ensure that individuals who will be responsible for switch training be identified at the outset and engaged throughout the ATDP. Finally, the ATDP should continue to be tested with individuals with severe disabilities to build an evidence base for the delivery of response efficient access solutions. Implications for Rehabilitation A data-driven, comprehensive access technology delivery protocol for children with complex communication needs could help to mitigate technology abandonment. Successful adoption of an access technology requires personalized design, training of the technology user, the teaching staff, the caregivers and other communication partners, and integration with functional activities.

  6. PDTD: a web-accessible protein database for drug target identification

    Directory of Open Access Journals (Sweden)

    Gao Zhenting

    2008-02-01

    Full Text Available Abstract Background Target identification is important for modern drug discovery. With the advances in the development of molecular docking, potential binding proteins may be discovered by docking a small molecule to a repository of proteins with three-dimensional (3D structures. To complete this task, a reverse docking program and a drug target database with 3D structures are necessary. To this end, we have developed a web server tool, TarFisDock (Target Fishing Docking http://www.dddc.ac.cn/tarfisdock, which has been used widely by others. Recently, we have constructed a protein target database, Potential Drug Target Database (PDTD, and have integrated PDTD with TarFisDock. This combination aims to assist target identification and validation. Description PDTD is a web-accessible protein database for in silico target identification. It currently contains >1100 protein entries with 3D structures presented in the Protein Data Bank. The data are extracted from the literatures and several online databases such as TTD, DrugBank and Thomson Pharma. The database covers diverse information of >830 known or potential drug targets, including protein and active sites structures in both PDB and mol2 formats, related diseases, biological functions as well as associated regulating (signaling pathways. Each target is categorized by both nosology and biochemical function. PDTD supports keyword search function, such as PDB ID, target name, and disease name. Data set generated by PDTD can be viewed with the plug-in of molecular visualization tools and also can be downloaded freely. Remarkably, PDTD is specially designed for target identification. In conjunction with TarFisDock, PDTD can be used to identify binding proteins for small molecules. The results can be downloaded in the form of mol2 file with the binding pose of the probe compound and a list of potential binding targets according to their ranking scores. Conclusion PDTD serves as a comprehensive and

  7. The plant phenological online database (PPODB): an online database for long-term phenological data

    Science.gov (United States)

    Dierenbach, Jonas; Badeck, Franz-W.; Schaber, Jörg

    2013-09-01

    We present an online database that provides unrestricted and free access to over 16 million plant phenological observations from over 8,000 stations in Central Europe between the years 1880 and 2009. Unique features are (1) a flexible and unrestricted access to a full-fledged database, allowing for a wide range of individual queries and data retrieval, (2) historical data for Germany before 1951 ranging back to 1880, and (3) more than 480 curated long-term time series covering more than 100 years for individual phenological phases and plants combined over Natural Regions in Germany. Time series for single stations or Natural Regions can be accessed through a user-friendly graphical geo-referenced interface. The joint databases made available with the plant phenological database PPODB render accessible an important data source for further analyses of long-term changes in phenology. The database can be accessed via www.ppodb.de .

  8. Epistemonikos: a free, relational, collaborative, multilingual database of health evidence.

    Science.gov (United States)

    Rada, Gabriel; Pérez, Daniel; Capurro, Daniel

    2013-01-01

    Epistemonikos (www.epistemonikos.org) is a free, multilingual database of the best available health evidence. This paper describes the design, development and implementation of the Epistemonikos project. Using several web technologies to store systematic reviews, their included articles, overviews of reviews and structured summaries, Epistemonikos is able to provide a simple and powerful search tool to access health evidence for sound decision making. Currently, Epistemonikos stores more than 115,000 unique documents and more than 100,000 relationships between documents. In addition, since its database is translated into 9 different languages, Epistemonikos ensures that non-English speaking decision-makers can access the best available evidence without language barriers.

  9. Access 2013 bible

    CERN Document Server

    Alexander, Michael

    2013-01-01

    A comprehensive reference to the updated and new features of Access 2013 As the world's most popular database management tool, Access enables you to organize, present, analyze, and share data as well as build powerful database solutions. However, databases can be complex. That's why you need the expert guidance in this comprehensive reference. Access 2013 Bible helps you gain a solid understanding of database purpose, construction, and application so that whether you're new to Access or looking to upgrade to the 2013 version, this well-rounded resource provides you with a th

  10. Online Databases for Health Professionals

    OpenAIRE

    Marshall, Joanne Gard

    1987-01-01

    Recent trends in the marketing of electronic information technology have increased interest among health professionals in obtaining direct access to online biomedical databases such as Medline. During 1985, the Canadian Medical Association (CMA) and Telecom Canada conducted an eight-month trial of the use made of online information retrieval systems by 23 practising physicians and one pharmacist. The results of this project demonstrated both the value and the limitations of these systems in p...

  11. DOT Online Database

    Science.gov (United States)

    Page Home Table of Contents Contents Search Database Search Login Login Databases Advisory Circulars accessed by clicking below: Full-Text WebSearch Databases Database Records Date Advisory Circulars 2092 5 data collection and distribution policies. Document Database Website provided by MicroSearch

  12. Radiation technology enabled market access to Indian mango

    International Nuclear Information System (INIS)

    Sharma, Arun

    2009-01-01

    International trade in agricultural produce is subject to quarantine barriers imposed by importing countries to limit the entry of exotic pests and pathogens. Radiation technology provides an effective alternative to fumigants which are being gradually phased out. The technology has enabled market access to Indian mangoes in the US market after a gap of 18 years. The technology provides opportunity for export of other fruits and vegetables as well to countries like US, Australia and New Zealand. (author)

  13. Access to information technology and willingness to receive text ...

    African Journals Online (AJOL)

    Over the past decade, new technologies and methods of communication have ... To determine access to information technology and willingness to receive short message service (SMS) text message reminders for childhood immunisation .... Table 1 shows the attitude of the mothers towards reminders for immunisations.

  14. Harmful algal bloom historical database from Coastal waters of Florida from 01 November 1995 to 09 September 1996 (NODC Accession 0019216)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — In the later part of 1999, a relational Microsoft Access database was created to accommodate a wide range of data on the phytoplankton Karenia brevis. This database,...

  15. Factors associated with access of rural women to technology in ...

    African Journals Online (AJOL)

    A descriptive, correlational study was conducted to: describe the channels through which rural women obtain information regarding technology, and factors promoting access of women to technology; determine the contribution of technology to socio-economic development; and describe the relationships among factors ...

  16. MetIDB: A Publicly Accessible Database of Predicted and Experimental 1H NMR Spectra of Flavonoids

    NARCIS (Netherlands)

    Mihaleva, V.V.; Beek, te T.A.; Zimmeren, van F.; Moco, S.I.A.; Laatikainen, R.; Niemitz, M.; Korhonen, S.P.; Driel, van M.A.; Vervoort, J.

    2013-01-01

    Identification of natural compounds, especially secondary metabolites, has been hampered by the lack of easy to use and accessible reference databases. Nuclear magnetic resonance (NMR) spectroscopy is the most selective technique for identification of unknown metabolites. High quality 1H NMR (proton

  17. Secure, web-accessible call rosters for academic radiology departments.

    Science.gov (United States)

    Nguyen, A V; Tellis, W M; Avrin, D E

    2000-05-01

    Traditionally, radiology department call rosters have been posted via paper and bulletin boards. Frequently, changes to these lists are made by multiple people independently, but often not synchronized, resulting in confusion among the house staff and technical staff as to who is on call and when. In addition, multiple and disparate copies exist in different sections of the department, and changes made would not be propagated to all the schedules. To eliminate such difficulties, a paperless call scheduling application was developed. Our call scheduling program allowed Java-enabled web access to a database by designated personnel from each radiology section who have privileges to make the necessary changes. Once a person made a change, everyone accessing the database would see the modification. This eliminates the chaos resulting from people swapping shifts at the last minute and not having the time to record or broadcast the change. Furthermore, all changes to the database were logged. Users are given a log-in name and password and can only edit their section; however, all personnel have access to all sections' schedules. Our applet was written in Java 2 using the latest technology in database access. We access our Interbase database through the DataExpress and DB Swing (Borland, Scotts Valley, CA) components. The result is secure access to the call rosters via the web. There are many advantages to the web-enabled access, mainly the ability for people to make changes and have the changes recorded and propagated in a single virtual location and available to all who need to know.

  18. MammoGrid: a mammography database

    CERN Multimedia

    2002-01-01

    What would be the advantages if physicians around the world could gain access to a unique mammography database? The answer may come from MammoGrid, a three-year project under the Fifth Framework Programme of the EC. Led by CERN, MammoGrid involves the UK (the Universities of Oxford, Cambridge and the West of England, Bristol, plus the company Mirada Solutions of Oxford), and Italy (the Universities of Pisa and Sassari and the Hospitals in Udine and Torino). The aim of the project is, in light of emerging GRID technology, to develop a Europe-wide database of mammograms. The database will be used to investigate a set of important healthcare applications as well as the potential of the GRID to enable healthcare professionals throughout the EU to work together effectively. The contributions of the partners include building the GRID-database infrastructure, developing image processing and Computer Aided Detection techniques, and making the clinical evaluation. The first project meeting took place at CERN in Sept...

  19. The European Fusion Material properties database

    Energy Technology Data Exchange (ETDEWEB)

    Karditsas, P.J. [UKAEA Fusion, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)]. E-mail: panos.karditsas@ukaea.org.uk; Lloyd, G. [Tessella Support Services plc, 3 Vineyard Chambers, Abingdon OX14 3PX (United Kingdom); Walters, M. [Tessella Support Services plc, 3 Vineyard Chambers, Abingdon OX14 3PX (United Kingdom); Peacock, A. [EFDA Close Support Unit, Garching D-85748 (Germany)

    2006-02-15

    Materials research represents a significant part of the European and world effort on fusion research. A European Fusion Materials web-based relational database is being developed to collect, expand and preserve for the future the data produced in support of the NET, DEMO and ITER. The database allows understanding of material properties and their critical parameters for fusion environments. The system uses J2EE technologies and the PostgreSQL relational database, and flexibility ensures that new methods to automate material design for specific applications can be easily implemented. It runs on a web server and allows users access via the Internet using their preferred web browser. The database allows users to store, browse and search raw tests, material properties and qualified data, and electronic reports. For data security, users are issued with individual accounts, and the origin of all requests is checked against a list of trusted sites. Different user accounts have access to different datasets to ensure the data is not shared unintentionally. The system allows several levels of data checking/cleaning and validation. Data insertion is either online or through downloaded templates, and validation is through different expert groups, which can apply different criteria to the data.

  20. Modern Hardware Technologies and Software Techniques for On-Line Database Storage and Access.

    Science.gov (United States)

    1985-12-01

    of the information in a message narrative. This method employs artificial intelligence techniques to extract information, In simalest terms, an...disf ribif ion (tape replacemenf) systemns Database distribution On-fine mass storage Videogame ROM (luke-box I Media Cost Mt $2-10/438 $10-SO/G38...trajninq ot tne great intelligence for the analyst would be required. If, on’ the other hand, a sentence analysis scneme siTole enouq,. for the low-level

  1. Open-access MIMIC-II database for intensive care research.

    Science.gov (United States)

    Lee, Joon; Scott, Daniel J; Villarroel, Mauricio; Clifford, Gari D; Saeed, Mohammed; Mark, Roger G

    2011-01-01

    The critical state of intensive care unit (ICU) patients demands close monitoring, and as a result a large volume of multi-parameter data is collected continuously. This represents a unique opportunity for researchers interested in clinical data mining. We sought to foster a more transparent and efficient intensive care research community by building a publicly available ICU database, namely Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II). The data harnessed in MIMIC-II were collected from the ICUs of Beth Israel Deaconess Medical Center from 2001 to 2008 and represent 26,870 adult hospital admissions (version 2.6). MIMIC-II consists of two major components: clinical data and physiological waveforms. The clinical data, which include patient demographics, intravenous medication drip rates, and laboratory test results, were organized into a relational database. The physiological waveforms, including 125 Hz signals recorded at bedside and corresponding vital signs, were stored in an open-source format. MIMIC-II data were also deidentified in order to remove protected health information. Any interested researcher can gain access to MIMIC-II free of charge after signing a data use agreement and completing human subjects training. MIMIC-II can support a wide variety of research studies, ranging from the development of clinical decision support algorithms to retrospective clinical studies. We anticipate that MIMIC-II will be an invaluable resource for intensive care research by stimulating fair comparisons among different studies.

  2. Handling of network and database instabilities in CORAL

    International Nuclear Information System (INIS)

    Trentadue, R; Valassi, A; Kalkhof, A

    2012-01-01

    The CORAL software is widely used by the LHC experiments for storing and accessing data using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several back-ends and deployment models, direct client access to Oracle servers being one of the most important use cases. Since 2010, several problems have been reported by the LHC experiments in their use of Oracle through CORAL, involving application errors, hangs or crashes after the network or the database servers became temporarily unavailable. CORAL already provided some level of handling of these instabilities, which are due to external causes and cannot be avoided, but this proved to be insufficient in some cases and to be itself the cause of other problems, such as the hangs and crashes mentioned before, in other cases. As a consequence, a major redesign of the CORAL plugins has been implemented, with the aim of making the software more robust against these database and network glitches. The new implementation ensures that CORAL automatically reconnects to Oracle databases in a transparent way whenever possible and gently terminates the application when this is not possible. Internally, this is done by resetting all relevant parameters of the underlying back-end technology (OCI, the Oracle Call Interface). This presentation reports on the status of this work at the time of the CHEP2012 conference, covering the design and implementation of these new features and the outlook for future developments in this area.

  3. User Management with LDAP(Light weight Directory Access Protocolfor access to technology and Information Services in Companies

    Directory of Open Access Journals (Sweden)

    José Teodoro Mejía Viteri

    2016-08-01

    Full Text Available This research aims to conduct an analysis of management services information and users with LDAP (Lightweight / Simplified Directory Access Protocol, their interaction with other technology services company, allowing it to be accessed through a single user and password. This study allowed us to collect information through a literature review on the LDAP service and its ability to interact with your user directory Open source technology services; also with Windows Server and Active Directory service is used by companies for their ease of management and access resources on Windows clients; is intended to provide an alternative for the implementation of each of the services required by public and private companies with tools free use and access to services for management and administration can be done by integrating or synchronizing with the directory LDAP.

  4. Relational databases for conditions data and event selection in ATLAS

    International Nuclear Information System (INIS)

    Viegas, F; Hawkings, R; Dimitrov, G

    2008-01-01

    The ATLAS experiment at LHC will make extensive use of relational databases in both online and offline contexts, running to O(TBytes) per year. Two of the most challenging applications in terms of data volume and access patterns are conditions data, making use of the LHC conditions database, COOL, and the TAG database, that stores summary event quantities allowing a rapid selection of interesting events. Both of these databases are being replicated to regional computing centres using Oracle Streams technology, in collaboration with the LCG 3D project. Database optimisation, performance tests and first user experience with these applications will be described, together with plans for first LHC data-taking and future prospects

  5. Relational databases for conditions data and event selection in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Viegas, F; Hawkings, R; Dimitrov, G [CERN, CH-1211 Geneve 23 (Switzerland)

    2008-07-15

    The ATLAS experiment at LHC will make extensive use of relational databases in both online and offline contexts, running to O(TBytes) per year. Two of the most challenging applications in terms of data volume and access patterns are conditions data, making use of the LHC conditions database, COOL, and the TAG database, that stores summary event quantities allowing a rapid selection of interesting events. Both of these databases are being replicated to regional computing centres using Oracle Streams technology, in collaboration with the LCG 3D project. Database optimisation, performance tests and first user experience with these applications will be described, together with plans for first LHC data-taking and future prospects.

  6. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    Science.gov (United States)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  7. Download - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Download First of all, please read the license of this database. Data ...1.4 KB) Simple search and download Downlaod via FTP FTP server is sometimes jammed. If it is, access [here]. About This Database Data...base Description Download License Update History of This Database Site Policy | Contact Us Download - Trypanosomes Database | LSDB Archive ...

  8. Scale out databases for CERN use cases

    International Nuclear Information System (INIS)

    Baranowski, Zbigniew; Grzybek, Maciej; Canali, Luca; Garcia, Daniel Lanza; Surdy, Kacper

    2015-01-01

    Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log database. (paper)

  9. LINQ The Future of Data Access in C# 30

    CERN Document Server

    Hummel, Joe

    2006-01-01

    Language Integrated Query (LINQ) is Microsoft's new technology for powerful, general purpose data access. This technology provides a fully-integrated query language, available in both C# 3.0 and VB 9.0, for high-level data access against objects, relational databases, and XML documents. In this Short Cut you'll learn about LINQ and the proposed C# 3.0 extensions that support it. You'll also see how you can use LINQ and C# to accomplish a variety of tasks, from querying objects to accessing relational data and XML. Best of all, you'll be able to test the examples and run your own code using t

  10. Technology in the Public Library: Results from the 1992 PLDS Survey of Technology.

    Science.gov (United States)

    Fidler, Linda M.; Johnson, Debra Wilcox

    1994-01-01

    Discusses and compares the incorporation of technology by larger public libraries in Canada and the United States. Technology mentioned includes online public access catalogs; remote and local online database searching; microcomputers and software for public use; and fax, voice mail, and Telecommunication Devices for the Deaf and Teletype writer…

  11. Ethernet access network based on free-space optic deployment technology

    Science.gov (United States)

    Gebhart, Michael; Leitgeb, Erich; Birnbacher, Ulla; Schrotter, Peter

    2004-06-01

    The satisfaction of all communication needs from single households and business companies over a single access infrastructure is probably the most challenging topic in communications technology today. But even though the so-called "Last Mile Access Bottleneck" is well known since more than ten years and many distribution technologies have been tried out, the optimal solution has not yet been found and paying commercial access networks offering all service classes are still rare today. Conventional services like telephone, radio and TV, as well as new and emerging services like email, web browsing, online-gaming, video conferences, business data transfer or external data storage can all be transmitted over the well known and cost effective Ethernet networking protocol standard. Key requirements for the deployment technology driven by the different services are high data rates to the single customer, security, moderate deployment costs and good scalability to number and density of users, quick and flexible deployment without legal impediments and high availability, referring to the properties of optical and wireless communication. We demonstrate all elements of an Ethernet Access Network based on Free Space Optic distribution technology. Main physical parts are Central Office, Distribution Network and Customer Equipment. Transmission of different services, as well as configuration, service upgrades and remote control of the network are handled by networking features over one FSO connection. All parts of the network are proven, the latest commercially available technology. The set up is flexible and can be adapted to any more specific need if required.

  12. Stay in the Box! Embedded Assistive Technology Improves Access for Students with Disabilities

    Directory of Open Access Journals (Sweden)

    Katherine Koch

    2017-11-01

    Full Text Available Assistive technology is not only a required component of a student’s IEP; it can be an effective way to help students with (and without disabilities access their education and to provide them with required instructional accommodations. Teachers, however, are often not adequately prepared in their pre-service course work and ongoing professional development to address the technology needs of their special education students and have not had the opportunities to access technology due to limited availability and cost. While assistive technology can be purchased to augment an existing computer, it is often unnecessary to do that. Both Microsoft and Apple operating systems in “off-the-shelf” computers and handheld devices have embedded assistive technology that is easy to access and easy to use. This embedded technology can help teachers become familiar with technology and assist students with sensory, physical, learning, and attention disabilities, and it might have practical applications with Universal Design for Learning. This paper provides a discussion on how embedded technology can support students with disabilities in the school setting and provides examples for access and use.

  13. SWS: accessing SRS sites contents through Web Services.

    Science.gov (United States)

    Romano, Paolo; Marra, Domenico

    2008-03-26

    Web Services and Workflow Management Systems can support creation and deployment of network systems, able to automate data analysis and retrieval processes in biomedical research. Web Services have been implemented at bioinformatics centres and workflow systems have been proposed for biological data analysis. New databanks are often developed by taking into account these technologies, but many existing databases do not allow a programmatic access. Only a fraction of available databanks can thus be queried through programmatic interfaces. SRS is a well know indexing and search engine for biomedical databanks offering public access to many databanks and analysis tools. Unfortunately, these data are not easily and efficiently accessible through Web Services. We have developed 'SRS by WS' (SWS), a tool that makes information available in SRS sites accessible through Web Services. Information on known sites is maintained in a database, srsdb. SWS consists in a suite of WS that can query both srsdb, for information on sites and databases, and SRS sites. SWS returns results in a text-only format and can be accessed through a WSDL compliant client. SWS enables interoperability between workflow systems and SRS implementations, by also managing access to alternative sites, in order to cope with network and maintenance problems, and selecting the most up-to-date among available systems. Development and implementation of Web Services, allowing to make a programmatic access to an exhaustive set of biomedical databases can significantly improve automation of in-silico analysis. SWS supports this activity by making biological databanks that are managed in public SRS sites available through a programmatic interface.

  14. Impacts of extension access and cooperative membership on technology adoption and household welfare.

    Science.gov (United States)

    Wossen, Tesfamicheal; Abdoulaye, Tahirou; Alene, Arega; Haile, Mekbib G; Feleke, Shiferaw; Olanrewaju, Adetunji; Manyong, Victor

    2017-08-01

    This paper examines the impacts of access to extension services and cooperative membership on technology adoption, asset ownership and poverty using household-level data from rural Nigeria. Using different matching techniques and endogenous switching regression approach, we find that both extension access and cooperative membership have a positive and statistically significant effect on technology adoption and household welfare. Moreover, we find that both extension access and cooperative membership have heterogeneous impacts. In particular, we find evidence of a positive selection as the average treatment effects of extension access and cooperative membership are higher for farmers with the highest propensity to access extension and cooperative services. The impact of extension services on poverty reduction and of cooperatives on technology adoption is significantly stronger for smallholders with access to formal credit than for those without access. This implies that expanding rural financial markets can maximize the potential positive impacts of extension and cooperative services on farmers' productivity and welfare.

  15. Access control based on attribute certificates for medical intranet applications.

    Science.gov (United States)

    Mavridis, I; Georgiadis, C; Pangalos, G; Khair, M

    2001-01-01

    Clinical information systems frequently use intranet and Internet technologies. However these technologies have emphasized sharing and not security, despite the sensitive and private nature of much health information. Digital certificates (electronic documents which recognize an entity or its attributes) can be used to control access in clinical intranet applications. To outline the need for access control in distributed clinical database systems, to describe the use of digital certificates and security policies, and to propose the architecture for a system using digital certificates, cryptography and security policy to control access to clinical intranet applications. We have previously developed a security policy, DIMEDAC (Distributed Medical Database Access Control), which is compatible with emerging public key and privilege management infrastructure. In our implementation approach we propose the use of digital certificates, to be used in conjunction with DIMEDAC. Our proposed access control system consists of two phases: the ways users gain their security credentials; and how these credentials are used to access medical data. Three types of digital certificates are used: identity certificates for authentication; attribute certificates for authorization; and access-rule certificates for propagation of access control policy. Once a user is identified and authenticated, subsequent access decisions are based on a combination of identity and attribute certificates, with access-rule certificates providing the policy framework. Access control in clinical intranet applications can be successfully and securely managed through the use of digital certificates and the DIMEDAC security policy.

  16. Web geoprocessing services on GML with a fast XML database ...

    African Journals Online (AJOL)

    Nowadays there exist quite a lot of Spatial Database Infrastructures (SDI) that facilitate the Geographic Information Systems (GIS) user community in getting access to distributed spatial data through web technology. However, sometimes the users first have to process available spatial data to obtain the needed information.

  17. New perspectives in toxicological information management, and the role of ISSTOX databases in assessing chemical mutagenicity and carcinogenicity.

    Science.gov (United States)

    Benigni, Romualdo; Battistelli, Chiara Laura; Bossa, Cecilia; Tcheremenskaia, Olga; Crettaz, Pierre

    2013-07-01

    Currently, the public has access to a variety of databases containing mutagenicity and carcinogenicity data. These resources are crucial for the toxicologists and regulators involved in the risk assessment of chemicals, which necessitates access to all the relevant literature, and the capability to search across toxicity databases using both biological and chemical criteria. Towards the larger goal of screening chemicals for a wide range of toxicity end points of potential interest, publicly available resources across a large spectrum of biological and chemical data space must be effectively harnessed with current and evolving information technologies (i.e. systematised, integrated and mined), if long-term screening and prediction objectives are to be achieved. A key to rapid progress in the field of chemical toxicity databases is that of combining information technology with the chemical structure as identifier of the molecules. This permits an enormous range of operations (e.g. retrieving chemicals or chemical classes, describing the content of databases, finding similar chemicals, crossing biological and chemical interrogations, etc.) that other more classical databases cannot allow. This article describes the progress in the technology of toxicity databases, including the concepts of Chemical Relational Database and Toxicological Standardized Controlled Vocabularies (Ontology). Then it describes the ISSTOX cluster of toxicological databases at the Istituto Superiore di Sanitá. It consists of freely available databases characterised by the use of modern information technologies and by curation of the quality of the biological data. Finally, this article provides examples of analyses and results made possible by ISSTOX.

  18. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    International Nuclear Information System (INIS)

    Valassi, A; Kalkhof, A; Bartoldus, R; Salnikov, A; Wache, M

    2011-01-01

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier 'CORAL server' deployed close to the database and a tree of 'CORAL server proxies', providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farm of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.

  19. ACCESSING FEDERAL DATA BASES FOR CONTAMINATED SITE CLEAN-UP TECHNOLOGIES

    Science.gov (United States)

    The Federal Remediation Technologies Roundtable (Roundtable) eveloped this publication to provide information on accessing Federal data bases that contain data on innovative remediation technologies. The Roundtable includes representatives from the Department of Defense (DoD), En...

  20. Unified Access Architecture for Large-Scale Scientific Datasets

    Science.gov (United States)

    Karna, Risav

    2014-05-01

    Data-intensive sciences have to deploy diverse large scale database technologies for data analytics as scientists have now been dealing with much larger volume than ever before. While array databases have bridged many gaps between the needs of data-intensive research fields and DBMS technologies (Zhang 2011), invocation of other big data tools accompanying these databases is still manual and separate the database management's interface. We identify this as an architectural challenge that will increasingly complicate the user's work flow owing to the growing number of useful but isolated and niche database tools. Such use of data analysis tools in effect leaves the burden on the user's end to synchronize the results from other data manipulation analysis tools with the database management system. To this end, we propose a unified access interface for using big data tools within large scale scientific array database using the database queries themselves to embed foreign routines belonging to the big data tools. Such an invocation of foreign data manipulation routines inside a query into a database can be made possible through a user-defined function (UDF). UDFs that allow such levels of freedom as to call modules from another language and interface back and forth between the query body and the side-loaded functions would be needed for this purpose. For the purpose of this research we attempt coupling of four widely used tools Hadoop (hadoop1), Matlab (matlab1), R (r1) and ScaLAPACK (scalapack1) with UDF feature of rasdaman (Baumann 98), an array-based data manager, for investigating this concept. The native array data model used by an array-based data manager provides compact data storage and high performance operations on ordered data such as spatial data, temporal data, and matrix-based data for linear algebra operations (scidbusr1). Performances issues arising due to coupling of tools with different paradigms, niche functionalities, separate processes and output

  1. Access to the Arts through Assistive Technology.

    Science.gov (United States)

    Frame, Charles

    Personnel in the rehabilitation field have come to recognize the possibilities and implications of computers as assistive technology for disabled persons. This manual provides information on how to adapt the Unicorn Board, Touch Talker/Light Talker overlays, the Adaptive Firmware Card setup disk, and Trace-Transparent Access Module (T-TAM) to…

  2. Techno-economic evaluation of broadband access technologies

    DEFF Research Database (Denmark)

    Sigurdsson, Halldór Matthias; Skouby, Knud Erik

    2005-01-01

    Broadband for all is an essential element in the EU policy concerning the future of ICT-based society. The overall purpose of this paper is to present a model for evaluation of different broadband access technologies and to present some preliminary results based on the model that has been carried...

  3. 8th Cambridge Workshop on Universal Access and Assistive Technology

    CERN Document Server

    Lazar, Jonathan; Heylighen, Ann; Dong, Hua

    2016-01-01

    This book presents the proceedings of the 8th Cambridge Workshop on Universal Access and Assistive Technology (CWUAAT '14), incorporating the 11th Cambridge Workshop on Rehabilitation Robotics, held in Cambridge, England in March 2016. It presents novel and state-of-the-art research from an international group of leaders in the fields of universal access and assistive technology. It explores various issues including the reconciliation of usability, accessibility and inclusive design, the design of inclusive assistive and rehabilitation systems, measuring product demand and human capabilities, data mining and visualizing inclusion, legislation in inclusive design, and situational inclusive interfaces (automotive and aerospace). This book provides an invaluable resource to researchers, postgraduates, design practitioners, therapists and clinical practitioners, as well as design teachers.

  4. Constructing an XML database of linguistics data

    Directory of Open Access Journals (Sweden)

    J H Kroeze

    2010-04-01

    Full Text Available A language-oriented, multi-dimensional database of the linguistic characteristics of the Hebrew text of the Old Testament can enable researchers to do ad hoc queries. XML is a suitable technology to transform free text into a database. A clause’s word order can be kept intact while other features such as syntactic and semantic functions can be marked as elements or attributes. The elements or attributes from the XML “database” can be accessed and proces sed by a 4th generation programming language, such as Visual Basic. XML is explored as an option to build an exploitable database of linguistic data by representing inherently multi-dimensional data, including syntactic and semantic analyses of free text.

  5. Meta-Language Support for Type-Safe Access to External Resources

    NARCIS (Netherlands)

    M.A. Hills (Mark); P. Klint (Paul); J.J. Vinju (Jurgen); K. Czarnecki; G. Hedin

    2012-01-01

    textabstractMeta-programming applications often require access to heterogeneous sources of information, often from different technological spaces (grammars, models, ontologies, databases), that have specialized ways of defining their respective data schemas. Without direct language support,

  6. SierraDNA – Demonstrating the Usefulness of Direct ILS Database Access

    Directory of Open Access Journals (Sweden)

    James Padgett

    2015-10-01

    Full Text Available Innovative Interface’s Sierra(™ Integrated Library System (ILS brings with it a Database Navigator Application (SierraDNA - in layman's terms SierraDNA gives Sierra sites read access to their ILS database. Unlike the closed use cases produced by vendor supplied APIs, which restrict Libraries to limited development opportunities, SierraDNA enables sites to publish their own APIs and scripts based upon custom SQL code to meet their own needs and those of their users and processes. In this article we give examples showing how SierraDNA can be utilized to improve Library services. We highlight three example use cases which have benefited our users, enhanced online security and improved our back office processes. In the first use case we employ user access data from our electronic resources proxy server (WAM to detect hacked user accounts. Three scripts are used in conjunction to flag user accounts which are being hijacked to systematically steal content from our electronic resource provider’s websites. In the second we utilize the reading histories of our users to augment our search experience with an Amazon style “People who borrowed this book also borrowed…these books” feature. Two scripts are used together to determine which other items were borrowed by borrowers of the item currently of interest. And lastly, we use item holds data to improve our acquisitions workflow through an automated demand based ordering process. Our explanation and SQL code should be of direct use for adoption or as examples for other Sierra customers willing to exploit their ILS data in similar ways, but the principles may also be useful to non-Sierra sites that also wish to enhancement security, user services or improve back office processes.

  7. An access technology delivery protocol for children with severe and multiple disabilities: a case demonstration.

    Science.gov (United States)

    Mumford, Leslie; Lam, Rachel; Wright, Virginia; Chau, Tom

    2014-08-01

    This study applied response efficiency theory to create the Access Technology Delivery Protocol (ATDP), a child and family-centred collaborative approach to the implementation of access technologies. We conducted a descriptive, mixed methods case study to demonstrate the ATDP method with a 12-year-old boy with no reliable means of access to an external device. Evaluations of response efficiency, satisfaction, goal attainment, technology use and participation were made after 8 and 16 weeks of training with a custom smile-based access technology. At the 16 week mark, the new access technology offered better response quality; teacher satisfaction was high; average technology usage was 3-4 times per week for up to 1 h each time; switch sensitivity and specificity reached 78% and 64%, respectively, and participation scores increased by 38%. This case supports further development and testing of the ATDP with additional children with multiple or severe disabilities.

  8. The Computer Catalog: A Democratic or Authoritarian Technology?

    Science.gov (United States)

    Adams, Judith A.

    1988-01-01

    Discussion of consequences of library automation argues that technology should be used to augment access to information. Online public access catalogs are considered in this context, along with several related issues such as system incompatibility, invasion of privacy, barriers to database access and manipulation, and user fees, which contribute…

  9. A parallel model for SQL astronomical databases based on solid state storage. Application to the Gaia Archive PostgreSQL database

    Science.gov (United States)

    González-Núñez, J.; Gutiérrez-Sánchez, R.; Salgado, J.; Segovia, J. C.; Merín, B.; Aguado-Agelet, F.

    2017-07-01

    Query planning and optimisation algorithms in most popular relational databases were developed at the times hard disk drives were the only storage technology available. The advent of higher parallel random access capacity devices, such as solid state disks, opens up the way for intra-machine parallel computing over large datasets. We describe a two phase parallel model for the implementation of heavy analytical processes in single instance PostgreSQL astronomical databases. This model is particularised to fulfil two frequent astronomical problems, density maps and crossmatch computation with Quad Tree Cube (Q3C) indexes. They are implemented as part of the relational databases infrastructure for the Gaia Archive and performance is assessed. Improvement of a factor 28.40 in comparison to sequential execution is observed in the reference implementation for a histogram computation. Speedup ratios of 3.7 and 4.0 are attained for the reference positional crossmatches considered. We observe large performance enhancements over sequential execution for both CPU and disk access intensive computations, suggesting these methods might be useful with the growing data volumes in Astronomy.

  10. Biofuel Database

    Science.gov (United States)

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  11. Access and Use of E-Resources by the Lecturers and Students of ...

    African Journals Online (AJOL)

    This study examines the use and access of various e-resource databases in Modibbo Adama University of Technology, Yola. The study specifically highlights the preferences and importance of online resources among the lecturers and students of the School of Management and Information Technology of the university.

  12. Scale out databases for CERN use cases

    CERN Document Server

    Baranowski, Zbigniew; Canali, Luca; Garcia, Daniel Lanza; Surdy, Kacper

    2015-01-01

    Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log dat...

  13. Sources of inter-firm heterogeneity in accessing knowledge-creation benefits within technology clusters

    NARCIS (Netherlands)

    Arikan, A.; Knoben, J.

    2014-01-01

    We build on recent literature to highlight the distinction between knowledge-diffusion and knowledge-creation benefits of technology clustering and argue that firms located in technology clusters will have differential access to the latter. To explain the antecedents of such differential access, we

  14. Information and communication technology resources access and ...

    African Journals Online (AJOL)

    Journal Home > Vol 14, No 1 (2017) > ... The ability to undertake effective legal research is one of the skills required of a lawyer but ... The use of Information and communication technology by Nigerian lawyers deals with ... for Researchers · for Journals · for Authors · for Policy Makers · about Open Access · Journal Quality.

  15. Broadband Optical Access Technologies to Converge towards a Broadband Society in Europe

    Science.gov (United States)

    Coudreuse, Jean-Pierre; Pautonnier, Sophie; Lavillonnière, Eric; Didierjean, Sylvain; Hilt, Benoît; Kida, Toshimichi; Oshima, Kazuyoshi

    This paper provides insights on the status of broadband optical access market and technologies in Europe and on the expected trends for the next generation optical access networks. The final target for most operators, cities or any other player is of course FTTH (Fibre To The Home) deployment although we can expect intermediate steps with copper or wireless technologies. Among the two candidate architectures for FTTH, PON (Passive Optical Network) is by far the most attractive and cost effective solution. We also demonstrate that Ethernet based optical access network is very adequate to all-IP networks without any incidence on the level of quality of service. Finally, we provide feedback from a FTTH pilot network in Colmar (France) based on Gigabit Ethernet PON technology. The interest of this pilot lies on the level of functionality required for broadband optical access networks but also on the development of new home network configurations.

  16. VATE: VAlidation of high TEchnology based on large database analysis by learning machine

    NARCIS (Netherlands)

    Meldolesi, E; Van Soest, J; Alitto, A R; Autorino, R; Dinapoli, N; Dekker, A; Gambacorta, M A; Gatta, R; Tagliaferri, L; Damiani, A; Valentini, V

    2014-01-01

    The interaction between implementation of new technologies and different outcomes can allow a broad range of researches to be expanded. The purpose of this paper is to introduce the VAlidation of high TEchnology based on large database analysis by learning machine (VATE) project that aims to combine

  17. Managing Database Services: An Approach Based in Information Technology Services Availabilty and Continuity Management

    Directory of Open Access Journals (Sweden)

    Leonardo Bastos Pontes

    2017-01-01

    Full Text Available This paper is held in the information technology services management environment, with a few ideas of information technology governance, and purposes to implement a hybrid model to manage the services of a database, based on the principles of information technology services management in a supplementary health operator. This approach utilizes fundamental nuances of services management guides, such as CMMI for Services, COBIT, ISO 20000, ITIL and MPS.BR for Services; it studies harmonically Availability and Continuity Management, as most part of the guides also do. This work has its importance because it keeps a good flow in the database and improves the agility of the systems in the accredited clinics in the health plan.

  18. 10 CFR 603.875 - Foreign access to technology and U.S. competitiveness provisions.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Foreign access to technology and U.S. competitiveness... Foreign access to technology and U.S. competitiveness provisions. (a) Consistent with the objective of enhancing national security and United States competitiveness by increasing the public's reliance on the...

  19. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  20. Incident and Trafficking Database: New Systems for Reporting and Accessing State Information

    International Nuclear Information System (INIS)

    Dimitrovski, D.; Kittley, S.

    2015-01-01

    The IAEA's Incident and Trafficking Database (ITDB) is the Agency's authoritative source for information on incidents in which nuclear and other radioactive material is out of national regulatory control. It was established in 1995 and, as of June 2014, 126 States participate in the ITDB programme. Currently, the database contains over 2500 confirmed incidents, out of which 21% involve nuclear material, 62% radioactive source and 17% radioactively contaminated material. In recent years, the system for States to report incidents to the ITDB has been evolving — moving from fax-based to secure email and most recently to secure on-line reporting. A Beta version of the on-line system was rolled out this June, offering a simple, yet secure, communication channel for member states to provide information. In addition the system serves as a central hub for information related to official communication of the IAEA with Member States so some communication that is traditionally shared by e-mail does not get lost when ITDB counterparts change. In addition the new reporting system incorporates optional features that allow multiple Member State users to collaboratively contribute toward an INF. States are also being given secure on-line access to a streamlined version of the ITDB. This improves States' capabilities to retrieve and analyze information for their own purposes. In addition, on-line access to ITDB statistical information on incidents is available to States through an ITDB Dashboard. The dashboard contains aggregate information on number and types of incidents, material involved, as well some other statistics related to the ITDB that is typically provided in the ITDB Quarterly reports. (author)

  1. Proposal for a High Energy Nuclear Database

    International Nuclear Information System (INIS)

    Brown, David A.; Vogt, Ramona

    2005-01-01

    We propose to develop a high-energy heavy-ion experimental database and make it accessible to the scientific community through an on-line interface. This database will be searchable and cross-indexed with relevant publications, including published detector descriptions. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This database should eventually contain all published data from Bevalac and AGS to RHIC to CERN-LHC energies, proton-proton to nucleus-nucleus collisions as well as other relevant systems, and all measured observables. Such a database would have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models to a broad range of old and new experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion and target and source development for upcoming facilities such as the Next Linear Collider. To enhance the utility of this database, we propose periodically performing evaluations of the data and summarizing the results in topical reviews

  2. Atomic Spectra Database (ASD)

    Science.gov (United States)

    SRD 78 NIST Atomic Spectra Database (ASD) (Web, free access)   This database provides access and search capability for NIST critically evaluated data on atomic energy levels, wavelengths, and transition probabilities that are reasonably up-to-date. The NIST Atomic Spectroscopy Data Center has carried out these critical compilations.

  3. Private and Efficient Query Processing on Outsourced Genomic Databases.

    Science.gov (United States)

    Ghasemi, Reza; Al Aziz, Md Momin; Mohammed, Noman; Dehkordi, Massoud Hadian; Jiang, Xiaoqian

    2017-09-01

    Applications of genomic studies are spreading rapidly in many domains of science and technology such as healthcare, biomedical research, direct-to-consumer services, and legal and forensic. However, there are a number of obstacles that make it hard to access and process a big genomic database for these applications. First, sequencing genomic sequence is a time consuming and expensive process. Second, it requires large-scale computation and storage systems to process genomic sequences. Third, genomic databases are often owned by different organizations, and thus, not available for public usage. Cloud computing paradigm can be leveraged to facilitate the creation and sharing of big genomic databases for these applications. Genomic data owners can outsource their databases in a centralized cloud server to ease the access of their databases. However, data owners are reluctant to adopt this model, as it requires outsourcing the data to an untrusted cloud service provider that may cause data breaches. In this paper, we propose a privacy-preserving model for outsourcing genomic data to a cloud. The proposed model enables query processing while providing privacy protection of genomic databases. Privacy of the individuals is guaranteed by permuting and adding fake genomic records in the database. These techniques allow cloud to evaluate count and top-k queries securely and efficiently. Experimental results demonstrate that a count and a top-k query over 40 Single Nucleotide Polymorphisms (SNPs) in a database of 20 000 records takes around 100 and 150 s, respectively.

  4. A high-energy nuclear database proposal

    International Nuclear Information System (INIS)

    Brown, D.A.; Vogt, R.; UC Davis, CA

    2006-01-01

    We propose to develop a high-energy heavy-ion experimental database and make it accessible to the scientific community through an on-line interface. This database will be searchable and cross-indexed with relevant publications, including published detector descriptions. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This database should eventually contain all published data from the Bevalac, AGS and SPS to RHIC and LHC energies, proton-proton to nucleus-nucleus collisions as well as other relevant systems, and all measured observables. Such a database would have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models to a broad range of old and new experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion and target and source development for upcoming facilities such as the Next Linear Collider. To enhance the utility of this database, we propose periodically performing evaluations of the data and summarizing the results in topical reviews. (author)

  5. The potential use of mobile technology: enhancing accessibility and ...

    African Journals Online (AJOL)

    The potential use of mobile technology: enhancing accessibility and communication in a blended ... South African Journal of Education ... Recommendations, limitations of the present study, and suggestions for future research were made.

  6. Security and health research databases: the stakeholders and questions to be addressed.

    Science.gov (United States)

    Stewart, Sara

    2006-01-01

    Health research database security issues abound. Issues include subject confidentiality, data ownership, data integrity and data accessibility. There are also various stakeholders in database security. Each of these stakeholders has a different set of concerns and responsibilities when dealing with security issues. There is an obvious need for training in security issues, so that these issues may be addressed and health research will move on without added obstacles based on misunderstanding security methods and technologies.

  7. SciELO, Scientific Electronic Library Online, a Database of Open Access Journals

    Directory of Open Access Journals (Sweden)

    Rogerio Meneghini

    2013-09-01

    Full Text Available   This essay discusses SciELO, a scientific journal database operating in 14 countries. It covers over 1000 journals providing open access to full text and table sets of scientometrics data. In Brazil it is responsible for a collection of nearly 300 journals, selected along 15 years as the best Brazilian periodicals in natural and social sciences. Nonetheless, they still are national journal in the sense that over 80% of the articles are published by Brazilian scientists. Important initiatives focused on professionalization and internationalization are considered to bring these journals to a higher level of quality and visibility. DOI: 10.18870/hlrc.v3i3.153

  8. Construction of a bibliographic information database and development of retrieval system for research reports in nuclear science and technology (II)

    International Nuclear Information System (INIS)

    Han, Duk Haeng; Kim, Tae Whan; Choi, Kwang; Yoo, An Na; Keum, Jong Yong; Kim, In Kwon

    1996-05-01

    The major goal of this project is to construct a bibliographic information database in nuclear engineering and to develop a prototype retrieval system. To give an easy access to microfiche research report, this project has accomplished the construction of microfiche research reports database and the development of retrieval system. The results of the project are as follows; 1. Microfiche research reports database was constructed by downloading from DOE Energy, NTIS, INIS. 2. The retrieval system was developed in host and web version using access point such as title, abstracts, keyword, report number. 6 tabs., 8 figs., 11 refs. (Author) .new

  9. Construction of a bibliographic information database and development of retrieval system for research reports in nuclear science and technology (II)

    Energy Technology Data Exchange (ETDEWEB)

    Han, Duk Haeng; Kim, Tae Whan; Choi, Kwang; Yoo, An Na; Keum, Jong Yong; Kim, In Kwon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-05-01

    The major goal of this project is to construct a bibliographic information database in nuclear engineering and to develop a prototype retrieval system. To give an easy access to microfiche research report, this project has accomplished the construction of microfiche research reports database and the development of retrieval system. The results of the project are as follows; 1. Microfiche research reports database was constructed by downloading from DOE Energy, NTIS, INIS. 2. The retrieval system was developed in host and web version using access point such as title, abstracts, keyword, report number. 6 tabs., 8 figs., 11 refs. (Author) .new.

  10. Quality, language, subdiscipline and promotion were associated with article accesses on Physiotherapy Evidence Database (PEDro).

    Science.gov (United States)

    Yamato, Tiê P; Arora, Mohit; Stevens, Matthew L; Elkins, Mark R; Moseley, Anne M

    2018-03-01

    To quantify the relationship between the number of times articles are accessed on the Physiotherapy Evidence Database (PEDro) and the article characteristics. A secondary aim was to examine the relationship between accesses and the number of citations of articles. The study was conducted to derive prediction models for the number of accesses of articles indexed on PEDro from factors that may influence an article's accesses. All articles available on PEDro from August 2014 to January 2015 were included. We extracted variables relating to the algorithm used to present PEDro search results (research design, year of publication, PEDro score, source of systematic review (Cochrane or non-Cochrane)) plus language, subdiscipline of physiotherapy, and whether articles were promoted to PEDro users. Three predictive models were examined using multiple regression analysis. Citation and journal impact factor were downloaded. There were 29,313 articles indexed in this period. We identified seven factors that predicted the number of accesses. More accesses were noted for factors related to the algorithm used to present PEDro search results (synthesis research (i.e., guidelines and reviews), recent articles, Cochrane reviews, and higher PEDro score) plus publication in English and being promoted to PEDro users. The musculoskeletal, neurology, orthopaedics, sports, and paediatrics subdisciplines were associated with more accesses. We also found that there was no association between number of accesses and citations. The number of times an article is accessed on PEDro is partly predicted by how condensed and high quality the evidence it contains is. Copyright © 2017 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  11. Optical fiber cabling technologies for flexible access network

    Science.gov (United States)

    Tanji, Hisashi

    2008-07-01

    Fiber-to-the-home (FTTH) outside plant infrastructure should be so designed and constructed as to flexibly deal with increasing subscribers and system evolution to be expected in the future, taking minimization of total cost (CAPEX and OPEX) into consideration. With this in mind, fiber access architectures are reviewed and key technologies on optical fiber and cable for supporting flexible access network are presented. Low loss over wide wavelength (low water peak) and bend-insensitive single mode fiber is a future proof solution. Enhanced separable ribbon facilitates mid-span access to individual fibers in a cable installed, improving fiber utilizing efficiency and flexibility of distribution design. It also contributes to an excellent low PMD characteristic which could be required for video RF overlay system or high capacity long reach metro-access convergence network in the future. Bend-insensitive fiber based cabling technique including field installable connector greatly improves fiber/cable handling in installation and maintenance work.

  12. Respiratory cancer database: An open access database of respiratory cancer gene and miRNA.

    Science.gov (United States)

    Choubey, Jyotsna; Choudhari, Jyoti Kant; Patel, Ashish; Verma, Mukesh Kumar

    2017-01-01

    Respiratory cancer database (RespCanDB) is a genomic and proteomic database of cancer of respiratory organ. It also includes the information of medicinal plants used for the treatment of various respiratory cancers with structure of its active constituents as well as pharmacological and chemical information of drug associated with various respiratory cancers. Data in RespCanDB has been manually collected from published research article and from other databases. Data has been integrated using MySQL an object-relational database management system. MySQL manages all data in the back-end and provides commands to retrieve and store the data into the database. The web interface of database has been built in ASP. RespCanDB is expected to contribute to the understanding of scientific community regarding respiratory cancer biology as well as developments of new way of diagnosing and treating respiratory cancer. Currently, the database consist the oncogenomic information of lung cancer, laryngeal cancer, and nasopharyngeal cancer. Data for other cancers, such as oral and tracheal cancers, will be added in the near future. The URL of RespCanDB is http://ridb.subdic-bioinformatics-nitrr.in/.

  13. KALIMER database development

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment.

  14. KALIMER database development

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  15. Access 2010 for dummies

    CERN Document Server

    Ulrich Fuller, Laurie

    2010-01-01

    A friendly, step-by-step guide to the Microsoft Office database application Access may be the least understood and most challenging application in the Microsoft Office suite. This guide is designed to help anyone who lacks experience in creating and managing a database learn to use Access 2010 quickly and easily. In the classic For Dummies tradition, the book provides an education in Access, the interface, and the architecture of a database. It explains the process of building a database, linking information, sharing data, generating reports, and much more.As the Micr

  16. Children's Culture Database (CCD)

    DEFF Research Database (Denmark)

    Wanting, Birgit

    a Dialogue inspired database with documentation, network (individual and institutional profiles) and current news , paper presented at the research seminar: Electronic access to fiction, Copenhagen, November 11-13, 1996......a Dialogue inspired database with documentation, network (individual and institutional profiles) and current news , paper presented at the research seminar: Electronic access to fiction, Copenhagen, November 11-13, 1996...

  17. Waste management and technologies analytical database project for Los Alamos National Laboratory/Department of Energy. Final report, June 7, 1993--June 15, 1994

    International Nuclear Information System (INIS)

    1995-01-01

    The Waste Management and Technologies Analytical Database System (WMTADS) supported by the Department of Energy's (DOE) Office of Environmental Management (EM), Office of Technology Development (EM-50), was developed and based at the Los Alamos National Laboratory (LANL), Los Alamos, New Mexico, to collect, identify, organize, track, update, and maintain information related to existing/available/developing and planned technologies to characterize, treat, and handle mixed, hazardous and radioactive waste for storage and disposal in support of EM strategies and goals and to focus area projects. WMTADS was developed as a centralized source of on-line information regarding technologies for environmental management processes that can be accessed by a computer, modem, phone line, and communications software through a Local Area Network (LAN), and server connectivity on the Internet, the world's largest computer network, and with file transfer protocol (FTP) can also be used to globally transfer files from the server to the user's computer through Internet and World Wide Web (WWW) using Mosaic

  18. Characterizing Journal Access at a Canadian University Using the Journal Citation Reports Database

    Directory of Open Access Journals (Sweden)

    Alan Gale

    2011-07-01

    Full Text Available This article outlines a simple approach to characterizing the level of access to the scholarly journal literature in the physical sciences and engineering offered by a research library, particularly within the Canadian university system. The method utilizes the “Journal Citation Reports” (JCR database to produce lists of journals, ranked based on total citations, in the subject areas of interest. Details of the approach are illustrated using data from the University of Guelph. The examples cover chemistry, physics, mathematics and statistics, as well as engineering. In assessing the level of access both the Library’s current journal subscriptions and backfiles are considered. To gain greater perspective, data from both 2003 and 2008 is analyzed. In addition, the number of document delivery requests, received from University of Guelph Library users in recent years, are also reviewed. The approach taken in characterizing access to the journal literature is found to be simple and easy to implement, but time consuming. The University of Guelph Library is shown to provide excellent access to the current journal literature in the subject areas examined. Access to the historical literature in those areas is also strong. In making these assessments, a broad and comprehensive array of journals is considered in each case. Document delivery traffic (i.e. Guelph requests is found to have decreased markedly in recent years. This is attributed, at least in part, to improving access to the scholarly literature. For the University of Guelph, collection assessment is an ongoing process that must balance the needs of a diverse group of users. The results of analyses of the kind discussed in this article can be of practical significance and value to that process.

  19. The Structural Ceramics Database: Technical Foundations

    Science.gov (United States)

    Munro, R. G.; Hwang, F. Y.; Hubbard, C. R.

    1989-01-01

    The development of a computerized database on advanced structural ceramics can play a critical role in fostering the widespread use of ceramics in industry and in advanced technologies. A computerized database may be the most effective means of accelerating technology development by enabling new materials to be incorporated into designs far more rapidly than would have been possible with traditional information transfer processes. Faster, more efficient access to critical data is the basis for creating this technological advantage. Further, a computerized database provides the means for a more consistent treatment of data, greater quality control and product reliability, and improved continuity of research and development programs. A preliminary system has been completed as phase one of an ongoing program to establish the Structural Ceramics Database system. The system is designed to be used on personal computers. Developed in a modular design, the preliminary system is focused on the thermal properties of monolithic ceramics. The initial modules consist of materials specification, thermal expansion, thermal conductivity, thermal diffusivity, specific heat, thermal shock resistance, and a bibliography of data references. Query and output programs also have been developed for use with these modules. The latter program elements, along with the database modules, will be subjected to several stages of testing and refinement in the second phase of this effort. The goal of the refinement process will be the establishment of this system as a user-friendly prototype. Three primary considerations provide the guidelines to the system’s development: (1) The user’s needs; (2) The nature of materials properties; and (3) The requirements of the programming language. The present report discusses the manner and rationale by which each of these considerations leads to specific features in the design of the system. PMID:28053397

  20. The PMDB Protein Model Database

    Science.gov (United States)

    Castrignanò, Tiziana; De Meo, Paolo D'Onorio; Cozzetto, Domenico; Talamo, Ivano Giuseppe; Tramontano, Anna

    2006-01-01

    The Protein Model Database (PMDB) is a public resource aimed at storing manually built 3D models of proteins. The database is designed to provide access to models published in the scientific literature, together with validating experimental data. It is a relational database and it currently contains >74 000 models for ∼240 proteins. The system is accessible at and allows predictors to submit models along with related supporting evidence and users to download them through a simple and intuitive interface. Users can navigate in the database and retrieve models referring to the same target protein or to different regions of the same protein. Each model is assigned a unique identifier that allows interested users to directly access the data. PMID:16381873

  1. Evolving provider payment models and patient access to innovative medical technology.

    Science.gov (United States)

    Long, Genia; Mortimer, Richard; Sanzenbacher, Geoffrey

    2014-12-01

    Abstract Objective: To investigate the evolving use and expected impact of pay-for-performance (P4P) and risk-based provider reimbursement on patient access to innovative medical technology. Structured interviews with leading private payers representing over 110 million commercially-insured lives exploring current and planned use of P4P provider payment models, evidence requirements for technology assessment and new technology coverage, and the evolving relationship between the two topics. Respondents reported rapid increases in the use of P4P and risk-sharing programs, with roughly half of commercial lives affected 3 years ago, just under two-thirds today, and an expected three-quarters in 3 years. All reported well-established systems for evaluating new technology coverage. Five of nine reported becoming more selective in the past 3 years in approving new technologies; four anticipated that in the next 3 years there will be a higher evidence requirement for new technology access. Similarly, four expected it will become more difficult for clinically appropriate but costly technologies to gain coverage. All reported planning to rely more on these types of provider payment incentives to control costs, but didn't see them as a substitute for payer technology reviews and coverage limitations; they each have a role to play. Interviews limited to nine leading payers with models in place; self-reported data. Likely implications include a more uncertain payment environment for providers, and indirectly for innovative medical technology and future investment, greater reliance on quality and financial metrics, and increased evidence requirements for favorable coverage and utilization decisions. Increasing provider financial risk may challenge the traditional technology adoption paradigm, where payers assumed a 'gatekeeping' role and providers a countervailing patient advocacy role with regard to access to new technology. Increased provider financial risk may result in an

  2. Advancements in web-database applications for rabies surveillance

    Directory of Open Access Journals (Sweden)

    Bélanger Denise

    2011-08-01

    Full Text Available Abstract Background Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among participating agencies. In Québec, RageDB was developed to house surveillance data for the raccoon rabies variant, representing the next generation in web-based database applications that provide a key resource for the protection of public health. Results RageDB incorporates data from, and grants access to, all agencies responsible for the surveillance of raccoon rabies in Québec. Technological advancements of RageDB to rabies surveillance databases include 1 automatic integration of multi-agency data and diagnostic results on a daily basis; 2 a web-based data editing interface that enables authorized users to add, edit and extract data; and 3 an interactive dashboard to help visualize data simply and efficiently, in table, chart, and cartographic formats. Furthermore, RageDB stores data from citizens who voluntarily report sightings of rabies suspect animals. We also discuss how sightings data can indicate public perception to the risk of racoon rabies and thus aid in directing the allocation of disease control resources for protecting public health. Conclusions RageDB provides an example in the evolution of spatio-temporal database applications for the storage, analysis and communication of disease surveillance data. The database was fast and inexpensive to develop by using open-source technologies, simple and efficient design strategies, and shared web hosting. The database increases communication among agencies collaborating to protect human health from

  3. Records Management Database

    Data.gov (United States)

    US Agency for International Development — The Records Management Database is tool created in Microsoft Access specifically for USAID use. It contains metadata in order to access and retrieve the information...

  4. The potential use of mobile technology: enhancing accessibility and communication in a blended learning course

    Directory of Open Access Journals (Sweden)

    Tabisa Mayisela

    2013-01-01

    Full Text Available Mobile technology is increasingly being used to support blended learning beyond computer centres. It has been considered as a potential solution to the problem of a shortage of computers for accessing online learning materials (courseware in a blended learning course. The purpose of the study was to establish how the use of mobile technology could enhance accessibility and communication in a blended learning course. Data were solicitedfrom a purposive convenience sample of 36 students engaged in the blended learning course. The case study utilized a mixed-methods approach. An unstructured interview was conducted with the course lecturer and these data informed the design of the students' semi-structured questionnaire. It was found that students with access to mobile technology had an increased opportunity to access the courseware of the blended learning course. Mobile technology further enhanced student-to-student and student-to-lecturer communication by means of social networks. The study concludes that mobile technology has the potential to increase accessibility and communication in a blended learning course. Recommendations, limitations of the present study, and suggestionsforfuture research were made.

  5. JASPAR, the open access database of transcription factor-binding profiles: new content and tools in the 2008 update

    DEFF Research Database (Denmark)

    Bryne, J.C.; Valen, E.; Tang, M.H.E.

    2008-01-01

    JASPAR is a popular open-access database for matrix models describing DNA-binding preferences for transcription factors and other DNA patterns. With its third major release, JASPAR has been expanded and equipped with additional functions aimed at both casual and power users. The heart of the JASPAR...... databasethe JASPAR CORE sub-databasehas increased by 12 in size, and three new specialized sub-databases have been added. New functions include clustering of matrix models by similarity, generation of random matrices by sampling from selected sets of existing models and a language-independent Web Service...

  6. The OAuth 2.0 Web Authorization Protocol for the Internet Addiction Bioinformatics (IABio) Database.

    Science.gov (United States)

    Choi, Jeongseok; Kim, Jaekwon; Lee, Dong Kyun; Jang, Kwang Soo; Kim, Dai-Jin; Choi, In Young

    2016-03-01

    Internet addiction (IA) has become a widespread and problematic phenomenon as smart devices pervade society. Moreover, internet gaming disorder leads to increases in social expenditures for both individuals and nations alike. Although the prevention and treatment of IA are getting more important, the diagnosis of IA remains problematic. Understanding the neurobiological mechanism of behavioral addictions is essential for the development of specific and effective treatments. Although there are many databases related to other addictions, a database for IA has not been developed yet. In addition, bioinformatics databases, especially genetic databases, require a high level of security and should be designed based on medical information standards. In this respect, our study proposes the OAuth standard protocol for database access authorization. The proposed IA Bioinformatics (IABio) database system is based on internet user authentication, which is a guideline for medical information standards, and uses OAuth 2.0 for access control technology. This study designed and developed the system requirements and configuration. The OAuth 2.0 protocol is expected to establish the security of personal medical information and be applied to genomic research on IA.

  7. VariVis: a visualisation toolkit for variation databases

    Directory of Open Access Journals (Sweden)

    Smith Timothy D

    2008-04-01

    Full Text Available Abstract Background With the completion of the Human Genome Project and recent advancements in mutation detection technologies, the volume of data available on genetic variations has risen considerably. These data are stored in online variation databases and provide important clues to the cause of diseases and potential side effects or resistance to drugs. However, the data presentation techniques employed by most of these databases make them difficult to use and understand. Results Here we present a visualisation toolkit that can be employed by online variation databases to generate graphical models of gene sequence with corresponding variations and their consequences. The VariVis software package can run on any web server capable of executing Perl CGI scripts and can interface with numerous Database Management Systems and "flat-file" data files. VariVis produces two easily understandable graphical depictions of any gene sequence and matches these with variant data. While developed with the goal of improving the utility of human variation databases, the VariVis package can be used in any variation database to enhance utilisation of, and access to, critical information.

  8. Security Research on Engineering Database System

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Engine engineering database system is an oriented C AD applied database management system that has the capability managing distributed data. The paper discusses the security issue of the engine engineering database management system (EDBMS). Through studying and analyzing the database security, to draw a series of securi ty rules, which reach B1, level security standard. Which includes discretionary access control (DAC), mandatory access control (MAC) and audit. The EDBMS implem ents functions of DAC, ...

  9. The INFN-CNAF Tier-1 GEMSS Mass Storage System and database facility activity

    Science.gov (United States)

    Ricci, Pier Paolo; Cavalli, Alessandro; Dell'Agnello, Luca; Favaro, Matteo; Gregori, Daniele; Prosperini, Andrea; Pezzi, Michele; Sapunenko, Vladimir; Zizzi, Giovanni; Vagnoni, Vincenzo

    2015-05-01

    The consolidation of Mass Storage services at the INFN-CNAF Tier1 Storage department that has occurred during the last 5 years, resulted in a reliable, high performance and moderately easy-to-manage facility that provides data access, archive, backup and database services to several different use cases. At present, the GEMSS Mass Storage System, developed and installed at CNAF and based upon an integration between the IBM GPFS parallel filesystem and the Tivoli Storage Manager (TSM) tape management software, is one of the largest hierarchical storage sites in Europe. It provides storage resources for about 12% of LHC data, as well as for data of other non-LHC experiments. Files are accessed using standard SRM Grid services provided by the Storage Resource Manager (StoRM), also developed at CNAF. Data access is also provided by XRootD and HTTP/WebDaV endpoints. Besides these services, an Oracle database facility is in production characterized by an effective level of parallelism, redundancy and availability. This facility is running databases for storing and accessing relational data objects and for providing database services to the currently active use cases. It takes advantage of several Oracle technologies, like Real Application Cluster (RAC), Automatic Storage Manager (ASM) and Enterprise Manager centralized management tools, together with other technologies for performance optimization, ease of management and downtime reduction. The aim of the present paper is to illustrate the state-of-the-art of the INFN-CNAF Tier1 Storage department infrastructures and software services, and to give a brief outlook to forthcoming projects. A description of the administrative, monitoring and problem-tracking tools that play a primary role in managing the whole storage framework is also given.

  10. An approach for access differentiation design in medical distributed applications built on databases.

    Science.gov (United States)

    Shoukourian, S K; Vasilyan, A M; Avagyan, A A; Shukurian, A K

    1999-01-01

    A formalized "top to bottom" design approach was described in [1] for distributed applications built on databases, which were considered as a medium between virtual and real user environments for a specific medical application. Merging different components within a unified distributed application posits new essential problems for software. Particularly protection tools, which are sufficient separately, become deficient during the integration due to specific additional links and relationships not considered formerly. E.g., it is impossible to protect a shared object in the virtual operating room using only DBMS protection tools, if the object is stored as a record in DB tables. The solution of the problem should be found only within the more general application framework. Appropriate tools are absent or unavailable. The present paper suggests a detailed outline of a design and testing toolset for access differentiation systems (ADS) in distributed medical applications which use databases. The appropriate formal model as well as tools for its mapping to a DMBS are suggested. Remote users connected via global networks are considered too.

  11. Exploring Accessibility Scenarios for 2020 in Relation with Future ICT Trends on Assistive Technology and Accessibility

    Directory of Open Access Journals (Sweden)

    Adamantios Koumpis

    2012-01-01

    Full Text Available In this paper we are going to present a set of 5 future scenarios that were developed within the eAccessibility2020 study. The study aims to explore and analyse the referred relationships between the emerging ICT landscape, in the societal and economic context and the development and provision of assistive technologies (AT and e-Accessibility, within a perspective of 10 years. The scenarios were developed after an initial trend analysis that the study team conducted based on data gathering. The scenarios were developed based on a methodology which defined a set of guides for scenario development and a set of visions for the future of eAccessibility.

  12. IMPACT OF HEALTH TECHNOLOGY ASSESSMENT IN LITIGATION CONCERNING ACCESS TO HIGH-COST DRUGS.

    Science.gov (United States)

    Aleman, Alicia; Perez Galan, Ana

    2017-01-01

    The impact of health technology assessment (HTA) in the judicialization of the right of health has not been deeply studied in Latin American countries. The purpose of this study is to review the process of judicialization of the access to high cost drugs in Uruguay and assess the impact HTAs have had on this process. The methodology used for this study included a comprehensive literature search in electronic databases, local journals, internal documents developed in the Ministry of Health, as well as conducting interviews with key informants. Judicialization of the access of high cost drugs has been increasing since 2010. The strategy of the Ministry of Health of Uruguay to decrease this problem included the organization of roundtables with judges and other stakeholders on the basis of HTA, the training of defense lawyers in the use and interpretation of HTA, and the participation of a professional who develops HTA in the preparation of the defense arguments. A year after the implementation of this strategy, 25 percent of writs of protection were won by the Ministry of Health. Even though the strategy implemented was effective in reducing the loss of litigations, it was not effective in reducing the growing number of writs of protection. It is essential to address this problem in a broad debate and to promote understanding between the parties.

  13. Distance Education Technologies in Asia

    International Development Research Centre (IDRC) Digital Library (Canada)

    17 schools ... Mobile Technology in Non-formal Distance Education 192 ..... in the design and application of e-learning strategies, the need to standardise and ...... library providing access to over 20,000 journals and thesis databases, and 6,000 ...

  14. Proposal for a high-energy nuclear database

    International Nuclear Information System (INIS)

    Brown, D.A.; Vogt, R.

    2006-01-01

    We propose to develop a high-energy heavy-ion experimental database and make it accessible to the scientific community through an on-line interface. This database will be searchable and cross-indexed with relevant publications, including published detector descriptions. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This database should eventually contain all published data from Bevalac, AGS and SPS to RHIC and LHC energies, proton-proton to nucleus-nucleus collisions as well as other relevant systems, and all measured observables. Such a database would have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models to a broad range of old and new experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion and target and source development for upcoming facilities such as the Next Linear Collider. To enhance the utility of this database, we propose periodically performing evaluations of the data and summarizing the results in topical reviews. (author)

  15. Proposal for a High Energy Nuclear Database

    International Nuclear Information System (INIS)

    Brown, D A; Vogt, R

    2005-01-01

    The authors propose to develop a high-energy heavy-ion experimental database and make it accessible to the scientific community through an on-line interface. This database will be searchable and cross-indexed with relevant publications, including published detector descriptions. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This database should eventually contain all published data from Bevalac, AGS and SPS to RHIC and CERN-LHC energies, proton-proton to nucleus-nucleus collisions as well as other relevant systems, and all measured observables. Such a database would have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models to a broad range of old and new experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion and target and source development for upcoming facilities such as the Next Linear Collider. To enhance the utility of this database, they propose periodically performing evaluations of the data and summarizing the results in topical reviews

  16. Electronic Document Delivery: Converging Standards and Technologies. UDT Series on Data Communication Technologies and Standards for Libraries, Report #2.

    Science.gov (United States)

    Cleveland, Gary

    The development of information technologies such as public access catalogs and online databases has greatly enhanced access to information. The lack of automation in the area of document delivery, however, has created a large disparity between the speed with which citations are found and the provision of primary documents. This imbalance can…

  17. The International Nucleotide Sequence Database Collaboration.

    Science.gov (United States)

    Cochrane, Guy; Karsch-Mizrachi, Ilene; Nakamura, Yasukazu

    2011-01-01

    Under the International Nucleotide Sequence Database Collaboration (INSDC; http://www.insdc.org), globally comprehensive public domain nucleotide sequence is captured, preserved and presented. The partners of this long-standing collaboration work closely together to provide data formats and conventions that enable consistent data submission to their databases and support regular data exchange around the globe. Clearly defined policy and governance in relation to free access to data and relationships with journal publishers have positioned INSDC databases as a key provider of the scientific record and a core foundation for the global bioinformatics data infrastructure. While growth in sequence data volumes comes no longer as a surprise to INSDC partners, the uptake of next-generation sequencing technology by mainstream science that we have witnessed in recent years brings a step-change to growth, necessarily making a clear mark on INSDC strategy. In this article, we introduce the INSDC, outline data growth patterns and comment on the challenges of increased growth.

  18. The Influence of Information Technology Access on Agricultural Research in Nigeria.

    Science.gov (United States)

    Jimba, Samuel Wodi; Atinmo, Morayo Ibironke

    2000-01-01

    Examines the relationship between accessibility to information technology and research publications among users of agricultural libraries in Nigeria. Discusses results of a questionnaire that investigated the use of electronic information resources and considers the effects of information technology and globalization on the economies of developing…

  19. National Carbon Sequestration Database and Geographic Information System (NatCarb)

    Energy Technology Data Exchange (ETDEWEB)

    Kenneth Nelson; Timothy Carr

    2009-03-31

    This annual and final report describes the results of the multi-year project entitled 'NATional CARBon Sequestration Database and Geographic Information System (NatCarb)' (http://www.natcarb.org). The original project assembled a consortium of five states (Indiana, Illinois, Kansas, Kentucky and Ohio) in the midcontinent of the United States (MIDCARB) to construct an online distributed Relational Database Management System (RDBMS) and Geographic Information System (GIS) covering aspects of carbon dioxide (CO{sub 2}) geologic sequestration. The NatCarb system built on the technology developed in the initial MIDCARB effort. The NatCarb project linked the GIS information of the Regional Carbon Sequestration Partnerships (RCSPs) into a coordinated regional database system consisting of datasets useful to industry, regulators and the public. The project includes access to national databases and GIS layers maintained by the NatCarb group (e.g., brine geochemistry) and publicly accessible servers (e.g., USGS, and Geography Network) into a single system where data are maintained and enhanced at the local level, but are accessed and assembled through a single Web portal to facilitate query, assembly, analysis and display. This project improves the flow of data across servers and increases the amount and quality of available digital data. The purpose of NatCarb is to provide a national view of the carbon capture and storage potential in the U.S. and Canada. The digital spatial database allows users to estimate the amount of CO{sub 2} emitted by sources (such as power plants, refineries and other fossil-fuel-consuming industries) in relation to geologic formations that can provide safe, secure storage sites over long periods of time. The NatCarb project worked to provide all stakeholders with improved online tools for the display and analysis of CO{sub 2} carbon capture and storage data through a single website portal (http://www.natcarb.org/). While the external

  20. Access to information technology and willingness to receive text ...

    African Journals Online (AJOL)

    Background. Effective communication is imperative for the delivery and receipt of adequate health care services. Aim. To determine access to information technology and willingness to receive short message service (SMS) text message reminders for childhood immunisation services among mothers in Lagos, Nigeria.

  1. jSPyDB, an open source database-independent tool for data management

    CERN Document Server

    Pierro, Giuseppe Antonio

    2010-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different Database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. ...

  2. Using Technology to Improve Access to Mental Health Services.

    Science.gov (United States)

    Cortelyou-Ward, Kendall; Rotarius, Timothy; Honrado, Jed C

    Mental ill-health is a public health threat that is prevalent throughout the United States. Tens of millions of Americans have been diagnosed along the continuum of mental ill-health, and many more millions of family members and friends are indirectly affected by the pervasiveness of mental ill-health. Issues such as access and the societal stigma related to mental health issues serve as deterrents to patients receiving their necessary care. However, technological advances have shown the potential to increase access to mental health services for many patients.

  3. Wearables for all : development of guidelines to stimulate accessible wearable technology design

    NARCIS (Netherlands)

    Wentzel, Jobke; Velleman, Eric; van der Geest, Thea

    2016-01-01

    In this paper, we present the rationale and approach for establishing guidelines for the development of accessible wearables. Wearable technology is increasingly integrated in our everyday lives. Therefore, ensuring accessibility is pivotal to prevent a digital divide between persons who have and

  4. Geologic Field Database

    Directory of Open Access Journals (Sweden)

    Katarina Hribernik

    2002-12-01

    Full Text Available The purpose of the paper is to present the field data relational database, which was compiled from data, gathered during thirty years of fieldwork on the Basic Geologic Map of Slovenia in scale1:100.000. The database was created using MS Access software. The MS Access environment ensures its stability and effective operation despite changing, searching, and updating the data. It also enables faster and easier user-friendly access to the field data. Last but not least, in the long-term, with the data transferred into the GISenvironment, it will provide the basis for the sound geologic information system that will satisfy a broad spectrum of geologists’ needs.

  5. Access 2010 Programmer's Reference

    CERN Document Server

    Hennig, Teresa; Griffith, Geoffrey L

    2010-01-01

    A comprehensive guide to programming for Access 2010 and 2007. Millions of people use the Access database applications, and hundreds of thousands of developers work with Access daily. Access 2010 brings better integration with SQL Server and enhanced XML support; this Wrox guide shows developers how to take advantage of these and other improvements. With in-depth coverage of VBA, macros, and other programming methods for building Access applications, this book also provides real-world code examples to demonstrate each topic.: Access is the leading database that is used worldwide; While VBA rem

  6. An Oracle(c) database for the AMS experiment

    International Nuclear Information System (INIS)

    Boschini, M.; Gervasi, M.; Grandi, D.; Rancoita, P.G.; Trombetta, L.; Usoskin, I.G.

    1999-01-01

    We present hardware and software technologies implemented for the AMS Milano Data Center. Goal of the AMS Milano Data Center is to provide data collected during the STS-91 Space Shuttle flight to users and to provide a User Interface as well to manage the data properly. Data are stored in a database that provides high level query and retrieval features, the support being a magneto-optical juke-box. We describe the use of proprietary software (Oracle(c)) as well as custom-written software to enhance access performances. In particular we underscore the use of the Oracle Call Interfaces as a powerful tool to interface the database and the operating system in a natural way

  7. Improving Information Access through Technology: A Plan for Louisiana's Public Libraries.

    Science.gov (United States)

    Jaques, Thomas F.

    Strengthening technology in Louisiana's public libraries will support equitable and convenient access to electronic information resources for all citizens at library sites, in homes, and in business. The plan presented in this document is intended to enhance and expand technology in the state's public libraries. After discussion of the crucial…

  8. SSC lattice database and graphical interface

    International Nuclear Information System (INIS)

    Trahern, C.G.; Zhou, J.

    1991-11-01

    When completed the Superconducting Super Collider will be the world's largest accelerator complex. In order to build this system on schedule, the use of database technologies will be essential. In this paper we discuss one of the database efforts underway at the SSC, the lattice database. The SSC lattice database provides a centralized source for the design of each major component of the accelerator complex. This includes the two collider rings, the High Energy Booster, Medium Energy Booster, Low Energy Booster, and the LINAC as well as transfer and test beam lines. These designs have been created using a menagerie of programs such as SYNCH, DIMAD, MAD, TRANSPORT, MAGIC, TRACE3D AND TEAPOT. However, once a design has been completed, it is entered into a uniform database schema in the database system. In this paper we discuss the reasons for creating the lattice database and its implementation via the commercial database system SYBASE. Each lattice in the lattice database is composed of a set of tables whose data structure can describe any of the SSC accelerator lattices. In order to allow the user community access to the databases, a programmatic interface known as dbsf (for database to several formats) has been written. Dbsf creates ascii input files appropriate to the above mentioned accelerator design programs. In addition it has a binary dataset output using the Self Describing Standard data discipline provided with the Integrated Scientific Tool Kit software tools. Finally we discuss the graphical interfaces to the lattice database. The primary interface, known as OZ, is a simulation environment as well as a database browser

  9. REPLIKASI UNIDIRECTIONAL PADA HETEROGEN DATABASE

    OpenAIRE

    Hendro Nindito; Evaristus Didik Madyatmadja; Albert Verasius Dian Sano

    2013-01-01

    The use of diverse database technology in enterprise today can not be avoided. Thus, technology is needed to generate information in real time. The purpose of this research is to discuss a database replication technology that can be applied in heterogeneous database environments. In this study we use Windows-based MS SQL Server database to Linux-based Oracle database as the goal. The research method used is prototyping where development can be done quickly and testing of working models of the...

  10. Database Description - JSNP | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name JSNP Alternative nam...n Science and Technology Agency Creator Affiliation: Contact address E-mail : Database...sapiens Taxonomy ID: 9606 Database description A database of about 197,000 polymorphisms in Japanese populat...1):605-610 External Links: Original website information Database maintenance site Institute of Medical Scien...er registration Not available About This Database Database Description Download License Update History of This Database

  11. Image storage, cataloguing and retrieval using a personal computer database software application

    International Nuclear Information System (INIS)

    Lewis, G.; Howman-Giles, R.

    1999-01-01

    Full text: Interesting images and cases are collected and collated by most nuclear medicine practitioners throughout the world. Changing imaging technology has altered the way in which images may be presented and are reported, with less reliance on 'hard copy' for both reporting and archiving purposes. Digital image generation and storage is rapidly replacing film in both radiological and nuclear medicine practice. A personal computer database based interesting case filing system is described and demonstrated. The digital image storage format allows instant access to both case information (e.g. history and examination, scan report or teaching point) and the relevant images. The database design allows rapid selection of cases and images appropriate to a particular diagnosis, scan type, age or other search criteria. Correlative X-ray, CT, MRI and ultrasound images can also be stored and accessed. The application is in use at The New Children's Hospital as an aid to postgraduate medical education, with new cases being regularly added to the database

  12. Database Description - PSCDB | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name PSCDB Alternative n...rial Science and Technology (AIST) Takayuki Amemiya E-mail: Database classification Structure Databases - Protein structure Database...554-D558. External Links: Original website information Database maintenance site Graduate School of Informat...available URL of Web services - Need for user registration Not available About This Database Database Descri...ption Download License Update History of This Database Site Policy | Contact Us Database Description - PSCDB | LSDB Archive ...

  13. Proposal for Implementing Multi-User Database (MUD) Technology in an Academic Library.

    Science.gov (United States)

    Filby, A. M. Iliana

    1996-01-01

    Explores the use of MOO (multi-user object oriented) virtual environments in academic libraries to enhance reference services. Highlights include the development of multi-user database (MUD) technology from gaming to non-recreational settings; programming issues; collaborative MOOs; MOOs as distinguished from other types of virtual reality; audio…

  14. INTEGRATIVE METHOD OF TEACHING INFORMATION MODELING IN PRACTICAL HEALTH SERVICE BASED ON MICROSOFT ACCESS QUERIES

    Directory of Open Access Journals (Sweden)

    Svetlana A. Firsova

    2016-06-01

    Full Text Available Introduction: this article explores the pedagogical technology employed to teach medical students foundations of work with MICROSOFT ACCESS databases. The above technology is based on integrative approach to the information modeling in public health practice, drawing upon basic didactic concepts that pertain to objects and tools databases created in MICROSOFT ACCESS. The article examines successive steps in teaching the topic “Queries in MICROSOFT ACCESS” – from simple queries to complex ones. The main attention is paid to such components of methodological system, as the principles and teaching methods classified according to the degree of learners’ active cognitive activity. The most interesting is the diagram of the relationship of learning principles, teaching methods and specific types of requests. Materials and Methods: the authors used comparative analysis of literature, syllabi, curricula in medical informatics taught at leading medical universities in Russia. Results: the original technique of training in putting queries with databases of MICROSOFT ACCESS is presented for analysis of information models in practical health care. Discussion and Conclusions: it is argued that the proposed pedagogical technology will significantly improve the effectiveness of teaching the course “Medical Informatics”, that includes development and application of models to simulate the operation of certain facilities and services of the health system which, in turn, increases the level of information culture of practitioners.

  15. Specialist Bibliographic Databases

    OpenAIRE

    Gasparyan, Armen Yuri; Yessirkepov, Marlen; Voronov, Alexander A.; Trukhachev, Vladimir I.; Kostyukova, Elena I.; Gerasimov, Alexey N.; Kitas, George D.

    2016-01-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and d...

  16. Enabling Dedicated, Affordable Space Access Through Aggressive Technology Maturation

    Science.gov (United States)

    Jones, Jonathan E.; Kibbey, Timothy P.; Cobb, C. Brent; Harris, Lawanna L.

    2014-01-01

    A launch vehicle at the scale and price point which allows developers to take reasonable risks with high payoff propulsion and avionics hardware solutions does not exist today. Establishing this service provides a ride through the proverbial technology "valley of death" that lies between demonstration in laboratory and flight environments. NASA's NanoLaunch effort will provide the framework to mature both earth-to-orbit and on-orbit propulsion and avionics technologies while also providing affordable, dedicated access to low earth orbit for cubesat class payloads.

  17. Accessing NASA Technology with the World Wide Web

    Science.gov (United States)

    Nelson, Michael L.; Bianco, David J.

    1995-01-01

    NASA Langley Research Center (LaRC) began using the World Wide Web (WWW) in the summer of 1993, becoming the first NASA installation to provide a Center-wide home page. This coincided with a reorganization of LaRC to provide a more concentrated focus on technology transfer to both aerospace and non-aerospace industry. Use of WWW and NCSA Mosaic not only provides automated information dissemination, but also allows for the implementation, evolution and integration of many technology transfer and technology awareness applications. This paper describes several of these innovative applications, including the on-line presentation of the entire Technology OPportunities Showcase (TOPS), an industrial partnering showcase that exists on the Web long after the actual 3-day event ended. The NASA Technical Report Server (NTRS) provides uniform access to many logically similar, yet physically distributed NASA report servers. WWW is also the foundation of the Langley Software Server (LSS), an experimental software distribution system which will distribute LaRC-developed software. In addition to the more formal technology distribution projects, WWW has been successful in connecting people with technologies and people with other people.

  18. Acoustic Metadata Management and Transparent Access to Networked Oceanographic Data Sets

    Science.gov (United States)

    2015-09-30

    Transparent Access to Networked Oceanographic Data Sets Marie A. Roch Dept. of Computer Science San Diego State University 5500 Campanile Drive San...specific technologies for processing Excel spreadsheets and Access databases. The architecture (Figure 4) is based on a client-server model...Keesey, M. S., Lieske, J. H., Ostro, S. J., Standish, E. M., and Wimberly, R. N. (1996). "JPL’s On-Line Solar System Data Service," B. Am. Astron

  19. The OAuth 2.0 Web Authorization Protocol for the Internet Addiction Bioinformatics (IABio Database

    Directory of Open Access Journals (Sweden)

    Jeongseok Choi

    2016-03-01

    Full Text Available Internet addiction (IA has become a widespread and problematic phenomenon as smart devices pervade society. Moreover, internet gaming disorder leads to increases in social expenditures for both individuals and nations alike. Although the prevention and treatment of IA are getting more important, the diagnosis of IA remains problematic. Understanding the neurobiological mechanism of behavioral addictions is essential for the development of specific and effective treatments. Although there are many databases related to other addictions, a database for IA has not been developed yet. In addition, bioinformatics databases, especially genetic databases, require a high level of security and should be designed based on medical information standards. In this respect, our study proposes the OAuth standard protocol for database access authorization. The proposed IA Bioinformatics (IABio database system is based on internet user authentication, which is a guideline for medical information standards, and uses OAuth 2.0 for access control technology. This study designed and developed the system requirements and configuration. The OAuth 2.0 protocol is expected to establish the security of personal medical information and be applied to genomic research on IA.

  20. Construction of a nasopharyngeal carcinoma 2D/MS repository with Open Source XML database--Xindice.

    Science.gov (United States)

    Li, Feng; Li, Maoyu; Xiao, Zhiqiang; Zhang, Pengfei; Li, Jianling; Chen, Zhuchu

    2006-01-11

    Many proteomics initiatives require integration of all information with uniformcriteria from collection of samples and data display to publication of experimental results. The integration and exchanging of these data of different formats and structure imposes a great challenge to us. The XML technology presents a promise in handling this task due to its simplicity and flexibility. Nasopharyngeal carcinoma (NPC) is one of the most common cancers in southern China and Southeast Asia, which has marked geographic and racial differences in incidence. Although there are some cancer proteome databases now, there is still no NPC proteome database. The raw NPC proteome experiment data were captured into one XML document with Human Proteome Markup Language (HUP-ML) editor and imported into native XML database Xindice. The 2D/MS repository of NPC proteome was constructed with Apache, PHP and Xindice to provide access to the database via Internet. On our website, two methods, keyword query and click query, were provided at the same time to access the entries of the NPC proteome database. Our 2D/MS repository can be used to share the raw NPC proteomics data that are generated from gel-based proteomics experiments. The database, as well as the PHP source codes for constructing users' own proteome repository, can be accessed at http://www.xyproteomics.org/.

  1. Access 2013 for dummies

    CERN Document Server

    Ulrich Fuller, Laurie

    2013-01-01

    The easy guide to Microsoft Access returns with updates on the latest version! Microsoft Access allows you to store, organize, view, analyze, and share data; the new Access 2013 release enables you to build even more powerful, custom database solutions that integrate with the web and enterprise data sources. Access 2013 For Dummies covers all the new features of the latest version of Accessand serves as an ideal reference, combining the latest Access features with the basics of building usable databases. You'll learn how to create an app from the Welcome screen, get support

  2. Pro Access 2010 Development

    CERN Document Server

    Collins, Mark

    2011-01-01

    Pro Access 2010 Development is a fundamental resource for developing business applications that take advantage of the features of Access 2010 and the many sources of data available to your business. In this book, you'll learn how to build database applications, create Web-based databases, develop macros and Visual Basic for Applications (VBA) tools for Access applications, integrate Access with SharePoint and other business systems, and much more. Using a practical, hands-on approach, this book will take you through all the facets of developing Access-based solutions, such as data modeling, co

  3. Electricity access for geographically disadvantaged rural communities--technology and policy insights

    International Nuclear Information System (INIS)

    Chaurey, Akanksha; Ranganathan, Malini; Mohanty, Parimita

    2004-01-01

    The purpose of this paper is to weigh the issues and options for increasing electricity access in remote and geographically challenged villages in interior Rajasthan, the desertstate in Western India where power sector reforms are currently underway. By first providing an overview of reforms and various electrification policy initiatives in India, the paper then analyzes the specific problems as studied at the grass-roots level with respect to rural electricity access and the use of off-grid renewables. Finally, it discusses interventions that could facilitate access to electricity by suggesting a sequential distributed generation (DG)-based approach, wherein consecutive DG schemes--incorporating the requisite technological, financial, and institutional arrangements--are designed depending on the developmental requirements of the community. In essence, this approach fits under the broader need to understand how the three 'Rs'- rural electrification (the process), power sector reforms (the catalyst), and the use of renewable energy technologies (the means) - could potentially converge to meet the needs of India's rural poor

  4. Electricity access for geographically disadvantaged rural communities--technology and policy insights

    Energy Technology Data Exchange (ETDEWEB)

    Chaurey, Akanksha E-mail: akanksha@teri.res.in; Ranganathan, Malini E-mail: malinir@teri.res.in; Mohanty, Parimita

    2004-10-01

    The purpose of this paper is to weigh the issues and options for increasing electricity access in remote and geographically challenged villages in interior Rajasthan, the desertstate in Western India where power sector reforms are currently underway. By first providing an overview of reforms and various electrification policy initiatives in India, the paper then analyzes the specific problems as studied at the grass-roots level with respect to rural electricity access and the use of off-grid renewables. Finally, it discusses interventions that could facilitate access to electricity by suggesting a sequential distributed generation (DG)-based approach, wherein consecutive DG schemes--incorporating the requisite technological, financial, and institutional arrangements--are designed depending on the developmental requirements of the community. In essence, this approach fits under the broader need to understand how the three 'Rs'- rural electrification (the process), power sector reforms (the catalyst), and the use of renewable energy technologies (the means) - could potentially converge to meet the needs of India's rural poor.

  5. On-line atomic data access

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, D.R. [Oak Ridge National Lab., TN (United States); Nash, J.K. [Lawrence Livermore National Lab., CA (United States)

    1996-04-01

    The need for atomic data is one which continues to expand in a wide variety of applications including fusion energy, astrophysics, laser- produced plasma research, and plasma processing. Modern computer database and communications technology nables this data to be placed on-line and obtained by users of the Internet. Presented here is a summary of the observations and conclusions regarding such on-line atomic data access derived from a forum held at the Tenth APS Topical Conference on Atomic Processes in Plasmas.

  6. DATABASE REPLICATION IN HETEROGENOUS PLATFORM

    OpenAIRE

    Hendro Nindito; Evaristus Didik Madyatmadja; Albert Verasius Dian Sano

    2014-01-01

    The application of diverse database technologies in enterprises today is increasingly a common practice. To provide high availability and survavibality of real-time information, a database replication technology that has capability to replicate databases under heterogenous platforms is required. The purpose of this research is to find the technology with such capability. In this research, the data source is stored in MSSQL database server running on Windows. The data will be replicated to MyS...

  7. [Improving global access to new vaccines: intellectual property, technology transfer, and regulatory pathways].

    Science.gov (United States)

    Crager, Sara Eve

    2015-01-01

    The 2012 World Health Assembly Global Vaccine Action Plan called for global access to new vaccines within 5 years of licensure. Current approaches have proven insufficient to achieve sustainable vaccine pricing within such a timeline. Paralleling the successful strategy of generic competition to bring down drug prices, a clear consensus is emerging that market entry of multiple suppliers is a critical factor in expeditiously bringing down prices of new vaccines. In this context, key target objectives for improving access to new vaccines include overcoming intellectual property obstacles, streamlining regulatory pathways for biosimilar vaccines, and reducing market entry timelines for developing-country vaccine manufacturers by transfer of technology and know-how. I propose an intellectual property, technology, and know-how bank as a new approach to facilitate widespread access to new vaccines in low- and middle-income countries by efficient transfer of patented vaccine technologies to multiple developing-country vaccine manufacturers.

  8. Improving global access to new vaccines: intellectual property, technology transfer, and regulatory pathways.

    Science.gov (United States)

    Crager, Sara Eve

    2014-11-01

    The 2012 World Health Assembly Global Vaccine Action Plan called for global access to new vaccines within 5 years of licensure. Current approaches have proven insufficient to achieve sustainable vaccine pricing within such a timeline. Paralleling the successful strategy of generic competition to bring down drug prices, a clear consensus is emerging that market entry of multiple suppliers is a critical factor in expeditiously bringing down prices of new vaccines. In this context, key target objectives for improving access to new vaccines include overcoming intellectual property obstacles, streamlining regulatory pathways for biosimilar vaccines, and reducing market entry timelines for developing-country vaccine manufacturers by transfer of technology and know-how. I propose an intellectual property, technology, and know-how bank as a new approach to facilitate widespread access to new vaccines in low- and middle-income countries by efficient transfer of patented vaccine technologies to multiple developing-country vaccine manufacturers.

  9. Radiation immune RAM semiconductor technology for the 80's. [Random Access Memory

    Science.gov (United States)

    Hanna, W. A.; Panagos, P.

    1983-01-01

    This paper presents current and short term future characteristics of RAM semiconductor technologies which were obtained by literature survey and discussions with cognizant Government and industry personnel. In particular, total ionizing dose tolerance and high energy particle susceptibility of the technologies are addressed. Technologies judged compatible with spacecraft applications are ranked to determine the best current and future technology for fast access (less than 60 ns), radiation tolerant RAM.

  10. Generalized Database Management System Support for Numeric Database Environments.

    Science.gov (United States)

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  11. GenomeRNAi: a database for cell-based RNAi phenotypes.

    Science.gov (United States)

    Horn, Thomas; Arziman, Zeynep; Berger, Juerg; Boutros, Michael

    2007-01-01

    RNA interference (RNAi) has emerged as a powerful tool to generate loss-of-function phenotypes in a variety of organisms. Combined with the sequence information of almost completely annotated genomes, RNAi technologies have opened new avenues to conduct systematic genetic screens for every annotated gene in the genome. As increasing large datasets of RNAi-induced phenotypes become available, an important challenge remains the systematic integration and annotation of functional information. Genome-wide RNAi screens have been performed both in Caenorhabditis elegans and Drosophila for a variety of phenotypes and several RNAi libraries have become available to assess phenotypes for almost every gene in the genome. These screens were performed using different types of assays from visible phenotypes to focused transcriptional readouts and provide a rich data source for functional annotation across different species. The GenomeRNAi database provides access to published RNAi phenotypes obtained from cell-based screens and maps them to their genomic locus, including possible non-specific regions. The database also gives access to sequence information of RNAi probes used in various screens. It can be searched by phenotype, by gene, by RNAi probe or by sequence and is accessible at http://rnai.dkfz.de.

  12. Development of a functional, internet-accessible department of surgery outcomes database.

    Science.gov (United States)

    Newcomb, William L; Lincourt, Amy E; Gersin, Keith; Kercher, Kent; Iannitti, David; Kuwada, Tim; Lyons, Cynthia; Sing, Ronald F; Hadzikadic, Mirsad; Heniford, B Todd; Rucho, Susan

    2008-06-01

    The need for surgical outcomes data is increasing due to pressure from insurance companies, patients, and the need for surgeons to keep their own "report card". Current data management systems are limited by inability to stratify outcomes based on patients, surgeons, and differences in surgical technique. Surgeons along with research and informatics personnel from an academic, hospital-based Department of Surgery and a state university's Department of Information Technology formed a partnership to develop a dynamic, internet-based, clinical data warehouse. A five-component model was used: data dictionary development, web application creation, participating center education and management, statistics applications, and data interpretation. A data dictionary was developed from a list of data elements to address needs of research, quality assurance, industry, and centers of excellence. A user-friendly web interface was developed with menu-driven check boxes, multiple electronic data entry points, direct downloads from hospital billing information, and web-based patient portals. Data were collected on a Health Insurance Portability and Accountability Act-compliant server with a secure firewall. Protected health information was de-identified. Data management strategies included automated auditing, on-site training, a trouble-shooting hotline, and Institutional Review Board oversight. Real-time, daily, monthly, and quarterly data reports were generated. Fifty-eight publications and 109 abstracts have been generated from the database during its development and implementation. Seven national academic departments now use the database to track patient outcomes. The development of a robust surgical outcomes database requires a combination of clinical, informatics, and research expertise. Benefits of surgeon involvement in outcomes research include: tracking individual performance, patient safety, surgical research, legal defense, and the ability to provide accurate information

  13. Brasilia’s Database Administrators

    Directory of Open Access Journals (Sweden)

    Jane Adriana

    2016-06-01

    Full Text Available Database administration has gained an essential role in the management of new database technologies. Different data models are being created for supporting the enormous data volume, from the traditional relational database. These new models are called NoSQL (Not only SQL databases. The adoption of best practices and procedures, has become essential for the operation of database management systems. Thus, this paper investigates some of the techniques and tools used by database administrators. The study highlights features and particularities in databases within the area of Brasilia, the Capital of Brazil. The results point to which new technologies regarding database management are currently the most relevant, as well as the central issues in this area.

  14. Software listing: CHEMTOX database

    International Nuclear Information System (INIS)

    Moskowitz, P.D.

    1993-01-01

    Initially launched in 1983, the CHEMTOX Database was among the first microcomputer databases containing hazardous chemical information. The database is used in many industries and government agencies in more than 17 countries. Updated quarterly, the CHEMTOX Database provides detailed environmental and safety information on 7500-plus hazardous substances covered by dozens of regulatory and advisory sources. This brief listing describes the method of accessing data and provides ordering information for those wishing to obtain the CHEMTOX Database

  15. Feasibility of Smartphone Based Photogrammetric Point Clouds for the Generation of Accessibility Maps

    Science.gov (United States)

    Angelats, E.; Parés, M. E.; Kumar, P.

    2018-05-01

    Accessible cities with accessible services are an old claim of people with reduced mobility. But this demand is still far away of becoming a reality as lot of work is required to be done yet. First step towards accessible cities is to know about real situation of the cities and its pavement infrastructure. Detailed maps or databases on street slopes, access to sidewalks, mobility in public parks and gardens, etc. are required. In this paper, we propose to use smartphone based photogrammetric point clouds, as a starting point to create accessible maps or databases. This paper analyses the performance of these point clouds and the complexity of the image acquisition procedure required to obtain them. The paper proves, through two test cases, that smartphone technology is an economical and feasible solution to get the required information, which is quite often seek by city planners to generate accessible maps. The proposed approach paves the way to generate, in a near term, accessibility maps through the use of point clouds derived from crowdsourced smartphone imagery.

  16. Assessment of Low-Income Adults' Access to Technology: Implications for Nutrition Education

    Science.gov (United States)

    Neuenschwander, Lauren M.; Abbott, Angela; Mobley, Amy R.

    2012-01-01

    Objective: The main objective of this study was to investigate access and use of technologies such as the Internet among Indiana's low-income population. The secondary objective was to determine whether access and use of computers significantly differed by age, race, and/or education level. Methods: Data were collected from low-income adult…

  17. Cell Centred Database (CCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Cell Centered Database (CCDB) is a web accessible database for high resolution 2D, 3D and 4D data from light and electron microscopy, including correlated imaging.

  18. A Discourse On Broadband Technologies And Curriculum Access In Elective Home Learning

    Directory of Open Access Journals (Sweden)

    Andrew MCAVOY

    2014-12-01

    Full Text Available The extent, to which broadband technologies are being considered, when accessing the curriculum, is increasingly evident in traditional learning environments such as schools and colleges. This article explores the impact that these technologies are having on the home schooling community by offering enhanced access and opportunities. It suggests that they have generated improved choices and greater freedoms for learning communities. They have shone a light on the curriculum and removed it from the shadows. The curriculum is no longer the preserve of the educational establishment. The secret garden has been breached by technologies such as broadband and the democratisation of the curriculum is progressively evident as more diverse learning communities are given increased access and control over the curriculum. The author asks how this is being reflected in policy and translated into practice by the home schooling community whilst acknowledging the contemporary nature of broadband technologies and how they are influencing the decision making process of potential home schoolers. Looking to the future, the author suggests that the political agenda is not providing clear direction and that this is being determined by social reform outside the political sphere and largely driven by the consumer. In this case the learner. The relatively current nature of this debate is in itself justification for further research if we are to develop a clearer understanding of how new technologies such as broadband are influencing policy and practice in the home schooling community.

  19. The ethics of attaching research conditions to access to new health technologies.

    Science.gov (United States)

    Holland, Stephen; Hope, Tony

    2012-06-01

    Decisions on which new health technologies to provide are controversial because of the scarcity of healthcare resources, the competing demands of payers, providers and patients and the uncertainty of the evidence base. Given this, additional information about new health technologies is often considered valuable. One response is to make access to a new health technology conditional on further research. Access can be restricted to patients who participate in a research study, such as a randomised controlled trial; alternatively, a new treatment can be made generally available, but only on condition that further evidence is collected (eg, on long-term outcomes and adverse events, in patient registries). The National Institute for Health and Clinical Excellence (NICE), which provides guidance on which new health technologies to make available under the UK's NHS, for example, has made some research conditional recommendations, and the current interest in such options suggests that they are likely to become more prevalent in the future. This paper identifies and discusses the main ethical issues created by this distinctive range of recommendations. We argue that decisions to put research conditions on access to new technologies are compatible with widely accepted values, principles and practices relevant to resource allocation. However, there are important features of these distinctive judgements that must be taken into account by resource allocation decision-making bodies and research ethics committees, and that require new sorts of empirical data.

  20. Student Access to Information Technology and Perceptions of Future Opportunities in Two Small Labrador Communities

    Directory of Open Access Journals (Sweden)

    Della Healey

    2002-02-01

    Full Text Available The potential of information technology is increasingly being recognized for the access it provides to educational and vocational opportunities. In Canada, many small schools in rural communities have taken advantage of information technologies to help overcome geographic isolation for students. This article is about students in two small and geographically isolated Labrador communities. Twenty senior students were found to have varying degrees of access to information technologies. Differences were found in their perceptions of the benefits of information technology for their educational and vocational futures.

  1. The challenge of making nuclear technologies acceptable, accessible and affordable

    International Nuclear Information System (INIS)

    Ramamurthy, V.S.

    2009-01-01

    Full text: It is more than five decades since the first successful demonstration of nuclear power for commercial electricity production. The same decades have also seen the successful demonstration of several other applications of nuclear technologies that can contribute directly to human development, as for example, in the Food and Agriculture, Human and animal Health, Environment and Water sectors. In spite of several successful demonstrations and applications in these fields, it is somewhat strange that their full potential is yet to be realized. More importantly, their availability to populations across the world is highly skewed. Three barriers have been identified for the wide spread use of nuclear technologies for development- Acceptability, Accessibility and Affordability. It is an unfortunate twist of fate that the first public demonstration of nuclear technology was its destructive power. The following demonization of anything nuclear was further compounded by the discussions on the unresolved questions on tackling long lived radioactive wastes, our inability to arrive at a global consensus on nuclear disarmament and issues of nuclear proliferation. These have certainly had a negative impact on the public acceptance of nuclear technologies across the board. While the recent concerns on the global climate change following the emission of carbon-di-oxide from excessive hydrocarbon burning for meeting our increasing energy needs have revived the interest in nuclear energy, a lot needs to be done to de-demonize nuclear technologies in public mind leading to increased acceptance of nuclear technologies for development. Lack of resources, infrastructure and trained man power also have a negative impact on the accessibility and affordability of the nuclear technologies for development. It is argued that only education holds the key for this. The role of international partnerships is also highlighted in realizing the full potential of nuclear technologies for

  2. High Energy Nuclear Database: A Testbed for Nuclear Data Information Technology

    International Nuclear Information System (INIS)

    Brown, D A; Vogt, R; Beck, B; Pruet, J

    2007-01-01

    We describe the development of an on-line high-energy heavy-ion experimental database. When completed, the database will be searchable and cross-indexed with relevant publications, including published detector descriptions. While this effort is relatively new, it will eventually contain all published data from older heavy-ion programs as well as published data from current and future facilities. These data include all measured observables in proton-proton, proton-nucleus and nucleus-nucleus collisions. Once in general use, this database will have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models for a broad range of experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion, target and source development for upcoming facilities such as the International Linear Collider and homeland security. This database is part of a larger proposal that includes the production of periodic data evaluations and topical reviews. These reviews would provide an alternative and impartial mechanism to resolve discrepancies between published data from rival experiments and between theory and experiment. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This project serves as a testbed for the further development of an object-oriented nuclear data format and database system. By using ''off-the-shelf'' software tools and techniques, the system is simple, robust, and extensible. Eventually we envision a ''Grand Unified Nuclear Format'' encapsulating data types used in the ENSDF, ENDF/B, EXFOR, NSR and other formats, including processed data formats

  3. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  4. Database Replication

    CERN Document Server

    Kemme, Bettina

    2010-01-01

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and

  5. Solving Relational Database Problems with ORDBMS in an Advanced Database Course

    Science.gov (United States)

    Wang, Ming

    2011-01-01

    This paper introduces how to use the object-relational database management system (ORDBMS) to solve relational database (RDB) problems in an advanced database course. The purpose of the paper is to provide a guideline for database instructors who desire to incorporate the ORDB technology in their traditional database courses. The paper presents…

  6. Replikasi Unidirectional pada Heterogen Database

    Directory of Open Access Journals (Sweden)

    Hendro Nindito

    2013-12-01

    Full Text Available The use of diverse database technology in enterprise today can not be avoided. Thus, technology is needed to generate information in real time. The purpose of this research is to discuss a database replication technology that can be applied in heterogeneous database environments. In this study we use Windows-based MS SQL Server database to Linux-based Oracle database as the goal. The research method used is prototyping where development can be done quickly and testing of working models of the interaction process is done through repeated. From this research it is obtained that the database replication technolgy using Oracle Golden Gate can be applied in heterogeneous environments in real time as well.

  7. Utilizing Multimedia Database Access: Teaching Strategies Using the iPad in the Dance Classroom

    Science.gov (United States)

    Ostashewski, Nathaniel; Reid, Doug; Ostashewski, Marcia

    2016-01-01

    This article presents action research that identified iPad tablet technology-supported teaching strategies in a dance classroom context. Dance classrooms use instructor-accessed music as a regular element of lessons, but video is both challenging and time-consuming to produce or display. The results of this study highlight how the Apple iPad…

  8. X-ray Photoelectron Spectroscopy Database (Version 4.1)

    Science.gov (United States)

    SRD 20 X-ray Photoelectron Spectroscopy Database (Version 4.1) (Web, free access)   The NIST XPS Database gives access to energies of many photoelectron and Auger-electron spectral lines. The database contains over 22,000 line positions, chemical shifts, doublet splittings, and energy separations of photoelectron and Auger-electron lines.

  9. Student Access to and Skills in Using Technology in an Open and Distance Learning Context

    Directory of Open Access Journals (Sweden)

    Hanlie Liebenberg

    2012-10-01

    Full Text Available Amidst the different challenges facing higher education, and particularly distance education (DE and open distance learning (ODL, access to information and communication technology (ICT and students’ abilities to use ICTs are highly contested issues in the South African higher education landscape. While there are various opinions about the scope and definition of the digital divide, increasing empirical evidence questions the uncritical use of the notion of the digital divide in South African and international higher education discourses.In the context of the University of South Africa (Unisa as a mega ODL institution, students’ access to technology and their functional competence are some of the critical issues to consider as Unisa prepares our graduates for an increasingly digital and networked world.This paper discusses a descriptive study that investigated students’ access to technology and their capabilities in using technology, within the broader discourse of the “digital divide.” Results support literature that challenges a simplistic understanding of the notion of the “digital divide” and reveal that the nature of access is varied.

  10. Serial killer: il database mondiale

    Directory of Open Access Journals (Sweden)

    Gaetano parente

    2016-07-01

    Full Text Available The complex and multisided study of serial killers is partly made difficult by the current level of progress that has led these deviant people to evolve in relation to the aspects of shrewdness (concerning the staging and mobility. Despite the important work of some scholars who proposed important theories, all this shows that, concerning serial murders, it is still particularly frequent not to pay attention to links among homicides committed by the same person but in different parts of the world. It is therefore crucial to develop a worldwide database that allows all police forces to access information collected on crime scenes of murders which are particularly absurd and committed without any apparent reason. It will then be up to the profiler, through ad hoc and technologically advanced tools, to collect this information on the crime scene that would be made available to all police forces thanks to the worldwide database.

  11. Construction of a nasopharyngeal carcinoma 2D/MS repository with Open Source XML Database – Xindice

    Directory of Open Access Journals (Sweden)

    Li Jianling

    2006-01-01

    Full Text Available Abstract Background Many proteomics initiatives require integration of all information with uniformcriteria from collection of samples and data display to publication of experimental results. The integration and exchanging of these data of different formats and structure imposes a great challenge to us. The XML technology presents a promise in handling this task due to its simplicity and flexibility. Nasopharyngeal carcinoma (NPC is one of the most common cancers in southern China and Southeast Asia, which has marked geographic and racial differences in incidence. Although there are some cancer proteome databases now, there is still no NPC proteome database. Results The raw NPC proteome experiment data were captured into one XML document with Human Proteome Markup Language (HUP-ML editor and imported into native XML database Xindice. The 2D/MS repository of NPC proteome was constructed with Apache, PHP and Xindice to provide access to the database via Internet. On our website, two methods, keyword query and click query, were provided at the same time to access the entries of the NPC proteome database. Conclusion Our 2D/MS repository can be used to share the raw NPC proteomics data that are generated from gel-based proteomics experiments. The database, as well as the PHP source codes for constructing users' own proteome repository, can be accessed at http://www.xyproteomics.org/.

  12. The AMMA database

    Science.gov (United States)

    Boichard, Jean-Luc; Brissebrat, Guillaume; Cloche, Sophie; Eymard, Laurence; Fleury, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim

    2010-05-01

    The AMMA project includes aircraft, ground-based and ocean measurements, an intensive use of satellite data and diverse modelling studies. Therefore, the AMMA database aims at storing a great amount and a large variety of data, and at providing the data as rapidly and safely as possible to the AMMA research community. In order to stimulate the exchange of information and collaboration between researchers from different disciplines or using different tools, the database provides a detailed description of the products and uses standardized formats. The AMMA database contains: - AMMA field campaigns datasets; - historical data in West Africa from 1850 (operational networks and previous scientific programs); - satellite products from past and future satellites, (re-)mapped on a regular latitude/longitude grid and stored in NetCDF format (CF Convention); - model outputs from atmosphere or ocean operational (re-)analysis and forecasts, and from research simulations. The outputs are processed as the satellite products are. Before accessing the data, any user has to sign the AMMA data and publication policy. This chart only covers the use of data in the framework of scientific objectives and categorically excludes the redistribution of data to third parties and the usage for commercial applications. Some collaboration between data producers and users, and the mention of the AMMA project in any publication is also required. The AMMA database and the associated on-line tools have been fully developed and are managed by two teams in France (IPSL Database Centre, Paris and OMP, Toulouse). Users can access data of both data centres using an unique web portal. This website is composed of different modules : - Registration: forms to register, read and sign the data use chart when an user visits for the first time - Data access interface: friendly tool allowing to build a data extraction request by selecting various criteria like location, time, parameters... The request can

  13. Database Quality and Access Issues Relevant to Research Using Anesthesia Information Management System Data.

    Science.gov (United States)

    Epstein, Richard H; Dexter, Franklin

    2018-07-01

    For this special article, we reviewed the computer code, used to extract the data, and the text of all 47 studies published between January 2006 and August 2017 using anesthesia information management system (AIMS) data from Thomas Jefferson University Hospital (TJUH). Data from this institution were used in the largest number (P = .0007) of papers describing the use of AIMS published in this time frame. The AIMS was replaced in April 2017, making this finite sample finite. The objective of the current article was to identify factors that made TJUH successful in publishing anesthesia informatics studies. We examined the structured query language used for each study to examine the extent to which databases outside of the AIMS were used. We examined data quality from the perspectives of completeness, correctness, concordance, plausibility, and currency. Our results were that most could not have been completed without external database sources (36/47, 76.6%; P = .0003 compared with 50%). The operating room management system was linked to the AIMS and was used significantly more frequently (26/36, 72%) than other external sources. Access to these external data sources was provided, allowing exploration of data quality. The TJUH AIMS used high-resolution timestamps (to the nearest 3 milliseconds) and created audit tables to track changes to clinical documentation. Automatic data were recorded at 1-minute intervals and were not editable; data cleaning occurred during analysis. Few paired events with an expected order were out of sequence. Although most data elements were of high quality, there were notable exceptions, such as frequent missing values for estimated blood loss, height, and weight. Some values were duplicated with different units, and others were stored in varying locations. Our conclusions are that linking the TJUH AIMS to the operating room management system was a critical step in enabling publication of multiple studies using AIMS data. Access to this and

  14. Database system for management of health physics and industrial hygiene records

    International Nuclear Information System (INIS)

    Murdoch, B. T.; Blomquist, J. A.; Cooke, R. H.; Davis, J. T.; Davis, T. M.; Dolecek, E. H.; Halka-Peel, L.; Johnson, D.; Keto, D. N.; Reyes, L. R.; Schlenker, R. A.; Woodring; J. L.

    1999-01-01

    This paper provides an overview of the Worker Protection System (WPS), a client/server, Windows-based database management system for essential radiological protection and industrial hygiene. Seven operational modules handle records for external dosimetry, bioassay/internal dosimetry, sealed sources, routine radiological surveys, lasers, workplace exposure, and respirators. WPS utilizes the latest hardware and software technologies to provide ready electronic access to a consolidated source of worker protection

  15. Sequential data access with Oracle and Hadoop: a performance comparison

    International Nuclear Information System (INIS)

    Baranowski, Zbigniew; Canali, Luca; Grancher, Eric

    2014-01-01

    The Hadoop framework has proven to be an effective and popular approach for dealing with 'Big Data' and, thanks to its scaling ability and optimised storage access, Hadoop Distributed File System-based projects such as MapReduce or HBase are seen as candidates to replace traditional relational database management systems whenever scalable speed of data processing is a priority. But do these projects deliver in practice? Does migrating to Hadoop's 'shared nothing' architecture really improve data access throughput? And, if so, at what cost? Authors answer these questions–addressing cost/performance as well as raw performance– based on a performance comparison between an Oracle-based relational database and Hadoop's distributed solutions like MapReduce or HBase for sequential data access. A key feature of our approach is the use of an unbiased data model as certain data models can significantly favour one of the technologies tested.

  16. HIV Structural Database

    Science.gov (United States)

    SRD 102 HIV Structural Database (Web, free access)   The HIV Protease Structural Database is an archive of experimentally determined 3-D structures of Human Immunodeficiency Virus 1 (HIV-1), Human Immunodeficiency Virus 2 (HIV-2) and Simian Immunodeficiency Virus (SIV) Proteases and their complexes with inhibitors or products of substrate cleavage.

  17. The Effect of Relational Database Technology on Administrative Computing at Carnegie Mellon University.

    Science.gov (United States)

    Golden, Cynthia; Eisenberger, Dorit

    1990-01-01

    Carnegie Mellon University's decision to standardize its administrative system development efforts on relational database technology and structured query language is discussed and its impact is examined in one of its larger, more widely used applications, the university information system. Advantages, new responsibilities, and challenges of the…

  18. Language and Text-to-Speech Technologies for Highly Accessible Language & Culture Learning

    Directory of Open Access Journals (Sweden)

    Anouk Gelan

    2011-06-01

    Full Text Available This contribution presents the results of the “Speech technology integrated learning modules for Intercultural Dialogue” project. The project objective was to increase the availability and quality of e-learning opportunities for less widely-used and less taught European languages using a user-friendly and highly accessible learning environment. The integration of new Text-to-Speech developments into web-based authoring software for tutorial CALL had a double goal: on the one hand increase the accessibility of e-learning packages, also for learners having difficulty reading (e.g. dyslexic learners or preferring auditory learning; on the other hand exploiting some didactic possibilities of this technology.

  19. Validating an infrared thermal switch as a novel access technology

    Directory of Open Access Journals (Sweden)

    Memarian Negar

    2010-08-01

    Full Text Available Abstract Background Recently, a novel single-switch access technology based on infrared thermography was proposed. The technology exploits the temperature differences between the inside and surrounding areas of the mouth as a switch trigger, thereby allowing voluntary switch activation upon mouth opening. However, for this technology to be clinically viable, it must be validated against a gold standard switch, such as a chin switch, that taps into the same voluntary motion. Methods In this study, we report an experiment designed to gauge the concurrent validity of the infrared thermal switch. Ten able-bodied adults participated in a series of 3 test sessions where they simultaneously used both an infrared thermal and conventional chin switch to perform multiple trials of a number identification task with visual, auditory and audiovisual stimuli. Participants also provided qualitative feedback about switch use. User performance with the two switches was quantified using an efficiency measure based on mutual information. Results User performance (p = 0.16 and response time (p = 0.25 with the infrared thermal switch were comparable to those of the gold standard. Users reported preference for the infrared thermal switch given its non-contact nature and robustness to changes in user posture. Conclusions Thermal infrared access technology appears to be a valid single switch alternative for individuals with disabilities who retain voluntary mouth opening and closing.

  20. Electricity access for geographically disadvantaged rural communities - technology and policy insights

    Energy Technology Data Exchange (ETDEWEB)

    Chaurey, A.; Malini Ranganathan [The Energy and Resources Institute, New Delhi (India). India Habitat Centre; Parimita Mohanty [Jadavpur University, Kolkota (India). School of Energy Studies

    2004-10-01

    The purpose of this paper is to weigh the issues and options for increasing electricity access in remote and geographically challenged villages in interior Rajasthan, the desert state in Western India where power sector reforms are currently underway. By first providing an overview of reforms and various electrification policy initiatives in India, the paper then analyzes the specific problems as studied at the grass-roots level with respect to rural electricity access and the use of off-grid renewables. Finally, it discusses interventions that could facilitate access to electricity by suggesting a sequential distributed generation (DG)-based approach, wherein consecutive DG schemes-incorporating the requisite technological, financial, and institutional arrangements-are designed depending on the developmental requirements of the community. In essence, this approach fits under the broader need to understand how the three ''Rs'' - rural electrification (the process), power sector reforms (the catalyst), and the use of renewable energy technologies (the means)- could potentially converge to meet the needs of India's rural poor. (author)

  1. JICST Factual DatabaseJICST Chemical Substance Safety Regulation Database

    Science.gov (United States)

    Abe, Atsushi; Sohma, Tohru

    JICST Chemical Substance Safety Regulation Database is based on the Database of Safety Laws for Chemical Compounds constructed by Japan Chemical Industry Ecology-Toxicology & Information Center (JETOC) sponsored by the Sience and Technology Agency in 1987. JICST has modified JETOC database system, added data and started the online service through JOlS-F (JICST Online Information Service-Factual database) in January 1990. JICST database comprises eighty-three laws and fourteen hundred compounds. The authors outline the database, data items, files and search commands. An example of online session is presented.

  2. DATABASES DEVELOPED IN INDIA FOR BIOLOGICAL SCIENCES

    Directory of Open Access Journals (Sweden)

    Gitanjali Yadav

    2017-09-01

    Full Text Available The complexity of biological systems requires use of a variety of experimental methods with ever increasing sophistication to probe various cellular processes at molecular and atomic resolution. The availability of technologies for determining nucleic acid sequences of genes and atomic resolution structures of biomolecules prompted development of major biological databases like GenBank and PDB almost four decades ago. India was one of the few countries to realize early, the utility of such databases for progress in modern biology/biotechnology. Department of Biotechnology (DBT, India established Biotechnology Information System (BTIS network in late eighties. Starting with the genome sequencing revolution at the turn of the century, application of high-throughput sequencing technologies in biology and medicine for analysis of genomes, transcriptomes, epigenomes and microbiomes have generated massive volumes of sequence data. BTIS network has not only provided state of the art computational infrastructure to research institutes and universities for utilizing various biological databases developed abroad in their research, it has also actively promoted research and development (R&D projects in Bioinformatics to develop a variety of biological databases in diverse areas. It is encouraging to note that, a large number of biological databases or data driven software tools developed in India, have been published in leading peer reviewed international journals like Nucleic Acids Research, Bioinformatics, Database, BMC, PLoS and NPG series publication. Some of these databases are not only unique, they are also highly accessed as reflected in number of citations. Apart from databases developed by individual research groups, BTIS has initiated consortium projects to develop major India centric databases on Mycobacterium tuberculosis, Rice and Mango, which can potentially have practical applications in health and agriculture. Many of these biological

  3. 48 CFR 3004.470 - Security requirements for access to unclassified facilities, Information Technology resources...

    Science.gov (United States)

    2010-10-01

    ... access to unclassified facilities, Information Technology resources, and sensitive information. 3004.470... Technology resources, and sensitive information. ... ACQUISITION REGULATION (HSAR) GENERAL ADMINISTRATIVE MATTERS Safeguarding Classified and Sensitive Information...

  4. JASPAR 2014: an extensively expanded and updated open-access database of transcription factor binding profiles.

    Science.gov (United States)

    Mathelier, Anthony; Zhao, Xiaobei; Zhang, Allen W; Parcy, François; Worsley-Hunt, Rebecca; Arenillas, David J; Buchman, Sorana; Chen, Chih-yu; Chou, Alice; Ienasescu, Hans; Lim, Jonathan; Shyr, Casper; Tan, Ge; Zhou, Michelle; Lenhard, Boris; Sandelin, Albin; Wasserman, Wyeth W

    2014-01-01

    JASPAR (http://jaspar.genereg.net) is the largest open-access database of matrix-based nucleotide profiles describing the binding preference of transcription factors from multiple species. The fifth major release greatly expands the heart of JASPAR-the JASPAR CORE subcollection, which contains curated, non-redundant profiles-with 135 new curated profiles (74 in vertebrates, 8 in Drosophila melanogaster, 10 in Caenorhabditis elegans and 43 in Arabidopsis thaliana; a 30% increase in total) and 43 older updated profiles (36 in vertebrates, 3 in D. melanogaster and 4 in A. thaliana; a 9% update in total). The new and updated profiles are mainly derived from published chromatin immunoprecipitation-seq experimental datasets. In addition, the web interface has been enhanced with advanced capabilities in browsing, searching and subsetting. Finally, the new JASPAR release is accompanied by a new BioPython package, a new R tool package and a new R/Bioconductor data package to facilitate access for both manual and automated methods.

  5. Federal databases

    International Nuclear Information System (INIS)

    Welch, M.J.; Welles, B.W.

    1988-01-01

    Accident statistics on all modes of transportation are available as risk assessment analytical tools through several federal agencies. This paper reports on the examination of the accident databases by personal contact with the federal staff responsible for administration of the database programs. This activity, sponsored by the Department of Energy through Sandia National Laboratories, is an overview of the national accident data on highway, rail, air, and marine shipping. For each mode, the definition or reporting requirements of an accident are determined and the method of entering the accident data into the database is established. Availability of the database to others, ease of access, costs, and who to contact were prime questions to each of the database program managers. Additionally, how the agency uses the accident data was of major interest

  6. An Ontology as a Tool for Representing Fuzzy Data in Relational Databases

    Directory of Open Access Journals (Sweden)

    Carmen Martinez-Cruz

    2012-11-01

    Full Text Available Several applications to represent classical or fuzzy data in databases have been developed in the last two decades. However, these representations present some limitations specially related with the system portability and complexity. Ontologies provides a mechanism to represent data in an implementation-independent and web-accessible way. To get advantage of this, in this paper, an ontology, that represents fuzzy relational database model, has been redefined to communicate users or applications with fuzzy data stored in fuzzy databases. The communication channel established between the ontology and any Relational Database Management System (RDBMS is analysed in depth throughout the text to justify some of the advantages of the system: expressiveness, portability and platform heterogeneity. Moreover, some tools have been developed to define and manage fuzzy and classical data in relational databases using this ontology. Even an application that performs fuzzy queries using the same technology is included in this proposal together with some examples using real databases.

  7. NBIC: Search Ballast Report Database

    Science.gov (United States)

    Smithsonian Environmental Research Center Logo US Coast Guard Logo Submit BW Report | Search NBIC Database developed an online database that can be queried through our website. Data are accessible for all coastal Lakes, have been incorporated into the NBIC database as of August 2004. Information on data availability

  8. DoSSiER: Database of Scientific Simulation and Experimental Results

    CERN Document Server

    Wenzel, Hans; Genser, Krzysztof; Elvira, Daniel; Pokorski, Witold; Carminati, Federico; Konstantinov, Dmitri; Ribon, Alberto; Folger, Gunter; Dotti, Andrea

    2017-01-01

    The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.

  9. Exploring Teacher Pedagogy, Stages of Concern and Accessibility as Determinants of Technology Adoption

    Science.gov (United States)

    Burke, Paul F.; Schuck, Sandy; Aubusson, Peter; Kearney, Matthew; Frischknecht, Bart

    2018-01-01

    This research examines how the pedagogical orientations of teachers affect technology adoption in the classroom. At the same time, the authors account for the stage of concern that teachers are experiencing regarding the use of the technology, their access to the technology and the level of schooling at which they teach.The authors' investigation…

  10. C# Database Basics

    CERN Document Server

    Schmalz, Michael

    2012-01-01

    Working with data and databases in C# certainly can be daunting if you're coming from VB6, VBA, or Access. With this hands-on guide, you'll shorten the learning curve considerably as you master accessing, adding, updating, and deleting data with C#-basic skills you need if you intend to program with this language. No previous knowledge of C# is necessary. By following the examples in this book, you'll learn how to tackle several database tasks in C#, such as working with SQL Server, building data entry forms, and using data in a web service. The book's code samples will help you get started

  11. Aviation Safety Issues Database

    Science.gov (United States)

    Morello, Samuel A.; Ricks, Wendell R.

    2009-01-01

    The aviation safety issues database was instrumental in the refinement and substantiation of the National Aviation Safety Strategic Plan (NASSP). The issues database is a comprehensive set of issues from an extremely broad base of aviation functions, personnel, and vehicle categories, both nationally and internationally. Several aviation safety stakeholders such as the Commercial Aviation Safety Team (CAST) have already used the database. This broader interest was the genesis to making the database publically accessible and writing this report.

  12. Database Description - FANTOM5 | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us FANTOM5 Database Description General information of database Database name FANTOM5 Alternati...me: Rattus norvegicus Taxonomy ID: 10116 Taxonomy Name: Macaca mulatta Taxonomy ID: 9544 Database descriptio...l Links: Original website information Database maintenance site RIKEN Center for Life Science Technologies, ...ilable Web services Not available URL of Web services - Need for user registration Not available About This Database Database... Description Download License Update History of This Database Site Policy | Contact Us Database Description - FANTOM5 | LSDB Archive ...

  13. Let's talk about it: dialogues with multimedia databases Database support for human activity

    NARCIS (Netherlands)

    de Vries, A.P.; van der Veer, Gerrit C.; Blanken, Henk

    We describe two scenarios of user tasks in which access to multimedia data plays a significant role. Because current multimedia databases cannot support these tasks, we introduce three new requirements on multimedia databases: multimedia objects should be active objects, querying is an interaction

  14. Managed access technology to combat contraband cell phones in prison: Findings from a process evaluation.

    Science.gov (United States)

    Grommon, Eric

    2018-02-01

    Cell phones in correctional facilities have emerged as one of the most pervasive forms of modern contraband. This issue has been identified as a top priority for many correctional administrators in the United States. Managed access, a technology that utilizes cellular signals to capture transmissions from contraband phones, has received notable attention as a promising tool to combat this problem. However, this technology has received little evaluative attention. The present study offers a foundational process evaluation and draws upon output measures and stakeholder interviews to identify salient operational challenges and subsequent lessons learned about implementing and maintaining a managed access system. Findings suggest that while managed access captures large volumes of contraband cellular transmissions, the technology requires significant implementation planning, personnel support, and complex partnerships with commercial cellular carriers. Lessons learned provide guidance for practitioners to navigate these challenges and for scholars to improve future evaluations of managed access. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. SWS: accessing SRS sites contents through Web Services

    OpenAIRE

    Romano, Paolo; Marra, Domenico

    2008-01-01

    Background Web Services and Workflow Management Systems can support creation and deployment of network systems, able to automate data analysis and retrieval processes in biomedical research. Web Services have been implemented at bioinformatics centres and workflow systems have been proposed for biological data analysis. New databanks are often developed by taking into account these technologies, but many existing databases do not allow a programmatic access. Only a fraction of available datab...

  16. A dedicated database system for handling multi-level data in systems biology.

    Science.gov (United States)

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging. To overcome this, we designed and developed a dedicated database system that can serve and solve the vital issues in data management and hereby facilitate data integration, modeling and analysis in systems biology within a sole database. In addition, a yeast data repository was implemented as an integrated database environment which is operated by the database system. Two applications were implemented to demonstrate extensibility and utilization of the system. Both illustrate how the user can access the database via the web query function and implemented scripts. These scripts are specific for two sample cases: 1) Detecting the pheromone pathway in protein interaction networks; and 2) Finding metabolic reactions regulated by Snf1 kinase. In this study we present the design of database system which offers an extensible environment to efficiently capture the majority of biological entities and relations encountered in systems biology. Critical functions and control processes were designed and implemented to ensure consistent, efficient, secure and reliable transactions. The two sample cases on the yeast integrated data clearly demonstrate the value of a sole database environment for systems biology research.

  17. AtomDB: Expanding an Accessible and Accurate Atomic Database for X-ray Astronomy

    Science.gov (United States)

    Smith, Randall

    Since its inception in 2001, the AtomDB has become the standard repository of accurate and accessible atomic data for the X-ray astrophysics community, including laboratory astrophysicists, observers, and modelers. Modern calculations of collisional excitation rates now exist - and are in AtomDB - for all abundant ions in a hot plasma. AtomDB has expanded beyond providing just a collisional model, and now also contains photoionization data from XSTAR as well as a charge exchange model, amongst others. However, building and maintaining an accurate and complete database that can fully exploit the diagnostic potential of high-resolution X-ray spectra requires further work. The Hitomi results, sadly limited as they were, demonstrated the urgent need for the best possible wavelength and rate data, not merely for the strongest lines but for the diagnostic features that may have 1% or less of the flux of the strong lines. In particular, incorporation of weak but powerfully diagnostic satellite lines will be crucial to understanding the spectra expected from upcoming deep observations with Chandra and XMM-Newton, as well as the XARM and Athena satellites. Beyond incorporating this new data, a number of groups, both experimental and theoretical, have begun to produce data with errors and/or sensitivity estimates. We plan to use this to create statistically meaningful spectral errors on collisional plasmas, providing practical uncertainties together with model spectra. We propose to continue to (1) engage the X-ray astrophysics community regarding their issues and needs, notably by a critical comparison with other related databases and tools, (2) enhance AtomDB to incorporate a large number of satellite lines as well as updated wavelengths with error estimates, (3) continue to update the AtomDB with the latest calculations and laboratory measurements, in particular velocity-dependent charge exchange rates, and (4) enhance existing tools, and create new ones as needed to

  18. BioServices: a common Python package to access biological Web Services programmatically.

    Science.gov (United States)

    Cokelaer, Thomas; Pultz, Dennis; Harder, Lea M; Serra-Musach, Jordi; Saez-Rodriguez, Julio

    2013-12-15

    Web interfaces provide access to numerous biological databases. Many can be accessed to in a programmatic way thanks to Web Services. Building applications that combine several of them would benefit from a single framework. BioServices is a comprehensive Python framework that provides programmatic access to major bioinformatics Web Services (e.g. KEGG, UniProt, BioModels, ChEMBLdb). Wrapping additional Web Services based either on Representational State Transfer or Simple Object Access Protocol/Web Services Description Language technologies is eased by the usage of object-oriented programming. BioServices releases and documentation are available at http://pypi.python.org/pypi/bioservices under a GPL-v3 license.

  19. Design of remote weather monitor system based on embedded web database

    International Nuclear Information System (INIS)

    Gao Jiugang; Zhuang Along

    2010-01-01

    The remote weather monitoring system is designed by employing the embedded Web database technology and the S3C2410 microprocessor as the core. The monitoring system can simultaneously monitor the multi-channel sensor signals, and can give a dynamic Web pages display of various types of meteorological information on the remote computer. It gives a elaborated introduction of the construction and application of the Web database under the embedded Linux. Test results show that the client access the Web page via the GPRS or the Internet, acquires data and uses an intuitive graphical way to display the value of various types of meteorological information. (authors)

  20. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  1. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  2. Scopus database: a review.

    Science.gov (United States)

    Burnham, Judy F

    2006-03-08

    The Scopus database provides access to STM journal articles and the references included in those articles, allowing the searcher to search both forward and backward in time. The database can be used for collection development as well as for research. This review provides information on the key points of the database and compares it to Web of Science. Neither database is inclusive, but complements each other. If a library can only afford one, choice must be based in institutional needs.

  3. Videophone Technology and Students with Deaf-Blindness: A Method for Increasing Access and Communication

    Science.gov (United States)

    Emerson, Judith; Bishop, John

    2012-01-01

    Introduction: Seeing the Possibilities with Videophone Technology began as research project funded by the National Center for Technology Innovation. The project implemented a face-to-face social networking program for students with deaf-blindness to investigate the potential for increasing access and communication using videophone technology.…

  4. World Database of Happiness

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    1995-01-01

    textabstractABSTRACT The World Database of Happiness is an ongoing register of research on subjective appreciation of life. Its purpose is to make the wealth of scattered findings accessible, and to create a basis for further meta-analytic studies. The database involves four sections:
    1.

  5. Towards P2P XML Database Technology

    NARCIS (Netherlands)

    Y. Zhang (Ying)

    2007-01-01

    textabstractTo ease the development of data-intensive P2P applications, we envision a P2P XML Database Management System (P2P XDBMS) that acts as a database middle-ware, providing a uniform database abstraction on top of a dynamic set of distributed data sources. In this PhD work, we research which

  6. Evaluation of Cognitively Accessible Software to Increase Independent Access to Cellphone Technology for People with Intellectual Disability

    Science.gov (United States)

    Stock, S. E.; Davies, D. K.; Wehmeyer, M. L.; Palmer, S. B.

    2008-01-01

    Background: There are over two billion telephones in use worldwide. Yet, for millions of Americans with intellectual disabilities (ID), access to the benefits of cellphone technology is limited because of deficits in literacy, numerical comprehension, the proliferation of features and shrinking size of cellphone hardware and user interfaces.…

  7. Literacy disparities in patient access and health-related use of Internet and mobile technologies.

    Science.gov (United States)

    Bailey, Stacy C; O'Conor, Rachel; Bojarski, Elizabeth A; Mullen, Rebecca; Patzer, Rachel E; Vicencio, Daniel; Jacobson, Kara L; Parker, Ruth M; Wolf, Michael S

    2015-12-01

    Age and race-related disparities in technology use have been well documented, but less is known about how health literacy influences technology access and use. To assess the association between patients' literacy skills and mobile phone ownership, use of text messaging, Internet access, and use of the Internet for health-related purposes. A secondary analysis utilizing data from 1077 primary care patients enrolled in two, multisite studies from 2011-2013. Patients were administered an in-person, structured interview. Patients with adequate health literacy were more likely to own a mobile phone or smartphone in comparison with patients having marginal or low literacy (mobile phone ownership: 96.8 vs. 95.2 vs. 90.1%, respectively, P Internet from their home (92.1 vs. 74.7 vs. 44.9%, P Internet for email (93.0 vs. 75.7 vs. 38.5%, P technology access and use are widespread, with lower literate patients being less likely to own smartphones or to access and use the Internet, particularly for health reasons. Future interventions should consider these disparities and ensure that health promotion activities do not further exacerbate disparities. © 2014 John Wiley & Sons Ltd.

  8. A practical approach for inexpensive searches of radiology report databases.

    Science.gov (United States)

    Desjardins, Benoit; Hamilton, R Curtis

    2007-06-01

    We present a method to perform full text searches of radiology reports for the large number of departments that do not have this ability as part of their radiology or hospital information system. A tool written in Microsoft Access (front-end) has been designed to search a server (back-end) containing the indexed backup weekly copy of the full relational database extracted from a radiology information system (RIS). This front end-/back-end approach has been implemented in a large academic radiology department, and is used for teaching, research and administrative purposes. The weekly second backup of the 80 GB, 4 million record RIS database takes 2 hours. Further indexing of the exported radiology reports takes 6 hours. Individual searches of the indexed database typically take less than 1 minute on the indexed database and 30-60 minutes on the nonindexed database. Guidelines to properly address privacy and institutional review board issues are closely followed by all users. This method has potential to improve teaching, research, and administrative programs within radiology departments that cannot afford more expensive technology.

  9. Recon2Neo4j: applying graph database technologies for managing comprehensive genome-scale networks.

    Science.gov (United States)

    Balaur, Irina; Mazein, Alexander; Saqi, Mansoor; Lysenko, Artem; Rawlings, Christopher J; Auffray, Charles

    2017-04-01

    The goal of this work is to offer a computational framework for exploring data from the Recon2 human metabolic reconstruction model. Advanced user access features have been developed using the Neo4j graph database technology and this paper describes key features such as efficient management of the network data, examples of the network querying for addressing particular tasks, and how query results are converted back to the Systems Biology Markup Language (SBML) standard format. The Neo4j-based metabolic framework facilitates exploration of highly connected and comprehensive human metabolic data and identification of metabolic subnetworks of interest. A Java-based parser component has been developed to convert query results (available in the JSON format) into SBML and SIF formats in order to facilitate further results exploration, enhancement or network sharing. The Neo4j-based metabolic framework is freely available from: https://diseaseknowledgebase.etriks.org/metabolic/browser/ . The java code files developed for this work are available from the following url: https://github.com/ibalaur/MetabolicFramework . ibalaur@eisbm.org. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  10. Brain Tumor Database, a free relational database for collection and analysis of brain tumor patient information.

    Science.gov (United States)

    Bergamino, Maurizio; Hamilton, David J; Castelletti, Lara; Barletta, Laura; Castellan, Lucio

    2015-03-01

    In this study, we describe the development and utilization of a relational database designed to manage the clinical and radiological data of patients with brain tumors. The Brain Tumor Database was implemented using MySQL v.5.0, while the graphical user interface was created using PHP and HTML, thus making it easily accessible through a web browser. This web-based approach allows for multiple institutions to potentially access the database. The BT Database can record brain tumor patient information (e.g. clinical features, anatomical attributes, and radiological characteristics) and be used for clinical and research purposes. Analytic tools to automatically generate statistics and different plots are provided. The BT Database is a free and powerful user-friendly tool with a wide range of possible clinical and research applications in neurology and neurosurgery. The BT Database graphical user interface source code and manual are freely available at http://tumorsdatabase.altervista.org. © The Author(s) 2013.

  11. Database in Artificial Intelligence.

    Science.gov (United States)

    Wilkinson, Julia

    1986-01-01

    Describes a specialist bibliographic database of literature in the field of artificial intelligence created by the Turing Institute (Glasgow, Scotland) using the BRS/Search information retrieval software. The subscription method for end-users--i.e., annual fee entitles user to unlimited access to database, document provision, and printed awareness…

  12. Database Optimizing Services

    Directory of Open Access Journals (Sweden)

    Adrian GHENCEA

    2010-12-01

    Full Text Available Almost every organization has at its centre a database. The database provides support for conducting different activities, whether it is production, sales and marketing or internal operations. Every day, a database is accessed for help in strategic decisions. The satisfaction therefore of such needs is entailed with a high quality security and availability. Those needs can be realised using a DBMS (Database Management System which is, in fact, software for a database. Technically speaking, it is software which uses a standard method of cataloguing, recovery, and running different data queries. DBMS manages the input data, organizes it, and provides ways of modifying or extracting the data by its users or other programs. Managing the database is an operation that requires periodical updates, optimizing and monitoring.

  13. WAIS Searching of the Current Contents Database

    Science.gov (United States)

    Banholzer, P.; Grabenstein, M. E.

    The Homer E. Newell Memorial Library of NASA's Goddard Space Flight Center is developing capabilities to permit Goddard personnel to access electronic resources of the Library via the Internet. The Library's support services contractor, Maxima Corporation, and their subcontractor, SANAD Support Technologies have recently developed a World Wide Web Home Page (http://www-library.gsfc.nasa.gov) to provide the primary means of access. The first searchable database to be made available through the HomePage to Goddard employees is Current Contents, from the Institute for Scientific Information (ISI). The initial implementation includes coverage of articles from the last few months of 1992 to present. These records are augmented with abstracts and references, and often are more robust than equivalent records in bibliographic databases that currently serve the astronomical community. Maxima/SANAD selected Wais Incorporated's WAIS product with which to build the interface to Current Contents. This system allows access from Macintosh, IBM PC, and Unix hosts, which is an important feature for Goddard's multiplatform environment. The forms interface is structured to allow both fielded (author, article title, journal name, id number, keyword, subject term, and citation) and unfielded WAIS searches. The system allows a user to: Retrieve individual journal article records. Retrieve Table of Contents of specific issues of journals. Connect to articles with similar subject terms or keywords. Connect to other issues of the same journal in the same year. Browse journal issues from an alphabetical list of indexed journal names.

  14. Developing hybrid near-space technologies for affordable access to suborbital space

    Science.gov (United States)

    Badders, Brian David

    High power rockets and high altitude balloons are two near-space technologies that could be combined in order to provide access to the mesosphere and, eventually, suborbital space. This "rockoon" technology has been used by several large budget space programs before being abandoned in favor of even more expensive, albeit more accurate, ground launch systems. With the increased development of nano-satellites and atmospheric sensors, combined with rising interest in global atmospheric data, there is an increase in desire for affordable access to extreme altitudes that does not necessarily require the precision of ground launches. Development of hybrid near-space technologies for access to over 200k ft. on a small budget brings many challenges within engineering, systems integration, cost analysis, market analysis, and business planning. This research includes the design and simulation testing of all the systems needed for a safe and reusable launch system, the cost analysis for initial production, the development of a business plan, and the development of a marketing plan. This project has both engineering and scientific significance in that it can prove the space readiness of new technologies, raise their technology readiness levels (TRLs), expedite the development process, and also provide new data to the scientific community. It also has the ability to stimulate university involvement in the aerospace industry and help to inspire the next generation of workers in the space sector. Previous development of high altitude balloon/high power rocket hybrid systems have been undertaken by government funded military programs or large aerospace corporations with varying degrees of success. However, there has yet to be a successful flight with this type of system which provides access to the upper mesosphere in a university setting. This project will aim to design and analyze a viable system while testing the engineering process under challenging budgetary constraints. The

  15. High energy nuclear database: a test-bed for nuclear data information technology

    International Nuclear Information System (INIS)

    Brown, D.A.; Vogt, R.; Beck, B.; Pruet, J.; Vogt, R.

    2008-01-01

    We describe the development of an on-line high-energy heavy-ion experimental database. When completed, the database will be searchable and cross-indexed with relevant publications, including published detector descriptions. While this effort is relatively new, it will eventually contain all published data from older heavy-ion programs as well as published data from current and future facilities. These data include all measured observables in proton-proton, proton-nucleus and nucleus-nucleus collisions. Once in general use, this database will have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models for a broad range of experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion, target and source development for upcoming facilities such as the International Linear Collider and homeland security. This database is part of a larger proposal that includes the production of periodic data evaluations and topical reviews. These reviews would provide an alternative and impartial mechanism to resolve discrepancies between published data from rival experiments and between theory and experiment. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This project serves as a test-bed for the further development of an object-oriented nuclear data format and database system. By using 'off-the-shelf' software tools and techniques, the system is simple, robust, and extensible. Eventually we envision a 'Grand Unified Nuclear Format' encapsulating data types used in the ENSDF, Endf/B, EXFOR, NSR and other formats, including processed data formats. (authors)

  16. CMS conditions data access using FroNTier

    International Nuclear Information System (INIS)

    Blumenfeld, Barry; Johns Hopkins U.; Dykstra, David; Lueking, Lee; Wicklund, Eric; Fermilab

    2007-01-01

    The CMS experiment at the LHC has established an infrastructure using the FroNTier framework to deliver conditions (i.e. calibration, alignment, etc.) data to processing clients worldwide. FroNTier is a simple web service approach providing client HTTP access to a central database service. The system for CMS has been developed to work with POOL which provides object relational mapping between the C++ clients and various database technologies. Because of the read only nature of the data, Squid proxy caching servers are maintained near clients and these caches provide high performance data access. Several features have been developed to make the system meet the needs of CMS including careful attention to cache coherency with the central database, and low latency loading required for the operation of the online High Level Trigger. The ease of deployment, stability of operation, and high performance make the FroNTier approach well suited to the GRID environment being used for CMS offline, as well as for the online environment used by the CMS High Level Trigger (HLT). The use of standard software, such as Squid and various monitoring tools, make the system reliable, highly configurable and easily maintained. We describe the architecture, software, deployment, performance, monitoring and overall operational experience for the system

  17. CMS conditions data access using FroNTier

    International Nuclear Information System (INIS)

    Blumenfeld, B; Dykstra, D; Lueking, L; Wicklund, E

    2008-01-01

    The CMS experiment at the LHC has established an infrastructure using the FroNTier framework to deliver conditions (i.e. calibration, alignment, etc.) data to processing clients worldwide. FroNTier is a simple web service approach providing client HTTP access to a central database service. The system for CMS has been developed to work with POOL which provides object relational mapping between the C++ clients and various database technologies. Because of the read only nature of the data, Squid proxy caching servers are maintained near clients and these caches provide high performance data access. Several features have been developed to make the system meet the needs of CMS including careful attention to cache coherency with the central database, and low latency loading required for the operation of the online High Level Trigger. The ease of deployment, stability of operation, and high performance make the FroNTier approach well suited to the GRID environment being used for CMS offline, as well as for the online environment used by the CMS High Level Trigger. The use of standard software, such as Squid and various monitoring tools, makes the system reliable, highly configurable and easily maintained. We describe the architecture, software, deployment, performance, monitoring and overall operational experience for the system

  18. Structure health monitoring system using internet and database technologies

    International Nuclear Information System (INIS)

    Kwon, Il Bum; Kim, Chi Yeop; Choi, Man Yong; Lee, Seung Seok

    2003-01-01

    Structural health monitoring system should developed to be based on internet and database technology in order to manage efficiently large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconvince.

  19. Structural health monitoring system using internet and database technologies

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chi Yeop; Choi, Man Yong; Kwon, Il Bum; Lee, Seung Seok [Nonstructive Measurment Lab., KRISS, Daejeon (Korea, Republic of)

    2003-07-01

    Structure health monitoring system should develope to be based on internet and database technology in order to manage efficiency large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconviniences.

  20. Structure health monitoring system using internet and database technologies

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Il Bum; Kim, Chi Yeop; Choi, Man Yong; Lee, Seung Seok [Smart Measurment Group. Korea Resarch Institute of Standards and Science, Saejeon (Korea, Republic of)

    2003-05-15

    Structural health monitoring system should developed to be based on internet and database technology in order to manage efficiently large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconvince.

  1. Structural health monitoring system using internet and database technologies

    International Nuclear Information System (INIS)

    Kim, Chi Yeop; Choi, Man Yong; Kwon, Il Bum; Lee, Seung Seok

    2003-01-01

    Structure health monitoring system should develope to be based on internet and database technology in order to manage efficiency large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconviniences.

  2. RTDB: A memory resident real-time object database

    International Nuclear Information System (INIS)

    Nogiec, Jerzy M.; Desavouret, Eugene

    2003-01-01

    RTDB is a fast, memory-resident object database with built-in support for distribution. It constitutes an attractive alternative for architecting real-time solutions with multiple, possibly distributed, processes or agents sharing data. RTDB offers both direct and navigational access to stored objects, with local and remote random access by object identifiers, and immediate direct access via object indices. The database supports transparent access to objects stored in multiple collaborating dispersed databases and includes a built-in cache mechanism that allows for keeping local copies of remote objects, with specifiable invalidation deadlines. Additional features of RTDB include a trigger mechanism on objects that allows for issuing events or activating handlers when objects are accessed or modified and a very fast, attribute based search/query mechanism. The overall architecture and application of RTDB in a control and monitoring system is presented

  3. Construction of an ortholog database using the semantic web technology for integrative analysis of genomic data.

    Science.gov (United States)

    Chiba, Hirokazu; Nishide, Hiroyo; Uchiyama, Ikuo

    2015-01-01

    Recently, various types of biological data, including genomic sequences, have been rapidly accumulating. To discover biological knowledge from such growing heterogeneous data, a flexible framework for data integration is necessary. Ortholog information is a central resource for interlinking corresponding genes among different organisms, and the Semantic Web provides a key technology for the flexible integration of heterogeneous data. We have constructed an ortholog database using the Semantic Web technology, aiming at the integration of numerous genomic data and various types of biological information. To formalize the structure of the ortholog information in the Semantic Web, we have constructed the Ortholog Ontology (OrthO). While the OrthO is a compact ontology for general use, it is designed to be extended to the description of database-specific concepts. On the basis of OrthO, we described the ortholog information from our Microbial Genome Database for Comparative Analysis (MBGD) in the form of Resource Description Framework (RDF) and made it available through the SPARQL endpoint, which accepts arbitrary queries specified by users. In this framework based on the OrthO, the biological data of different organisms can be integrated using the ortholog information as a hub. Besides, the ortholog information from different data sources can be compared with each other using the OrthO as a shared ontology. Here we show some examples demonstrating that the ortholog information described in RDF can be used to link various biological data such as taxonomy information and Gene Ontology. Thus, the ortholog database using the Semantic Web technology can contribute to biological knowledge discovery through integrative data analysis.

  4. Software Engineering Laboratory (SEL) database organization and user's guide

    Science.gov (United States)

    So, Maria; Heller, Gerard; Steinberg, Sandra; Spiegel, Douglas

    1989-01-01

    The organization of the Software Engineering Laboratory (SEL) database is presented. Included are definitions and detailed descriptions of the database tables and views, the SEL data, and system support data. The mapping from the SEL and system support data to the base tables is described. In addition, techniques for accessing the database, through the Database Access Manager for the SEL (DAMSEL) system and via the ORACLE structured query language (SQL), are discussed.

  5. Development of an instrument to measure Faculty's information and communication technology access (FICTA).

    Science.gov (United States)

    Soomro, Kamal Ahmed; Kale, Ugur; Curtis, Reagan; Akcaoglu, Mete; Bernstein, Malayna

    2018-01-01

    The phenomenon of "digital divide" is complex and multidimensional, extending beyond issues of physical access. The purpose of this study was to develop a scale to measure a range of factors related to digital divide among higher education faculty and to evaluate its reliability and validity. Faculty's Information and Communication Technology Access (FICTA) scale was tested and validated with 322 faculty teaching in public and private sector universities. Principal components analysis with varimax rotation confirmed an 8-factor solution corresponding to various dimensions of ICT access. The 57-item FICTA scale demonstrated good psychometric properties and offers researchers a tool to examine faculty's access to ICT at four levels - motivational, physical, skills, and usage access.

  6. The data and system Nikkei Telecom "Industry/Technology Information Service"

    Science.gov (United States)

    Kurata, Shizuya; Sueyoshi, Yukio

    Nihoh Keizai Shimbun started supplying "Industry/Technology Information Service" from July 1989 as a part of Nikkei Telecom Package, which is online information service using personal computers for its terminals. Previously Nikkei's database service mainly covered such areas as economy, corporations and markets. On the other hand, the new "Industry/Technology Information Service" (main data covers industry by industry information-semi macro) is attracting a good deal of attention as it is the first to supply science and technology related database which has not been touched before. Moreover it is attracting attention technically as it has an access by gateway system to JOIS which is the first class science technology file in Japan. This report introduces data and system of "Industry/Technology Information Service" briefly.

  7. The potential use of mobile technology: enhancing accessibility and communication in a blended learning course

    OpenAIRE

    Mayisela, Tabisa

    2013-01-01

    Mobile technology is increasingly being used to support blended learning beyond computer centres. It has been considered as a potential solution to the problem of a shortage of computers for accessing online learning materials (courseware) in a blended learning course. The purpose of the study was to establish how the use of mobile technology could enhance accessibility and communication in a blended learning course. Data were solicitedfrom a purposive convenience sample of 36 students engage...

  8. Supply Chain Initiatives Database

    Energy Technology Data Exchange (ETDEWEB)

    None

    2012-11-01

    The Supply Chain Initiatives Database (SCID) presents innovative approaches to engaging industrial suppliers in efforts to save energy, increase productivity and improve environmental performance. This comprehensive and freely-accessible database was developed by the Institute for Industrial Productivity (IIP). IIP acknowledges Ecofys for their valuable contributions. The database contains case studies searchable according to the types of activities buyers are undertaking to motivate suppliers, target sector, organization leading the initiative, and program or partnership linkages.

  9. The IRPVM-DB database

    International Nuclear Information System (INIS)

    Davies, L.M.; Gillemot, F.; Yanko, L.; Lyssakov, V.

    1997-01-01

    The IRPVM-DB (International Reactor Pressure Vessel Material Database) initiated by the IAEA IWG LMNPP is going to collect the available surveillance and research data world-wide on RPV material ageing. This paper presents the purpose of the database; it summarizes the type and the relationship of data included; it gives information about the data access and protection; and finally, it summarizes the state of art of the database. (author). 1 ref., 2 figs

  10. The IRPVM-DB database

    Energy Technology Data Exchange (ETDEWEB)

    Davies, L M [Davies Consultants, Oxford (United Kingdom); Gillemot, F [Atomic Energy Research Inst., Budapest (Hungary); Yanko, L [Minatom (Russian Federation); Lyssakov, V [International Atomic Energy Agency, Vienna (Austria)

    1997-09-01

    The IRPVM-DB (International Reactor Pressure Vessel Material Database) initiated by the IAEA IWG LMNPP is going to collect the available surveillance and research data world-wide on RPV material ageing. This paper presents the purpose of the database; it summarizes the type and the relationship of data included; it gives information about the data access and protection; and finally, it summarizes the state of art of the database. (author). 1 ref., 2 figs.

  11. Programming database tools for the casual user

    International Nuclear Information System (INIS)

    Katz, R.A; Griffiths, C.

    1990-01-01

    The AGS Distributed Control System (AGSDCS) uses a relational database management system (INTERBASE) for the storage of all data associated with the control of the particle accelerator complex. This includes the static data which describes the component devices of the complex, as well as data for application program startup and data records that are used in analysis. Due to licensing restraints, it was necessary to develop tools to allow programs requiring access to a database to be unconcerned whether or not they were running on a licensed node. An in-house database server program was written, using Apollo mailbox communication protocols, allowing application programs via calls to this server to access the interbase database. Initially, the tools used by the server to actually access the database were written using the GDML C host language interface. Through the evolutionary learning process these tools have been converted to Dynamic SQL. Additionally, these tools have been extracted from the exclusive province of the database server and placed in their own library. This enables application programs to use these same tools on a licensed node without using the database server and without having to modify the application code. The syntax of the C calls remain the same

  12. Evaluation of Public E-Services and Information Technology Accessibility in Different Social Groups

    Directory of Open Access Journals (Sweden)

    Ramutė Naujikienė

    2012-12-01

    Full Text Available The purpose of this study is to develop an approach based on the social quality evaluation square model for evaluation of information technology usage in different social groups. Componential view to the accessibility of e-services including IT means providing the possibility to research the influences of different life conditions to usage of the public e-services. The task of this empirical study is directed towards revealing the differences of e-inclusion and e-services accessibility for social groups of citizens of Lithuania, and to compare this accessibility data with other EU countries.Design/methodology/approach—the approach is based on the square model of social quality evaluation of information technology usage in different social groups. The social division square model includes an assessment of quality according to the evaluation of socioeconomic security, social inclusion, social cohesion, and empowerment. Empowerment can be defined as consisting of individual or collective decisions to act on one’s own life.Findings—the results are demonstrated by the accessibility of public e-services data, which are evaluated by the quality of social group development according to IT applications. The hypothesis was confirmed that the e-government activities can be realized by properly selecting and installing technologies, and using technology facilities. E-services influence the capabilities of state officials to apply modern technology and increase the availability of e-services for social groups. Results consist of individual or collective decisions to act on one’s own life, to implementation of effective information technologies in the e-government activities and using of e-services. An important indicator is the implementation of e-services in the activity of citizens. It is submitted as the index of e-participation in dealing with the activities of citizens and the possibilities of authorities directly related with providing services

  13. Evaluation of Public E-Services and Information Technology Accessibility in Different Social Groups

    Directory of Open Access Journals (Sweden)

    Ramutė Naujikienė

    2013-02-01

    Full Text Available The purpose of this study is to develop an approach based on the social quality evaluation square model for evaluation of information technology usage in different social groups. Componential view to the accessibility of e-services including IT means providing the possibility to research the influences of different life conditions to usage of the public e-services. The task of this empirical study is directed towards revealing the differences of e-inclusion and e-services accessibility for social groups of citizens of Lithuania, and to compare this accessibility data with other EU countries. Design/methodology/approach—the approach is based on the square model of social quality evaluation of information technology usage in different social groups. The social division square model includes an assessment of quality according to the evaluation of socioeconomic security, social inclusion, social cohesion, and empowerment. Empowerment can be defined as consisting of individual or collective decisions to act on one’s own life. Findings—the results are demonstrated by the accessibility of public e-services data, which are evaluated by the quality of social group development according to IT applications. The hypothesis was confirmed that the e-government activities can be realized by properly selecting and installing technologies, and using technology facilities. E-services influence the capabilities of state officials to apply modern technology and increase the availability of e-services for social groups. Results consist of individual or collective decisions to act on one’s own life, to implementation of effective information technologies in the e-government activities and using of e-services. An important indicator is the implementation of e-services in the activity of citizens. It is submitted as the index of e-participation in dealing with the activities of citizens and the possibilities of authorities directly related with providing

  14. Enhancing Ear and Hearing Health Access for Children With Technology and Connectivity.

    Science.gov (United States)

    Swanepoel, De Wet

    2017-10-12

    Technology and connectivity advances are demonstrating increasing potential to improve access of service delivery to persons with hearing loss. This article demonstrates use cases from community-based hearing screening and automated diagnosis of ear disease. This brief report reviews recent evidence for school- and home-based hearing testing in underserved communities using smartphone technologies paired with calibrated headphones. Another area of potential impact facilitated by technology and connectivity is the use of feature extraction algorithms to facilitate automated diagnosis of most common ear conditions from video-otoscopic images. Smartphone hearing screening using calibrated headphones demonstrated equivalent sensitivity and specificity for school-based hearing screening. Automating test sequences with a forced-choice response paradigm allowed persons with minimal training to offer screening in underserved communities. The automated image analysis and diagnosis system for ear disease demonstrated an overall accuracy of 80.6%, which is up to par and exceeds accuracy rates previously reported for general practitioners and pediatricians. The emergence of these tools that capitalize on technology and connectivity advances enables affordable and accessible models of service delivery for community-based ear and hearing care.

  15. Response from youths and teachers with regard to the encyclopedic database on nuclear power, ATOMICA

    International Nuclear Information System (INIS)

    Ishikawa, Isamu; Eto, Motokuni

    2005-01-01

    An encyclopedic database on nuclear power and its related fields, commonly named ATOMICA, was first established and released on a PC communications basis in 1995. The database started to be operated on the internet in October 1996. Now ATOMICA contains more than 2,300 encyclopedic data which include tables and figures over 9,300 in total. The fields the database covers are categorized into 18 areas: energy and environment, nuclear power generation, advanced reactors, fuel cycle, back end technology, safety research, basic and advanced research, radiation application, radiation influence and protection, governmental policy, regulation and rule, statistics of operation of nuclear facilities, international cooperation, etc. The number of access to the database has increased steadily, reaching more than one and a half million a year in 2003. The influence of and protection against radiation are most frequently accessed being followed by energy and environment, nuclear power generation, and nuclear fuel cycle and status of foreign countries. People with different backgrounds are believed to have an access to the database. This paper gives firstly and overview of the activities on the establishment and maintenance of ATOMICA including its historical background. Then accesses to the database are analyzed from the aspects of their number and fields. PA Center has also received questions and comments on the database from various people. In this regard the features of questions and comments are summarized with special reference to those from youths and teachers as well as from general public. This is done to elucidate the requirements to the database from the viewpoint of education. In general youths and teachers are interested in the characteristics of radiation in the nature and from nuclear facilities. They are concerned about the effect of radiation on human bodies. Besides radiation the areas they pay keen attention to are mechanism of fission and principle of nuclear

  16. A dedicated database system for handling multi-level data in systems biology

    OpenAIRE

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Background Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging...

  17. OLIO+: an osteopathic medicine database.

    Science.gov (United States)

    Woods, S E

    1991-01-01

    OLIO+ is a bibliographic database designed to meet the information needs of the osteopathic medical community. Produced by the American Osteopathic Association (AOA), OLIO+ is devoted exclusively to the osteopathic literature. The database is available only by subscription through AOA and may be accessed from any data terminal with modem or IBM-compatible personal computer with telecommunications software that can emulate VT100 or VT220. Apple access is also available, but some assistance from OLIO+ support staff may be necessary to modify the Apple keyboard.

  18. Database modeling and design logical design

    CERN Document Server

    Teorey, Toby J; Nadeau, Tom; Jagadish, HV

    2011-01-01

    Database systems and database design technology have undergone significant evolution in recent years. The relational data model and relational database systems dominate business applications; in turn, they are extended by other technologies like data warehousing, OLAP, and data mining. How do you model and design your database application in consideration of new technology or new business needs? In the extensively revised fifth edition, you'll get clear explanations, lots of terrific examples and an illustrative case, and the really practical advice you have come to count on--with design rules

  19. Database modeling and design logical design

    CERN Document Server

    Teorey, Toby J; Nadeau, Tom; Jagadish, HV

    2005-01-01

    Database systems and database design technology have undergone significant evolution in recent years. The relational data model and relational database systems dominate business applications; in turn, they are extended by other technologies like data warehousing, OLAP, and data mining. How do you model and design your database application in consideration of new technology or new business needs? In the extensively revised fourth edition, you'll get clear explanations, lots of terrific examples and an illustrative case, and the really practical advice you have come to count on--with design rul

  20. Testing the Digital Divide: Does Access to High-Quality Use of Technology in Schools Affect Student Achievement?

    Science.gov (United States)

    Talley, Gregory Keith

    2012-01-01

    This study investigates the relationship between access, use of technology and student achievement in public middle schools in Maryland. The objective of this study was to determine whether a digital divide (differences in access and utilization of technology based on student characteristics of race, socioeconomic status, and gender) exists among…

  1. Race and time from diagnosis to radical prostatectomy: does equal access mean equal timely access to the operating room?--Results from the SEARCH database.

    Science.gov (United States)

    Bañez, Lionel L; Terris, Martha K; Aronson, William J; Presti, Joseph C; Kane, Christopher J; Amling, Christopher L; Freedland, Stephen J

    2009-04-01

    African American men with prostate cancer are at higher risk for cancer-specific death than Caucasian men. We determine whether significant delays in management contribute to this disparity. We hypothesize that in an equal-access health care system, time interval from diagnosis to treatment would not differ by race. We identified 1,532 African American and Caucasian men who underwent radical prostatectomy (RP) from 1988 to 2007 at one of four Veterans Affairs Medical Centers that comprise the Shared Equal-Access Regional Cancer Hospital (SEARCH) database with known biopsy date. We compared time from biopsy to RP between racial groups using linear regression adjusting for demographic and clinical variables. We analyzed risk of potential clinically relevant delays by determining odds of delays >90 and >180 days. Median time interval from diagnosis to RP was 76 and 68 days for African Americans and Caucasian men, respectively (P = 0.004). After controlling for demographic and clinical variables, race was not associated with the time interval between diagnosis and RP (P = 0.09). Furthermore, race was not associated with increased risk of delays >90 (P = 0.45) or >180 days (P = 0.31). In a cohort of men undergoing RP in an equal-access setting, there was no significant difference between racial groups with regard to time interval from diagnosis to RP. Thus, equal-access includes equal timely access to the operating room. Given our previous finding of poorer outcomes among African Americans, treatment delays do not seem to explain these observations. Our findings need to be confirmed in patients electing other treatment modalities and in other practice settings.

  2. Letter to the editor: Good relationships are pivotal in nuclear databases

    International Nuclear Information System (INIS)

    Heger, A.S.; Koen, B.V.

    1990-01-01

    No matter which way you turn in the technical world today, you will find the apostles of the Informational Revolution aggressively championing information technology is the key to effective operation and competitiveness. Information, they insist, is a strategic asset. To some extent they may seem to be correct. In the nuclear power industry, for instance, the increased interest in improved plant safety and performance and mandated probabilistic risk assessments have demonstrated the need for quality data in the form of information. The Holy Grail of the information. The Holy Grail of the information technology advocates is the seamless integration of hardware, software, and telecommunications technology into networks where engineers and risk analysis can get whatever information they need whenever they need it. Experience has shown that this approach can be very confusing. This web of technological wonders has led to the inundation of end users to the point that they refuse to access the system. Instead of asking, What are the data that matter and how do we most effectively manage them, these database management systems must start asking, What are the relationships that matter and how can the technology most effectively support them? It is time for the nuclear industry to recognize where the real potential and value of its databases lie

  3. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Directory of Open Access Journals (Sweden)

    Xiaohuan Yang

    2009-02-01

    Full Text Available The spatial distribution of population is closely related to land use and land cover (LULC patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B data integrated with a Pattern Decomposition Method (PDM and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM. The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable.

  4. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Science.gov (United States)

    Yang, Xiaohuan; Huang, Yaohuan; Dong, Pinliang; Jiang, Dong; Liu, Honghui

    2009-01-01

    The spatial distribution of population is closely related to land use and land cover (LULC) patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS) have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS) is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B) data integrated with a Pattern Decomposition Method (PDM) and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM). The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable. PMID:22399959

  5. A service-oriented data access control model

    Science.gov (United States)

    Meng, Wei; Li, Fengmin; Pan, Juchen; Song, Song; Bian, Jiali

    2017-01-01

    The development of mobile computing, cloud computing and distributed computing meets the growing individual service needs. Facing with complex application system, it's an urgent problem to ensure real-time, dynamic, and fine-grained data access control. By analyzing common data access control models, on the basis of mandatory access control model, the paper proposes a service-oriented access control model. By regarding system services as subject and data of databases as object, the model defines access levels and access identification of subject and object, and ensures system services securely to access databases.

  6. Survey on utilization of database for research and development of global environmental industry technology; Chikyu kankyo sangyo gijutsu kenkyu kaihatsu no tame no database nado no riyo ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-01

    To optimize networks and database systems for promotion of the industry technology development contributing to the solution of the global environmental problem, studies are made on reusable information resource and its utilization methods. As reusable information resource, there are external database and network system for researchers` information exchange and for computer use. The external database includes commercial database and academic database. As commercial database, 6 agents and 13 service systems are selected. As academic database, there are NACSIS-IR and the database which is connected with INTERNET in the U.S. These are used in connection with the UNIX academic research network called INTERNET. For connection with INTERNET, a commercial UNIX network service called IIJ which starts service in April 1993 can be used. However, personal computer communication network is used for the time being. 6 figs., 4 tabs.

  7. Guide on Project Web Access of SFR R and D and Technology Monitoring System

    International Nuclear Information System (INIS)

    Lee, Dong Uk; Won, Byung Chool; Lee, Yong Bum; Kim, Young In; Hahn, Do Hee

    2008-09-01

    The SFR R and D and technology monitoring system based on the MS enterprise project management is developed for systematic effective management of 'Development of Basic Key Technologies for Gen IV SFR' project which was performed under the Mid- and Long-term Nuclear R and D Program sponsored by the Ministry of Education, Science and Technology. This system is a tool for project management based on web access. Therefore this manual is a detailed guide for Project Web Access(PWA). Section 1 describes the common guide for using of system functions such as project server 2007 client connection setting, additional outlook function setting etc. The section 2 describes the guide for system administrator. It is described the guide for project management in section 3, 4

  8. Guide on Project Web Access of SFR R and D and Technology Monitoring System

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Uk; Won, Byung Chool; Lee, Yong Bum; Kim, Young In; Hahn, Do Hee

    2008-09-15

    The SFR R and D and technology monitoring system based on the MS enterprise project management is developed for systematic effective management of 'Development of Basic Key Technologies for Gen IV SFR' project which was performed under the Mid- and Long-term Nuclear R and D Program sponsored by the Ministry of Education, Science and Technology. This system is a tool for project management based on web access. Therefore this manual is a detailed guide for Project Web Access(PWA). Section 1 describes the common guide for using of system functions such as project server 2007 client connection setting, additional outlook function setting etc. The section 2 describes the guide for system administrator. It is described the guide for project management in section 3, 4.

  9. LHCb distributed conditions database

    International Nuclear Information System (INIS)

    Clemencic, M

    2008-01-01

    The LHCb Conditions Database project provides the necessary tools to handle non-event time-varying data. The main users of conditions are reconstruction and analysis processes, which are running on the Grid. To allow efficient access to the data, we need to use a synchronized replica of the content of the database located at the same site as the event data file, i.e. the LHCb Tier1. The replica to be accessed is selected from information stored on LFC (LCG File Catalog) and managed with the interface provided by the LCG developed library CORAL. The plan to limit the submission of jobs to those sites where the required conditions are available will also be presented. LHCb applications are using the Conditions Database framework on a production basis since March 2007. We have been able to collect statistics on the performance and effectiveness of both the LCG library COOL (the library providing conditions handling functionalities) and the distribution framework itself. Stress tests on the CNAF hosted replica of the Conditions Database have been performed and the results will be summarized here

  10. [The Development and Application of the Orthopaedics Implants Failure Database Software Based on WEB].

    Science.gov (United States)

    Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao

    2015-09-01

    This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.

  11. Uranium Location Database

    Data.gov (United States)

    U.S. Environmental Protection Agency — A GIS compiled locational database in Microsoft Access of ~15,000 mines with uranium occurrence or production, primarily in the western United States. The metadata...

  12. Uses of internet technology in clinical practice

    International Nuclear Information System (INIS)

    Mansoor, I.

    2001-01-01

    The practice of medicine has extended itself to vast areas and requires active clinicians to systematize and organize their workload through the use of the most up-to-date digital and computer communication technologies. Computerization and worldwide accessibility of information has especially provided great assistance in this regard. The explosive growth of medical information increases the need for the use of these new methods of organizing and accessing data. This article briefly summarizes a few of the vital tools that internet technology has provided clinical practice, with the aid of basic concepts of internet, database systems, hospital systems and data security and reliability. (author)

  13. Access to DNA and protein databases on the Internet.

    Science.gov (United States)

    Harper, R

    1994-02-01

    During the past year, the number of biological databases that can be queried via Internet has dramatically increased. This increase has resulted from the introduction of networking tools, such as Gopher and WAIS, that make it easy for research workers to index databases and make them available for on-line browsing. Biocomputing in the nineties will see the advent of more client/server options for the solution of problems in bioinformatics.

  14. Improving collaboration between primary care research networks using Access Grid technology

    Directory of Open Access Journals (Sweden)

    Zsolt Nagykaldi

    2008-05-01

    Full Text Available Access Grid (AG is an Internet2-driven, high performance audio_visual conferencing technology used worldwide by academic and government organisations to enhance communication, human interaction and group collaboration. AG technology is particularly promising for improving academic multi-centre research collaborations. This manuscript describes how the AG technology was utilised by the electronic Primary Care Research Network (ePCRN that is part of the National Institutes of Health (NIH Roadmap initiative to improve primary care research and collaboration among practice- based research networks (PBRNs in the USA. It discusses the design, installation and use of AG implementations, potential future applications, barriers to adoption, and suggested solutions.

  15. The Evolution of Teachers' Instructional Beliefs and Practices in High-Access-to-Technology Classrooms.

    Science.gov (United States)

    Dwyer, David C.; And Others

    Beginning in 1985, Apple Computer, Inc., and several school districts began a collaboration to examine the impact of computer saturation on instruction and learning in K-12 classrooms. The initial guiding question was simply put: What happens when teachers and students have constant access to technology? To provide "constant access,"…

  16. Perspectives in understanding open access to research data - infrastructure and technology challenges

    Science.gov (United States)

    Bigagli, Lorenzo; Sondervan, Jeroen

    2014-05-01

    The Policy RECommendations for Open Access to Research Data in Europe (RECODE) project, started in February 2013 with a duration of two years, has the objective to identify a series of targeted and over-arching policy recommendations for Open Access to European research data, based on existing good practice and addressing such hindering factors as stakeholder fragmentation, technical and infrastructural issues, ethical and legal issues, and financial and institutional policies. In this work we focus on the technical and infrastructural aspect, where by "infrastructure" we mean the technological assets (hardware and software), the human resources, and all the policies, processes, procedures and training for managing and supporting its continuous operation and evolution. The context targeted by RECODE includes heterogeneous networks, initiatives, projects and communities that are fragmented by discipline, geography, stakeholder category (publishers, academics, repositories, etc.) as well as other boundaries. Many of these organizations are already addressing key technical and infrastructural barriers to Open Access to research data. Such barriers may include: lack of automatic mechanisms for policy enforcement, lack of metadata and data models supporting open access, obsolescence of infrastructures, scarce awareness about new technological solutions, lack of training and/or expertise on IT and semantics aspects. However, these organizations are often heterogeneous and fragmented by discipline, geography, stakeholder category (publishers, academics, repositories, etc.) as well as other boundaries, and often work in isolation, or with limited contact with one another. RECODE has addressed these challenges, and the possible solutions to mitigate them, engaging all the identified stakeholders in a number of ways, including an online questionnaire, case studies interviews, literature review, a workshop. The conclusions have been validated by the RECODE Advisory Board and

  17. ARTI Refrigerant Database

    Energy Technology Data Exchange (ETDEWEB)

    Calm, J.M. [Calm (James M.), Great Falls, VA (United States)

    1994-05-27

    The Refrigerant Database consolidates and facilitates access to information to assist industry in developing equipment using alternative refrigerants. The underlying purpose is to accelerate phase out of chemical compounds of environmental concern.

  18. WMC Database Evaluation. Case Study Report

    Energy Technology Data Exchange (ETDEWEB)

    Palounek, Andrea P. T [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-10-29

    The WMC Database is ultimately envisioned to hold a collection of experimental data, design information, and information from computational models. This project was a first attempt at using the Database to access experimental data and extract information from it. This evaluation shows that the Database concept is sound and robust, and that the Database, once fully populated, should remain eminently usable for future researchers.

  19. ECG-ViEW II, a freely accessible electrocardiogram database

    Science.gov (United States)

    Park, Man Young; Lee, Sukhoon; Jeon, Min Seok; Yoon, Dukyong; Park, Rae Woong

    2017-01-01

    The Electrocardiogram Vigilance with Electronic data Warehouse II (ECG-ViEW II) is a large, single-center database comprising numeric parameter data of the surface electrocardiograms of all patients who underwent testing from 1 June 1994 to 31 July 2013. The electrocardiographic data include the test date, clinical department, RR interval, PR interval, QRS duration, QT interval, QTc interval, P axis, QRS axis, and T axis. These data are connected with patient age, sex, ethnicity, comorbidities, age-adjusted Charlson comorbidity index, prescribed drugs, and electrolyte levels. This longitudinal observational database contains 979,273 electrocardiograms from 461,178 patients over a 19-year study period. This database can provide an opportunity to study electrocardiographic changes caused by medications, disease, or other demographic variables. ECG-ViEW II is freely available at http://www.ecgview.org. PMID:28437484

  20. Pengujian Kualitas Website Ditinjau dari Perspektif Accessibility, Experience, Marketing dan Technology

    Directory of Open Access Journals (Sweden)

    Diyurman Gea

    2014-06-01

    Full Text Available The purpose of this study was to determine the quality of the websites managed by individuals, companies and governments. The test results would be useful for managers to pay more attention to the quality of the website optimally through several perspectives: accessibility, experience, marketing and technology. The research used samples from 350 websites and the data were divided into seven categories, namely: website managed by SMEs (small and medium, website of universities, website of governments, e-commerce, news websites, website of industrial companies, and website of non-profit organization. We used Nibler as the testing tool to facilitate the assessment process. Data were analyzed using WEKA presented in the form of a decision tree. The results showed that the tested websites had an average value of 4.66 or worse (scale 1-10. The conclusion is that the manager of the website should perform repairs on the data and applications, in particular in the perspective of technology, accessibility and experience.

  1. Basic survey for promoting energy efficiency in developing countries. Database development project directory of energy conservation technology in Japan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-02-01

    In order to promote energy conservation in developing countries, the gist of Japanese energy saving technologies was edited into a database. The Asian territory is expected of remarkable economic development and increased energy consumption including that for fossil fuels. Therefore, this project of structuring a database has urgent importance for the Asian countries. New and wide-area discussions were given to revise the 1995 edition. The committee was composed of members from high energy consuming areas such as iron and steel, paper and pulp, chemical, oil refining, cement, electric power, machinery, electric devices, and industrial machinery industries. Technical literatures and reports were referred to, and opinions were heard from specialists and committee members representing the respective areas. In order to reflect the current status and particular conditions in specific industrial areas, additions were given under the assistance and guidance from the specialists. The energy saving technologies recorded in the database may be called small to medium scale technologies, with the target placed on saving energy by 10% or more. Small-scale energy saving technologies were omitted. Flow charts for manufacturing processes were also added. (NEDO)

  2. Accessible Transportation Technologies Research Initiative (ATTRI) : User Needs Assessment: Stakeholder Engagement Report.

    Science.gov (United States)

    2016-05-01

    The Accessible Transportation Technologies Research Initiative (ATTRI) is a joint U.S. Department of Transportation (U.S. DOT) initiative that is co-led by the Federal Highway Administration (FHWA) and the Federal Transit Administration (FTA). ATTRI ...

  3. A study on the unstructured music database—Taking the Bo people’s music and its music iconography database as an example

    Directory of Open Access Journals (Sweden)

    Liu Yutong

    2015-01-01

    Full Text Available An unstructured music iconography data system constructed by key technologies like Dublin Core, Lucene technology and MVC framework is studied in this paper. Results indicate that the traditional directory tree and the existing indexing and searching tools are severely insufficient in the organization and management of the massive unstructured data. Relevant documents can be searched effectively and rapidly through the index established by provided by BeFS. Key technologies, such as Dublin Core, Lucene technology and MVC framework, can be applied to the construction of the enormous unstructured database of music and image resources. The database system test can be divided into two links, functional test and performance test. The test results of the Bo people’s music and image database system obtained through the tested design scheme indicate that the performance of the system is relatively high and able to satisfy the concurrent access of massive data with excellent user experience.

  4. jSPyDB, an open source database-independent tool for data management

    Science.gov (United States)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  5. jSPyDB, an open source database-independent tool for data management

    International Nuclear Information System (INIS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  6. Integrated data acquisition, storage, retrieval and processing using the COMPASS DataBase (CDB)

    Energy Technology Data Exchange (ETDEWEB)

    Urban, J., E-mail: urban@ipp.cas.cz [Institute of Plasma Physics AS CR, v.v.i., Za Slovankou 3, 182 00 Praha 8 (Czech Republic); Pipek, J.; Hron, M. [Institute of Plasma Physics AS CR, v.v.i., Za Slovankou 3, 182 00 Praha 8 (Czech Republic); Janky, F.; Papřok, R.; Peterka, M. [Institute of Plasma Physics AS CR, v.v.i., Za Slovankou 3, 182 00 Praha 8 (Czech Republic); Department of Surface and Plasma Science, Faculty of Mathematics and Physics, Charles University in Prague, V Holešovičkách 2, 180 00 Praha 8 (Czech Republic); Duarte, A.S. [Instituto de Plasmas e Fusão Nuclear, Instituto Superior Técnico, Universidade Técnica de Lisboa, 1049-001 Lisboa (Portugal)

    2014-05-15

    Highlights: • CDB is used as a new data storage solution for the COMPASS tokamak. • The software is light weight, open, fast and easily extensible and scalable. • CDB seamlessly integrates with any data acquisition system. • Rich metadata are stored for physics signals. • Data can be processed automatically, based on dependence rules. - Abstract: We present a complex data handling system for the COMPASS tokamak, operated by IPP ASCR Prague, Czech Republic [1]. The system, called CDB (COMPASS DataBase), integrates different data sources as an assortment of data acquisition hardware and software from different vendors is used. Based on widely available open source technologies wherever possible, CDB is vendor and platform independent and it can be easily scaled and distributed. The data is directly stored and retrieved using a standard NAS (Network Attached Storage), hence independent of the particular technology; the description of the data (the metadata) is recorded in a relational database. Database structure is general and enables the inclusion of multi-dimensional data signals in multiple revisions (no data is overwritten). This design is inherently distributed as the work is off-loaded to the clients. Both NAS and database can be implemented and optimized for fast local access as well as secure remote access. CDB is implemented in Python language; bindings for Java, C/C++, IDL and Matlab are provided. Independent data acquisitions systems as well as nodes managed by FireSignal [2] are all integrated using CDB. An automated data post-processing server is a part of CDB. Based on dependency rules, the server executes, in parallel if possible, prescribed post-processing tasks.

  7. NLTE4 Plasma Population Kinetics Database

    Science.gov (United States)

    SRD 159 NLTE4 Plasma Population Kinetics Database (Web database for purchase)   This database contains benchmark results for simulation of plasma population kinetics and emission spectra. The data were contributed by the participants of the 4th Non-LTE Code Comparison Workshop who have unrestricted access to the database. The only limitation for other users is in hidden labeling of the output results. Guest users can proceed to the database entry page without entering userid and password.

  8. Impact of Access to Online Databases on Document Delivery Services within Iranian Academic Libraries

    Directory of Open Access Journals (Sweden)

    Zohreh Zahedi

    2007-04-01

    Full Text Available The present study investigates the impact of access to online databases on the document delivery services in Iranian Academic Libraries, within the framework of factors such as number of orders lodged over the years studied and their trends, expenditures made by each university, especially those universities and groups that had the highest number of orders. This investigation was carried out through a survey and by calling on the library document supply unit in universities as well as in-person interview with librarians in charge. The study sample was confined to the universities of Shiraz, Tehran and Tarbiyat Modaress along with their faculties. Findings indicate that the rate of document requests in various universities depends on the target audience, capabilities, students’ familiarity as well as mode of document delivery services..

  9. A Brief Survey of Media Access Control, Data Link Layer, and Protocol Technologies for Lunar Surface Communications

    Science.gov (United States)

    Wallett, Thomas M.

    2009-01-01

    This paper surveys and describes some of the existing media access control and data link layer technologies for possible application in lunar surface communications and the advanced wideband Direct Sequence Code Division Multiple Access (DSCDMA) conceptual systems utilizing phased-array technology that will evolve in the next decade. Time Domain Multiple Access (TDMA) and Code Division Multiple Access (CDMA) are standard Media Access Control (MAC) techniques that can be incorporated into lunar surface communications architectures. Another novel hybrid technique that is recently being developed for use with smart antenna technology combines the advantages of CDMA with those of TDMA. The relatively new and sundry wireless LAN data link layer protocols that are continually under development offer distinct advantages for lunar surface applications over the legacy protocols which are not wireless. Also several communication transport and routing protocols can be chosen with characteristics commensurate with smart antenna systems to provide spacecraft communications for links exhibiting high capacity on the surface of the Moon. The proper choices depend on the specific communication requirements.

  10. HCUP State Inpatient Databases (SID) - Restricted Access File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The State Inpatient Databases (SID) contain the universe of hospital inpatient discharge abstracts in States participating in HCUP that release their data through...

  11. Protocole of a controlled before-after evaluation of a national health information technology-based program to improve healthcare coordination and access to information.

    Science.gov (United States)

    Saillour-Glénisson, Florence; Duhamel, Sylvie; Fourneyron, Emmanuelle; Huiart, Laetitia; Joseph, Jean Philippe; Langlois, Emmanuel; Pincemail, Stephane; Ramel, Viviane; Renaud, Thomas; Roberts, Tamara; Sibé, Matthieu; Thiessard, Frantz; Wittwer, Jerome; Salmi, Louis Rachid

    2017-04-21

    Improvement of coordination of all health and social care actors in the patient pathways is an important issue in many countries. Health Information (HI) technology has been considered as a potentially effective answer to this issue. The French Health Ministry first funded the development of five TSN ("Territoire de Soins Numérique"/Digital health territories) projects, aiming at improving healthcare coordination and access to information for healthcare providers, patients and the population, and at improving healthcare professionals work organization. The French Health Ministry then launched a call for grant to fund one research project consisting in evaluating the TSN projects implementation and impact and in developing a model for HI technology evaluation. EvaTSN is mainly based on a controlled before-after study design. Data collection covers three periods: before TSN program implementation, during early TSN program implementation and at late TSN program implementation, in the five TSN projects' territories and in five comparison territories. Three populations will be considered: "TSN-targeted people" (healthcare system users and people having characteristics targeted by the TSN projects), "TSN patient users" (people included in TSN experimentations or using particular services) and "TSN professional users" (healthcare professionals involved in TSN projects). Several samples will be made in each population depending on the objective, axis and stage of the study. Four types of data sources are considered: 1) extractions from the French National Heath Insurance Database (SNIIRAM) and the French Autonomy Personalized Allowance database, 2) Ad hoc surveys collecting information on knowledge of TSN projects, TSN program use, ease of use, satisfaction and understanding, TSN pathway experience and appropriateness of hospital admissions, 3) qualitative analyses using semi-directive interviews and focus groups and document analyses and 4) extractions of TSN

  12. Self-Access Centers: Maximizing Learners’ Access to Center Resources

    Directory of Open Access Journals (Sweden)

    Mark W. Tanner

    2010-09-01

    Full Text Available Originally published in TESL-EJ March 2009, Volume 12, Number 4 (http://tesl-ej.org/ej48/a2.html. Reprinted with permission from the authors.Although some students have discovered how to use self-access centers effectively, the majority appear to be unaware of available resources. A website and database of materials were created to help students locate materials and use the Self-Access Study Center (SASC at Brigham Young University’s English Language Center (ELC more effectively. Students took two surveys regarding their use of the SASC. The first survey was given before the website and database were made available. A second survey was administered 12 weeks after students had been introduced to the resource. An analysis of the data shows that students tend to use SASC resources more autonomously as a result of having a web-based database. The survey results suggest that SAC managers can encourage more autonomous use of center materials by provided a website and database to help students find appropriate materials to use to learn English.

  13. Sustainable Development and Airport Surface Access: The Role of Technological Innovation and Behavioral Change

    Directory of Open Access Journals (Sweden)

    Bilal Qazi

    2013-04-01

    Full Text Available Sustainable development reflects an underlying tension to achieve economic growth whilst addressing environmental challenges, and this is particularly the case for the aviation sector. Although much of the aviation-related focus has fallen on reducing aircraft emissions, airports have also been under increasing pressure to support the vision of a low carbon energy future. One of the main sources of airport-related emissions is passenger journeys to and from airports (the surface access component of air travel, which is the focus of this paper. Two aspects associated with the relationship between sustainable development and airport surface access are considered. Firstly, there is an evaluation of three technological innovation options that will enable sustainable transport solutions for surface access journeys: telepresence systems to reduce drop-off/pick-up trips, techniques to improve public transport and options to encourage the sharing of rides. Secondly, the role of behavioral change for surface access journeys from a theoretical perspective, using empirical data from Manchester airport, is evaluated. Finally, the contribution of technology and behavioral intervention measures to improvements in sustainable development are discussed.

  14. International Nuclear Safety Center (INSC) database

    International Nuclear Information System (INIS)

    Sofu, T.; Ley, H.; Turski, R.B.

    1997-01-01

    As an integral part of DOE's International Nuclear Safety Center (INSC) at Argonne National Laboratory, the INSC Database has been established to provide an interactively accessible information resource for the world's nuclear facilities and to promote free and open exchange of nuclear safety information among nations. The INSC Database is a comprehensive resource database aimed at a scope and level of detail suitable for safety analysis and risk evaluation for the world's nuclear power plants and facilities. It also provides an electronic forum for international collaborative safety research for the Department of Energy and its international partners. The database is intended to provide plant design information, material properties, computational tools, and results of safety analysis. Initial emphasis in data gathering is given to Soviet-designed reactors in Russia, the former Soviet Union, and Eastern Europe. The implementation is performed under the Oracle database management system, and the World Wide Web is used to serve as the access path for remote users. An interface between the Oracle database and the Web server is established through a custom designed Web-Oracle gateway which is used mainly to perform queries on the stored data in the database tables

  15. Open access for ALICE analysis based on virtualization technology

    International Nuclear Information System (INIS)

    Buncic, P; Gheata, M; Schutz, Y

    2015-01-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system. (paper)

  16. Database on wind characteristics. Contents of database bank

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, K.S.

    2001-01-01

    for the available data in the established database bank and part three is the Users Manual describing the various ways to access and analyse the data. The present report constitutes the second part of the Annex XVII reporting. Basically, the database bank contains three categories of data, i.e. i) high sampled wind...... field time series; ii) high sampled wind turbine structural response time series; andiii) wind resource data. The main emphasis, however, is on category i). The available data, within each of the three categories, are described in details. The description embraces site characteristics, terrain type...

  17. Urate levels predict survival in amyotrophic lateral sclerosis: Analysis of the expanded Pooled Resource Open-Access ALS clinical trials database.

    Science.gov (United States)

    Paganoni, Sabrina; Nicholson, Katharine; Chan, James; Shui, Amy; Schoenfeld, David; Sherman, Alexander; Berry, James; Cudkowicz, Merit; Atassi, Nazem

    2018-03-01

    Urate has been identified as a predictor of amyotrophic lateral sclerosis (ALS) survival in some but not all studies. Here we leverage the recent expansion of the Pooled Resource Open-Access ALS Clinical Trials (PRO-ACT) database to study the association between urate levels and ALS survival. Pooled data of 1,736 ALS participants from the PRO-ACT database were analyzed. Cox proportional hazards regression models were used to evaluate associations between urate levels at trial entry and survival. After adjustment for potential confounders (i.e., creatinine and body mass index), there was an 11% reduction in risk of reaching a survival endpoint during the study with each 1-mg/dL increase in uric acid levels (adjusted hazard ratio 0.89, 95% confidence interval 0.82-0.97, P ALS and confirms the utility of the PRO-ACT database as a powerful resource for ALS epidemiological research. Muscle Nerve 57: 430-434, 2018. © 2017 Wiley Periodicals, Inc.

  18. ARACHNID: A prototype object-oriented database tool for distributed systems

    Science.gov (United States)

    Younger, Herbert; Oreilly, John; Frogner, Bjorn

    1994-01-01

    This paper discusses the results of a Phase 2 SBIR project sponsored by NASA and performed by MIMD Systems, Inc. A major objective of this project was to develop specific concepts for improved performance in accessing large databases. An object-oriented and distributed approach was used for the general design, while a geographical decomposition was used as a specific solution. The resulting software framework is called ARACHNID. The Faint Source Catalog developed by NASA was the initial database testbed. This is a database of many giga-bytes, where an order of magnitude improvement in query speed is being sought. This database contains faint infrared point sources obtained from telescope measurements of the sky. A geographical decomposition of this database is an attractive approach to dividing it into pieces. Each piece can then be searched on individual processors with only a weak data linkage between the processors being required. As a further demonstration of the concepts implemented in ARACHNID, a tourist information system is discussed. This version of ARACHNID is the commercial result of the project. It is a distributed, networked, database application where speed, maintenance, and reliability are important considerations. This paper focuses on the design concepts and technologies that form the basis for ARACHNID.

  19. Global Ocean Currents Database (GOCD) (NCEI Accession 0093183)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Global Ocean Currents Database (GOCD) is a collection of quality controlled ocean current measurements such as observed current direction and speed obtained from...

  20. Calculation of Investments for the Distribution of GPON Technology in the village of Bishtazhin through database

    Directory of Open Access Journals (Sweden)

    MSc. Jusuf Qarkaxhija

    2013-12-01

    Full Text Available According to daily reports, the income from internet services is getting lower each year. Landline phone services are running at a loss,  whereas mobile phone services are getting too mainstream and the only bright spot holding together cable operators (ISP  in positive balance is the income from broadband services (Fast internet, IPTV. Broadband technology is a term that defines multiple methods of information distribution through internet at great speed. Some of the broadband technologies are: optic fiber, coaxial cable, DSL, Wireless, mobile broadband, and satellite connection.  The ultimate goal of any broadband service provider is being able to provide voice, data and the video through a single network, called triple play service. The Internet distribution remains an important issue in Kosovo and particularly in rural zones. Considering the immense development of the technologies and different alternatives that we can face, the goal of this paper is to emphasize the necessity of a forecasting of such investment and to give an experience in this aspect. Because of the fact that in this investment are involved many factors related to population, geographical factors, several technologies and the fact that these factors are in continuously change, the best way is, to store all the data in a database and to use this database for different results. This database helps us to substitute the previous manual calculations with an automatic procedure of calculations. This way of work will improve the work style, having now all the tools to take the right decision about an Internet investment considering all the aspects of this investment.

  1. Smart Location Database - Service

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Smart Location Database (SLD) summarizes over 80 demographic, built environment, transit service, and destination accessibility attributes for every census block...

  2. Smart Location Database - Download

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Smart Location Database (SLD) summarizes over 80 demographic, built environment, transit service, and destination accessibility attributes for every census block...

  3. Oceans of Data: In what ways can learning research inform the development of electronic interfaces and tools for use by students accessing large scientific databases?

    Science.gov (United States)

    Krumhansl, R. A.; Foster, J.; Peach, C. L.; Busey, A.; Baker, I.

    2012-12-01

    The practice of science and engineering is being revolutionized by the development of cyberinfrastructure for accessing near real-time and archived observatory data. Large cyberinfrastructure projects have the potential to transform the way science is taught in high school classrooms, making enormous quantities of scientific data available, giving students opportunities to analyze and draw conclusions from many kinds of complex data, and providing students with experiences using state-of-the-art resources and techniques for scientific investigations. However, online interfaces to scientific data are built by scientists for scientists, and their design can significantly impede broad use by novices. Knowledge relevant to the design of student interfaces to complex scientific databases is broadly dispersed among disciplines ranging from cognitive science to computer science and cartography and is not easily accessible to designers of educational interfaces. To inform efforts at bridging scientific cyberinfrastructure to the high school classroom, Education Development Center, Inc. and the Scripps Institution of Oceanography conducted an NSF-funded 2-year interdisciplinary review of literature and expert opinion pertinent to making interfaces to large scientific databases accessible to and usable by precollege learners and their teachers. Project findings are grounded in the fundamentals of Cognitive Load Theory, Visual Perception, Schemata formation and Universal Design for Learning. The Knowledge Status Report (KSR) presents cross-cutting and visualization-specific guidelines that highlight how interface design features can address/ ameliorate challenges novice high school students face as they navigate complex databases to find data, and construct and look for patterns in maps, graphs, animations and other data visualizations. The guidelines present ways to make scientific databases more broadly accessible by: 1) adjusting the cognitive load imposed by the user

  4. NNDC database migration project

    Energy Technology Data Exchange (ETDEWEB)

    Burrows, Thomas W; Dunford, Charles L [U.S. Department of Energy, Brookhaven Science Associates (United States)

    2004-03-01

    NNDC Database Migration was necessary to replace obsolete hardware and software, to be compatible with the industry standard in relational databases (mature software, large base of supporting software for administration and dissemination and replication and synchronization tools) and to improve the user access in terms of interface and speed. The Relational Database Management System (RDBMS) consists of a Sybase Adaptive Server Enterprise (ASE), which is relatively easy to move between different RDB systems (e.g., MySQL, MS SQL-Server, or MS Access), the Structured Query Language (SQL) and administrative tools written in Java. Linux or UNIX platforms can be used. The existing ENSDF datasets are often VERY large and will need to be reworked and both the CRP (adopted) and CRP (Budapest) datasets give elemental cross sections (not relative I{gamma}) in the RI field (so it is not immediately obvious which of the old values has been changed). But primary and secondary intensities are now available on the same scale. The intensity normalization has been done for us. We will gain access to a large volume of data from Budapest and some of those gamma-ray intensity and energy data will be superior to what we already have.

  5. Investigation of an artificial intelligence technology--Model trees. Novel applications for an immediate release tablet formulation database.

    Science.gov (United States)

    Shao, Q; Rowe, R C; York, P

    2007-06-01

    This study has investigated an artificial intelligence technology - model trees - as a modelling tool applied to an immediate release tablet formulation database. The modelling performance was compared with artificial neural networks that have been well established and widely applied in the pharmaceutical product formulation fields. The predictability of generated models was validated on unseen data and judged by correlation coefficient R(2). Output from the model tree analyses produced multivariate linear equations which predicted tablet tensile strength, disintegration time, and drug dissolution profiles of similar quality to neural network models. However, additional and valuable knowledge hidden in the formulation database was extracted from these equations. It is concluded that, as a transparent technology, model trees are useful tools to formulators.

  6. Enhanced DIII-D Data Management Through a Relational Database

    Science.gov (United States)

    Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.

    2000-10-01

    A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.

  7. IAEA nuclear databases for applications

    International Nuclear Information System (INIS)

    Schwerer, Otto

    2003-01-01

    The Nuclear Data Section (NDS) of the International Atomic Energy Agency (IAEA) provides nuclear data services to scientists on a worldwide scale with particular emphasis on developing countries. More than 100 data libraries are made available cost-free by Internet, CD-ROM and other media. These databases are used for practically all areas of nuclear applications as well as basic research. An overview is given of the most important nuclear reaction and nuclear structure databases, such as EXFOR, CINDA, ENDF, NSR, ENSDF, NUDAT, and of selected special purpose libraries such as FENDL, RIPL, RNAL, the IAEA Photonuclear Data Library, and the IAEA charged-particle cross section database for medical radioisotope production. The NDS also coordinates two international nuclear data centre networks and is involved in data development activities (to create new or improve existing data libraries when the available data are inadequate) and in technology transfer to developing countries, e.g. through the installation and support of the mirror web site of the IAEA Nuclear Data Services at IPEN (operational since March 2000) and by organizing nuclear-data related workshops. By encouraging their participation in IAEA Co-ordinated Research Projects and also by compiling their experimental results in databases such as EXFOR, the NDS helps to make developing countries' contributions to nuclear science visible and conveniently available. The web address of the IAEA Nuclear Data Services is http://www.nds.iaea.org and the NDS mirror service at IPEN (Brasil) can be accessed at http://www.nds.ipen.br/ (author)

  8. 76 FR 10360 - Access to Confidential Business Information by Guident Technologies Inc. and Its Identified...

    Science.gov (United States)

    2011-02-24

    ... Business Information by Guident Technologies Inc. and Its Identified Subcontractors AGENCY: Environmental Protection Agency (EPA). ACTION: Notice. SUMMARY: EPA has authorized its contractor, Guident Technologies... information may be claimed or determined to be Confidential Business Information (CBI). DATES: Access to the...

  9. The ChArMEx database

    Science.gov (United States)

    Ferré, Hélène; Belmahfoud, Nizar; Boichard, Jean-Luc; Brissebrat, Guillaume; Cloché, Sophie; Descloitres, Jacques; Fleury, Laurence; Focsa, Loredana; Henriot, Nicolas; Mière, Arnaud; Ramage, Karim; Vermeulen, Anne; Boulanger, Damien

    2015-04-01

    The Chemistry-Aerosol Mediterranean Experiment (ChArMEx, http://charmex.lsce.ipsl.fr/) aims at a scientific assessment of the present and future state of the atmospheric environment in the Mediterranean Basin, and of its impacts on the regional climate, air quality, and marine biogeochemistry. The project includes long term monitoring of environmental parameters , intensive field campaigns, use of satellite data and modelling studies. Therefore ChARMEx scientists produce and need to access a wide diversity of data. In this context, the objective of the database task is to organize data management, distribution system and services, such as facilitating the exchange of information and stimulating the collaboration between researchers within the ChArMEx community, and beyond. The database relies on a strong collaboration between ICARE, IPSL and OMP data centers and has been set up in the framework of the Mediterranean Integrated Studies at Regional And Locals Scales (MISTRALS) program data portal. ChArMEx data, either produced or used by the project, are documented and accessible through the database website: http://mistrals.sedoo.fr/ChArMEx. The website offers the usual but user-friendly functionalities: data catalog, user registration procedure, search tool to select and access data... The metadata (data description) are standardized, and comply with international standards (ISO 19115-19139; INSPIRE European Directive; Global Change Master Directory Thesaurus). A Digital Object Identifier (DOI) assignement procedure allows to automatically register the datasets, in order to make them easier to access, cite, reuse and verify. At present, the ChArMEx database contains about 120 datasets, including more than 80 in situ datasets (2012, 2013 and 2014 summer campaigns, background monitoring station of Ersa...), 25 model output sets (dust model intercomparison, MEDCORDEX scenarios...), a high resolution emission inventory over the Mediterranean... Many in situ datasets

  10. INIS as an information and knowledge resource for advanced nuclear technology studies

    International Nuclear Information System (INIS)

    Rashkova, N.

    2009-01-01

    INIS is, at present, the leading reference database for scientific literature on the peaceful uses of nuclear science and technology. It operates at the IAEA - INIS and NKM Section and coordinates nuclear information management activities: maintains INIS database and the access to the database, prepares the IAEA input; controls the input from the Member States and assists them in establishing information centers and exchanging nuclear information. The INIS DB covers all types of literature, published worldwide: journal articles, conference proceedings, legal documents, scientific and technical documents etc. The non-conventional collection (NCL) contains more than 200 000 full text documents. INIS provides free access to the bibliographic database for registered users and other information resources, such as database on CD-ROM, topical CDs, full text database, links to other databases, web pages, authority list etc. INIS database offers structured and easy searchable information. The information is grouped in subject categories, under the INIS-ETDE subject classification scheme, which is periodically updated to reflect the new developments in the specific area. The main INIS areas are: nuclear physics, reactor physics and engineering, material science, nuclear fuels, plasma physics and fusion technology, particle accelerators radiation protection, nuclear medicine. The database grows rapidly - currently contains over 3 million records. The material science, reactor design and engineering, new reactor technologies and plasma applications are among the most rapidly growing subject categories. INIS database offers different search options, depending on the needs of the user: simple search; advanced search in abstract, author, place and date of publication, source, document type, subject, descriptors etc.; search by descriptors and predefined queries; combining queries by logical operators. The export from the database is available in html, text, formatted text; XML

  11. The use of modern information technology in research on transport accessibility

    Directory of Open Access Journals (Sweden)

    Bartosz BARTOSIEWICZ

    2015-09-01

    Full Text Available Transport accessibility can be analyzed using a number of different methods. The problem with each of them is the difficulty of obtaining data to measure this phenomenon The focus of this article and its main goal are to present methods and tools for gathering data on road traffic; thanks to modern information technology, it is possible to collect real data without the need for large-scale and highly capital-intensive measurements. The application of modern information technology (IT presented in the article, such as computer programs and applications like Google Maps Traffic Overlay and TomTom Live Traffic, enable research to be conducted on a scale that has thus far been unattainable, and allows information to be collected on such criteria as traffic volume, flow, average traffic speed, and actual journey time. Such innovative means of gathering data on automobile traffic open up new perspectives for assessing transport accessibility in terms of automobile traffic by providing high-quality data that meet the requirements for use in primary research.

  12. Quantum search of a real unstructured database

    Science.gov (United States)

    Broda, Bogusław

    2016-02-01

    A simple circuit implementation of the oracle for Grover's quantum search of a real unstructured classical database is proposed. The oracle contains a kind of quantumly accessible classical memory, which stores the database.

  13. Tourism through Travel Club: A Database Project

    Science.gov (United States)

    Pratt, Renée M. E.; Smatt, Cindi T.; Wynn, Donald E.

    2017-01-01

    This applied database exercise utilizes a scenario-based case study to teach the basics of Microsoft Access and database management in introduction to information systems and introduction to database course. The case includes background information on a start-up business (i.e., Carol's Travel Club), description of functional business requirements,…

  14. 47 CFR 69.120 - Line information database.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Line information database. 69.120 Section 69...) ACCESS CHARGES Computation of Charges § 69.120 Line information database. (a) A charge that is expressed... from a local exchange carrier database to recover the costs of: (1) The transmission facilities between...

  15. The design and implementation of access control management system in IHEP network

    International Nuclear Information System (INIS)

    Wang Yanming; An Dehai; Qi Fazhi

    2010-01-01

    In campus network environment of Institute of High Energy Physics, because of the number of Network devices and computers are large scale, ensuring the access validity of network devices and user's computer, and aiming at effective control the exceptional network communication are technological means to achieve network normal running. The access control system of Campus network of institute of High Energy Physics using MySQL database in the behind, and using CGI PHP HTML language to develop the front interface. The System achieves user information management, user computer access control, cutting down the exceptional network communication and alarm function. Increasing the management effective of network, to ensure campus network safety and reliable running. (authors)

  16. Fire test database

    International Nuclear Information System (INIS)

    Lee, J.A.

    1989-01-01

    This paper describes a project recently completed for EPRI by Impell. The purpose of the project was to develop a reference database of fire tests performed on non-typical fire rated assemblies. The database is designed for use by utility fire protection engineers to locate test reports for power plant fire rated assemblies. As utilities prepare to respond to Information Notice 88-04, the database will identify utilities, vendors or manufacturers who have specific fire test data. The database contains fire test report summaries for 729 tested configurations. For each summary, a contact is identified from whom a copy of the complete fire test report can be obtained. Five types of configurations are included: doors, dampers, seals, wraps and walls. The database is computerized. One version for IBM; one for Mac. Each database is accessed through user-friendly software which allows adding, deleting, browsing, etc. through the database. There are five major database files. One each for the five types of tested configurations. The contents of each provides significant information regarding the test method and the physical attributes of the tested configuration. 3 figs

  17. Database Description - fRNAdb | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available Affiliation: National Institute of Advanced Industrial Science and Technology (AIST) Journal Search: Creato...D89-92 External Links: Original website information Database maintenance site National Institute of Industrial Science and Technology

  18. Disruptive technological advances in vascular access for dialysis: an overview.

    Science.gov (United States)

    Yeo, Wee-Song; Ng, Qin Xiang

    2017-11-29

    End-stage kidney disease (ESKD), one of the most prevalent diseases in the world and with increasing incidence, is associated with significant morbidity and mortality. Current available modes of renal replacement therapy (RRT) include dialysis and renal transplantation. Though renal transplantation is the preferred and ideal mode of RRT, this modality may not be available to all patients with ESKD. Moreover, renal transplant recipients are constantly at risk of complications associated with immunosuppression and immunosuppressant use, and posttransplant lymphoproliferative disorder. Dialysis may be the only available modality in certain patients. However, dialysis has its limitations, which include issues associated with lack of vascular access, risks of infections and vascular thrombosis, decreased quality of life, and absence of biosynthetic functions of the kidney. In particular, the creation and maintenance of hemodialysis vascular access in children poses a unique set of challenges to the pediatric nephrologist owing to the smaller vessel diameters and vascular hyperreactivity compared with adult patients. Vascular access issues continue to be one of the major limiting factors prohibiting the delivery of adequate dialysis in ESKD patients and is the Achilles' heel of hemodialysis. This review aims to provide a critical overview of disruptive technological advances and innovations for vascular access. Novel strategies in preventing neointimal hyperplasia, novel bioengineered products, grafts and devices for vascular access will be discussed. The potential impact of these solutions on improving the morbidity encountered by dialysis patients will also be examined.

  19. Current status and perceived needs of information technology in Critical Access Hospitals: a survey study

    Directory of Open Access Journals (Sweden)

    George Demiris

    2007-01-01

    Full Text Available The US Congress established the designation of Critical Access Hospitals in 1997, recognising rural hospitals as vital links to health for rural and underserved populations. The intent of the reimbursement system is to improve financial performance, thereby reducing hospital closures. Informatics applications are thought to be tools that can enable the sustainability of such facilities. The aim of this study is to identify the current use of information and communication technology in Critical Access Hospitals, and to assess their readiness and receptiveness for the use of new software and hardware applications and their perceived information technology (IT needs. A survey was mailed to the administrators of all Critical Access Hospitals in one US state (Missouri and a reminder was mailed a few weeks later. Twenty-seven out of 33 surveys were filled out and returned (response rate 82%. While most respondents (66.7% stated that their employees have been somewhat comfortable in using new technology, almost 15% stated that their employees have been somewhat uncomfortable. Similarly, almost 12% of the respondents stated that they themselves felt somewhat uncomfortable introducing new technology. While all facilities have computers, only half of them have a specific IT plan. Findings indicate that Critical Access Hospitals are often struggling with lack of resources and specific applications that address their needs. However, it is widely recognised that IT plays an essential role in the sustainability of their organisations. The study demonstrates that IT applications have to be customised to address the needs and infrastructure of the rural settings in order to be accepted and properly utilised.

  20. Special Issue Journal of Healthcare Engineering Accessibility, Inclusion and Rehabilitation Using Information Technologies

    DEFF Research Database (Denmark)

    Brooks, Anthony Lewis; González-Cid, Yolanda

    2018-01-01

    limitation to perform tasks that they were formerly unable to accomplish, in inclusion for people with different abilities and preferences, and in rehabilitation. Potential topics include but are not limited to the following: ● Design, evaluation and use of IT to benefit people with disabilities (sensory......, motor and cognitive impairments / multiple disabilities) and elderly people ● Information technologies to accessibility to enable people with functional limitation to perform tasks that they were formerly unable to accomplish ● IT for the inclusion for people with different abilities and preferences......Accessibility, Inclusion and Rehabilitation Using Information Technologies Social exclusion occurs when individuals or even entire communities of people are blocked from rights, opportunities and resources preventing them from full participation in the activities of the society in which they live...

  1. Second-Tier Database for Ecosystem Focus, 2000-2001 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Van Holmes, Chris; Muongchanh, Christine; Anderson, James J. (University of Washington, School of Aquatic and Fishery Sciences, Seattle, WA)

    2001-11-01

    The Second-Tier Database for Ecosystem Focus (Contract 00004124) provides direct and timely public access to Columbia Basin environmental, operational, fishery and riverine data resources for federal, state, public and private entities. The Second-Tier Database known as Data Access in Realtime (DART) does not duplicate services provided by other government entities in the region. Rather, it integrates public data for effective access, consideration and application.

  2. Software Engineering Laboratory (SEL) database organization and user's guide, revision 2

    Science.gov (United States)

    Morusiewicz, Linda; Bristow, John

    1992-01-01

    The organization of the Software Engineering Laboratory (SEL) database is presented. Included are definitions and detailed descriptions of the database tables and views, the SEL data, and system support data. The mapping from the SEL and system support data to the base table is described. In addition, techniques for accessing the database through the Database Access Manager for the SEL (DAMSEL) system and via the ORACLE structured query language (SQL) are discussed.

  3. Combination of a geolocation database access with infrastructure sensing in TV bands

    OpenAIRE

    Dionísio, Rogério; Ribeiro, Jorge; Marques, Paulo; Rodriguez, Jonathan

    2014-01-01

    This paper describes the implementation and the technical specifications of a geolocation database assisted by a spectrum-monitoring outdoor network. The geolocation database is populated according to Electronic Communications Committee (ECC) report 186 methodology. The application programming interface (API) between the sensor network and the geolocation database implements an effective and secure connection to successfully gather sensing data and sends it to the geolocation database for ...

  4. The EDEN-IW ontology model for sharing knowledge and water quality data between heterogenous databases

    DEFF Research Database (Denmark)

    Stjernholm, M.; Poslad, S.; Zuo, L.

    2004-01-01

    The Environmental Data Exchange Network for Inland Water (EDEN-IW) project's main aim is to develop a system for making disparate and heterogeneous databases of Inland Water quality more accessible to users. The core technology is based upon a combination of: ontological model to represent...... a Semantic Web based data model for IW; software agents as an infrastructure to share and reason about the IW se-mantic data model and XML to make the information accessible to Web portals and mainstream Web services. This presentation focuses on the Semantic Web or Onto-logical model. Currently, we have...

  5. The ATLAS conditions database architecture for the Muon spectrometer

    International Nuclear Information System (INIS)

    Verducci, Monica

    2010-01-01

    The Muon System, facing the challenge requirement of the conditions data storage, has extensively started to use the conditions database project 'COOL' as the basis for all its conditions data storage both at CERN and throughout the worldwide collaboration as decided by the ATLAS Collaboration. The management of the Muon COOL conditions database will be one of the most challenging applications for Muon System, both in terms of data volumes and rates, but also in terms of the variety of data stored. The Muon conditions database is responsible for almost all of the 'non event' data and detector quality flags storage needed for debugging of the detector operations and for performing reconstruction and analysis. The COOL database allows database applications to be written independently of the underlying database technology and ensures long term compatibility with the entire ATLAS Software. COOL implements an interval of validity database, i.e. objects stored or referenced in COOL have an associated start and end time between which they are valid, the data is stored in folders, which are themselves arranged in a hierarchical structure of folder sets. The structure is simple and mainly optimized to store and retrieve object(s) associated with a particular time. In this work, an overview of the entire Muon conditions database architecture is given, including the different sources of the data and the storage model used. In addiction the software interfaces used to access to the conditions data are described, more emphasis is given to the Offline Reconstruction framework ATHENA and the services developed to provide the conditions data to the reconstruction.

  6. The ATLAS conditions database architecture for the Muon spectrometer

    Science.gov (United States)

    Verducci, Monica; ATLAS Muon Collaboration

    2010-04-01

    The Muon System, facing the challenge requirement of the conditions data storage, has extensively started to use the conditions database project 'COOL' as the basis for all its conditions data storage both at CERN and throughout the worldwide collaboration as decided by the ATLAS Collaboration. The management of the Muon COOL conditions database will be one of the most challenging applications for Muon System, both in terms of data volumes and rates, but also in terms of the variety of data stored. The Muon conditions database is responsible for almost all of the 'non event' data and detector quality flags storage needed for debugging of the detector operations and for performing reconstruction and analysis. The COOL database allows database applications to be written independently of the underlying database technology and ensures long term compatibility with the entire ATLAS Software. COOL implements an interval of validity database, i.e. objects stored or referenced in COOL have an associated start and end time between which they are valid, the data is stored in folders, which are themselves arranged in a hierarchical structure of folder sets. The structure is simple and mainly optimized to store and retrieve object(s) associated with a particular time. In this work, an overview of the entire Muon conditions database architecture is given, including the different sources of the data and the storage model used. In addiction the software interfaces used to access to the conditions data are described, more emphasis is given to the Offline Reconstruction framework ATHENA and the services developed to provide the conditions data to the reconstruction.

  7. GiSAO.db: a database for ageing research

    Directory of Open Access Journals (Sweden)

    Grillari Johannes

    2011-05-01

    Full Text Available Abstract Background Age-related gene expression patterns of Homo sapiens as well as of model organisms such as Mus musculus, Saccharomyces cerevisiae, Caenorhabditis elegans and Drosophila melanogaster are a basis for understanding the genetic mechanisms of ageing. For an effective analysis and interpretation of expression profiles it is necessary to store and manage huge amounts of data in an organized way, so that these data can be accessed and processed easily. Description GiSAO.db (Genes involved in senescence, apoptosis and oxidative stress database is a web-based database system for storing and retrieving ageing-related experimental data. Expression data of genes and miRNAs, annotation data like gene identifiers and GO terms, orthologs data and data of follow-up experiments are stored in the database. A user-friendly web application provides access to the stored data. KEGG pathways were incorporated and links to external databases augment the information in GiSAO.db. Search functions facilitate retrieval of data which can also be exported for further processing. Conclusions We have developed a centralized database that is very well suited for the management of data for ageing research. The database can be accessed at https://gisao.genome.tugraz.at and all the stored data can be viewed with a guest account.

  8. Open access for ALICE analysis based on virtualization technology

    CERN Document Server

    Buncic, P; Schutz, Y

    2015-01-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modi...

  9. Wireless Access

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Wireless Access. Wireless connect to the Base station. Easy and Convenient access. Costlier as compared to the wired technology. Reliability challenges. We see it as a complementary technology to the DSL.

  10. NLM Emergency Access Initiative: FAQs

    Science.gov (United States)

    Facebook Visit us on Twitter Visit us on Youtube Emergency Access Initiative Home | Journals | Books | Online Databases | FAQs Take Short Survey FAQ What is the Emergency Access Initiative? The Emergency Access Initiative (EAI) is a collaborative partnership between NLM and participating publishers to

  11. National Radiobiology Archives Distributed Access user's manual

    International Nuclear Information System (INIS)

    Watson, C.; Smith, S.; Prather, J.

    1991-11-01

    This User's Manual describes installation and use of the National Radiobiology Archives (NRA) Distributed Access package. The package consists of a distributed subset of information representative of the NRA databases and database access software which provide an introduction to the scope and style of the NRA Information Systems

  12. The TJ-II Relational Database Access Library: A User's Guide; Libreria de Acceso a la Base de Datos Relacional de TJ-II: Guia del Usuario

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, E.; Portas, A. B.; Vega, J.

    2003-07-01

    A relational database has been developed to store data representing physical values from TJ-II discharges. This new database complements the existing TJ-EI raw data database. This database resides in a host computer running Windows 2000 Server operating system and it is managed by SQL Server. A function library has been developed that permits remote access to these data from user programs running in computers connected to TJ-II local area networks via remote procedure cali. In this document a general description of the database and its organization are provided. Also given are a detailed description of the functions included in the library and examples of how to use these functions in computer programs written in the FORTRAN and C languages. (Author) 8 refs.

  13. XCOM: Photon Cross Sections Database

    Science.gov (United States)

    SRD 8 XCOM: Photon Cross Sections Database (Web, free access)   A web database is provided which can be used to calculate photon cross sections for scattering, photoelectric absorption and pair production, as well as total attenuation coefficients, for any element, compound or mixture (Z <= 100) at energies from 1 keV to 100 GeV.

  14. Timeliness and Predictability in Real-Time Database Systems

    National Research Council Canada - National Science Library

    Son, Sang H

    1998-01-01

    The confluence of computers, communications, and databases is quickly creating a globally distributed database where many applications require real time access to both temporally accurate and multimedia data...

  15. RODOS database adapter

    International Nuclear Information System (INIS)

    Xie Gang

    1995-11-01

    Integrated data management is an essential aspect of many automatical information systems such as RODOS, a real-time on-line decision support system for nuclear emergency management. In particular, the application software must provide access management to different commercial database systems. This report presents the tools necessary for adapting embedded SQL-applications to both HP-ALLBASE/SQL and CA-Ingres/SQL databases. The design of the database adapter and the concept of RODOS embedded SQL syntax are discussed by considering some of the most important features of SQL-functions and the identification of significant differences between SQL-implementations. Finally fully part of the software developed and the administrator's and installation guides are described. (orig.) [de

  16. Database Description - NBDC NikkajiRDF | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available or Name: Japan Science and Technology Agency (JST) Creator Affiliation: Contact a...e information Database maintenance site Japan Science and Technology Agency (JST) URL of the original websit

  17. Kansas Cartographic Database (KCD)

    Data.gov (United States)

    Kansas Data Access and Support Center — The Kansas Cartographic Database (KCD) is an exact digital representation of selected features from the USGS 7.5 minute topographic map series. Features that are...

  18. Specialist Bibliographic Databases.

    Science.gov (United States)

    Gasparyan, Armen Yuri; Yessirkepov, Marlen; Voronov, Alexander A; Trukhachev, Vladimir I; Kostyukova, Elena I; Gerasimov, Alexey N; Kitas, George D

    2016-05-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls.

  19. Specialist Bibliographic Databases

    Science.gov (United States)

    2016-01-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls. PMID:27134485

  20. The design of distributed database system for HIRFL

    International Nuclear Information System (INIS)

    Wang Hong; Huang Xinmin

    2004-01-01

    This paper is focused on a kind of distributed database system used in HIRFL distributed control system. The database of this distributed database system is established by SQL Server 2000, and its application system adopts the Client/Server model. Visual C ++ is used to develop the applications, and the application uses ODBC to access the database. (authors)

  1. Development of Human Face Literature Database Using Text Mining Approach: Phase I.

    Science.gov (United States)

    Kaur, Paramjit; Krishan, Kewal; Sharma, Suresh K

    2018-06-01

    The face is an important part of the human body by which an individual communicates in the society. Its importance can be highlighted by the fact that a person deprived of face cannot sustain in the living world. The amount of experiments being performed and the number of research papers being published under the domain of human face have surged in the past few decades. Several scientific disciplines, which are conducting research on human face include: Medical Science, Anthropology, Information Technology (Biometrics, Robotics, and Artificial Intelligence, etc.), Psychology, Forensic Science, Neuroscience, etc. This alarms the need of collecting and managing the data concerning human face so that the public and free access of it can be provided to the scientific community. This can be attained by developing databases and tools on human face using bioinformatics approach. The current research emphasizes on creating a database concerning literature data of human face. The database can be accessed on the basis of specific keywords, journal name, date of publication, author's name, etc. The collected research papers will be stored in the form of a database. Hence, the database will be beneficial to the research community as the comprehensive information dedicated to the human face could be found at one place. The information related to facial morphologic features, facial disorders, facial asymmetry, facial abnormalities, and many other parameters can be extracted from this database. The front end has been developed using Hyper Text Mark-up Language and Cascading Style Sheets. The back end has been developed using hypertext preprocessor (PHP). The JAVA Script has used as scripting language. MySQL (Structured Query Language) is used for database development as it is most widely used Relational Database Management System. XAMPP (X (cross platform), Apache, MySQL, PHP, Perl) open source web application software has been used as the server.The database is still under the

  2. Database reliability engineering designing and operating resilient database systems

    CERN Document Server

    Campbell, Laine

    2018-01-01

    The infrastructure-as-code revolution in IT is also affecting database administration. With this practical book, developers, system administrators, and junior to mid-level DBAs will learn how the modern practice of site reliability engineering applies to the craft of database architecture and operations. Authors Laine Campbell and Charity Majors provide a framework for professionals looking to join the ranks of today’s database reliability engineers (DBRE). You’ll begin by exploring core operational concepts that DBREs need to master. Then you’ll examine a wide range of database persistence options, including how to implement key technologies to provide resilient, scalable, and performant data storage and retrieval. With a firm foundation in database reliability engineering, you’ll be ready to dive into the architecture and operations of any modern database. This book covers: Service-level requirements and risk management Building and evolving an architecture for operational visibility ...

  3. Databases as policy instruments. About extending networks as evidence-based policy

    Directory of Open Access Journals (Sweden)

    Stoevelaar Herman

    2007-12-01

    Full Text Available Abstract Background This article seeks to identify the role of databases in health policy. Access to information and communication technologies has changed traditional relationships between the state and professionals, creating new systems of surveillance and control. As a result, databases may have a profound effect on controlling clinical practice. Methods We conducted three case studies to reconstruct the development and use of databases as policy instruments. Each database was intended to be employed to control the use of one particular pharmaceutical in the Netherlands (growth hormone, antiretroviral drugs for HIV and Taxol, respectively. We studied the archives of the Dutch Health Insurance Board, conducted in-depth interviews with key informants and organized two focus groups, all focused on the use of databases both in policy circles and in clinical practice. Results Our results demonstrate that policy makers hardly used the databases, neither for cost control nor for quality assurance. Further analysis revealed that these databases facilitated self-regulation and quality assurance by (national bodies of professionals, resulting in restrictive prescription behavior amongst physicians. Conclusion The databases fulfill control functions that were formerly located within the policy realm. The databases facilitate collaboration between policy makers and physicians, since they enable quality assurance by professionals. Delegating regulatory authority downwards into a network of physicians who control the use of pharmaceuticals seems to be a good alternative for centralized control on the basis of monitoring data.

  4. Improved Information Retrieval Performance on SQL Database Using Data Adapter

    Science.gov (United States)

    Husni, M.; Djanali, S.; Ciptaningtyas, H. T.; Wicaksana, I. G. N. A.

    2018-02-01

    The NoSQL databases, short for Not Only SQL, are increasingly being used as the number of big data applications increases. Most systems still use relational databases (RDBs), but as the number of data increases each year, the system handles big data with NoSQL databases to analyze and access data more quickly. NoSQL emerged as a result of the exponential growth of the internet and the development of web applications. The query syntax in the NoSQL database differs from the SQL database, therefore requiring code changes in the application. Data adapter allow applications to not change their SQL query syntax. Data adapters provide methods that can synchronize SQL databases with NotSQL databases. In addition, the data adapter provides an interface which is application can access to run SQL queries. Hence, this research applied data adapter system to synchronize data between MySQL database and Apache HBase using direct access query approach, where system allows application to accept query while synchronization process in progress. From the test performed using data adapter, the results obtained that the data adapter can synchronize between SQL databases, MySQL, and NoSQL database, Apache HBase. This system spends the percentage of memory resources in the range of 40% to 60%, and the percentage of processor moving from 10% to 90%. In addition, from this system also obtained the performance of database NoSQL better than SQL database.

  5. Improving patient access to novel medical technologies in Europe.

    LENUS (Irish Health Repository)

    Kearney, Peter

    2012-02-03

    The European Society of Cardiology (ESC) organized a one-day workshop with clinicians, health economic experts, and health technology appraisal experts to discuss the equity of patient access to novel medical technologies in Europe. Two index technologies were considered: implantable cardioverter defibrillators (ICDs) and drug-eluting stents (DES). The use of ICDs range from 35 implants\\/million population in Portugal to 166 implants\\/million population in Germany, whereas for implants of DES (as percentage of total stents) it is lowest in Germany at 14% and high in Portugal at 65%. These differences can in part be explained by a lack of structured implementation of guidelines, the direct cost in relation to the overall healthcare budget, and to differences in procedures and models applied by Health Technology Assessment (HTA) agencies in Europe. The workshop participants concluded that physicians need to be involved in a more structured way in HTA and need to become better acquainted with its methods and terminology. Clinical guidelines should be systematically translated, explained, disseminated, updated, and adopted by cardiologists in Europe. Clinically appropriate, consistent and transparent health economic models need to be developed and high-quality international outcome and cost data should be used. A process for funding of a technology should be developed after a positive recommendation from HTA agencies. Both the ESC and the national cardiac societies should build-up health economic expertise and engage more actively in discussions with stakeholders involved in the provision of healthcare.

  6. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage of MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. The data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 9 figs

  7. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. These data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 10 figs

  8. DOE Order 5480.28 Hanford facilities database

    Energy Technology Data Exchange (ETDEWEB)

    Hayenga, J.L., Westinghouse Hanford

    1996-09-01

    This document describes the development of a database of DOE and/or leased Hanford Site Facilities. The completed database will consist of structure/facility parameters essential to the prioritization of these structures for natural phenomena hazard vulnerability in compliance with DOE Order 5480.28, `Natural Phenomena Hazards Mitigation`. The prioritization process will be based upon the structure/facility vulnerability to natural phenomena hazards. The ACCESS based database, `Hanford Facilities Site Database`, is generated from current Hanford Site information and databases.

  9. Accessing and operating agricultural machinery: Advancements in assistive technology for users with impaired mobility.

    Science.gov (United States)

    Ehlers, Shawn G; Field, William E

    2018-02-14

    This research focused on the advancements made in enabling agricultural workers with impaired mobility to access and operate off-road agricultural machinery. Although not a new concept, technological advancements in remote-controlled lifts, electronic actuators, electric over hydraulic controllers, and various modes of hand controls have advanced significantly, allowing operators with limited mobility to resume a high level of productivity in agricultural-related enterprises. In the United States, approximately 1.7% of the population is living with some form of paralysis or significant mobility impairment. When paired with the 2012 USDA Agriculture Census of 3.2 million farmers, it can be extrapolated that these technologies could impact 54,000 agricultural workers who have encountered disabling injuries or disease, which inhibit their ability to access and operate tractors, combines, and other self-propelled agricultural machines. Advancements in agricultural-specific technologies can allow for many of these individuals to regain the ability to effectively operate machinery once more.

  10. Some Considerations about Modern Database Machines

    Directory of Open Access Journals (Sweden)

    Manole VELICANU

    2010-01-01

    Full Text Available Optimizing the two computing resources of any computing system - time and space - has al-ways been one of the priority objectives of any database. A current and effective solution in this respect is the computer database. Optimizing computer applications by means of database machines has been a steady preoccupation of researchers since the late seventies. Several information technologies have revolutionized the present information framework. Out of these, those which have brought a major contribution to the optimization of the databases are: efficient handling of large volumes of data (Data Warehouse, Data Mining, OLAP – On Line Analytical Processing, the improvement of DBMS – Database Management Systems facilities through the integration of the new technologies, the dramatic increase in computing power and the efficient use of it (computer networks, massive parallel computing, Grid Computing and so on. All these information technologies, and others, have favored the resumption of the research on database machines and the obtaining in the last few years of some very good practical results, as far as the optimization of the computing resources is concerned.

  11. Ceramics Technology Project database: September 1991 summary report

    Energy Technology Data Exchange (ETDEWEB)

    Keyes, B.L.P.

    1992-06-01

    The piston ring-cylinder liner area of the internal combustion engine must withstand very-high-temperature gradients, highly-corrosive environments, and constant friction. Improving the efficiency in the engine requires ring and cylinder liner materials that can survive this abusive environment and lubricants that resist decomposition at elevated temperatures. Wear and friction tests have been done on many material combinations in environments similar to actual use to find the right materials for the situation. This report covers tribology information produced from 1986 through July 1991 by Battelle columbus Laboratories, Caterpillar Inc., and Cummins Engine Company, Inc. for the Ceramic Technology Project (CTP). All data in this report were taken from the project`s semiannual and bimonthly progress reports and cover base materials, coatings, and lubricants. The data, including test rig descriptions and material characterizations, are stored in the CTP database and are available to all project participants on request. Objective of this report is to make available the test results from these studies, but not to draw conclusions from these data.

  12. IAEA/NDS requirements related to database software

    International Nuclear Information System (INIS)

    Pronyaev, V.; Zerkin, V.

    2001-01-01

    Full text: The Nuclear Data Section of the IAEA disseminates data to the NDS users through Internet or on CD-ROMs and diskettes. OSU Web-server on DEC Alpha with Open VMS and Oracle/DEC DBMS provides via CGI scripts and FORTRAN retrieval programs access to the main nuclear databases supported by the networks of Nuclear Reactions Data Centres and Nuclear Structure and Decay Data Centres (CINDA, EXFOR, ENDF, NSR, ENSDF). For Web-access to data from other libraries and files, hyper-links to the files stored in ASCII text or other formats are used. Databases on CD-ROM are usually provided with some retrieval system. They are distributed in the run-time mode and comply with all license requirements for software used in their development. Although major development work is done now at the PC with MS-Windows and Linux, NDS may not at present, due to some institutional conditions, use these platforms for organization of the Web access to the data. Starting the end of 1999, the NDS, in co-operation with other data centers, began to work out the strategy of migration of main network nuclear data bases onto platforms other than DEC Alpha/Open VMS/DBMS. Because the different co-operating centers have their own preferences for hardware and software, the requirement to provide maximum platform independence for nuclear databases is the most important and desirable feature. This requirement determined some standards for the nuclear database software development. Taking into account the present state and future development, these standards can be formulated as follows: 1. All numerical data (experimental, evaluated, recommended values and their uncertainties) prepared for inclusion in the IAEA/NDS nuclear database should be submitted in the form of the ASCII text files and will be kept at NDS as a master file. 2. Databases with complex structure should be submitted in the form of the files with standard SQL statements describing all its components. All extensions of standard SQL

  13. Rhinoplasty perioperative database using a personal digital assistant.

    Science.gov (United States)

    Kotler, Howard S

    2004-01-01

    To construct a reliable, accurate, and easy-to-use handheld computer database that facilitates the point-of-care acquisition of perioperative text and image data specific to rhinoplasty. A user-modified database (Pendragon Forms [v.3.2]; Pendragon Software Corporation, Libertyville, Ill) and graphic image program (Tealpaint [v.4.87]; Tealpaint Software, San Rafael, Calif) were used to capture text and image data, respectively, on a Palm OS (v.4.11) handheld operating with 8 megabytes of memory. The handheld and desktop databases were maintained secure using PDASecure (v.2.0) and GoldSecure (v.3.0) (Trust Digital LLC, Fairfax, Va). The handheld data were then uploaded to a desktop database of either FileMaker Pro 5.0 (v.1) (FileMaker Inc, Santa Clara, Calif) or Microsoft Access 2000 (Microsoft Corp, Redmond, Wash). Patient data were collected from 15 patients undergoing rhinoplasty in a private practice outpatient ambulatory setting. Data integrity was assessed after 6 months' disk and hard drive storage. The handheld database was able to facilitate data collection and accurately record, transfer, and reliably maintain perioperative rhinoplasty data. Query capability allowed rapid search using a multitude of keyword search terms specific to the operative maneuvers performed in rhinoplasty. Handheld computer technology provides a method of reliably recording and storing perioperative rhinoplasty information. The handheld computer facilitates the reliable and accurate storage and query of perioperative data, assisting the retrospective review of one's own results and enhancement of surgical skills.

  14. Medicaid CHIP ESPC Database

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Environmental Scanning and Program Characteristic (ESPC) Database is in a Microsoft (MS) Access format and contains Medicaid and CHIP data, for the 50 states and...

  15. The applied technologies to access clean water for remote communities

    Science.gov (United States)

    Rabindra, I. B.

    2018-01-01

    A lot of research is done to overcome the remote communities to access clean water, yet very little is utilized and implemented by the community. Various reasons can probably be made for, which is the application of research results is assessed less practical. The aims of this paper is seeking a practical approach, how to establish criteria for the design can be easier applied, at the proper locations, the simple construction, effectively producing a volume and quality of clean water designation. The methods used in this paper is a technological model assessment of treatment/filtering clean water produced a variety of previous research, to establish a model of appropriate technology for remote communities. Various research results collected from the study of literature, while the identification of opportunities and threats to its application is done using a SWOT analysis. This article discussion is looking for alternative models of clean water filtration technology from the previous research results, to be selected as appropriate technology, easily applied and bring of many benefits to the remote communities. The conclusions resulting from the discussion in this paper, expected to be used as the basic criteria of design model of clean water filtration technologies that can be accepted and applied effectively by the remote communities.

  16. THE EXTRAGALACTIC DISTANCE DATABASE

    International Nuclear Information System (INIS)

    Tully, R. Brent; Courtois, Helene M.; Jacobs, Bradley A.; Rizzi, Luca; Shaya, Edward J.; Makarov, Dmitry I.

    2009-01-01

    A database can be accessed on the Web at http://edd.ifa.hawaii.edu that was developed to promote access to information related to galaxy distances. The database has three functional components. First, tables from many literature sources have been gathered and enhanced with links through a distinct galaxy naming convention. Second, comparisons of results both at the levels of parameters and of techniques have begun and are continuing, leading to increasing homogeneity and consistency of distance measurements. Third, new material is presented arising from ongoing observational programs at the University of Hawaii 2.2 m telescope, radio telescopes at Green Bank, Arecibo, and Parkes and with the Hubble Space Telescope. This new observational material is made available in tandem with related material drawn from archives and passed through common analysis pipelines.

  17. 經由校園網路存取圖書館光碟資料庫之研究 Studies on Multiuser Access Library CD-ROM Database via Campus Network

    Directory of Open Access Journals (Sweden)

    Ruey-shun Chen

    1992-06-01

    Full Text Available 無Library CD-ROM with its enormous storage, retrieval capabilities and reasonable price. It has been gradually replacing some of its printed counterpart. But one of the greatest limitation on the use of stand-alone CD-ROM workstation is that only one user can access the CD-ROM database at a time. This paper is proposed a new method to solve this problem. The method use personal computer via standard network system Ethernet high speed fiber network FADDY and standard protocol TCP/IP can access library CD-ROM database and perform a practical CD-ROM campus network system. Its advantage reduce redundant CD-ROM purchase fee and reduce damage by handed in and out and allows multiuser to access the same CD-ROM disc simultaneously.

  18. DataBase on Demand

    International Nuclear Information System (INIS)

    Aparicio, R Gaspar; Gomez, D; Wojcik, D; Coz, I Coterillo

    2012-01-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  19. Assessing the engagement, learning, and overall experience of students operating an atomic absorption spectrophotometer with remote access technology.

    Science.gov (United States)

    Erasmus, Daniel J; Brewer, Sharon E; Cinel, Bruno

    2015-01-01

    The use of internet-based technologies in the teaching of laboratories has emerged as a promising education tool. This study evaluated the effectiveness of using remote access technology to operate an atomic absorption spectrophotometer in analyzing the iron content in a crude myoglobin extract. Sixty-two students were surveyed on their level of engagement, learning, and overall experience. Feedback from students suggests that the use of remote access technology is effective in teaching students the principles of chemical analysis by atomic absorption spectroscopy. © 2014 The International Union of Biochemistry and Molecular Biology.

  20. Waste Tank Vapor Project: Tank vapor database development

    International Nuclear Information System (INIS)

    Seesing, P.R.; Birn, M.B.; Manke, K.L.

    1994-09-01

    The objective of the Tank Vapor Database (TVD) Development task in FY 1994 was to create a database to store, retrieve, and analyze data collected from the vapor phase of Hanford waste tanks. The data needed to be accessible over the Hanford Local Area Network to users at both Westinghouse Hanford Company (WHC) and Pacific Northwest Laboratory (PNL). The data were restricted to results published in cleared reports from the laboratories analyzing vapor samples. Emphasis was placed on ease of access and flexibility of data formatting and reporting mechanisms. Because of time and budget constraints, a Rapid Application Development strategy was adopted by the database development team. An extensive data modeling exercise was conducted to determine the scope of information contained in the database. a A SUN Sparcstation 1000 was procured as the database file server. A multi-user relational database management system, Sybase reg-sign, was chosen to provide the basic data storage and retrieval capabilities. Two packages were chosen for the user interface to the database: DataPrism reg-sign and Business Objects trademark. A prototype database was constructed to provide the Waste Tank Vapor Project's Toxicology task with summarized and detailed information presented at Vapor Conference 4 by WHC, PNL, Oak Ridge National Laboratory, and Oregon Graduate Institute. The prototype was used to develop a list of reported compounds, and the range of values for compounds reported by the analytical laboratories using different sample containers and analysis methodologies. The prototype allowed a panel of toxicology experts to identify carcinogens and compounds whose concentrations were within the reach of regulatory limits. The database and user documentation was made available for general access in September 1994