WorldWideScience

Sample records for geo database accession

  1. Geoscientific (GEO) database of the Andra Meuse / Haute-Marne research center

    International Nuclear Information System (INIS)

    Tabani, P.; Hemet, P.; Hermand, G.; Delay, J.; Auriere, C.

    2010-01-01

    Document available in extended abstract form only. The GEO database (geo-scientific database of the Meuse/Haute-Marne Center) is a tool developed by Andra, with a view to group in a secured computer form all data related to the acquisition of in situ and laboratory measurements made on solid and fluid samples. This database has three main functions: - Acquisition and management of data and computer files related to geological, geomechanical, hydrogeological and geochemical measurements on solid and fluid samples and in situ measurements (logging, on sample measurements, geological logs, etc). - Available consultation by the staff on Andra's intranet network for selective viewing of data linked to a borehole and/or a sample and for making computations and graphs on sets of laboratory measurements related to a sample. - Physical management of fluid and solid samples stored in a 'core library' in order to localize a sample, follow-up its movement out of the 'core library' to an organization, and carry out regular inventories. The GEO database is a relational Oracle data base. It is installed on a data server which stores information and manages the users' transactions. The users can consult, download and exploit data from any computer connected to the Andra network or Internet. Management of the access rights is made through a login/ password. Four geo-scientific explanations are linked to the Geo database, they are: - The Geosciences portal: The Geosciences portal is a web Intranet application accessible from the ANDRA network. It does not require a particular installation from the client and is accessible through the Internet navigator. A SQL Server Express database manages the users and access rights to the application. This application is used for the acquisition of hydrogeological and geochemical data collected on the field and on fluid samples, as well as data related to scientific work carried out at surface level or in drifts

  2. Geo-scientific database for research and development purposes

    International Nuclear Information System (INIS)

    Tabani, P.; Mangeot, A.; Crabol, V.; Delage, P.; Dewonck, S.; Auriere, C.

    2012-01-01

    , fruit of a continuous computer development over the past ten years, can store several hundreds of million data. The GEO database (geo-scientific database) is a tool developed by Andra since 1992, in order to group in a secured computer form all data related to the acquisition of in situ and laboratory measurements made on solid and fluid samples as well as observations related to environment. This database has three main functions: - Acquisition and management of data and computer files related to geological, geomechanical, hydrogeological and geochemical measurements on solid and fluid samples and in situ measurements (logging, on sample measurements, geological logs, etc.) as well as observations on fauna and flora. - Available consultation by the staff on Andra's intranet network for selective viewing of data linked to a borehole, a sample or a watch point and for making computations and graphs on sets of laboratory measurements related to a sample. - Physical management of fluid and solid samples stored in a 'core library' in order to localize a sample, follow-up its movement out of the 'core library' to an organization, and carry out regular inventories. Three geo-scientific software are linked to the Geo database: - Geosciences portal: it's a web Intranet application accessible from the ANDRA network. This application is used for the acquisition of hydrogeological and geochemical data collected on the field and on fluid samples, observations related to environmental monitoring, as well as data related to scientific work carried out at surface level or in drifts. - GESTECH application is a software used to integrate geomechanical and geological data collected on solid samples in the GEO database. - INTEGRAT application is a software application automatically integrates data files in the GEO database. For the sake of traceability and efficiency, references of the fluid and solid samples, of the containers (crates, cells, etc.) and storage zones of the 'core library

  3. Discovery of accessible locations using region-based geo-social data

    KAUST Repository

    Wang, Yan; Li, Jianmin; Zhong, Ying; Zhu, Shunzhi; Guo, Danhuai; Shang, Shuo

    2018-01-01

    Geo-social data plays a significant role in location discovery and recommendation. In this light, we propose and study a novel problem of discovering accessible locations in spatial networks using region-based geo-social data. Given a set Q of query

  4. GeoInt: the first macroseismic intensity database for the Republic of Georgia

    Science.gov (United States)

    Varazanashvili, O.; Tsereteli, N.; Bonali, F. L.; Arabidze, V.; Russo, E.; Pasquaré Mariotto, F.; Gogoladze, Z.; Tibaldi, A.; Kvavadze, N.; Oppizzi, P.

    2018-05-01

    Our work is intended to present the new macroseismic intensity database for the Republic of Georgia—hereby named GeoInt—which includes earthquakes from the historical (from 1250 B.C. onwards) to the instrumental era. Such database is composed of 111 selected earthquakes and related 3944 intensity data points (IDPs) for 1509 different localities, reported in the Medvedev-Sponheuer-Karnik scale (MSK). Regarding the earthquakes, the M S is in the 3.3-7 range and the depth is in the 2-36 km range. The entire set of IDPs is characterized by intensities ranging from 2-3 to 9-10 and covers an area spanning from 39.508° N to 45.043° N in a N-S direction and from 37.324° E to 48.500° E in an E-W direction, with some of the IDPs located outside the Georgian border, in the (i) Republic of Armenia, (ii) Russian Federation, (iii) Republic of Turkey, and (iv) Republic of Azerbaijan. We have revised each single IDP and have reevaluated and homogenized intensity values to the MSK scale. In particular, regarding the whole set of 3944 IDPs, 348 belong to the Historical era (pre-1900) and 3596 belong to the instrumental era (post-1900). With particular regard to the 3596 IDPs, 105 are brand new (3%), whereas the intensity values for 804 IDPs have been reevaluated (22%); for 2687 IDPs (75%), intensities have been confirmed from previous interpretations. We introduce this database as a key input for further improvements in seismic hazard modeling and seismic risk calculation for this region, based on macroseismic intensity; we report all the 111 earthquakes with available macroseismic information. The GeoInt database is also accessible online at http://www.enguriproject.unimib.it and will be kept updated in the future.

  5. The Geochemical Databases GEOROC and GeoReM - What's New?

    Science.gov (United States)

    Sarbas, B.; Jochum, K. P.; Nohl, U.; Weis, U.

    2017-12-01

    The geochemical databases GEOROC (http: georoc.mpch-mainz.gwdg.de) and GeoReM (http: georem.mpch-mainz.gwdg.de) are maintained by the Max Planck Institute for Chemistry in Mainz, Germany. Both online databases became crucial tools for geoscientists from different research areas. They are regularly upgraded by new tools and new data from recent publications obtained from a wide range of international journals. GEOROC is a collection of published analyses of volcanic rocks and mantle xenoliths. Since recently, data for plutonic rocks are added. The analyses include major and trace element concentrations, radiogenic and non-radiogenic isotope ratios as well as analytical ages for whole rocks, glasses, minerals and inclusions. Samples come from eleven geological settings and span the whole geological age scale from Archean to Recent. Metadata include, among others, geographic location, rock class and rock type, geological age, degree of alteration, analytical method, laboratory, and reference. The GEOROC web page allows selection of samples by geological setting, geography, chemical criteria, rock or sample name, and bibliographic criteria. In addition, it provides a large number of precompiled files for individual locations, minerals and rock classes. GeoReM is a database collecting information about reference materials of geological and environmental interest, such as rock powders, synthetic and natural glasses as well as mineral, isotopic, biological, river water and seawater reference materials. It contains published data and compilation values (major and trace element concentrations and mass fractions, radiogenic and stable isotope ratios). Metadata comprise, among others, uncertainty, analytical method and laboratory. Reference materials are important for calibration, method validation, quality control and to establish metrological traceability. GeoReM offers six different search strategies: samples or materials (published values), samples (GeoReM preferred

  6. GeoPro: Technology to Enable Scientific Modeling

    International Nuclear Information System (INIS)

    C. Juan

    2004-01-01

    Development of the ground-water flow model for the Death Valley Regional Groundwater Flow System (DVRFS) required integration of numerous supporting hydrogeologic investigations. The results from recharge, discharge, hydraulic properties, water level, pumping, model boundaries, and geologic studies were integrated to develop the required conceptual and 3-D framework models, and the flow model itself. To support the complex modeling process and the needs of the multidisciplinary DVRFS team, a hardware and software system called GeoPro (Geoscience Knowledge Integration Protocol) was developed. A primary function of GeoPro is to manage the large volume of disparate data compiled for the 100,000-square-kilometer area of southern Nevada and California. The data are primarily from previous investigations and regional flow models developed for the Nevada Test Site and Yucca Mountain projects. GeoPro utilizes relational database technology (Microsoft SQL Server(trademark)) to store and manage these tabular point data, groundwater flow model ASCII data, 3-D hydrogeologic framework data, 2-D and 2.5-D GIS data, and text documents. Data management consists of versioning, tracking, and reporting data changes as multiple users access the centralized database. GeoPro also supports the modeling process by automating the routine data transformations required to integrate project software. This automation is also crucial to streamlining pre- and post-processing of model data during model calibration. Another function of GeoPro is to facilitate the dissemination and use of the model data and results through web-based documents by linking and allowing access to the underlying database and analysis tools. The intent is to convey to end-users the complex flow model product in a manner that is simple, flexible, and relevant to their needs. GeoPro is evolving from a prototype system to a production-level product. Currently the DVRFS pre- and post-processing modeling tools are being re

  7. Ten Years Experience In Geo-Databases For Linear Facilities Risk Assessment (Lfra)

    Science.gov (United States)

    Oboni, F.

    2003-04-01

    Keywords: geo-environmental, database, ISO14000, management, decision-making, risk, pipelines, roads, railroads, loss control, SAR, hazard identification ABSTRACT: During the past decades, characterized by the development of the Risk Management (RM) culture, a variety of different RM models have been proposed by governmental agencies in various parts of the world. The most structured models appear to have originated in the field of environmental RM. These models are briefly reviewed in the first section of the paper focusing the attention on the difference between Hazard Management and Risk Management and the need to use databases in order to allow retrieval of specific information and effective updating. The core of the paper reviews a number of different RM approaches, based on extensions of geo-databases, specifically developed for linear facilities (LF) in transportation corridors since the early 90s in Switzerland, Italy, Canada, the US and South America. The applications are compared in terms of methodology, capabilities and resources necessary to their implementation. The paper then focuses the attention on the level of detail that applications and related data have to attain. Common pitfalls related to decision making based on hazards rather than on risks are discussed. The paper focuses the last sections on the description of the next generation of linear facility RA application, including examples of results and discussion of future methodological research. It is shown that geo-databases should be linked to loss control and accident reports in order to maximize their benefits. The links between RA and ISO 14000 (environmental management code) are explicitly considered.

  8. Development of a geo-information system for the evaluation of active faults

    Energy Technology Data Exchange (ETDEWEB)

    Park, Dong Won; Pi, Nuen; Ger, Tien; Choi, Jun Hyoung [Paichai Univ., Daejeon (Korea, Republic of)

    2003-02-15

    In this work, we exploit hypermedia web database of geo-information system in order to access and share the multimedia data such as text,. graphics, animation, audio and video. It is gaining interest as a tool for effectively accessing the large amount of information that is available.

  9. Development of a geo-information system for the evaluation of active faults

    International Nuclear Information System (INIS)

    Park, Dong Won; Pi, Nuen; Ger, Tien; Choi, Jun Hyoung

    2003-02-01

    In this work, we exploit hypermedia web database of geo-information system in order to access and share the multimedia data such as text,. graphics, animation, audio and video. It is gaining interest as a tool for effectively accessing the large amount of information that is available

  10. The plant phenological online database (PPODB): an online database for long-term phenological data

    Science.gov (United States)

    Dierenbach, Jonas; Badeck, Franz-W.; Schaber, Jörg

    2013-09-01

    We present an online database that provides unrestricted and free access to over 16 million plant phenological observations from over 8,000 stations in Central Europe between the years 1880 and 2009. Unique features are (1) a flexible and unrestricted access to a full-fledged database, allowing for a wide range of individual queries and data retrieval, (2) historical data for Germany before 1951 ranging back to 1880, and (3) more than 480 curated long-term time series covering more than 100 years for individual phenological phases and plants combined over Natural Regions in Germany. Time series for single stations or Natural Regions can be accessed through a user-friendly graphical geo-referenced interface. The joint databases made available with the plant phenological database PPODB render accessible an important data source for further analyses of long-term changes in phenology. The database can be accessed via www.ppodb.de .

  11. PaleoGeo: a Web based GIS database for paleoenvironmental studies

    Science.gov (United States)

    Song, Wonsuh; Kondo, Yasuhisa; Oguchi, Takashi

    2014-05-01

    Paleoenvironmental studies cover various fields such as paleohydrology, geomorphology, paleooceanology, paleobiology, paleoclimatology, and chronology. It is difficult for an individual researcher to collect and compile enormous data regarding these fields. We have been compiling portal data and presenting them using a web-based geographical information system (Web-GIS) called PaleoGeo for the multidisciplinary project 'Replacement of Neanderthals by Modern Humans'. The aim of the project is to reconstruct the distribution of Neanderthals and modern humans in time and space in relation to past climate change. We have been collecting information from almost three thousand articles of 13 journals regarding paleoenvironmental research (i.e., Boreas, Catena, Climatic Change, Earth Surface Processes and Landforms, Geomorphology, Journal of Quaternary Science, Palaeogeography, Palaeoclimatology, and Palaeoecology, Quaternary International, Quaternary Research, Quaternary Science Reviews, The Holocene, and The Journal of Geology). The topics of the articles were classified into six themes (paleohydrology, earth surface processes and materials, paleooceanology, paleobiology, palaeoclimatology, and chronology) and 19 subthemes (hydrology, flood, fluvial, glacier, fluvial/glacier, sedimentology, soil, slope process, periglacial, peat land, eolian, sea-level, biology, vegetation, zoology, vegetation/zoology, archaeology, climate, atmosphere, and chronology). The collected data consist of the journal name, information about each paper (authors, title, volume, year, and page numbers), site location (country name, longitude, and latitude), theme, subtheme, keywords, DOI (Digital Object Identifier), and period (era). Location data are indispensable for paleoenvironmental studies. The PaleoGeo shows information with a map, which is an advantage of this database system. However, the number of the paleoenvironmental studies is growing rapidly and we have to effectively cover them as

  12. A linked GeoData map for enabling information access

    Science.gov (United States)

    Powell, Logan J.; Varanka, Dalia E.

    2018-01-10

    attributed Resource Description Framework (RDF) serializations of linked data for mapping. The proof-of-concept focuses on accessing triple data from visual elements of a geographic map as the interface to the MKB. The map interface is embedded with other essential functions such as SPARQL Protocol and RDF Query Language (SPARQL) data query endpoint services and reasoning capabilities of Apache Marmotta (Apache Software Foundation, 2017). An RDF database of the Geographic Names Information System (GNIS), which contains official names of domestic feature in the United States, was linked to a county data layer from The National Map of the U.S. Geological Survey. The county data are part of a broader Government Units theme offered to the public as Esri shapefiles. The shapefile used to draw the map itself was converted to a geographic-oriented JavaScript Object Notation (JSON) (GeoJSON) format and linked through various properties with a linked geodata version of the GNIS database called “GNIS–LD” (Butler and others, 2016; B. Regalia and others, University of California-Santa Barbara, written commun., 2017). The GNIS–LD files originated in Terse RDF Triple Language (Turtle) format but were converted to a JSON format specialized in linked data, “JSON–LD” (Beckett and Berners-Lee, 2011; Sorny and others, 2014). The GNIS–LD database is composed of roughly three predominant triple data graphs: Features, Names, and History. The graphs include a set of namespace prefixes used by each of the attributes. Predefining the prefixes made the conversion to the JSON–LD format simple to complete because Turtle and JSON–LD are variant specifications of the basic RDF concept.To convert a shapefile into GeoJSON format to capture the geospatial coordinate geometry objects, an online converter, Mapshaper, was used (Bloch, 2013). To convert the Turtle files, a custom converter written in Java reconstructs the files by parsing each grouping of attributes belonging to one subject

  13. Fermilab Security Site Access Request Database

    Science.gov (United States)

    Fermilab Security Site Access Request Database Use of the online version of the Fermilab Security Site Access Request Database requires that you login into the ESH&Q Web Site. Note: Only Fermilab generated from the ESH&Q Section's Oracle database on May 27, 2018 05:48 AM. If you have a question

  14. Dynamic taxonomies applied to a web-based relational database for geo-hydrological risk mitigation

    Science.gov (United States)

    Sacco, G. M.; Nigrelli, G.; Bosio, A.; Chiarle, M.; Luino, F.

    2012-02-01

    In its 40 years of activity, the Research Institute for Geo-hydrological Protection of the Italian National Research Council has amassed a vast and varied collection of historical documentation on landslides, muddy-debris flows, and floods in northern Italy from 1600 to the present. Since 2008, the archive resources have been maintained through a relational database management system. The database is used for routine study and research purposes as well as for providing support during geo-hydrological emergencies, when data need to be quickly and accurately retrieved. Retrieval speed and accuracy are the main objectives of an implementation based on a dynamic taxonomies model. Dynamic taxonomies are a general knowledge management model for configuring complex, heterogeneous information bases that support exploratory searching. At each stage of the process, the user can explore or browse the database in a guided yet unconstrained way by selecting the alternatives suggested for further refining the search. Dynamic taxonomies have been successfully applied to such diverse and apparently unrelated domains as e-commerce and medical diagnosis. Here, we describe the application of dynamic taxonomies to our database and compare it to traditional relational database query methods. The dynamic taxonomy interface, essentially a point-and-click interface, is considerably faster and less error-prone than traditional form-based query interfaces that require the user to remember and type in the "right" search keywords. Finally, dynamic taxonomy users have confirmed that one of the principal benefits of this approach is the confidence of having considered all the relevant information. Dynamic taxonomies and relational databases work in synergy to provide fast and precise searching: one of the most important factors in timely response to emergencies.

  15. FID GEO: Digital transformation and Open Access in Germany's geoscience research community

    Science.gov (United States)

    Hübner, Andreas; Martinson, Guntars; Bertelmann, Roland; Elger, Kirsten; Pfurr, Norbert; Schüler, Mechthild

    2017-04-01

    The 'Specialized Information Service for Solid Earth Sciences' (FID GEO) supports Germany's geoscience research community in 1) electronic publishing of i) institutional and "grey" literature not released in publishing houses and ii) pre- and postprints of research articles 2) digitising geoscience literature and maps and 3) addressing the publication of research data associated with peer-reviewed research articles (data supplements). Established in 2016, FID GEO is funded by the German Research Foundation (DFG) and is run by the Göttingen State and University Library (SUB Göttingen) and the GFZ German Research Centre for Geosciences. Here we present recent success stories and lessons learned. With regard to digitisation, FID GEO received a request from one of the most prestigious geoscience societies in Germany to digitise back-issues of its journals that are so far only available in print. Aims are to ensure long-term availability in Open Access and high visibility by DOI-referenced electronic publication via the FID GEO repository. While digitisation will be financed by FID GEO funds, major challenges are to identify the copyright holders (journals date back to 1924) and negotiate digitisation and publication rights. With respect to research data publishing, we present how we target scientists to integrate the publication of research data into their workflows and institutions to promote the topic. For the latter, we successfully take advantage of existing networks as entry points to the community, like the research network Geo.X in the Berlin-Brandenburg area, individual learned societies as well as their overarching structures DV Geo and GeoUnion. FID GEO promotes the Statement of Commitment of the Coalition for Publishing Data in the Earth and Space Sciences (COPDESS) as well as the FAIR Data Principles in presentations to the above-mentioned groups and institutions. Our aim is to eventually transfer the positive feedback from the geoscience community into

  16. PathwayAccess: CellDesigner plugins for pathway databases.

    Science.gov (United States)

    Van Hemert, John L; Dickerson, Julie A

    2010-09-15

    CellDesigner provides a user-friendly interface for graphical biochemical pathway description. Many pathway databases are not directly exportable to CellDesigner models. PathwayAccess is an extensible suite of CellDesigner plugins, which connect CellDesigner directly to pathway databases using respective Java application programming interfaces. The process is streamlined for creating new PathwayAccess plugins for specific pathway databases. Three PathwayAccess plugins, MetNetAccess, BioCycAccess and ReactomeAccess, directly connect CellDesigner to the pathway databases MetNetDB, BioCyc and Reactome. PathwayAccess plugins enable CellDesigner users to expose pathway data to analytical CellDesigner functions, curate their pathway databases and visually integrate pathway data from different databases using standard Systems Biology Markup Language and Systems Biology Graphical Notation. Implemented in Java, PathwayAccess plugins run with CellDesigner version 4.0.1 and were tested on Ubuntu Linux, Windows XP and 7, and MacOSX. Source code, binaries, documentation and video walkthroughs are freely available at http://vrac.iastate.edu/~jlv.

  17. GeoNetwork powered GI-cat: a geoportal hybrid solution

    Science.gov (United States)

    Baldini, Alessio; Boldrini, Enrico; Santoro, Mattia; Mazzetti, Paolo

    2010-05-01

    according to the interface protocols exposed by GI-cat into the multiple query dialects spoken by the resource service providers. Currently, a number of well-accepted catalog and inventory services are supported, including several OGC Web Services, THREDDS Data Server, SeaDataNet Common Data Index, GBIF and OpenSearch engines. A GeoNetwork powered GI-cat has been developed in order to exploit the best of the two frameworks. The new system uses a modified version of GeoNetwork web interface in order to add the capability of querying also the specified GI-cat catalog and not only the GeoNetwork internal database. The resulting system consists in a geoportal in which GI-cat plays the role of the search engine. This new system allows to distribute the query on the different types of data sources linked to a GI-cat. The metadata results of the query are then visualized by the Geonetwork web interface. This configuration was experimented in the framework of GIIDA, a project of the Italian National Research Council (CNR) focused on data accessibility and interoperability. A second advantage of this solution is achieved setting up a GeoNetwork catalog amongst the accessors of the GI-cat instance. Such a configuration will allow in turn GI-cat to run the query against the internal GeoNetwork database. This allows to have both the harvesting and the metadata editor functionalities provided by GeoNetwork and the distributed search functionality of GI-cat available in a consistent way through the same web interface.

  18. Information literacy skills and accessibility of databases among ...

    African Journals Online (AJOL)

    Previous researches on the accessibility of Databases by Undergraduate Students of Umaru Musa Yar'adua University (UMYU) have found out that there was low level of accessibility of library databases. This paper investigates the further factors of the accessibility of databases among Undergraduate Students of Umaru ...

  19. Discovery of accessible locations using region-based geo-social data

    KAUST Repository

    Wang, Yan

    2018-03-17

    Geo-social data plays a significant role in location discovery and recommendation. In this light, we propose and study a novel problem of discovering accessible locations in spatial networks using region-based geo-social data. Given a set Q of query regions, the top-k accessible location discovery query (k ALDQ) finds k locations that have the highest spatial-density correlations to Q. Both the spatial distances between locations and regions and the POI (point of interest) density within the regions are taken into account. We believe that this type of k ALDQ query can bring significant benefit to many applications such as travel planning, facility allocation, and urban planning. Three challenges exist in k ALDQ: (1) how to model the spatial-density correlation practically, (2) how to prune the search space effectively, and (3) how to schedule the searches from multiple query regions. To tackle the challenges and process k ALDQ effectively and efficiently, we first define a series of spatial and density metrics to model the spatial-density correlation. Then we propose a novel three-phase solution with a pair of upper and lower bounds of the spatial-density correlation and a heuristic scheduling strategy to schedule multiple query regions. Finally, we conduct extensive experiments on real and synthetic spatial data to demonstrate the performance of the developed solutions.

  20. Accessing and using chemical databases

    DEFF Research Database (Denmark)

    Nikolov, Nikolai Georgiev; Pavlov, Todor; Niemelä, Jay Russell

    2013-01-01

    Computer-based representation of chemicals makes it possible to organize data in chemical databases-collections of chemical structures and associated properties. Databases are widely used wherever efficient processing of chemical information is needed, including search, storage, retrieval......, and dissemination. Structure and functionality of chemical databases are considered. The typical kinds of information found in a chemical database are considered-identification, structural, and associated data. Functionality of chemical databases is presented, with examples of search and access types. More details...... are included about the OASIS database and platform and the Danish (Q)SAR Database online. Various types of chemical database resources are discussed, together with a list of examples....

  1. Database theory and SQL practice using Access

    International Nuclear Information System (INIS)

    Kim, Gyeong Min; Lee, Myeong Jin

    2001-01-01

    This book introduces database theory and SQL practice using Access. It is comprised of seven chapters, which give description of understanding database with basic conception and DMBS, understanding relational database with examples of it, building database table and inputting data using access 2000, structured Query Language with introduction, management and making complex query using SQL, command for advanced SQL with understanding conception of join and virtual table, design on database for online bookstore with six steps and building of application with function, structure, component, understanding of the principle, operation and checking programming source for application menu.

  2. Design, Implementation and Applications of 3d Web-Services in DB4GEO

    Science.gov (United States)

    Breunig, M.; Kuper, P. V.; Dittrich, A.; Wild, P.; Butwilowski, E.; Al-Doori, M.

    2013-09-01

    The object-oriented database architecture DB4GeO was originally designed to support sub-surface applications in the geo-sciences. This is reflected in DB4GeO's geometric data model as well as in its import and export functions. Initially, these functions were designed for communication with 3D geological modeling and visualization tools such as GOCAD or MeshLab. However, it soon became clear that DB4GeO was suitable for a much wider range of applications. Therefore it is natural to move away from a standalone solution and to open the access to DB4GeO data by standardized OGC web-services. Though REST and OGC services seem incompatible at first sight, the implementation in DB4GeO shows that OGC-based implementation of web-services may use parts of the DB4GeO-REST implementation. Starting with initial solutions in the history of DB4GeO, this paper will introduce the design, adaptation (i.e. model transformation), and first steps in the implementation of OGC Web Feature (WFS) and Web Processing Services (WPS), as new interfaces to DB4GeO data and operations. Among its capabilities, DB4GeO can provide data in different data formats like GML, GOCAD, or DB3D XML through a WFS, as well as its ability to run operations like a 3D-to-2D service, or mesh-simplification (Progressive Meshes) through a WPS. We then demonstrate, an Android-based mobile 3D augmented reality viewer for DB4GeO that uses the Web Feature Service to visualize 3D geo-database query results. Finally, we explore future research work considering DB4GeO in the framework of the research group "Computer-Aided Collaborative Subway Track Planning in Multi-Scale 3D City and Building Models".

  3. Correlates of Access to Business Research Databases

    Science.gov (United States)

    Gottfried, John C.

    2010-01-01

    This study examines potential correlates of business research database access through academic libraries serving top business programs in the United States. Results indicate that greater access to research databases is related to enrollment in graduate business programs, but not to overall enrollment or status as a public or private institution.…

  4. Tri-party agreement databases, access mechanism and procedures. Revision 2

    International Nuclear Information System (INIS)

    Brulotte, P.J.

    1996-01-01

    This document contains the information required for the Washington State Department of Ecology (Ecology) and the U.S. Environmental Protection Agency (EPA) to access databases related to the Hanford Federal Facility Agreement and Consent Order (Tri-Party Agreement). It identifies the procedure required to obtain access to the Hanford Site computer networks and the Tri-Party Agreement related databases. It addresses security requirements, access methods, database availability dates, database access procedures, and the minimum computer hardware and software configurations required to operate within the Hanford Site networks. This document supersedes any previous agreements including the Administrative Agreement to Provide Computer Access to U.S. Environmental Protection Agency (EPA) and the Administrative Agreement to Provide Computer Access to Washington State Department of Ecology (Ecology), agreements that were signed by the U.S. Department of Energy (DOE), Richland Operations Office (RL) in June 1990, Access approval to EPA and Ecology is extended by RL to include all Tri-Party Agreement relevant databases named in this document via the documented access method and date. Access to databases and systems not listed in this document will be granted as determined necessary and negotiated among Ecology, EPA, and RL through the Tri-Party Agreement Project Managers. The Tri-Party Agreement Project Managers are the primary points of contact for all activities to be carried out under the Tri-Party Agreement. Action Plan. Access to the Tri-Party Agreement related databases and systems does not provide or imply any ownership on behalf of Ecology or EPA whether public or private of either the database or the system. Access to identified systems and databases does not include access to network/system administrative control information, network maps, etc

  5. Reactome graph database: Efficient access to complex pathway data

    Science.gov (United States)

    Korninger, Florian; Viteri, Guilherme; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D’Eustachio, Peter

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types. PMID:29377902

  6. Reactome graph database: Efficient access to complex pathway data.

    Directory of Open Access Journals (Sweden)

    Antonio Fabregat

    2018-01-01

    Full Text Available Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j as well as the new ContentService (REST API that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types.

  7. Reactome graph database: Efficient access to complex pathway data.

    Science.gov (United States)

    Fabregat, Antonio; Korninger, Florian; Viteri, Guilherme; Sidiropoulos, Konstantinos; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D'Eustachio, Peter; Hermjakob, Henning

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types.

  8. Large scale access tests and online interfaces to ATLAS conditions databases

    International Nuclear Information System (INIS)

    Amorim, A; Lopes, L; Pereira, P; Simoes, J; Soloviev, I; Burckhart, D; Schmitt, J V D; Caprini, M; Kolos, S

    2008-01-01

    The access of the ATLAS Trigger and Data Acquisition (TDAQ) system to the ATLAS Conditions Databases sets strong reliability and performance requirements on the database storage and access infrastructures. Several applications were developed to support the integration of Conditions database access with the online services in TDAQ, including the interface to the Information Services (IS) and to the TDAQ Configuration Databases. The information storage requirements were the motivation for the ONline A Synchronous Interface to COOL (ONASIC) from the Information Service (IS) to LCG/COOL databases. ONASIC avoids the possible backpressure from Online Database servers by managing a local cache. In parallel, OKS2COOL was developed to store Configuration Databases into an Offline Database with history record. The DBStressor application was developed to test and stress the access to the Conditions database using the LCG/COOL interface while operating in an integrated way as a TDAQ application. The performance scaling of simultaneous Conditions database read accesses was studied in the context of the ATLAS High Level Trigger large computing farms. A large set of tests were performed involving up to 1000 computing nodes that simultaneously accessed the LCG central database server infrastructure at CERN

  9. The AAS Working Group on Accessibility and Disability (WGAD) Year 1 Highlights and Database Access

    Science.gov (United States)

    Knierman, Karen A.; Diaz Merced, Wanda; Aarnio, Alicia; Garcia, Beatriz; Monkiewicz, Jacqueline A.; Murphy, Nicholas Arnold

    2017-06-01

    The AAS Working Group on Accessibility and Disability (WGAD) was formed in January of 2016 with the express purpose of seeking equity of opportunity and building inclusive practices for disabled astronomers at all educational and career stages. In this presentation, we will provide a summary of current activities, focusing on developing best practices for accessibility with respect to astronomical databases, publications, and meetings. Due to the reliance of space sciences on databases, it is important to have user centered design systems for data retrieval. The cognitive overload that may be experienced by users of current databases may be mitigated by use of multi-modal interfaces such as xSonify. Such interfaces would be in parallel or outside the original database and would not require additional software efforts from the original database. WGAD is partnering with the IAU Commission C1 WG Astronomy for Equity and Inclusion to develop such accessibility tools for databases and methods for user testing. To collect data on astronomical conference and meeting accessibility considerations, WGAD solicited feedback from January AAS attendees via a web form. These data, together with upcoming input from the community and analysis of accessibility documents of similar conferences, will be used to create a meeting accessibility document. Additionally, we will update the progress of journal access guidelines and our social media presence via Twitter. We recommend that astronomical journals form committees to evaluate the accessibility of their publications by performing user-centered usability studies.

  10. GeoBolivia the initiator Spatial Data Infrastructure of the Plurinational State of Bolivia's Node

    Science.gov (United States)

    Molina Rodriguez, Raul Fernando; Lesage, Sylvain

    2014-05-01

    Started in 2011, the GeoBolivia project (www.geo.gob.bo)aims at building the Spatial Data Infrastructure of the Plurinational State of Bolivia (IDE-EPB by its Spanish initials), as an effort of the Vice Presidency of the State to give an open access to the public geographic information of Bolivia. The first phase of the project has already been completed. It consisted in implementing an infrastructure and a geoportal for accessing the geographic information through WMS, WFS, WCS and CSW services. The project is currently in its second phase dedicated to decentralizing the structure of IDE-EPB and promoting its use throughout the Bolivian State. The whole platform uses free software and open standards. As a complement, an on-line training module was developed to undertake the transfer of the knowledge the project generated. The main software components used in the SDI are: gvSIG, QGis, uDig as GIS desktop clients; PostGreSQL and PostGIS as geographic database management system; geOrchestra as a framework containing the GeoServer map server, the GeoNetwork catalog server and the OpenLayers and Mapfish GIS webclient; MapServer as a map server for generating OpenStreetMap tiles; Debian as operating system; Apache and Tomcat as web servers. Keywords: SDI, Bolivia, GIS, free software, catalog, gvSIG, QGIS, uDig, geOrchestra, OpenLayers, Mapfish, GeoNetwork, MapServer, GeoServer, OGC, WFS, WMS, WCS, CSW, WMC.

  11. GEO portal

    Data.gov (United States)

    US Agency for International Development — The USAID GeoPortal is a new application that groups web-based capabilities for on-demand discovery of and access to geospatial content, services, expertise, and...

  12. Access database application in medical treatment management platform

    International Nuclear Information System (INIS)

    Wu Qingming

    2014-01-01

    For timely, accurate and flexible access to medical expenses data, we applied Microsoft Access 2003 database management software, and we finished the establishment of a management platform for medical expenses. By developing management platform for medical expenses, overall hospital costs for medical expenses can be controlled to achieve a real-time monitoring of medical expenses. Using the Access database management platform for medical expenses not only changes the management model, but also promotes a sound management system for medical expenses. (authors)

  13. Topologically Consistent Models for Efficient Big Geo-Spatio Data Distribution

    Science.gov (United States)

    Jahn, M. W.; Bradley, P. E.; Doori, M. Al; Breunig, M.

    2017-10-01

    Geo-spatio-temporal topology models are likely to become a key concept to check the consistency of 3D (spatial space) and 4D (spatial + temporal space) models for emerging GIS applications such as subsurface reservoir modelling or the simulation of energy and water supply of mega or smart cities. Furthermore, the data management for complex models consisting of big geo-spatial data is a challenge for GIS and geo-database research. General challenges, concepts, and techniques of big geo-spatial data management are presented. In this paper we introduce a sound mathematical approach for a topologically consistent geo-spatio-temporal model based on the concept of the incidence graph. We redesign DB4GeO, our service-based geo-spatio-temporal database architecture, on the way to the parallel management of massive geo-spatial data. Approaches for a new geo-spatio-temporal and object model of DB4GeO meeting the requirements of big geo-spatial data are discussed in detail. Finally, a conclusion and outlook on our future research are given on the way to support the processing of geo-analytics and -simulations in a parallel and distributed system environment.

  14. Database design for Physical Access Control System for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Sathishkumar, T., E-mail: satishkumart@igcar.gov.in; Rao, G. Prabhakara, E-mail: prg@igcar.gov.in; Arumugam, P., E-mail: aarmu@igcar.gov.in

    2016-08-15

    Highlights: • Database design needs to be optimized and highly efficient for real time operation. • It requires a many-to-many mapping between Employee table and Doors table. • This mapping typically contain thousands of records and redundant data. • Proposed novel database design reduces the redundancy and provides abstraction. • This design is incorporated with the access control system developed in-house. - Abstract: A (Radio Frequency IDentification) RFID cum Biometric based two level Access Control System (ACS) was designed and developed for providing access to vital areas of nuclear facilities. The system has got both hardware [Access controller] and software components [server application, the database and the web client software]. The database design proposed, enables grouping of the employees based on the hierarchy of the organization and the grouping of the doors based on Access Zones (AZ). This design also illustrates the mapping between the Employee Groups (EG) and AZ. By following this approach in database design, a higher level view can be presented to the system administrator abstracting the inner details of the individual entities and doors. This paper describes the novel approach carried out in designing the database of the ACS.

  15. Database design for Physical Access Control System for nuclear facilities

    International Nuclear Information System (INIS)

    Sathishkumar, T.; Rao, G. Prabhakara; Arumugam, P.

    2016-01-01

    Highlights: • Database design needs to be optimized and highly efficient for real time operation. • It requires a many-to-many mapping between Employee table and Doors table. • This mapping typically contain thousands of records and redundant data. • Proposed novel database design reduces the redundancy and provides abstraction. • This design is incorporated with the access control system developed in-house. - Abstract: A (Radio Frequency IDentification) RFID cum Biometric based two level Access Control System (ACS) was designed and developed for providing access to vital areas of nuclear facilities. The system has got both hardware [Access controller] and software components [server application, the database and the web client software]. The database design proposed, enables grouping of the employees based on the hierarchy of the organization and the grouping of the doors based on Access Zones (AZ). This design also illustrates the mapping between the Employee Groups (EG) and AZ. By following this approach in database design, a higher level view can be presented to the system administrator abstracting the inner details of the individual entities and doors. This paper describes the novel approach carried out in designing the database of the ACS.

  16. Integrated Geo Hazard Management System in Cloud Computing Technology

    Science.gov (United States)

    Hanifah, M. I. M.; Omar, R. C.; Khalid, N. H. N.; Ismail, A.; Mustapha, I. S.; Baharuddin, I. N. Z.; Roslan, R.; Zalam, W. M. Z.

    2016-11-01

    Geo hazard can result in reducing of environmental health and huge economic losses especially in mountainous area. In order to mitigate geo-hazard effectively, cloud computer technology are introduce for managing geo hazard database. Cloud computing technology and it services capable to provide stakeholder's with geo hazards information in near to real time for an effective environmental management and decision-making. UNITEN Integrated Geo Hazard Management System consist of the network management and operation to monitor geo-hazard disaster especially landslide in our study area at Kelantan River Basin and boundary between Hulu Kelantan and Hulu Terengganu. The system will provide easily manage flexible measuring system with data management operates autonomously and can be controlled by commands to collects and controls remotely by using “cloud” system computing. This paper aims to document the above relationship by identifying the special features and needs associated with effective geohazard database management using “cloud system”. This system later will use as part of the development activities and result in minimizing the frequency of the geo-hazard and risk at that research area.

  17. High-Performance Secure Database Access Technologies for HEP Grids

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the

  18. High-Performance Secure Database Access Technologies for HEP Grids

    International Nuclear Information System (INIS)

    Vranicar, Matthew; Weicher, John

    2006-01-01

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist's computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that 'Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications'. There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure

  19. Accessing Electronic Databases for Curriculum Delivery in Schools ...

    African Journals Online (AJOL)

    This paper discussed the role of electronic databases in education with emphasis on the means of accessing the electronic databases. The paper further highlighted the various types and categories of electronic databases which the schools can explore in the process of teaching and learning as well as the techniques of ...

  20. A review of accessibility of administrative healthcare databases in the Asia-Pacific region.

    Science.gov (United States)

    Milea, Dominique; Azmi, Soraya; Reginald, Praveen; Verpillat, Patrice; Francois, Clement

    2015-01-01

    We describe and compare the availability and accessibility of administrative healthcare databases (AHDB) in several Asia-Pacific countries: Australia, Japan, South Korea, Taiwan, Singapore, China, Thailand, and Malaysia. The study included hospital records, reimbursement databases, prescription databases, and data linkages. Databases were first identified through PubMed, Google Scholar, and the ISPOR database register. Database custodians were contacted. Six criteria were used to assess the databases and provided the basis for a tool to categorise databases into seven levels ranging from least accessible (Level 1) to most accessible (Level 7). We also categorised overall data accessibility for each country as high, medium, or low based on accessibility of databases as well as the number of academic articles published using the databases. Fifty-four administrative databases were identified. Only a limited number of databases allowed access to raw data and were at Level 7 [Medical Data Vision EBM Provider, Japan Medical Data Centre (JMDC) Claims database and Nihon-Chouzai Pharmacy Claims database in Japan, and Medicare, Pharmaceutical Benefits Scheme (PBS), Centre for Health Record Linkage (CHeReL), HealthLinQ, Victorian Data Linkages (VDL), SA-NT DataLink in Australia]. At Levels 3-6 were several databases from Japan [Hamamatsu Medical University Database, Medi-Trend, Nihon University School of Medicine Clinical Data Warehouse (NUSM)], Australia [Western Australia Data Linkage (WADL)], Taiwan [National Health Insurance Research Database (NHIRD)], South Korea [Health Insurance Review and Assessment Service (HIRA)], and Malaysia [United Nations University (UNU)-Casemix]. Countries were categorised as having a high level of data accessibility (Australia, Taiwan, and Japan), medium level of accessibility (South Korea), or a low level of accessibility (Thailand, China, Malaysia, and Singapore). In some countries, data may be available but accessibility was restricted

  1. A review of accessibility of administrative healthcare databases in the Asia-Pacific region

    Science.gov (United States)

    Milea, Dominique; Azmi, Soraya; Reginald, Praveen; Verpillat, Patrice; Francois, Clement

    2015-01-01

    Objective We describe and compare the availability and accessibility of administrative healthcare databases (AHDB) in several Asia-Pacific countries: Australia, Japan, South Korea, Taiwan, Singapore, China, Thailand, and Malaysia. Methods The study included hospital records, reimbursement databases, prescription databases, and data linkages. Databases were first identified through PubMed, Google Scholar, and the ISPOR database register. Database custodians were contacted. Six criteria were used to assess the databases and provided the basis for a tool to categorise databases into seven levels ranging from least accessible (Level 1) to most accessible (Level 7). We also categorised overall data accessibility for each country as high, medium, or low based on accessibility of databases as well as the number of academic articles published using the databases. Results Fifty-four administrative databases were identified. Only a limited number of databases allowed access to raw data and were at Level 7 [Medical Data Vision EBM Provider, Japan Medical Data Centre (JMDC) Claims database and Nihon-Chouzai Pharmacy Claims database in Japan, and Medicare, Pharmaceutical Benefits Scheme (PBS), Centre for Health Record Linkage (CHeReL), HealthLinQ, Victorian Data Linkages (VDL), SA-NT DataLink in Australia]. At Levels 3–6 were several databases from Japan [Hamamatsu Medical University Database, Medi-Trend, Nihon University School of Medicine Clinical Data Warehouse (NUSM)], Australia [Western Australia Data Linkage (WADL)], Taiwan [National Health Insurance Research Database (NHIRD)], South Korea [Health Insurance Review and Assessment Service (HIRA)], and Malaysia [United Nations University (UNU)-Casemix]. Countries were categorised as having a high level of data accessibility (Australia, Taiwan, and Japan), medium level of accessibility (South Korea), or a low level of accessibility (Thailand, China, Malaysia, and Singapore). In some countries, data may be available but

  2. Distributed Database Access in the LHC Computing Grid with CORAL

    CERN Document Server

    Molnár, Z; Düllmann, D; Giacomo, G; Kalkhof, A; Valassi, A; CERN. Geneva. IT Department

    2009-01-01

    The CORAL package is the LCG Persistency Framework foundation for accessing relational databases. From the start CORAL has been designed to facilitate the deployment of the LHC experiment database applications in a distributed computing environment. In particular we cover - improvements to database service scalability by client connection management - platform-independent, multi-tier scalable database access by connection multiplexing, caching - a secure authentication and authorisation scheme integrated with existing grid services. We will summarize the deployment experience from several experiment productions using the distributed database infrastructure, which is now available in LCG. Finally, we present perspectives for future developments in this area.

  3. The GEOSS Clearinghouse based on the GeoNetwork opensource

    Science.gov (United States)

    Liu, K.; Yang, C.; Wu, H.; Huang, Q.

    2010-12-01

    The Global Earth Observation System of Systems (GEOSS) is established to support the study of the Earth system in a global community. It provides services for social management, quick response, academic research, and education. The purpose of GEOSS is to achieve comprehensive, coordinated and sustained observations of the Earth system, improve monitoring of the state of the Earth, increase understanding of Earth processes, and enhance prediction of the behavior of the Earth system. In 2009, GEO called for a competition for an official GEOSS clearinghouse to be selected as a source to consolidating catalogs for Earth observations. The Joint Center for Intelligent Spatial Computing at George Mason University worked with USGS to submit a solution based on the open-source platform - GeoNetwork. In the spring of 2010, the solution is selected as the product for GEOSS clearinghouse. The GEOSS Clearinghouse is a common search facility for the Intergovernmental Group on Ea rth Observation (GEO). By providing a list of harvesting functions in Business Logic, GEOSS clearinghouse can collect metadata from distributed catalogs including other GeoNetwork native nodes, webDAV/sitemap/WAF, catalog services for the web (CSW)2.0, GEOSS Component and Service Registry (http://geossregistries.info/), OGC Web Services (WCS, WFS, WMS and WPS), OAI Protocol for Metadata Harvesting 2.0, ArcSDE Server and Local File System. Metadata in GEOSS clearinghouse are managed in a database (MySQL, Postgresql, Oracle, or MckoiDB) and an index of the metadata is maintained through Lucene engine. Thus, EO data, services, and related resources can be discovered and accessed. It supports a variety of geospatial standards including CSW and SRU for search, FGDC and ISO metadata, and WMS related OGC standards for data access and visualization, as linked from the metadata.

  4. Intelligent Access to Sequence and Structure Databases (IASSD) - an interface for accessing information from major web databases.

    Science.gov (United States)

    Ganguli, Sayak; Gupta, Manoj Kumar; Basu, Protip; Banik, Rahul; Singh, Pankaj Kumar; Vishal, Vineet; Bera, Abhisek Ranjan; Chakraborty, Hirak Jyoti; Das, Sasti Gopal

    2014-01-01

    With the advent of age of big data and advances in high throughput technology accessing data has become one of the most important step in the entire knowledge discovery process. Most users are not able to decipher the query result that is obtained when non specific keywords or a combination of keywords are used. Intelligent access to sequence and structure databases (IASSD) is a desktop application for windows operating system. It is written in Java and utilizes the web service description language (wsdl) files and Jar files of E-utilities of various databases such as National Centre for Biotechnology Information (NCBI) and Protein Data Bank (PDB). Apart from that IASSD allows the user to view protein structure using a JMOL application which supports conditional editing. The Jar file is freely available through e-mail from the corresponding author.

  5. Database Access through Java Technologies

    Directory of Open Access Journals (Sweden)

    Nicolae MERCIOIU

    2010-09-01

    Full Text Available As a high level development environment, the Java technologies offer support to the development of distributed applications, independent of the platform, providing a robust set of methods to access the databases, used to create software components on the server side, as well as on the client side. Analyzing the evolution of Java tools to access data, we notice that these tools evolved from simple methods that permitted the queries, the insertion, the update and the deletion of the data to advanced implementations such as distributed transactions, cursors and batch files. The client-server architectures allows through JDBC (the Java Database Connectivity the execution of SQL (Structured Query Language instructions and the manipulation of the results in an independent and consistent manner. The JDBC API (Application Programming Interface creates the level of abstractization needed to allow the call of SQL queries to any DBMS (Database Management System. In JDBC the native driver and the ODBC (Open Database Connectivity-JDBC bridge and the classes and interfaces of the JDBC API will be described. The four steps needed to build a JDBC driven application are presented briefly, emphasizing on the way each step has to be accomplished and the expected results. In each step there are evaluations on the characteristics of the database systems and the way the JDBC programming interface adapts to each one. The data types provided by SQL2 and SQL3 standards are analyzed by comparison with the Java data types, emphasizing on the discrepancies between those and the SQL types, but also the methods that allow the conversion between different types of data through the methods of the ResultSet object. Next, starting from the metadata role and studying the Java programming interfaces that allow the query of result sets, we will describe the advanced features of the data mining with JDBC. As alternative to result sets, the Rowsets add new functionalities that

  6. NCBI2RDF: Enabling Full RDF-Based Access to NCBI Databases

    Directory of Open Access Journals (Sweden)

    Alberto Anguita

    2013-01-01

    Full Text Available RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments.

  7. NCBI2RDF: Enabling Full RDF-Based Access to NCBI Databases

    Science.gov (United States)

    Anguita, Alberto; García-Remesal, Miguel; de la Iglesia, Diana; Maojo, Victor

    2013-01-01

    RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments. PMID:23984425

  8. NCBI2RDF: enabling full RDF-based access to NCBI databases.

    Science.gov (United States)

    Anguita, Alberto; García-Remesal, Miguel; de la Iglesia, Diana; Maojo, Victor

    2013-01-01

    RDF has become the standard technology for enabling interoperability among heterogeneous biomedical databases. The NCBI provides access to a large set of life sciences databases through a common interface called Entrez. However, the latter does not provide RDF-based access to such databases, and, therefore, they cannot be integrated with other RDF-compliant databases and accessed via SPARQL query interfaces. This paper presents the NCBI2RDF system, aimed at providing RDF-based access to the complete NCBI data repository. This API creates a virtual endpoint for servicing SPARQL queries over different NCBI repositories and presenting to users the query results in SPARQL results format, thus enabling this data to be integrated and/or stored with other RDF-compliant repositories. SPARQL queries are dynamically resolved, decomposed, and forwarded to the NCBI-provided E-utilities programmatic interface to access the NCBI data. Furthermore, we show how our approach increases the expressiveness of the native NCBI querying system, allowing several databases to be accessed simultaneously. This feature significantly boosts productivity when working with complex queries and saves time and effort to biomedical researchers. Our approach has been validated with a large number of SPARQL queries, thus proving its reliability and enhanced capabilities in biomedical environments.

  9. The Hawaiian Freshwater Algal Database (HfwADB: a laboratory LIMS and online biodiversity resource

    Directory of Open Access Journals (Sweden)

    Sherwood Alison R

    2012-10-01

    Full Text Available Abstract Background Biodiversity databases serve the important role of highlighting species-level diversity from defined geographical regions. Databases that are specially designed to accommodate the types of data gathered during regional surveys are valuable in allowing full data access and display to researchers not directly involved with the project, while serving as a Laboratory Information Management System (LIMS. The Hawaiian Freshwater Algal Database, or HfwADB, was modified from the Hawaiian Algal Database to showcase non-marine algal specimens collected from the Hawaiian Archipelago by accommodating the additional level of organization required for samples including multiple species. Description The Hawaiian Freshwater Algal Database is a comprehensive and searchable database containing photographs and micrographs of samples and collection sites, geo-referenced collecting information, taxonomic data and standardized DNA sequence data. All data for individual samples are linked through unique 10-digit accession numbers (“Isolate Accession”, the first five of which correspond to the collection site (“Environmental Accession”. Users can search online for sample information by accession number, various levels of taxonomy, habitat or collection site. HfwADB is hosted at the University of Hawaii, and was made publicly accessible in October 2011. At the present time the database houses data for over 2,825 samples of non-marine algae from 1,786 collection sites from the Hawaiian Archipelago. These samples include cyanobacteria, red and green algae and diatoms, as well as lesser representation from some other algal lineages. Conclusions HfwADB is a digital repository that acts as a Laboratory Information Management System for Hawaiian non-marine algal data. Users can interact with the repository through the web to view relevant habitat data (including geo-referenced collection locations and download images of collection sites, specimen

  10. Assesment of access to bibliographic databases and telemetry databases in Astronomy: A groundswell for development.

    Science.gov (United States)

    Diaz-Merced, Wanda Liz; Casado, Johanna; Garcia, Beatriz; Aarnio, Alicia; Knierman, Karen; Monkiewicz, Jacqueline; Alicia Aarnio.

    2018-01-01

    Big Data" is a subject that has taken special relevance today, particularly in Astrophysics, where continuous advances in technology are leading to ever larger data sets. A multimodal approach in perception of astronomical data data (achieved through sonification used for the processing of data) increases the detection of signals in very low signal-to-noise ratio limits and is of special importance to achieve greater inclusion in the field of Astronomy. In the last ten years, different software tools have been developed that perform the sonification of astronomical data from tables or databases, among them the best known and in multiplatform development are Sonification Sandbox, MathTrack, and xSonify.In order to determine the accessibility of software we propose to start carrying out a conformity analysis of ISO (International Standard Organization) 9241-171171: 2008. This standard establishes the general guidelines that must be taken into account for accessibility in software design, and it is applied to software used in work, public places, and at home. To analyze the accessibility of web databases, we take into account the "Web Content Content Accessibility Guidelines (WCAG) 2.0", accepted and published by ISO in the ISO / IEC 40500: 2012 standard.In this poster, we present a User Centered Design (UCD), Human Computer Interaction (HCI), and User Experience (UX) framework to address a non-segregational provision of access to bibliographic databases and telemetry databases in Astronomy. Our framework is based on an ISO evaluation on a selection of data bases such as ADS, Simbad and SDSS. The WCAG 2.0 and ISO 9241-171171: 2008 should not be taken as absolute accessibility standards: these guidelines are very general, are not absolute, and do not address particularities. They are not to be taken as a substitute for UCD, HCI, UX design and evaluation. Based on our results, this research presents the framework for a focus group and qualitative data analysis aimed to

  11. Training Database Technology in DBMS MS Access

    OpenAIRE

    Nataliya Evgenievna Surkova

    2015-01-01

    The article describes the methodological issues of learning relational database technology and management systems relational databases. DBMS Microsoft Access is the primer for learning of DBMS. This methodology allows to generate some general cultural competence, such as the possession of the main methods, ways and means of production, storage and processing of information, computer skills as a means of managing information. Also must formed professional competence such as the ability to coll...

  12. Freely Accessible Chemical Database Resources of Compounds for in Silico Drug Discovery.

    Science.gov (United States)

    Yang, JingFang; Wang, Di; Jia, Chenyang; Wang, Mengyao; Hao, GeFei; Yang, GuangFu

    2018-05-07

    In silico drug discovery has been proved to be a solidly established key component in early drug discovery. However, this task is hampered by the limitation of quantity and quality of compound databases for screening. In order to overcome these obstacles, freely accessible database resources of compounds have bloomed in recent years. Nevertheless, how to choose appropriate tools to treat these freely accessible databases are crucial. To the best of our knowledge, this is the first systematic review on this issue. The existed advantages and drawbacks of chemical databases were analyzed and summarized based on the collected six categories of freely accessible chemical databases from literature in this review. Suggestions on how and in which conditions the usage of these databases could be reasonable were provided. Tools and procedures for building 3D structure chemical libraries were also introduced. In this review, we described the freely accessible chemical database resources for in silico drug discovery. In particular, the chemical information for building chemical database appears as attractive resources for drug design to alleviate experimental pressure. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  13. Database application research in real-time data access of accelerator control system

    International Nuclear Information System (INIS)

    Chen Guanghua; Chen Jianfeng; Wan Tianmin

    2012-01-01

    The control system of Shanghai Synchrotron Radiation Facility (SSRF) is a large-scale distributed real-time control system, It involves many types and large amounts of real-time data access during the operating. Database system has wide application prospects in the large-scale accelerator control system. It is the future development direction of the accelerator control system, to replace the differently dedicated data structures with the mature standardized database system. This article discusses the application feasibility of database system in accelerators based on the database interface technology, real-time data access testing, and system optimization research and to establish the foundation of the wide scale application of database system in the SSRF accelerator control system. Based on the database interface technology, real-time data access testing and system optimization research, this article will introduce the application feasibility of database system in accelerators, and lay the foundation of database system application in the SSRF accelerator control system. (authors)

  14. Development of RESTful services and map-based user interface tools for access and delivery of data and metadata from the Marine-Geo Digital Library

    Science.gov (United States)

    Morton, J. J.; Ferrini, V. L.

    2015-12-01

    The Marine Geoscience Data System (MGDS, www.marine-geo.org) operates an interactive digital data repository and metadata catalog that provides access to a variety of marine geology and geophysical data from throughout the global oceans. Its Marine-Geo Digital Library includes common marine geophysical data types and supporting data and metadata, as well as complementary long-tail data. The Digital Library also includes community data collections and custom data portals for the GeoPRISMS, MARGINS and Ridge2000 programs, for active source reflection data (Academic Seismic Portal), and for marine data acquired by the US Antarctic Program (Antarctic and Southern Ocean Data Portal). Ensuring that these data are discoverable not only through our own interfaces but also through standards-compliant web services is critical for enabling investigators to find data of interest.Over the past two years, MGDS has developed several new RESTful web services that enable programmatic access to metadata and data holdings. These web services are compliant with the EarthCube GeoWS Building Blocks specifications and are currently used to drive our own user interfaces. New web applications have also been deployed to provide a more intuitive user experience for searching, accessing and browsing metadata and data. Our new map-based search interface combines components of the Google Maps API with our web services for dynamic searching and exploration of geospatially constrained data sets. Direct introspection of nearly all data formats for hundreds of thousands of data files curated in the Marine-Geo Digital Library has allowed for precise geographic bounds, which allow geographic searches to an extent not previously possible. All MGDS map interfaces utilize the web services of the Global Multi-Resolution Topography (GMRT) synthesis for displaying global basemap imagery and for dynamically provide depth values at the cursor location.

  15. Internet-accessible radiographic database of Vietnam War casualties for medical student education.

    Science.gov (United States)

    Critchley, Eric P; Smirniotopoulos, James G

    2003-04-01

    The purpose of this study was to determine the feasibility of archiving radiographic images from Vietnam era conflict casualties into a personal computer-based electronic database of text and images and displaying the data using an Internet-accessible database for preservation and educational purposes. Thirty-two patient cases were selected at random from a pool of 1,000 autopsy reports in which radiographs were available. A total of 74 radiographs from these cases were digitized using a commercial image scanner and then uploaded into an Internet accessible database. The quality of the digitized images was assessed by administering an image-based test to a group of 12 medical students. No statistically significant (p > 0.05) differences were found between test scores when using the original radiographs versus using the digitized radiographs on the Internet-accessible database. An Internet-accessible database is capable of effectively archiving Vietnam era casualty radiographs for educational purposes.

  16. Geoscience data visualization and analysis using GeoMapApp

    Science.gov (United States)

    Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha

    2013-04-01

    Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D

  17. South African oil dependency : geo-political, geo-economic and geo-strategic considerations

    OpenAIRE

    2012-01-01

    Ph.D. There is little research undertaken on the economic assessment of oil security of supply from the dimensions of geo-politics, geo-economics and geo-strategy. This study seeks to bridge the gap by providing new analytical and empirical work that captures the impact of geo-politics, geo-economics and geo-strategy on oil supply, consumption and price. This study is the first to define, analyse and contextualise the South African oil security of supply from a geo-political, geo-economic ...

  18. Advanced technologies for scalable ATLAS conditions database access on the grid

    CERN Document Server

    Basset, R; Dimitrov, G; Girone, M; Hawkings, R; Nevski, P; Valassi, A; Vaniachine, A; Viegas, F; Walker, R; Wong, A

    2010-01-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysi...

  19. Advanced technologies for scalable ATLAS conditions database access on the grid

    International Nuclear Information System (INIS)

    Basset, R; Canali, L; Girone, M; Hawkings, R; Valassi, A; Viegas, F; Dimitrov, G; Nevski, P; Vaniachine, A; Walker, R; Wong, A

    2010-01-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysis of server performance under stress tests indicates that Conditions Db data access is limited by the disk I/O throughput. An unacceptable side-effect of the disk I/O saturation is a degradation of the WLCG 3D Services that update Conditions Db data at all ten ATLAS Tier-1 sites using the technology of Oracle Streams. To avoid such bottlenecks we prototyped and tested a novel approach for database peak load avoidance in Grid computing. Our approach is based upon the proven idea of pilot job submission on the Grid: instead of the actual query, an ATLAS utility library sends to the database server a pilot query first.

  20. Web catalog of oceanographic data using GeoNetwork

    Science.gov (United States)

    Marinova, Veselka; Stefanov, Asen

    2017-04-01

    Most of the data collected, analyzed and used by Bulgarian oceanographic data center (BgODC) from scientific cruises, argo floats, ferry boxes and real time operating systems are spatially oriented and need to be displayed on the map. The challenge is to make spatial information more accessible to users, decision makers and scientists. In order to meet this challenge, BgODC concentrate its efforts on improving dynamic and standardized access to their geospatial data as well as those from various related organizations and institutions. BgODC currently is implementing a project to create a geospatial portal for distributing metadata and search, exchange and harvesting spatial data. There are many open source software solutions able to create such spatial data infrastructure (SDI). Finally, the GeoNetwork open source is chosen, as it is already widespread. This software is free, effective and "cheap" solution for implementing SDI at organization level. It is platform independent and runs under many operating systems. Filling of the catalog goes through these practical steps: • Managing and storing data reliably within MS SQL spatial data base; • Registration of maps and data of various formats and sources in GeoServer (most popular open source geospatial server embedded with GeoNetwork) ; • Filling added meta data and publishing geospatial data at the desktop of GeoNetwork. GeoServer and GeoNetwork are based on Java so they require installing of a servlet engine like Tomcat. The experience gained from the use of GeoNetwork Open Source confirms that the catalog meets the requirements for data management and is flexible enough to customize. Building the catalog facilitates sustainable data exchange between end users. The catalog is a big step towards implementation of the INSPIRE directive due to availability of many features necessary for producing "INSPIRE compliant" metadata records. The catalog now contains all available GIS data provided by BgODC for Internet

  1. Training Database Technology in DBMS MS Access

    Directory of Open Access Journals (Sweden)

    Nataliya Evgenievna Surkova

    2015-05-01

    Full Text Available The article describes the methodological issues of learning relational database technology and management systems relational databases. DBMS Microsoft Access is the primer for learning of DBMS. This methodology allows to generate some general cultural competence, such as the possession of the main methods, ways and means of production, storage and processing of information, computer skills as a means of managing information. Also must formed professional competence such as the ability to collect, analyze and process the data necessary for solving the professional tasks, the ability to use solutions for analytical and research tasks modern technology and information technology.

  2. The French-German initiative for Chernobyl: programme 2: REDAC, the radioecological database after the Chernobyl accident

    International Nuclear Information System (INIS)

    Deville-Cavelin, G.; Biesold, H.; Chabanyuk, V.

    2006-01-01

    Goals: to built a database for integrating the results of programme 'Radioecology' of the French-German Initiative: Ecological portrait, initial contamination, wastes management, soil-plants and animals transfer, transfer by runoff and in the aquatic environment, countermeasures in urban and natural and agricultural environments. Specific methodology: original 'Project Solutions Framework': Information system developed as a soft integrated portal, Geo-information system: all spatial data geo-coded. DB structure: Publications: all classical informations, original data; Products: storage of open publications of the Project; Processes: management of the Project and Sub-projects; Services: information and software objects, help; Basics: information on system and organizational development. - Soft integration: cartography system: Map from 'Ecological portrait' integrated with thematic databases, Loaded in a special category (by IS Geo Internet Map Server); Cartographical functions: navigation, scaling, extracting, layer management, Databases arrangement independent of map system architecture. - Soft integration: portlets and DDB: Portlets = mini-applications for business functions and processes, made of web parts; Digital Dashboards (DDB) Portlets + web parts DDB sites = collections of DDB, adjustable by users. - General conclusions: REDAC, powerful and useful radioecological tool: All elements easily accessible through the original tool, ProSF, developed by IS Geo; Relations constructed between the documents (files, databases, documentation, reports,...); All elements structured by a meta-information; Mechanisms of search; Global radioecological glossary; Spatial data geo-coded; Processes, tools and methodology suitable for similar projects; Data useful for scientific studies, modelling, operational purposes, communication with mass media. - Outlook: Addition of functionality, support and maintenance Strong integration: Thematic integration = merging of all DB in an

  3. The French-German initiative for Chernobyl: programme 2: REDAC, the radioecological database after the Chernobyl accident

    Energy Technology Data Exchange (ETDEWEB)

    Deville-Cavelin, G. [Institut de Radioprotection et de Surete Nucleaire (IRSN), Environment and Emergency Operations Div. - Dept. for the Study of Radionuclide Behaviour in Ecosystems, 13 - Saint-Paul-lez-Durance (France); Biesold, H. [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Braunschweig (Germany); Chabanyuk, V. [Chornobyl Center (CC), Kiev regoin (Ukraine)

    2006-07-01

    Goals: to built a database for integrating the results of programme 'Radioecology' of the French-German Initiative: Ecological portrait, initial contamination, wastes management, soil-plants and animals transfer, transfer by runoff and in the aquatic environment, countermeasures in urban and natural and agricultural environments. Specific methodology: original 'Project Solutions Framework': Information system developed as a soft integrated portal, Geo-information system: all spatial data geo-coded. DB structure: Publications: all classical informations, original data; Products: storage of open publications of the Project; Processes: management of the Project and Sub-projects; Services: information and software objects, help; Basics: information on system and organizational development. - Soft integration: cartography system: Map from 'Ecological portrait' integrated with thematic databases, Loaded in a special category (by IS Geo Internet Map Server); Cartographical functions: navigation, scaling, extracting, layer management, Databases arrangement independent of map system architecture. - Soft integration: portlets and DDB: Portlets = mini-applications for business functions and processes, made of web parts; Digital Dashboards (DDB) Portlets + web parts DDB sites = collections of DDB, adjustable by users. - General conclusions: REDAC, powerful and useful radioecological tool: All elements easily accessible through the original tool, ProSF, developed by IS Geo; Relations constructed between the documents (files, databases, documentation, reports,...); All elements structured by a meta-information; Mechanisms of search; Global radioecological glossary; Spatial data geo-coded; Processes, tools and methodology suitable for similar projects; Data useful for scientific studies, modelling, operational purposes, communication with mass media. - Outlook: Addition of functionality, support and maintenance Strong integration: Thematic

  4. Mandatory and Location-Aware Access Control for Relational Databases

    Science.gov (United States)

    Decker, Michael

    Access control is concerned with determining which operations a particular user is allowed to perform on a particular electronic resource. For example, an access control decision could say that user Alice is allowed to perform the operation read (but not write) on the resource research report. With conventional access control this decision is based on the user's identity whereas the basic idea of Location-Aware Access Control (LAAC) is to evaluate also a user's current location when making the decision if a particular request should be granted or denied. LAAC is an interesting approach for mobile information systems because these systems are exposed to specific security threads like the loss of a device. Some data models for LAAC can be found in literature, but almost all of them are based on RBAC and none of them is designed especially for Database Management Systems (DBMS). In this paper we therefore propose a LAAC-approach for DMBS and describe a prototypical implementation of that approach that is based on database triggers.

  5. The INIS database on another efficient site... and on free access

    International Nuclear Information System (INIS)

    Libmann, F.

    2009-01-01

    This article presents the INIS database, its history, document type content, and availability. It stresses on the recent opening of the database to free access, on the functionality of the searching interface and on the quality of the work and the professionalism of the database producers. (J.S.)

  6. Managing and delivering of 3D geo data across institutions has a web based solution - intermediate results of the project GeoMol.

    Science.gov (United States)

    Gietzel, Jan; Schaeben, Helmut; Gabriel, Paul

    2014-05-01

    The increasing relevance of geological information for policy and economy at transnational level has recently been recognized by the European Commission, who has called for harmonized information related to reserves and resources in the EU Member States. GeoMol's transnational approach responds to that, providing consistent and seamless 3D geological information of the Alpine Foreland Basins based on harmonized data and agreed methodologies. However, until recently no adequate tool existed to ensure full interoperability among the involved GSOs and to distribute the multi-dimensional information of a transnational project facing diverse data policy, data base systems and software solutions. In recent years (open) standards describing 2D spatial data have been developed and implemented in different software systems including production environments for 2D spatial data (like regular 2D-GI-Systems). Easy yet secured access to the data is of upmost importance and thus priority for any spatial data infrastructure. To overcome limitations conditioned by highly sophisticated and platform dependent geo modeling software packages functionalities of a web portals can be utilized. Thus, combining a web portal with a "check-in-check-out" system allows distributed organized editing of data and models but requires standards for the exchange of 3D geological information to ensure interoperability. Another major concern is the management of large models and the ability of 3D tiling into spatially restricted models with refined resolution, especially when creating countrywide models . Using GST ("Geosciences in Space and Time") developed initially at TU Bergakademie Freiberg and continuously extended by the company GiGa infosystems, incorporating these key issues and based on an object-relational data model, it is possible to check out parts or whole models for edits and check in again after modification. GST is the core of GeoMol's web-based collaborative environment designed to

  7. USING THE INTERNATIONAL SCIENTOMETRIC DATABASES OF OPEN ACCESS IN SCIENTIFIC RESEARCH

    Directory of Open Access Journals (Sweden)

    O. Galchevska

    2015-05-01

    Full Text Available In the article the problem of the use of international scientometric databases in research activities as web-oriented resources and services that are the means of publication and dissemination of research results is considered. Selection criteria of scientometric platforms of open access in conducting scientific researches (coverage Ukrainian scientific periodicals and publications, data accuracy, general characteristics of international scientometrics database, technical, functional characteristics and their indexes are emphasized. The review of the most popular scientometric databases of open access Google Scholar, Russian Scientific Citation Index (RSCI, Scholarometer, Index Copernicus (IC, Microsoft Academic Search is made. Advantages of usage of International Scientometrics database Google Scholar in conducting scientific researches and prospects of research that are in the separation of cloud information and analytical services of the system are determined.

  8. HCUP State Emergency Department Databases (SEDD) - Restricted Access File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The State Emergency Department Databases (SEDD) contain the universe of emergency department visits in participating States. Restricted access data files are...

  9. libChEBI: an API for accessing the ChEBI database.

    Science.gov (United States)

    Swainston, Neil; Hastings, Janna; Dekker, Adriano; Muthukrishnan, Venkatesh; May, John; Steinbeck, Christoph; Mendes, Pedro

    2016-01-01

    ChEBI is a database and ontology of chemical entities of biological interest. It is widely used as a source of identifiers to facilitate unambiguous reference to chemical entities within biological models, databases, ontologies and literature. ChEBI contains a wealth of chemical data, covering over 46,500 distinct chemical entities, and related data such as chemical formula, charge, molecular mass, structure, synonyms and links to external databases. Furthermore, ChEBI is an ontology, and thus provides meaningful links between chemical entities. Unlike many other resources, ChEBI is fully human-curated, providing a reliable, non-redundant collection of chemical entities and related data. While ChEBI is supported by a web service for programmatic access and a number of download files, it does not have an API library to facilitate the use of ChEBI and its data in cheminformatics software. To provide this missing functionality, libChEBI, a comprehensive API library for accessing ChEBI data, is introduced. libChEBI is available in Java, Python and MATLAB versions from http://github.com/libChEBI, and provides full programmatic access to all data held within the ChEBI database through a simple and documented API. libChEBI is reliant upon the (automated) download and regular update of flat files that are held locally. As such, libChEBI can be embedded in both on- and off-line software applications. libChEBI allows better support of ChEBI and its data in the development of new cheminformatics software. Covering three key programming languages, it allows for the entirety of the ChEBI database to be accessed easily and quickly through a simple API. All code is open access and freely available.

  10. 3D visualization of geo-scientific data for research and development purposes

    International Nuclear Information System (INIS)

    Mangeot, A.; Tabani, P.; Yven, B.; Dewonck, S.; Napier, B.; Waston, C.J.; Baker, G.R.; Shaw, R.P.

    2012-01-01

    built with the numerical ground surface model, the aerial photos, some information of the geological model (Gocad) and with CAD files from drawing office. Furthermore, linkages have been built to visualize geo-referenced Andra databases information within GeoVisionary using the 'Quey layer/middle-ware Layer' Approach. The main geo-scientific databases used at Andra are called SAGD (database in relation sensors) and GEO (database in relation with boreholes and samples). Andra improves the efficiency of its information system by using software GeoVisonary in order to visualize, analyze and share large volumes of data in 3D environment in real time but also to explain Andra's work to local people

  11. Access to digital library databases in higher education: design problems and infrastructural gaps.

    Science.gov (United States)

    Oswal, Sushil K

    2014-01-01

    After defining accessibility and usability, the author offers a broad survey of the research studies on digital content databases which have thus far primarily depended on data drawn from studies conducted by sighted researchers with non-disabled users employing screen readers and low vision devices. This article aims at producing a detailed description of the difficulties confronted by blind screen reader users with online library databases which now hold most of the academic, peer-reviewed journal and periodical content essential for research and teaching in higher education. The approach taken here is borrowed from descriptive ethnography which allows the author to create a complete picture of the accessibility and usability problems faced by an experienced academic user of digital library databases and screen readers. The author provides a detailed analysis of the different aspects of accessibility issues in digital databases under several headers with a special focus on full-text PDF files. The author emphasizes that long-term studies with actual, blind screen reader users employing both qualitative and computerized research tools can yield meaningful data for the designers and developers to improve these databases to a level that they begin to provide an equal access to the blind.

  12. GEO Supersites Data Exploitation Platform

    Science.gov (United States)

    Lengert, W.; Popp, H.-J.; Gleyzes, J.-P.

    2012-04-01

    In the framework of the GEO Geohazard Supersite initiative, an international partnership of organizations and scientists involved in the monitoring and assessment of geohazards has been established. The mission is to advance the scientific understanding of geohazards by improving geohazard monitoring through the combination of in-situ and space-based data, and by facilitating the access to data relevant for geohazard research. The stakeholders are: (1) governmental organizations or research institutions responsible for the ground-based monitoring of earthquake and volcanic areas, (2) space agencies and satellite operators providing satellite data, (3) the global geohazard scientific community. The 10.000's of ESA's SAR products are accessible, since beginning 2008, using ESA's "Virtual Archive", a Cloud Computing assets, allowing the global community an utmost downloading performance of these high volume data sets for mass-market costs. In the GEO collaborative context, the management of ESA's "Virtual Archive" and the ordering of these large data sets is being performed by UNAVCO, who is also coordinating the data demand for the several hundreds of co-PIs. ESA is envisaging to provide scientists and developers access to a highly elastic operational e-infrastructure, providing interdisciplinary data on a large scale as well as tools ensuring innovation and a permanent evolution of the products. Consequently, this science environment will help in defining and testing new applications and technologies fostering innovation and new science findings. In Europe, the collaboration between EPOS, "European Plate Observatory System" lead by INGV, and ESA with support of DLR, ASI, and CNES are the main institutional stakeholders for the GEO Supersites contributing also to a unifying e-infrastructure. The overarching objective of the Geohazard Supersites is: "To implement a sustainable Global Earthquake Observation System and a Global Volcano Observation System as part of the

  13. Geo-demographic analysis of fatal motorcycle crashes

    Science.gov (United States)

    2001-01-01

    The objective of this study is to analyze the combined motor vehicle crash data from the Fatality Analysis Reporting System (FARS) with the Claritas geo-demographic database from the lifestyle perspective to determine the appropriate media to use in ...

  14. Rural and remote dental services shortages: filling the gaps through geo-spatial analysis evidence-based targeting.

    Science.gov (United States)

    Shiika, Yulia; Kruger, Estie; Tennant, Marc

    Australia has a significant mal-distribution of its limited dental workforce. Outside the major capital cities, the distribution of accessible dental care is at best patchy. This study applied geo-spatial analysis technology to locate gaps in dental service accessibility for rural and remote dwelling Australians, in order to test the hypothesis that there are a few key location points in Australia where further dental services could make a significant contribution to ameliorating the immediate shortage crisis. A total of 2,086 dental practices were located in country areas, covering a combined catchment area of 1.84 million square kilometers, based on 50 km catchment zones around each clinic. Geo-spatial analysis technology was used to identify gaps in the accessibility of dental services for rural and remote dwelling Australians. An extraction of data was obtained to analyse the integrated geographically-aligned database. Results: Resolution of the lack of dental practices for 74 townships (of greater than 500 residents) across Australia could potentially address access for 104,000 people. An examination of the socio-economic mix found that the majority of the dental practices (84%) are located in areas classified as less disadvantaged. Output from the study provided a cohesive national map that has identified locations that could have health improvement via the targeting of dental services to that location. The study identified potential location sites for dental clinics, to address the current inequity in accessing dental services in rural and remote Australia.

  15. Full-Text Linking: Affiliated versus Nonaffiliated Access in a Free Database.

    Science.gov (United States)

    Grogg, Jill E.; Andreadis, Debra K.; Kirk, Rachel A.

    2002-01-01

    Presents a comparison of access to full-text articles from a free bibliographic database (PubSCIENCE) for affiliated and unaffiliated users. Found that affiliated users had access to more full-text articles than unaffiliated users had, and that both types of users could increase their level of access through additional searching and greater…

  16. Access to geo information in Europe : Is the marine sector showing the way?

    NARCIS (Netherlands)

    Welle Donker, F.M.; De Jong, J.

    2010-01-01

    In the digital age, geo-information or spatial data has become embedded in our daily lives. Although the term geo-information does not ring familiar, applications such as navigation systems, real estate information and weather forecasts are used by all for day-to-day decision-making. Most

  17. The ConnectinGEO Observation Inventory

    Science.gov (United States)

    Santoro, M.; Nativi, S.; Jirka, S.; McCallum, I.

    2016-12-01

    ConnectinGEO (Coordinating an Observation Network of Networks EnCompassing saTellite and IN-situ to fill the Gaps in European Observations) is an EU-funded project under the H2020 Framework Programme. The primary goal of the project is to link existing coordinated Earth Observation networks with science and technology (S&T) communities, the industry sector and the GEOSS and Copernicus stakeholders. An expected outcome of the project is a prioritized list of critical gaps within GEOSS (Global Earth Observation System of Systems) in observations and models that translate observations into practice relevant knowledge. The project defines and utilizes a formalized methodology to create a set of observation requirements that will be related to information on available observations to identify key gaps. Gaps in the information provided by current observation systems as well as gaps in the systems themselves will be derived from five different threads. One of these threads consists in the analysis of the observations and measurements that are currently registered in GEO Discovery and Access Broker (DAB). To this aim, an Observation Inventory (OI) has been created and populated using the current metadata information harmonized by the DAB. This presentation describes the process defined to populate the ConnectinGEO OI and the resulting system architecture. In addition, it provides information on how to systematically access the OI for performing the gap analysis. Furthermore it demonstrates initial findings of the gap analysis, and shortcomings in the metadata that need attention. The research leading to these results benefited from funding by the European Union H2020 Framework Programme under grant agreement n. 641538 (ConnectinGEO).

  18. Optimization and Accessibility of the Qweak Database

    Science.gov (United States)

    Urban, Erik; Spayde, Damon

    2010-11-01

    The Qweak experiment is a multi-institutional collaborative effort at Thomas Jefferson National Accelerator Facility designed to accurately determine the weak nuclear charge of a proton through measurements of the parity violating asymmetries of electron-proton elastic scattering that result from pulses of electrons with opposite helicities. Through the study of these scattering asymmetries, the Qweak experiment hopes to constrain extensions of the Standard Model or find indications of new physics. Since precision is critical to the success of the Qweak experiment, the collaboration will be taking data for thousands of hours. The Qweak database is responsible for storing the non-binary, processed data of this experiment in a meaningful and organized manner for use at a later date. The goal of this undertaking to not only create a database which can input and output data quickly, but create one which can easily be accessed by those who have minimal knowledge of the database language. Through tests on the system, the speed of retrieval and insert times has been optimized and, in addition, the implementation of summary tables and additional programs should make the majority of commonly sought results readily available to database novices.

  19. Why new tools were developed for the 'GeoPortalNetwork : Liberty United" project

    NARCIS (Netherlands)

    Vanmeulebrouk, B.; Van Swol, R.; Kuyper, M.; Bulens, J.; Zevenbergen, J.A.

    2009-01-01

    As part of the national innovation co-funding scheme “Space for Geo-information” the project “GeoPortal Network: Liberty United” ran from late 2005 till the end of 2008. Purpose of the project was to promote access to geo-spatial information via web services. To achieve this goal, a network of

  20. Chroni - an Android Application for Geochronologists to Access Archived Sample Analyses from the NSF-Funded Geochron.Org Data Repository.

    Science.gov (United States)

    Nettles, J. J.; Bowring, J. F.

    2014-12-01

    NSF requires data management plans as part of funding proposals and geochronologists, among other scientists, are archiving their data and results to the public cloud archives managed by the NSF-funded Integrated Earth Data Applications, or IEDA. GeoChron is a database for geochronology housed within IEDA. The software application U-Pb_Redux developed at the Cyber Infrastructure Research and Development Lab for the Earth Sciences (CIRDLES.org) at the College of Charleston provides seamless connectivity to GeoChron for uranium-lead (U-Pb) geochronologists to automatically upload and retrieve their data and results. U-Pb_Redux also manages publication-quality documents including report tables and graphs. CHRONI is a lightweight mobile application for Android devices that provides easy access to these archived data and results. With CHRONI, U-Pb geochronologists can view archived data and analyses downloaded from the Geochron database, or any other location, in a customizable format. CHRONI uses the same extensible markup language (XML) schema and documents used by U-Pb_Redux and GeoChron. Report Settings are special XML files that can be customized in U-Pb_Redux, stored in the cloud, and then accessed and used in CHRONI to create the same customized data display on the mobile device. In addition to providing geologists effortless and mobile access to archived data and analyses, CHRONI allows users to manage their GeoChron credentials, quickly download private and public files via a specified IEDA International Geo Sample Number (IGSN) or URL, and view specialized graphics associated with particular IGSNs. Future versions of CHRONI will be developed to support iOS compatible devices. CHRONI is an open source project under the Apache 2 license and is hosted at https://github.com/CIRDLES/CHRONI. We encourage community participation in its continued development.

  1. The TJ-II Relational Database Access Library: A User's Guide

    International Nuclear Information System (INIS)

    Sanchez, E.; Portas, A. B.; Vega, J.

    2003-01-01

    A relational database has been developed to store data representing physical values from TJ-II discharges. This new database complements the existing TJ-EI raw data database. This database resides in a host computer running Windows 2000 Server operating system and it is managed by SQL Server. A function library has been developed that permits remote access to these data from user programs running in computers connected to TJ-II local area networks via remote procedure cali. In this document a general description of the database and its organization are provided. Also given are a detailed description of the functions included in the library and examples of how to use these functions in computer programs written in the FORTRAN and C languages. (Author) 8 refs

  2. GeoSpark SQL: An Effective Framework Enabling Spatial Queries on Spark

    Directory of Open Access Journals (Sweden)

    Zhou Huang

    2017-09-01

    Full Text Available In the era of big data, Internet-based geospatial information services such as various LBS apps are deployed everywhere, followed by an increasing number of queries against the massive spatial data. As a result, the traditional relational spatial database (e.g., PostgreSQL with PostGIS and Oracle Spatial cannot adapt well to the needs of large-scale spatial query processing. Spark is an emerging outstanding distributed computing framework in the Hadoop ecosystem. This paper aims to address the increasingly large-scale spatial query-processing requirement in the era of big data, and proposes an effective framework GeoSpark SQL, which enables spatial queries on Spark. On the one hand, GeoSpark SQL provides a convenient SQL interface; on the other hand, GeoSpark SQL achieves both efficient storage management and high-performance parallel computing through integrating Hive and Spark. In this study, the following key issues are discussed and addressed: (1 storage management methods under the GeoSpark SQL framework, (2 the spatial operator implementation approach in the Spark environment, and (3 spatial query optimization methods under Spark. Experimental evaluation is also performed and the results show that GeoSpark SQL is able to achieve real-time query processing. It should be noted that Spark is not a panacea. It is observed that the traditional spatial database PostGIS/PostgreSQL performs better than GeoSpark SQL in some query scenarios, especially for the spatial queries with high selectivity, such as the point query and the window query. In general, GeoSpark SQL performs better when dealing with compute-intensive spatial queries such as the kNN query and the spatial join query.

  3. Preserving location and absence privacy in geo-social networks

    DEFF Research Database (Denmark)

    Freni, Dario; Vicente, Carmen Ruiz; Mascetti, Sergio

    2010-01-01

    accessible to multiple users. This renders it difficult for GeoSN users to control which information about them is available and to whom it is available. This paper addresses two privacy threats that occur in GeoSNs: location privacy and absence privacy. The former concerns the availability of information...... about the presence of users in specific locations at given times, while the latter concerns the availability of information about the absence of an individual from specific locations during given periods of time. The challenge addressed is that of supporting privacy while still enabling useful services....... The resulting geo-aware social networks (GeoSNs) pose privacy threats beyond those found in location-based services. Content published in a GeoSN is often associated with references to multiple users, without the publisher being aware of the privacy preferences of those users. Moreover, this content is often...

  4. A database in ACCESS for assessing vaccine serious adverse events

    Directory of Open Access Journals (Sweden)

    Thomas RE

    2015-04-01

    Full Text Available Roger E Thomas,1 Dave Jackson2,3 1Department of Family Medicine, G012 Health Sciences Centre, University of Calgary Medical School, Calgary, AB, Canada; 2Independent Research Consultant, Calgary, AB, Canada; 3Database Consultant, University of Calgary, Calgary, AB, Canada Purpose: To provide a free flexible database for use by any researcher for assessing reports of adverse events after vaccination. Results: A database was developed in Microsoft ACCESS to assess reports of serious adverse events after yellow fever vaccination using Brighton Collaboration criteria. The database is partly automated (if data panels contain identical data fields the data are automatically also entered into those fields. The purpose is to provide the database free for developers to add additional panels to assess other vaccines. Keywords: serious adverse events after vaccination, database, process to assess vaccine-associated events 

  5. GEOS Atmospheric Model: Challenges at Exascale

    Science.gov (United States)

    Putman, William M.; Suarez, Max J.

    2017-01-01

    The Goddard Earth Observing System (GEOS) model at NASA's Global Modeling and Assimilation Office (GMAO) is used to simulate the multi-scale variability of the Earth's weather and climate, and is used primarily to assimilate conventional and satellite-based observations for weather forecasting and reanalysis. In addition, assimilations coupled to an ocean model are used for longer-term forecasting (e.g., El Nino) on seasonal to interannual times-scales. The GMAO's research activities, including system development, focus on numerous time and space scales, as detailed on the GMAO website, where they are tabbed under five major themes: Weather Analysis and Prediction; Seasonal-Decadal Analysis and Prediction; Reanalysis; Global Mesoscale Modeling, and Observing System Science. A brief description of the GEOS systems can also be found at the GMAO website. GEOS executes as a collection of earth system components connected through the Earth System Modeling Framework (ESMF). The ESMF layer is supplemented with the MAPL (Modeling, Analysis, and Prediction Layer) software toolkit developed at the GMAO, which facilitates the organization of the computational components into a hierarchical architecture. GEOS systems run in parallel using a horizontal decomposition of the Earth's sphere into processing elements (PEs). Communication between PEs is primarily through a message passing framework, using the message passing interface (MPI), and through explicit use of node-level shared memory access via the SHMEM (Symmetric Hierarchical Memory access) protocol. Production GEOS weather prediction systems currently run at 12.5-kilometer horizontal resolution with 72 vertical levels decomposed into PEs associated with 5,400 MPI processes. Research GEOS systems run at resolutions as fine as 1.5 kilometers globally using as many as 30,000 MPI processes. Looking forward, these systems can be expected to see a 2 times increase in horizontal resolution every two to three years, as well as

  6. Accessing the SEED genome databases via Web services API: tools for programmers.

    Science.gov (United States)

    Disz, Terry; Akhter, Sajia; Cuevas, Daniel; Olson, Robert; Overbeek, Ross; Vonstein, Veronika; Stevens, Rick; Edwards, Robert A

    2010-06-14

    The SEED integrates many publicly available genome sequences into a single resource. The database contains accurate and up-to-date annotations based on the subsystems concept that leverages clustering between genomes and other clues to accurately and efficiently annotate microbial genomes. The backend is used as the foundation for many genome annotation tools, such as the Rapid Annotation using Subsystems Technology (RAST) server for whole genome annotation, the metagenomics RAST server for random community genome annotations, and the annotation clearinghouse for exchanging annotations from different resources. In addition to a web user interface, the SEED also provides Web services based API for programmatic access to the data in the SEED, allowing the development of third-party tools and mash-ups. The currently exposed Web services encompass over forty different methods for accessing data related to microbial genome annotations. The Web services provide comprehensive access to the database back end, allowing any programmer access to the most consistent and accurate genome annotations available. The Web services are deployed using a platform independent service-oriented approach that allows the user to choose the most suitable programming platform for their application. Example code demonstrate that Web services can be used to access the SEED using common bioinformatics programming languages such as Perl, Python, and Java. We present a novel approach to access the SEED database. Using Web services, a robust API for access to genomics data is provided, without requiring large volume downloads all at once. The API ensures timely access to the most current datasets available, including the new genomes as soon as they come online.

  7. GeoCEGAS: natural gas distribution management system

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, Lorena C.J. [Companhia de Gas do Ceara (CEGAS), Fortaleza, CE (Brazil); Targa, Fernando O. [Gestao Empresarial e Informatica Ltda. (GEMPI), Sao Paulo, SP (Brazil)

    2009-07-01

    This Technical Paper approach the conception, architecture, design, construction, and implementation of GeoCEGAS, a spatially enabled corporate management information system, oriented to store and provide Web access, to information associated with the natural gas distribution network, owned by CEGAS. This paper reports business processes, business entities and business intelligence approached on the project, as well as an overview of system architecture, applications, and technology used on the implementation of GeoCEGAS. Finally, is presented an introduction to the work methodology used, as well a synopsis of benefits achievements. (author)

  8. [Project evidência [evidence]: research and education about accessing scientific databases in Azores].

    Science.gov (United States)

    Soares, Hélia; Pereira, Sandra M; Neves, Ajuda; Gomes, Amy; Teixeira, Bruno; Oliveira, Carolina; Sousa, Fábio; Tavares, Márcio; Tavares, Patrícia; Dutra, Raquel; Pereira, Hélder Rocha

    2013-04-01

    Project Evidência [Evidence] intends to promote the use of scientific databases among nurses. This study aims to design educational interventions that facilitate nurses' access to these databases, to determine nurses' habits regarding the use of scientific databases, and to determine the impact that educational interventions on scientific databases have on Azorean nurses who volunteered for this project. An intervention project was conducted, and a quantitative descriptive survey was designed to evaluate the impact two and five months after the educational intervention. This impact was investigated considering certain aspects, namely, the nurses' knowledge, habits and reasons for using scientific databases. A total of 192 nurses participated in this study, and the primary results indicate that the educational intervention had a positive impact based not only on the increased frequency of using platforms or databases of scientific information (DSIs) s but also on the competence and self-awareness regarding its use and consideration of the reasons for accessing this information.

  9. Access To The PMM's Pixel Database

    Science.gov (United States)

    Monet, D.; Levine, S.

    1999-12-01

    The U.S. Naval Observatory Flagstaff Station is in the process of enabling access to the Precision Measuring Machine (PMM) program's pixel database. The initial release will include the pixels from the PMM's scans of the Palomar Observatory Sky Survey I (POSS-I) -O and -E surveys, the Whiteoak Extension, the European Southern Observatory-R survey, the Science and Engineering Council-J, -EJ, and -ER surveys, and the Anglo- Australian Observatory-R survey. (The SERC-ER and AAO-R surveys are currently incomplete.) As time allows, access to the POSS-II -J, -F, and -N surveys, the Palomar Infrared Milky Way Atlas, the Yale/San Juan Southern Proper Motion survey, and plates rejected by various surveys will be added. (POSS-II -J and -F are complete, but -N was never finished.) Eventually, some 10 Tbytes of pixel data will be available. Due to funding and technology limitations, the initial interface will have only limited functionality, and access time will be slow since the archive is stored on Digital Linear Tape (DLT). Usage of the pixel data will be restricted to non-commercial, scientific applications, and agreements on copyright issues have yet to be finalized. The poster presentation will give the URL.

  10. Database architecture optimized for the new bottleneck: Memory access

    NARCIS (Netherlands)

    P.A. Boncz (Peter); S. Manegold (Stefan); M.L. Kersten (Martin)

    1999-01-01

    textabstractIn the past decade, advances in speed of commodity CPUs have far out-paced advances in memory latency. Main-memory access is therefore increasingly a performance bottleneck for many computer applications, including database systems. In this article, we use a simple scan test to show the

  11. Optimizing Database Architecture for the New Bottleneck: Memory Access

    NARCIS (Netherlands)

    S. Manegold (Stefan); P.A. Boncz (Peter); M.L. Kersten (Martin)

    2000-01-01

    textabstractIn the past decade, advances in speed of commodity CPUs have far out-paced advances in memory latency. Main-memory access is therefore increasingly a performance bottleneck for many computer applications, including database systems. In this article, we use a simple scan test to show the

  12. Living with geo-resources and geo-hazards

    NARCIS (Netherlands)

    Hangx, Suzanne|info:eu-repo/dai/nl/30483579X; Niemeijer, André|info:eu-repo/dai/nl/370832132

    2015-01-01

    Two of the key strategic topics on the European Committee’s Horizon2020 Roadmap revolve around geo-resources and geo-hazards, and their impact on societal and economic development. On the way towards a better policy for sustainable geo-resources production, such as oil, gas, geothermal energy and

  13. A Model-driven Role-based Access Control for SQL Databases

    Directory of Open Access Journals (Sweden)

    Raimundas Matulevičius

    2015-07-01

    Full Text Available Nowadays security has become an important aspect in information systems engineering. A mainstream method for information system security is Role-based Access Control (RBAC, which restricts system access to authorised users. While the benefits of RBAC are widely acknowledged, the implementation and administration of RBAC policies remains a human intensive activity, typically postponed until the implementation and maintenance phases of system development. This deferred security engineering approach makes it difficult for security requirements to be accurately captured and for the system’s implementation to be kept aligned with these requirements as the system evolves. In this paper we propose a model-driven approach to manage SQL database access under the RBAC paradigm. The starting point of the approach is an RBAC model captured in SecureUML. This model is automatically translated to Oracle Database views and instead-of triggers code, which implements the security constraints. The approach has been fully instrumented as a prototype and its effectiveness has been validated by means of a case study.

  14. Pan European Phenological database (PEP725): a single point of access for European data

    Science.gov (United States)

    Templ, Barbara; Koch, Elisabeth; Bolmgren, Kjell; Ungersböck, Markus; Paul, Anita; Scheifinger, Helfried; Rutishauser, This; Busto, Montserrat; Chmielewski, Frank-M.; Hájková, Lenka; Hodzić, Sabina; Kaspar, Frank; Pietragalla, Barbara; Romero-Fresneda, Ramiro; Tolvanen, Anne; Vučetič, Višnja; Zimmermann, Kirsten; Zust, Ana

    2018-02-01

    The Pan European Phenology (PEP) project is a European infrastructure to promote and facilitate phenological research, education, and environmental monitoring. The main objective is to maintain and develop a Pan European Phenological database (PEP725) with an open, unrestricted data access for science and education. PEP725 is the successor of the database developed through the COST action 725 "Establishing a European phenological data platform for climatological applications" working as a single access point for European-wide plant phenological data. So far, 32 European meteorological services and project partners from across Europe have joined and supplied data collected by volunteers from 1868 to the present for the PEP725 database. Most of the partners actively provide data on a regular basis. The database presently holds almost 12 million records, about 46 growing stages and 265 plant species (including cultivars), and can be accessed via http://www.pep725.eu/. Users of the PEP725 database have studied a diversity of topics ranging from climate change impact, plant physiological question, phenological modeling, and remote sensing of vegetation to ecosystem productivity.

  15. Pan European Phenological database (PEP725): a single point of access for European data

    Science.gov (United States)

    Templ, Barbara; Koch, Elisabeth; Bolmgren, Kjell; Ungersböck, Markus; Paul, Anita; Scheifinger, Helfried; Rutishauser, This; Busto, Montserrat; Chmielewski, Frank-M.; Hájková, Lenka; Hodzić, Sabina; Kaspar, Frank; Pietragalla, Barbara; Romero-Fresneda, Ramiro; Tolvanen, Anne; Vučetič, Višnja; Zimmermann, Kirsten; Zust, Ana

    2018-06-01

    The Pan European Phenology (PEP) project is a European infrastructure to promote and facilitate phenological research, education, and environmental monitoring. The main objective is to maintain and develop a Pan European Phenological database (PEP725) with an open, unrestricted data access for science and education. PEP725 is the successor of the database developed through the COST action 725 "Establishing a European phenological data platform for climatological applications" working as a single access point for European-wide plant phenological data. So far, 32 European meteorological services and project partners from across Europe have joined and supplied data collected by volunteers from 1868 to the present for the PEP725 database. Most of the partners actively provide data on a regular basis. The database presently holds almost 12 million records, about 46 growing stages and 265 plant species (including cultivars), and can be accessed via http://www.pep725.eu/ . Users of the PEP725 database have studied a diversity of topics ranging from climate change impact, plant physiological question, phenological modeling, and remote sensing of vegetation to ecosystem productivity.

  16. Distributed Access View Integrated Database (DAVID) system

    Science.gov (United States)

    Jacobs, Barry E.

    1991-01-01

    The Distributed Access View Integrated Database (DAVID) System, which was adopted by the Astrophysics Division for their Astrophysics Data System, is a solution to the system heterogeneity problem. The heterogeneous components of the Astrophysics problem is outlined. The Library and Library Consortium levels of the DAVID approach are described. The 'books' and 'kits' level is discussed. The Universal Object Typer Management System level is described. The relation of the DAVID project with the Small Business Innovative Research (SBIR) program is explained.

  17. Ibmdbpy-spatial : An Open-source implementation of in-database geospatial analytics in Python

    Science.gov (United States)

    Roy, Avipsa; Fouché, Edouard; Rodriguez Morales, Rafael; Moehler, Gregor

    2017-04-01

    As the amount of spatial data acquired from several geodetic sources has grown over the years and as data infrastructure has become more powerful, the need for adoption of in-database analytic technology within geosciences has grown rapidly. In-database analytics on spatial data stored in a traditional enterprise data warehouse enables much faster retrieval and analysis for making better predictions about risks and opportunities, identifying trends and spot anomalies. Although there are a number of open-source spatial analysis libraries like geopandas and shapely available today, most of them have been restricted to manipulation and analysis of geometric objects with a dependency on GEOS and similar libraries. We present an open-source software package, written in Python, to fill the gap between spatial analysis and in-database analytics. Ibmdbpy-spatial provides a geospatial extension to the ibmdbpy package, implemented in 2015. It provides an interface for spatial data manipulation and access to in-database algorithms in IBM dashDB, a data warehouse platform with a spatial extender that runs as a service on IBM's cloud platform called Bluemix. Working in-database reduces the network overload, as the complete data need not be replicated into the user's local system altogether and only a subset of the entire dataset can be fetched into memory in a single instance. Ibmdbpy-spatial accelerates Python analytics by seamlessly pushing operations written in Python into the underlying database for execution using the dashDB spatial extender, thereby benefiting from in-database performance-enhancing features, such as columnar storage and parallel processing. The package is currently supported on Python versions from 2.7 up to 3.4. The basic architecture of the package consists of three main components - 1) a connection to the dashDB represented by the instance IdaDataBase, which uses a middleware API namely - pypyodbc or jaydebeapi to establish the database connection via

  18. GeoViQua: quality-aware geospatial data discovery and evaluation

    Science.gov (United States)

    Bigagli, L.; Papeschi, F.; Mazzetti, P.; Nativi, S.

    2012-04-01

    /tracking information such as provenance of data and metadata), and user-generated metadata (informal user comments, usage information, rating, etc.). Moreover, metadata should include sufficiently complete access information, to allow rich data visualization and propagation. The following main enabling components are currently identified within WP4: - Quality-aware access services, e.g. a quality-aware extension of the OGC Sensor Observation Service (SOS-Q) specification, to support quality constraints for sensor data publishing and access; - Quality-aware discovery services, namely a quality-aware extension of the OGC Catalog Service for the Web (CSW-Q), to cope with quality constrained search; - Quality-augmentation broker (GeoViQua Broker), to support the linking and combination of the existing GCI metadata with GeoViQua- and user-generated metadata required to support the users in selecting the "best" data for their intended use. We are currently developing prototypes of the above quality-enabled geo-search components, that will be assessed in a sensor-based pilot case study in the next months. In particular, the GeoViQua Broker will be integrated with the EuroGEOSS Broker, to implement CSW-Q and federate (either via distribution or harvesting schemes) quality-aware data sources, GeoViQua will constitute a valuable test-bed for advancing the current best practices and standards in geospatial quality representation and exploitation. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 265178.

  19. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-06-17

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  20. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    International Nuclear Information System (INIS)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-01-01

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  1. Big Data, Small Data: Accessing and Manipulating Geoscience Data Ranging From Repositories to Student-Collected Data Sets Using GeoMapApp

    Science.gov (United States)

    Goodwillie, A. M.

    2015-12-01

    We often demand information and data to be accessible over the web at no cost, and no longer do we expect to spend time labouriously compiling data from myriad sources with frustratingly-different formats. Instead, we increasingly expect convenience and consolidation. Recent advances in web-enabled technologies and cyberinfrastructure are answering those calls by providing data tools and resources that can transform undergraduate education. By freeing up valuable classroom time, students can focus upon gaining deeper insights and understanding from real-world data. GeoMapApp (http://www.geomapapp.org) is a map-based data discovery and visualisation tool developed at Lamont-Doherty Earth Observatory. GeoMapApp promotes U-Learning by working across all major computer platforms and functioning anywhere with internet connectivity, by lowering socio-economic barriers (it is free), by seamlessly integrating thousands of built-in research-grade data sets under intuitive menus, and by being adaptable to a range of learning environments - from lab sessions, group projects, and homework assignments to in-class pop-ups. GeoMapApp caters to casual and specialist users alike. Contours, artificial illumination, 3-D displays, data point manipulations, cross-sectional profiles, and other display techniques help students better grasp the content and geospatial context of data. Layering capabilities allow easy data set comparisons. The core functionality also applies to imported data sets: Student-collected data can thus be imported and analysed using the same techniques. A new Save Session function allows educators to preserve a pre-loaded state of GeoMapApp. When shared with a class, the saved file allows every student to open GeoMapApp at exactly the same starting point from which to begin their data explorations. Examples of built-in data sets include seafloor crustal age, earthquake locations and focal mechanisms, analytical geochemistry, ocean water physical properties, US and

  2. The design of a DataBase for Natural Resources in Danube Delta Biosphere Reserve DDBR

    Directory of Open Access Journals (Sweden)

    GRIGORAS Ion

    2016-12-01

    Full Text Available Efficient use of natural resources especially in Natura 2000 sites is an essential component of Europe 2020 strategy. The use of web database is absolutely necessary for a good resource management and it will provide a good communication channel for the main stakeholders: protected area manager, scientists, resources evaluators and local community. Access to information from database is allowed according with the user competence. General information on natural resources uses in Danube Delta Biosphere Reserve (D.D.B.R. will be freely available. Different degree of information, especially regarding editing data will be applied for the main actors involved in use of natural resources. Evaluators that are mainly scientists with good biodiversity background, protected area staff that applies the regulation regarding natural resources in relation with ecological conditions, private companies or persons interested in harvesting natural resources. The user interface is realized by using OpenSource products. The web interface for tabular data was build using ExtJs Javascrip library. The web map user interface was build using Openlayers, GeoExt, and Ext. For database SQL server we chose PostgresSQL and GeoServer for maps server

  3. GEO-ENGINEERING MODELING THROUGH INTERNET INFORMATICS (GEMINI)

    Energy Technology Data Exchange (ETDEWEB)

    W. Lynn Watney; John H. Doveton

    2004-05-13

    GEMINI (Geo-Engineering Modeling through Internet Informatics) is a public-domain web application focused on analysis and modeling of petroleum reservoirs and plays (http://www.kgs.ukans.edu/Gemini/index.html). GEMINI creates a virtual project by ''on-the-fly'' assembly and analysis of on-line data either from the Kansas Geological Survey or uploaded from the user. GEMINI's suite of geological and engineering web applications for reservoir analysis include: (1) petrofacies-based core and log modeling using an interactive relational rock catalog and log analysis modules; (2) a well profile module; (3) interactive cross sections to display ''marked'' wireline logs; (4) deterministic gridding and mapping of petrophysical data; (5) calculation and mapping of layer volumetrics; (6) material balance calculations; (7) PVT calculator; (8) DST analyst, (9) automated hydrocarbon association navigator (KHAN) for database mining, and (10) tutorial and help functions. The Kansas Hydrocarbon Association Navigator (KHAN) utilizes petrophysical databases to estimate hydrocarbon pay or other constituent at a play- or field-scale. Databases analyzed and displayed include digital logs, core analysis and photos, DST, and production data. GEMINI accommodates distant collaborations using secure password protection and authorized access. Assembled data, analyses, charts, and maps can readily be moved to other applications. GEMINI's target audience includes small independents and consultants seeking to find, quantitatively characterize, and develop subtle and bypassed pays by leveraging the growing base of digital data resources. Participating companies involved in the testing and evaluation of GEMINI included Anadarko, BP, Conoco-Phillips, Lario, Mull, Murfin, and Pioneer Resources.

  4. The bovine QTL viewer: a web accessible database of bovine Quantitative Trait Loci

    Directory of Open Access Journals (Sweden)

    Xavier Suresh R

    2006-06-01

    Full Text Available Abstract Background Many important agricultural traits such as weight gain, milk fat content and intramuscular fat (marbling in cattle are quantitative traits. Most of the information on these traits has not previously been integrated into a genomic context. Without such integration application of these data to agricultural enterprises will remain slow and inefficient. Our goal was to populate a genomic database with data mined from the bovine quantitative trait literature and to make these data available in a genomic context to researchers via a user friendly query interface. Description The QTL (Quantitative Trait Locus data and related information for bovine QTL are gathered from published work and from existing databases. An integrated database schema was designed and the database (MySQL populated with the gathered data. The bovine QTL Viewer was developed for the integration of QTL data available for cattle. The tool consists of an integrated database of bovine QTL and the QTL viewer to display QTL and their chromosomal position. Conclusion We present a web accessible, integrated database of bovine (dairy and beef cattle QTL for use by animal geneticists. The viewer and database are of general applicability to any livestock species for which there are public QTL data. The viewer can be accessed at http://bovineqtl.tamu.edu.

  5. An Open Access Database of Genome-wide Association Results

    Directory of Open Access Journals (Sweden)

    Johnson Andrew D

    2009-01-01

    Full Text Available Abstract Background The number of genome-wide association studies (GWAS is growing rapidly leading to the discovery and replication of many new disease loci. Combining results from multiple GWAS datasets may potentially strengthen previous conclusions and suggest new disease loci, pathways or pleiotropic genes. However, no database or centralized resource currently exists that contains anywhere near the full scope of GWAS results. Methods We collected available results from 118 GWAS articles into a database of 56,411 significant SNP-phenotype associations and accompanying information, making this database freely available here. In doing so, we met and describe here a number of challenges to creating an open access database of GWAS results. Through preliminary analyses and characterization of available GWAS, we demonstrate the potential to gain new insights by querying a database across GWAS. Results Using a genomic bin-based density analysis to search for highly associated regions of the genome, positive control loci (e.g., MHC loci were detected with high sensitivity. Likewise, an analysis of highly repeated SNPs across GWAS identified replicated loci (e.g., APOE, LPL. At the same time we identified novel, highly suggestive loci for a variety of traits that did not meet genome-wide significant thresholds in prior analyses, in some cases with strong support from the primary medical genetics literature (SLC16A7, CSMD1, OAS1, suggesting these genes merit further study. Additional adjustment for linkage disequilibrium within most regions with a high density of GWAS associations did not materially alter our findings. Having a centralized database with standardized gene annotation also allowed us to examine the representation of functional gene categories (gene ontologies containing one or more associations among top GWAS results. Genes relating to cell adhesion functions were highly over-represented among significant associations (p -14, a finding

  6. Using GeoRePORT to report socio-economic potential for geothermal development

    Energy Technology Data Exchange (ETDEWEB)

    Young, Katherine R.; Levine, Aaron

    2018-07-01

    The Geothermal Resource Portfolio Optimization and Reporting Tool (GeoRePORT, http://en.openei.org/wiki/GeoRePORT) was developed for reporting resource grades and project readiness levels, providing the U.S. Department of Energy a consistent and comprehensible means of evaluating projects. The tool helps funding organizations (1) quantitatively identify barriers, (2) develop measureable goals, (3) objectively evaluate proposals, including contribution to goals, (4) monitor progress, and (5) report portfolio performance. GeoRePORT assesses three categories: geological, technical, and socio-economic. Here, we describe GeoRePORT, then focus on the socio-economic assessment and its applications for assessing deployment potential in the U.S. Socio-economic attributes include land access, permitting, transmission, and market.

  7. Evaluation of an Online Instructional Database Accessed by QR Codes to Support Biochemistry Practical Laboratory Classes

    Science.gov (United States)

    Yip, Tor; Melling, Louise; Shaw, Kirsty J.

    2016-01-01

    An online instructional database containing information on commonly used pieces of laboratory equipment was created. In order to make the database highly accessible and to promote its use, QR codes were utilized. The instructional materials were available anytime and accessed using QR codes located on the equipment itself and within undergraduate…

  8. The Population of Optically Faint GEO Debris

    Science.gov (United States)

    Seitzer, Patrick; Barker, Ed; Buckalew, Brent; Burkhardt, Andrew; Cowardin, Heather; Frith, James; Gomez, Juan; Kaleida, Catherine; Lederer, Susan M.; Lee, Chris H.

    2016-01-01

    The 6.5-m Magellan telescope 'Walter Baade' at the Las Campanas Observatory in Chile has been used for spot surveys of the GEO orbital regime to study the population of optically faint GEO debris. The goal is to estimate the size of the population of GEO debris at sizes much smaller than can be studied with 1-meter class telescopes. Despite the small size of the field of view of the Magellan instrument (diameter 0.5-degree), a significant population of objects fainter than R = 19th magnitude have been found with angular rates consistent with circular orbits at GEO. We compare the size of this population with the numbers of GEO objects found at brighter magnitudes by smaller telescopes. The observed detections have a wide range in characteristics starting with those appearing as short uniform streaks. But there are a substantial number of detections with variations in brightness, flashers, during the 5-second exposure. The duration of each of these flashes can be extremely brief: sometimes less than half a second. This is characteristic of a rapidly tumbling object with a quite variable projected size times albedo. If the albedo is of the order of 0.2, then the largest projected size of these objects is around 10-cm. The data in this paper was collected over the last several years using Magellan's IMACS camera in f/2 mode. The analysis shows the brightness bins for the observed GEO population as well as the periodicity of the flashers. All objects presented are correlated with the catalog: the focus of the paper will be on the uncorrelated, optically faint, objects. The goal of this project is to better characterize the faint debris population in GEO that access to a 6.5-m optical telescope in a superb site can provide.

  9. Public sector information access policies in Europe

    NARCIS (Netherlands)

    Welle Donker, F.M.

    2010-01-01

    In the digital age geo-information has become embedded in our daily lives, such as navigation systems, community platforms, real estate information and weather forecasts. Everybody uses geo-information for their day-to-day decision making. Therefore, access to geo-information is of vital importance

  10. Using GeoMapApp in the Classroom

    Science.gov (United States)

    Goodwillie, A. M.

    2017-12-01

    The GeoMapApp tool has been updated with enhanced functionality that is useful in the classroom. Hosted as a service of the IEDA Facility at Columbia University, GeoMapApp (http://www.geomapapp.org) is a free resource that integrates a wide range of research-grade geoscience data in one intuitive map-based interface. It includes earthquake and volcano data, geological maps, plate tectonic data sets, and a high-resolution topography/bathymetry base map. Users can also import and analyse their own data files. Layering and transparency capabilities allow users to compare multiple data sets at once. The GeoMapApp interface presents data in its proper geospatial context, helping students more easily gain insight and understanding from the data. Simple tools for data manipulation allow students to analyse the data in different ways such as generating profiles and producing visualisations for reports. The new Save Session capability is designed to assist in the classroom: The educator saves a pre-loaded state of GeoMapApp. When shared with the class, the saved session file allows students to open GeoMapApp with exactly the same data sets loaded and the same display parameters chosen thus freeing up valuable time in which students can explore the data. In this presentation, activities related to plate tectonics will be highlighted. One activity helps students investigate plate boundaries by exploring earthquake and volcano locations. Another requires students to calculate the rate of seafloor spreading using crustal age data in various ocean basins. A third uses the GeoMapApp layering technique to explore the influence of geological forces in shaping the landscape. Educators report that using GeoMapApp in the classroom lowers the barriers to data accessibility for students; fosters an increased sense of data "ownership" - GeoMapApp presents the same data in the same tool used by researchers; allows engagement with authentic geoscience data; promotes STEM skills and

  11. The Distributed Geothermal Market Demand Model (dGeo): Documentation

    Energy Technology Data Exchange (ETDEWEB)

    McCabe, Kevin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mooney, Meghan E [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sigrin, Benjamin O [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gleason, Michael [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Liu, Xiaobing [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-06

    The National Renewable Energy Laboratory (NREL) developed the Distributed Geothermal Market Demand Model (dGeo) as a tool to explore the potential role of geothermal distributed energy resources (DERs) in meeting thermal energy demands in the United States. The dGeo model simulates the potential for deployment of geothermal DERs in the residential and commercial sectors of the continental United States for two specific technologies: ground-source heat pumps (GHP) and geothermal direct use (DU) for district heating. To quantify the opportunity space for these technologies, dGeo leverages a highly resolved geospatial database and robust bottom-up, agent-based modeling framework. This design is consistent with others in the family of Distributed Generation Market Demand models (dGen; Sigrin et al. 2016), including the Distributed Solar Market Demand (dSolar) and Distributed Wind Market Demand (dWind) models. dGeo is intended to serve as a long-term scenario-modeling tool. It has the capability to simulate the technical potential, economic potential, market potential, and technology deployment of GHP and DU through the year 2050 under a variety of user-defined input scenarios. Through these capabilities, dGeo can provide substantial analytical value to various stakeholders interested in exploring the effects of various techno-economic, macroeconomic, financial, and policy factors related to the opportunity for GHP and DU in the United States. This report documents the dGeo modeling design, methodology, assumptions, and capabilities.

  12. The Group on Earth Observations (GEO) through 2025

    Science.gov (United States)

    Ryan, Barbara; Cripe, Douglas

    Ministers from the Group on Earth Observations (GEO) Member governments, meeting in Geneva, Switzerland in January 2014, unanimously renewed the mandate of GEO through 2025. Through a Ministerial Declaration, they reconfirmed that GEO’s guiding principles of collaboration in leveraging national, regional and global investments and in developing and coordinating strategies to achieve full and open access to Earth observations data and information in order to support timely and knowledge-based decision-making - are catalysts for improving the quality of life of people around the world, advancing global sustainability, and preserving the planet and its biodiversity. GEO Ministers acknowledged and valued the contributions of GEO Member governments and invited all remaining Member States of the United Nations to consider joining GEO. The Ministers also encouraged all Members to strengthen national GEO arrangements, and - of particular interest to COSPAR - they highlighted the unique contributions of Participating Organizations. In this regard, ten more organizations saw their applications approved by Plenary and joined the ranks along with COSPAR to become a Participating Organization in GEO, bringing the current total to 77. Building on the efforts of a Post-2015 Working Group, in which COSPAR participated, Ministers provided additional guidance for GEO and the evolution of its Global Earth Observation System of System (GEOSS) through 2025. Five key areas of activities for the next decade include the following: 1.) Advocating for the value of Earth observations and the need to continue improving Earth observation worldwide; 2.) Urging the adoption and implementation of data sharing principles globally; 3.) Advancing the development of the GEOSS information system for the benefit of users; 4.) Developing a comprehensive interdisciplinary knowledge base defining and documenting observations needed for all disciplines and facilitate availability and accessibility of

  13. An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database.

    Science.gov (United States)

    Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang

    2016-01-28

    In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m.

  14. The consequences of the Chernobyl accident: REDAC, the radioecological database of the French-German Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Deville-Cavelin, G. [Institut de Radioprotection et de Surete Nucleaire, IRSN, BP 17, 92262 Fontenay-aux-Roses Cedex (France); Biesold, H. [Gesellschaft fuer Anlagen- und Reaktorsicherheit, GRS, mbH, Schwertnergasse 1, 50667 Koeln (Germany); Chabanyuk, V. [Intelligence Systems GEO, Chernobyl Centre for Nuclear Safety, Radioactive Wastes and Radioecology (Ukraine)

    2005-07-01

    methodology made use of the following main portlets and DDB: GlobalFunctions - interconnection between portlets; ContentTree - access to a content of REDAC; LibraryLocator - shows location in the library; Index - search of documents by key words; Search - search of documents according to chosen properties; Favorites - generation of sets of the most often used files; Briefcase - for downloading documents to the user's computer; MetaView - shows meta-data, characterizing the files; DocView - displays the file load from the web server; ProductRelations, ActiveRelations, AllRelations - shows relations between the selected document and other associated documents; Glossary - global project glossary based on thematic ones. The following conclusions are highlighted: REDAC is a powerful and useful radioecological tool: - All elements easily accessible through the original tool, ProSF, developed by IS Geo; - Relations constructed between the documents (files, databases, documentation, reports, etc); - All elements are structured by a meta-information; - Mechanisms of search; - Global radioecological glossary; - Spatial data geo-coded; - Processes, tools and methodology suitable for similar projects; - Data useful for scientific studies, modelling, operational purposes, communication with mass media. As prospects, the addition of functionality, support and maintenance are pointed out as well as a strong integration implying thematic integration (merging of all DB in an unique one) and information integration (decision of 'strong integration' and information support)

  15. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    CERN Document Server

    Valassi, A; Kalkhof, A; Salnikov, A; Wache, M

    2011-01-01

    The CORAL software is widely used at CERN for accessing the data stored by the LHC experiments using relational database technologies. CORAL provides a C++ abstraction layer that supports data persistency for several backends and deployment models, including local access to SQLite files, direct client access to Oracle and MySQL servers, and read-only access to Oracle through the FroNTier web server and cache. Two new components have recently been added to CORAL to implement a model involving a middle tier "CORAL server" deployed close to the database and a tree of "CORAL server proxy" instances, with data caching and multiplexing functionalities, deployed close to the client. The new components are meant to provide advantages for read-only and read-write data access, in both offline and online use cases, in the areas of scalability and performance (multiplexing for several incoming connections, optional data caching) and security (authentication via proxy certificates). A first implementation of the two new c...

  16. Open-access databases as unprecedented resources and drivers of cultural change in fisheries science

    Energy Technology Data Exchange (ETDEWEB)

    McManamay, Ryan A [ORNL; Utz, Ryan [National Ecological Observatory Network

    2014-01-01

    Open-access databases with utility in fisheries science have grown exponentially in quantity and scope over the past decade, with profound impacts to our discipline. The management, distillation, and sharing of an exponentially growing stream of open-access data represents several fundamental challenges in fisheries science. Many of the currently available open-access resources may not be universally known among fisheries scientists. We therefore introduce many national- and global-scale open-access databases with applications in fisheries science and provide an example of how they can be harnessed to perform valuable analyses without additional field efforts. We also discuss how the development, maintenance, and utilization of open-access data are likely to pose technical, financial, and educational challenges to fisheries scientists. Such cultural implications that will coincide with the rapidly increasing availability of free data should compel the American Fisheries Society to actively address these problems now to help ease the forthcoming cultural transition.

  17. Design and development of a geo-referenced database to radionuclides in food

    Science.gov (United States)

    Nascimento, L. M. E.; Ferreira, A. C. M.; Gonzalez, S. A.

    2018-03-01

    The primary purpose of the range of activities concerning the info management of the environmental assessment is to provide to scientific community an improved access to environmental data, as well as to support the decision making loop, in case of contamination events due either to accidental or intentional causes. In recent years, geotechnologies became a key reference in environmental research and monitoring, since they deliver an efficient data retrieval and subsequent processing about natural resources. This study aimed at the development of a georeferenced database (SIGLARA – SIstema Georeferenciado Latino Americano de Radionuclídeos em Alimentos), designed to radioactivity in food data storage, available in three languages (Spanish, Portuguese and English), employing free software[l].

  18. Design and development of a geo-referenced database to radionuclides in food

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Lucia Maria Evangelista do; Ferreira, Ana Cristina de Melo; Gonzalez, Sergio de Albuquerque, E-mail: anacris@ird.gov.br [Instituto de Radioproteção e Dosimetria (RD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    The primary purpose of the range of activities concerning the info management of the environmental assessment is to provide to scientific community an improved access to environmental data, as well as to support the decision making loop, in case of contamination events due either to accidental or intentional causes. In recent years, geotechnologies became a key reference in environmental research and monitoring, since they deliver an efficient data retrieval and subsequent processing about natural resources. This study aimed at the development of a georeferenced database (SIGLARA - Sistema Georeferenciado Latino Americano de Radionuclídeos em Alimentos), designed to radioactivity in food data storage, available in three languages (Spanish, Portuguese and English), employing free software. (author)

  19. Design and development of a geo-referenced database to radionuclides in food

    International Nuclear Information System (INIS)

    Nascimento, Lucia Maria Evangelista do; Ferreira, Ana Cristina de Melo; Gonzalez, Sergio de Albuquerque

    2017-01-01

    The primary purpose of the range of activities concerning the info management of the environmental assessment is to provide to scientific community an improved access to environmental data, as well as to support the decision making loop, in case of contamination events due either to accidental or intentional causes. In recent years, geotechnologies became a key reference in environmental research and monitoring, since they deliver an efficient data retrieval and subsequent processing about natural resources. This study aimed at the development of a georeferenced database (SIGLARA - Sistema Georeferenciado Latino Americano de Radionuclídeos em Alimentos), designed to radioactivity in food data storage, available in three languages (Spanish, Portuguese and English), employing free software. (author)

  20. Geo-scientific information system

    International Nuclear Information System (INIS)

    Gedeon, M.; De Soete, H.

    2010-01-01

    Document available in extended abstract form only. In the framework of the geological disposal of radioactive waste, the characterization of the Boom Clay and its environment has been going on for more than 30 years. During this time, a great quantity of data was collected to support the research on the reference host rock. A geo-scientific information system was built to store the data acquired in this framework,. The aim was to create a central place where all types of data could be looked up for further analyses and interpretation. All data stored in the system are geographically referenced. The GSIS database was created using PostgreSQL database with PostGIS spatial extension. PostgreSQL is an open-source object-relational database management system (ORDBMS) based on POSTGRES, developed at the University of California at Berkeley Computer Science Department. POSTGRES pioneered many concepts that only became available in some commercial database systems much later. PostgreSQL is an open-source descendant of this original Berkeley code. It supports SQL92 and SQL99 and offers many modern features: complex queries, foreign keys, triggers, views, transactional integrity, multi-version concurrency control. PostGIS is an extension to the PostgreSQL object-relational database system which allows GIS (Geographic Information Systems) objects to be stored in the database. PostGIS includes support for GiST-based R-Tree spatial indexes, and functions for analysis and processing of GIS objects. The GSIS database consists of three principal database domains, the objects database domain (ObjectsDB) and the data domain (DataDB). ObjectsDB includes the definitions (including the geometry/ position) and relative hierarchy of the objects. The objects are defined as structures, enclosed areas or scientific instruments with definable geometry (2D or 3D) including samples used to acquire data (boreholes, piezometers, sampling locations, galleries, sensors, etc.). DataDB includes

  1. GeoSciML and EarthResourceML Update, 2012

    Science.gov (United States)

    Richard, S. M.; Commissionthe Management; Application Inte, I.

    2012-12-01

    CGI Interoperability Working Group activities during 2012 include deployment of services using the GeoSciML-Portrayal schema, addition of new vocabularies to support properties added in version 3.0, improvements to server software for deploying services, introduction of EarthResourceML v.2 for mineral resources, and collaboration with the IUSS on a markup language for soils information. GeoSciML and EarthResourceML have been used as the basis for the INSPIRE Geology and Mineral Resources specifications respectively. GeoSciML-Portrayal is an OGC GML simple-feature application schema for presentation of geologic map unit, contact, and shear displacement structure (fault and ductile shear zone) descriptions in web map services. Use of standard vocabularies for geologic age and lithology enables map services using shared legends to achieve visual harmonization of maps provided by different services. New vocabularies have been added to the collection of CGI vocabularies provided to support interoperable GeoSciML services, and can be accessed through http://resource.geosciml.org. Concept URIs can be dereferenced to obtain SKOS rdf or html representations using the SISSVoc vocabulary service. New releases of the FOSS GeoServer application greatly improve support for complex XML feature schemas like GeoSciML, and the ArcGIS for INSPIRE extension implements similar complex feature support for ArcGIS Server. These improved server implementations greatly facilitate deploying GeoSciML services. EarthResourceML v2 adds features for information related to mining activities. SoilML provides an interchange format for soil material, soil profile, and terrain information. Work is underway to add GeoSciML to the portfolio of Open Geospatial Consortium (OGC) specifications.

  2. Geo-Mechanical Characterization of Carbonate Rock Masses by Means of Laser Scanner Technique

    Science.gov (United States)

    Palma, Biagio; Parise, Mario; Ruocco, Anna

    2017-12-01

    Knowledge of the geometrical and structural setting of rock masses is crucial to evaluate the stability and to design the most suitable stabilization works. In this work we use the Terrestrial Laser Scanning (TLS) at the site of the Grave of the Castellana Caves, a famous show cave in southern Italy. The Grave is the natural access to the cave system, produced by collapse of the vault, due to upward progression of instabilities in the carbonate rock masses. It is about 55-m high, bell-shaped, with maximum width of 120 m. Aim of the work is the characterization of carbonate rock masses from the structural and geo-mechanical standpoints through the use of innovative survey techniques. TLS survey provides a product consisting of millions of geo-referenced points, to be managed in space, to become a suitable database for the morphological and geological-structural analysis. Studying by means of TLS a rock face, partly inaccessible or located in very complex environments, allows to investigate slopes in their overall areal extent, thus offering advantages both as regards safety of the workers and time needed for the survey. In addition to TLS, the traditional approach was also followed by performing scanlines surveys along the rims of the Grave, following the ISRM recommendations for characterization of discontinuity in rock masses. A quantitative comparison among the data obtained by TLS technique and those deriving from the classical geo-mechanical survey is eventually presented, to discuss potentiality of drawbacks of the different techniques used for surveying the rock masses.

  3. Development of Geo-Marketing

    OpenAIRE

    Tatiana Ozhereleva

    2014-01-01

    This article analyzes the state and development of geo-marketing. The author illustrates the multi-aspectedness of geo-marketing: applied technology and management technology. The article demonstrates that geo-marketing can be viewed as a reflection of the processes of co-evolution in society. The author brings to light the specifics of geo-marketing research and situational analysis in geo-marketing. The article describes applications of geo-marketing

  4. Development of Geo-Marketing

    Directory of Open Access Journals (Sweden)

    Tatiana Ozhereleva

    2014-10-01

    Full Text Available This article analyzes the state and development of geo-marketing. The author illustrates the multi-aspectedness of geo-marketing: applied technology and management technology. The article demonstrates that geo-marketing can be viewed as a reflection of the processes of co-evolution in society. The author brings to light the specifics of geo-marketing research and situational analysis in geo-marketing. The article describes applications of geo-marketing

  5. Comparing speed of Web Map Service with GeoServer on ESRI Shapefile and PostGIS

    Directory of Open Access Journals (Sweden)

    Jan Růžička

    2016-07-01

    Full Text Available There are several options how to configure Web Map Service using severalmap servers. GeoServer is one of most popular map servers nowadays.GeoServer is able to read data from several sources. Very popular datasource is ESRI Shapefile. It is well documented and most of softwarefor geodata processing is able to read and write data in this format.Another very popular data store is PostgreSQL/PostGIS object-relationaldatabase. Both data sources has advantages and disadvantages and userof GeoServer has to decide which one to use. The paper describescomparison of performance of GeoServer Web Map Service when readingdata from ESRI Shapefile or from PostgreSQL/PostGIS database.

  6. Accessing the public MIMIC-II intensive care relational database for clinical research.

    Science.gov (United States)

    Scott, Daniel J; Lee, Joon; Silva, Ikaro; Park, Shinhyuk; Moody, George B; Celi, Leo A; Mark, Roger G

    2013-01-10

    The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge "Predicting mortality of ICU Patients". QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database.

  7. For 481 biomedical open access journals, articles are not searchable in the Directory of Open Access Journals nor in conventional biomedical databases.

    Science.gov (United States)

    Liljekvist, Mads Svane; Andresen, Kristoffer; Pommergaard, Hans-Christian; Rosenberg, Jacob

    2015-01-01

    Background. Open access (OA) journals allows access to research papers free of charge to the reader. Traditionally, biomedical researchers use databases like MEDLINE and EMBASE to discover new advances. However, biomedical OA journals might not fulfill such databases' criteria, hindering dissemination. The Directory of Open Access Journals (DOAJ) is a database exclusively listing OA journals. The aim of this study was to investigate DOAJ's coverage of biomedical OA journals compared with the conventional biomedical databases. Methods. Information on all journals listed in four conventional biomedical databases (MEDLINE, PubMed Central, EMBASE and SCOPUS) and DOAJ were gathered. Journals were included if they were (1) actively publishing, (2) full OA, (3) prospectively indexed in one or more database, and (4) of biomedical subject. Impact factor and journal language were also collected. DOAJ was compared with conventional databases regarding the proportion of journals covered, along with their impact factor and publishing language. The proportion of journals with articles indexed by DOAJ was determined. Results. In total, 3,236 biomedical OA journals were included in the study. Of the included journals, 86.7% were listed in DOAJ. Combined, the conventional biomedical databases listed 75.0% of the journals; 18.7% in MEDLINE; 36.5% in PubMed Central; 51.5% in SCOPUS and 50.6% in EMBASE. Of the journals in DOAJ, 88.7% published in English and 20.6% had received impact factor for 2012 compared with 93.5% and 26.0%, respectively, for journals in the conventional biomedical databases. A subset of 51.1% and 48.5% of the journals in DOAJ had articles indexed from 2012 and 2013, respectively. Of journals exclusively listed in DOAJ, one journal had received an impact factor for 2012, and 59.6% of the journals had no content from 2013 indexed in DOAJ. Conclusions. DOAJ is the most complete registry of biomedical OA journals compared with five conventional biomedical databases

  8. ASAView: Database and tool for solvent accessibility representation in proteins

    Directory of Open Access Journals (Sweden)

    Fawareh Hamed

    2004-05-01

    Full Text Available Abstract Background Accessible surface area (ASA or solvent accessibility of amino acids in a protein has important implications. Knowledge of surface residues helps in locating potential candidates of active sites. Therefore, a method to quickly see the surface residues in a two dimensional model would help to immediately understand the population of amino acid residues on the surface and in the inner core of the proteins. Results ASAView is an algorithm, an application and a database of schematic representations of solvent accessibility of amino acid residues within proteins. A characteristic two-dimensional spiral plot of solvent accessibility provides a convenient graphical view of residues in terms of their exposed surface areas. In addition, sequential plots in the form of bar charts are also provided. Online plots of the proteins included in the entire Protein Data Bank (PDB, are provided for the entire protein as well as their chains separately. Conclusions These graphical plots of solvent accessibility are likely to provide a quick view of the overall topological distribution of residues in proteins. Chain-wise computation of solvent accessibility is also provided.

  9. Ginseng Genome Database: an open-access platform for genomics of Panax ginseng.

    Science.gov (United States)

    Jayakodi, Murukarthick; Choi, Beom-Soon; Lee, Sang-Choon; Kim, Nam-Hoon; Park, Jee Young; Jang, Woojong; Lakshmanan, Meiyappan; Mohan, Shobhana V G; Lee, Dong-Yup; Yang, Tae-Jin

    2018-04-12

    The ginseng (Panax ginseng C.A. Meyer) is a perennial herbaceous plant that has been used in traditional oriental medicine for thousands of years. Ginsenosides, which have significant pharmacological effects on human health, are the foremost bioactive constituents in this plant. Having realized the importance of this plant to humans, an integrated omics resource becomes indispensable to facilitate genomic research, molecular breeding and pharmacological study of this herb. The first draft genome sequences of P. ginseng cultivar "Chunpoong" were reported recently. Here, using the draft genome, transcriptome, and functional annotation datasets of P. ginseng, we have constructed the Ginseng Genome Database http://ginsengdb.snu.ac.kr /, the first open-access platform to provide comprehensive genomic resources of P. ginseng. The current version of this database provides the most up-to-date draft genome sequence (of approximately 3000 Mbp of scaffold sequences) along with the structural and functional annotations for 59,352 genes and digital expression of genes based on transcriptome data from different tissues, growth stages and treatments. In addition, tools for visualization and the genomic data from various analyses are provided. All data in the database were manually curated and integrated within a user-friendly query page. This database provides valuable resources for a range of research fields related to P. ginseng and other species belonging to the Apiales order as well as for plant research communities in general. Ginseng genome database can be accessed at http://ginsengdb.snu.ac.kr /.

  10. Microsoft Access Small Business Solutions State-of-the-Art Database Models for Sales, Marketing, Customer Management, and More Key Business Activities

    CERN Document Server

    Hennig, Teresa; Linson, Larry; Purvis, Leigh; Spaulding, Brent

    2010-01-01

    Database models developed by a team of leading Microsoft Access MVPs that provide ready-to-use solutions for sales, marketing, customer management and other key business activities for most small businesses. As the most popular relational database in the world, Microsoft Access is widely used by small business owners. This book responds to the growing need for resources that help business managers and end users design and build effective Access database solutions for specific business functions. Coverage includes::; Elements of a Microsoft Access Database; Relational Data Model; Dealing with C

  11. GENISES: A GIS Database for the Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    Beckett, J.

    1991-01-01

    This paper provides a general description of the Geographic Nodal Information Study and Evaluation System (GENISES) database design. The GENISES database is the Geographic Information System (GIS) component of the Yucca Mountain Site Characterization Project Technical Database (TDB). The GENISES database has been developed and is maintained by EG ampersand G Energy Measurements, Inc., Las Vegas, NV (EG ampersand G/EM). As part of the Yucca Mountain Project (YMP) Site Characterization Technical Data Management System, GENISES provides a repository for geographically oriented technical data. The primary objective of the GENISES database is to support the Yucca Mountain Site Characterization Project with an effective tool for describing, analyzing, and archiving geo-referenced data. The database design provides the maximum efficiency in input/output, data analysis, data management and information display. This paper provides the systematic approach or plan for the GENISES database design and operation. The paper also discusses the techniques used for data normalization or the decomposition of complex data structures as they apply to GIS database. ARC/INFO and INGRES files are linked or joined by establishing ''relate'' fields through the common attribute names. Thus, through these keys, ARC can allow access to normalized INGRES files greatly reducing redundancy and the size of the database

  12. Geospacial information utilized under the access control strategy

    Institute of Scientific and Technical Information of China (English)

    TIAN Jie; ZHANG Xin-fang; WANG Tong-yang; XIANG Wei; Cheng Ming

    2007-01-01

    This paper introduces a solution to the secure requirement for digital rights management (DRM) by the way of geospacial access control named geospacial access control (GeoAC) in geospacial field. The issues of authorization for geospacial DRM are concentrated on. To geospacial DRM, one aspect is the declaration and enforcement of access rights, based on geographic aspects. To the approbation of digital geographic content, it is important to adopt online access to geodata through a spacial data infrastructure (SDI). This results in the interoperability requirements on three different levels: data model level, service level and access control level. The interaction between the data model and service level can be obtained by criterions of the open geospacial consortium (OGC), and the interaction of the access control level may be reached by declaring and enforcing access restrictions in GeoAC. Then an archetype enforcement based on GeoAC is elucidated. As one aspect of performing usage rights, the execution of access restrictions as an extension to a regular SDI is illuminated.

  13. HEROD: a human ethnic and regional specific omics database.

    Science.gov (United States)

    Zeng, Xian; Tao, Lin; Zhang, Peng; Qin, Chu; Chen, Shangying; He, Weidong; Tan, Ying; Xia Liu, Hong; Yang, Sheng Yong; Chen, Zhe; Jiang, Yu Yang; Chen, Yu Zong

    2017-10-15

    Genetic and gene expression variations within and between populations and across geographical regions have substantial effects on the biological phenotypes, diseases, and therapeutic response. The development of precision medicines can be facilitated by the OMICS studies of the patients of specific ethnicity and geographic region. However, there is an inadequate facility for broadly and conveniently accessing the ethnic and regional specific OMICS data. Here, we introduced a new free database, HEROD, a human ethnic and regional specific OMICS database. Its first version contains the gene expression data of 53 070 patients of 169 diseases in seven ethnic populations from 193 cities/regions in 49 nations curated from the Gene Expression Omnibus (GEO), the ArrayExpress Archive of Functional Genomics Data (ArrayExpress), the Cancer Genome Atlas (TCGA) and the International Cancer Genome Consortium (ICGC). Geographic region information of curated patients was mainly manually extracted from referenced publications of each original study. These data can be accessed and downloaded via keyword search, World map search, and menu-bar search of disease name, the international classification of disease code, geographical region, location of sample collection, ethnic population, gender, age, sample source organ, patient type (patient or healthy), sample type (disease or normal tissue) and assay type on the web interface. The HEROD database is freely accessible at http://bidd2.nus.edu.sg/herod/index.php. The database and web interface are implemented in MySQL, PHP and HTML with all major browsers supported. phacyz@nus.edu.sg. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. A method to implement fine-grained access control for personal health records through standard relational database queries.

    Science.gov (United States)

    Sujansky, Walter V; Faus, Sam A; Stone, Ethan; Brennan, Patricia Flatley

    2010-10-01

    Online personal health records (PHRs) enable patients to access, manage, and share certain of their own health information electronically. This capability creates the need for precise access-controls mechanisms that restrict the sharing of data to that intended by the patient. The authors describe the design and implementation of an access-control mechanism for PHR repositories that is modeled on the eXtensible Access Control Markup Language (XACML) standard, but intended to reduce the cognitive and computational complexity of XACML. The authors implemented the mechanism entirely in a relational database system using ANSI-standard SQL statements. Based on a set of access-control rules encoded as relational table rows, the mechanism determines via a single SQL query whether a user who accesses patient data from a specific application is authorized to perform a requested operation on a specified data object. Testing of this query on a moderately large database has demonstrated execution times consistently below 100ms. The authors include the details of the implementation, including algorithms, examples, and a test database as Supplementary materials. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. GeoSearch: a new virtual globe application for the submission, storage, and sharing of point-based ecological data

    Science.gov (United States)

    Cardille, J. A.; Gonzales, R.; Parrott, L.; Bai, J.

    2009-12-01

    How should researchers store and share data? For most of history, scientists with results and data to share have been mostly limited to books and journal articles. In recent decades, the advent of personal computers and shared data formats has made it feasible, though often cumbersome, to transfer data between individuals or among small groups. Meanwhile, the use of automatic samplers, simulation models, and other data-production techniques has increased greatly. The result is that there is more and more data to store, and a greater expectation that they will be available at the click of a button. In 10 or 20 years, will we still send emails to each other to learn about what data exist? The development and widespread familiarity with virtual globes like Google Earth and NASA WorldWind has created the potential, in just the last few years, to revolutionize the way we share data, search for and search through data, and understand the relationship between individual projects in research networks, where sharing and dissemination of knowledge is encouraged. For the last two years, we have been building the GeoSearch application, a cutting-edge online resource for the storage, sharing, search, and retrieval of data produced by research networks. Linking NASA’s WorldWind globe platform, the data browsing toolkit prefuse, and SQL databases, GeoSearch’s version 1.0 enables flexible searches and novel geovisualizations of large amounts of related scientific data. These data may be submitted to the database by individual researchers and processed by GeoSearch’s data parser. Ultimately, data from research groups gathered in a research network would be shared among users via the platform. Access is not limited to the scientists themselves; administrators can determine which data can be presented publicly and which require group membership. Under the auspices of the Canada’s Sustainable Forestry Management Network of Excellence, we have created a moderate-sized database

  16. Understanding the patient perspective on research access to national health records databases for conduct of randomized registry trials.

    Science.gov (United States)

    Avram, Robert; Marquis-Gravel, Guillaume; Simard, François; Pacheco, Christine; Couture, Étienne; Tremblay-Gravel, Maxime; Desplantie, Olivier; Malhamé, Isabelle; Bibas, Lior; Mansour, Samer; Parent, Marie-Claude; Farand, Paul; Harvey, Luc; Lessard, Marie-Gabrielle; Ly, Hung; Liu, Geoffrey; Hay, Annette E; Marc Jolicoeur, E

    2018-07-01

    Use of health administrative databases is proposed for screening and monitoring of participants in randomized registry trials. However, access to these databases raises privacy concerns. We assessed patient's preferences regarding use of personal information to link their research records with national health databases, as part of a hypothetical randomized registry trial. Cardiology patients were invited to complete an anonymous self-reported survey that ascertained preferences related to the concept of accessing government health databases for research, the type of personal identifiers to be shared and the type of follow-up preferred as participants in a hypothetical trial. A total of 590 responders completed the survey (90% response rate), the majority of which were Caucasians (90.4%), male (70.0%) with a median age of 65years (interquartile range, 8). The majority responders (80.3%) would grant researchers access to health administrative databases for screening and follow-up. To this end, responders endorsed the recording of their personal identifiers by researchers for future record linkage, including their name (90%), and health insurance number (83.9%), but fewer responders agreed with the recording of their social security number (61.4%, pgranting researchers access to the administrative databases (OR: 1.69, 95% confidence interval: 1.03-2.90; p=0.04). The majority of Cardiology patients surveyed were supportive of use of their personal identifiers to access administrative health databases and conduct long-term monitoring in the context of a randomized registry trial. Copyright © 2018 Elsevier Ireland Ltd. All rights reserved.

  17. Microcomputer Database Management Systems that Interface with Online Public Access Catalogs.

    Science.gov (United States)

    Rice, James

    1988-01-01

    Describes a study that assessed the availability and use of microcomputer database management interfaces to online public access catalogs. The software capabilities needed to effect such an interface are identified, and available software packages are evaluated by these criteria. A directory of software vendors is provided. (4 notes with…

  18. Geo-Neutrinos

    International Nuclear Information System (INIS)

    Dye, S.T.

    2009-01-01

    This paper briefly reviews recent developments in the field of geo-neutrinos. It describes current and future detection projects, discusses modeling projects, suggests an observational program, and visits geo-reactor hypotheses.

  19. Geo-Neutrinos

    Energy Technology Data Exchange (ETDEWEB)

    Dye, S.T. [Department of Physics and Astronomy, University of Hawaii at Manoa, 2505 Correa Road, Honolulu, Hawaii, 96822 (United States); College of Natural Sciences, Hawaii Pacific University, 45-045 Kamehameha Highway, Kaneohe, Hawaii, 96744 (United States)

    2009-03-15

    This paper briefly reviews recent developments in the field of geo-neutrinos. It describes current and future detection projects, discusses modeling projects, suggests an observational program, and visits geo-reactor hypotheses.

  20. JASPAR 2010: the greatly expanded open-access database of transcription factor binding profiles

    DEFF Research Database (Denmark)

    Portales-Casamar, Elodie; Thongjuea, Supat; Kwon, Andrew T

    2009-01-01

    JASPAR (http://jaspar.genereg.net) is the leading open-access database of matrix profiles describing the DNA-binding patterns of transcription factors (TFs) and other proteins interacting with DNA in a sequence-specific manner. Its fourth major release is the largest expansion of the core database...... to an active research community. As binding models are refined by newer data, the JASPAR database now uses versioning of matrices: in this release, 12% of the older models were updated to improved versions. Classification of TF families has been improved by adopting a new DNA-binding domain nomenclature...

  1. Toward an open-access global database for mapping, control, and surveillance of neglected tropical diseases.

    Directory of Open Access Journals (Sweden)

    Eveline Hürlimann

    2011-12-01

    Full Text Available BACKGROUND: After many years of general neglect, interest has grown and efforts came under way for the mapping, control, surveillance, and eventual elimination of neglected tropical diseases (NTDs. Disease risk estimates are a key feature to target control interventions, and serve as a benchmark for monitoring and evaluation. What is currently missing is a georeferenced global database for NTDs providing open-access to the available survey data that is constantly updated and can be utilized by researchers and disease control managers to support other relevant stakeholders. We describe the steps taken toward the development of such a database that can be employed for spatial disease risk modeling and control of NTDs. METHODOLOGY: With an emphasis on schistosomiasis in Africa, we systematically searched the literature (peer-reviewed journals and 'grey literature', contacted Ministries of Health and research institutions in schistosomiasis-endemic countries for location-specific prevalence data and survey details (e.g., study population, year of survey and diagnostic techniques. The data were extracted, georeferenced, and stored in a MySQL database with a web interface allowing free database access and data management. PRINCIPAL FINDINGS: At the beginning of 2011, our database contained more than 12,000 georeferenced schistosomiasis survey locations from 35 African countries available under http://www.gntd.org. Currently, the database is expanded to a global repository, including a host of other NTDs, e.g. soil-transmitted helminthiasis and leishmaniasis. CONCLUSIONS: An open-access, spatially explicit NTD database offers unique opportunities for disease risk modeling, targeting control interventions, disease monitoring, and surveillance. Moreover, it allows for detailed geostatistical analyses of disease distribution in space and time. With an initial focus on schistosomiasis in Africa, we demonstrate the proof-of-concept that the establishment

  2. Toward an Open-Access Global Database for Mapping, Control, and Surveillance of Neglected Tropical Diseases

    Science.gov (United States)

    Hürlimann, Eveline; Schur, Nadine; Boutsika, Konstantina; Stensgaard, Anna-Sofie; Laserna de Himpsl, Maiti; Ziegelbauer, Kathrin; Laizer, Nassor; Camenzind, Lukas; Di Pasquale, Aurelio; Ekpo, Uwem F.; Simoonga, Christopher; Mushinge, Gabriel; Saarnak, Christopher F. L.; Utzinger, Jürg; Kristensen, Thomas K.; Vounatsou, Penelope

    2011-01-01

    Background After many years of general neglect, interest has grown and efforts came under way for the mapping, control, surveillance, and eventual elimination of neglected tropical diseases (NTDs). Disease risk estimates are a key feature to target control interventions, and serve as a benchmark for monitoring and evaluation. What is currently missing is a georeferenced global database for NTDs providing open-access to the available survey data that is constantly updated and can be utilized by researchers and disease control managers to support other relevant stakeholders. We describe the steps taken toward the development of such a database that can be employed for spatial disease risk modeling and control of NTDs. Methodology With an emphasis on schistosomiasis in Africa, we systematically searched the literature (peer-reviewed journals and ‘grey literature’), contacted Ministries of Health and research institutions in schistosomiasis-endemic countries for location-specific prevalence data and survey details (e.g., study population, year of survey and diagnostic techniques). The data were extracted, georeferenced, and stored in a MySQL database with a web interface allowing free database access and data management. Principal Findings At the beginning of 2011, our database contained more than 12,000 georeferenced schistosomiasis survey locations from 35 African countries available under http://www.gntd.org. Currently, the database is expanded to a global repository, including a host of other NTDs, e.g. soil-transmitted helminthiasis and leishmaniasis. Conclusions An open-access, spatially explicit NTD database offers unique opportunities for disease risk modeling, targeting control interventions, disease monitoring, and surveillance. Moreover, it allows for detailed geostatistical analyses of disease distribution in space and time. With an initial focus on schistosomiasis in Africa, we demonstrate the proof-of-concept that the establishment and running of a

  3. Free access to INIS database provides a gateway to nuclear energy research results

    International Nuclear Information System (INIS)

    Tolonen, E.; Malmgren, M.

    2009-01-01

    Free access to INIS database was opened to all the Internet users around the world on May, 2009. The article reviews the history of INIS (the International Nuclear Information System), data aquisition process, database content and search possibilities. INIS is focused on the worldwide literature of the peaceful uses of nuclear energy and the database is produced in close collaboration with the IEA/ETDE World Energy Base (ETDEWEB), a database focusing on all aspects of energy. Nuclear Science Abstracts database (NSA), which is a comprehensive collection of international nuclear science and technology literature for the period 1948 through 1976, is also briefly discussed in the article. In Finland, the recently formed Aalto University is responsible for collecting and disseminating information (literature) and for the preparation of input to the INIS and IEA/ETDE Databases on the national level

  4. Entity Linking Leveraging the GeoDeepDive Cyberinfrastructure and Managing Uncertainty with Provenance.

    Science.gov (United States)

    Maio, R.; Arko, R. A.; Lehnert, K.; Ji, P.

    2017-12-01

    Unlocking the full, rich, network of links between the scientific literature and the real world entities to which data correspond - such as field expeditions (cruises) on oceanographic research vessels and physical samples collected during those expeditions - remains a challenge for the geoscience community. Doing so would enable data reuse and integration on a broad scale; making it possible to inspect the network and discover, for example, all rock samples reported in the scientific literature found within 10 kilometers of an undersea volcano, and associated geochemical analyses. Such a capability could facilitate new scientific discoveries. The GeoDeepDive project provides negotiated access to 4.2+ million documents from scientific publishers, enabling text and document mining via a public API and cyberinfrastructure. We mined this corpus using entity linking techniques, which are inherently uncertain, and recorded provenance information about each link. This opens the entity linking methodology to scrutiny, and enables downstream applications to make informed assessments about the suitability of an entity link for consumption. A major challenge is how to model and disseminate the provenance information. We present results from entity linking between journal articles, research vessels and cruises, and physical samples from the Petrological Database (PetDB), and incorporate Linked Data resources such as cruises in the Rolling Deck to Repository (R2R) catalog where possible. Our work demonstrates the value and potential of the GeoDeepDive cyberinfrastructure in combination with Linked Data infrastructure provided by the EarthCube GeoLink project. We present a research workflow to capture provenance information that leverages the World Wide Web Consortium (W3C) recommendation PROV Ontology.

  5. Geo-Seas: delivering harmonised marine geoscience data on a European scale

    Science.gov (United States)

    Glaves, Helen; Schaap, Dick

    2013-04-01

    A large amount of both raw and interpreted marine geoscience data is held by an array of European organisations but its discovery and re-use can be very difficult. The data is stored in a variety of different formats and a range of different nomenclatures, scales and co-ordinate systems are used at the organisational, national and international level. This lack of standardisation hinders the user's ability to locate and access these datasets or to use them in an integrated way. The Geo-Seas project, an EU funded Framework 7 initiative, has addressed these barriers to the re-use of marine geological and geophysical data through the development of an on-line data discovery and access service (http://www.geo-seas.eu). It allows the end user to identify, evaluate and download a range of standardised marine geoscience data sets from 26 federated data centres across 17 European maritime countries. The dedicated portal, which currently provides access to more than 100,000 datasets, has been developed by adopting and adapting the existing technologies, standards and methodologies developed by the SeaDataNet project for the management and delivery of oceanographic data. Through the re-use of this pre-existing architecture including the associated common standards and vocabularies a joint infrastructure for both marine geoscientific and oceanographic data has been created which supports the development of multidisciplinary ocean science. The Geo-Seas project has also brought together and incorporated the metadata services developed by previous EU-funded projects such as EUSeaSed and SEISCAN. The formats of this legacy metadata have not only been used as the basis for developing the Geo-Seas metadatabase but it has also lead to these pre-existing metadata catalogues being upgraded to current international standards. The Geo-Seas initiative has lead to a major improvement in the availability of standardised marine geoscientific data throughout Europe allowing end users better

  6. TRANSNET -- access to radioactive and hazardous materials transportation codes and databases

    International Nuclear Information System (INIS)

    Cashwell, J.W.

    1992-01-01

    TRANSNET has been developed and maintained by Sandia National Laboratories under the sponsorship of the United States Department of Energy (DOE) Office of Environmental Restoration and Waste Management to permit outside access to computerized routing, risk and systems analysis models, and associated databases. The goal of the TRANSNET system is to enable transfer of transportation analytical methods and data to qualified users by permitting direct, timely access to the up-to-date versions of the codes and data. The TRANSNET facility comprises a dedicated computer with telephone ports on which these codes and databases are adapted, modified, and maintained. To permit the widest spectrum of outside users, TRANSNET is designed to minimize hardware and documentation requirements. The user is thus required to have an IBM-compatible personal computer, Hayes-compatible modem with communications software, and a telephone. Maintenance and operation of the TRANSNET facility are underwritten by the program sponsor(s) as are updates to the respective models and data, thus the only charges to the user of the system are telephone hookup charges. TRANSNET provides access to the most recent versions of the models and data developed by or for Sandia National Laboratories. Code modifications that have been made since the last published documentation are noted to the user on the introductory screens. User friendly interfaces have been developed for each of the codes and databases on TRANSNET. In addition, users are provided with default input data sets for typical problems which can either be used directly or edited. Direct transfers of analytical or data files between codes are provided to permit the user to perform complex analyses with a minimum of input. Recent developments to the TRANSNET system include use of the system to directly pass data files between both national and international users as well as development and integration of graphical depiction techniques

  7. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    Science.gov (United States)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  8. Using the GeoFEST Faulted Region Simulation System

    Science.gov (United States)

    Parker, Jay W.; Lyzenga, Gregory A.; Donnellan, Andrea; Judd, Michele A.; Norton, Charles D.; Baker, Teresa; Tisdale, Edwin R.; Li, Peggy

    2004-01-01

    GeoFEST (the Geophysical Finite Element Simulation Tool) simulates stress evolution, fault slip and plastic/elastic processes in realistic materials, and so is suitable for earthquake cycle studies in regions such as Southern California. Many new capabilities and means of access for GeoFEST are now supported. New abilities include MPI-based cluster parallel computing using automatic PYRAMID/Parmetis-based mesh partitioning, automatic mesh generation for layered media with rectangular faults, and results visualization that is integrated with remote sensing data. The parallel GeoFEST application has been successfully run on over a half-dozen computers, including Intel Xeon clusters, Itanium II and Altix machines, and the Apple G5 cluster. It is not separately optimized for different machines, but relies on good domain partitioning for load-balance and low communication, and careful writing of the parallel diagonally preconditioned conjugate gradient solver to keep communication overhead low. Demonstrated thousand-step solutions for over a million finite elements on 64 processors require under three hours, and scaling tests show high efficiency when using more than (order of) 4000 elements per processor. The source code and documentation for GeoFEST is available at no cost from Open Channel Foundation. In addition GeoFEST may be used through a browser-based portal environment available to approved users. That environment includes semi-automated geometry creation and mesh generation tools, GeoFEST, and RIVA-based visualization tools that include the ability to generate a flyover animation showing deformations and topography. Work is in progress to support simulation of a region with several faults using 16 million elements, using a strain energy metric to adapt the mesh to faithfully represent the solution in a region of widely varying strain.

  9. GeoBrain for Facilitating Earth Science Education in Higher-Education Institutes--Experience and Lessons-learned

    Science.gov (United States)

    Deng, M.; di, L.

    2007-12-01

    Data integration and analysis are the foundation for the scientific investigation in Earth science. In the past several decades, huge amounts of Earth science data have been collected mainly through remote sensing. Those data have become the treasure for Earth science research. Training students how to discover and use the huge volume of Earth science data in research become one of the most important trainings for making a student a qualified scientist. Being developed by a NASA funded project, the GeoBrain system has adopted and implemented the latest Web services and knowledge management technologies for providing innovative methods in publishing, accessing, visualizing, and analyzing geospatial data and in building/sharing geoscience knowledge. It provides a data-rich online learning and research environment enabled by wealthy data and information available at NASA Earth Observing System (EOS) Data and Information System (EOSDIS). Students, faculty members, and researchers from institutes worldwide can easily access, analyze, and model with the huge amount of NASA EOS data just like they possess such vast resources locally at their desktops. Although still in development, the GeoBrain system has been operational since 2005. A number of education materials have been developed for facilitating the use of GeoBrain as a powerful education tool for Earth science education at both undergraduate and graduate levels. Thousands of online higher-education users worldwide have used GeoBrain services. A number of faculty members in multiple universities have been funded as GeoBrain education partners to explore the use of GeoBrain in the classroom teaching and student research. By summarizing and analyzing the feedbacks from the online users and the education partners, this presentation presents the user experiences on using GeoBrain in Earth science teaching and research. The feedbacks on classroom use of GeoBrain have demonstrated that GeoBrain is very useful for

  10. MoMoSat -- Mobile Service for Monitoring with GeoNotes via Satellite

    Energy Technology Data Exchange (ETDEWEB)

    Niemeyer, Irmgard [Forschungszentrum Juelich (Germany). Programme Group Systems Analysis and Technology Evaluation (STE); Jonas, Karl [Univ. of Applied Science Bonn-Rhein-Sieg, Sankt Augustin (Germany). FhG FOKUS CC SATCom; Horz, Alexander [horz informatik, Sankt Augustin (Germany); Wettschereck, Dietrich; Schmidt, Dirk [DIALOGIS GmbH, Bonn (Germany)

    2003-05-01

    The MoMoSat service will enable mobile end-users to view, manage, annotate, and communicate mapbased information in the field. The handled information exists of a huge volume of raster (satellite or aerial images) and vector data (i.e. street networks, cadastral maps or points of interest), as well as text-specific geo-referenced textual notes (the so-called 'GeoNotes') and real-time voice. A secure real-time communication between mobile units and the primary data store is an essential task of the MoMoSat service. The basic information is stored in the primary database that is accessible through a virtual private network (VPN) and cached at a server at a base station in order to ensure data availability. The base station may be installed in a car or another mobile vehicle. The two servers will periodically communicate with each other via secure satellite communication in order to check for updates. The base station supplies the relevant GIS data for the mobile units (people or even robots in the field at remote solutions). The communication between the mobile units is based on a peer-to-peer wireless local area network (WLAN) architecture. The mobile units are equipped with mobile computers (i.e. laptop, tablet PC or PDA) combined with a satellite-based positioning system (GPS) that enables them to request the proper geographic data sets from yhe base station's map server. An interactive mapping software shows the actual location on the map and allows the user to navigate (zoom, pan) through the high-resolution map display. The user can switch 'on' or 'off' several thematic layers (i.e. street network or points of interest) on the map. The software also supports collaborative aspects of MoMoSat by offering tools for the management of the GeoNotes that can be visualized by categories. The user can extend the existing GeoNotes with his personnel comments or create new GeoNotes by defining categories, recipients and the level of

  11. Extending the Intermediate Data Structure (IDS for longitudinal historical databases to include geographic data

    Directory of Open Access Journals (Sweden)

    Finn Hedefalk

    2014-09-01

    Full Text Available The Intermediate Data Structure (IDS is a standardised database structure for longitudinal historical databases. Such a common structure facilitates data sharing and comparative research. In this study, we propose an extended version of IDS, named IDS-Geo, that also includes geographic data. The geographic data that will be stored in IDS-Geo are primarily buildings and/or property units, and the purpose of these geographic data is mainly to link individuals to places in space. When we want to assign such detailed spatial locations to individuals (in times before there were any detailed house addresses available, we often have to create tailored geographic datasets. In those cases, there are benefits of storing geographic data in the same structure as the demographic data. Moreover, we propose the export of data from IDS-Geo using an eXtensible Markup Language (XML Schema. IDS-Geo is implemented in a case study using historical property units, for the period 1804 to 1913, stored in a geographically extended version of the Scanian Economic Demographic Database (SEDD. To fit into the IDS-Geo data structure, we included an object lifeline representation of all of the property units (based on the snapshot time representation of single historical maps and poll-tax registers. The case study verifies that the IDS-Geo model is capable of handling geographic data that can be linked to demographic data.

  12. A global reference database of crowdsourced cropland data collected using the Geo-Wiki platform

    NARCIS (Netherlands)

    Laso Bayas, JC; Lesiv, M; Waldner, F; Schucknecht, A; Duerauer, M; See, L; Fritz, S.; Fraisl, D; Moorthy, I; McCallum, I.; Perger, C; Danylo, O; Defourny, P; Gallego, J; Gilliams, S; Akhtar, I.H.; Baishya, S. J.; Baruah, M; Bungnamei, K; Campos, A; Changkakati, T; Cipriani, A; Das, Krishna; Das, Keemee; Das, I; Davis, K.F.; Hazarika, P; Johnson, B.A.; Malek, Ziga; Molinari, M.E.; Panging, K; Pawe, C.K.; Pérez-Hoyos, A; Sahariah, P.K.; Sahariah, D; Saikia, A; Saikia, M; Schlesinger, Peter; Seidacaru, E; Singha, K; Wilson, John W

    2017-01-01

    A global reference data set on cropland was collected through a crowdsourcing campaign using the Geo-Wiki crowdsourcing tool. The campaign lasted three weeks, with over 80 participants from around the world reviewing almost 36,000 sample units, focussing on cropland identification. For quality

  13. GeoMapApp as a platform for visualizing marine data from Polar Regions

    Science.gov (United States)

    Nitsche, F. O.; Ryan, W. B.; Carbotte, S. M.; Ferrini, V.; Goodwillie, A. M.; O'hara, S. H.; Weissel, R.; McLain, K.; Chinhong, C.; Arko, R. A.; Chan, S.; Morton, J. J.; Pomeroy, D.

    2012-12-01

    To maximize the investment in expensive fieldwork the resulting data should be re-used as much as possible. In addition, unnecessary duplication of data collection effort should be avoided. This becomes even more important if access to field areas is as difficult and expensive as it is in Polar Regions. Making existing data discoverable in an easy to use platform is key to improve re-use and avoid duplication. A common obstacle is that use of existing data is often limited to specialists who know of the data existence and also have the right tools to view and analyze these data. GeoMapApp is a free, interactive, map based tool that allows users to discover, visualize, and analyze a large number of data sets. In addition to a global view, it provides polar map projections for displaying data in Arctic and Antarctic areas. Data that have currently been added to the system include Arctic swath bathymetry data collected from the USCG icebreaker Healy. These data are collected almost continuously including from cruises where bathymetry is not the main objective and for which existence of the acquired data may not be well known. In contrast, existence of seismic data from the Antarctic continental margin is well known in the seismic community. They are archived at and can be accessed through the Antarctic Seismic Data Library System (SDLS). Incorporating these data into GeoMapApp makes an even broader community aware of these data and the custom interface, which includes capabilities to visualize and explore these data, allows users without specific software or knowledge of the underlying data format to access the data. In addition to investigating these datasets, GeoMapApp provides links to the actual data sources to allow specialists the opportunity to re-use the original data. Important identification of data sources and data references are achieved on different levels. For access to the actual Antarctic seismic data GeoMapApp links to the SDLS site, where users have

  14. GeoChronos: An On-line Collaborative Platform for Earth Observation Scientists

    Science.gov (United States)

    Gamon, J. A.; Kiddle, C.; Curry, R.; Markatchev, N.; Zonta-Pastorello, G., Jr.; Rivard, B.; Sanchez-Azofeifa, G. A.; Simmonds, R.; Tan, T.

    2009-12-01

    Recent advances in cyberinfrastructure are offering new solutions to the growing challenges of managing and sharing large data volumes. Web 2.0 and social networking technologies, provide the means for scientists to collaborate and share information more effectively. Cloud computing technologies can provide scientists with transparent and on-demand access to applications served over the Internet in a dynamic and scalable manner. Semantic Web technologies allow for data to be linked together in a manner understandable by machines, enabling greater automation. Combining all of these technologies together can enable the creation of very powerful platforms. GeoChronos (http://geochronos.org/), part of a CANARIE Network Enabled Platforms project, is an online collaborative platform that incorporates these technologies to enable members of the earth observation science community to share data and scientific applications and to collaborate more effectively. The GeoChronos portal is built on an open source social networking platform called Elgg. Elgg provides a full set of social networking functionalities similar to Facebook including blogs, tags, media/document sharing, wikis, friends/contacts, groups, discussions, message boards, calendars, status, activity feeds and more. An underlying cloud computing infrastructure enables scientists to access dynamically provisioned applications via the portal for visualizing and analyzing data. Users are able to access and run the applications from any computer that has a Web browser and Internet connectivity and do not need to manage and maintain the applications themselves. Semantic Web Technologies, such as the Resource Description Framework (RDF) are being employed for relating and linking together spectral, satellite, meteorological and other data. Social networking functionality plays an integral part in facilitating the sharing of data and applications. Examples of recent GeoChronos users during the early testing phase have

  15. Fluid migration through geo-membrane seams and through the interface between geo-membrane and geo-synthetic clay liner

    International Nuclear Information System (INIS)

    Barroso, M.

    2005-03-01

    Composite liners are used to limit the contamination migration from landfills. Their successful performance is closely related with the geo-membrane as it provides the primary barrier to diffusive and advective transport of contaminants. Critical issues on the performance of the geo-membranes are the seams between geo-membrane panels and the inevitable defects resulting, for instance, from inadequate installation activities. In landfills, where high density polyethylene geo-membranes are usually used, seams are typically made by the thermal-hot dual wedge method. A literature review on quality control of the seams showed that, in situ, fluid-tightness of seams is evaluated in qualitative terms (pass/failure criteria), despite their importance to ensure appropriate performance of the geo-membranes as barriers. In addition, a synthesis of studies on geo-membrane defects indicated that defects varying in density from 0.7 to 15.3 per hectare can be found in landfills. Defects represent preferential flow paths for leachate. Various authors have developed analytical solutions and empirical equations for predicting the flow rate through composite liners due to defects in the geo-membrane. The validity of these methods for composite liners comprising a geo-membrane over a geo-synthetic clay liner (GCL) over a compacted clay liner (CCL) has never been studied from an experimental point of view. To address the problem of fluid migration through the geo-membrane seams, an attempt is made to provide a test method, herein termed as 'gas permeation pouch test', for assessing the quality of the thermal-hot dual wedge seams. This test consists of pressurizing the air channel formed by the double seam with a gas to a specific pressure and, then, measuring the decrease in pressure over time. From the pressure decrease, both the gas permeation coefficients, in steady state conditions, and the time constant, in unsteady state conditions, can be estimated. Experiments were carried out

  16. EarthCube GeoLink: Semantics and Linked Data for the Geosciences

    Science.gov (United States)

    Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Cheatham, M.; Fils, D.; Hitzler, P.; Janowicz, K.; Ji, P.; Jones, M. B.; Krisnadhi, A.; Lehnert, K. A.; Mickle, A.; Narock, T.; O'Brien, M.; Raymond, L. M.; Schildhauer, M.; Shepherd, A.; Wiebe, P. H.

    2015-12-01

    The NSF EarthCube initiative is building next-generation cyberinfrastructure to aid geoscientists in collecting, accessing, analyzing, sharing, and visualizing their data and knowledge. The EarthCube GeoLink Building Block project focuses on a specific set of software protocols and vocabularies, often characterized as the Semantic Web and "Linked Data", to publish data online in a way that is easily discoverable, accessible, and interoperable. GeoLink brings together specialists from the computer science, geoscience, and library science domains, and includes data from a network of NSF-funded repositories that support scientific studies in marine geology, marine ecosystems, biogeochemistry, and paleoclimatology. We are working collaboratively with closely-related Building Block projects including EarthCollab and CINERGI, and solicit feedback from RCN projects including Cyberinfrastructure for Paleogeosciences (C4P) and iSamples. GeoLink has developed a modular ontology that describes essential geoscience research concepts; published data from seven collections (to date) on the Web as geospatially-enabled Linked Data using this ontology; matched and mapped data between collections using shared identifiers for investigators, repositories, datasets, funding awards, platforms, research cruises, physical specimens, and gazetteer features; and aggregated the results in a shared knowledgebase that can be queried via a standard SPARQL endpoint. Client applications have been built around the knowledgebase, including a Web/map-based data browser using the Leaflet JavaScript library and a simple query service using the OpenSearch format. Future development will include extending and refining the GeoLink ontology, adding content from additional repositories, developing semi-automated algorithms to enhance metadata, and further work on client applications.

  17. GeoInquiries: Addressing a Grand Challenge for Teaching with GIS in Schools

    Science.gov (United States)

    DiBiase, D.; Baker, T.

    2016-12-01

    According to the National Research Council (2006), geographic information systems (GIS) is a powerful tool for expanding students' abilities to think spatially, a critical skill for future STEM professionals. However, educators in mainstream subjects in U.S. education have struggled for decades to use GIS effectively in classrooms. GeoInquiries are no cost, standards-based (NGSS or AP), Creative Commons-licensed instructional activities that guide inquiry around map-based concepts found in key subjects like Earth and environmental science. Web maps developed for GeoInquiries expand upon printed maps in leading textbooks by taking advantage of 21st GIS capabilities. GeoInquiry collections consist of 15 activities, each chosen to offer a map-based activity every few weeks throughout the school year. GeoInquiries use a common inquiry instructional framework, learned by many educators during their teacher preparation coursework. GeoInquiries are instructionally flexible - acting as much like building blocks for crafting custom activities as finished instructional materials. Over a half million geoinquiries will be accessed in the next twelve months - serving an anticipated 15 million students. After a generation of outreach to the educators, GIS is finally finding its way the mainstream.

  18. Geo-energy Test Beds: part of the European Plate Observing System

    Science.gov (United States)

    Stephenson, Michael; Schofield, David; Luton, Christopher; Haslinger, Florian; Henninges, Jan; Giardini, Domenico

    2016-04-01

    For 2020, the EU has committed to cutting its greenhouse gas emissions to 20% below 1990 levels and further cuts are being decided for 2050. This commitment is one of the headline targets of the Europe 2020 growth strategy and is being implemented through binding legislation. This decarbonisation of the EU economy is one dimension of an overall EU energy and climate framework that is mutually interlinked with the need to ensure energy security, promote a fully integrated energy market, promote energy efficiency and promote research innovation and competitiveness. Power generation will have to take a particularly large part in emissions reductions (-54 to -68% by 2030 and -93 to -99% by 2050), mainly by focussing on increasing surface renewables (wind, tidal and solar) but also on carbon capture and storage on fossil fuel and biofuel power plants, shale gas, nuclear and geothermal power. All the above generation technologies share common geological challenges around containment, safety and environmental sustainability. In a densely populated continent, this means that high levels of subsurface management are needed to fully realise the energy potential. In response to this need, across Europe, public and private sector funded, experimental test and monitoring facilities and infrastructures (Geo-energy Test Beds, GETB) are being developed. These GETB investigate the processes, technology and practices that facilitate the sustainable exploitation of Geo-energy resources and are of intense interest to the public and regulators alike. The vision of EPOS IP Work Package 17 (wp17) is to promote research and innovation in Geo-energy that reflects core European energy priorities through provision of virtual access to data and protocols and trans-national access to GETB experiments. This will be achieved through provision of access to continuous strategic observations, promotion of the integrated use of data and models from European GETB, development of underpinning research

  19. Geo-neutrino Observation

    International Nuclear Information System (INIS)

    Dye, S. T.; Alderman, M.; Batygov, M.; Learned, J. G.; Matsuno, S.; Mahoney, J. M.; Pakvasa, S.; Rosen, M.; Smith, S.; Varner, G.; McDonough, W. F.

    2009-01-01

    Observations of geo-neutrinos measure radiogenic heat production within the earth, providing information on the thermal history and dynamic processes of the mantle. Two detectors currently observe geo-neutrinos from underground locations. Other detection projects in various stages of development include a deep ocean observatory. This paper presents the current status of geo-neutrino observation and describes the scientific capabilities of the deep ocean observatory, with emphasis on geology and neutrino physics.

  20. Towards Precise Metadata-set for Discovering 3D Geospatial Models in Geo-portals

    Science.gov (United States)

    Zamyadi, A.; Pouliot, J.; Bédard, Y.

    2013-09-01

    Accessing 3D geospatial models, eventually at no cost and for unrestricted use, is certainly an important issue as they become popular among participatory communities, consultants, and officials. Various geo-portals, mainly established for 2D resources, have tried to provide access to existing 3D resources such as digital elevation model, LIDAR or classic topographic data. Describing the content of data, metadata is a key component of data discovery in geo-portals. An inventory of seven online geo-portals and commercial catalogues shows that the metadata referring to 3D information is very different from one geo-portal to another as well as for similar 3D resources in the same geo-portal. The inventory considered 971 data resources affiliated with elevation. 51% of them were from three geo-portals running at Canadian federal and municipal levels whose metadata resources did not consider 3D model by any definition. Regarding the remaining 49% which refer to 3D models, different definition of terms and metadata were found, resulting in confusion and misinterpretation. The overall assessment of these geo-portals clearly shows that the provided metadata do not integrate specific and common information about 3D geospatial models. Accordingly, the main objective of this research is to improve 3D geospatial model discovery in geo-portals by adding a specific metadata-set. Based on the knowledge and current practices on 3D modeling, and 3D data acquisition and management, a set of metadata is proposed to increase its suitability for 3D geospatial models. This metadata-set enables the definition of genuine classes, fields, and code-lists for a 3D metadata profile. The main structure of the proposal contains 21 metadata classes. These classes are classified in three packages as General and Complementary on contextual and structural information, and Availability on the transition from storage to delivery format. The proposed metadata set is compared with Canadian Geospatial

  1. Geo-Spatial Support for Assessment of Anthropic Impact on Biodiversity

    Directory of Open Access Journals (Sweden)

    Marco Piragnolo

    2014-04-01

    Full Text Available This paper discusses a methodology where geo-spatial analysis tools are used to quantify risk derived from anthropic activities on habitats and species. The method has been developed with a focus on simplification and the quality of standard procedures set on flora and fauna protected by the European Directives. In this study case, the DPSIR (Drivers, Pressures, State, Impacts, Responses is applied using spatial procedures in a geographical information system (GIS framework. This approach can be inserted in a multidimensional space as the analysis is applied to each threat, pressure and activity and also to each habitat and species, at the spatial and temporal scale. Threats, pressures and activities, stress and indicators can be managed by means of a geo-database and analyzed using spatial analysis functions in a tested GIS workflow environment. The method applies a matrix with risk values, and the final product is a geo-spatial representation of impact indicators, which can be used as a support for decision-makers at various levels (regional, national and European.

  2. The Ruby UCSC API: accessing the UCSC genome database using Ruby.

    Science.gov (United States)

    Mishima, Hiroyuki; Aerts, Jan; Katayama, Toshiaki; Bonnal, Raoul J P; Yoshiura, Koh-ichiro

    2012-09-21

    The University of California, Santa Cruz (UCSC) genome database is among the most used sources of genomic annotation in human and other organisms. The database offers an excellent web-based graphical user interface (the UCSC genome browser) and several means for programmatic queries. A simple application programming interface (API) in a scripting language aimed at the biologist was however not yet available. Here, we present the Ruby UCSC API, a library to access the UCSC genome database using Ruby. The API is designed as a BioRuby plug-in and built on the ActiveRecord 3 framework for the object-relational mapping, making writing SQL statements unnecessary. The current version of the API supports databases of all organisms in the UCSC genome database including human, mammals, vertebrates, deuterostomes, insects, nematodes, and yeast.The API uses the bin index-if available-when querying for genomic intervals. The API also supports genomic sequence queries using locally downloaded *.2bit files that are not stored in the official MySQL database. The API is implemented in pure Ruby and is therefore available in different environments and with different Ruby interpreters (including JRuby). Assisted by the straightforward object-oriented design of Ruby and ActiveRecord, the Ruby UCSC API will facilitate biologists to query the UCSC genome database programmatically. The API is available through the RubyGem system. Source code and documentation are available at https://github.com/misshie/bioruby-ucsc-api/ under the Ruby license. Feedback and help is provided via the website at http://rubyucscapi.userecho.com/.

  3. The Ruby UCSC API: accessing the UCSC genome database using Ruby

    Science.gov (United States)

    2012-01-01

    Background The University of California, Santa Cruz (UCSC) genome database is among the most used sources of genomic annotation in human and other organisms. The database offers an excellent web-based graphical user interface (the UCSC genome browser) and several means for programmatic queries. A simple application programming interface (API) in a scripting language aimed at the biologist was however not yet available. Here, we present the Ruby UCSC API, a library to access the UCSC genome database using Ruby. Results The API is designed as a BioRuby plug-in and built on the ActiveRecord 3 framework for the object-relational mapping, making writing SQL statements unnecessary. The current version of the API supports databases of all organisms in the UCSC genome database including human, mammals, vertebrates, deuterostomes, insects, nematodes, and yeast. The API uses the bin index—if available—when querying for genomic intervals. The API also supports genomic sequence queries using locally downloaded *.2bit files that are not stored in the official MySQL database. The API is implemented in pure Ruby and is therefore available in different environments and with different Ruby interpreters (including JRuby). Conclusions Assisted by the straightforward object-oriented design of Ruby and ActiveRecord, the Ruby UCSC API will facilitate biologists to query the UCSC genome database programmatically. The API is available through the RubyGem system. Source code and documentation are available at https://github.com/misshie/bioruby-ucsc-api/ under the Ruby license. Feedback and help is provided via the website at http://rubyucscapi.userecho.com/. PMID:22994508

  4. The Ruby UCSC API: accessing the UCSC genome database using Ruby

    Directory of Open Access Journals (Sweden)

    Mishima Hiroyuki

    2012-09-01

    Full Text Available Abstract Background The University of California, Santa Cruz (UCSC genome database is among the most used sources of genomic annotation in human and other organisms. The database offers an excellent web-based graphical user interface (the UCSC genome browser and several means for programmatic queries. A simple application programming interface (API in a scripting language aimed at the biologist was however not yet available. Here, we present the Ruby UCSC API, a library to access the UCSC genome database using Ruby. Results The API is designed as a BioRuby plug-in and built on the ActiveRecord 3 framework for the object-relational mapping, making writing SQL statements unnecessary. The current version of the API supports databases of all organisms in the UCSC genome database including human, mammals, vertebrates, deuterostomes, insects, nematodes, and yeast. The API uses the bin index—if available—when querying for genomic intervals. The API also supports genomic sequence queries using locally downloaded *.2bit files that are not stored in the official MySQL database. The API is implemented in pure Ruby and is therefore available in different environments and with different Ruby interpreters (including JRuby. Conclusions Assisted by the straightforward object-oriented design of Ruby and ActiveRecord, the Ruby UCSC API will facilitate biologists to query the UCSC genome database programmatically. The API is available through the RubyGem system. Source code and documentation are available at https://github.com/misshie/bioruby-ucsc-api/ under the Ruby license. Feedback and help is provided via the website at http://rubyucscapi.userecho.com/.

  5. PostGIS-Based Heterogeneous Sensor Database Framework for the Sensor Observation Service

    Directory of Open Access Journals (Sweden)

    Ikechukwu Maduako

    2012-10-01

    Full Text Available Environmental monitoring and management systems in most cases deal with models and spatial analytics that involve the integration of in-situ and remote sensor observations. In-situ sensor observations and those gathered by remote sensors are usually provided by different databases and services in real-time dynamic services such as the Geo-Web Services. Thus, data have to be pulled from different databases and transferred over the network before they are fused and processed on the service middleware. This process is very massive and unnecessary communication and work load on the service. Massive work load in large raster downloads from flat-file raster data sources each time a request is made and huge integration and geo-processing work load on the service middleware which could actually be better leveraged at the database level. In this paper, we propose and present a heterogeneous sensor database framework or model for integration, geo-processing and spatial analysis of remote and in-situ sensor observations at the database level.  And how this can be integrated in the Sensor Observation Service, SOS to reduce communication and massive workload on the Geospatial Web Services and as well make query request from the user end a lot more flexible.

  6. Physical Access Control Database -

    Data.gov (United States)

    Department of Transportation — This data set contains the personnel access card data (photo, name, activation/expiration dates, card number, and access level) as well as data about turnstiles and...

  7. Study on Mandatory Access Control in a Secure Database Management System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper proposes a security policy model for mandatory access control in class B1 database management system whose level of labeling is tuple. The relation-hierarchical data model is extended to multilevel relation-hierarchical data model. Based on the multilevel relation-hierarchical data model, the concept of upper-lower layer relational integrity is presented after we analyze and eliminate the covert channels caused by the database integrity. Two SQL statements are extended to process polyinstantiation in the multilevel secure environment. The system is based on the multilevel relation-hierarchical data model and is capable of integratively storing and manipulating multilevel complicated objects (e. g., multilevel spatial data) and multilevel conventional data ( e. g., integer. real number and character string).

  8. Geo-data Acquisition Through Mobile GIS and Digital Video: an Urban Disaster Management Perspective

    NARCIS (Netherlands)

    Montoya, L.

    2003-01-01

    For the management of urban disaster risk, periodic updating of building and lifeline geo- databases is crucial, particularly in developing countries where urbanisation rates are very high. However, collecting information on the characteristics of buildings and lifelines through full ground surveys

  9. GeoBoost: accelerating research involving the geospatial metadata of virus GenBank records.

    Science.gov (United States)

    Tahsin, Tasnia; Weissenbacher, Davy; O'Connor, Karen; Magge, Arjun; Scotch, Matthew; Gonzalez-Hernandez, Graciela

    2018-05-01

    GeoBoost is a command-line software package developed to address sparse or incomplete metadata in GenBank sequence records that relate to the location of the infected host (LOIH) of viruses. Given a set of GenBank accession numbers corresponding to virus GenBank records, GeoBoost extracts, integrates and normalizes geographic information reflecting the LOIH of the viruses using integrated information from GenBank metadata and related full-text publications. In addition, to facilitate probabilistic geospatial modeling, GeoBoost assigns probability scores for each possible LOIH. Binaries and resources required for running GeoBoost are packed into a single zipped file and freely available for download at https://tinyurl.com/geoboost. A video tutorial is included to help users quickly and easily install and run the software. The software is implemented in Java 1.8, and supported on MS Windows and Linux platforms. gragon@upenn.edu. Supplementary data are available at Bioinformatics online.

  10. Design of Nutrition Catering System for Athletes Based on Access Database

    OpenAIRE

    Hongjiang Wu,; Haiyan Zhao; Xugang Liu; Mingshun Xing

    2015-01-01

    In order to monitor and adjust athletes' dietary nutrition scientifically, Active X Data Object (ADO) and Structure Query Language (SQL) were used to produce program under the development environment of Visual Basic 6.0 and Access database. The consulting system on food nutrition and dietary had been developed with the two languages combination and organization of the latest nutrition information. Nutrition balance of physiological characteristics, assessment for nutrition intake, inquiring n...

  11. SciELO, Scientific Electronic Library Online, a Database of Open Access Journals

    Science.gov (United States)

    Meneghini, Rogerio

    2013-01-01

    This essay discusses SciELO, a scientific journal database operating in 14 countries. It covers over 1000 journals providing open access to full text and table sets of scientometrics data. In Brazil it is responsible for a collection of nearly 300 journals, selected along 15 years as the best Brazilian periodicals in natural and social sciences.…

  12. DbAccess: Interactive Statistics and Graphics for Plasma Physics Databases

    International Nuclear Information System (INIS)

    Davis, W.; Mastrovito, D.

    2003-01-01

    DbAccess is an X-windows application, written in IDL(reg s ign), meeting many specialized statistical and graphical needs of NSTX [National Spherical Torus Experiment] plasma physicists, such as regression statistics and the analysis of variance. Flexible ''views'' and ''joins,'' which include options for complex SQL expressions, facilitate mixing data from different database tables. General Atomics Plot Objects add extensive graphical and interactive capabilities. An example is included for plasma confinement-time scaling analysis using a multiple linear regression least-squares power fit

  13. Research on geo-ontology construction based on spatial affairs

    Science.gov (United States)

    Li, Bin; Liu, Jiping; Shi, Lihong

    2008-12-01

    and change about flood with different scales and ranges in the city, can be distilled intellectively and on its own initiative from the geo-ontology database. Besides, correlative statistical information can also be provided to the governmental departments at all levels to help them to carry out timely measures of fighting back disaster and rescue. Compared with the past manners, the efficiency of dealing with flood information has been improved to some extent than ever because plenty of information irrespective and interferential to flood in different websites can be sieved in advance based on the retrieve method oriented to Geo-ontology. In a word, it will take the pursuers long time to study geo-ontology due to actual limited resource. But then, geo-ontology will be sure to further perfect correspondingly especially in the field of Geographic Information System owing to its more and more factual applications.

  14. Enhanced STEM Learning with the GeoMapApp Data Exploration Tool

    Science.gov (United States)

    Goodwillie, A. M.

    2014-12-01

    GeoMapApp (http://www.geomapapp.org), is a free, map-based data discovery and visualisation tool developed with NSF funding at Lamont-Doherty Earth Observatory. GeoMapApp provides casual and specialist users alike with access to hundreds of built-in geoscience data sets covering geology, geophysics, geochemistry, oceanography, climatology, cryospherics, and the environment. Users can also import their own data tables, spreadsheets, shapefiles, grids and images. Simple manipulation and analysis tools combined with layering capabilities and engaging visualisations provide a powerful platform with which to explore and interrogate geoscience data in its proper geospatial context thus helping users to more easily gain insight into the meaning of the data. A global elevation base map covering the oceans as well as continents forms the backbone of GeoMapApp. The multi-resolution base map is updated regularly and includes data sources ranging from Space Shuttle elevation data for land areas to ultra-high-resolution surveys of coral reefs and seafloor hydrothermal vent fields. Examples of built-in data sets that can be layered over the elevation model include interactive earthquake and volcano data, plate tectonic velocities, hurricane tracks, land and ocean temperature, water column properties, age of the ocean floor, and deep submersible bottom photos. A versatile profiling tool provides instant access to data cross-sections. Contouring and 3-D views are also offered - the attached image shows a 3-D view of East Africa's Ngorongoro Crater as an example. Tabular data - both imported and built-in - can be displayed in a variety of ways and a lasso tool enables users to quickly select data points directly from the map. A range of STEM-based education material based upon GeoMapApp is already available, including a number of self-contained modules for school- and college-level students (http://www.geomapapp.org/education/contributed_material.html). More learning modules are

  15. Exploiting relational database technology in a GIS

    Science.gov (United States)

    Batty, Peter

    1992-05-01

    All systems for managing data face common problems such as backup, recovery, auditing, security, data integrity, and concurrent update. Other challenges include the ability to share data easily between applications and to distribute data across several computers, whereas continuing to manage the problems already mentioned. Geographic information systems are no exception, and need to tackle all these issues. Standard relational database-management systems (RDBMSs) provide many features to help solve the issues mentioned so far. This paper describes how the IBM geoManager product approaches these issues by storing all its geographic data in a standard RDBMS in order to take advantage of such features. Areas in which standard RDBMS functions need to be extended are highlighted, and the way in which geoManager does this is explained. The performance implications of storing all data in the relational database are discussed. An important distinction is made between the storage and management of geographic data and the manipulation and analysis of geographic data, which needs to be made when considering the applicability of relational database technology to GIS.

  16. Validity of administrative database code algorithms to identify vascular access placement, surgical revisions, and secondary patency.

    Science.gov (United States)

    Al-Jaishi, Ahmed A; Moist, Louise M; Oliver, Matthew J; Nash, Danielle M; Fleet, Jamie L; Garg, Amit X; Lok, Charmaine E

    2018-03-01

    We assessed the validity of physician billing codes and hospital admission using International Classification of Diseases 10th revision codes to identify vascular access placement, secondary patency, and surgical revisions in administrative data. We included adults (≥18 years) with a vascular access placed between 1 April 2004 and 31 March 2013 at the University Health Network, Toronto. Our reference standard was a prospective vascular access database (VASPRO) that contains information on vascular access type and dates of placement, dates for failure, and any revisions. We used VASPRO to assess the validity of different administrative coding algorithms by calculating the sensitivity, specificity, and positive predictive values of vascular access events. The sensitivity (95% confidence interval) of the best performing algorithm to identify arteriovenous access placement was 86% (83%, 89%) and specificity was 92% (89%, 93%). The corresponding numbers to identify catheter insertion were 84% (82%, 86%) and 84% (80%, 87%), respectively. The sensitivity of the best performing coding algorithm to identify arteriovenous access surgical revisions was 81% (67%, 90%) and specificity was 89% (87%, 90%). The algorithm capturing arteriovenous access placement and catheter insertion had a positive predictive value greater than 90% and arteriovenous access surgical revisions had a positive predictive value of 20%. The duration of arteriovenous access secondary patency was on average 578 (553, 603) days in VASPRO and 555 (530, 580) days in administrative databases. Administrative data algorithms have fair to good operating characteristics to identify vascular access placement and arteriovenous access secondary patency. Low positive predictive values for surgical revisions algorithm suggest that administrative data should only be used to rule out the occurrence of an event.

  17. Towards the creation of a European Network of Earth Observation Networks within GEO. The ConnectinGEO project.

    Science.gov (United States)

    Masó, Joan; Serral, Ivette; Menard, Lionel; Wald, Lucien; Nativi, Stefano; Plag, Hans-Peter; Jules-Plag, Shelley; Nüst, Daniel; Jirka, Simon; Pearlman, Jay; De Maziere, Martine

    2015-04-01

    ConnectinGEO (Coordinating an Observation Network of Networks EnCompassing saTellite and IN-situ to fill the Gaps in European Observations" is a new H2020 Coordination and Support Action with the primary goal of linking existing Earth Observation networks with science and technology (S&T) communities, the industry sector, the Group on Earth Observations (GEO), and Copernicus. ConnectinGEO aims to facilitate a broader and more accessible knowledge base to support the needs of GEO, its Societal Benefit Areas (SBAs) and the users of the Global Earth Observing System of Systems (GEOSS). A broad range of subjects from climate, natural resources and raw materials, to the emerging UN Sustainable Development Goals (SDGs) will be addressed. The project will generate a prioritized list of critical gaps within available observation data and models to translate observations into practice-relevant knowledge, based on stakeholder consultation and systematic analysis. Ultimately, it will increase coherency of European observation networks, increase the use of Earth observations for assessments and forecasts and inform the planning for future observation systems. ConnectinGEO will initiate a European Network of Earth Observation Networks (ENEON) that will encompass space-based, airborne and in-situ observations networks. ENEON will be composed by project partners representing thematic observation networks along with the GEOSS Science and Technology Stakeholder Network, GEO Communities of Practices, Copernicus services, Sentinel missions and in-situ support data representatives, representatives of the space-based, airborne and in-situ observations European networks (e.g. EPOS, EMSO and GROOM, etc), representatives of the industry sector and European and national funding agencies, in particular those participating in the future ERA-PlaNET. At the beginning, the ENEON will be created and managed by the project. Then the management will be transferred to the network itself to ensure

  18. GeoServer cookbook

    CERN Document Server

    Iacovella, Stefano

    2014-01-01

    This book is ideal for GIS experts, developers, and system administrators who have had a first glance at GeoServer and who are eager to explore all its features in order to configure professional map servers. Basic knowledge of GIS and GeoServer is required.

  19. Geostationary Coastal Ecosystem Dynamics Imager (GEO CEDI) for the GEO Coastal and Air Pollution Events (GEO CAPE) Mission. Concept Presentation

    Science.gov (United States)

    Janz, Scott; Smith, James C.; Mannino, Antonio

    2010-01-01

    This slide presentation reviews the concepts of the Geostationary Coastal Ecosystem Dynamics Imager (GEO CEDI) which will be used on the GEO Coastal and Air Pollution Events (GEO CAPE) Mission. The primary science requirements require scans of the U.S. Coastal waters 3 times per day during the daylight hours. Included in the overview are presentations about the systems, the optics, the detectors, the mechanical systems, the electromechanical systems, the electrical design, the flight software, the thermal systems, and the contamination prevention requirements.

  20. File Specification for GEOS-5 FP (Forward Processing)

    Science.gov (United States)

    Lucchesi, R.

    2013-01-01

    horizontal grid. The majority of data products are time-averaged, but four instantaneous products are also available. Hourly data intervals are used for two-dimensional products, while 3-hourly intervals are used for three-dimensional products. These may be on the model's native 72-layer vertical grid or at 42 pressure surfaces extending to 0.1 hPa. This document describes the gridded output files produced by the GMAO near real-time operational FP, using the most recent version of the GEOS-5 assimilation system. Additional details about variables listed in this file specification can be found in a separate document, the GEOS-5 File Specification Variable Definition Glossary. Documentation about the current access methods for products described in this document can be found on the GMAO products page: http://gmao.gsfc.nasa.gov/products/.

  1. GeoCrystal: graphic-interactive access to geodata archives

    Science.gov (United States)

    Goebel, Stefan; Haist, Joerg; Jasnoch, Uwe

    2002-03-01

    Recently there is spent a lot of effort to establish information systems and global infrastructures enabling both data suppliers and users to describe (-> eCommerce, metadata) as well as to find appropriate data. Examples for this are metadata information systems, online-shops or portals for geodata. The main disadvantages of existing approaches are insufficient methods and mechanisms leading users to (e.g. spatial) data archives. This affects aspects concerning usability and personalization in general as well as visual feedback techniques in the different steps of the information retrieval process. Several approaches aim at the improvement of graphical user interfaces by using intuitive metaphors, but only some of them offer 3D interfaces in the form of information landscapes or geographic result scenes in the context of information systems for geodata. This paper presents GeoCrystal, which basic idea is to adopt Venn diagrams to compose complex queries and to visualize search results in a 3D information and navigation space for geodata. These concepts are enhanced with spatial metaphors and 3D information landscapes (library for geodata) wherein users can specify searches for appropriate geodata and are enabled to graphic-interactively communicate with search results (book metaphor).

  2. Multilevel security for relational databases

    CERN Document Server

    Faragallah, Osama S; El-Samie, Fathi E Abd

    2014-01-01

    Concepts of Database Security Database Concepts Relational Database Security Concepts Access Control in Relational Databases      Discretionary Access Control      Mandatory Access Control      Role-Based Access Control Work Objectives Book Organization Basic Concept of Multilevel Database Security IntroductionMultilevel Database Relations Polyinstantiation      Invisible Polyinstantiation      Visible Polyinstantiation      Types of Polyinstantiation      Architectural Consideration

  3. GeoSciGraph: An Ontological Framework for EarthCube Semantic Infrastructure

    Science.gov (United States)

    Gupta, A.; Schachne, A.; Condit, C.; Valentine, D.; Richard, S.; Zaslavsky, I.

    2015-12-01

    The CINERGI (Community Inventory of EarthCube Resources for Geosciences Interoperability) project compiles an inventory of a wide variety of earth science resources including documents, catalogs, vocabularies, data models, data services, process models, information repositories, domain-specific ontologies etc. developed by research groups and data practitioners. We have developed a multidisciplinary semantic framework called GeoSciGraph semantic ingration of earth science resources. An integrated ontology is constructed with Basic Formal Ontology (BFO) as its upper ontology and currently ingests multiple component ontologies including the SWEET ontology, GeoSciML's lithology ontology, Tematres controlled vocabulary server, GeoNames, GCMD vocabularies on equipment, platforms and institutions, software ontology, CUAHSI hydrology vocabulary, the environmental ontology (ENVO) and several more. These ontologies are connected through bridging axioms; GeoSciGraph identifies lexically close terms and creates equivalence class or subclass relationships between them after human verification. GeoSciGraph allows a community to create community-specific customizations of the integrated ontology. GeoSciGraph uses the Neo4J,a graph database that can hold several billion concepts and relationships. GeoSciGraph provides a number of REST services that can be called by other software modules like the CINERGI information augmentation pipeline. 1) Vocabulary services are used to find exact and approximate terms, term categories (community-provided clusters of terms e.g., measurement-related terms or environmental material related terms), synonyms, term definitions and annotations. 2) Lexical services are used for text parsing to find entities, which can then be included into the ontology by a domain expert. 3) Graph services provide the ability to perform traversal centric operations e.g., finding paths and neighborhoods which can be used to perform ontological operations like

  4. Approaches to communication in response to geo-hydrological risk: POLARIS an Italian web initiative.

    Science.gov (United States)

    Salvati, Paola; Pernice, Umberto; Bianchi, Cinzia; Fiorucci, Federica; Marchesini, Ivan; Guzzetti, Fausto

    2015-04-01

    information, considering usability and accessibility of the website, and key graphic aspects of web 2.0 information, making the web site communication more effective to users pertaining to diversified audiences. Specific icons are designed to describe the geo-hydrological events and maps to visualize their impact on the territory. The scientific and technical contents are edited using appropriate communication strategies which adopt a less technical and more widely comprehensible language, using intuitive and engaging web interfaces and linking messages to social media that encourage citizens' interactions. Monitoring the access of users to the website during more than a year after its publication, we noticed how the majority of the access corresponds to the occurrence of the worst geo-hydrological events and, in particular, when journalists or scientists promoted the website through television. Such a positive effect on the growth of users access suggested us to enhance our collaboration with scientific journalists by linking traditional (i.e. TV) and social media to further enlarge the awareness of website and to better explain users how to use the website information for increasing their resilience to geo-hydrological hazards.

  5. Characterizing GEO Titan IIIC Transtage Fragmentations Using Ground-based and Telescopic Measurements

    Science.gov (United States)

    Cowardin, H.; Anz-Meador, P.; Reyes, J. A.

    In a continued effort to better characterize the geosynchronous orbit (GEO) environment, NASA’s Orbital Debris Program Office (ODPO) utilizes various ground-based optical assets to acquire photometric and spectral data of known debris associated with fragmentations in or near GEO. The Titan IIIC Transtage upper stage is known to have fragmented four times. Two of the four fragmentations were in GEO while the Transtage fragmented a third time in GEO transfer orbit. The forth fragmentation occurred in low Earth orbit. To better assess and characterize these fragmentations, the NASA ODPO acquired a Titan Transtage test and display article previously in the custody of the 309th Aerospace Maintenance and Regeneration Group (AMARG) in Tucson, Arizona. After initial inspections at AMARG demonstrated that it was of sufficient fidelity to be of interest, the test article was brought to NASA Johnson Space Center (JSC) to continue material analysis and historical documentation. The Transtage has undergone two separate spectral measurement campaigns to characterize the reflectance spectroscopy of historical aerospace materials. These data have been incorporated into the NASA Spectral Database, with the goal of using telescopic data comparisons for potential material identification. A Light Detection and Ranging (LIDAR) system scan also has been completed and a scale model has been created for use in the Optical Measurement Center (OMC) for photometric analysis of an intact Transtage, including bidirectional reflectance distribution function (BRDF) measurements. An historical overview of the Titan IIIC Transtage, the current analysis that has been done to date, and the future work to be completed in support of characterizing the GEO and near GEO orbital debris environment will be discussed in the subsequent presentation.

  6. Formal modelling of processes and tasks to support use and search of geo-information in emergency response

    NARCIS (Netherlands)

    Zlatanova, S.

    2010-01-01

    Many Command& Control or Early warning systems have been developed providing access to large amounts of data (and metadata) via geo-portals, or by accessing predefined data sets relaying on Spatial Data Infrastructure. However, the users involved in emergency response are usually not geoinformation

  7. Geo-diversity as an indicator of natural resources for geopark in human society

    Science.gov (United States)

    Lin, Jiun-Chuan

    2017-04-01

    Geo-diversity is a concept of richness and number of different landscapes in a small area. The higher geo-diversity the potential attraction is higher. Many geoparks will make use of those landscapes for sustainable development. The purpose of this study is trying to evaluate the geomorphic resources for geoparks in Taiwan. For the sustainable development, the concept of geopark is one of the tool for the development of society. The evaluation of geo-diversity helps our understanding of local resources and for future management. Therefore, the geomorphic resources should be evaluated systematically and aim to help the sustainable development of the geopark. The indicators of geo-diversity can be classified into four characters to review: 1. number of landscapes within geopark; 2. accessibility to the sites of geopark, 3. dynamic processes of the landforms, 4. method of landform evolution. Taiwan geoparks should make use of these four characters for conservation, management and education purposes. Yehliu, Matsu and Penghu geoparks are three typical cases for demonstration in this paper.

  8. Access to public drinking water fountains in Berkeley, California: a geospatial analysis.

    Science.gov (United States)

    Avery, Dylan C; Smith, Charlotte D

    2018-01-24

    In January 2015, Berkeley, California became the first city in the Unites States to impose a tax on sugar-sweetened beverages. The tax is intended to discourage purchase of sugary beverages and promote consumption of healthier alternatives such as tap water. The goal of the study was to assess the condition of public drinking water fountains and determine if there is a difference in access to clean, functioning fountains based on race or socio-economic status. A mobile-GIS App was created to locate and collect data on existing drinking water fountains in Berkeley, CA. Demographic variables related to race and socio-economic status (SES) were acquired from the US Census - American Community Survey database. Disparities in access to, or condition of drinking water fountains relative to demographics was explored using spatial analyses. Spatial statistical-analysis was performed to estimate demographic characteristics of communities near the water fountains and logistic regression was used to examine the relationship between household median income or race and condition of fountain. Although most fountains were classified as functioning, some were dirty, clogged, or both dirty and clogged. No spatial relationships between demographic characteristics and fountain conditions were observed. All geo-located data and a series of maps were provided to the City of Berkeley and the public. The geo-database created as an outcome of this study is useful for prioritizing maintenance of existing fountains and planning the locations of future fountains. The methodologies used for this study could be applied to a wide variety of asset inventory and assessment projects such as clinics or pharmaceutical dispensaries, both in developed and developing countries.

  9. Legacy2Drupal - Conversion of an existing oceanographic relational database to a semantically enabled Drupal content management system

    Science.gov (United States)

    Maffei, A. R.; Chandler, C. L.; Work, T.; Allen, J.; Groman, R. C.; Fox, P. A.

    2009-12-01

    Content Management Systems (CMSs) provide powerful features that can be of use to oceanographic (and other geo-science) data managers. However, in many instances, geo-science data management offices have previously designed customized schemas for their metadata. The WHOI Ocean Informatics initiative and the NSF funded Biological Chemical and Biological Data Management Office (BCO-DMO) have jointly sponsored a project to port an existing, relational database containing oceanographic metadata, along with an existing interface coded in Cold Fusion middleware, to a Drupal6 Content Management System. The goal was to translate all the existing database tables, input forms, website reports, and other features present in the existing system to employ Drupal CMS features. The replacement features include Drupal content types, CCK node-reference fields, themes, RDB, SPARQL, workflow, and a number of other supporting modules. Strategic use of some Drupal6 CMS features enables three separate but complementary interfaces that provide access to oceanographic research metadata via the MySQL database: 1) a Drupal6-powered front-end; 2) a standard SQL port (used to provide a Mapserver interface to the metadata and data; and 3) a SPARQL port (feeding a new faceted search capability being developed). Future plans include the creation of science ontologies, by scientist/technologist teams, that will drive semantically-enabled faceted search capabilities planned for the site. Incorporation of semantic technologies included in the future Drupal 7 core release is also anticipated. Using a public domain CMS as opposed to proprietary middleware, and taking advantage of the many features of Drupal 6 that are designed to support semantically-enabled interfaces will help prepare the BCO-DMO database for interoperability with other ecosystem databases.

  10. Allelic database and accession divergence of a Brazilian mango collection based on microsatellite markers.

    Science.gov (United States)

    Dos Santos Ribeiro, I C N; Lima Neto, F P; Santos, C A F

    2012-12-19

    Allelic patterns and genetic distances were examined in a collection of 103 foreign and Brazilian mango (Mangifera indica) accessions in order to develop a reference database to support cultivar protection and breeding programs. An UPGMA dendrogram was generated using Jaccard's coefficients from a distance matrix based on 50 alleles of 12 microsatellite loci. The base pair number was estimated by the method of inverse mobility. The cophenetic correlation was 0.8. The accessions had a coefficient of similarity from 30 to 100%, which reflects high genetic variability. Three groups were observed in the UPGMA dendrogram; the first group was formed predominantly by foreign accessions, the second group was formed by Brazilian accessions, and the Dashehari accession was isolated from the others. The 50 microsatellite alleles did not separate all 103 accessions, indicating that there are duplicates in this mango collection. These 12 microsatellites need to be validated in order to establish a reliable set to identify mango cultivars.

  11. Database Organisation in a Web-Enabled Free and Open-Source Software (foss) Environment for Spatio-Temporal Landslide Modelling

    Science.gov (United States)

    Das, I.; Oberai, K.; Sarathi Roy, P.

    2012-07-01

    Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.

  12. Database design and database administration for a kindergarten

    OpenAIRE

    Vítek, Daniel

    2009-01-01

    The bachelor thesis deals with creation of database design for a standard kindergarten, installation of the designed database into the database system Oracle Database 10g Express Edition and demonstration of the administration tasks in this database system. The verification of the database was proved by a developed access application.

  13. Wireless access to a pharmaceutical database: A demonstrator for data driven Wireless Application Prorocol (WAP) applications in medical information processing

    DEFF Research Database (Denmark)

    Hansen, Michael Schacht; Dørup, Jens

    2001-01-01

    script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2......) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. RESULTS: A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol...... service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. CONCLUSIONS: We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further...

  14. Preparation of Database for Land use Management in North East of Cairo

    International Nuclear Information System (INIS)

    El-Ghawaby, A.M.

    2012-01-01

    Environmental management in urban areas is difficult due to the amount and miscellaneous data needed for decision making. This amount of data is splendid without adequate database systems and modern methodologies. A geo-database building for East Cairo City Area (ECCA) is built to be used in the process of urban land-use suitability to achieve better performance compared with usual methods used. This Geo-database has required availability of detailed, accurate, updated and geographically referenced data on its terrain physical characteristics and its expected environmental hazards that may occur. A smart environmental suitability model for ECCA is developed and implemented using ERDAS IMAGINE 9.2. This model is capable of suggesting the more appropriate urban land-use, based on the existing spatial and non-spatial potentials and constraints.

  15. Multi-Dimensional Bitmap Indices for Optimising Data Access within Object Oriented Databases at CERN

    CERN Document Server

    Stockinger, K

    2001-01-01

    Efficient query processing in high-dimensional search spaces is an important requirement for many analysis tools. In the literature on index data structures one can find a wide range of methods for optimising database access. In particular, bitmap indices have recently gained substantial popularity in data warehouse applications with large amounts of read mostly data. Bitmap indices are implemented in various commercial database products and are used for querying typical business applications. However, scientific data that is mostly characterised by non-discrete attribute values cannot be queried efficiently by the techniques currently supported. In this thesis we propose a novel access method based on bitmap indices that efficiently handles multi-dimensional queries against typical scientific data. The algorithm is called GenericRangeEval and is an extension of a bitmap index for discrete attribute values. By means of a cost model we study the performance of queries with various selectivities against uniform...

  16. Development of a geo-information system for the evaluation of active faults

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Sang Gi; Lim, Yu Jin [Paichai Univ., Taejon (Korea, Republic of)

    2001-04-15

    This project aims to develop an effective field system and database structure by analyzing the quantitative and qualitative geological data and establish its application plan. The contents and the scope of this study are as follows : developing a geo-information software, producing digital data from previous works and information-geological, age dating, trench information, topographic and geologic maps, establish a home page.

  17. GEOS. User Tutorials

    Energy Technology Data Exchange (ETDEWEB)

    Fu, Pengchen [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Settgast, Randolph R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Johnson, Scott M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walsh, Stuart D.C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Morris, Joseph P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ryerson, Frederick J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-12-17

    GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that has leveraged existing code capabilities and sta expertise to design new computational geosciences software.

  18. MetaboLights: An Open-Access Database Repository for Metabolomics Data.

    Science.gov (United States)

    Kale, Namrata S; Haug, Kenneth; Conesa, Pablo; Jayseelan, Kalaivani; Moreno, Pablo; Rocca-Serra, Philippe; Nainala, Venkata Chandrasekhar; Spicer, Rachel A; Williams, Mark; Li, Xuefei; Salek, Reza M; Griffin, Julian L; Steinbeck, Christoph

    2016-03-24

    MetaboLights is the first general purpose, open-access database repository for cross-platform and cross-species metabolomics research at the European Bioinformatics Institute (EMBL-EBI). Based upon the open-source ISA framework, MetaboLights provides Metabolomics Standard Initiative (MSI) compliant metadata and raw experimental data associated with metabolomics experiments. Users can upload their study datasets into the MetaboLights Repository. These studies are then automatically assigned a stable and unique identifier (e.g., MTBLS1) that can be used for publication reference. The MetaboLights Reference Layer associates metabolites with metabolomics studies in the archive and is extensively annotated with data fields such as structural and chemical information, NMR and MS spectra, target species, metabolic pathways, and reactions. The database is manually curated with no specific release schedules. MetaboLights is also recommended by journals for metabolomics data deposition. This unit provides a guide to using MetaboLights, downloading experimental data, and depositing metabolomics datasets using user-friendly submission tools. Copyright © 2016 John Wiley & Sons, Inc.

  19. GeoGebra for Mathematical Statistics

    Science.gov (United States)

    Hewson, Paul

    2009-01-01

    The GeoGebra software is attracting a lot of interest in the mathematical community, consequently there is a wide range of experience and resources to help use this application. This article briefly outlines how GeoGebra will be of great value in statistical education. The release of GeoGebra is an excellent example of the power of free software…

  20. Designing and implementing a Quality Broker: the GeoViQua experience

    Science.gov (United States)

    Papeschi, Fabrizio; Bigagli, Lorenzo; Masò, Joan; Nativi, Stefano

    2014-05-01

    GeoViQua (QUAlity aware VIsualisation for the Global Earth Observation System of Systems) is an FP7 project aiming at complementing the Global Earth Observation System of Systems (GEOSS) with rigorous data quality specifications and quality-aware capabilities, in order to improve reliability in scientific studies and policy decision-making. GeoViQua main scientific and technical objective is to enhance the GEOSS Common Infrastructure (GCI) providing the user community with innovative quality-aware search and visualization tools, which will be integrated in the GEOPortal, as well as made available to other end-user interfaces. To this end, GeoViQua will promote the extension of the current standard metadata for geographic information with accurate and expressive quality indicators. Employing and extending several ISO standards such as 19115, 19157 and 19139, a common set of data quality indicators has been selected to be used within the project. The resulting work, in the form of a data model, is expressed in XML Schema Language and encoded in XML. Quality information can be stated both by data producers and by data users, actually resulting in two conceptually distinct data models, the Producer Quality model and the User Quality model (or User Feedback model). GeoViQua architecture is built on the brokering approach successfully experimented within the EuroGEOSS project and realized by the GEO DAB (Discovery and Access Broker) which is part of the GCI. The GEO DAB allows for harmonization and distribution in a transparent way for both users and data providers. This way, GeoViQua can effectively complement and extend the GEO DAB obtaining a Quality augmentation Broker (DAB-Q) which plays a central role in ensuring the consistency of the Producer and User quality models. The GeoViQua architecture also includes a Feedback Catalog, a particular service brokered by the DAB-Q which is dedicated to the storage and discovery of user feedbacks. A very important issue

  1. Teaching Case: Adapting the Access Northwind Database to Support a Database Course

    Science.gov (United States)

    Dyer, John N.; Rogers, Camille

    2015-01-01

    A common problem encountered when teaching database courses is that few large illustrative databases exist to support teaching and learning. Most database textbooks have small "toy" databases that are chapter objective specific, and thus do not support application over the complete domain of design, implementation and management concepts…

  2. JASPAR 2010: the greatly expanded open-access database of transcription factor binding profiles

    Science.gov (United States)

    Portales-Casamar, Elodie; Thongjuea, Supat; Kwon, Andrew T.; Arenillas, David; Zhao, Xiaobei; Valen, Eivind; Yusuf, Dimas; Lenhard, Boris; Wasserman, Wyeth W.; Sandelin, Albin

    2010-01-01

    JASPAR (http://jaspar.genereg.net) is the leading open-access database of matrix profiles describing the DNA-binding patterns of transcription factors (TFs) and other proteins interacting with DNA in a sequence-specific manner. Its fourth major release is the largest expansion of the core database to date: the database now holds 457 non-redundant, curated profiles. The new entries include the first batch of profiles derived from ChIP-seq and ChIP-chip whole-genome binding experiments, and 177 yeast TF binding profiles. The introduction of a yeast division brings the convenience of JASPAR to an active research community. As binding models are refined by newer data, the JASPAR database now uses versioning of matrices: in this release, 12% of the older models were updated to improved versions. Classification of TF families has been improved by adopting a new DNA-binding domain nomenclature. A curated catalog of mammalian TFs is provided, extending the use of the JASPAR profiles to additional TFs belonging to the same structural family. The changes in the database set the system ready for more rapid acquisition of new high-throughput data sources. Additionally, three new special collections provide matrix profile data produced by recent alternative high-throughput approaches. PMID:19906716

  3. Maritime Geo-Fence Letter Report

    Science.gov (United States)

    2016-07-01

    1 Classification | CG-926 RDC | author | audience | month year Maritime Geo-Fence Letter Report Authors: Irene Gonin and Gregory...Johnson   Distribution Statement A: Approved for public release; distribution is unlimited. July 2016 Report No. CG-D-10-16 Maritime Geo-Fence...United States Coast Guard Research & Development Center 1 Chelsea Street New London, CT 06320 Maritime Geo-Fence Letter Report 1

  4. Database of geo-hydrological disasters for civil protection purposes

    Czech Academy of Sciences Publication Activity Database

    Blahůt, Jan; Poretti, I.; De Amicis, M.; Sterlacchini, S.

    2012-01-01

    Roč. 60, č. 3 (2012), s. 1065-1083 ISSN 0921-030X Institutional research plan: CEZ:AV0Z30460519 Keywords : disaster database * civil protection * risk scenarios Subject RIV: DB - Geology ; Mineralogy Impact factor: 1.639, year: 2012

  5. Teaching Plate Tectonic Concepts using GeoMapApp Learning Activities

    Science.gov (United States)

    Goodwillie, A. M.; Kluge, S.

    2012-12-01

    based upon GeoMapApp (http://www.geomapapp.org), a free map-based data exploration and visualisation tool that allows students to access a wide range of geoscience data in a virtual lab-like environment.

  6. Effect of teaching mathematics using GeoGebra on students' with dissimilar spatial visualisation

    Science.gov (United States)

    Bakar, Kamariah Abu; Ayub, Ahmad Fauzi Mohd; Tarmizi, Rohani Ahmad; Luan, Wong Su

    2015-10-01

    This study examined the effects of GeoGebra on mathematics performance of students with different spatial visualization. A qusai-experimental, pretest-posttest control group design was conducted. A total of 71 students from two intact groups were involved in the study. They were in two groups and each group was randonly assigned to the experimental group (36 students) and control group (35 students). A spatial visual test to identify students with high or low visualization, and a mathematics performance pre-test were administered at the initial stage of this study. A post-test was administered after 12 weeks of treatment using GeoGebra. Analyses of Covarion (ANCOVA) was used to adjust for the pre-test score. Findings showed that the group with access to GeoGebra achieved significantly better test scores in the posttest as compared to the group which followed the traditional teaching method. A two-way ANCOVA used to analyse the effect of students' spatial visualization on post-test performance showed that there was no effect. The results from this study suggested that using GeoGebra had helped the students to score better in the posttest. However, there is no significance difference on mathematics performances on students with difference types of spatial visualisastion. This study indicates that GeoGebra is useful in enhancing the teaching and learning of mathematics.

  7. GeoBuilder: a geometric algorithm visualization and debugging system for 2D and 3D geometric computing.

    Science.gov (United States)

    Wei, Jyh-Da; Tsai, Ming-Hung; Lee, Gen-Cher; Huang, Jeng-Hung; Lee, Der-Tsai

    2009-01-01

    Algorithm visualization is a unique research topic that integrates engineering skills such as computer graphics, system programming, database management, computer networks, etc., to facilitate algorithmic researchers in testing their ideas, demonstrating new findings, and teaching algorithm design in the classroom. Within the broad applications of algorithm visualization, there still remain performance issues that deserve further research, e.g., system portability, collaboration capability, and animation effect in 3D environments. Using modern technologies of Java programming, we develop an algorithm visualization and debugging system, dubbed GeoBuilder, for geometric computing. The GeoBuilder system features Java's promising portability, engagement of collaboration in algorithm development, and automatic camera positioning for tracking 3D geometric objects. In this paper, we describe the design of the GeoBuilder system and demonstrate its applications.

  8. Geo synthetic-reinforced Pavement systems

    International Nuclear Information System (INIS)

    Zornberg, J. G.

    2014-01-01

    Geo synthetics have been used as reinforcement inclusions to improve pavement performance. while there are clear field evidence of the benefit of using geo synthetic reinforcements, the specific conditions or mechanisms that govern the reinforcement of pavements are, at best, unclear and have remained largely unmeasured. Significant research has been recently conducted with the objectives of: (i) determining the relevant properties of geo synthetics that contribute to the enhanced performance of pavement systems, (ii) developing appropriate analytical, laboratory and field methods capable of quantifying the pavement performance, and (iii) enabling the prediction of pavement performance as a function of the properties of the various types of geo synthetics. (Author)

  9. Encoding of Geological knowledge in the GeoPiemonte Map Data Base

    Science.gov (United States)

    Piana, Fabrizio; Lombardo, Vincenzo; Mimmo, Dario; Barale, Luca; Irace, Andrea; Mulazzano, Elia

    2017-04-01

    In modern digital geological maps and geo-database, namely those devoted to interactive WebGIS services, there is the need to make explicit the geological assumptions in the process of the design and compilation of the Map Geodatabase. The Geodatabase of the Piemonte Geological Map, which consists of several thousands of Geologic Units and Geologic Structures, was designed in a way suitable for linking the knowledge of the geological domain at hand to more general levels of knowledge, represented in existing Earth Sciences ontologies and in a domain ontology (OntoGeonous), specifically designed for the project, though with a wide applicability in mind. The Geologic Units and Geologic Structures of the GeoPiemonte Map have been spatially correlated through the whole region, referring to a non-formal hierarchical scheme, which gives the parental relations between several orders of Geologic Units, putting them in relations with some main Geologic Events. The scheme reports the subdivisions we did on the Alps-Apennines orogenic belt (which constitutes the Piemonte geological framework) on which the architecture of the GeoDB relied. This contribution describes how the two different knowledge levels (specific domain vs. general knowledge) are assimilated within the GeoPiemonte informative system, providing relations between the contents of the geodatabase and the encoded concepts of the reference ontologies. Initiatives such as GeoScience Markup Language (GeoSciML 4.01, 2016 (1) and INSPIRE "Data Specification on Geology" (an operative simplification of GeoSciML, last version is 3.0, 2013) (2), as well as the recent terminological shepherding of the Geoscience Terminology Working Group (GTWG), provided us the authoritative standard geological source for knowledge encoding. Consistency and interoperability of geological data were thus sought, by classifying geologic features in an ontology-driven Data Model, while objects were described using GeoSciML controlled

  10. CEOS Ocean Variables Enabling Research and Applications for Geo (COVERAGE)

    Science.gov (United States)

    Tsontos, V. M.; Vazquez, J.; Zlotnicki, V.

    2017-12-01

    The CEOS Ocean Variables Enabling Research and Applications for GEO (COVERAGE) initiative seeks to facilitate joint utilization of different satellite data streams on ocean physics, better integrated with biological and in situ observations, including near real-time data streams in support of oceanographic and decision support applications for societal benefit. COVERAGE aligns with programmatic objectives of CEOS (the Committee on Earth Observation Satellites) and the missions of GEO-MBON (Marine Biodiversity Observation Network) and GEO-Blue Planet, which are to advance and exploit synergies among the many observational programs devoted to ocean and coastal waters. COVERAGE is conceived of as 3 year pilot project involving international collaboration. It focuses on implementing technologies, including cloud based solutions, to provide a data rich, web-based platform for integrated ocean data delivery and access: multi-parameter observations, easily discoverable and usable, organized by disciplines, available in near real-time, collocated to a common grid and including climatologies. These will be complemented by a set of value-added data services available via the COVERAGE portal including an advanced Web-based visualization interface, subsetting/extraction, data collocation/matchup and other relevant on demand processing capabilities. COVERAGE development will be organized around priority use cases and applications identified by GEO and agency partners. The initial phase will be to develop co-located 25km products from the four Ocean Virtual Constellations (VCs), Sea Surface Temperature, Sea Level, Ocean Color, and Sea Surface Winds. This aims to stimulate work among the ocean VCs while developing products and system functionality based on community recommendations. Such products as anomalies from a time mean, would build on the theme of applications with a relevance to CEOS/GEO mission and vision. Here we provide an overview of the COVERAGE initiative with an

  11. Genelab: Scientific Partnerships and an Open-Access Database to Maximize Usage of Omics Data from Space Biology Experiments

    Science.gov (United States)

    Reinsch, S. S.; Galazka, J..; Berrios, D. C; Chakravarty, K.; Fogle, H.; Lai, S.; Bokyo, V.; Timucin, L. R.; Tran, P.; Skidmore, M.

    2016-01-01

    NASA's mission includes expanding our understanding of biological systems to improve life on Earth and to enable long-duration human exploration of space. The GeneLab Data System (GLDS) is NASA's premier open-access omics data platform for biological experiments. GLDS houses standards-compliant, high-throughput sequencing and other omics data from spaceflight-relevant experiments. The GeneLab project at NASA-Ames Research Center is developing the database, and also partnering with spaceflight projects through sharing or augmentation of experiment samples to expand omics analyses on precious spaceflight samples. The partnerships ensure that the maximum amount of data is garnered from spaceflight experiments and made publically available as rapidly as possible via the GLDS. GLDS Version 1.0, went online in April 2015. Software updates and new data releases occur at least quarterly. As of October 2016, the GLDS contains 80 datasets and has search and download capabilities. Version 2.0 is slated for release in September of 2017 and will have expanded, integrated search capabilities leveraging other public omics databases (NCBI GEO, PRIDE, MG-RAST). Future versions in this multi-phase project will provide a collaborative platform for omics data analysis. Data from experiments that explore the biological effects of the spaceflight environment on a wide variety of model organisms are housed in the GLDS including data from rodents, invertebrates, plants and microbes. Human datasets are currently limited to those with anonymized data (e.g., from cultured cell lines). GeneLab ensures prompt release and open access to high-throughput genomics, transcriptomics, proteomics, and metabolomics data from spaceflight and ground-based simulations of microgravity, radiation or other space environment factors. The data are meticulously curated to assure that accurate experimental and sample processing metadata are included with each data set. GLDS download volumes indicate strong

  12. Internet Geo-Location

    Science.gov (United States)

    2017-12-01

    INTERNET GEO-LOCATION DUKE UNIVERSITY DECEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED STINFO COPY AIR...REPORT TYPE FINAL TECHNICAL REPORT 3. DATES COVERED (From - To) MAY 2014 – MAY 2017 4. TITLE AND SUBTITLE INTERNET GEO-LOCATION 5a. CONTRACT...of SpeedTest servers that are used by end users to measure the speed of their Internet connection. The servers log the IP address and the location

  13. DactyLoc : A minimally geo-referenced WiFi+GSM-fingerprint-based localization method for positioning in urban spaces

    DEFF Research Database (Denmark)

    Cujia, Kristian; Wirz, Martin; Kjærgaard, Mikkel Baun

    2012-01-01

    Fingerprinting-based localization methods relying on WiFi and GSM information provide sufficient localization accuracy for many mobile phone applications. Most of the existing approaches require a training set consisting of geo-referenced fingerprints to build a reference database. We propose...... a collaborative, semi-supervised WiFi+GSM fingerprinting method where only a small fraction of all fingerprints needs to be geo-referenced. Our approach enables indexing of areas in the absence of GPS reception as often found in urban spaces and indoors without manual labeling of fingerprints. The method takes...

  14. Development of SRS.php, a Simple Object Access Protocol-based library for data acquisition from integrated biological databases.

    Science.gov (United States)

    Barbosa-Silva, A; Pafilis, E; Ortega, J M; Schneider, R

    2007-12-11

    Data integration has become an important task for biological database providers. The current model for data exchange among different sources simplifies the manner that distinct information is accessed by users. The evolution of data representation from HTML to XML enabled programs, instead of humans, to interact with biological databases. We present here SRS.php, a PHP library that can interact with the data integration Sequence Retrieval System (SRS). The library has been written using SOAP definitions, and permits the programmatic communication through webservices with the SRS. The interactions are possible by invoking the methods described in WSDL by exchanging XML messages. The current functions available in the library have been built to access specific data stored in any of the 90 different databases (such as UNIPROT, KEGG and GO) using the same query syntax format. The inclusion of the described functions in the source of scripts written in PHP enables them as webservice clients to the SRS server. The functions permit one to query the whole content of any SRS database, to list specific records in these databases, to get specific fields from the records, and to link any record among any pair of linked databases. The case study presented exemplifies the library usage to retrieve information regarding registries of a Plant Defense Mechanisms database. The Plant Defense Mechanisms database is currently being developed, and the proposal of SRS.php library usage is to enable the data acquisition for the further warehousing tasks related to its setup and maintenance.

  15. Multilingual access to full text databases

    International Nuclear Information System (INIS)

    Fluhr, C.; Radwan, K.

    1990-05-01

    Many full text databases are available in only one language, or more, they may contain documents in different languages. Even if the user is able to understand the language of the documents in the database, it could be easier for him to express his need in his own language. For the case of databases containing documents in different languages, it is more simple to formulate the query in one language only and to retrieve documents in different languages. This paper present the developments and the first experiments of multilingual search, applied to french-english pair, for text data in nuclear field, based on the system SPIRIT. After reminding the general problems of full text databases search by queries formulated in natural language, we present the methods used to reformulate the queries and show how they can be expanded for multilingual search. The first results on data in nuclear field are presented (AFCEN norms and INIS abstracts). 4 refs

  16. Nuclear physics for geo-neutrino studies

    International Nuclear Information System (INIS)

    Fiorentini, Gianni; Ianni, Aldo; Korga, George; Suvorov, Yury; Lissia, Marcello; Mantovani, Fabio; Miramonti, Lino; Oberauer, Lothar; Obolensky, Michel; Smirnov, Oleg

    2010-01-01

    Geo-neutrino studies are based on theoretical estimates of geo-neutrino spectra. We propose a method for a direct measurement of the energy distribution of antineutrinos from decays of long-lived radioactive isotopes. We present preliminary results for the geo-neutrinos from 214 Bi decay, a process that accounts for about one-half of the total geo-neutrino signal. The feeding probability of the lowest state of 214 Bi--the most important for geo-neutrino signal--is found to be p 0 =0.177±0.004 (stat) -0.001 +0.003 (sys), under the hypothesis of universal neutrino spectrum shape (UNSS). This value is consistent with the (indirect) estimate of the table of isotopes. We show that achievable larger statistics and reduction of systematics should allow for the testing of possible distortions of the neutrino spectrum from that predicted using the UNSS hypothesis. Implications on the geo-neutrino signal are discussed.

  17. The Geo/Geo/1+1 Queueing System with Negative Customers

    OpenAIRE

    Ma, Zhanyou; Guo, Yalin; Wang, Pengcheng; Hou, Yumei

    2013-01-01

    We study a Geo/Geo/1+1 queueing system with geometrical arrivals of both positive and negative customers in which killing strategies considered are removal of customers at the head (RCH) and removal of customers at the end (RCE). Using quasi-birth-death (QBD) process and matrix-geometric solution method, we obtain the stationary distribution of the queue length, the average waiting time of a new arrival customer, and the probabilities of servers in busy or idle period, respectively. Finally, ...

  18. Geo-collaboration under stress

    NARCIS (Netherlands)

    Looije, R.; Brake, G.M. te; Neerincx, M.A.

    2007-01-01

    “Most of the science and decision making involved in geo-information is the product of collaborative teams. Current geospatial technologies are a limiting factor because they do not provide any direct support for group efforts. In this paper we present a method to enhance geo-collaboration by

  19. [Brief introduction of geo-authentic herbs].

    Science.gov (United States)

    Liang, Fei; Li, Jian; Zhang, Wei; Zhang, Rui-Xian

    2013-05-01

    The science of geo-authentic herbs is a characteristic discipline of traditional Chinese medicine established during thousands of years of clinical practices. It has a long history under the guidance of profound theories of traditional Chinese medicine. The words of "geo-authentic product" were derived from an administrative division unit in the ancient times, which layed stress on the good quality of products in particular regions. In ancient records of traditional Chinese medicine, the words of "geo-authentic product" were first found in Concise Herbal Foundation Compilation of the Ming dynasty, and the words of "geo-authentic herbs" were first discovered in Peony Pavilion of the late Ming dynasty. After all, clinical effect is the fundamental evaluation standard of geo-authentic herbs.

  20. Reciprocal Estimation of Pedestrian Location and Motion State toward a Smartphone Geo-Context Computing Solution

    Directory of Open Access Journals (Sweden)

    Jingbin Liu

    2015-06-01

    Full Text Available The rapid advance in mobile communications has made information and services ubiquitously accessible. Location and context information have become essential for the effectiveness of services in the era of mobility. This paper proposes the concept of geo-context that is defined as an integral synthesis of geographical location, human motion state and mobility context. A geo-context computing solution consists of a positioning engine, a motion state recognition engine, and a context inference component. In the geo-context concept, the human motion states and mobility context are associated with the geographical location where they occur. A hybrid geo-context computing solution is implemented that runs on a smartphone, and it utilizes measurements of multiple sensors and signals of opportunity that are available within a smartphone. Pedestrian location and motion states are estimated jointly under the framework of hidden Markov models, and they are used in a reciprocal manner to improve their estimation performance of one another. It is demonstrated that pedestrian location estimation has better accuracy when its motion state is known, and in turn, the performance of motion state recognition can be improved with increasing reliability when the location is given. The geo-context inference is implemented simply with the expert system principle, and more sophisticated approaches will be developed.

  1. PDTD: a web-accessible protein database for drug target identification

    Directory of Open Access Journals (Sweden)

    Gao Zhenting

    2008-02-01

    Full Text Available Abstract Background Target identification is important for modern drug discovery. With the advances in the development of molecular docking, potential binding proteins may be discovered by docking a small molecule to a repository of proteins with three-dimensional (3D structures. To complete this task, a reverse docking program and a drug target database with 3D structures are necessary. To this end, we have developed a web server tool, TarFisDock (Target Fishing Docking http://www.dddc.ac.cn/tarfisdock, which has been used widely by others. Recently, we have constructed a protein target database, Potential Drug Target Database (PDTD, and have integrated PDTD with TarFisDock. This combination aims to assist target identification and validation. Description PDTD is a web-accessible protein database for in silico target identification. It currently contains >1100 protein entries with 3D structures presented in the Protein Data Bank. The data are extracted from the literatures and several online databases such as TTD, DrugBank and Thomson Pharma. The database covers diverse information of >830 known or potential drug targets, including protein and active sites structures in both PDB and mol2 formats, related diseases, biological functions as well as associated regulating (signaling pathways. Each target is categorized by both nosology and biochemical function. PDTD supports keyword search function, such as PDB ID, target name, and disease name. Data set generated by PDTD can be viewed with the plug-in of molecular visualization tools and also can be downloaded freely. Remarkably, PDTD is specially designed for target identification. In conjunction with TarFisDock, PDTD can be used to identify binding proteins for small molecules. The results can be downloaded in the form of mol2 file with the binding pose of the probe compound and a list of potential binding targets according to their ranking scores. Conclusion PDTD serves as a comprehensive and

  2. Access 2013 bible

    CERN Document Server

    Alexander, Michael

    2013-01-01

    A comprehensive reference to the updated and new features of Access 2013 As the world's most popular database management tool, Access enables you to organize, present, analyze, and share data as well as build powerful database solutions. However, databases can be complex. That's why you need the expert guidance in this comprehensive reference. Access 2013 Bible helps you gain a solid understanding of database purpose, construction, and application so that whether you're new to Access or looking to upgrade to the 2013 version, this well-rounded resource provides you with a th

  3. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Directory of Open Access Journals (Sweden)

    Xiaohuan Yang

    2009-02-01

    Full Text Available The spatial distribution of population is closely related to land use and land cover (LULC patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B data integrated with a Pattern Decomposition Method (PDM and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM. The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable.

  4. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Science.gov (United States)

    Yang, Xiaohuan; Huang, Yaohuan; Dong, Pinliang; Jiang, Dong; Liu, Honghui

    2009-01-01

    The spatial distribution of population is closely related to land use and land cover (LULC) patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS) have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS) is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B) data integrated with a Pattern Decomposition Method (PDM) and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM). The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable. PMID:22399959

  5. DOT Online Database

    Science.gov (United States)

    Page Home Table of Contents Contents Search Database Search Login Login Databases Advisory Circulars accessed by clicking below: Full-Text WebSearch Databases Database Records Date Advisory Circulars 2092 5 data collection and distribution policies. Document Database Website provided by MicroSearch

  6. Harmful algal bloom historical database from Coastal waters of Florida from 01 November 1995 to 09 September 1996 (NODC Accession 0019216)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — In the later part of 1999, a relational Microsoft Access database was created to accommodate a wide range of data on the phytoplankton Karenia brevis. This database,...

  7. MetIDB: A Publicly Accessible Database of Predicted and Experimental 1H NMR Spectra of Flavonoids

    NARCIS (Netherlands)

    Mihaleva, V.V.; Beek, te T.A.; Zimmeren, van F.; Moco, S.I.A.; Laatikainen, R.; Niemitz, M.; Korhonen, S.P.; Driel, van M.A.; Vervoort, J.

    2013-01-01

    Identification of natural compounds, especially secondary metabolites, has been hampered by the lack of easy to use and accessible reference databases. Nuclear magnetic resonance (NMR) spectroscopy is the most selective technique for identification of unknown metabolites. High quality 1H NMR (proton

  8. LAND-deFeND - An innovative database structure for landslides and floods and their consequences.

    Science.gov (United States)

    Napolitano, Elisabetta; Marchesini, Ivan; Salvati, Paola; Donnini, Marco; Bianchi, Cinzia; Guzzetti, Fausto

    2018-02-01

    Information on historical landslides and floods - collectively called "geo-hydrological hazards - is key to understand the complex dynamics of the events, to estimate the temporal and spatial frequency of damaging events, and to quantify their impact. A number of databases on geo-hydrological hazards and their consequences have been developed worldwide at different geographical and temporal scales. Of the few available database structures that can handle information on both landslides and floods some are outdated and others were not designed to store, organize, and manage information on single phenomena or on the type and monetary value of the damages and the remediation actions. Here, we present the LANDslides and Floods National Database (LAND-deFeND), a new database structure able to store, organize, and manage in a single digital structure spatial information collected from various sources with different accuracy. In designing LAND-deFeND, we defined four groups of entities, namely: nature-related, human-related, geospatial-related, and information-source-related entities that collectively can describe fully the geo-hydrological hazards and their consequences. In LAND-deFeND, the main entities are the nature-related entities, encompassing: (i) the "phenomenon", a single landslide or local inundation, (ii) the "event", which represent the ensemble of the inundations and/or landslides occurred in a conventional geographical area in a limited period, and (iii) the "trigger", which is the meteo-climatic or seismic cause (trigger) of the geo-hydrological hazards. LAND-deFeND maintains the relations between the nature-related entities and the human-related entities even where the information is missing partially. The physical model of the LAND-deFeND contains 32 tables, including nine input tables, 21 dictionary tables, and two association tables, and ten views, including specific views that make the database structure compliant with the EC INSPIRE and the Floods

  9. Pricing Analysis in Geo/Geo/1 Queueing System

    Directory of Open Access Journals (Sweden)

    Yan Ma

    2015-01-01

    Full Text Available This paper studies the equilibrium behavior of customers and optimal pricing strategies of servers in a Geo/Geo/1 queueing system. Two common pricing mechanisms are considered. The first one is called ex-post payment (EPP scheme where the server collects tolls proportional to queue times, and the second one is called ex-ante payment (EAP scheme where the server charges a flat fee for the total service. The server sets the toll price to maximize its own profit. It is found that, under a customer’s choice equilibrium, the two toll mechanisms are equivalent from the economic point of view. Finally, we present several numerical experiments to investigate the effects of system parameters on the equilibrium customer joining rate and servers’ profits.

  10. The geo-genic radon potential map of the aspiring 'Buzau Land' Geo-park

    International Nuclear Information System (INIS)

    Moldovan, M. C.; Burghele, B. D.; Roba, C. A.; Sferle, T. L.; Buterez, C.; Mitrofan, H.

    2017-01-01

    Mapping the geo-genic radon potential in Buzau County is part of a research project aiming to apply research for sustainable development and economic growth following the principles of geo-conservation in order to support the 'Buzau Land' UNESCO Geo-park initiative. The mapping of geo-genic radon will be used as an overview for planning purposes. The main geological formations of the studied area were identified as Cretaceous and Paleogene flysch, included in a thin-skinned nappes pile and consisting of alternating sandstones, marls, clays and, subordinately, conglomerates, all tightly folded or faulted. Significant variations in the concentration of radon were therefore determined in the ground. However, no high values were determined, the maximum measured activity concentration being 101.6 kBq m -3 . (authors)

  11. Open-access MIMIC-II database for intensive care research.

    Science.gov (United States)

    Lee, Joon; Scott, Daniel J; Villarroel, Mauricio; Clifford, Gari D; Saeed, Mohammed; Mark, Roger G

    2011-01-01

    The critical state of intensive care unit (ICU) patients demands close monitoring, and as a result a large volume of multi-parameter data is collected continuously. This represents a unique opportunity for researchers interested in clinical data mining. We sought to foster a more transparent and efficient intensive care research community by building a publicly available ICU database, namely Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II). The data harnessed in MIMIC-II were collected from the ICUs of Beth Israel Deaconess Medical Center from 2001 to 2008 and represent 26,870 adult hospital admissions (version 2.6). MIMIC-II consists of two major components: clinical data and physiological waveforms. The clinical data, which include patient demographics, intravenous medication drip rates, and laboratory test results, were organized into a relational database. The physiological waveforms, including 125 Hz signals recorded at bedside and corresponding vital signs, were stored in an open-source format. MIMIC-II data were also deidentified in order to remove protected health information. Any interested researcher can gain access to MIMIC-II free of charge after signing a data use agreement and completing human subjects training. MIMIC-II can support a wide variety of research studies, ranging from the development of clinical decision support algorithms to retrospective clinical studies. We anticipate that MIMIC-II will be an invaluable resource for intensive care research by stimulating fair comparisons among different studies.

  12. Download - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Download First of all, please read the license of this database. Data ...1.4 KB) Simple search and download Downlaod via FTP FTP server is sometimes jammed. If it is, access [here]. About This Database Data...base Description Download License Update History of This Database Site Policy | Contact Us Download - Trypanosomes Database | LSDB Archive ...

  13. Accessibility to health care facilities in Montreal Island: an application of relative accessibility indicators from the perspective of senior and non-senior residents

    Directory of Open Access Journals (Sweden)

    Morency Catherine

    2010-10-01

    Full Text Available Abstract Background Geographical access to health care facilities is known to influence health services usage. As societies age, accessibility to health care becomes an increasingly acute public health concern. It is known that seniors tend to have lower mobility levels, and it is possible that this may negatively affect their ability to reach facilities and services. Therefore, it becomes important to examine the mobility situation of seniors vis-a-vis the spatial distribution of health care facilities, to identify areas where accessibility is low and interventions may be required. Methods Accessibility is implemented using a cumulative opportunities measure. Instead of assuming a fixed bandwidth (i.e. a distance threshold for measuring accessibility, in this paper the bandwidth is defined using model-based estimates of average trip length. Average trip length is an all-purpose indicator of individual mobility and geographical reach. Adoption of a spatial modelling approach allows us to tailor these estimates of travel behaviour to specific locations and person profiles. Replacing a fixed bandwidth with these estimates permits us to calculate customized location- and person-based accessibility measures that allow inter-personal as well as geographical comparisons. Data The case study is Montreal Island. Geo-coded travel behaviour data, specifically average trip length, and relevant traveller's attributes are obtained from the Montreal Household Travel Survey. These data are complemented with information from the Census. Health care facilities, also geo-coded, are extracted from a comprehensive business point database. Health care facilities are selected based on Standard Industrial Classification codes 8011-21 (Medical Doctors and Dentists. Results Model-based estimates of average trip length show that travel behaviour varies widely across space. With the exception of seniors in the downtown area, older residents of Montreal Island tend to be

  14. User Defined Geo-referenced Information

    DEFF Research Database (Denmark)

    Konstantas, Dimitri; Villalba, Alfredo; di Marzo Serugendo, Giovanna

    2009-01-01

    . In this paper we present two novel mobile and wireless collaborative services and concepts, the Hovering Information, a mobile, geo-referenced content information management system, and the QoS Information service, providing user observed end-to-end infrastructure geo-related QoS information....

  15. SOIL Geo-Wiki: A tool for improving soil information

    Science.gov (United States)

    Skalský, Rastislav; Balkovic, Juraj; Fritz, Steffen; See, Linda; van der Velde, Marijn; Obersteiner, Michael

    2014-05-01

    Crowdsourcing is increasingly being used as a way of collecting data for scientific research, e.g. species identification, classification of galaxies and unravelling of protein structures. The WorldSoilProfiles.org database at ISRIC is a global collection of soil profiles, which have been 'crowdsourced' from experts. This system, however, requires contributors to have a priori knowledge about soils. Yet many soil parameters can be observed in the field without specific knowledge or equipment such as stone content, soil depth or color. By crowdsourcing this information over thousands of locations, the uncertainty in current soil datasets could be radically reduced, particularly in areas currently without information or where multiple interpretations are possible from different existing soil maps. Improved information on soils could benefit many research fields and applications. Better soil data could enhance assessments of soil ecosystem services (e.g. soil carbon storage) and facilitate improved process-based ecosystem modeling from local to global scales. Geo-Wiki is a crowdsourcing tool that was developed at IIASA for land cover validation using satellite imagery. Several branches are now available focused on specific aspects of land cover validation, e.g. validating cropland extent or urbanized areas. Geo-Wiki Pictures is a smart phone application for collecting land cover related information on the ground. The extension of Geo-Wiki to a mobile environment provides a tool for experts in land cover validation but is also a way of reaching the general public in the validation of land cover. Here we propose a Soil Geo-Wiki tool that builds on the existing functionality of the Geo-Wiki application, which will be largely designed for the collection and sharing of soil information. Two distinct applications are envisaged: an expert-oriented application mainly for scientific purposes, which will use soil science related language (e.g. WRB or any other global reference

  16. GEOS-5 Chemistry Transport Model User's Guide

    Science.gov (United States)

    Kouatchou, J.; Molod, A.; Nielsen, J. E.; Auer, B.; Putman, W.; Clune, T.

    2015-01-01

    The Goddard Earth Observing System version 5 (GEOS-5) General Circulation Model (GCM) makes use of the Earth System Modeling Framework (ESMF) to enable model configurations with many functions. One of the options of the GEOS-5 GCM is the GEOS-5 Chemistry Transport Model (GEOS-5 CTM), which is an offline simulation of chemistry and constituent transport driven by a specified meteorology and other model output fields. This document describes the basic components of the GEOS-5 CTM, and is a user's guide on to how to obtain and run simulations on the NCCS Discover platform. In addition, we provide information on how to change the model configuration input files to meet users' needs.

  17. NEMiD: a web-based curated microbial diversity database with geo-based plotting.

    Science.gov (United States)

    Bhattacharjee, Kaushik; Joshi, Santa Ram

    2014-01-01

    The majority of the Earth's microbes remain unknown, and that their potential utility cannot be exploited until they are discovered and characterized. They provide wide scope for the development of new strains as well as biotechnological uses. The documentation and bioprospection of microorganisms carry enormous significance considering their relevance to human welfare. This calls for an urgent need to develop a database with emphasis on the microbial diversity of the largest untapped reservoirs in the biosphere. The data annotated in the North-East India Microbial database (NEMiD) were obtained by the isolation and characterization of microbes from different parts of the Eastern Himalayan region. The database was constructed as a relational database management system (RDBMS) for data storage in MySQL in the back-end on a Linux server and implemented in an Apache/PHP environment. This database provides a base for understanding the soil microbial diversity pattern in this megabiodiversity hotspot and indicates the distribution patterns of various organisms along with identification. The NEMiD database is freely available at www.mblabnehu.info/nemid/.

  18. NEMiD: A Web-Based Curated Microbial Diversity Database with Geo-Based Plotting

    Science.gov (United States)

    Bhattacharjee, Kaushik; Joshi, Santa Ram

    2014-01-01

    The majority of the Earth's microbes remain unknown, and that their potential utility cannot be exploited until they are discovered and characterized. They provide wide scope for the development of new strains as well as biotechnological uses. The documentation and bioprospection of microorganisms carry enormous significance considering their relevance to human welfare. This calls for an urgent need to develop a database with emphasis on the microbial diversity of the largest untapped reservoirs in the biosphere. The data annotated in the North-East India Microbial database (NEMiD) were obtained by the isolation and characterization of microbes from different parts of the Eastern Himalayan region. The database was constructed as a relational database management system (RDBMS) for data storage in MySQL in the back-end on a Linux server and implemented in an Apache/PHP environment. This database provides a base for understanding the soil microbial diversity pattern in this megabiodiversity hotspot and indicates the distribution patterns of various organisms along with identification. The NEMiD database is freely available at www.mblabnehu.info/nemid/. PMID:24714636

  19. CORAL Server and CORAL Server Proxy: Scalable Access to Relational Databases from CORAL Applications

    International Nuclear Information System (INIS)

    Valassi, A; Kalkhof, A; Bartoldus, R; Salnikov, A; Wache, M

    2011-01-01

    The CORAL software is widely used at CERN by the LHC experiments to access the data they store on relational databases, such as Oracle. Two new components have recently been added to implement a model involving a middle tier 'CORAL server' deployed close to the database and a tree of 'CORAL server proxies', providing data caching and multiplexing, deployed close to the client. A first implementation of the two new components, released in the summer 2009, is now deployed in the ATLAS online system to read the data needed by the High Level Trigger, allowing the configuration of a farm of several thousand processes. This paper reviews the architecture of the software, its development status and its usage in ATLAS.

  20. Framework 'interstitial' oxygen in La10(GeO4)5-(GeO5)O2 apatite electrolyte

    International Nuclear Information System (INIS)

    Pramana, S.S.; White, T.J.

    2007-01-01

    Oxygen conduction at low temperatures in apatites make these materials potentially useful as electrolytes in solid-oxide fuel cells, but our understanding of the defect structures enabling ion migration is incomplete. While conduction along [001] channels is dominant, considerable inter-tunnel mobility has been recognized. Using neutron powder diffraction of stoichiometric 'La 10 (GeO 4 ) 6 O 3 ', it has been shown that this compound is more correctly described as an La 10 (GeO 4 ) 5- (GeO 5 )O 2 apatite, in which high concentrations of interstitial oxygen reside within the channel walls. It is suggested that these framework interstitial O atoms provide a reservoir of ions that can migrate into the conducting channels of apatite, via a mechanism of inter-tunnel oxygen diffusion that transiently converts GeO 4 tetrahedra to GeO 5 distorted trigonal bipyramids. This structural modification is consistent with known crystal chemistry and may occur generally in oxide apatites. (orig.)

  1. SierraDNA – Demonstrating the Usefulness of Direct ILS Database Access

    Directory of Open Access Journals (Sweden)

    James Padgett

    2015-10-01

    Full Text Available Innovative Interface’s Sierra(™ Integrated Library System (ILS brings with it a Database Navigator Application (SierraDNA - in layman's terms SierraDNA gives Sierra sites read access to their ILS database. Unlike the closed use cases produced by vendor supplied APIs, which restrict Libraries to limited development opportunities, SierraDNA enables sites to publish their own APIs and scripts based upon custom SQL code to meet their own needs and those of their users and processes. In this article we give examples showing how SierraDNA can be utilized to improve Library services. We highlight three example use cases which have benefited our users, enhanced online security and improved our back office processes. In the first use case we employ user access data from our electronic resources proxy server (WAM to detect hacked user accounts. Three scripts are used in conjunction to flag user accounts which are being hijacked to systematically steal content from our electronic resource provider’s websites. In the second we utilize the reading histories of our users to augment our search experience with an Amazon style “People who borrowed this book also borrowed…these books” feature. Two scripts are used together to determine which other items were borrowed by borrowers of the item currently of interest. And lastly, we use item holds data to improve our acquisitions workflow through an automated demand based ordering process. Our explanation and SQL code should be of direct use for adoption or as examples for other Sierra customers willing to exploit their ILS data in similar ways, but the principles may also be useful to non-Sierra sites that also wish to enhancement security, user services or improve back office processes.

  2. Geophysical Data Sets in GeoMapApp

    Science.gov (United States)

    Goodwillie, A. M.

    2017-12-01

    GeoMapApp (http://www.geomapapp.org), a free map-based data tool developed at Lamont-Doherty Earth Observatory, provides access to hundreds of integrated geoscience data sets that are useful for geophysical studies. Examples include earthquake and volcano catalogues, gravity and magnetics data, seismic velocity tomographic models, geological maps, geochemical analytical data, lithospheric plate boundary information, geodetic velocities, and high-resolution bathymetry and land elevations. Users can also import and analyse their own data files. Data analytical functions provide contouring, shading, profiling, layering and transparency, allowing multiple data sets to be seamlessly compared. A new digitization and field planning portal allow stations and waypoints to be generated. Sessions can be saved and shared with colleagues and students. In this eLightning presentation we will demonstrate some of GeoMapApp's capabilities with a focus upon subduction zones and tectonics. In the attached screen shot of the Cascadia margin, the contoured depth to the top of the subducting Juan de Fuca slab is overlain on a shear wave velocity depth slice. Geochemical data coloured on Al2O3 and scaled on MgO content is shown as circles. The stack of data profiles was generated along the white line.

  3. Utility assessment of a map-based online geo-collaboration tool.

    Science.gov (United States)

    Sidlar, Christopher L; Rinner, Claus

    2009-05-01

    Spatial group decision-making processes often include both informal and analytical components. Discussions among stakeholders or planning experts are an example of an informal component. When participants discuss spatial planning projects they typically express concerns and comments by pointing to places on a map. The Argumentation Map model provides a conceptual basis for collaborative tools that enable explicit linkages of arguments to the places to which they refer. These tools allow for the input of explicitly geo-referenced arguments as well as the visual access to arguments through a map interface. In this paper, we will review previous utility studies in geo-collaboration and evaluate a case study of a Web-based Argumentation Map application. The case study was conducted in the summer of 2005 when student participants discussed planning issues on the University of Toronto St. George campus. During a one-week unmoderated discussion phase, 11 participants wrote 60 comments on issues such as safety, facilities, parking, and building aesthetics. By measuring the participants' use of geographic references, we draw conclusions on how well the software tool supported the potential of the underlying concept. This research aims to contribute to a scientific approach to geo-collaboration in which the engineering of novel spatial decision support methods is complemented by a critical assessment of their utility in controlled, realistic experiments.

  4. Geo-Seas - building a unified e-infrastructure for marine geoscientific data management in Europe

    Science.gov (United States)

    Glaves, H.; Schaap, D.

    2012-04-01

    A significant barrier to marine geoscientific research in Europe is the lack of standardised marine geological and geophysical data and data products which could potentially facilitate multidisciplinary marine research extending across national and international boundaries. Although there are large volumes of geological and geophysical data available for the marine environment it is currently very difficult to use these datasets in an integrated way due to different nomenclatures, formats, scales and coordinate systems being used within different organisations as well as between countries. This makes the direct use of primary data very difficult and also hampers use of the data to produce integrated multidisciplinary data products and services. The Geo-Seas project, an EU Framework 7 funded initiative, is developing a unified e-infrastructure to facilitate the sharing of marine geoscientific data within Europe. This e-infrastructure is providing on-line access to both discovery metadata and the associated federated data sets from 26 European data centres via a dedicated portal. The implementation of the Geo-Seas portal is allowing a range of end users to locate, assess and access standardised geoscientific data from multiple sources which is interoperable with other marine data types. Geo-Seas is building on the work already done by the existing SeaDataNet project which currently provides a data management e-infrastructure for oceanographic data which allows users to locate and access federated oceanographic data sets. By adopting and adapting the SeaDataNet methodologies and technologies the Geo-Seas project has not only avoid unnecessary duplication of effort by reusing existing and proven technologies but also contributed to the development of a multidisciplinary approach to ocean science across Europe through the creation of a joint infrastructure for both marine geoscientific and oceanographic data. This approach is also leading to the development of

  5. User Experience Design in Professional Map-Based Geo-Portals

    Directory of Open Access Journals (Sweden)

    Bastian Zimmer

    2013-10-01

    Full Text Available We have recently been witnessing the growing establishment of map-centered web-based geo-portals on national, regional and local levels. However, a particular issue with these geo-portals is that each instance has been implemented in different ways in terms of design, usability, functionality, interaction possibilities, map size and symbologies. In this paper, we try to tackle these shortcomings by analyzing and formalizing the requirements for map-based geo-portals in a user experience based approach. First, we propose a holistic definition the term of a “geo-portal”. Then, we present our approach to user experience design for map-based geo-portals by defining the functional requirements of a geo-portal, by analyzing previous geo-portal developments, by distilling the results of our empirical user study to perform practically-oriented user requirements, and finally by establishing a set of user experience design guidelines for the creation of map-based geo-portals. These design guidelines have been extracted for each of the main components of a geo-portal, i.e., the map, the search dialogue, the presentation of the search results, symbologies, and other aspects. These guidelines shall constitute the basis for future geo-portal developments to achieve standardization in the user-experience design of map-based geo-portals.

  6. GeoSciML v3.0 - a significant upgrade of the CGI-IUGS geoscience data model

    Science.gov (United States)

    Raymond, O.; Duclaux, G.; Boisvert, E.; Cipolloni, C.; Cox, S.; Laxton, J.; Letourneau, F.; Richard, S.; Ritchie, A.; Sen, M.; Serrano, J.-J.; Simons, B.; Vuollo, J.

    2012-04-01

    analytical data using the Observations and Measurements (ISO19156) and SWE Common v2 models. The GeoSciML v3 data model does not include vocabularies to support the data model. However, it does provide a standard pattern to reference controlled vocabulary concepts using HTTP-URIs. The international GeoSciML community has developed distributed RDF-based geoscience vocabularies that can be accessed by GeoSciML web services using the standard pattern recommended in GeoSciML v3. GeoSciML v3 is the first version of GeoSciML that will be accompanied by web service validation tools using Schematron rules. For example, these validation tools may check for compliance of a web service to a particular profile of GeoSciML, or for logical consistency of data content that cannot be enforced by the application schemas. This validation process will support accreditation of GeoSciML services and a higher degree of semantic interoperability. * International Union of Geological Sciences Commission for Management and Application of Geoscience Information (CGI-IUGS)

  7. Biofuel Database

    Science.gov (United States)

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  8. Geo-neutrino review

    International Nuclear Information System (INIS)

    Tolich, N.

    2012-01-01

    The principal source of energy for dynamic processes of the earth, such as plate tectonics is thought to come from the radioactive decays of 238 U, 232 Th, and 40 K within the earth. These decays produce electron-antineutrinos, so-called geo-neutrinos, the measurement of which near the earth's surface allows for a direct measure of the total radiogenic heat production in the earth. The KamLAND and Borexino experiments have both measured a geo-neutrino flux significantly greater than zero. As shown in these proceedings, more precise future measurements will significantly constrain earth composition models.

  9. GEO600: status and plans

    International Nuclear Information System (INIS)

    Willke, B

    2007-01-01

    The GEO600 gravitational wave detector located near Hannover in Germany is one of the four detectors of the LIGO Scientific Collaboration (LSC). For almost the entire year of 2006, GEO600 participated in the S5 science run of the LSC. Overall an equivalent of about 270 days of science data with an average peak sensitivity of better than 3 x 10 -22 Hz -1/2 have been acquired so far. In this paper, we describe the status of the GEO600 project during the period between January 2006 and February 2007. In addition, plans for the near-term and medium-term future are discussed

  10. Description and testing of the Geo Data Portal: Data integration framework and Web processing services for environmental science collaboration

    Science.gov (United States)

    Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.

    2011-01-01

    Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.

  11. Why Geo-Humanities

    Science.gov (United States)

    Graells, Robert Casals i.; Sibilla, Anna; Bohle, Martin

    2016-04-01

    Anthropogenic global change is a composite process. It consists of societal processes (in the 'noosphere') and natural processes (in the 'bio-geosphere'). The 'noosphere' is the ensemble of social, cultural or political insights ('shared subjective mental concepts') of people. Understanding the composite of societal and natural processes ('human geo-biosphere intersections'), which shapes the features of anthropogenic global change, would benefit from a description that draws equally on natural sciences, social sciences and humanities. To that end it is suggested to develop a concept of 'geo-humanities': This essay presents some aspects of its scope, discussing "knowledge that is to manage", "intentions that are to shape", "choices that are to justify" and "complexity that is to handle". Managing knowledge: That people understand anthropogenic global change requires their insights into how 'human geosphere intersections' function. Insights are formed ('processed') in the noosphere by means of interactions between people. Understanding how 'human geosphere intersections' functions combines scientific, engineering and economic studies with studies of the dynamics of the noosphere. Shaping intentions: During the last century anthropogenic global change developed as the collateral outcome of humankind's accumulated actions. It is caused by the number of people, the patterns of their consumption of resources, and the alterations of their environments. Nowadays, anthropogenic global chance is either an intentional negligence or a conscious act. Justifying choices: Humanity has alternatives how to alter Earth at planetary scale consciously. For example, there is a choice to alter the geo-biosphere or to adjust the noosphere. Whatever the choice, it will depend on people's world-views, cultures and preferences. Thus beyond issues whether science and technology are 'sound' overarching societal issues are to tackle, such as: (i) how to appropriate and distribute natural

  12. Bi-static Optical Observations of GEO Objects

    Science.gov (United States)

    Seitzer, Patrick; Barker, Edwin S.; Cowardin, Heather; Lederer, Susan M.; Buckalew, Brent

    2014-01-01

    A bi-static study of objects at Geosynchronous Earth Orbit (GEO) was conducted using two ground-based wide-field optical telescopes. The University of Michigan's 0.6-m MODEST (Michigan Orbital Debris Survey Telescope) located at the Cerro Tololo Inter- American Observatory in Chile was employed in a series of coordinated observations with the U.S. Naval Observatory's (USNO) 1.3-m telescope at the USNO Flagstaff Station near Flagstaff, Arizona, USA. The goals of this project are twofold: (1) Obtain optical distances to known and unknown objects at GEO from the difference in the observed topocentric position of objects measured with respect to a reference star frame. The distance can be derived directly from these measurements, and is independent of any orbital solution. The wide geographical separation of these two telescopes means that the parallax difference is larger than ten degrees, and (2) Compare optical photometry in similar filters of GEO objects taken during the same time period from the two sites. The object's illuminated surfaces presented different angles of reflected sunlight to the two telescopes.During a four hour period on the night.of 22 February 2014 (UT), coordinated observations were obtained for eight different GEO positions. Each coordinated observation sequence was started on the hour or half-hour, and was selected to ensure the same cataloged GEO object was available in the field of view of both telescopes during the thirty minute observing sequence. GEO objects were chosen to be both controlled and uncontrolled at a range of orbital inclinations, and the objects were not tracked. Instead both telescopes were operated with all drives off in GEO survey mode to discover un-cataloged objects at GEO. The initial results from this proof-of-concept observing run will be presented, with the intent of laying the foundation for future large-scale bi-static observing campaigns of the GEO regime.

  13. The Case for GEO Hosted SSA Payloads

    Science.gov (United States)

    Welsch, C.; Armand, B.; Repp, M.; Robinson, A.

    2014-09-01

    Space situational awareness (SSA) in the geosynchronous earth orbit (GEO) belt presents unique challenges, and given the national importance and high value of GEO satellites, is increasingly critical as space becomes more congested and contested. Space situational awareness capabilities can serve as an effective deterrent against potential adversaries if they provide accurate, timely, and persistent information and are resilient to the threat environment. This paper will demonstrate how simple optical SSA payloads hosted on GEO commercial and government satellites can complement the SSA mission and data provided by Space-Based Space Surveillance (SBSS) and the Geosynchronous Space Situational Awareness Program (GSSAP). GSSAP is built by Orbital Sciences Corporation and launched on July 28, 2014. Analysis performed for this paper will show how GEO hosted SSA payloads, working in combination with SBSS and GSSAP, can increase persistence and timely coverage of high value assets in the GEO belt. The potential to further increase GEO object identification and tracking accuracy by integrating SSA data from multiple sources across different viewing angles including GEO hosted SSA sources will be addressed. Hosting SSA payloads on GEO platforms also increases SSA mission architecture resiliency as the sensors are by distributed across multiple platforms including commercial platforms. This distributed architecture presents a challenging target for an adversary to attempt to degrade or disable. We will present a viable concept of operations to show how data from hosted SSA sensors could be integrated with SBSS and GSSAP data to present a comprehensive and more accurate data set to users. Lastly, we will present an acquisition approach using commercial practices and building on lessons learned from the Commercially Hosted Infra Red Payload CHIRP to demonstrate the affordability of GEO hosted SSA payloads.

  14. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    Science.gov (United States)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  15. Development of a geo-information system for the evaluation of active faults

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Sang Gi; Lee, G. B.; Kim, H. J. [Paichai Univ., Taejon (Korea, Republic of)] (and others)

    2002-03-15

    This project aims to assist the participants of the active fault project by computerizing the field and laboratory data of the study area. The geo-information system, therefore, not only contributes to the participants while they are organizing and analyzing their data but also gathers detailed information in digital form. A geological database can be established by organizing the gathered digital information from the participants, and the database can easily be sheared among specialists. In such purpose, a field system, which can be used by the project participants, has been attempting to be developed during the first project year. The field system contains not only a software but also available topographic and geological maps of the study area. The field system is coded by visual basic, and the mapobject component of ESRI and the TrueDBGrid OCX are also utilized. Major functions of the system are tools for vector and raster form topographic maps, database design and application, geological symbol plot, and the database search for the plotted geological symbol.

  16. BIG GEO DATA MANAGEMENT: AN EXPLORATION WITH SOCIAL MEDIA AND TELECOMMUNICATIONS OPEN DATA

    Directory of Open Access Journals (Sweden)

    C. Arias Munoz

    2016-06-01

    Full Text Available The term Big Data has been recently used to define big, highly varied, complex data sets, which are created and updated at a high speed and require faster processing, namely, a reduced time to filter and analyse relevant data. These data is also increasingly becoming Open Data (data that can be freely distributed made public by the government, agencies, private enterprises and among others. There are at least two issues that can obstruct the availability and use of Open Big Datasets: Firstly, the gathering and geoprocessing of these datasets are very computationally intensive; hence, it is necessary to integrate high-performance solutions, preferably internet based, to achieve the goals. Secondly, the problems of heterogeneity and inconsistency in geospatial data are well known and affect the data integration process, but is particularly problematic for Big Geo Data. Therefore, Big Geo Data integration will be one of the most challenging issues to solve. With these applications, we demonstrate that is possible to provide processed Big Geo Data to common users, using open geospatial standards and technologies. NoSQL databases like MongoDB and frameworks like RASDAMAN could offer different functionalities that facilitate working with larger volumes and more heterogeneous geospatial data sources.

  17. Big Geo Data Management: AN Exploration with Social Media and Telecommunications Open Data

    Science.gov (United States)

    Arias Munoz, C.; Brovelli, M. A.; Corti, S.; Zamboni, G.

    2016-06-01

    The term Big Data has been recently used to define big, highly varied, complex data sets, which are created and updated at a high speed and require faster processing, namely, a reduced time to filter and analyse relevant data. These data is also increasingly becoming Open Data (data that can be freely distributed) made public by the government, agencies, private enterprises and among others. There are at least two issues that can obstruct the availability and use of Open Big Datasets: Firstly, the gathering and geoprocessing of these datasets are very computationally intensive; hence, it is necessary to integrate high-performance solutions, preferably internet based, to achieve the goals. Secondly, the problems of heterogeneity and inconsistency in geospatial data are well known and affect the data integration process, but is particularly problematic for Big Geo Data. Therefore, Big Geo Data integration will be one of the most challenging issues to solve. With these applications, we demonstrate that is possible to provide processed Big Geo Data to common users, using open geospatial standards and technologies. NoSQL databases like MongoDB and frameworks like RASDAMAN could offer different functionalities that facilitate working with larger volumes and more heterogeneous geospatial data sources.

  18. Implementation of Multiple Access Techniques Applicable for Maritime Satellite Communications

    OpenAIRE

    Stojce Dimov Ilcev

    2013-01-01

    In this paper are introduced fundamentals, characteristics, advantages and disadvantages of Multiple Access (MA) employed as transmission techniques in the Maritime Mobile Satellite Communications (MMSC) between ships and Coast Earth Station (CES) via Geostationary Earth Orbit (GEO) or Not-GEO satellite constellations. In fixed satellite communication, as a rule, especially in MMSC many users are active at the same time. The problem of simultaneous communications between many single or multip...

  19. The energy geo-policy

    International Nuclear Information System (INIS)

    Duval, M.

    2005-01-01

    This analysis updates and develops the analysis of the energy geo-policy proposed by the French Review of geo-policy. In this framework the today policies of the different sate and geographical actors, as suppliers and consumers of petroleum energy, are examined. Then the author analyzes the political problems resulting from, this petroleum energy transfers by earth and sea and the problems resulting specifically from the nuclear energy. The last part brings the author own opinions. (A.L.B.)

  20. Incident and Trafficking Database: New Systems for Reporting and Accessing State Information

    International Nuclear Information System (INIS)

    Dimitrovski, D.; Kittley, S.

    2015-01-01

    The IAEA's Incident and Trafficking Database (ITDB) is the Agency's authoritative source for information on incidents in which nuclear and other radioactive material is out of national regulatory control. It was established in 1995 and, as of June 2014, 126 States participate in the ITDB programme. Currently, the database contains over 2500 confirmed incidents, out of which 21% involve nuclear material, 62% radioactive source and 17% radioactively contaminated material. In recent years, the system for States to report incidents to the ITDB has been evolving — moving from fax-based to secure email and most recently to secure on-line reporting. A Beta version of the on-line system was rolled out this June, offering a simple, yet secure, communication channel for member states to provide information. In addition the system serves as a central hub for information related to official communication of the IAEA with Member States so some communication that is traditionally shared by e-mail does not get lost when ITDB counterparts change. In addition the new reporting system incorporates optional features that allow multiple Member State users to collaboratively contribute toward an INF. States are also being given secure on-line access to a streamlined version of the ITDB. This improves States' capabilities to retrieve and analyze information for their own purposes. In addition, on-line access to ITDB statistical information on incidents is available to States through an ITDB Dashboard. The dashboard contains aggregate information on number and types of incidents, material involved, as well some other statistics related to the ITDB that is typically provided in the ITDB Quarterly reports. (author)

  1. Atomic Spectra Database (ASD)

    Science.gov (United States)

    SRD 78 NIST Atomic Spectra Database (ASD) (Web, free access)   This database provides access and search capability for NIST critically evaluated data on atomic energy levels, wavelengths, and transition probabilities that are reasonably up-to-date. The NIST Atomic Spectroscopy Data Center has carried out these critical compilations.

  2. Evolution of grid-wide access to database resident information in ATLAS using Frontier

    CERN Document Server

    Barberis, D; The ATLAS collaboration; de Stefano, J; Dewhurst, A L; Dykstra, D; Front, D

    2012-01-01

    The ATLAS experiment deployed Frontier technology world-wide during the the initial year of LHC collision data taking to enable user analysis jobs running on the World-wide LHC Computing Grid to access database resident data. Since that time, the deployment model has evolved to optimize resources, improve performance, and streamline maintenance of Frontier and related infrastructure. In this presentation we focus on the specific changes in the deployment and improvements undertaken such as the optimization of cache and launchpad location, the use of RPMs for more uniform deployment of underlying Frontier related components, improvements in monitoring, optimization of fail-over, and an increasing use of a centrally managed database containing site specific information (for configuration of services and monitoring). In addition, analysis of Frontier logs has allowed us a deeper understanding of problematic queries and understanding of use cases. Use of the system has grown beyond just user analysis and subsyste...

  3. Italian Present-day Stress Indicators: IPSI Database

    Science.gov (United States)

    Mariucci, M. T.; Montone, P.

    2017-12-01

    In Italy, since the 90s of the last century, researches concerning the contemporary stress field have been developing at Istituto Nazionale di Geofisica e Vulcanologia (INGV) with local and regional scale studies. Throughout the years many data have been analysed and collected: now they are organized and available for an easy end-use online. IPSI (Italian Present-day Stress Indicators) database, is the first geo-referenced repository of information on the crustal present-day stress field maintained at INGV through a web application database and website development by Gabriele Tarabusi. Data consist of horizontal stress orientations analysed and compiled in a standardized format and quality-ranked for reliability and comparability on a global scale with other database. Our first database release includes 855 data records updated to December 2015. Here we present an updated version that will be released in 2018, after new earthquake data entry up to December 2017. The IPSI web site (http://ipsi.rm.ingv.it/) allows accessing data on a standard map viewer and choose which data (category and/or quality) to plot easily. The main information of each single element (type, quality, orientation) can be viewed simply going over the related symbol, all the information appear by clicking the element. At the same time, simple basic information on the different data type, tectonic regime assignment, quality ranking method are available with pop-up windows. Data records can be downloaded in some common formats, moreover it is possible to download a file directly usable with SHINE, a web based application to interpolate stress orientations (http://shine.rm.ingv.it). IPSI is mainly conceived for those interested in studying the characters of Italian peninsula and surroundings although Italian data are part of the World Stress Map (http://www.world-stress-map.org/) as evidenced by many links that redirect to this database for more details on standard practices in this field.

  4. SciELO, Scientific Electronic Library Online, a Database of Open Access Journals

    Directory of Open Access Journals (Sweden)

    Rogerio Meneghini

    2013-09-01

    Full Text Available   This essay discusses SciELO, a scientific journal database operating in 14 countries. It covers over 1000 journals providing open access to full text and table sets of scientometrics data. In Brazil it is responsible for a collection of nearly 300 journals, selected along 15 years as the best Brazilian periodicals in natural and social sciences. Nonetheless, they still are national journal in the sense that over 80% of the articles are published by Brazilian scientists. Important initiatives focused on professionalization and internationalization are considered to bring these journals to a higher level of quality and visibility. DOI: 10.18870/hlrc.v3i3.153

  5. Cementation of nuclear graphite using geo-polymers

    International Nuclear Information System (INIS)

    Girke, N.A.; Steinmetz, H.J.; Bukaemsky, A.; Bosbach, D.; Hermann, E.; Griebel, I.

    2012-01-01

    Geo-polymers are solid aluminosilicate materials usually formed by alkali hydroxide or alkali silicate activation of solid precursors such as coal fly ash, calcined clay and/or metallurgical slag. Today the primary application of geo-polymer technology is in the development of alternatives to Portland-based cements. Variations in the ratio of aluminium to silicon, and alkali to silicon or addition of structure support, produce geo-polymers with different physical and mechanical properties. These materials have an amorphous three-dimensional structure that gives geo-polymers certain properties, such as fire and acid resistance, low leach rate, which make them an ideal substitute for ordinary Portland cement (OPC) in a wide range of applications especially in conditioning and storage of radioactive waste. Therefore investigations have been initiated about how and to which amount graphite as a hydrophobic material can be mixed with cement or concrete to form stable waste products and which concretes fulfill the specifications at best. As result geo-polymers have been identified as a promising matrix for graphite containing nuclear wastes. With geo-polymers both favorable properties in the cementation process and a high long time structural stability of the products can be achieved. (authors)

  6. Quality, language, subdiscipline and promotion were associated with article accesses on Physiotherapy Evidence Database (PEDro).

    Science.gov (United States)

    Yamato, Tiê P; Arora, Mohit; Stevens, Matthew L; Elkins, Mark R; Moseley, Anne M

    2018-03-01

    To quantify the relationship between the number of times articles are accessed on the Physiotherapy Evidence Database (PEDro) and the article characteristics. A secondary aim was to examine the relationship between accesses and the number of citations of articles. The study was conducted to derive prediction models for the number of accesses of articles indexed on PEDro from factors that may influence an article's accesses. All articles available on PEDro from August 2014 to January 2015 were included. We extracted variables relating to the algorithm used to present PEDro search results (research design, year of publication, PEDro score, source of systematic review (Cochrane or non-Cochrane)) plus language, subdiscipline of physiotherapy, and whether articles were promoted to PEDro users. Three predictive models were examined using multiple regression analysis. Citation and journal impact factor were downloaded. There were 29,313 articles indexed in this period. We identified seven factors that predicted the number of accesses. More accesses were noted for factors related to the algorithm used to present PEDro search results (synthesis research (i.e., guidelines and reviews), recent articles, Cochrane reviews, and higher PEDro score) plus publication in English and being promoted to PEDro users. The musculoskeletal, neurology, orthopaedics, sports, and paediatrics subdisciplines were associated with more accesses. We also found that there was no association between number of accesses and citations. The number of times an article is accessed on PEDro is partly predicted by how condensed and high quality the evidence it contains is. Copyright © 2017 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  7. Real-Time Integration of Geo-data in ORM

    NARCIS (Netherlands)

    Balsters, Herman; Klaver, Chris; Huitema, George B.; Meersman, R; Dillon, T; Herrero, P

    2010-01-01

    Geographic information (geo-data; i.e., data with a spatial component.) is being used for civil, political, and commercial applications. Modeling geo-data can be involved due to its often very complex structure, hence placing high demands on the modeling language employed. Many geo-applications

  8. Assessment of the detectability of geo-hazards using Google Earth applied to the Three Parallel Rivers Area, Yunnan province of China

    Science.gov (United States)

    Voermans, Michiel; Mao, Zhun; Baartman, Jantiene EM; Stokes, Alexia

    2017-04-01

    Anthropogenic activities such as hydropower, mining and road construction in mountainous areas can induce and intensify mass wasting geo-hazards (e.g. landslides, gullies, rockslides). This represses local safety and socio-economic development, and endangers biodiversity at larger scale. Until today, data and knowledge to construct geo-hazard databases for further assessments are lacking. This applies in particular to countries with a recently emerged rapid economic growth, where there are no previous hazard documentations and where means to gain data from e.g. intensive fieldwork or VHR satellite imagery and DEM processing are lacking. Google Earth (GE, https://www.google.com/earth/) is a freely available and relatively simple virtual globe, map and geographical information program, which is potentially useful in detecting geo-hazards. This research aimed at (i) testing the capability of Google Earth to detect locations of geo-hazards and (ii) identifying factors affecting the diagnosing quality of the detection, including effects of geo-hazard dimensions, environs setting and professional background and effort of GE users. This was tested on nine geo-hazard sites following road segments in the Three Parallel Rivers Area in the Yunnan province of China, where geo-hazards are frequently occurring. Along each road site, the position and size of each geo-hazard was measured in situ. Next, independent diagnosers with varying professional experience (students, researchers, engineers etc.) were invited to detect geo-hazard occurrence along each of the eight sites via GE. Finally, the inventory and diagnostic data were compared to validate the objectives. Rates of detected geo-hazards from 30 diagnosers ranged from 10% to 48%. No strong correlations were found between the type and size of the geo-hazards and their detection rates. Also the years of expertise of the diagnosers proved not to make a difference, opposite to what may be expected. Meanwhile the amount of time

  9. Respiratory cancer database: An open access database of respiratory cancer gene and miRNA.

    Science.gov (United States)

    Choubey, Jyotsna; Choudhari, Jyoti Kant; Patel, Ashish; Verma, Mukesh Kumar

    2017-01-01

    Respiratory cancer database (RespCanDB) is a genomic and proteomic database of cancer of respiratory organ. It also includes the information of medicinal plants used for the treatment of various respiratory cancers with structure of its active constituents as well as pharmacological and chemical information of drug associated with various respiratory cancers. Data in RespCanDB has been manually collected from published research article and from other databases. Data has been integrated using MySQL an object-relational database management system. MySQL manages all data in the back-end and provides commands to retrieve and store the data into the database. The web interface of database has been built in ASP. RespCanDB is expected to contribute to the understanding of scientific community regarding respiratory cancer biology as well as developments of new way of diagnosing and treating respiratory cancer. Currently, the database consist the oncogenomic information of lung cancer, laryngeal cancer, and nasopharyngeal cancer. Data for other cancers, such as oral and tracheal cancers, will be added in the near future. The URL of RespCanDB is http://ridb.subdic-bioinformatics-nitrr.in/.

  10. BioAtlas: Interactive web service for microbial distribution analysis

    DEFF Research Database (Denmark)

    Lund, Jesper; List, Markus; Baumbach, Jan

    Massive amounts of 16S rRNA sequencing data have been stored in publicly accessible databases, such as GOLD, SILVA, GreenGenes (GG), and the Ribosomal Database Project (RDP). Many of these sequences are tagged with geo-locations. Nevertheless, researchers currently lack a user-friendly tool...... to analyze microbial distribution in a location-specific context. BioAtlas is an interactive web application that closes this gap between sequence databases, taxonomy profiling and geo/body-location information. It enables users to browse taxonomically annotated sequences across (i) the world map, (ii) human...

  11. Geo-communication and web-based infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2005-01-01

    The role of geo-information and the distribution of geo-information have changed dramatically since the introduction of web-services on the Internet. In the framework of web-services maps should be seen as an index to further geo-information. Maps are no longer an aim in themselves. In this context...... web-services perform the function as index-portals on the basis of geoinformation. The introduction of web-services as index-portals based on geoinformation has changed the conditions for both content and form of geocommunication. A high number of players and interactions (as well as a very high...... number of all kinds of information and combinations of these) characterize web-services, where maps are only a part of the whole. These new conditions demand new ways of modelling the processes leading to geo-communication. One new aspect is the fact that the service providers have become a part...

  12. XAFS study of GeO sub 2 glass under pressure

    CERN Document Server

    Ohtaka, O; Fukui, H; Murai, K; Okube, M; Takebe, H; Katayama, Y; Utsumi, W

    2002-01-01

    Using a large-volume high-pressure apparatus, Li sub 2 O-4GeO sub 2 glass and pure GeO sub 2 gel have been compressed to 14 GPa at room temperature and their local structural changes have been investigated by an in situ XAFS (x-ray absorption fine-structure) method. On compression of Li sub 2 O-4GeO sub 2 glass, the Ge-O distance gradually becomes short below 7 GPa, showing the conventional compression of the GeO sub 4 tetrahedron. Abrupt increase in the Ge-O distance occurs between 8 and 10 GPa, which corresponds to the coordination number (CN) changing from 4 to 6. The CN change is completed at 10 GPa. On decompression, the reverse transition occurs gradually below 10 GPa. In contrast to the case for Li sub 2 O-4GeO sub 2 glass, the Ge-O distance in GeO sub 2 gel gradually increases over a pressure range from 2 to 12 GPa, indicating that continuous change in CN occurs. The Ge-O distance at 12 GPa is shorter than that of Li-4GeO sub 2 indicating that the change in CN is not completed even at this pressure. O...

  13. Access 2010 for dummies

    CERN Document Server

    Ulrich Fuller, Laurie

    2010-01-01

    A friendly, step-by-step guide to the Microsoft Office database application Access may be the least understood and most challenging application in the Microsoft Office suite. This guide is designed to help anyone who lacks experience in creating and managing a database learn to use Access 2010 quickly and easily. In the classic For Dummies tradition, the book provides an education in Access, the interface, and the architecture of a database. It explains the process of building a database, linking information, sharing data, generating reports, and much more.As the Micr

  14. Children's Culture Database (CCD)

    DEFF Research Database (Denmark)

    Wanting, Birgit

    a Dialogue inspired database with documentation, network (individual and institutional profiles) and current news , paper presented at the research seminar: Electronic access to fiction, Copenhagen, November 11-13, 1996......a Dialogue inspired database with documentation, network (individual and institutional profiles) and current news , paper presented at the research seminar: Electronic access to fiction, Copenhagen, November 11-13, 1996...

  15. Implementation of Multiple Access Techniques Applicable for Maritime Satellite Communications

    Directory of Open Access Journals (Sweden)

    Stojce Dimov Ilcev

    2013-12-01

    Full Text Available In this paper are introduced fundamentals, characteristics, advantages and disadvantages of Multiple Access (MA employed as transmission techniques in the Maritime Mobile Satellite Communications (MMSC between ships and Coast Earth Station (CES via Geostationary Earth Orbit (GEO or Not-GEO satellite constellations. In fixed satellite communication, as a rule, especially in MMSC many users are active at the same time. The problem of simultaneous communications between many single or multipoint mobile satellite users can be solved by using MA technique, such as Frequency Division Multiple Access (FDMA, Time Division Multiple Access (TDMA, Code Division Multiple Access (CDMA, Space Division Multiple Access (SDMA and Random (Packet Division Multiple Access (RDMA. Since the resources of the systems such as the transmitting power and the bandwidth are limited, it is advisable to use the channels with complete charge and to create a different MA to the channel. This generates a problem of summation and separation of signals in the transmission and reception parts, respectively. Deciding this problem consists in the development of orthogonal channels of transmission in order to divide signals from various users unambiguously on the reception part.

  16. Characterizing Journal Access at a Canadian University Using the Journal Citation Reports Database

    Directory of Open Access Journals (Sweden)

    Alan Gale

    2011-07-01

    Full Text Available This article outlines a simple approach to characterizing the level of access to the scholarly journal literature in the physical sciences and engineering offered by a research library, particularly within the Canadian university system. The method utilizes the “Journal Citation Reports” (JCR database to produce lists of journals, ranked based on total citations, in the subject areas of interest. Details of the approach are illustrated using data from the University of Guelph. The examples cover chemistry, physics, mathematics and statistics, as well as engineering. In assessing the level of access both the Library’s current journal subscriptions and backfiles are considered. To gain greater perspective, data from both 2003 and 2008 is analyzed. In addition, the number of document delivery requests, received from University of Guelph Library users in recent years, are also reviewed. The approach taken in characterizing access to the journal literature is found to be simple and easy to implement, but time consuming. The University of Guelph Library is shown to provide excellent access to the current journal literature in the subject areas examined. Access to the historical literature in those areas is also strong. In making these assessments, a broad and comprehensive array of journals is considered in each case. Document delivery traffic (i.e. Guelph requests is found to have decreased markedly in recent years. This is attributed, at least in part, to improving access to the scholarly literature. For the University of Guelph, collection assessment is an ongoing process that must balance the needs of a diverse group of users. The results of analyses of the kind discussed in this article can be of practical significance and value to that process.

  17. GEOS Code Development Road Map - May, 2013

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Scott [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Settgast, Randolph [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fu, Pengcheng [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Antoun, Tarabay [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ryerson, F. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-05-03

    GEOS is a massively parallel computational framework designed to enable HPC-based simulations of subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS will enable coupling of different solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. The overall architecture of the framework includes consistent data structures and will allow incorporation of additional physical and materials models as demanded by future applications. Along with predicting the initiation, propagation and reactivation of fractures, GEOS will also generate a seismic source term that can be linked with seismic wave propagation codes to generate synthetic microseismicity at surface and downhole arrays. Similarly, the output from GEOS can be linked with existing fluid/thermal transport codes. GEOS can also be linked with existing, non-intrusive uncertainty quantification schemes to constrain uncertainty in its predictions and sensitivity to the various parameters describing the reservoir and stimulation operations. We anticipate that an implicit-explicit 3D version of GEOS, including a preliminary seismic source model, will be available for parametric testing and validation against experimental and field data by Oct. 1, 2013.

  18. The PMDB Protein Model Database

    Science.gov (United States)

    Castrignanò, Tiziana; De Meo, Paolo D'Onorio; Cozzetto, Domenico; Talamo, Ivano Giuseppe; Tramontano, Anna

    2006-01-01

    The Protein Model Database (PMDB) is a public resource aimed at storing manually built 3D models of proteins. The database is designed to provide access to models published in the scientific literature, together with validating experimental data. It is a relational database and it currently contains >74 000 models for ∼240 proteins. The system is accessible at and allows predictors to submit models along with related supporting evidence and users to download them through a simple and intuitive interface. Users can navigate in the database and retrieve models referring to the same target protein or to different regions of the same protein. Each model is assigned a unique identifier that allows interested users to directly access the data. PMID:16381873

  19. Kennisagenda Geo-informatie: GISsen met beleid

    NARCIS (Netherlands)

    Dessing, N.; Lips, F.; Hoogenboom, J.; Vullings, L.A.E.

    2009-01-01

    LNV wil méér geo-informatie inzetten bij de ontwikkeling en uitvoering van beleid en beleidsnota’s ruimer voorzien van kaartmateriaal. Dit betekent dat geo-informatie vaker moet worden benut om lokale knelpunten, mogelijkheden en de gevolgen van alternatieve oplossingen inzichtelijk te maken. Om dit

  20. Records Management Database

    Data.gov (United States)

    US Agency for International Development — The Records Management Database is tool created in Microsoft Access specifically for USAID use. It contains metadata in order to access and retrieve the information...

  1. JASPAR, the open access database of transcription factor-binding profiles: new content and tools in the 2008 update

    DEFF Research Database (Denmark)

    Bryne, J.C.; Valen, E.; Tang, M.H.E.

    2008-01-01

    JASPAR is a popular open-access database for matrix models describing DNA-binding preferences for transcription factors and other DNA patterns. With its third major release, JASPAR has been expanded and equipped with additional functions aimed at both casual and power users. The heart of the JASPAR...... databasethe JASPAR CORE sub-databasehas increased by 12 in size, and three new specialized sub-databases have been added. New functions include clustering of matrix models by similarity, generation of random matrices by sampling from selected sets of existing models and a language-independent Web Service...

  2. Requirements elicitation for geo-information solutions

    NARCIS (Netherlands)

    Robbi Sluter, Claudia; van Elzakker, Corné P.J.M.; Ivanova, Ivana

    2017-01-01

    Geo-information solutions can achieve a higher level of quality if they are developed in accordance with a user-centred design that requires definition of the user requirements in the first step of solution construction. We treat a geo-information solution as a system designed to support human-based

  3. Security Research on Engineering Database System

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Engine engineering database system is an oriented C AD applied database management system that has the capability managing distributed data. The paper discusses the security issue of the engine engineering database management system (EDBMS). Through studying and analyzing the database security, to draw a series of securi ty rules, which reach B1, level security standard. Which includes discretionary access control (DAC), mandatory access control (MAC) and audit. The EDBMS implem ents functions of DAC, ...

  4. Geostatistical analyses of communication routes in a geo-strategic and regional development perspective

    Directory of Open Access Journals (Sweden)

    Alexandru-Ionuţ Petrişor

    2017-12-01

    Full Text Available Accessibility is a key concept in regional development, with numerous ties to territorial cohesion and polycentricity. Moreover, it also exhibits a geo-strategic function, anchored in the international relationships between countries and continents. The article reviews several case studies, placing analyses of the Romanian accessibility in a broader context. The results show that regional development, overall EU connectivity and possible transit fluxes are prevented by the configuration or lack of communication routes. Increasing the accessibility of regions must be a priority of governments, regardless of political opinions. It is expected that the transition of economy to post-carbon era or other models – green economy, knowledge-based economy etc. – to result into the emergence of new poles and axes of development, and ensure transport sustainability.

  5. An approach for access differentiation design in medical distributed applications built on databases.

    Science.gov (United States)

    Shoukourian, S K; Vasilyan, A M; Avagyan, A A; Shukurian, A K

    1999-01-01

    A formalized "top to bottom" design approach was described in [1] for distributed applications built on databases, which were considered as a medium between virtual and real user environments for a specific medical application. Merging different components within a unified distributed application posits new essential problems for software. Particularly protection tools, which are sufficient separately, become deficient during the integration due to specific additional links and relationships not considered formerly. E.g., it is impossible to protect a shared object in the virtual operating room using only DBMS protection tools, if the object is stored as a record in DB tables. The solution of the problem should be found only within the more general application framework. Appropriate tools are absent or unavailable. The present paper suggests a detailed outline of a design and testing toolset for access differentiation systems (ADS) in distributed medical applications which use databases. The appropriate formal model as well as tools for its mapping to a DMBS are suggested. Remote users connected via global networks are considered too.

  6. GeoNetGIS: a Geodetic Network Geographical Information System to manage GPS networks in seismic and volcanic areas

    Science.gov (United States)

    Cristofoletti, P.; Esposito, A.; Anzidei, M.

    2003-04-01

    This paper presents the methodologies and issues involved in the use of GIS techniques to manage geodetic information derived from networks in seismic and volcanic areas. Organization and manipulation of different geodetical, geological and seismic database, give us a new challenge in interpretation of information that has several dimensions, including spatial and temporal variations, also the flexibility and brand range of tools available in GeoNetGIS, make it an attractive platform for earthquake risk assessment. During the last decade the use of geodetic networks based on the Global Positioning System, devoted to geophysical applications, especially for crustal deformation monitoring in seismic and volcanic areas, increased dramatically. The large amount of data provided by these networks, combined with different and independent observations, such as epicentre distribution of recent and historical earthquakes, geological and structural data, photo interpretation of aerial and satellite images, can aid for the detection and parameterization of seismogenic sources. In particular we applied our geodetic oriented GIS to a new GPS network recently set up and surveyed in the Central Apennine region: the CA-GeoNet. GeoNetGIS is designed to analyze in three and four dimensions GPS sources and to improve crustal deformation analysis and interpretation related with tectonic structures and seismicity. It manages many database (DBMS) consisting of different classes, such as Geodesy, Topography, Seismicity, Geology, Geography and Raster Images, administrated according to Thematic Layers. GeoNetGIS represents a powerful research tool allowing to join the analysis of all data layers to integrate the different data base which aid for the identification of the activity of known faults or structures and suggesting the new evidences of active tectonics. A new approach to data integration given by GeoNetGIS capabilities, allow us to create and deliver a wide range of maps, digital

  7. GeoLab: A Geological Workstation for Future Missions

    Science.gov (United States)

    Evans, Cynthia; Calaway, Michael; Bell, Mary Sue; Li, Zheng; Tong, Shuo; Zhong, Ye; Dahiwala, Ravi

    2014-01-01

    The GeoLab glovebox was, until November 2012, fully integrated into NASA's Deep Space Habitat (DSH) Analog Testbed. The conceptual design for GeoLab came from several sources, including current research instruments (Microgravity Science Glovebox) used on the International Space Station, existing Astromaterials Curation Laboratory hardware and clean room procedures, and mission scenarios developed for earlier programs. GeoLab allowed NASA scientists to test science operations related to contained sample examination during simulated exploration missions. The team demonstrated science operations that enhance theThe GeoLab glovebox was, until November 2012, fully integrated into NASA's Deep Space Habitat (DSH) Analog Testbed. The conceptual design for GeoLab came from several sources, including current research instruments (Microgravity Science Glovebox) used on the International Space Station, existing Astromaterials Curation Laboratory hardware and clean room procedures, and mission scenarios developed for earlier programs. GeoLab allowed NASA scientists to test science operations related to contained sample examination during simulated exploration missions. The team demonstrated science operations that enhance the early scientific returns from future missions and ensure that the best samples are selected for Earth return. The facility was also designed to foster the development of instrument technology. Since 2009, when GeoLab design and construction began, the GeoLab team [a group of scientists from the Astromaterials Acquisition and Curation Office within the Astromaterials Research and Exploration Science (ARES) Directorate at JSC] has progressively developed and reconfigured the GeoLab hardware and software interfaces and developed test objectives, which were to 1) determine requirements and strategies for sample handling and prioritization for geological operations on other planetary surfaces, 2) assess the scientific contribution of selective in-situ sample

  8. The GEO Geohazard Supersites and Natural Laboratories - GSNL 2.0: improving societal benefits of Geohazard science

    Science.gov (United States)

    Salvi, Stefano

    2016-04-01

    The Geohazard Supersites and Natural Laboratories initiative began with the "Frascati declaration" at the conclusion of the 3rd International Geohazards workshop of GEO held in November 2007 in Frascati, Italy. The recommendation of the workshop was "to stimulate an international and intergovernmental effort to monitor and study selected reference sites by establishing open access to relevant datasets according to GEO principles, to foster the collaboration between all various partners and end-users". This recommendation was later formalized in the GEO Work Plan as Component 2 of the GEO task DI-01, part of the GEO Disasters Societal Benefit Area. Today GSNL has grown to a voluntary collaboration among monitoring agencies, scientific community and the CEOS space agencies, working to improve the scientific understanding of earthquake and volcanic phenomena and enable better risk assessment and emergency management. According to its principles, actions in GSNL are focused on specific areas of the world, the Supersites, for which large amounts of in situ and satellite data are made openly available to all scientists. These areas are selected based on the importance of the scientific problems, as well as on the amount of population at risk, and should be evenly distributed among developed and less developed countries. Seven Supersites have been established to date, six of which on volcanic areas (Hawaii, US; Icelandic volcanoes; Mt. Etna, IT; Campi Flegrei, IT; Ecuadorian volcanoes, Taupo, NZ), and one on a seismic area (Western North Anatolian fault, TR). One more proposals is being evaluated: the Corinth Gulf in Greece. The Supersites have succeeded in promoting new scientific developments by providing a framework for an easier access to EO and in situ data. Coordination among researchers at the global scale has been achieved only where the Supersite activities were sustained through well established projects. For some Supersites a close coordination between

  9. SemantGeo: Powering Ecological and Environment Data Discovery and Search with Standards-Based Geospatial Reasoning

    Science.gov (United States)

    Seyed, P.; Ashby, B.; Khan, I.; Patton, E. W.; McGuinness, D. L.

    2013-12-01

    Recent efforts to create and leverage standards for geospatial data specification and inference include the GeoSPARQL standard, Geospatial OWL ontologies (e.g., GAZ, Geonames), and RDF triple stores that support GeoSPARQL (e.g., AllegroGraph, Parliament) that use RDF instance data for geospatial features of interest. However, there remains a gap on how best to fuse software engineering best practices and GeoSPARQL within semantic web applications to enable flexible search driven by geospatial reasoning. In this abstract we introduce the SemantGeo module for the SemantEco framework that helps fill this gap, enabling scientists find data using geospatial semantics and reasoning. SemantGeo provides multiple types of geospatial reasoning for SemantEco modules. The server side implementation uses the Parliament SPARQL Endpoint accessed via a Tomcat servlet. SemantGeo uses the Google Maps API for user-specified polygon construction and JsTree for providing containment and categorical hierarchies for search. SemantGeo uses GeoSPARQL for spatial reasoning alone and in concert with RDFS/OWL reasoning capabilities to determine, e.g., what geofeatures are within, partially overlap with, or within a certain distance from, a given polygon. We also leverage qualitative relationships defined by the Gazetteer ontology that are composites of spatial relationships as well as administrative designations or geophysical phenomena. We provide multiple mechanisms for exploring data, such as polygon (map-based) and named-feature (hierarchy-based) selection, that enable flexible search constraints using boolean combination of selections. JsTree-based hierarchical search facets present named features and include a 'part of' hierarchy (e.g., measurement-site-01, Lake George, Adirondack Region, NY State) and type hierarchies (e.g., nodes in the hierarchy for WaterBody, Park, MeasurementSite), depending on the ';axis of choice' option selected. Using GeoSPARQL and aforementioned ontology

  10. A GeoWall with Physics and Astronomy Applications

    Science.gov (United States)

    Dukes, Phillip; Bruton, Dan

    2008-03-01

    A GeoWall is a passive stereoscopic projection system that can be used by students, teachers, and researchers for visualization of the structure and dynamics of three-dimensional systems and data. The type of system described here adequately provides 3-D visualization in natural color for large or small groups of viewers. The name ``GeoWall'' derives from its initial development to visualize data in the geosciences.1 An early GeoWall system was developed by Paul Morin at the electronic visualization laboratory at the University of Minnesota and was applied in an introductory geology course in spring of 2001. Since that time, several stereoscopic media, which are applicable to introductory-level physics and astronomy classes, have been developed and released into the public domain. In addition to the GeoWall's application in the classroom, there is considerable value in its use as part of a general science outreach program. In this paper we briefly describe the theory of operation of stereoscopic projection and the basic necessary components of a GeoWall system. Then we briefly describe how we are using a GeoWall as an instructional tool for the classroom and informal astronomy education and in research. Finally, we list sources for several of the free software media in physics and astronomy available for use with a GeoWall system.

  11. Geospatial data sharing, online spatial analysis and processing of Indian Biodiversity data in Internet GIS domain - A case study for raster based online geo-processing

    Science.gov (United States)

    Karnatak, H.; Pandey, K.; Oberai, K.; Roy, A.; Joshi, D.; Singh, H.; Raju, P. L. N.; Krishna Murthy, Y. V. N.

    2014-11-01

    National Biodiversity Characterization at Landscape Level, a project jointly sponsored by Department of Biotechnology and Department of Space, was implemented to identify and map the potential biodiversity rich areas in India. This project has generated spatial information at three levels viz. Satellite based primary information (Vegetation Type map, spatial locations of road & village, Fire occurrence); geospatially derived or modelled information (Disturbance Index, Fragmentation, Biological Richness) and geospatially referenced field samples plots. The study provides information of high disturbance and high biological richness areas suggesting future management strategies and formulating action plans. The study has generated for the first time baseline database in India which will be a valuable input towards climate change study in the Indian Subcontinent. The spatial data generated during the study is organized as central data repository in Geo-RDBMS environment using PostgreSQL and POSTGIS. The raster and vector data is published as OGC WMS and WFS standard for development of web base geoinformation system using Service Oriented Architecture (SOA). The WMS and WFS based system allows geo-visualization, online query and map outputs generation based on user request and response. This is a typical mashup architecture based geo-information system which allows access to remote web services like ISRO Bhuvan, Openstreet map, Google map etc., with overlay on Biodiversity data for effective study on Bio-resources. The spatial queries and analysis with vector data is achieved through SQL queries on POSTGIS and WFS-T operations. But the most important challenge is to develop a system for online raster based geo-spatial analysis and processing based on user defined Area of Interest (AOI) for large raster data sets. The map data of this study contains approximately 20 GB of size for each data layer which are five in number. An attempt has been to develop system using

  12. DESIGN AND IMPLEMENTATION OF AN OPEN ACCESS GEOPORTAL

    OpenAIRE

    SARI, Fatih

    2018-01-01

    GeoPortal Systems are being considered one of the most important object in interoperability concept for Spatial data Management. With the developing technology of the information age, the need for accessing to spatial data is caused to effort for establishing national, regional and local information systems by institutes and organizations. Sharing and accessing of spatial datasets between institutes and organizations are being more important within interoperability concept.In this study, Open...

  13. Geologic Field Database

    Directory of Open Access Journals (Sweden)

    Katarina Hribernik

    2002-12-01

    Full Text Available The purpose of the paper is to present the field data relational database, which was compiled from data, gathered during thirty years of fieldwork on the Basic Geologic Map of Slovenia in scale1:100.000. The database was created using MS Access software. The MS Access environment ensures its stability and effective operation despite changing, searching, and updating the data. It also enables faster and easier user-friendly access to the field data. Last but not least, in the long-term, with the data transferred into the GISenvironment, it will provide the basis for the sound geologic information system that will satisfy a broad spectrum of geologists’ needs.

  14. Paradigm shift from cartography to geo-communication

    DEFF Research Database (Denmark)

    Brodersen, Lars

    2007-01-01

    This paper argues that the domain of GIS, cartography, geo-information etc. is facing a paradigm shift. The implication of a paradigm shift is a complete and necessary re-definition of e.g. the philosophical foundation of the system, as well as with a major upgrade and readjustment of procedures......-information is actually not possible at all without having a usage (a project identity and a purpose) in mind. Objective and neutral geo-information does not exist. Therefore the overall philosophy of the geo-domain will be that it is a communication discipline....

  15. Access 2010 Programmer's Reference

    CERN Document Server

    Hennig, Teresa; Griffith, Geoffrey L

    2010-01-01

    A comprehensive guide to programming for Access 2010 and 2007. Millions of people use the Access database applications, and hundreds of thousands of developers work with Access daily. Access 2010 brings better integration with SQL Server and enhanced XML support; this Wrox guide shows developers how to take advantage of these and other improvements. With in-depth coverage of VBA, macros, and other programming methods for building Access applications, this book also provides real-world code examples to demonstrate each topic.: Access is the leading database that is used worldwide; While VBA rem

  16. Kennisagenda Geo-informatie: GISsen met beleid

    OpenAIRE

    Dessing, N.; Lips, F.; Hoogenboom, J.; Vullings, L.A.E.

    2009-01-01

    LNV wil méér geo-informatie inzetten bij de ontwikkeling en uitvoering van beleid en beleidsnota’s ruimer voorzien van kaartmateriaal. Dit betekent dat geo-informatie vaker moet worden benut om lokale knelpunten, mogelijkheden en de gevolgen van alternatieve oplossingen inzichtelijk te maken. Om dit te bereiken moet de beschikbaarheid van adequate data en gebruikersvriendelijke en nieuwe GIS-technieken aanmerkelijk verbeteren.

  17. Geo-registration of Unprofessional and Weakly-related Image and Precision Evaluation

    Directory of Open Access Journals (Sweden)

    LIU Yingzhen

    2015-09-01

    Full Text Available The 3D geo-spatial model built by unprofessional and weakly-related image is a significant source of geo-spatial information. The unprofessional and weakly-related image cannot be useful geo-spatial information until be geo-registered with accurate geo-spatial orientation and location. In this paper, we present an automatic geo-registration using the coordination acquired by real-time GPS module. We calculate 2D and 3D spatial transformation parameters based on the spatial similarity between the image location in the geo-spatial coordination system and in the 3D reconstruction coordination system. Because of the poor precision of GPS information and especially the unstability of elevation measurement, we use RANSAC algorithm to get rid of outliers. In the experiment, we compare the geo-registered image positions to their differential GPS coordinates. The errors of translation, rotation and scaling are evaluated quantitively and the causes of bad result are analyzed. The experiment demonstrates that this geo-registration method can get a precise result with enough images.

  18. Specialist Bibliographic Databases

    OpenAIRE

    Gasparyan, Armen Yuri; Yessirkepov, Marlen; Voronov, Alexander A.; Trukhachev, Vladimir I.; Kostyukova, Elena I.; Gerasimov, Alexey N.; Kitas, George D.

    2016-01-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and d...

  19. Characterisation of irradiation effect on geo-polymers

    International Nuclear Information System (INIS)

    Chupin, Frederic

    2015-01-01

    This study aims to improve knowledge about the radiation effect on geo-polymer behavior in terms of dihydrogen release and general strength in order to consider them as an alternative to usual nuclear waste cementitious coating matrices. Using various characterization techniques (nitrogen adsorption, low temperature DSC, FTIR and 1 H NMR spectroscopy) and by means of simulation irradiations (gamma, heavy ions), it has been shown that all the water present in the geo-polymer could be radiolyzed and that there was a confinement effect on the water radiolysis under low LET irradiation, probably due to efficient energy transfers from the solid matrix to the interstitial solution. Three dihydrogen production rates have been identified with the absorbed dose, depending on the concentration of dissolved dioxygen and the dihydrogen accumulation in the geo-polymer matrix. The good mechanical strength of the geo-polymer has been shown up to 9 MGy under gamma irradiation and is due to its high stability under irradiation. This could be explained by the fast recombination of the defects observed by EPR spectroscopy. However, phase crystallization was revealed during irradiation with heavy ions, which may induce some weakening of the geo-polymer network under alpha irradiation. The overall results helped to understand the phenomenology in a waste package under storage conditions. (author) [fr

  20. The Key Driving Forces for Geo-Economic Relationships between China and ASEAN Countries

    Directory of Open Access Journals (Sweden)

    Shufang Wang

    2017-12-01

    Full Text Available With the rise of China and the implementation of the “21st Century Maritime Silk Road” strategy, research on geo-economics between China and ASEAN (Association of Southeast Asian Nations countries has become increasingly important. Current studies mainly focus on influencing factors, while there is little consideration about how these influencing factors act on geo-economic relationships. Therefore, this paper explores the key driving forces for geo-economic relationships between China and ASEAN countries by use of the structural equation modeling based on Partial Lease Squares. There are three main findings: (1 Economic factors have the greatest impact on geo-economic relationships and the total path effect is 0.778. Geo-location, geopolitics and geo-culture act on geo-economic relationships directly and indirectly. Their total path effects are 0.731, 0.645 and 0.513, respectively. (2 Indirect effects of geo-location, geopolitics and geo-culture impacting geo-economic relationships are far greater than direct effects. Geo-culture, in particular, has a vital mediating effect on geo-economic relationships. (3 Economic drivers promote geo-economic relationships through market, industrial policy, technical, network and benefit-sharing mechanisms. Political drivers improve geo-economic relationships through cooperation, negotiation, coordination and institutional mechanisms. Cultural drivers enhance geo-economic relationships through transmission mechanism. Location drivers facilitate geo-economic relationships through selection mechanism. We provide new insights on the geo-economic relationships through quantitative analysis and enrich the existing literature by revealing the key driving forces and mechanisms for geo-economic relationships.

  1. Providing Access and Visualization to Global Cloud Properties from GEO Satellites

    Science.gov (United States)

    Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Palikonda, R.; Ayers, J. K.

    2015-12-01

    Providing public access to cloud macro and microphysical properties is a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a tool and method that allows end users to easily browse and access cloud information that is otherwise difficult to acquire and manipulate. The core of the tool is an application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the dynamically generated imagery as an input into their own work flows for both image generation and cloud product requisition. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite derived cloud product information available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider. Using the Open Geospatial Consortium's Web Processing Service as our access method, we describe a system that uses a hybrid local and cloud based parallel processing system that can return both satellite imagery and cloud product imagery as well as the binary data used to generate them in multiple formats. The images and cloud products are sourced from multiple satellites and also "merged" datasets created by temporally and spatially matching satellite sensors. Finally, the tool and API allow users to access information that spans the time ranges that our group has information available. In the case of satellite imagery, the temporal range can span the entire lifetime of the sensor.

  2. Access 2013 for dummies

    CERN Document Server

    Ulrich Fuller, Laurie

    2013-01-01

    The easy guide to Microsoft Access returns with updates on the latest version! Microsoft Access allows you to store, organize, view, analyze, and share data; the new Access 2013 release enables you to build even more powerful, custom database solutions that integrate with the web and enterprise data sources. Access 2013 For Dummies covers all the new features of the latest version of Accessand serves as an ideal reference, combining the latest Access features with the basics of building usable databases. You'll learn how to create an app from the Welcome screen, get support

  3. Pro Access 2010 Development

    CERN Document Server

    Collins, Mark

    2011-01-01

    Pro Access 2010 Development is a fundamental resource for developing business applications that take advantage of the features of Access 2010 and the many sources of data available to your business. In this book, you'll learn how to build database applications, create Web-based databases, develop macros and Visual Basic for Applications (VBA) tools for Access applications, integrate Access with SharePoint and other business systems, and much more. Using a practical, hands-on approach, this book will take you through all the facets of developing Access-based solutions, such as data modeling, co

  4. GeoFramework: A Modeling Framework for Solid Earth Geophysics

    Science.gov (United States)

    Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.

    2003-12-01

    As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic

  5. A novel insight into beaconless geo-routing

    KAUST Repository

    Bader, Ahmed

    2012-12-01

    Beaconless geo-routing protocols have been traditionally analyzed assuming equal communication ranges for the data and control packets. This is not true in reality, since the communication range is in practice function of the packet length. As a consequence, a substantial discrepancy may exist between analytical and empirical results offered in beaconless geo-routing literature. Furthermore, performance of beaconless geo-routing protocols has typically considered using single-hop metrics only. End-to-end performance is considered in literature only occasionally and mainly in terms of simulation only. In this paper, we re-examine this class of protocols. We first incorporate practical packet detection models in order to capture the dependency of the communication range on the packet\\'s length. We then develop a detailed analytical framework for the end-to-end delay and energy performance of beaconless geo-routing protocols. Finally, we present two different application scenarios and study various tradeoffs in light of the framework developed. © 2012 IEEE.

  6. Functionally Graded Materials Database

    Science.gov (United States)

    Kisara, Katsuto; Konno, Tomomi; Niino, Masayuki

    2008-02-01

    Functionally Graded Materials Database (hereinafter referred to as FGMs Database) was open to the society via Internet in October 2002, and since then it has been managed by the Japan Aerospace Exploration Agency (JAXA). As of October 2006, the database includes 1,703 research information entries with 2,429 researchers data, 509 institution data and so on. Reading materials such as "Applicability of FGMs Technology to Space Plane" and "FGMs Application to Space Solar Power System (SSPS)" were prepared in FY 2004 and 2005, respectively. The English version of "FGMs Application to Space Solar Power System (SSPS)" is now under preparation. This present paper explains the FGMs Database, describing the research information data, the sitemap and how to use it. From the access analysis, user access results and users' interests are discussed.

  7. GeoBest - A contribution to the long term development of deep geothermal energy in Switzerland.

    Science.gov (United States)

    Kraft, T.; Wiemer, S.; Husen, S.

    2012-04-01

    The processes and conditions underpinning induced seismicity associated with deep geothermal operations are still not sufficiently well understood to make useful predictions as to the likely seismic response to reservoir development and exploitation. The empirical data include only a handful of well-monitored EGS experiments; models are consequently poorly constrained. Unfortunately, data sets of well-monitored deep hydrothermal experiments are missing and empirical constraints of induced seismicity models for these cases do not exist. Given that the majority of the projects underway or planned in Europe are of the hydrothermal type, there is hope that this deficit can be remedied in the near future through a close cooperation of geothermal industry, science and public authorities. The GeoBest project was initiated in Switzerland to facilitate the dialog between geothermal industry, science and public authorities. The Swiss Seismological Service (SED) is implementing the GeoBest project on behalf of the Swiss Federal Office for Energy (SFOE) to provide cantonal and federal authorities with guidelines on how to handle seismic monitoring and hazard in the framework of the environmental risk assessment. Within GeoBest, selected pilot projects in Switzerland will be supported during the necessary seismic monitoring of natural and induced seismicity. GeoBest supports the pilot project in the first two years, that are most critical with respect to the financial risk, by providing seismological instrumentation from the GeoBest instrument pool and partial financial support for the operation of the seismic monitoring network. In return the pilot projects grant SED access to project data needed for seismic hazard assessment and the development of best practice guidelines. These types of collaboration offer the unique opportunity to collect high-quality seismological data and, by combining them with relevant project data, to gain first hand practical experience for the

  8. A web-based, relational database for studying glaciers in the Italian Alps

    Science.gov (United States)

    Nigrelli, G.; Chiarle, M.; Nuzzi, A.; Perotti, L.; Torta, G.; Giardino, M.

    2013-02-01

    Glaciers are among the best terrestrial indicators of climate change and thus glacier inventories have attracted a growing, worldwide interest in recent years. In Italy, the first official glacier inventory was completed in 1925 and 774 glacial bodies were identified. As the amount of data continues to increase, and new techniques become available, there is a growing demand for computer tools that can efficiently manage the collected data. The Research Institute for Geo-hydrological Protection of the National Research Council, in cooperation with the Departments of Computer Science and Earth Sciences of the University of Turin, created a database that provides a modern tool for storing, processing and sharing glaciological data. The database was developed according to the need of storing heterogeneous information, which can be retrieved through a set of web search queries. The database's architecture is server-side, and was designed by means of an open source software. The website interface, simple and intuitive, was intended to meet the needs of a distributed public: through this interface, any type of glaciological data can be managed, specific queries can be performed, and the results can be exported in a standard format. The use of a relational database to store and organize a large variety of information about Italian glaciers collected over the last hundred years constitutes a significant step forward in ensuring the safety and accessibility of such data. Moreover, the same benefits also apply to the enhanced operability for handling information in the future, including new and emerging types of data formats, such as geographic and multimedia files. Future developments include the integration of cartographic data, such as base maps, satellite images and vector data. The relational database described in this paper will be the heart of a new geographic system that will merge data, data attributes and maps, leading to a complete description of Italian glacial

  9. Building a database for long-term monitoring of benthic macrofauna in the Pertuis-Charentais (2004-2014).

    Science.gov (United States)

    Philippe, Anne S; Plumejeaud-Perreau, Christine; Jourde, Jérôme; Pineau, Philippe; Lachaussée, Nicolas; Joyeux, Emmanuel; Corre, Frédéric; Delaporte, Philippe; Bocher, Pierrick

    2017-01-01

    Long-term benthic monitoring is rewarding in terms of science, but labour-intensive, whether in the field, the laboratory, or behind the computer. Building and managing databases require multiple skills, including consistency over time as well as organisation via a systematic approach. Here, we introduce and share our spatially explicit benthic database, comprising 11 years of benthic data. It is the result of intensive benthic sampling that has been conducted on a regular grid (259 stations) covering the intertidal mudflats of the Pertuis-Charentais (Marennes-Oléron Bay and Aiguillon Bay). Samples were taken by foot or by boats during winter depending on tidal height, from December 2003 to February 2014. The present dataset includes abundances and biomass densities of all mollusc species of the study regions and principal polychaetes as well as their length, accessibility to shorebirds, energy content and shell mass when appropriate and available. This database has supported many studies dealing with the spatial distribution of benthic invertebrates and temporal variations in food resources for shorebird species as well as latitudinal comparisons with other databases. In this paper, we introduce our benthos monitoring, share our data, and present a "guide of good practices" for building, cleaning and using it efficiently, providing examples of results with associated R code. The dataset has been formatted into a geo-referenced relational database, using PostgreSQL open-source DBMS. We provide density information, measurements, energy content and accessibility of thirteen bivalve, nine gastropod and two polychaete taxa (a total of 66,620 individuals)​ for 11 consecutive winters. Figures and maps are provided to describe how the dataset was built, cleaned, and how it can be used. This dataset can again support studies concerning spatial and temporal variations in species abundance, interspecific interactions as well as evaluations of the availability of food

  10. Comparative Analysis of NOAA REFM and SNB3GEO Tools for the Forecast of the Fluxes of High-Energy Electrons at GEO

    Science.gov (United States)

    Balikhin, M. A.; Rodriguez, J. V.; Boynton, R. J.; Walker, S. N.; Aryan, Homayon; Sibeck, D. G.; Billings, S. A.

    2016-01-01

    Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB3GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB3GEO forecasts use solar wind density and interplanetary magnetic field B(sub z) observations at L1. The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB3GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB3GEO forecast.

  11. Comparative analysis of NOAA REFM and SNB3GEO tools for the forecast of the fluxes of high-energy electrons at GEO

    Science.gov (United States)

    Balikhin, M. A.; Rodriguez, J. V.; Boynton, R. J.; Walker, S. N.; Aryan, H.; Sibeck, D. G.; Billings, S. A.

    2016-01-01

    Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB3GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB3GEO forecasts use solar wind density and interplanetary magnetic field Bz observations at L1.The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB3GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB3GEO forecast.

  12. Software listing: CHEMTOX database

    International Nuclear Information System (INIS)

    Moskowitz, P.D.

    1993-01-01

    Initially launched in 1983, the CHEMTOX Database was among the first microcomputer databases containing hazardous chemical information. The database is used in many industries and government agencies in more than 17 countries. Updated quarterly, the CHEMTOX Database provides detailed environmental and safety information on 7500-plus hazardous substances covered by dozens of regulatory and advisory sources. This brief listing describes the method of accessing data and provides ordering information for those wishing to obtain the CHEMTOX Database

  13. Rheological behavior of alkali-activated metakaolin during geo-polymerization

    International Nuclear Information System (INIS)

    Poulesquen, A.; Frizon, F.; Lambertin, D.

    2011-01-01

    The dynamic rheological behavior of geo-polymers, inorganic materials synthesized by activation of an aluminosilicate source by an alkaline solution, is described. The pastes studied were mixtures of an activation solution (alkali + silica) and metakaolin. The influence of the activation solution (NaOH vs. KOH), the silica (Aerosil vs. Tixosil), and the temperature on the evolution of the elastic modulus (G') and viscous modulus (G') over time were studied in the linear viscoelastic range. The results show that the nature of the silica has little influence on the viscous and elastic moduli when the geo-polymer is activated by KOH, and that the setting time is faster with sodium hydroxide and at higher temperatures regardless of the geo-polymer. In addition, during geo-polymerization the stepwise variation of the modulus values indicates that the formation of the 3D network occurs in several steps. Moreover, geo-polymers activated by potassium hydroxide exhibit slower kinetics but the interactions between constituents are stronger, as the loss tangent (tanδ = G''/G') is lower. Finally, the maximum loss tangent, tanδ, was also used as a criterion to determine the temperature dependence of the geo-polymers synthesized. This criterion is a precursor of the transition to the glassy state. The activation energies could thus be determined for the geo-polymers synthesized with potassium hydroxide or sodium hydroxide. (authors)

  14. Geo-targeted Weather Alerts Coming to Millions of Mobile Devices

    Science.gov (United States)

    Gerber, M.

    2011-12-01

    The Personal Localized Alert Network (PLAN), aka Commercial Mobile Alert System (CMAS), is readying for roll out and will be broadcasting emergency public alerts to millions of cell phones by the middle of 2012. Learn how the National Weather Serivce (NWS) is supplying PLAN with geo-referenced weather alert information in the industry standard Common Alerting Protocol (CAP) format and how you can access this same information for integration with mobile devices, other consumer electronics, and decision support systems. Information will also be provided on the NWS' new collaborative venue that encourages wide participation in the evolution and use of NWS CAP alerts in a variety of applications.

  15. Modelling of Diffuse Failure and Fluidization in geo materials and Geo structures

    International Nuclear Information System (INIS)

    Pastor, M.

    2013-01-01

    Failure of geo structures is caused by changes in effective stresses induced by external loads (earthquakes, for instance), change in the pore pressures (rain), in the geometry (erosion), or in materials properties (chemical attack, degradation, weathering). Landslides can by analysed as the failure of a geo structure, the slope. There exist many alternative classifications of landslides can be analyzed as the failure of a geo structure, the slope. There exist many alternative classifications of landslides, but we will consider here a simple classification into slides and flows. In the case of slides, the failure consists on the movement of a part of the slope with deformations which concentrate in a narrow zone, the failure surface. This can be idealized as localized failure, and it is typical of over consolidated or dense materials exhibiting softening. On the other hand, flows are made of fluidized materials, flowing in a fluid like manner. This mechanism of failure is known as diffuse failure, and has received much less attention by researchers. Modelling of diffuse failure of slopes is complex, because there appear difficulties in the mathematical, constitutive and numerical models, which have to account for a phase transition. This work deals with modeling, and we will present here some tools recently developed by the author and the group to which he belongs. (Author)

  16. Cell Centred Database (CCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Cell Centered Database (CCDB) is a web accessible database for high resolution 2D, 3D and 4D data from light and electron microscopy, including correlated imaging.

  17. GeoForum MV 2012. GIS schafft Energie. Contributions of geo-information science to the energy turnaround; GeoForum MV 2012. GIS schafft Energie. Beitraege der Geoinformationswirtschaft zur Energiewende

    Energy Technology Data Exchange (ETDEWEB)

    Bill, Ralf [Rostock Univ. (Germany). Professur fuer Geodaesie und Geoinformatik; Flach, Guntram [Fraunhofer IGD, Rostock (Germany); Klammer, Ulf; Lerche, Tobias (eds.) [GeoMV e.V. Verein der Geoinformationswirtschaft Mecklenburg-Vorpommern e.V., Rostock (Germany)

    2012-07-01

    Geo-information systems (GIS) have become indispensable in the development and implementation of concepts for enhanced use of renewable energy sources. Publications in geo-informatics so far have tended to focus on potential studies and regional planning aspects, but also on the establishment of land registers for energy sources and heat consumption. This year's GeoForum presented a comprehensive and concise picture of all these trends. Further subjects were discussed as well, i.e. 1. Logistics, eMobility and the development of individualised services in public transportation; 2. Geodata especially of Mecklenburg-Western Pomerania state and with a view to the power supply sector; 3. Basic technologies as current trends in INSPIRE with increasing data volumes and services will enhance their uses in the energy sector.

  18. Development of an electronic emergency department-based geo-information injury surveillance system in Hong Kong.

    Science.gov (United States)

    Chow, C B; Leung, M; Lai, Adela; Chow, Y H; Chung, Joanne; Tong, K M; Lit, Albert

    2012-06-01

    To describe the experience in the development of an electronic emergency department (ED)-based injury surveillance (IS) system in Hong Kong using data-mining and geo-spatial information technology (IT) for a Safe Community setup. This paper described the phased development of an emergency department-based IS system based on World Health Organization (WHO) injury surveillance Guideline to support safety promotion and injury prevention in a Safe Community in Hong Kong starting 2002. The initial ED data-based only collected data on name, sex, age, address, eight general categories of injury types (traffic, domestic, common assault, indecent assault, batter, industrial, self-harm and sports) and disposal from ED. Phase 1--manual data collection on International Classification of External Causes of Injury pre-event data; Phase 2--manual form was converted to electronic format using web-based data mining technology with built in data quality monitoring mechanism; Phase 3--integration of injury surveillance-data with in-patient hospital information; and Phase 4--geo-spatial information and body mapping were introduced to geo-code exact place of injury in an electronic map and site of injury on body map. It was feasible to develop a geo-spatial IS system at busy ED to collect valuable information for safety promotion and injury prevention at Safe Community setting. The keys for successful development and implementation involves engagement of all stakeholders at design and implementation of the system with injury prevention as ultimate goal, detail workflow planning at front end, support from the management, building on exiting system and appropriate utilisation of modern technology. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. The Correlation of Geo-Ecological Environment and Mountain Urban planning

    Science.gov (United States)

    Yang, Chun; Zeng, Wei

    2018-01-01

    As a special area with the complex geological structure, mountain city is more prone to geological disasters. Due to air pollution, ground subsidence, serious water pollution, earthquakes and floods geo-ecological environment problems have become increasingly serious, mountain urban planning is facing more severe challenges. Therefore, this article bases on the correlation research of geo-ecological environment and mountain urban planning, and re-examins mountain urban planning from the perspective of geo-ecological, coordinates the relationship between the human and nature by geo-ecological thinking, raises the questions which urban planning need to pay attention. And advocates creating an integrated system of geo-ecological and mountain urban planning, analysis the status and dynamics of present mountain urban planning.

  20. Data Management for Flexible Access - Implementation and Lessons Learned from work with Multiple User Communities

    Science.gov (United States)

    Benedict, K. K.; Scott, S.; Hudspeth, W. B.

    2012-12-01

    There is no shortage of community-specific and generic data discovery and download platforms and protocols (e.g. CUAHSI HIS, DataONE, GeoNetwork Open Source, GeoPortal, OGC CSW, OAI PMH), documentation standards (e.g. FGDC, ISO 19115, EML, Dublin Core), data access and visualization standards and models (e.g. OGC WxS, OpenDAP), and general-purpose web service models (i.e. REST & SOAP) upon which Geo-informatics cyberinfrastructure (CI) may be built. When attempting to develop a robust platform that may service a wide variety of users and use cases the challenge is one of identifying which existing platform (if any) may support those current needs while also allowing for future expansion for additional capabilities. In the case of the implementation of a data storage, discovery and delivery platform to support the multiple projects at the Earth Data Analysis Center at UNM, no single platform or protocol met the joint requirements of two initial applications (the New Mexico Resource Geographic Information System [http://rgis.unm.edu] and the New Mexico EPSCoR Data Portal [http://nmepscor.org/dataportal]) and furthermore none met anticipated additional requirements as new applications of the platform emerged. As a result of this assessment three years ago EDAC embarked on the development of the Geographic Storage, Transformation, and Retrieval Engine (GSToRE) platform as a general purpose platform upon which n-tiered geospatially enabled data intensive applications could be built. When initially released in 2010 the focus was on the publication of dynamically generated Open Geospatial Consortium services based upon a PostgreSQL/PostGIS backend database. The identification of additional service interface requirements (implementation of the DataONE API and CUAHSI WaterML services), use cases provided by the NM EPSCoR education working group, and expanded metadata publication needs have led to a significant update to the underlying data management tier for GSToRE - the

  1. Database Replication

    CERN Document Server

    Kemme, Bettina

    2010-01-01

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and

  2. Geo-engineering: a curse or a blessing?

    NARCIS (Netherlands)

    Wissenburg, M.L.J.

    2016-01-01

    In recent years, geo-engineering has been suggested as a viable strategy in dealing with climate change, the main indicator of what has become known as ‘the Anthropocene’. In this paper, I investigate the effects of geo-engineering in terms of freedom – not the only but perhaps the most important

  3. Communication of geo-scientific safety arguments

    International Nuclear Information System (INIS)

    Flavelle, P.; Goodwin, B.; Jensen, M.; Linden, R.; Mazurek, M.; Srivastave, M.; Strom, A.; Sudicky, E.; Voinis, S.

    2007-01-01

    Working Group B addressed the communication of geo-scientific safety arguments through a discussion of practical experience as it related to the methods, types of information and specific arguments found to best communicate geo-scientific concepts and notions of safety with broad audiences including, colleagues, authorities and regulators, political decision makers, academics, and the general public. The following questions were suggested by the programme committee of the AMIGO-2 workshop for discussion by Working Group B with respect to the communication of geo-scientific information and safety arguments: - What is the place of geo-scientific arguments in relation to quantitative and qualitative topics like scenario and FEPs (features, events, processes) assessment, simulated repository evolution, calculated dose or risk impacts, engineering tests of materials, etc., when presenting a safety case to different audiences and with respect to the various stages of the repository programme? (see section 3). - Would we be better off focusing messages to the public on time scales of a few hundred years or a few generations? (see section 4). - How do you handle the fact that geoscience interpretations seldom are unique and data often are open to various interpretations? (see section 5). - How do you handle expert controversy on a specific topic? (see section 6). (authors)

  4. Instrumental Genesis in GeoGebra Based Board Game Design

    DEFF Research Database (Denmark)

    Misfeldt, Morten

    2013-01-01

    In this paper I address the use of digital tools (GeoGebra) in open ended design activities, with primary school children. I present results from the research and development project “Creative Digital Mathematics”, which aims to use the pupil’s development of mathematical board games as a vehicle...... in their work with GeoGebra and how they relate their work with GeoGebra and mathematics to fellow pupils and real life situations. The results show that pupils’ consider development of board games as meaningful mathematical activity, and that they develop skills with GeoGebra, furthermore the pupils considers...... potential use of their board game by classmates in their design activities....

  5. X-ray Photoelectron Spectroscopy Database (Version 4.1)

    Science.gov (United States)

    SRD 20 X-ray Photoelectron Spectroscopy Database (Version 4.1) (Web, free access)   The NIST XPS Database gives access to energies of many photoelectron and Auger-electron spectral lines. The database contains over 22,000 line positions, chemical shifts, doublet splittings, and energy separations of photoelectron and Auger-electron lines.

  6. The AMMA database

    Science.gov (United States)

    Boichard, Jean-Luc; Brissebrat, Guillaume; Cloche, Sophie; Eymard, Laurence; Fleury, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim

    2010-05-01

    The AMMA project includes aircraft, ground-based and ocean measurements, an intensive use of satellite data and diverse modelling studies. Therefore, the AMMA database aims at storing a great amount and a large variety of data, and at providing the data as rapidly and safely as possible to the AMMA research community. In order to stimulate the exchange of information and collaboration between researchers from different disciplines or using different tools, the database provides a detailed description of the products and uses standardized formats. The AMMA database contains: - AMMA field campaigns datasets; - historical data in West Africa from 1850 (operational networks and previous scientific programs); - satellite products from past and future satellites, (re-)mapped on a regular latitude/longitude grid and stored in NetCDF format (CF Convention); - model outputs from atmosphere or ocean operational (re-)analysis and forecasts, and from research simulations. The outputs are processed as the satellite products are. Before accessing the data, any user has to sign the AMMA data and publication policy. This chart only covers the use of data in the framework of scientific objectives and categorically excludes the redistribution of data to third parties and the usage for commercial applications. Some collaboration between data producers and users, and the mention of the AMMA project in any publication is also required. The AMMA database and the associated on-line tools have been fully developed and are managed by two teams in France (IPSL Database Centre, Paris and OMP, Toulouse). Users can access data of both data centres using an unique web portal. This website is composed of different modules : - Registration: forms to register, read and sign the data use chart when an user visits for the first time - Data access interface: friendly tool allowing to build a data extraction request by selecting various criteria like location, time, parameters... The request can

  7. Database Quality and Access Issues Relevant to Research Using Anesthesia Information Management System Data.

    Science.gov (United States)

    Epstein, Richard H; Dexter, Franklin

    2018-07-01

    For this special article, we reviewed the computer code, used to extract the data, and the text of all 47 studies published between January 2006 and August 2017 using anesthesia information management system (AIMS) data from Thomas Jefferson University Hospital (TJUH). Data from this institution were used in the largest number (P = .0007) of papers describing the use of AIMS published in this time frame. The AIMS was replaced in April 2017, making this finite sample finite. The objective of the current article was to identify factors that made TJUH successful in publishing anesthesia informatics studies. We examined the structured query language used for each study to examine the extent to which databases outside of the AIMS were used. We examined data quality from the perspectives of completeness, correctness, concordance, plausibility, and currency. Our results were that most could not have been completed without external database sources (36/47, 76.6%; P = .0003 compared with 50%). The operating room management system was linked to the AIMS and was used significantly more frequently (26/36, 72%) than other external sources. Access to these external data sources was provided, allowing exploration of data quality. The TJUH AIMS used high-resolution timestamps (to the nearest 3 milliseconds) and created audit tables to track changes to clinical documentation. Automatic data were recorded at 1-minute intervals and were not editable; data cleaning occurred during analysis. Few paired events with an expected order were out of sequence. Although most data elements were of high quality, there were notable exceptions, such as frequent missing values for estimated blood loss, height, and weight. Some values were duplicated with different units, and others were stored in varying locations. Our conclusions are that linking the TJUH AIMS to the operating room management system was a critical step in enabling publication of multiple studies using AIMS data. Access to this and

  8. Earth System Model Development and Analysis using FRE-Curator and Live Access Servers: On-demand analysis of climate model output with data provenance.

    Science.gov (United States)

    Radhakrishnan, A.; Balaji, V.; Schweitzer, R.; Nikonov, S.; O'Brien, K.; Vahlenkamp, H.; Burger, E. F.

    2016-12-01

    There are distinct phases in the development cycle of an Earth system model. During the model development phase, scientists make changes to code and parameters and require rapid access to results for evaluation. During the production phase, scientists may make an ensemble of runs with different settings, and produce large quantities of output, that must be further analyzed and quality controlled for scientific papers and submission to international projects such as the Climate Model Intercomparison Project (CMIP). During this phase, provenance is a key concern:being able to track back from outputs to inputs. We will discuss one of the paths taken at GFDL in delivering tools across this lifecycle, offering on-demand analysis of data by integrating the use of GFDL's in-house FRE-Curator, Unidata's THREDDS and NOAA PMEL's Live Access Servers (LAS).Experience over this lifecycle suggests that a major difficulty in developing analysis capabilities is only partially the scientific content, but often devoted to answering the questions "where is the data?" and "how do I get to it?". "FRE-Curator" is the name of a database-centric paradigm used at NOAA GFDL to ingest information about the model runs into an RDBMS (Curator database). The components of FRE-Curator are integrated into Flexible Runtime Environment workflow and can be invoked during climate model simulation. The front end to FRE-Curator, known as the Model Development Database Interface (MDBI) provides an in-house web-based access to GFDL experiments: metadata, analysis output and more. In order to provide on-demand visualization, MDBI uses Live Access Servers which is a highly configurable web server designed to provide flexible access to geo-referenced scientific data, that makes use of OPeNDAP. Model output saved in GFDL's tape archive, the size of the database and experiments, continuous model development initiatives with more dynamic configurations add complexity and challenges in providing an on

  9. Geo Uruguay

    International Nuclear Information System (INIS)

    2008-06-01

    This book is based on the Geo Uruguay project which consists on the analysis and diagnosis of the environmental impact in the human welfare. The main topics covered in the different chapters are: human welfare, geographical aspects, climate change, transport and energy, changes in land use, coastal features, biodiversity, industrial urbanization, waste and territorial ordering, energy offers like oil, wood, natural gas, coal and electricity

  10. HIV Structural Database

    Science.gov (United States)

    SRD 102 HIV Structural Database (Web, free access)   The HIV Protease Structural Database is an archive of experimentally determined 3-D structures of Human Immunodeficiency Virus 1 (HIV-1), Human Immunodeficiency Virus 2 (HIV-2) and Simian Immunodeficiency Virus (SIV) Proteases and their complexes with inhibitors or products of substrate cleavage.

  11. ON THE CARTOGRAFICAL AND GEO INFORMATION TRAINING OF BACHELORS OF GEOGRAPHY

    Directory of Open Access Journals (Sweden)

    N. G. Ivliyeva

    2015-01-01

    Full Text Available The competence of a future expert becomes a distinctive sign of quality of education. When studying the disciplines of cartographical and geo information orientation the students of geography gain practical skills of creation of maps on the basis of GIS-technologies. However quite often because of insufficient cartographical preparation they create the maps which don't satisfy the requirement of traditional cartography. In the article highlights which should be known to users of GIS-packages by drawing up analytical maps according to attributive tables of GIS database are described. On the example of one of the ways of cartographical visualization of data functionality of two widespread GIS-packages is compared.

  12. Geo-Seas - a pan-European infrastructure for the management of marine geological and geophysical data.

    Science.gov (United States)

    Glaves, Helen; Graham, Colin

    2010-05-01

    Geo-Seas - a pan-European infrastructure for the management of marine geological and geophysical data. Helen Glaves1 and Colin Graham2 on behalf of the Geo-Seas consortium The Geo-Seas project will create a network of twenty six European marine geoscience data centres from seventeen coastal countries including six from the Baltic Sea area. This will be achieved through the development of a pan-European infrastructure for the exchange of marine geoscientific data. Researchers will be able to locate and access harmonised and federated marine geological and geophysical datasets and data products held by the data centres through the Geo-Seas data portal, using a common data catalogue. The new infrastructure, an expansion of the exisiting SeaDataNet, will create an infrastructure covering oceanographic and marine geoscientific data. New data products and services will be developed following consultations with users on their current and future research requirements. Common data standards will be implemented across all of the data centres and other geological and geophysical organisations will be encouraged to adopt the protocols, standards and tools which are developed as part of the Geo-Seas project. Oceanographic and marine data include a wide range of variables, an important category of which are the geological and geophysical data sets. This data includes raw observational and analytical data as well as derived data products from seabed sediment samples, boreholes, geophysical surveys (seismic, gravity etc) and sidescan sonar surveys. All of which are essential in order to produce a complete interpretation of seabed geology. Despite there being a large volume of geological and geophysical data available for the marine environment it is currently very difficult to use these datasets in an integrated way between organisations due to different nomenclatures, formats, scales and coordinate systems being used within different organisations and also within different

  13. Simulation of Telescope Detectivity for Geo Survey and Tracking

    Science.gov (United States)

    Richard, P.

    2014-09-01

    As the number of space debris on Earths Orbit increases steadily, the need to survey, track and catalogue them becomes of key importance. In this context, CNES has been using the TAROT Telescopes (Rapid Telescopes for Transient Objects owned and operated by CNRS) for several years to conduct studies about space surveillance and tracking. Today, two testbeds of services using the TAROT telescopes are running every night: one for GEO situational awareness and the second for debris tracking. Additionally to the CNES research activity on space surveillance and tracking domain, an operational collision avoidance service for LEO and GEO satellites is in place at CNES for several years. This service named CAESAR (Conjunction Analysis and Evaluation: Alerts and Recommendations) is used by CNES as well as by external customers. As the optical debris tracking testbed based on TAROT telescopes is the first step toward an operational provider of GEO measures that could be used by CAESAR, simulations have been done to help choosing the sites and types of telescopes that could be added in the GEO survey and debris tracking telescope network. One of the distinctive characteristics of the optical observation of space debris compared to traditional astronomic observation is the need to observe objects at low elevations. The two mains reasons for this are the need to observe the GEO belt from non-equatorial sites and the need to observe debris at longitudes far from the telescope longitude. This paper presents the results of simulations of the detectivity for GEO debris of various telescopes and sites, based on models of the GEO belt, the atmosphere and the instruments. One of the conclusions is that clever detection of faint streaks and spread sources by image processing is one of the major keys to improve the detection of debris on the GEO belt.

  14. JASPAR 2014: an extensively expanded and updated open-access database of transcription factor binding profiles.

    Science.gov (United States)

    Mathelier, Anthony; Zhao, Xiaobei; Zhang, Allen W; Parcy, François; Worsley-Hunt, Rebecca; Arenillas, David J; Buchman, Sorana; Chen, Chih-yu; Chou, Alice; Ienasescu, Hans; Lim, Jonathan; Shyr, Casper; Tan, Ge; Zhou, Michelle; Lenhard, Boris; Sandelin, Albin; Wasserman, Wyeth W

    2014-01-01

    JASPAR (http://jaspar.genereg.net) is the largest open-access database of matrix-based nucleotide profiles describing the binding preference of transcription factors from multiple species. The fifth major release greatly expands the heart of JASPAR-the JASPAR CORE subcollection, which contains curated, non-redundant profiles-with 135 new curated profiles (74 in vertebrates, 8 in Drosophila melanogaster, 10 in Caenorhabditis elegans and 43 in Arabidopsis thaliana; a 30% increase in total) and 43 older updated profiles (36 in vertebrates, 3 in D. melanogaster and 4 in A. thaliana; a 9% update in total). The new and updated profiles are mainly derived from published chromatin immunoprecipitation-seq experimental datasets. In addition, the web interface has been enhanced with advanced capabilities in browsing, searching and subsetting. Finally, the new JASPAR release is accompanied by a new BioPython package, a new R tool package and a new R/Bioconductor data package to facilitate access for both manual and automated methods.

  15. Federal databases

    International Nuclear Information System (INIS)

    Welch, M.J.; Welles, B.W.

    1988-01-01

    Accident statistics on all modes of transportation are available as risk assessment analytical tools through several federal agencies. This paper reports on the examination of the accident databases by personal contact with the federal staff responsible for administration of the database programs. This activity, sponsored by the Department of Energy through Sandia National Laboratories, is an overview of the national accident data on highway, rail, air, and marine shipping. For each mode, the definition or reporting requirements of an accident are determined and the method of entering the accident data into the database is established. Availability of the database to others, ease of access, costs, and who to contact were prime questions to each of the database program managers. Additionally, how the agency uses the accident data was of major interest

  16. An Overview of the GEOS-5 Aerosol Reanalysis

    Science.gov (United States)

    da Silva, Arlindo; Colarco, Peter Richard; Damenov, Anton Spasov; Buchard-Marchant, Virginie; Randles, Cynthia A.; Gupta, Pawan

    2011-01-01

    GEOS-5 is the latest version of the NASA Global Modeling and Assimilation Office (GMAO) earth system model. GEOS-5 contains components for atmospheric circulation and composition (including data assimilation), ocean circulation and biogeochemistry, and land surface processes. In addition to traditional meteorological parameters, GEOS-5 includes modules representing the atmospheric composition, most notably aerosols and tropospheric/stratospheric chemical constituents, taking explicit account of the impact of these constituents on the radiative processes of the atmosphere. MERRA is a NASA meteorological reanalysis for the satellite era (1979-present) using GEOS-5. This project focuses on historical analyses of the hydrological cycle on a broad range of weather and climate time scales. As a first step towards an integrated Earth System Analysis (IESA), the GMAO is extending MERRA with reanalyses for other components of the earth system: land, ocean, bio-geochemistry and atmospheric constituents. In this talk we will present results from the MERRA-driven aerosol reanalysis covering the Aqua period (2003-present). The assimilation of Aerosol Optical Depth (AOD) in GEOS-5 involves very careful cloud screening and homogenization of the observing system by means of a Neural Net scheme that translates MODIS radiances into AERONET calibrated AOD. These measurements are further quality controlled using an adaptive buddy check scheme, and assimilated using the Local Displacement Ensemble (LDE) methodology. For this reanalysis, GEOS-5 runs at a nominal 50km horizontal resolution with 72 vertical layers (top at approx. 8Skm). GEOS-5 is driven by daily biomass burning emissions derived from MODIS fire radiative power retrievals. We will present a summary of our efforts to validate such dataset. The GEOS-5 assimilated aerosol fields are first validated by comparison to independent in-situ measurements (AERONET and PM2.5 surface concentrations). In order to asses aerosol

  17. NBIC: Search Ballast Report Database

    Science.gov (United States)

    Smithsonian Environmental Research Center Logo US Coast Guard Logo Submit BW Report | Search NBIC Database developed an online database that can be queried through our website. Data are accessible for all coastal Lakes, have been incorporated into the NBIC database as of August 2004. Information on data availability

  18. C# Database Basics

    CERN Document Server

    Schmalz, Michael

    2012-01-01

    Working with data and databases in C# certainly can be daunting if you're coming from VB6, VBA, or Access. With this hands-on guide, you'll shorten the learning curve considerably as you master accessing, adding, updating, and deleting data with C#-basic skills you need if you intend to program with this language. No previous knowledge of C# is necessary. By following the examples in this book, you'll learn how to tackle several database tasks in C#, such as working with SQL Server, building data entry forms, and using data in a web service. The book's code samples will help you get started

  19. Aviation Safety Issues Database

    Science.gov (United States)

    Morello, Samuel A.; Ricks, Wendell R.

    2009-01-01

    The aviation safety issues database was instrumental in the refinement and substantiation of the National Aviation Safety Strategic Plan (NASSP). The issues database is a comprehensive set of issues from an extremely broad base of aviation functions, personnel, and vehicle categories, both nationally and internationally. Several aviation safety stakeholders such as the Commercial Aviation Safety Team (CAST) have already used the database. This broader interest was the genesis to making the database publically accessible and writing this report.

  20. Real Time Adaptive Stream-oriented Geo-data Filtering

    Directory of Open Access Journals (Sweden)

    A. A. Golovkov

    2016-01-01

    Full Text Available The cutting-edge engineering maintenance software systems of various objects are aimed at processing of geo-location data coming from the employees’ mobile devices in real time. To reduce the amount of transmitted data such systems, usually, use various filtration methods of geo-coordinates recorded directly on mobile devices.The paper identifies the reasons for errors of geo-data coming from different sources, and proposes an adaptive dynamic method to filter geo-location data. Compared with the static method previously described in the literature [1] the approach offers to align adaptively the filtering threshold with changing characteristics of coordinates from many sources of geo-location data.To evaluate the efficiency of the developed filter method have been involved about 400 thousand points, representing motion paths of different type (on foot, by car and high-speed train and parking (indoors, outdoors, near high-rise buildings to take data from different mobile devices. Analysis of results has shown that the benefits of the proposed method are the more precise location of long parking (up to 6 hours and coordinates when user is in motion, the capability to provide steam-oriented filtering of data from different sources that allows to use the approach in geo-information systems, providing continuous monitoring of the location in streamoriented data processing in real time. The disadvantage is a little bit more computational complexity and increasing amount of points of the final track as compared to other filtration techniques.In general, the developed approach enables a significant quality improvement of displayed paths of moving mobile objects.

  1. Let's talk about it: dialogues with multimedia databases Database support for human activity

    NARCIS (Netherlands)

    de Vries, A.P.; van der Veer, Gerrit C.; Blanken, Henk

    We describe two scenarios of user tasks in which access to multimedia data plays a significant role. Because current multimedia databases cannot support these tasks, we introduce three new requirements on multimedia databases: multimedia objects should be active objects, querying is an interaction

  2. Calibration of GEO 600 for the S1 science run

    International Nuclear Information System (INIS)

    Hewitson, M; Grote, H; Heinzel, G; Strain, K A; Ward, H; Weiland, U

    2003-01-01

    In 2002, the interferometric gravitational wave detector GEO 600 took part in a coincident science run (S1) with other detectors world-wide. When completed, GEO will employ a dual-recycling scheme which will allow its peak sensitivity to be tuned over a range of frequencies in the detection band. Still in the commissioning phase, GEO was operated as a power-recycled Michelson for the duration of S1. The accurate calibration of the sensitivity of GEO to gravitational waves is a critical step in preparing GEO data for exchange with other detectors forming a world-wide detector network. An online calibration scheme has been developed to perform real-time calibration of the power-recycled GEO detector. This scheme will later be extended to cover the more complex case of the dual-recycled interferometer in which multiple output signals will need to be combined to optimally recover a calibrated strain channel. This report presents an outline of the calibration scheme that was used during S1. Also presented are results of detector characterization work that arises naturally from the calibration work

  3. Characterization of ginger essential oil/palygorskite composite (GEO-PGS) and its anti-bacteria activity.

    Science.gov (United States)

    Lei, Hong; Wei, Qiaonian; Wang, Qing; Su, Anxiang; Xue, Mei; Liu, Qin; Hu, Qiuhui

    2017-04-01

    To explore a novel kind of anti-bacterial composite material having the excellent antibacterial ability, stability and specific-targeting capability, palygorskite (PGS) was used as the carrier of ginger essential oil (GEO) and a novel kind of composite GEO-PGS was prepared by ion exchange process. The characterization and the antibacterial activity of GEO-PGS was investigated in this study. Results of FTIR, XPS, XRD,TG analysis and SEM observation demonstrated the combination of GEO and PGS, GEO was absorbed on the surface of PGS, and the content of GEO in the composite was estimated to be 18.66%. Results of minimal inhibitory concentration (MIC) analysis, growth curve and Gram staining analysis of Staphylococci aureus and Escherichia coli exposed to GEO-PGS showed that GEO-PGS had much higher antibacterial activity than GEO, and GEO-PGS had the specific-targeting antibacterial capability. Moreover, GEO-PGS showed the characteristics of thermo-stability, acidity and alkalinity-resistance in exerting its anti-bacteria activity. In conclusion, the novel composite GEO-PGS combined the bacteria-absorbent activity of PGS and the antibacterial activity of GEO, suggesting the great potential application of GEO-PGS as the novel composite substance with high antibacterial activity. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Geo-communication, web-services, and spatial data infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2007-01-01

    The introduction of web-services as index-portals based on geo-information has changed the conditions for both content and form of geo-communication. A high number of players and interactions as well as a very high number of all kinds of information and combinations of these caracterise web...... looks very complex, and it will get even more complex. Therefore, there is a strong need for theories and models that can describe this complex web in the SDI and geo-communication consisting of active components, passive components, users, and information in order to make it possible to handle...

  5. Geo textiles and related products used in the waterproofing of reservoirs. Situation in Morocco

    International Nuclear Information System (INIS)

    Leiro Lopez, A.; Mateo Sanz, B.

    2015-01-01

    The aim of this paper is to describe the geo textiles, and products related to geo textiles, used for the building of water-storage reservoirs, which can be applicable to the construction of this kind of structures in Morocco. It presents different types of geo textiles and related products most commonly used in reservoirs, such as geo nets, geo grids, geo mats and geo composites, describing their characteristics and experimental methodology. Furthermore, and drawing on the Spanish Manual for Design, Construction, Operation and Maintenance of Reservoirs, emphasis is placed on the functions that geo synthetics can perform, such as protection and filter in the case of geo textiles, and drainage in the case of geo nets and draining composites. Finally, several works of this sort of structures located in Morocco are cited. (Author)

  6. Database for the degradation risk assessment of groundwater resources (Southern Italy)

    Science.gov (United States)

    Polemio, M.; Dragone, V.; Mitolo, D.

    2003-04-01

    The risk characterisation of quality degradation and availability lowering of groundwater resources has been pursued for a wide coastal plain (Basilicata region, Southern Italy), an area covering 40 km along the Ionian Sea and 10 km inland. The quality degradation is due two phenomena: pollution due to discharge of waste water (coming from urban areas) and due to salt pollution, related to seawater intrusion but not only. The availability lowering is due to overexploitation but also due to drought effects. To this purpose the historical data of 1,130 wells have been collected. Wells, homogenously distributed in the area, were the source of geological, stratigraphical, hydrogeological, geochemical data. In order to manage space-related information via a GIS, a database system has been devised to encompass all the surveyed wells and the body of information available per well. Geo-databases were designed to comprise the four types of data collected: a database including geometrical, geological and hydrogeological data on wells (WDB), a database devoted to chemical and physical data on groundwater (CDB), a database including the geotechnical parameters (GDB), a database concering piezometric and hydrological (rainfall, air temperature, river discharge) data (HDB). The record pertaining to each well is identified in these databases by the progressive number of the well itself. Every database is designed as follows: a) the HDB contains 1,158 records, 28 of and 31 fields, mainly describing the geometry of the well and of the stratigraphy; b) the CDB encompasses data about 157 wells, based on which the chemical and physical analyses of groundwater have been carried out. More than one record has been associated with these 157 wells, due to periodic monitoring and analysis; c) the GDB covers 61 wells to which the geotechnical parameters obtained by soil samples taken at various depths; the HDB is designed to permit the analysis of long time series (from 1918) of piezometric

  7. AtomDB: Expanding an Accessible and Accurate Atomic Database for X-ray Astronomy

    Science.gov (United States)

    Smith, Randall

    Since its inception in 2001, the AtomDB has become the standard repository of accurate and accessible atomic data for the X-ray astrophysics community, including laboratory astrophysicists, observers, and modelers. Modern calculations of collisional excitation rates now exist - and are in AtomDB - for all abundant ions in a hot plasma. AtomDB has expanded beyond providing just a collisional model, and now also contains photoionization data from XSTAR as well as a charge exchange model, amongst others. However, building and maintaining an accurate and complete database that can fully exploit the diagnostic potential of high-resolution X-ray spectra requires further work. The Hitomi results, sadly limited as they were, demonstrated the urgent need for the best possible wavelength and rate data, not merely for the strongest lines but for the diagnostic features that may have 1% or less of the flux of the strong lines. In particular, incorporation of weak but powerfully diagnostic satellite lines will be crucial to understanding the spectra expected from upcoming deep observations with Chandra and XMM-Newton, as well as the XARM and Athena satellites. Beyond incorporating this new data, a number of groups, both experimental and theoretical, have begun to produce data with errors and/or sensitivity estimates. We plan to use this to create statistically meaningful spectral errors on collisional plasmas, providing practical uncertainties together with model spectra. We propose to continue to (1) engage the X-ray astrophysics community regarding their issues and needs, notably by a critical comparison with other related databases and tools, (2) enhance AtomDB to incorporate a large number of satellite lines as well as updated wavelengths with error estimates, (3) continue to update the AtomDB with the latest calculations and laboratory measurements, in particular velocity-dependent charge exchange rates, and (4) enhance existing tools, and create new ones as needed to

  8. Hybridization of Environmental Microbial Community Nucleic Acids by GeoChip.

    Science.gov (United States)

    Van Nostrand, Joy D; Yin, Huaqin; Wu, Liyou; Yuan, Tong; Zhou, Jizhong

    2016-01-01

    Functional gene arrays, like the GeoChip, allow for the study of tens of thousands of genes in a single assay. The GeoChip array (5.0) contains probes for genes involved in geochemical cycling (N, C, S, and P), metal homeostasis, stress response, organic contaminant degradation, antibiotic resistance, secondary metabolism, and virulence factors as well as genes specific for fungi, protists, and viruses. Here, we briefly describe GeoChip design strategies (gene selection and probe design) and discuss minimum quantity and quality requirements for nucleic acids. We then provide detailed protocols for amplification, labeling, and hybridization of samples to the GeoChip.

  9. Spatial Data Infrastructure in the Perspective of Modern Geo-communication

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2006-01-01

    -edge of communication-theories play important roles. The introduction of web-services as index-portals based on geo-information has changed the conditions for both content and form of geo-communication. A high number of players and interactions as well as a very high number of all kinds of information and combinations...... the increasing complexity. Modern web-based geo-communication and its infrastructure looks very complex, and it will get even more complex! Therefore there is a strong need for theories and models that can de-scribe this complex web in the SDI in the perspective of modern geo-communication....

  10. Coordinating Mobile Databases: A System Demonstration

    OpenAIRE

    Zaihrayeu, Ilya; Giunchiglia, Fausto

    2004-01-01

    In this paper we present the Peer Database Management System (PDBMS). This system runs on top of the standard database management system, and it allows it to connect its database with other (peer) databases on the network. A particularity of our solution is that PDBMS allows for conventional database technology to be effectively operational in mobile settings. We think of database mobility as a database network, where databases appear and disappear spontaneously and their network access point...

  11. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  12. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  13. Scopus database: a review.

    Science.gov (United States)

    Burnham, Judy F

    2006-03-08

    The Scopus database provides access to STM journal articles and the references included in those articles, allowing the searcher to search both forward and backward in time. The database can be used for collection development as well as for research. This review provides information on the key points of the database and compares it to Web of Science. Neither database is inclusive, but complements each other. If a library can only afford one, choice must be based in institutional needs.

  14. World Database of Happiness

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    1995-01-01

    textabstractABSTRACT The World Database of Happiness is an ongoing register of research on subjective appreciation of life. Its purpose is to make the wealth of scattered findings accessible, and to create a basis for further meta-analytic studies. The database involves four sections:
    1.

  15. MAGA, a new database of gas natural emissions: a collaborative web environment for collecting data.

    Science.gov (United States)

    Cardellini, Carlo; Chiodini, Giovanni; Frigeri, Alessandro; Bagnato, Emanuela; Frondini, Francesco; Aiuppa, Alessandro

    2014-05-01

    The data on volcanic and non-volcanic gas emissions available online are, as today, are incomplete and most importantly, fragmentary. Hence, there is need for common frameworks to aggregate available data, in order to characterize and quantify the phenomena at various scales. A new and detailed web database (MAGA: MApping GAs emissions) has been developed, and recently improved, to collect data on carbon degassing form volcanic and non-volcanic environments. MAGA database allows researchers to insert data interactively and dynamically into a spatially referred relational database management system, as well as to extract data. MAGA kicked-off with the database set up and with the ingestion in to the database of the data from: i) a literature survey on publications on volcanic gas fluxes including data on active craters degassing, diffuse soil degassing and fumaroles both from dormant closed-conduit volcanoes (e.g., Vulcano, Phlegrean Fields, Santorini, Nysiros, Teide, etc.) and open-vent volcanoes (e.g., Etna, Stromboli, etc.) in the Mediterranean area and Azores, and ii) the revision and update of Googas database on non-volcanic emission of the Italian territory (Chiodini et al., 2008), in the framework of the Deep Earth Carbon Degassing (DECADE) research initiative of the Deep Carbon Observatory (DCO). For each geo-located gas emission site, the database holds images and description of the site and of the emission type (e.g., diffuse emission, plume, fumarole, etc.), gas chemical-isotopic composition (when available), gas temperature and gases fluxes magnitude. Gas sampling, analysis and flux measurement methods are also reported together with references and contacts to researchers expert of each site. In this phase data can be accessed on the network from a web interface, and data-driven web service, where software clients can request data directly from the database, are planned to be implemented shortly. This way Geographical Information Systems (GIS) and

  16. EuroGeoSurveys

    Science.gov (United States)

    Demicheli, L.; Ludden, J. N.; Robida, F.

    2012-04-01

    information and advice, EGS runs a number of Expert Groups in areas such as Carbon Capture and Storage, Earth Observation, Geochemistry, Spatial Information, Marine Geology, Mineral Resources, Water Resources, GeoEnergy, Natural Hazards, Soils Resources, as well as International Cooperation and Development or Communication to improve on external relations, dissemination and outreach. The Expert Groups consist of a panel of leading scientists from the member organisations of EGS who meet on a regular basis and provide technical support to the Secretariat. Having built its reputation as the leading source of European geological expertise to the European Institutions, EGS is now looking to develop their reputation in the private sector as well as their public profile through the Communication Strategy 2010-2016. EGS international profile, already consolidated through association with international geological organisations such as the International Union of Geological Sciences (IUGS) or as a participating organisation in the Global Earth Observation System of Systems (GEOSS), has recently gaining momentum through participation in outstanding projects (such as OneGeology). Most notably in 2010 agreements were signed for increased collaboration with the European Environment Agency (EEA) and the U.S. Geological Survey (USGS). Already consolidated EU priorities and emerging ones, such as those induced by globalization and the financial crisis, have opened a series of challenges for geosciences, forcing geological surveys to re-organise themselves. EGS is preparing to evolve again to even more successfully deal with those challenges. In this framework the cooperation with EPOS is being followed with much interest, as it is clear for EGS that only an open access data policy and the exploitation of synergies with other geoscientific bodies can reinforce our joint capacity to improve the security, health and wealth of European citizens.

  17. Development of geo-information data management system and application to geological disposal of high-level radioactive waste in China

    Directory of Open Access Journals (Sweden)

    Wang Peng

    2017-01-01

    Full Text Available In this paper, based on information technology, a geo-information database was established and a geo-information data management system (named as HLW-GIS was developed to facilitate data management work of the multi-source and multidisciplinary data been which are generated during site selection process of geological repository in China. Many important functions, such as basic create, retrieve, update, and delete operations, full text search and download functions, can be achieved through this management system. Even the function of statistics and analysis for certain professional data can be provided. Finally, a few hundred gigabytes of data from numerous different disciplines were integrated, stored, and managed successfully. Meanwhile, the management system can also provide a significant reference for data management work of related research fields, such as decommissioning and management of nuclear facilities, resource prospection and environmental protection.

  18. The Neotoma Paleoecology Database

    Science.gov (United States)

    Grimm, E. C.; Ashworth, A. C.; Barnosky, A. D.; Betancourt, J. L.; Bills, B.; Booth, R.; Blois, J.; Charles, D. F.; Graham, R. W.; Goring, S. J.; Hausmann, S.; Smith, A. J.; Williams, J. W.; Buckland, P.

    2015-12-01

    The Neotoma Paleoecology Database (www.neotomadb.org) is a multiproxy, open-access, relational database that includes fossil data for the past 5 million years (the late Neogene and Quaternary Periods). Modern distributional data for various organisms are also being made available for calibration and paleoecological analyses. The project is a collaborative effort among individuals from more than 20 institutions worldwide, including domain scientists representing a spectrum of Pliocene-Quaternary fossil data types, as well as experts in information technology. Working groups are active for diatoms, insects, ostracodes, pollen and plant macroscopic remains, testate amoebae, rodent middens, vertebrates, age models, geochemistry and taphonomy. Groups are also active in developing online tools for data analyses and for developing modules for teaching at different levels. A key design concept of NeotomaDB is that stewards for various data types are able to remotely upload and manage data. Cooperatives for different kinds of paleo data, or from different regions, can appoint their own stewards. Over the past year, much progress has been made on development of the steward software-interface that will enable this capability. The steward interface uses web services that provide access to the database. More generally, these web services enable remote programmatic access to the database, which both desktop and web applications can use and which provide real-time access to the most current data. Use of these services can alleviate the need to download the entire database, which can be out-of-date as soon as new data are entered. In general, the Neotoma web services deliver data either from an entire table or from the results of a view. Upon request, new web services can be quickly generated. Future developments will likely expand the spatial and temporal dimensions of the database. NeotomaDB is open to receiving new datasets and stewards from the global Quaternary community

  19. GeoBus: sharing science research with schools

    Science.gov (United States)

    Roper, Kathryn; Robinson, Ruth; Moorhouse, Ben

    2016-04-01

    GeoBus (www.geobus.org.uk) is an educational outreach project that was developed in 2012 by the Department of Earth and Environmental Sciences at the University of St Andrews, and it is currently sponsored by industry, NERC, The Crown Estate, and the Scottish Government. The aims of GeoBus are to support the teaching of Earth Science in secondary (middle and high) schools by providing teaching support to schools that have little or no experience in teaching this subject. This is, in part, done through the sharing of new science research outcomes and the experiences of young researchers with school pupils to provide a bridge between industry, higher education institutions, research councils and schools. Since its launch, over 40,000 pupils will have been involved in experiential Earth science learning activities in 190 different schools (over 400 separate visits) across the length and breadth of Scotland: many of these schools are in remote and disadvantaged regions. A new GeoBus project is under development within the Department of Earth Sciences at UCL in London. A key aim of GeoBus is to incorporate new research into our workshops with the main challenge being the development of appropriate resources that incorporate the key learning aims and requirements of the science and geography curricula. GeoBus works closely with researchers, teachers and educational practitioners to tailor the research outcomes to the curricula as much as possible. Over the past four years, GeoBus has developed 17 workshops, 5 challenge events and extensive field trips and each of these activities are trialled and evaluated within the university, and adjustments are made before the activities are delivered in schools. Activities are continually reviewed and further developments are made in response to both teacher and pupil feedback. This critical reflection of the project's success and impact is important to insure a positive and significant contribution to the science learning in

  20. A note on the optimal pricing strategy in the discrete-time Geo/Geo/1 queuing system with sojourn time-dependent reward

    Directory of Open Access Journals (Sweden)

    Doo Ho Lee

    Full Text Available This work studies the optimal pricing strategy in a discrete-time Geo/Geo/1 queuing system under the sojourn time-dependent reward. We consider two types of pricing schemes. The first one is called the ex-post payment scheme where the server charges a price that is proportional to the time a customer spends in the system, and the second one is called ex-ante payment scheme where the server charges a flat price for all services. In each pricing scheme, a departing customer receives the reward that is inversely proportional to his/her sojourn time. The server should make the optimal pricing decisions in order to maximize its expected profits per time unit in each pricing scheme. This work also investigates customer's equilibrium joining or balking behavior under server's optimal pricing strategy. Numerical experiments are also conducted to validate our analysis. Keywords: Optimal pricing, Equilibrium behavior, Geo/Geo/1 queue, Sojourn time-dependent reward

  1. Brain Tumor Database, a free relational database for collection and analysis of brain tumor patient information.

    Science.gov (United States)

    Bergamino, Maurizio; Hamilton, David J; Castelletti, Lara; Barletta, Laura; Castellan, Lucio

    2015-03-01

    In this study, we describe the development and utilization of a relational database designed to manage the clinical and radiological data of patients with brain tumors. The Brain Tumor Database was implemented using MySQL v.5.0, while the graphical user interface was created using PHP and HTML, thus making it easily accessible through a web browser. This web-based approach allows for multiple institutions to potentially access the database. The BT Database can record brain tumor patient information (e.g. clinical features, anatomical attributes, and radiological characteristics) and be used for clinical and research purposes. Analytic tools to automatically generate statistics and different plots are provided. The BT Database is a free and powerful user-friendly tool with a wide range of possible clinical and research applications in neurology and neurosurgery. The BT Database graphical user interface source code and manual are freely available at http://tumorsdatabase.altervista.org. © The Author(s) 2013.

  2. Legacy2Drupal: Conversion of an existing relational oceanographic database to a Drupal 7 CMS

    Science.gov (United States)

    Work, T. T.; Maffei, A. R.; Chandler, C. L.; Groman, R. C.

    2011-12-01

    Content Management Systems (CMSs) such as Drupal provide powerful features that can be of use to oceanographic (and other geo-science) data managers. However, in many instances, geo-science data management offices have already designed and implemented customized schemas for their metadata. The NSF funded Biological Chemical and Biological Data Management Office (BCO-DMO) has ported an existing relational database containing oceanographic metadata, along with an existing interface coded in Cold Fusion middleware, to a Drupal 7 Content Management System. This is an update on an effort described as a proof-of-concept in poster IN21B-1051, presented at AGU2009. The BCO-DMO project has translated all the existing database tables, input forms, website reports, and other features present in the existing system into Drupal CMS features. The replacement features are made possible by the use of Drupal content types, CCK node-reference fields, a custom theme, and a number of other supporting modules. This presentation describes the process used to migrate content in the original BCO-DMO metadata database to Drupal 7, some problems encountered during migration, and the modules used to migrate the content successfully. Strategic use of Drupal 7 CMS features that enable three separate but complementary interfaces to provide access to oceanographic research metadata will also be covered: 1) a Drupal 7-powered user front-end; 2) REST-ful JSON web services (providing a Mapserver interface to the metadata and data; and 3) a SPARQL interface to a semantic representation of the repository metadata (this feeding a new faceted search capability currently under development). The existing BCO-DMO ontology, developed in collaboration with Rensselaer Polytechnic Institute's Tetherless World Constellation, makes strategic use of pre-existing ontologies and will be used to drive semantically-enabled faceted search capabilities planned for the site. At this point, the use of semantic

  3. Balancing geo-privacy and spatial patterns in epidemiological studies

    Directory of Open Access Journals (Sweden)

    Chien-Chou Chen

    2017-11-01

    Full Text Available To balance the protection of geo-privacy and the accuracy of spatial patterns, we developed a geo-spatial tool (GeoMasker intended to mask the residential locations of patients or cases in a geographic information system (GIS. To elucidate the effects of geo-masking parameters, we applied 2010 dengue epidemic data from Taiwan testing the tool’s performance in an empirical situation. The similarity of pre- and post-spatial patterns was measured by D statistics under a 95% confidence interval. In the empirical study, different magnitudes of anonymisation (estimated Kanonymity ≥10 and 100 were achieved and different degrees of agreement on the pre- and post-patterns were evaluated. The application is beneficial for public health workers and researchers when processing data with individuals’ spatial information.

  4. Database in Artificial Intelligence.

    Science.gov (United States)

    Wilkinson, Julia

    1986-01-01

    Describes a specialist bibliographic database of literature in the field of artificial intelligence created by the Turing Institute (Glasgow, Scotland) using the BRS/Search information retrieval software. The subscription method for end-users--i.e., annual fee entitles user to unlimited access to database, document provision, and printed awareness…

  5. Pro iOS Geo building apps with location based services

    CERN Document Server

    Andreucci, Giacomo

    2013-01-01

    Deepen your app development skills with Pro iOS Geo. This book shows you how to use geolocation-based tools to enhance the iOS apps you develop. Author Giacomo Andreucci describes different ways to integrate geo services, depending on the kind of app you're looking to develop: a web app, a hybrid app, or a native app. You'll discover how to use the Google Maps API features to integrate powerful geo capabilities in your apps with a little effort. You'll learn how to: Design geographic features for your apps while respecting usability criteria Design touristic geo apps Use HTML5 and the Google M

  6. Geostationary Coastal and Air Pollution Events (GEO-CAPE) Sensitivity Analysis Experiment

    Science.gov (United States)

    Lee, Meemong; Bowman, Kevin

    2014-01-01

    Geostationary Coastal and Air pollution Events (GEO-CAPE) is a NASA decadal survey mission to be designed to provide surface reflectance at high spectral, spatial, and temporal resolutions from a geostationary orbit necessary for studying regional-scale air quality issues and their impact on global atmospheric composition processes. GEO-CAPE's Atmospheric Science Questions explore the influence of both gases and particles on air quality, atmospheric composition, and climate. The objective of the GEO-CAPE Observing System Simulation Experiment (OSSE) is to analyze the sensitivity of ozone to the global and regional NOx emissions and improve the science impact of GEO-CAPE with respect to the global air quality. The GEO-CAPE OSSE team at Jet propulsion Laboratory has developed a comprehensive OSSE framework that can perform adjoint-sensitivity analysis for a wide range of observation scenarios and measurement qualities. This report discusses the OSSE framework and presents the sensitivity analysis results obtained from the GEO-CAPE OSSE framework for seven observation scenarios and three instrument systems.

  7. Database Optimizing Services

    Directory of Open Access Journals (Sweden)

    Adrian GHENCEA

    2010-12-01

    Full Text Available Almost every organization has at its centre a database. The database provides support for conducting different activities, whether it is production, sales and marketing or internal operations. Every day, a database is accessed for help in strategic decisions. The satisfaction therefore of such needs is entailed with a high quality security and availability. Those needs can be realised using a DBMS (Database Management System which is, in fact, software for a database. Technically speaking, it is software which uses a standard method of cataloguing, recovery, and running different data queries. DBMS manages the input data, organizes it, and provides ways of modifying or extracting the data by its users or other programs. Managing the database is an operation that requires periodical updates, optimizing and monitoring.

  8. RTDB: A memory resident real-time object database

    International Nuclear Information System (INIS)

    Nogiec, Jerzy M.; Desavouret, Eugene

    2003-01-01

    RTDB is a fast, memory-resident object database with built-in support for distribution. It constitutes an attractive alternative for architecting real-time solutions with multiple, possibly distributed, processes or agents sharing data. RTDB offers both direct and navigational access to stored objects, with local and remote random access by object identifiers, and immediate direct access via object indices. The database supports transparent access to objects stored in multiple collaborating dispersed databases and includes a built-in cache mechanism that allows for keeping local copies of remote objects, with specifiable invalidation deadlines. Additional features of RTDB include a trigger mechanism on objects that allows for issuing events or activating handlers when objects are accessed or modified and a very fast, attribute based search/query mechanism. The overall architecture and application of RTDB in a control and monitoring system is presented

  9. Software Engineering Laboratory (SEL) database organization and user's guide

    Science.gov (United States)

    So, Maria; Heller, Gerard; Steinberg, Sandra; Spiegel, Douglas

    1989-01-01

    The organization of the Software Engineering Laboratory (SEL) database is presented. Included are definitions and detailed descriptions of the database tables and views, the SEL data, and system support data. The mapping from the SEL and system support data to the base tables is described. In addition, techniques for accessing the database, through the Database Access Manager for the SEL (DAMSEL) system and via the ORACLE structured query language (SQL), are discussed.

  10. Application of geo-information science methods in ecotourism exploitation

    Science.gov (United States)

    Dong, Suocheng; Hou, Xiaoli

    2004-11-01

    Application of geo-information science methods in ecotourism development was discussed in the article. Since 1990s, geo-information science methods, which take the 3S (Geographic Information System, Global Positioning System, and Remote Sensing) as core techniques, has played an important role in resources reconnaissance, data management, environment monitoring, and regional planning. Geo-information science methods can easily analyze and convert geographic spatial data. The application of 3S methods is helpful to sustainable development in tourism. Various assignments are involved in the development of ecotourism, such as reconnaissance of ecotourism resources, drawing of tourism maps, dealing with mass data, and also tourism information inquire, employee management, quality management of products. The utilization of geo-information methods in ecotourism can make the development more efficient by promoting the sustainable development of tourism and the protection of eco-environment.

  11. 3D GEO-INFORMATION REQUIREMENTS FOR DISASTER AND EMERGENCY MANAGEMENT

    Directory of Open Access Journals (Sweden)

    E. Demir Ozbek

    2016-06-01

    Full Text Available A conceptual approach is proposed to define 3D geo-information requirement for different types of disasters. This approach includes components such as Disaster Type-Sector-Actor-Process-Activity-Task-Data. According to disaster types processes, activities, tasks, sectors, and responsible and operational actors are derived. Based on the tasks, the needed level of detail for 3D geo-information model is determined. The levels of detail are compliant with the 3D international standard CityGML. After a brief introduction on the disaster phases and geo-information requirement for actors to perform the tasks, the paper discusses the current situation of disaster and emergency management in Turkey and elaborates on components of conceptual approach. This paper discusses the 3D geo-information requirements for the tasks to be used in the framework of 3D geo-information model for Disaster and Emergency Management System in Turkey. The framework is demonstrated for an industrial fire case in Turkey.

  12. Supply Chain Initiatives Database

    Energy Technology Data Exchange (ETDEWEB)

    None

    2012-11-01

    The Supply Chain Initiatives Database (SCID) presents innovative approaches to engaging industrial suppliers in efforts to save energy, increase productivity and improve environmental performance. This comprehensive and freely-accessible database was developed by the Institute for Industrial Productivity (IIP). IIP acknowledges Ecofys for their valuable contributions. The database contains case studies searchable according to the types of activities buyers are undertaking to motivate suppliers, target sector, organization leading the initiative, and program or partnership linkages.

  13. The IRPVM-DB database

    International Nuclear Information System (INIS)

    Davies, L.M.; Gillemot, F.; Yanko, L.; Lyssakov, V.

    1997-01-01

    The IRPVM-DB (International Reactor Pressure Vessel Material Database) initiated by the IAEA IWG LMNPP is going to collect the available surveillance and research data world-wide on RPV material ageing. This paper presents the purpose of the database; it summarizes the type and the relationship of data included; it gives information about the data access and protection; and finally, it summarizes the state of art of the database. (author). 1 ref., 2 figs

  14. The IRPVM-DB database

    Energy Technology Data Exchange (ETDEWEB)

    Davies, L M [Davies Consultants, Oxford (United Kingdom); Gillemot, F [Atomic Energy Research Inst., Budapest (Hungary); Yanko, L [Minatom (Russian Federation); Lyssakov, V [International Atomic Energy Agency, Vienna (Austria)

    1997-09-01

    The IRPVM-DB (International Reactor Pressure Vessel Material Database) initiated by the IAEA IWG LMNPP is going to collect the available surveillance and research data world-wide on RPV material ageing. This paper presents the purpose of the database; it summarizes the type and the relationship of data included; it gives information about the data access and protection; and finally, it summarizes the state of art of the database. (author). 1 ref., 2 figs.

  15. Facilitating Geoscience Education in Higher-Education Institutes Worldwide With GeoBrain -- An Online Learning and Research Environment for Classroom Innovations

    Science.gov (United States)

    Deng, M.; di, L.

    2006-12-01

    Higher education in geosciences has imminent goals to prepare students with modern geoscience knowledge and skills to meet the increased demand on trained professionals for working on the big challenges faced by geoscience disciplines, such as the global environmental change, world energy supplies, sustainable development, etc. In order to reach the goal, the geoscience education in post-secondary institutes worldwide has to attract and retain enough students and to train students with knowledge and skills needed by the society. The classroom innovations that can encourage and support student investigations and research activities are key motivation mechanisms that help to reach the goal. This presentation describes the use of GeoBrain, an innovative geospatial knowledge system, as a powerful educating tool for motivating and facilitating innovative undergraduate and graduate teaching and research in geosciences. Developed in a NASA funded project, the GeoBrain system has adopted and implemented the latest Web services and knowledge management technologies for providing innovative methods in publishing, accessing, visualizing, and analyzing geospatial data and in building/sharing geoscience knowledge. It provides a data-rich online learning and research environment enabled by wealthy data and information available at NASA Earth Observing System (EOS) Data and Information System (EOSDIS). Students, faculty members, and researchers from institutes worldwide can easily access, analyze, and model with the huge amount of NASA EOS data just like they possess such vast resources locally at their desktops. The online environment provided by GeoBrain has brought significant positive changes to geosciences education in higher-education institutes because of its new concepts and technologies, motivation mechanisms, free exploration resources, and advanced geo- processing capabilities. With the system, the used-to-be very challenging or even impossible teaching tasks has

  16. Programming database tools for the casual user

    International Nuclear Information System (INIS)

    Katz, R.A; Griffiths, C.

    1990-01-01

    The AGS Distributed Control System (AGSDCS) uses a relational database management system (INTERBASE) for the storage of all data associated with the control of the particle accelerator complex. This includes the static data which describes the component devices of the complex, as well as data for application program startup and data records that are used in analysis. Due to licensing restraints, it was necessary to develop tools to allow programs requiring access to a database to be unconcerned whether or not they were running on a licensed node. An in-house database server program was written, using Apollo mailbox communication protocols, allowing application programs via calls to this server to access the interbase database. Initially, the tools used by the server to actually access the database were written using the GDML C host language interface. Through the evolutionary learning process these tools have been converted to Dynamic SQL. Additionally, these tools have been extracted from the exclusive province of the database server and placed in their own library. This enables application programs to use these same tools on a licensed node without using the database server and without having to modify the application code. The syntax of the C calls remain the same

  17. Strategies GeoCape Intelligent Observation Studies @ GSFC

    Science.gov (United States)

    Cappelaere, Pat; Frye, Stu; Moe, Karen; Mandl, Dan; LeMoigne, Jacqueline; Flatley, Tom; Geist, Alessandro

    2015-01-01

    This presentation provides information a summary of the tradeoff studies conducted for GeoCape by the GSFC team in terms of how to optimize GeoCape observation efficiency. Tradeoffs include total ground scheduling with simple priorities, ground scheduling with cloud forecast, ground scheduling with sub-area forecast, onboard scheduling with onboard cloud detection and smart onboard scheduling and onboard image processing. The tradeoffs considered optimzing cost, downlink bandwidth and total number of images acquired.

  18. Radiation resistance of GeO2-doped silica core optical fibers

    International Nuclear Information System (INIS)

    Shibata, Shuichi; Nakahara, Motohiro; Omori, Yasuharu

    1985-01-01

    Effects of hlogen addition to silica glass on the loss in optical fibers are examined by using halogen-free, chlorine-containing and fluorine-containing GeO 2 -doped silica core optical fibers. Measurements are made for dependence of induced loss in these optical fibers on various factors such as wavelength and total dose of gamma radiation as well as GeO 2 content. Ultraviolet absorption spectra are also observed. In addition, effects of halogens added to pure silica fibers are considered on the basis of Raman spectra of three different optical fibers (pure, F-doped, and F- and GeO 2 -codoped silica core). Thus, it is concluded that (1) addition of halogens (F and Cl) serves to decrease GeO defects and Ge(3) defects in GeO 2 -doped silica optical fibers ; (2) addition of halogens suppresses the increase in loss in GeO 2 -doped silica optical fibers induced by gamma radiation ; and (3) there are close relations between the increase in loss induced by gamma radiation and defects originally existing in the fibers. Effects of halogens added to GeO 2 -doped and pure silica optical fibers can be explained on the basis of the latter relations. (Nogami, K.)

  19. The Features of Geo-Ecological Assessment within the Geo-Eco-Socio-Economic Approach to the Development of Northern Territories

    Directory of Open Access Journals (Sweden)

    Aleksander Ivanovich Semyachkov

    2015-12-01

    Full Text Available In modern conditions, for the purpose of preservation a territory’s ecosystem at its involvement in economic circulation, it is necessary to carry out the anticipatory geo-ecological assessment for indicating the degree of resistance to hypothetical anthropogenic influence. The existing methodological approaches for performing the geo-ecological assessment are unified and can often be equally applied to various types of territories. A new methodical approach for geo-ecological assessment is brought forth in the article. It takes into account the specific character of the Ural region’s northern territories. The approach is based on the point assessment of territory, which is explained by its large area, moreover, the point assessment is proposed to carry out before the development of the territory. This approach makes possible to consider the specific features of the territory’s ecosystem, namely its ability for self-restoration and self-cleaning in the process of economic development and after it. It allows carrying out the choice of economic activity direction on the whole and satisfying the condition of the minimization of the damage from violation the territory’s ecosystem and preservation its resource potential. The research results can be utilized in the studies of experts and students working on the geo-ecological assessment of territory

  20. Utilizing Free and Open Source Software to access, view and compare in situ observations, EO products and model output data

    Science.gov (United States)

    Vines, Aleksander; Hamre, Torill; Lygre, Kjetil

    2014-05-01

    The GreenSeas project (Development of global plankton data base and model system for eco-climate early warning) aims to advance the knowledge and predictive capacities of how marine ecosystems will respond to global change. A main task has been to set up a data delivery and monitoring core service following the open and free data access policy implemented in the Global Monitoring for the Environment and Security (GMES) programme. The aim is to ensure open and free access to historical plankton data, new data (EO products and in situ measurements), model data (including estimates of simulation error) and biological, environmental and climatic indicators to a range of stakeholders, such as scientists, policy makers and environmental managers. To this end, we have developed a geo-spatial database of both historical and new in situ physical, biological and chemical parameters for the Southern Ocean, Atlantic, Nordic Seas and the Arctic, and organized related satellite-derived quantities and model forecasts in a joint geo-spatial repository. For easy access to these data, we have implemented a web-based GIS (Geographical Information Systems) where observed, derived and forcasted parameters can be searched, displayed, compared and exported. Model forecasts can also be uploaded dynamically to the system, to allow modelers to quickly compare their results with available in situ and satellite observations. We have implemented the web-based GIS(Geographical Information Systems) system based on free and open source technologies: Thredds Data Server, ncWMS, GeoServer, OpenLayers, PostGIS, Liferay, Apache Tomcat, PRTree, NetCDF-Java, json-simple, Geotoolkit, Highcharts, GeoExt, MapFish, FileSaver, jQuery, jstree and qUnit. We also wanted to used open standards to communicate between the different services and we use WMS, WFS, netCDF, GML, OPeNDAP, JSON, and SLD. The main advantage we got from using FOSS was that we did not have to invent the wheel all over again, but could use

  1. OLIO+: an osteopathic medicine database.

    Science.gov (United States)

    Woods, S E

    1991-01-01

    OLIO+ is a bibliographic database designed to meet the information needs of the osteopathic medical community. Produced by the American Osteopathic Association (AOA), OLIO+ is devoted exclusively to the osteopathic literature. The database is available only by subscription through AOA and may be accessed from any data terminal with modem or IBM-compatible personal computer with telecommunications software that can emulate VT100 or VT220. Apple access is also available, but some assistance from OLIO+ support staff may be necessary to modify the Apple keyboard.

  2. Influence of inert fillers on shrinkage cracking of meta-kaolin geo-polymers

    International Nuclear Information System (INIS)

    Kuenzel, C.; Boccaccini, A.R.

    2012-01-01

    Geo-polymers contain a network of tetrahedral coordinated aluminate and silicate, and are potential materials to immobilize/encapsulate nuclear wastes. They can exhibit shrinkage cracking when water is removed by drying, and in order to use geo-polymers for waste encapsulation this effect needs to be investigated and controlled. In this study, six different fillers were mixed with meta-kaolin and sodium silicate solution at high pH to form geo-polymers, and the influence of filler addition on mechanical properties has been determined. The fillers used were Fe 2 O 3 , Al 2 O 3 , CaCO 3 , sand, glass and rubber and these do not react during geo-polymerisation reactions. Geo-polymers were prepared containing 30 weight percent of filler. The mechanical properties of the geo-polymers were influenced by the type of filler, with low density fillers increasing mortar viscosity. Geo-polymer samples containing fine filler particles exhibited shrinkage cracking on drying. This was not observed when coarser particles were added and these samples also had significantly improved mechanical properties. (authors)

  3. Race and time from diagnosis to radical prostatectomy: does equal access mean equal timely access to the operating room?--Results from the SEARCH database.

    Science.gov (United States)

    Bañez, Lionel L; Terris, Martha K; Aronson, William J; Presti, Joseph C; Kane, Christopher J; Amling, Christopher L; Freedland, Stephen J

    2009-04-01

    African American men with prostate cancer are at higher risk for cancer-specific death than Caucasian men. We determine whether significant delays in management contribute to this disparity. We hypothesize that in an equal-access health care system, time interval from diagnosis to treatment would not differ by race. We identified 1,532 African American and Caucasian men who underwent radical prostatectomy (RP) from 1988 to 2007 at one of four Veterans Affairs Medical Centers that comprise the Shared Equal-Access Regional Cancer Hospital (SEARCH) database with known biopsy date. We compared time from biopsy to RP between racial groups using linear regression adjusting for demographic and clinical variables. We analyzed risk of potential clinically relevant delays by determining odds of delays >90 and >180 days. Median time interval from diagnosis to RP was 76 and 68 days for African Americans and Caucasian men, respectively (P = 0.004). After controlling for demographic and clinical variables, race was not associated with the time interval between diagnosis and RP (P = 0.09). Furthermore, race was not associated with increased risk of delays >90 (P = 0.45) or >180 days (P = 0.31). In a cohort of men undergoing RP in an equal-access setting, there was no significant difference between racial groups with regard to time interval from diagnosis to RP. Thus, equal-access includes equal timely access to the operating room. Given our previous finding of poorer outcomes among African Americans, treatment delays do not seem to explain these observations. Our findings need to be confirmed in patients electing other treatment modalities and in other practice settings.

  4. Oxygen transport and GeO2 stability during thermal oxidation of Ge

    Science.gov (United States)

    da Silva, S. R. M.; Rolim, G. K.; Soares, G. V.; Baumvol, I. J. R.; Krug, C.; Miotti, L.; Freire, F. L.; da Costa, M. E. H. M.; Radtke, C.

    2012-05-01

    Oxygen transport during thermal oxidation of Ge and desorption of the formed Ge oxide are investigated. Higher oxidation temperatures and lower oxygen pressures promote GeO desorption. An appreciable fraction of oxidized Ge desorbs during the growth of a GeO2 layer. The interplay between oxygen desorption and incorporation results in the exchange of O originally present in GeO2 by O from the gas phase throughout the oxide layer. This process is mediated by O vacancies generated at the GeO2/Ge interface. The formation of a substoichiometric oxide is shown to have direct relation with the GeO desorption.

  5. A service-oriented data access control model

    Science.gov (United States)

    Meng, Wei; Li, Fengmin; Pan, Juchen; Song, Song; Bian, Jiali

    2017-01-01

    The development of mobile computing, cloud computing and distributed computing meets the growing individual service needs. Facing with complex application system, it's an urgent problem to ensure real-time, dynamic, and fine-grained data access control. By analyzing common data access control models, on the basis of mandatory access control model, the paper proposes a service-oriented access control model. By regarding system services as subject and data of databases as object, the model defines access levels and access identification of subject and object, and ensures system services securely to access databases.

  6. Ensuring consistency and persistence to the Quality Information Model - The role of the GeoViQua Broker

    Science.gov (United States)

    Bigagli, Lorenzo; Papeschi, Fabrizio; Nativi, Stefano; Bastin, Lucy; Masó, Joan

    2013-04-01

    a few products are annotated with their PID; recent studies show that on a total of about 100000 Clearinghouse products, only 37 have the Product Identifier. Furthermore the association should be persistent within the GeoViQua scope. GeoViQua architecture is built on the brokering approach successfully experimented within the EuroGEOSS project and realized by the GEO DAB (Discovery and Access Broker). Part of the GEOSS Common Infrastructure (GCI), the GEO DAB allows for harmonization and distribution in a transparent way for both users and data providers. This way, GeoViQua can effectively complement and extend the GEO DAB obtaining a Quality-augmentation broker (GeoViQua Broker) which plays a central role in ensuring the consistency of the Producer and User quality models. This work is focused on the typical use case in which the GeoViQua Broker performs data discovery from different data providers, and then integrates in the Quality Information Model the producer quality report with the feedback given by users. In particular, this work highlights the problems faced by the GeoViQua Broker and the techniques adopted to ensure consistency and persistency also for quality reports whose target products are not annotated with a PID. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 265178.

  7. Formulation of caesium based and caesium containing geo-polymers

    Energy Technology Data Exchange (ETDEWEB)

    Berger, S.; Joussot-Dubien, C.; Frizon, F. [CEA Valrho, Dir. de l' Energie Nucleaire, DEN, Decontamination and Conditioning Department, DEN/DTCD/SPDE/L2ED, 30 - Marcoule (France)

    2009-10-15

    Cement encapsulation is widely used as a low- and intermediate level radioactive waste immobilisation process. Among these wastes, caesium ions are poorly immobilised by Portland cement based materials. This work consists of an experimental investigation into the ability of geo-polymers to effectively encapsulate this chemical species and to determine the impact of caesium incorporation on the geo-polymer properties. Geo-polymers were synthesised with several compositions based on the activation of metakaolin with an alkali hydroxide solution containing caesium. The setting time, mineralogy, porosity and mechanical properties of the samples were examined for one month. Leach tests were conducted during the same period to determine the immobilisation efficiency. The results depend to a large extent on the composition of the activation solution in terms of soluble silica content and alkali used. These parameters determine both the degree of condensation and the geo-polymer composition. (authors)

  8. Formulation of caesium based and caesium containing geo-polymers

    International Nuclear Information System (INIS)

    Berger, S.; Joussot-Dubien, C.; Frizon, F.

    2009-01-01

    Cement encapsulation is widely used as a low- and intermediate level radioactive waste immobilisation process. Among these wastes, caesium ions are poorly immobilised by Portland cement based materials. This work consists of an experimental investigation into the ability of geo-polymers to effectively encapsulate this chemical species and to determine the impact of caesium incorporation on the geo-polymer properties. Geo-polymers were synthesised with several compositions based on the activation of metakaolin with an alkali hydroxide solution containing caesium. The setting time, mineralogy, porosity and mechanical properties of the samples were examined for one month. Leach tests were conducted during the same period to determine the immobilisation efficiency. The results depend to a large extent on the composition of the activation solution in terms of soluble silica content and alkali used. These parameters determine both the degree of condensation and the geo-polymer composition. (authors)

  9. Analysis of CO in the tropical troposphere using Aura satellite data and the GEOS-Chem model: insights into transport characteristics of the GEOS meteorological products

    Directory of Open Access Journals (Sweden)

    Junhua Liu

    2010-12-01

    Full Text Available We use the GEOS-Chem chemistry-transport model (CTM to interpret the spatial and temporal variations of tropical tropospheric CO observed by the Microwave Limb Sounder (MLS and the Tropospheric Emission Spectrometer (TES. In so doing, we diagnose and evaluate transport in the GEOS-4 and GEOS-5 assimilated meteorological fields that drive the model, with a particular focus on vertical mixing at the end of the dry season when convection moves over the source regions. The results indicate that over South America, deep convection in both GEOS-4 and GEOS-5 decays at too low an altitude early in the wet season, and the source of CO from isoprene in the model (MEGAN v2.1 is too large, causing a lag in the model's seasonal maximum of CO compared to MLS CO in the upper troposphere (UT. TES and MLS data reveal problems with excessive transport of CO to the eastern equatorial Pacific and lofting in the ITCZ in August and September, particularly in GEOS-4. Over southern Africa, GEOS-4 and GEOS-5 simulations match the phase of the observed CO variation from the lower troposphere (LT to the UT fairly well, although the magnitude of the seasonal maximum is underestimated considerably due to low emissions in the model. A sensitivity run with increased emissions leads to improved agreement with observed CO in the LT and middle troposphere (MT, but the amplitude of the seasonal variation is too high in the UT in GEOS-4. Difficulty in matching CO in the LT and UT implies there may be overly vigorous vertical mixing in GEOS-4 early in the wet season. Both simulations and observations show a time lag between the peak in fire emissions (July and August and in CO (September and October. We argue that it is caused by the prevailing subsidence in the LT until convection moves south in September, as well as the low sensitivity of TES data in the LT over the African Plateau. The MLS data suggest that too much CO has been transported from fires in northern Africa to the UT

  10. The geo-reactor. A link between nuclear fission and geothermal energy?

    International Nuclear Information System (INIS)

    Degueldre, Claude; Fiorina, Carlo

    2013-01-01

    Recent high-precision isotope analysis data suggests the potential occurrence of a geo-reactor. Specific gas isotopes that could have been generated by binary and ternary fissions were identified in volcano emanations or as soluble/associated species in crystalline rocks and semi-quantitatively evaluated as isotopic ratio or estimated amounts. Presently if it is evident that according to the actinide inventory on the Earth, local potential criticality of the geo-system may have been reached, several questions remain such as why, where and when did a geo-reactor be operational? Even if the hypothesis of a geo-reactor operation in the proto-Earth period should be acceptable, it could be difficult to anticipate that a geo-reactor is still operating today. This could be tested in the future by assessing and reconstructing the system by antineutrino detection and tomography through the Earth. The present paper focuses on the geo-reactor conditions including history, spatial extension and regimes. The discussion based on recent calculations involves investigations on the limits in term of fissile inventory, size and power, based on stratification through the gravitational field and the various features through the inner mantel, the boundary with the core, the external part and the inner-core. the reconstruction allows to formulating that from the history point of view there are possibilities that the geo-reactor reached criticality in a proto-Earth period as a thorium/uranium reactor triggered by an under-layer with heavier actinides. The geo-reactor should be a key component of geothermal energy sources. (author)

  11. Geo synthetics in hydraulic and coastal engineering: Filters, revetments and sand filled structures

    International Nuclear Information System (INIS)

    Bezuijen, A.; Pilarczyk, K. W.

    2014-01-01

    The paper deals with 2 applications of geo textiles in coastal and hydraulic engineering: Geo textiles in filters and revetments; and geo textiles in sand filled structure. Geo textiles are often replacing granular filters. However, they have different properties than a granular filter. For the application of geo textiles in revetments, the consequences of the different properties will be shown: how permeability is influenced by a geo textile and what can be the consequences of the weight differences between granular and geo textile filters. In the other application, the filter properties of geo textiles are only secondary. In geo textile tubes and containers the geo textile is used as wrapping material to create large unties that will not erode during wave attach. the structures with geo textile tubes and containers serve as an alternative for rock based structures. The first of these structures were more or less constructed by trial and error, but research on the shape of the structures, the stability under wave attach and the durability of the used of the used material has given the possibility to use design tools for these structures. Recently also the morphological aspects of these structures have been investigated. This is of importance because regularly structures with geo textile tubes fail due to insufficient toe protection against the scour hole that that develops in front of the structure, leading to undermining of the structure. Recent research in the Dealt Flume of Deltares and the Large Wave Flume in Hannover has led to better understanding what mechanisms determine the stability under wave attach. It is shown that also the degree of filling is of importance and the position of the water level with respect to the tube has a large influence. (Author)

  12. Geo-Enabled, Mobile Services

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard

    2006-01-01

    We are witnessing the emergence of a global infrastructure that enables the widespread deployment of geo-enabled, mobile services in practice. At the same time, the research community has also paid increasing attention to data management aspects of mobile services. This paper offers me...

  13. LHCb distributed conditions database

    International Nuclear Information System (INIS)

    Clemencic, M

    2008-01-01

    The LHCb Conditions Database project provides the necessary tools to handle non-event time-varying data. The main users of conditions are reconstruction and analysis processes, which are running on the Grid. To allow efficient access to the data, we need to use a synchronized replica of the content of the database located at the same site as the event data file, i.e. the LHCb Tier1. The replica to be accessed is selected from information stored on LFC (LCG File Catalog) and managed with the interface provided by the LCG developed library CORAL. The plan to limit the submission of jobs to those sites where the required conditions are available will also be presented. LHCb applications are using the Conditions Database framework on a production basis since March 2007. We have been able to collect statistics on the performance and effectiveness of both the LCG library COOL (the library providing conditions handling functionalities) and the distribution framework itself. Stress tests on the CNAF hosted replica of the Conditions Database have been performed and the results will be summarized here

  14. Uranium Location Database

    Data.gov (United States)

    U.S. Environmental Protection Agency — A GIS compiled locational database in Microsoft Access of ~15,000 mines with uranium occurrence or production, primarily in the western United States. The metadata...

  15. Penn State geoPebble system: Design,Implementation, and Initial Results

    Science.gov (United States)

    Urbina, J. V.; Anandakrishnan, S.; Bilen, S. G.; Fleishman, A.; Burkett, P.

    2014-12-01

    The Penn State geoPebble system is a new network of wirelessly interconnected seismic and GPS sensor nodes with flexible architecture. This network will be used for studies of ice sheets in Antarctica and Greenland, as well as to investigate mountain glaciers. The network will consist of ˜150 geoPebbles that can be deployed in a user-defined spatial geometry. We present our design methodology, which has enabled us to develop these state-of- the art sensors using commercial-off-the-shelf hardware combined with custom-designed hardware and software. Each geoPebble is a self- contained, wirelessly connected sensor for collecting seismic measurements and position information. Key elements of each node encompasses a three-component seismic recorder, which includes an amplifier, filter, and 24- bit analog-to-digital converter that can sample up to 10 kHz. Each unit also includes a microphone channel to record the ground-coupled airwave. The timing for each node is available from GPS measurements and a local precision oscillator that is conditioned by the GPS timing pulses. In addition, we record the carrier-phase measurement of the L1 GPS signal in order to determine location at sub-decimeter accuracy (relative to other geoPebbles within a few kilometers radius). Each geoPebble includes 16 GB of solid-state storage, wireless communications capability to a central supervisory unit, and auxiliary measurements capability (including tilt from accelerometers, absolute orientation from magnetometers and temperature). A novel aspect of the geoPebble is a wireless charging system for the internal battery (using inductive coupling techniques). The geoPebbles include all the sensors (geophones, GPS, microphone), communications (WiFi), and power (battery and charging) internally, so the geoPebble system can operate without any cabling connections (though we do provide an external connector so that different geophones can be used). We report initial field-deployment results and

  16. MAGIC: conoscere i mari italiani e individuarne i geo-rischi

    Directory of Open Access Journals (Sweden)

    Alessandro Bosman

    2010-03-01

    Full Text Available MAGIC project: Marine Geohazard along the Italian Coasts MAGIC Project is funded by the Italian Civil Protection  Department (DPC to produce a bathymetric database as reference for compiling maps (1:50.000 of marine geo-hazard. During its 5-year life span (2007-2012, MAGIC will allow the acquisition of high-resolution multibeam bathymetry along the Italian continental margins and will involve the entire Italian scientific community currently active in the field of Marine Geology. More than 73.000 nautical miles of multibeam data will be analyzed, allowing comparison of geological features produced by sedimentary and tectonic processes (i.e. volcanic events, submarine landslide, active faulting. The main objective of MAGIC is to furnish the DPC  accurate depiction of superficial geology and relatedgeo-hazard on the most sensitive and hazard-prone areas.

  17. Access to DNA and protein databases on the Internet.

    Science.gov (United States)

    Harper, R

    1994-02-01

    During the past year, the number of biological databases that can be queried via Internet has dramatically increased. This increase has resulted from the introduction of networking tools, such as Gopher and WAIS, that make it easy for research workers to index databases and make them available for on-line browsing. Biocomputing in the nineties will see the advent of more client/server options for the solution of problems in bioinformatics.

  18. Investigation Antiwear Properties of Lubricants with the Geo-Modifiers of Friction

    Directory of Open Access Journals (Sweden)

    I. Levanov

    2017-09-01

    Full Text Available The article describes the influence of the geo-modifiers of friction on the antiwear properties of lubricants. Geo-modifiers of friction are the fine powders of mineral materials. This work is directed on the investigation the influence of the geo-modifiers of friction in the form of the hard lubricant compositions, which based on a mineral serpentine, on the anti-wear properties of greases and gear oils. This composition is the fine powder serpentine with the addition of components such as chalk, borax, kaolin and talc. We compared the antiwear properties of the greases without geo-modifiers of friction and the antiwear properties of greases containing the geo-modifiers of friction from 0.5 % to 3 %. The Litol-24 and transmission oil TAD-17 was used for testihg. The four-ball machine of friction was used for tests accordance with GOST 9490-75. As geo-modifiers the serpentine was used, the fraction of which has a size from 0.87 microns to 2.2 microns. Such parameter as the wear scar diameter was used for evaluation of the antiwear properties of lubricants. As a result of tests it was established that the antiwear greases properties improved on 26-50 % depending on the concentration of the geo-modifiers of friction based on the pure serpentine.

  19. MongoDB and Python Patterns and processes for the popular document-oriented database

    CERN Document Server

    O'Higgins, Niall

    2011-01-01

    Learn how to leverage MongoDB with your Python applications, using the hands-on recipes in this book. You get complete code samples for tasks such as making fast geo queries for location-based apps, efficiently indexing your user documents for social-graph lookups, and many other scenarios. This guide explains the basics of the document-oriented database and shows you how to set up a Python environment with it. Learn how to read and write to MongoDB, apply idiomatic MongoDB and Python patterns, and use the database with several popular Python web frameworks. You'll discover how to model your

  20. ARTI Refrigerant Database

    Energy Technology Data Exchange (ETDEWEB)

    Calm, J.M. [Calm (James M.), Great Falls, VA (United States)

    1994-05-27

    The Refrigerant Database consolidates and facilitates access to information to assist industry in developing equipment using alternative refrigerants. The underlying purpose is to accelerate phase out of chemical compounds of environmental concern.

  1. WMC Database Evaluation. Case Study Report

    Energy Technology Data Exchange (ETDEWEB)

    Palounek, Andrea P. T [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-10-29

    The WMC Database is ultimately envisioned to hold a collection of experimental data, design information, and information from computational models. This project was a first attempt at using the Database to access experimental data and extract information from it. This evaluation shows that the Database concept is sound and robust, and that the Database, once fully populated, should remain eminently usable for future researchers.

  2. ECG-ViEW II, a freely accessible electrocardiogram database

    Science.gov (United States)

    Park, Man Young; Lee, Sukhoon; Jeon, Min Seok; Yoon, Dukyong; Park, Rae Woong

    2017-01-01

    The Electrocardiogram Vigilance with Electronic data Warehouse II (ECG-ViEW II) is a large, single-center database comprising numeric parameter data of the surface electrocardiograms of all patients who underwent testing from 1 June 1994 to 31 July 2013. The electrocardiographic data include the test date, clinical department, RR interval, PR interval, QRS duration, QT interval, QTc interval, P axis, QRS axis, and T axis. These data are connected with patient age, sex, ethnicity, comorbidities, age-adjusted Charlson comorbidity index, prescribed drugs, and electrolyte levels. This longitudinal observational database contains 979,273 electrocardiograms from 461,178 patients over a 19-year study period. This database can provide an opportunity to study electrocardiographic changes caused by medications, disease, or other demographic variables. ECG-ViEW II is freely available at http://www.ecgview.org. PMID:28437484

  3. Newspaper archives + text mining = rich sources of historical geo-spatial data

    Science.gov (United States)

    Yzaguirre, A.; Smit, M.; Warren, R.

    2016-04-01

    Newspaper archives are rich sources of cultural, social, and historical information. These archives, even when digitized, are typically unstructured and organized by date rather than by subject or location, and require substantial manual effort to analyze. The effort of journalists to be accurate and precise means that there is often rich geo-spatial data embedded in the text, alongside text describing events that editors considered to be of sufficient importance to the region or the world to merit column inches. A regional newspaper can add over 100,000 articles to its database each year, and extracting information from this data for even a single country would pose a substantial Big Data challenge. In this paper, we describe a pilot study on the construction of a database of historical flood events (location(s), date, cause, magnitude) to be used in flood assessment projects, for example to calibrate models, estimate frequency, establish high water marks, or plan for future events in contexts ranging from urban planning to climate change adaptation. We then present a vision for extracting and using the rich geospatial data available in unstructured text archives, and suggest future avenues of research.

  4. A fully distributed geo-routing scheme for wireless sensor networks

    KAUST Repository

    Bader, Ahmed

    2013-12-01

    When marrying randomized distributed space-time coding (RDSTC) to beaconless geo-routing, new performance horizons can be created. In order to reach those horizons, however, beaconless geo-routing protocols must evolve to operate in a fully distributed fashion. In this letter, we expose a technique to construct a fully distributed geo-routing scheme in conjunction with RDSTC. We then demonstrate the performance gains of this novel scheme by comparing it to one of the prominent classical schemes. © 2013 IEEE.

  5. A fully distributed geo-routing scheme for wireless sensor networks

    KAUST Repository

    Bader, Ahmed; Abed-Meraim, Karim; Alouini, Mohamed-Slim

    2013-01-01

    When marrying randomized distributed space-time coding (RDSTC) to beaconless geo-routing, new performance horizons can be created. In order to reach those horizons, however, beaconless geo-routing protocols must evolve to operate in a fully distributed fashion. In this letter, we expose a technique to construct a fully distributed geo-routing scheme in conjunction with RDSTC. We then demonstrate the performance gains of this novel scheme by comparing it to one of the prominent classical schemes. © 2013 IEEE.

  6. Specific Space Transportation Costs to GEO - Past, Present and Future

    Science.gov (United States)

    Koelle, Dietrich E.

    2002-01-01

    The largest share of space missions is going to the Geosynchronous Orbit (GEO); they have the highest commercial importance. The paper first shows the historic trend of specific transportation costs to GEO from 1963 to 2002. It started out with more than 500 000 /kg(2002-value) and has come down to 36 000 /kg. This reduction looks impressive, however, the reason is NOT improved technology or new techniques but solely the growth of GEO payloads`unit mass. The first GEO satellite in 1963 did have a mass of 36 kg mass (BoL) . This has grown to a weight of 1600 kg (average of all GEO satellites) in the year 2000. Mass in GEO after injection is used here instead of GTO mass since the GTO mass depends on the launch site latitude. The specific cost reduction is only due to the "law-of-scale", valid in the whole transportation business: the larger the payload, the lower the specific transportation cost. The paper shows the actual prices of launch services to GTO by the major launch vehicles. Finally the potential GEO transportation costs of future launch systems are evaluated. What is the potential reduction of specific transportation costs if reusable elements are introduced in future systems ? Examples show that cost reductions up to 75 % seem achievable - compared to actual costs - but only with launch systems optimized according to modern principles of cost engineering. 1. 53rd International Astronautical Congress, World Space Congress Houston 2. First Submission 3. Specific Space Transportation Costs to GEO - Past, Present and Future 4. KOELLE, D.E. 5. IAA.1.1 Launch Vehicles' Cost Engineering and Economic Competitiveness 6. D.E. Koelle; A.E. Goldstein 7. One overhead projector and screen 8. Word file attached 9. KOELLE I have approval to attend the Congress. I am not willing to present this paper at the IAC Public Outreach Program.

  7. Concept of a spatial data infrastructure for web-mapping, processing and service provision for geo-hazards

    Science.gov (United States)

    Weinke, Elisabeth; Hölbling, Daniel; Albrecht, Florian; Friedl, Barbara

    2017-04-01

    Geo-hazards and their effects are distributed geographically over wide regions. The effective mapping and monitoring is essential for hazard assessment and mitigation. It is often best achieved using satellite imagery and new object-based image analysis approaches to identify and delineate geo-hazard objects (landslides, floods, forest fires, storm damages, etc.). At the moment, several local/national databases and platforms provide and publish data of different types of geo-hazards as well as web-based risk maps and decision support systems. Also, the European commission implemented the Copernicus Emergency Management Service (EMS) in 2015 that publishes information about natural and man-made disasters and risks. Currently, no platform for landslides or geo-hazards as such exists that enables the integration of the user in the mapping and monitoring process. In this study we introduce the concept of a spatial data infrastructure for object delineation, web-processing and service provision of landslide information with the focus on user interaction in all processes. A first prototype for the processing and mapping of landslides in Austria and Italy has been developed within the project Land@Slide, funded by the Austrian Research Promotion Agency FFG in the Austrian Space Applications Program ASAP. The spatial data infrastructure and its services for the mapping, processing and analysis of landslides can be extended to other regions and to all types of geo-hazards for analysis and delineation based on Earth Observation (EO) data. The architecture of the first prototypical spatial data infrastructure includes four main areas of technical components. The data tier consists of a file storage system and the spatial data catalogue for the management of EO-data, other geospatial data on geo-hazards, as well as descriptions and protocols for the data processing and analysis. An interface to extend the data integration from external sources (e.g. Sentinel-2 data) is planned

  8. Direct access to INIS

    International Nuclear Information System (INIS)

    Zheludev, I.S.; Romanenko, A.G.

    1981-01-01

    Librarians, researchers, and information specialists throughout the world now have the opportunity for direct access to coverage of almost 95% of the world's literature dealing with the peaceful uses of atomic energy and nuclear science. This opportunity has been provided by the International Nuclear Information System (INIS) of the IAEA. INIS, with the voluntary collaboration of more than 60 of the Agency's Member States, maintains a comprehensive, computer-resident data-base, containing the bibliographic details plus informative abstracts of the bulk of the world's literature on nuclear science and technology. Since this data-base is growing at a rate of 75,000 items per year, and already contains more than 500,000 items, it is obviously important to be able to search this collection conveniently and efficiently. The usefulness of this ability is enhanced when other data-bases on related subjects are made available on an information network. During the early 1970s, on-line interrogation of large bibliographic data-bases became the accepted method for searching this type of information resource. Direct interaction between the searcher and the data-base provides quick feed-back resulting in improved literature listings for launching research and development projects. On-line access enables organizations which cannot store a large data-base on their own computer to expand the information resources at their command. Because of these advantages, INIS undertook to extend to interested Member States on-line access to its data-base in Vienna

  9. NLTE4 Plasma Population Kinetics Database

    Science.gov (United States)

    SRD 159 NLTE4 Plasma Population Kinetics Database (Web database for purchase)   This database contains benchmark results for simulation of plasma population kinetics and emission spectra. The data were contributed by the participants of the 4th Non-LTE Code Comparison Workshop who have unrestricted access to the database. The only limitation for other users is in hidden labeling of the output results. Guest users can proceed to the database entry page without entering userid and password.

  10. Thallium pollution in China: A geo-environmental perspective.

    Science.gov (United States)

    Xiao, Tangfu; Yang, Fei; Li, Shehong; Zheng, Baoshan; Ning, Zengping

    2012-04-01

    It is well known that thallium (Tl) is a non-essential and toxic metal to human health, but less is known about the geo-environmentally-induced Tl pollution and its associated health impacts. High concentrations of Tl that are primarily associated with the epithermal metallogenesis of sulfide minerals have the potential of producing Tl pollution in the environment, which has been recognized as an emerging pollutant in China. This paper aims to review the research progress in China on Tl pollution in terms of the source, mobility, transportation pathway, and health exposure of Tl and to address the environmental concerns on Tl pollution in a geo-environmental perspective. Tl associated with the epithermal metallogenesis of sulfide minerals has been documented to disperse readily and accumulate through the geo-environmental processes of soil enrichment, water transportation and food crop growth beyond a mineralized zone. The enrichments of Tl in local soil, water, and crops may result in Tl pollution and consequent adverse health effects, e.g. chronic Tl poisoning. Investigation of the baseline Tl in the geo-environment, proper land use and health-related environmental planning and regulation are critical to prevent the Tl pollution. Examination of the human urinary Tl concentration is a quick approach to identify exposure of Tl pollution to humans. The experiences of Tl pollution in China can provide important lessons for many other regions in the world with similar geo-environmental contexts because of the high mobility and toxicity of Tl. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Geo synthetic-reinforced Pavement systems; Sistemas de pavimentos reforzados con geosinteticos

    Energy Technology Data Exchange (ETDEWEB)

    Zornberg, J. G.

    2014-02-01

    Geo synthetics have been used as reinforcement inclusions to improve pavement performance. while there are clear field evidence of the benefit of using geo synthetic reinforcements, the specific conditions or mechanisms that govern the reinforcement of pavements are, at best, unclear and have remained largely unmeasured. Significant research has been recently conducted with the objectives of: (i) determining the relevant properties of geo synthetics that contribute to the enhanced performance of pavement systems, (ii) developing appropriate analytical, laboratory and field methods capable of quantifying the pavement performance, and (iii) enabling the prediction of pavement performance as a function of the properties of the various types of geo synthetics. (Author)

  12. Impact of Access to Online Databases on Document Delivery Services within Iranian Academic Libraries

    Directory of Open Access Journals (Sweden)

    Zohreh Zahedi

    2007-04-01

    Full Text Available The present study investigates the impact of access to online databases on the document delivery services in Iranian Academic Libraries, within the framework of factors such as number of orders lodged over the years studied and their trends, expenditures made by each university, especially those universities and groups that had the highest number of orders. This investigation was carried out through a survey and by calling on the library document supply unit in universities as well as in-person interview with librarians in charge. The study sample was confined to the universities of Shiraz, Tehran and Tarbiyat Modaress along with their faculties. Findings indicate that the rate of document requests in various universities depends on the target audience, capabilities, students’ familiarity as well as mode of document delivery services..

  13. Creation of the NaSCoRD Database

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jankovsky, Zachary Kyle [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stuart, William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    This report was written as part of a United States Department of Energy (DOE), Office of Nuclear Energy, Advanced Reactor Technologies program funded project to re-create the capabilities of the legacy Centralized Reliability Database Organization (CREDO) database. The CREDO database provided a record of component design and performance documentation across various systems that used sodium as a working fluid. Regaining this capability will allow the DOE complex and the domestic sodium reactor industry to better understand how previous systems were designed and built for use in improving the design and operations of future loops. The contents of this report include: overview of the current state of domestic sodium reliability databases; summary of the ongoing effort to improve, understand, and process the CREDO information; summary of the initial efforts to develop a unified sodium reliability database called the Sodium System Component Reliability Database (NaSCoRD); and explain both how potential users can access the domestic sodium reliability databases and the type of information that can be accessed from these databases.

  14. HCUP State Inpatient Databases (SID) - Restricted Access File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The State Inpatient Databases (SID) contain the universe of hospital inpatient discharge abstracts in States participating in HCUP that release their data through...

  15. Self-Access Centers: Maximizing Learners’ Access to Center Resources

    Directory of Open Access Journals (Sweden)

    Mark W. Tanner

    2010-09-01

    Full Text Available Originally published in TESL-EJ March 2009, Volume 12, Number 4 (http://tesl-ej.org/ej48/a2.html. Reprinted with permission from the authors.Although some students have discovered how to use self-access centers effectively, the majority appear to be unaware of available resources. A website and database of materials were created to help students locate materials and use the Self-Access Study Center (SASC at Brigham Young University’s English Language Center (ELC more effectively. Students took two surveys regarding their use of the SASC. The first survey was given before the website and database were made available. A second survey was administered 12 weeks after students had been introduced to the resource. An analysis of the data shows that students tend to use SASC resources more autonomously as a result of having a web-based database. The survey results suggest that SAC managers can encourage more autonomous use of center materials by provided a website and database to help students find appropriate materials to use to learn English.

  16. Fluid migration through geo-membrane seams and through the interface between geo-membrane and geo-synthetic clay liner; Contribution a l'etude des transferts de masse au niveau des joints de geomembrane et a l'interface entre geomembrane et geosynthetique bentonitique

    Energy Technology Data Exchange (ETDEWEB)

    Barroso, M

    2005-03-15

    Composite liners are used to limit the contamination migration from landfills. Their successful performance is closely related with the geo-membrane as it provides the primary barrier to diffusive and advective transport of contaminants. Critical issues on the performance of the geo-membranes are the seams between geo-membrane panels and the inevitable defects resulting, for instance, from inadequate installation activities. In landfills, where high density polyethylene geo-membranes are usually used, seams are typically made by the thermal-hot dual wedge method. A literature review on quality control of the seams showed that, in situ, fluid-tightness of seams is evaluated in qualitative terms (pass/failure criteria), despite their importance to ensure appropriate performance of the geo-membranes as barriers. In addition, a synthesis of studies on geo-membrane defects indicated that defects varying in density from 0.7 to 15.3 per hectare can be found in landfills. Defects represent preferential flow paths for leachate. Various authors have developed analytical solutions and empirical equations for predicting the flow rate through composite liners due to defects in the geo-membrane. The validity of these methods for composite liners comprising a geo-membrane over a geo-synthetic clay liner (GCL) over a compacted clay liner (CCL) has never been studied from an experimental point of view. To address the problem of fluid migration through the geo-membrane seams, an attempt is made to provide a test method, herein termed as 'gas permeation pouch test', for assessing the quality of the thermal-hot dual wedge seams. This test consists of pressurizing the air channel formed by the double seam with a gas to a specific pressure and, then, measuring the decrease in pressure over time. From the pressure decrease, both the gas permeation coefficients, in steady state conditions, and the time constant, in unsteady state conditions, can be estimated. Experiments were

  17. International Nuclear Safety Center (INSC) database

    International Nuclear Information System (INIS)

    Sofu, T.; Ley, H.; Turski, R.B.

    1997-01-01

    As an integral part of DOE's International Nuclear Safety Center (INSC) at Argonne National Laboratory, the INSC Database has been established to provide an interactively accessible information resource for the world's nuclear facilities and to promote free and open exchange of nuclear safety information among nations. The INSC Database is a comprehensive resource database aimed at a scope and level of detail suitable for safety analysis and risk evaluation for the world's nuclear power plants and facilities. It also provides an electronic forum for international collaborative safety research for the Department of Energy and its international partners. The database is intended to provide plant design information, material properties, computational tools, and results of safety analysis. Initial emphasis in data gathering is given to Soviet-designed reactors in Russia, the former Soviet Union, and Eastern Europe. The implementation is performed under the Oracle database management system, and the World Wide Web is used to serve as the access path for remote users. An interface between the Oracle database and the Web server is established through a custom designed Web-Oracle gateway which is used mainly to perform queries on the stored data in the database tables

  18. The live service of video geo-information

    Science.gov (United States)

    Xue, Wu; Zhang, Yongsheng; Yu, Ying; Zhao, Ling

    2016-03-01

    In disaster rescue, emergency response and other occasions, traditional aerial photogrammetry is difficult to meet real-time monitoring and dynamic tracking demands. To achieve the live service of video geo-information, a system is designed and realized—an unmanned helicopter equipped with video sensor, POS, and high-band radio. This paper briefly introduced the concept and design of the system. The workflow of video geo-information live service is listed. Related experiments and some products are shown. In the end, the conclusion and outlook is given.

  19. The GEO Handbook on Biodiversity Observation Networks

    CSIR Research Space (South Africa)

    Walters, Michele

    2017-01-01

    Full Text Available across the planet. I congratulate GEO BON on creating this powerful mechanism and wish the GEO BON community great success in each of its future endeavours. Geneva, Switzerland Barbara J. Ryan Executive Director: Group on Earth Observations viii Foreword... of biodiversity data is the desired goal, it would be hard to achieve except via the mechanism of a network, simply because 6 R.J. Scholes et al. sampling and species identification is more cost-effective and situation-appropriate if conducted using local...

  20. Database on wind characteristics. Contents of database bank

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, K.S.

    2001-01-01

    for the available data in the established database bank and part three is the Users Manual describing the various ways to access and analyse the data. The present report constitutes the second part of the Annex XVII reporting. Basically, the database bank contains three categories of data, i.e. i) high sampled wind...... field time series; ii) high sampled wind turbine structural response time series; andiii) wind resource data. The main emphasis, however, is on category i). The available data, within each of the three categories, are described in details. The description embraces site characteristics, terrain type...

  1. Urate levels predict survival in amyotrophic lateral sclerosis: Analysis of the expanded Pooled Resource Open-Access ALS clinical trials database.

    Science.gov (United States)

    Paganoni, Sabrina; Nicholson, Katharine; Chan, James; Shui, Amy; Schoenfeld, David; Sherman, Alexander; Berry, James; Cudkowicz, Merit; Atassi, Nazem

    2018-03-01

    Urate has been identified as a predictor of amyotrophic lateral sclerosis (ALS) survival in some but not all studies. Here we leverage the recent expansion of the Pooled Resource Open-Access ALS Clinical Trials (PRO-ACT) database to study the association between urate levels and ALS survival. Pooled data of 1,736 ALS participants from the PRO-ACT database were analyzed. Cox proportional hazards regression models were used to evaluate associations between urate levels at trial entry and survival. After adjustment for potential confounders (i.e., creatinine and body mass index), there was an 11% reduction in risk of reaching a survival endpoint during the study with each 1-mg/dL increase in uric acid levels (adjusted hazard ratio 0.89, 95% confidence interval 0.82-0.97, P ALS and confirms the utility of the PRO-ACT database as a powerful resource for ALS epidemiological research. Muscle Nerve 57: 430-434, 2018. © 2017 Wiley Periodicals, Inc.

  2. Variable-scale Geo-information

    NARCIS (Netherlands)

    Meijers, B.M.

    2011-01-01

    The use of geo-information is changing by the advent of new mobile devices, such as tablet-pc's that harness a lot of computing power. This type of information is more and more applied in mainstream digital consumer products, in a net-centric environment (i.e. dissemination takes place via the

  3. Global Ocean Currents Database (GOCD) (NCEI Accession 0093183)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Global Ocean Currents Database (GOCD) is a collection of quality controlled ocean current measurements such as observed current direction and speed obtained from...

  4. Encapsulation of Mg-Zr alloy in metakaolin-based geo-polymer

    International Nuclear Information System (INIS)

    Rooses, Adrien; Steins, Prune; Dannoux-Papin, Adeline; Lambertin, David; Poulesquen, Arnaud; Frizon, Fabien

    2013-01-01

    Investigations were carried out to propose a suitable material for the encapsulation of Mg-Zr alloy wastes issued from fuel cladding of the first generation nuclear reactors. Stability over time, good mechanical properties and low gas production are the main requirements that embedding matrices must comply with in order to be suitable for long run storage. One of the main issues encapsulating Mg-Zr alloy in mineral binder is the hydrogen production related to Mg-Zr alloys corrosion and water radiolysis process. In this context, metakaolin geo-polymers offer an interesting outlook: corrosion densities of Mg-Zr alloys are significantly lower than in Portland cement. This work firstly presents the hydrogen production of Mg-Zr alloy embedded in geo-polymers prepared from different the activation solution (NaOH or KOH). The effect of addition of fluorine on the magnesium corrosion in geo-polymer has been investigated too. The results point out that sodium geo-polymer is a suitable binder for Mg-Zr alloy encapsulation with respect to magnesium corrosion resistance. Furthermore the presence of fluorine reduces significantly the hydrogen release. Then, the impact of fluorine on the geo-polymer network formation was studied by rheological, calorimetric and 19 F NMR measurements. No direct effect resulting from the addition of fluorine has been shown on the geo-polymer binder. Secondly, the formulation of the encapsulation matrix has been adjusted to fulfil the expected physical and mechanical properties. Observations, dimensional evolutions and compressive strengths demonstrated that addition of sand to the geo-polymer binder is efficient to meet the storage criteria. Consequently, a matrix formulation compatible with Mg-Zr alloy encapsulation has been proposed. Finally, irradiation tests have been carried out to assess the hydrogen radiolytic yield of the matrix under exposure to γ radiation. (authors)

  5. GEO Optical Data Association with Concurrent Metric and Photometric Information

    Science.gov (United States)

    Dao, P.; Monet, D.

    Data association in a congested area of the GEO belt with occasional visits by non-resident objects can be treated as a Multi-Target-Tracking (MTT) problem. For a stationary sensor surveilling the GEO belt, geosynchronous and near GEO objects are not completely motionless in the earth-fixed frame and can be observed as moving targets. In some clusters, metric or positional information is insufficiently accurate or up-to-date to associate the measurements. In the presence of measurements with uncertain origin, star tracks (residuals) and other sensor artifacts, heuristic techniques based on hard decision assignment do not perform adequately. In the MMT community, Bar-Shalom [2009 Bar-Shalom] was first in introducing the use of measurements to update the state of the target of interest in the tracking filter, e.g. Kalman filter. Following Bar-Shalom’s idea, we use the Probabilistic Data Association Filter (PDAF) but to make use of all information obtainable in the measurement of three-axis-stabilized GEO satellites, we combine photometric with metric measurements to update the filter. Therefore, our technique Concurrent Spatio- Temporal and Brightness (COSTB) has the stand-alone ability of associating a track with its identity –for resident objects. That is possible because the light curve of a stabilized GEO satellite changes minimally from night to night. We exercised COSTB on camera cadence data to associate measurements, correct mistags and detect non-residents in a simulated near real time cadence. Data on GEO clusters were used.

  6. HEALTH GeoJunction: place-time-concept browsing of health publications.

    Science.gov (United States)

    MacEachren, Alan M; Stryker, Michael S; Turton, Ian J; Pezanowski, Scott

    2010-05-18

    The volume of health science publications is escalating rapidly. Thus, keeping up with developments is becoming harder as is the task of finding important cross-domain connections. When geographic location is a relevant component of research reported in publications, these tasks are more difficult because standard search and indexing facilities have limited or no ability to identify geographic foci in documents. This paper introduces HEALTH GeoJunction, a web application that supports researchers in the task of quickly finding scientific publications that are relevant geographically and temporally as well as thematically. HEALTH GeoJunction is a geovisual analytics-enabled web application providing: (a) web services using computational reasoning methods to extract place-time-concept information from bibliographic data for documents and (b) visually-enabled place-time-concept query, filtering, and contextualizing tools that apply to both the documents and their extracted content. This paper focuses specifically on strategies for visually-enabled, iterative, facet-like, place-time-concept filtering that allows analysts to quickly drill down to scientific findings of interest in PubMed abstracts and to explore relations among abstracts and extracted concepts in place and time. The approach enables analysts to: find publications without knowing all relevant query parameters, recognize unanticipated geographic relations within and among documents in multiple health domains, identify the thematic emphasis of research targeting particular places, notice changes in concepts over time, and notice changes in places where concepts are emphasized. PubMed is a database of over 19 million biomedical abstracts and citations maintained by the National Center for Biotechnology Information; achieving quick filtering is an important contribution due to the database size. Including geography in filters is important due to rapidly escalating attention to geographic factors in public

  7. HEALTH GeoJunction: place-time-concept browsing of health publications

    Directory of Open Access Journals (Sweden)

    Turton Ian J

    2010-05-01

    Full Text Available Abstract Background The volume of health science publications is escalating rapidly. Thus, keeping up with developments is becoming harder as is the task of finding important cross-domain connections. When geographic location is a relevant component of research reported in publications, these tasks are more difficult because standard search and indexing facilities have limited or no ability to identify geographic foci in documents. This paper introduces HEALTH GeoJunction, a web application that supports researchers in the task of quickly finding scientific publications that are relevant geographically and temporally as well as thematically. Results HEALTH GeoJunction is a geovisual analytics-enabled web application providing: (a web services using computational reasoning methods to extract place-time-concept information from bibliographic data for documents and (b visually-enabled place-time-concept query, filtering, and contextualizing tools that apply to both the documents and their extracted content. This paper focuses specifically on strategies for visually-enabled, iterative, facet-like, place-time-concept filtering that allows analysts to quickly drill down to scientific findings of interest in PubMed abstracts and to explore relations among abstracts and extracted concepts in place and time. The approach enables analysts to: find publications without knowing all relevant query parameters, recognize unanticipated geographic relations within and among documents in multiple health domains, identify the thematic emphasis of research targeting particular places, notice changes in concepts over time, and notice changes in places where concepts are emphasized. Conclusions PubMed is a database of over 19 million biomedical abstracts and citations maintained by the National Center for Biotechnology Information; achieving quick filtering is an important contribution due to the database size. Including geography in filters is important due to

  8. GeoCAM: A geovisual analytics workspace to contextualize and interpret statements about movement

    Directory of Open Access Journals (Sweden)

    Anuj Jaiswal

    2011-12-01

    Full Text Available This article focuses on integrating computational and visual methods in a system that supports analysts to identify, extract, map, and relate linguistic accounts of movement. We address two objectives: (1 build the conceptual, theoretical, and empirical framework needed to represent and interpret human-generated directions; and (2 design and implement a geovisual analytics workspace for direction document analysis. We have built a set of geo-enabled, computational methods to identify documents containing movement statements, and a visual analytics environment that uses natural language processing methods iteratively with geographic database support to extract, interpret, and map geographic movement references in context. Additionally, analysts can provide feedback to improve computational results. To demonstrate the value of this integrative approach, we have realized a proof-of-concept implementation focusing on identifying and processing documents that contain human-generated route directions. Using our visual analytic interface, an analyst can explore the results, provide feedback to improve those results, pose queries against a database of route directions, and interactively represent the route on a map.

  9. Smart Location Database - Service

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Smart Location Database (SLD) summarizes over 80 demographic, built environment, transit service, and destination accessibility attributes for every census block...

  10. Smart Location Database - Download

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Smart Location Database (SLD) summarizes over 80 demographic, built environment, transit service, and destination accessibility attributes for every census block...

  11. Making the GeoConnection: Web 2.0-based support for early-career geoscientists (Invited)

    Science.gov (United States)

    Martinez, C. M.; Gonzales, L. M.; Keane, C. M.

    2010-12-01

    The US Bureau of Labor estimates that there will be an 18% increase in geoscience jobs between 2008 and 2018 in the United States, and demand for geoscientists is expected to rise worldwide as scientists tackle global challenges related to resources, hazards and climate. At the same time, the geoscience workforce is aging, with approximately half of the current workforce reaching retirement age within the next 10-15 years. A new generation of geoscientists must be ready to take the reins. To support this new generation, AGI’s geoscience workforce outreach programs were designed to help retain geoscience students through their degree programs and into careers in the field. These resources include support for early-career professional development and career planning. AGI’s GeoConnection Network for the Geosciences provides a venue for informal dissemination of career information and professional resources. The network links Web 2.0 platforms, including a Facebook page, YouTube Channel and Twitter feed, to build a robust geoscience community of geoscientists at all stages of their careers. Early-career geoscientists can participate in GeoConnection to network with other scientists, and to receive information about professional development and job opportunities. Through GeoConnection packets, students can join professional societies which will assist their transition from school to the workplace. AGI’s member societies provide professional development course work, field trips, career services, interviewing opportunities, and community meetings. As part of the GeoConnection Network, AGI hosts informational webinars to highlight new workforce data, discuss current affairs in the geosciences, and to provide information about geoscience careers. Between December 2009 and August 2010, AGI hosted 10 webinars, with more than 300 total participants for all the webinars, and 5 additional webinars are planned for the remainder of the year. The webinars offer early

  12. Oceans of Data: In what ways can learning research inform the development of electronic interfaces and tools for use by students accessing large scientific databases?

    Science.gov (United States)

    Krumhansl, R. A.; Foster, J.; Peach, C. L.; Busey, A.; Baker, I.

    2012-12-01

    The practice of science and engineering is being revolutionized by the development of cyberinfrastructure for accessing near real-time and archived observatory data. Large cyberinfrastructure projects have the potential to transform the way science is taught in high school classrooms, making enormous quantities of scientific data available, giving students opportunities to analyze and draw conclusions from many kinds of complex data, and providing students with experiences using state-of-the-art resources and techniques for scientific investigations. However, online interfaces to scientific data are built by scientists for scientists, and their design can significantly impede broad use by novices. Knowledge relevant to the design of student interfaces to complex scientific databases is broadly dispersed among disciplines ranging from cognitive science to computer science and cartography and is not easily accessible to designers of educational interfaces. To inform efforts at bridging scientific cyberinfrastructure to the high school classroom, Education Development Center, Inc. and the Scripps Institution of Oceanography conducted an NSF-funded 2-year interdisciplinary review of literature and expert opinion pertinent to making interfaces to large scientific databases accessible to and usable by precollege learners and their teachers. Project findings are grounded in the fundamentals of Cognitive Load Theory, Visual Perception, Schemata formation and Universal Design for Learning. The Knowledge Status Report (KSR) presents cross-cutting and visualization-specific guidelines that highlight how interface design features can address/ ameliorate challenges novice high school students face as they navigate complex databases to find data, and construct and look for patterns in maps, graphs, animations and other data visualizations. The guidelines present ways to make scientific databases more broadly accessible by: 1) adjusting the cognitive load imposed by the user

  13. NNDC database migration project

    Energy Technology Data Exchange (ETDEWEB)

    Burrows, Thomas W; Dunford, Charles L [U.S. Department of Energy, Brookhaven Science Associates (United States)

    2004-03-01

    NNDC Database Migration was necessary to replace obsolete hardware and software, to be compatible with the industry standard in relational databases (mature software, large base of supporting software for administration and dissemination and replication and synchronization tools) and to improve the user access in terms of interface and speed. The Relational Database Management System (RDBMS) consists of a Sybase Adaptive Server Enterprise (ASE), which is relatively easy to move between different RDB systems (e.g., MySQL, MS SQL-Server, or MS Access), the Structured Query Language (SQL) and administrative tools written in Java. Linux or UNIX platforms can be used. The existing ENSDF datasets are often VERY large and will need to be reworked and both the CRP (adopted) and CRP (Budapest) datasets give elemental cross sections (not relative I{gamma}) in the RI field (so it is not immediately obvious which of the old values has been changed). But primary and secondary intensities are now available on the same scale. The intensity normalization has been done for us. We will gain access to a large volume of data from Budapest and some of those gamma-ray intensity and energy data will be superior to what we already have.

  14. Distributed Digital Survey Logbook Built on GeoServer and PostGIS

    Science.gov (United States)

    Jovicic, Aleksandar; Castelli, Ana; Kljajic, Zoran

    2013-04-01

    Keeping tracks of events that happens during survey (e.g. position and time when instruments goes into the water or come on-board, depths from which samples are taken or notes about equipment malfunctions and repairs) is essential for efficient post-processing and quality control of collected data especially in case of suspicious measurements. Most scientists still using good-old-paper way for such tasks and later transform it into digital form using spreadsheet applications. This approach looks more "safe" (if person is not confident in their computer skills) but in reality it turns to be more error-prone (especially when it comes to position recording and variations of sexagesimal representations or if there are no hints which timezone was used for time recording). As cruises usually involves various teams not always interested to do own measurements at each station, keeping eye on current position is essential, especially if cruise plan is changed (due to bad weather or discovering of some underwater features that requires more attention than originally planned). Also, position is usually displayed only at one monitor (as most GPS receivers provide just serial connectivity and distribution of such signal to multiple clients requires some devices non-wide-spread on computer equipment market) so it can make messy situation in control room when everybody try to write-down current position and time. To overcome all mentioned obstacles Distributed Digital Surevey Logbook is implemented. It is built on Open Geospatial Consortium (OGC) compliant GeoServer, using PostGIS database. It can handle geospatial content (charts and cruise plans), do recording of vessel track and all kind of events that any member of team want to record. As GeoServer allows distribution of position data to unlimited number of clients (from traditional PC's and laptops to tablets and smartphones), it can decrease pressure on control room no matter if all features are used or just as distant

  15. Enhanced DIII-D Data Management Through a Relational Database

    Science.gov (United States)

    Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.

    2000-10-01

    A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.

  16. ExoGeoLab Pilot Project for Landers, Rovers and Instruments

    Science.gov (United States)

    Foing, Bernard

    2010-05-01

    We have developed a pilot facility with a Robotic Test Bench (ExoGeoLab) and a Mobile Lab Habitat (ExoHab). They can be used to validate concepts and external instruments from partner institutes. The ExoGeoLab research incubator project, has started in the frame of a collaboration between ILEWG (International Lunar Exploration working Group http://sci.esa.int/ilewg), ESTEC, NASA and academic partners, supported by a design and control desk in the European Space Incubator (ESI), as well as infrastructure. ExoGeoLab includes a sequence of technology and research pilot project activities: - Data analysis and interpretation of remote sensing and in-situ data, and merging of multi-scale data sets - Procurement and integration of geophysical, geo-chemical and astrobiological breadboard instruments on a surface station and rovers - Integration of cameras, environment and solar sensors, Visible and near IR spectrometer, Raman spectrometer, sample handling, cooperative rovers - Delivery of a generic small planetary lander demonstrator (ExoGeoLab lander, Sept 2009) as a platform for multi-instruments tests - Research operations and exploitation of ExoGeoLab test bench for various conceptual configurations, and support for definition and design of science surface packages (Moon, Mars, NEOs, outer moons) - Field tests of lander, rovers and instruments in analogue sites (Utah MDRS 2009 & 2010, Eifel volcanic park in Sept 2009, and future campaigns). Co-authors, ILEWG ExoGeoLab & ExoHab Team: B.H. Foing(1,11)*#, C. Stoker(2,11)*, P. Ehrenfreund(10,11), L. Boche-Sauvan(1,11)*, L. Wendt(8)*, C. Gross(8, 11)*, C. Thiel(9)*, S. Peters(1,6)*, A. Borst(1,6)*, J. Zavaleta(2)*, P. Sarrazin(2)*, D. Blake(2), J. Page(1,4,11), V. Pletser(5,11)*, E. Monaghan(1)*, P. Mahapatra(1)#, A. Noroozi(3), P. Giannopoulos(1,11) , A. Calzada(1,6,11), R. Walker(7), T. Zegers(1, 15) #, G. Groemer(12)# , W. Stumptner(12)#, B. Foing(2,5), J. K. Blom(3)#, A. Perrin(14)#, M. Mikolajczak(14)#, S. Chevrier(14

  17. A distributed charge storage with GeO2 nanodots

    International Nuclear Information System (INIS)

    Chang, T.C.; Yan, S.T.; Hsu, C.H.; Tang, M.T.; Lee, J.F.; Tai, Y.H.; Liu, P.T.; Sze, S.M.

    2004-01-01

    In this study, a distributed charge storage with GeO 2 nanodots is demonstrated. The mean size and aerial density of the nanodots embedded in SiO 2 are estimated to be about 5.5 nm and 4.3x10 11 cm -2 , respectively. The composition of the dots is also confirmed to be GeO 2 by x-ray absorption near-edge structure analyses. A significant memory effect is observed through the electrical measurements. Under the low voltage operation of 5 V, the memory window is estimated to ∼0.45 V. Also, a physical model is proposed to demonstrate the charge storage effect through the interfacial traps of GeO 2 nanodots

  18. Towards Geo-spatial Hypermedia: Concepts and Prototype Implementation

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Vestergaard, Peter Posselt; Ørbæk, Peter

    2002-01-01

    This paper combines spatial hypermedia with techniques from Geographical Information Systems and location based services. We describe the Topos 3D Spatial Hypermedia system and how it has been developed to support geo-spatial hypermedia coupling hypermedia information to model representations...... of real world buildings and landscapes. The prototype experiments are primarily aimed at supporting architects and landscape architects in their work on site. Here it is useful to be able to superimpose and add different layers of information to, e.g. a landscape depending on the task being worked on. We...... and indirect navigation. Finally, we conclude with a number of research issues which are central to the future development of geo-spatial hypermedia, including design issues in combining metaphorical and literal hypermedia space, as well as a discussion of the role of spatial parsing in a geo-spatial context....

  19. Study on geo-information modelling

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana

    2006-01-01

    Roč. 5, č. 5 (2006), s. 1108-1113 ISSN 1109-2777 Institutional research plan: CEZ:AV0Z10750506 Keywords : control GIS * geo-information modelling * uncertainty * spatial temporal approach Web Services Subject RIV: BC - Control Systems Theory

  20. Probe into geo-information science and information science in nuclear and geography science in China

    International Nuclear Information System (INIS)

    Tang Bin

    2001-01-01

    In the past ten years a new science-Geo-Information Science, a branch of Geoscience, developed very fast, which has been valued and paid much attention to. Based on information science, the author analyzes the flow of material, energy, people and information and their relations, presents the place of Geo-Information Science in Geo-science and its content from Geo-Informatics, Geo-Information technology and the application of itself. Finally, the author discusses the main content and problem existed in Geo-Information Science involved in Nuclear and Geography Science

  1. Environmental data for the planning of off-shore wind parks from the EnerGEO Platform of Integrated Assessment (PIA)

    Energy Technology Data Exchange (ETDEWEB)

    Zelle, Hein; Mika, Agnes; Calkoen, Charles; Santbergen, Peter [BMT ARGOSS, Marknesse (Netherlands); Blanc, Isabelle; Guermont, Catherine; Menard, Lionel; Gschwind, Benoit [MINES ParisTech, Sophia Antipolis (France)

    2013-07-01

    The EU-sponsored EnerGEO project aims at providing decision makers with a modelling platform to assess the environmental impacts of different sources of renewable energy. One of the pillars of the project is the Wind Energy Pilot, addressing the effects of offshore wind parks on air pollution and energy use. The methods used in the pilot and the underlying environmental databases are integrated into a WebGIS client tool and made available to the public. This paper is dedicated to describing the environmental databases and supporting data incorporated in the client tool. A 27-km resolution, 11-year wind database is created using the WRF model. The wind database is used to assess the wind climate in the north-west Atlantic region and to derive the potential power output from offshore wind parks. Auxiliary data concerning water depth, distance to shore and distance to the nearest suitable port are created to aid the planning and maintenance phases. Seasonal workability conditions are assessed using a 20-year wave database. The distance at which future wind parks should be placed to exhibit different wind climates is investigated. (orig.)

  2. Environmental data for the planning of off-shore wind parks from the EnerGEO Platform of Integrated Assessment (PIA)

    International Nuclear Information System (INIS)

    Zelle, Hein; Mika, Agnes; Calkoen, Charles; Santbergen, Peter; Blanc, Isabelle; Guermont, Catherine; Menard, Lionel; Gschwind, Benoit

    2013-01-01

    The EU-sponsored EnerGEO project aims at providing decision makers with a modelling platform to assess the environmental impacts of different sources of renewable energy. One of the pillars of the project is the Wind Energy Pilot, addressing the effects of offshore wind parks on air pollution and energy use. The methods used in the pilot and the underlying environmental databases are integrated into a WebGIS client tool and made available to the public. This paper is dedicated to describing the environmental databases and supporting data incorporated in the client tool. A 27-km resolution, 11-year wind database is created using the WRF model. The wind database is used to assess the wind climate in the north-west Atlantic region and to derive the potential power output from offshore wind parks. Auxiliary data concerning water depth, distance to shore and distance to the nearest suitable port are created to aid the planning and maintenance phases. Seasonal workability conditions are assessed using a 20-year wave database. The distance at which future wind parks should be placed to exhibit different wind climates is investigated. (orig.)

  3. The ChArMEx database

    Science.gov (United States)

    Ferré, Hélène; Belmahfoud, Nizar; Boichard, Jean-Luc; Brissebrat, Guillaume; Cloché, Sophie; Descloitres, Jacques; Fleury, Laurence; Focsa, Loredana; Henriot, Nicolas; Mière, Arnaud; Ramage, Karim; Vermeulen, Anne; Boulanger, Damien

    2015-04-01

    The Chemistry-Aerosol Mediterranean Experiment (ChArMEx, http://charmex.lsce.ipsl.fr/) aims at a scientific assessment of the present and future state of the atmospheric environment in the Mediterranean Basin, and of its impacts on the regional climate, air quality, and marine biogeochemistry. The project includes long term monitoring of environmental parameters , intensive field campaigns, use of satellite data and modelling studies. Therefore ChARMEx scientists produce and need to access a wide diversity of data. In this context, the objective of the database task is to organize data management, distribution system and services, such as facilitating the exchange of information and stimulating the collaboration between researchers within the ChArMEx community, and beyond. The database relies on a strong collaboration between ICARE, IPSL and OMP data centers and has been set up in the framework of the Mediterranean Integrated Studies at Regional And Locals Scales (MISTRALS) program data portal. ChArMEx data, either produced or used by the project, are documented and accessible through the database website: http://mistrals.sedoo.fr/ChArMEx. The website offers the usual but user-friendly functionalities: data catalog, user registration procedure, search tool to select and access data... The metadata (data description) are standardized, and comply with international standards (ISO 19115-19139; INSPIRE European Directive; Global Change Master Directory Thesaurus). A Digital Object Identifier (DOI) assignement procedure allows to automatically register the datasets, in order to make them easier to access, cite, reuse and verify. At present, the ChArMEx database contains about 120 datasets, including more than 80 in situ datasets (2012, 2013 and 2014 summer campaigns, background monitoring station of Ersa...), 25 model output sets (dust model intercomparison, MEDCORDEX scenarios...), a high resolution emission inventory over the Mediterranean... Many in situ datasets

  4. The standard-based open workflow system in GeoBrain (Invited)

    Science.gov (United States)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial

  5. Multi-User GeoGebra for Virtual Math Teams

    Directory of Open Access Journals (Sweden)

    Gerry Stahl

    2010-05-01

    Full Text Available The Math Forum is an online resource center for pre-algebra, algebra, geometry and pre-calculus. Its Virtual Math Teams (VMT service provides an integrated web-based environment for small teams to discuss mathematics. The VMT collaboration environment now includes the dynamic mathematics application, GeoGebra. It offers a multi-user version of GeoGebra, which can be used in concert with VMT’s chat, web browsers, curricula and wiki repository.

  6. Global Remote Sensing Data Subdivision Organization Based on GeoSOT%全球遥感数据剖分组织的 GeoSOT 网格应用

    Institute of Scientific and Technical Information of China (English)

    2014-01-01

    At present, there are various data grids to organize data in different department data centers.In order to seek a remote sensing image data organization grid,which is compatible with the existing survey-ing and mapping data,a scheme of remote sensing data organization based on GeoSOT,geographical coordinate subdividing grid with one dimension integer coding on 2 n-tree,is proposed.it theoretically proves that GeoSOT has good isomorphism with National Topographic Map and other grids, such as Worldwind, Google Earth, Google Maps, Bing Maps and Mapworld, which makes GeoSOT gridinherit easily traditional surveying and mapping data and organize global remote sensing data.Under the premise of keeping the existing data organization,a virtual one global grid for global remote sensingdata organ-ization based on GeoSOT and a method of fast generating specification data products by GeoSOT cells aggregation are introduced.The test shows that it is very significantly to prove data integration efficiency with the virtual one global grid for global remote sensingdata organization based on GeoSOT.%针对目前不同部门按自身行业特点采用不同数据组织网格的问题,为寻求更适合于现有测绘数据组织体系兼容的遥感数据组织网格,提出基于GeoSOT网格的遥感数据组织方案,理论证明了Geo-SOT网格与国家地形图图幅和Worldwind、GoogleEarth、GoogleMaps、BingMaps、天地图等网格具有很好的同构性,有利于对传统测绘数据的继承。同时,在不改变现有数据组织体系的前提下,提出基于GeoSOT全球遥感数据“虚拟一张网”的数据组织模型和数据整合方法。通过试验证明,基于GeoSOT遥感影像“虚拟一张网”的数据组织可有效提高遥感数据整合效率。

  7. GeoMod 2014 - Modelling in geoscience

    Science.gov (United States)

    Leever, Karen; Oncken, Onno

    2016-08-01

    GeoMod is a biennial conference to review and discuss latest developments in analogue and numerical modelling of lithospheric and mantle deformation. GeoMod2014 took place at the GFZ German Research Centre for Geosciences in Potsdam, Germany. Its focus was on rheology and deformation at a wide range of temporal and spatial scales: from earthquakes to long-term deformation, from micro-structures to orogens and subduction systems. It also addressed volcanotectonics and the interaction between tectonics and surface processes (Elger et al., 2014). The conference was followed by a 2-day short course on "Constitutive Laws: from Observation to Implementation in Models" and a 1-day hands-on tutorial on the ASPECT numerical modelling software.

  8. Parametric instability in GEO 600 interferometer

    International Nuclear Information System (INIS)

    Gurkovsky, A.G.; Vyatchanin, S.P.

    2007-01-01

    We present analysis of undesirable effect of parametric instability in signal recycled GEO 600 interferometer. The basis for this effect is provided by excitation of additional (Stokes) optical mode, having frequency ω 1 , and mirror elastic mode, having frequency ω m , when the optical energy stored in the main FP cavity mode, having frequency ω 0 , exceeds a certain threshold and detuning Δ=ω 0 -ω 1 -ω m is small. We discuss the potential of observing parametric instability and its precursors in GEO 600 interferometer. This approach provides the best option to get familiar with this phenomenon, to develop experimental methods to depress it and to test the effectiveness of these methods in situ

  9. Quantum search of a real unstructured database

    Science.gov (United States)

    Broda, Bogusław

    2016-02-01

    A simple circuit implementation of the oracle for Grover's quantum search of a real unstructured classical database is proposed. The oracle contains a kind of quantumly accessible classical memory, which stores the database.

  10. The German-Chinese research collaboration YANGTZE-GEO: Assessing the geo-risks in the Three Gorges Reservoir area

    Science.gov (United States)

    Schönbrodt, S.; Behrens, T.; Bieger, K.; Ehret, D.; Frei, M.; Hörmann, G.; Seeber, C.; Schleier, M.; Schmalz, B.; Fohrer, N.; Kaufmann, H.; King, L.; Rohn, J.; Subklew, G.; Xiang, W.

    2012-04-01

    The river impoundment by The Three Gorges Dam leads to resettlement and land reclamation on steep slopes. As a consequence, ecosystem changes such as soil erosion, mass movements, and diffuse sediment and matter fluxes are widely expected to increase rapidly. In order to assess and analyse those ecosystem changes, the German-Chinese joint research project YANGTZE-GEO was set up in 2008. Within the framework of YANGTZE-GEO five German universities (Tuebingen, Erlangen, Giessen, Kiel, Potsdam) conducted studies on soil erosion, mass movements, diffuse matter inputs, and land use change and vulnerability in close collaboration with Chinese scientists. The Chinese partners and institutions are according to their alphabetic order of hometown the Chinese Research Academy of Environmental Sciences (CRAES; Beijing), the Standing Office of the State Council Three Gorges Project Construction Committee (Beijing), the National Climate Centre (NCC) of the China Meteorological Administration (CMA; Beijing), the Aero Geophysical Survey and Remote Sensing for Land and Resources (AES; Beijing), the Nanjing University, the CAS Institute of Soil Science (Nanjing), the Nanjing Institute of Geography and Limnology at CAS (NIGLAS; Nanjing), the China University of Geosciences (CUG; Wuhan), the CAS Institute of Hydrobiology (Wuhan), and the China Three Gorges University (Yichang). The overall aim of YANGTZE-GEO is the development of a risk assessment and forecasting system to locate high risk areas using GIS-based erosion modelling, data mining tools for terrace condition analysis and landslide recognition, eco-hydrological modelling for diffuse matter inputs, and state-of-the-art remote sensing to assess the landscape's vulnerability. Furthermore, the project aims at the recommendation of sustainable land management systems. YANGTZE-GEO showed the relevance of such research and crucially contributes to the understanding of the dimension and dynamics of the ecological consequences of

  11. Tourism through Travel Club: A Database Project

    Science.gov (United States)

    Pratt, Renée M. E.; Smatt, Cindi T.; Wynn, Donald E.

    2017-01-01

    This applied database exercise utilizes a scenario-based case study to teach the basics of Microsoft Access and database management in introduction to information systems and introduction to database course. The case includes background information on a start-up business (i.e., Carol's Travel Club), description of functional business requirements,…

  12. 47 CFR 69.120 - Line information database.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Line information database. 69.120 Section 69...) ACCESS CHARGES Computation of Charges § 69.120 Line information database. (a) A charge that is expressed... from a local exchange carrier database to recover the costs of: (1) The transmission facilities between...

  13. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    Science.gov (United States)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  14. Fire test database

    International Nuclear Information System (INIS)

    Lee, J.A.

    1989-01-01

    This paper describes a project recently completed for EPRI by Impell. The purpose of the project was to develop a reference database of fire tests performed on non-typical fire rated assemblies. The database is designed for use by utility fire protection engineers to locate test reports for power plant fire rated assemblies. As utilities prepare to respond to Information Notice 88-04, the database will identify utilities, vendors or manufacturers who have specific fire test data. The database contains fire test report summaries for 729 tested configurations. For each summary, a contact is identified from whom a copy of the complete fire test report can be obtained. Five types of configurations are included: doors, dampers, seals, wraps and walls. The database is computerized. One version for IBM; one for Mac. Each database is accessed through user-friendly software which allows adding, deleting, browsing, etc. through the database. There are five major database files. One each for the five types of tested configurations. The contents of each provides significant information regarding the test method and the physical attributes of the tested configuration. 3 figs

  15. The Eruption Forecasting Information System (EFIS) database project

    Science.gov (United States)

    Ogburn, Sarah; Harpel, Chris; Pesicek, Jeremy; Wellik, Jay; Pallister, John; Wright, Heather

    2016-04-01

    The Eruption Forecasting Information System (EFIS) project is a new initiative of the U.S. Geological Survey-USAID Volcano Disaster Assistance Program (VDAP) with the goal of enhancing VDAP's ability to forecast the outcome of volcanic unrest. The EFIS project seeks to: (1) Move away from relying on the collective memory to probability estimation using databases (2) Create databases useful for pattern recognition and for answering common VDAP questions; e.g. how commonly does unrest lead to eruption? how commonly do phreatic eruptions portend magmatic eruptions and what is the range of antecedence times? (3) Create generic probabilistic event trees using global data for different volcano 'types' (4) Create background, volcano-specific, probabilistic event trees for frequently active or particularly hazardous volcanoes in advance of a crisis (5) Quantify and communicate uncertainty in probabilities A major component of the project is the global EFIS relational database, which contains multiple modules designed to aid in the construction of probabilistic event trees and to answer common questions that arise during volcanic crises. The primary module contains chronologies of volcanic unrest, including the timing of phreatic eruptions, column heights, eruptive products, etc. and will be initially populated using chronicles of eruptive activity from Alaskan volcanic eruptions in the GeoDIVA database (Cameron et al. 2013). This database module allows us to query across other global databases such as the WOVOdat database of monitoring data and the Smithsonian Institution's Global Volcanism Program (GVP) database of eruptive histories and volcano information. The EFIS database is in the early stages of development and population; thus, this contribution also serves as a request for feedback from the community.

  16. Pre-Service Mathematics Teachers' Views about GeoGebra and Its Use

    Science.gov (United States)

    Horzum, Tugba; Ünlü, Melihan

    2017-01-01

    The purpose of this study was to determine the views of pre-service Mathematics teachers' (PMTs) about GeoGebra and its use after being exposed to GeoGebra activities designing processes. This is a case study which was conducted with 36 PMTs. Three open-ended questions were used, after the completion of the 14-week process of GeoGebra training and…

  17. Second-Tier Database for Ecosystem Focus, 2000-2001 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Van Holmes, Chris; Muongchanh, Christine; Anderson, James J. (University of Washington, School of Aquatic and Fishery Sciences, Seattle, WA)

    2001-11-01

    The Second-Tier Database for Ecosystem Focus (Contract 00004124) provides direct and timely public access to Columbia Basin environmental, operational, fishery and riverine data resources for federal, state, public and private entities. The Second-Tier Database known as Data Access in Realtime (DART) does not duplicate services provided by other government entities in the region. Rather, it integrates public data for effective access, consideration and application.

  18. Acceleration and volumetric strain generated by the Parkfield 2004 earthquake on the GEOS strong-motion array near Parkfield, California

    Science.gov (United States)

    Borcherdt, Rodger D.; Johnston, Malcolm J.S.; Dietel, Christopher; Glassmoyer, Gary; Myren, Doug; Stephens, Christopher

    2004-01-01

    An integrated array of 11 General Earthquake Observation System (GEOS) stations installed near Parkfield, CA provided on scale broad-band, wide-dynamic measurements of acceleration and volumetric strain of the Parkfield earthquake (M 6.0) of September 28, 2004. Three component measurements of acceleration were obtained at each of the stations. Measurements of collocated acceleration and volumetric strain were obtained at four of the stations. Measurements of velocity at most sites were on scale only for the initial P-wave arrival. When considered in the context of the extensive set of strong-motion recordings obtained on more than 40 analog stations by the California Strong-Motion Instrumentation Program (Shakal, et al., 2004 http://www.quake.ca.gov/cisn-edc) and those on the dense array of Spudich, et al, (1988), these recordings provide an unprecedented document of the nature of the near source strong motion generated by a M 6.0 earthquake. The data set reported herein provides the most extensive set of near field broad band wide dynamic range measurements of acceleration and volumetric strain for an earthquake as large as M 6 of which the authors are aware. As a result considerable interest has been expressed in these data. This report is intended to describe the data and facilitate its use to resolve a number of scientific and engineering questions concerning earthquake rupture processes and resultant near field motions and strains. This report provides a description of the array, its scientific objectives and the strong-motion recordings obtained of the main shock. The report provides copies of the uncorrected and corrected data. Copies of the inferred velocities, displacements, and Psuedo velocity response spectra are provided. Digital versions of these recordings are accessible with information available through the internet at several locations: the National Strong-Motion Program web site (http://agram.wr.usgs.gov/), the COSMOS Virtual Data Center Web site

  19. Software Engineering Laboratory (SEL) database organization and user's guide, revision 2

    Science.gov (United States)

    Morusiewicz, Linda; Bristow, John

    1992-01-01

    The organization of the Software Engineering Laboratory (SEL) database is presented. Included are definitions and detailed descriptions of the database tables and views, the SEL data, and system support data. The mapping from the SEL and system support data to the base table is described. In addition, techniques for accessing the database through the Database Access Manager for the SEL (DAMSEL) system and via the ORACLE structured query language (SQL) are discussed.

  20. GeoDataspaces: Simplifying Data Management Tasks with Globus

    Science.gov (United States)

    Malik, T.; Chard, K.; Tchoua, R. B.; Foster, I.

    2014-12-01

    Data and its management are central to modern scientific enterprise. Typically, geoscientists rely on observations and model output data from several disparate sources (file systems, RDBMS, spreadsheets, remote data sources). Integrated data management solutions that provide intuitive semantics and uniform interfaces, irrespective of the kind of data source are, however, lacking. Consequently, geoscientists are left to conduct low-level and time-consuming data management tasks, individually, and repeatedly for discovering each data source, often resulting in errors in handling. In this talk we will describe how the EarthCube GeoDataspace project is improving this situation for seismologists, hydrologists, and space scientists by simplifying some of the existing data management tasks that arise when developing computational models. We will demonstrate a GeoDataspace, bootstrapped with "geounits", which are self-contained metadata packages that provide complete description of all data elements associated with a model run, including input/output and parameter files, model executable and any associated libraries. Geounits link raw and derived data as well as associating provenance information describing how data was derived. We will discuss challenges in establishing geounits and describe machine learning and human annotation approaches that can be used for extracting and associating ad hoc and unstructured scientific metadata hidden in binary formats with data resources and models. We will show how geounits can improve search and discoverability of data associated with model runs. To support this model, we will describe efforts related towards creating a scalable metadata catalog that helps to maintain, search and discover geounits within the Globus network of accessible endpoints. This talk will focus on the issue of creating comprehensive personal inventories of data assets for computational geoscientists, and describe a publishing mechanism, which can be used to

  1. Combination of a geolocation database access with infrastructure sensing in TV bands

    OpenAIRE

    Dionísio, Rogério; Ribeiro, Jorge; Marques, Paulo; Rodriguez, Jonathan

    2014-01-01

    This paper describes the implementation and the technical specifications of a geolocation database assisted by a spectrum-monitoring outdoor network. The geolocation database is populated according to Electronic Communications Committee (ECC) report 186 methodology. The application programming interface (API) between the sensor network and the geolocation database implements an effective and secure connection to successfully gather sensing data and sends it to the geolocation database for ...

  2. The GEO-3 Scenarios 2002-2032. Quantification and Analysis of Environmental Impacts

    International Nuclear Information System (INIS)

    Bakkes, J.; Potting, J.; Kemp-Benedict, E.; Raskin, P.; Masui, T.; Rana, A.; Nellemann, C.; Rothman, D.

    2004-01-01

    The four contrasting visions of the world's next three decades as presented in the third Global Environment Outlook (GEO-3) have many implications for policy - from hunger to climate change and from freshwater issues to biodiversity. The four scenarios analysed are Markets First, Policy First, Security First, Sustainability First. Presenting a deeper analysis than the original GEO-3 report, this Technical Report quantifies the impacts of the scenarios for all 19 GEO 'sub-regions', such as Eastern Africa and Central Europe. Regional impacts are discussed in the context of sustainable development. The report summary compares the impacts of the four scenarios across regions - and for the world as a whole - in the light of internationally agreed targets including those in the Millennium Declaration where applicable. It provides an account of the analytical methods, key assumptions, models and other tools, along with the approaches used in the analyses. Based on the methods and results, the report looks back on the process of producing the forward-looking analysis for GEO-3. Were all analytical centres on the same track? Did the approach adopted for GEO-3 contribute to the overall GEO objective of strengthening global-regional involvement and linkages?

  3. Geo-communication and web-based geospatial infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2005-01-01

    The introduction of web-services as index-portals based on geoinformation has changed the conditions for both content and form of geocommunication. A high number of players and interactions (as well as a very high number of all kinds of information and combinations of these) characterize web-services......, where maps are only a part of the whole. These new conditions demand new ways of modelling the processes leading to geo-communication. One new aspect is the fact that the service providers have become a part of the geo-communication process with influence on the content. Another aspect...

  4. GiSAO.db: a database for ageing research

    Directory of Open Access Journals (Sweden)

    Grillari Johannes

    2011-05-01

    Full Text Available Abstract Background Age-related gene expression patterns of Homo sapiens as well as of model organisms such as Mus musculus, Saccharomyces cerevisiae, Caenorhabditis elegans and Drosophila melanogaster are a basis for understanding the genetic mechanisms of ageing. For an effective analysis and interpretation of expression profiles it is necessary to store and manage huge amounts of data in an organized way, so that these data can be accessed and processed easily. Description GiSAO.db (Genes involved in senescence, apoptosis and oxidative stress database is a web-based database system for storing and retrieving ageing-related experimental data. Expression data of genes and miRNAs, annotation data like gene identifiers and GO terms, orthologs data and data of follow-up experiments are stored in the database. A user-friendly web application provides access to the stored data. KEGG pathways were incorporated and links to external databases augment the information in GiSAO.db. Search functions facilitate retrieval of data which can also be exported for further processing. Conclusions We have developed a centralized database that is very well suited for the management of data for ageing research. The database can be accessed at https://gisao.genome.tugraz.at and all the stored data can be viewed with a guest account.

  5. SpeciesGeoCoder: Fast Categorization of Species Occurrences for Analyses of Biodiversity, Biogeography, Ecology, and Evolution.

    Science.gov (United States)

    Töpel, Mats; Zizka, Alexander; Calió, Maria Fernanda; Scharn, Ruud; Silvestro, Daniele; Antonelli, Alexandre

    2017-03-01

    Understanding the patterns and processes underlying the uneven distribution of biodiversity across space constitutes a major scientific challenge in systematic biology and biogeography, which largely relies on effectively mapping and making sense of rapidly increasing species occurrence data. There is thus an urgent need for making the process of coding species into spatial units faster, automated, transparent, and reproducible. Here we present SpeciesGeoCoder, an open-source software package written in Python and R, that allows for easy coding of species into user-defined operational units. These units may be of any size and be purely spatial (i.e., polygons) such as countries and states, conservation areas, biomes, islands, biodiversity hotspots, and areas of endemism, but may also include elevation ranges. This flexibility allows scoring species into complex categories, such as those encountered in topographically and ecologically heterogeneous landscapes. In addition, SpeciesGeoCoder can be used to facilitate sorting and cleaning of occurrence data obtained from online databases, and for testing the impact of incorrect identification of specimens on the spatial coding of species. The various outputs of SpeciesGeoCoder include quantitative biodiversity statistics, global and local distribution maps, and files that can be used directly in many phylogeny-based applications for ancestral range reconstruction, investigations of biome evolution, and other comparative methods. Our simulations indicate that even datasets containing hundreds of millions of records can be analyzed in relatively short time using a standard computer. We exemplify the use of SpeciesGeoCoder by inferring the historical dispersal of birds across the Isthmus of Panama, showing that lowland species crossed the Isthmus about twice as frequently as montane species with a marked increase in the number of dispersals during the last 10 million years. [ancestral area reconstruction; biodiversity

  6. Geo-communication and information design

    DEFF Research Database (Denmark)

    Brodersen, Lars

    2009-01-01

    of processes, procedures, factors, relations etc., all forming parts of a theory on geo-communication and information design. How do we decide whether to transmit content A or content B to another person? We make a decision. Making decisions does not normally give rise to difficulties, although a great deal......This article is an abstract of the book 'Geo-communication and information design'. The work involved in the book was inspired by the author's sense of wonder that there were apparently no existing theories, models etc. capable of identifying and choosing the content of information in systematic...... of debate might occur during the decision-making process. But if the question is extended to include a demand for systematics and consciousness (control) in the procedure adopted, the whole issue becomes more complex. How do we decide to transmit content A or content B to another person on a systematic...

  7. NLM Emergency Access Initiative: FAQs

    Science.gov (United States)

    Facebook Visit us on Twitter Visit us on Youtube Emergency Access Initiative Home | Journals | Books | Online Databases | FAQs Take Short Survey FAQ What is the Emergency Access Initiative? The Emergency Access Initiative (EAI) is a collaborative partnership between NLM and participating publishers to

  8. National Radiobiology Archives Distributed Access user's manual

    International Nuclear Information System (INIS)

    Watson, C.; Smith, S.; Prather, J.

    1991-11-01

    This User's Manual describes installation and use of the National Radiobiology Archives (NRA) Distributed Access package. The package consists of a distributed subset of information representative of the NRA databases and database access software which provide an introduction to the scope and style of the NRA Information Systems

  9. The TJ-II Relational Database Access Library: A User's Guide; Libreria de Acceso a la Base de Datos Relacional de TJ-II: Guia del Usuario

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, E.; Portas, A. B.; Vega, J.

    2003-07-01

    A relational database has been developed to store data representing physical values from TJ-II discharges. This new database complements the existing TJ-EI raw data database. This database resides in a host computer running Windows 2000 Server operating system and it is managed by SQL Server. A function library has been developed that permits remote access to these data from user programs running in computers connected to TJ-II local area networks via remote procedure cali. In this document a general description of the database and its organization are provided. Also given are a detailed description of the functions included in the library and examples of how to use these functions in computer programs written in the FORTRAN and C languages. (Author) 8 refs.

  10. XCOM: Photon Cross Sections Database

    Science.gov (United States)

    SRD 8 XCOM: Photon Cross Sections Database (Web, free access)   A web database is provided which can be used to calculate photon cross sections for scattering, photoelectric absorption and pair production, as well as total attenuation coefficients, for any element, compound or mixture (Z <= 100) at energies from 1 keV to 100 GeV.

  11. Geo-Hazards and Mountain Road Development in Nepal: Understanding the Science-Policy-Governance Interface

    Science.gov (United States)

    Dugar, Sumit; Dahal, Vaskar

    2015-04-01

    informed other road development schemes in Nepal. Geomorphological surveys and robust geo-hazard assessments that factor the spatial and temporal dimensions of the seismic, fluvial and sediment hazards along the road corridor are critical for sustainable development of mountain roads. However, scientific and technical research studies seldom inform mountain road development primarily due to lack of co-ordination between the respective government agencies, access to journal papers in developing countries and unwillingness to adopt novel interventions in rural road construction practices. These challenges are further exacerbated by weak governance and lack of proper policy enforcement that often leads to construction of poorly engineered roads, thereby increasing the risk of rural infrastructural damage from geo-hazards. Though there exists a disconnect between the science-policy-governance interface where information on geo-hazards is neglected in mountain road development due to lack of scientific research and government apathy, there is an opportunity to spur dialogue and sensitize these issues via trans-disciplinary approaches on disaster risk management.

  12. Timeliness and Predictability in Real-Time Database Systems

    National Research Council Canada - National Science Library

    Son, Sang H

    1998-01-01

    The confluence of computers, communications, and databases is quickly creating a globally distributed database where many applications require real time access to both temporally accurate and multimedia data...

  13. RODOS database adapter

    International Nuclear Information System (INIS)

    Xie Gang

    1995-11-01

    Integrated data management is an essential aspect of many automatical information systems such as RODOS, a real-time on-line decision support system for nuclear emergency management. In particular, the application software must provide access management to different commercial database systems. This report presents the tools necessary for adapting embedded SQL-applications to both HP-ALLBASE/SQL and CA-Ingres/SQL databases. The design of the database adapter and the concept of RODOS embedded SQL syntax are discussed by considering some of the most important features of SQL-functions and the identification of significant differences between SQL-implementations. Finally fully part of the software developed and the administrator's and installation guides are described. (orig.) [de

  14. Kansas Cartographic Database (KCD)

    Data.gov (United States)

    Kansas Data Access and Support Center — The Kansas Cartographic Database (KCD) is an exact digital representation of selected features from the USGS 7.5 minute topographic map series. Features that are...

  15. Specialist Bibliographic Databases.

    Science.gov (United States)

    Gasparyan, Armen Yuri; Yessirkepov, Marlen; Voronov, Alexander A; Trukhachev, Vladimir I; Kostyukova, Elena I; Gerasimov, Alexey N; Kitas, George D

    2016-05-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls.

  16. Specialist Bibliographic Databases

    Science.gov (United States)

    2016-01-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls. PMID:27134485

  17. The design of distributed database system for HIRFL

    International Nuclear Information System (INIS)

    Wang Hong; Huang Xinmin

    2004-01-01

    This paper is focused on a kind of distributed database system used in HIRFL distributed control system. The database of this distributed database system is established by SQL Server 2000, and its application system adopts the Client/Server model. Visual C ++ is used to develop the applications, and the application uses ODBC to access the database. (authors)

  18. Nuclear Criticality Information System. Database examples

    Energy Technology Data Exchange (ETDEWEB)

    Foret, C.A.

    1984-06-01

    The purpose of this publication is to provide our users with a guide to using the Nuclear Criticality Information System (NCIS). It is comprised of an introduction, an information and resources section, a how-to-use section, and several useful appendices. The main objective of this report is to present a clear picture of the NCIS project and its available resources as well as assisting our users in accessing the database and using the TIS computer to process data. The introduction gives a brief description of the NCIS project, the Technology Information System (TIS), online user information, future plans and lists individuals to contact for additional information about the NCIS project. The information and resources section outlines the NCIS database and describes the resources that are available. The how-to-use section illustrates access to the NCIS database as well as searching datafiles for general or specific data. It also shows how to access and read the NCIS news section as well as connecting to other information centers through the TIS computer.

  19. Nuclear Criticality Information System. Database examples

    International Nuclear Information System (INIS)

    Foret, C.A.

    1984-06-01

    The purpose of this publication is to provide our users with a guide to using the Nuclear Criticality Information System (NCIS). It is comprised of an introduction, an information and resources section, a how-to-use section, and several useful appendices. The main objective of this report is to present a clear picture of the NCIS project and its available resources as well as assisting our users in accessing the database and using the TIS computer to process data. The introduction gives a brief description of the NCIS project, the Technology Information System (TIS), online user information, future plans and lists individuals to contact for additional information about the NCIS project. The information and resources section outlines the NCIS database and describes the resources that are available. The how-to-use section illustrates access to the NCIS database as well as searching datafiles for general or specific data. It also shows how to access and read the NCIS news section as well as connecting to other information centers through the TIS computer

  20. Applications of GIS and database technologies to manage a Karst Feature Database

    Science.gov (United States)

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  1. GeoXp : An R Package for Exploratory Spatial Data Analysis

    Directory of Open Access Journals (Sweden)

    Thibault Laurent

    2012-04-01

    Full Text Available We present GeoXp, an R package implementing interactive graphics for exploratory spatial data analysis. We use a data set concerning public schools of the French MidiPyrenees region to illustrate the use of these exploratory techniques based on the coupling between a statistical graph and a map. Besides elementary plots like boxplots,histograms or simple scatterplots, GeoXp also couples maps with Moran scatterplots, variogram clouds, Lorenz curves and other graphical tools. In order to make the most of the multidimensionality of the data, GeoXp includes dimension reduction techniques such as principal components analysis and cluster analysis whose results are also linked to the map.

  2. Requirements for the next generation of nuclear databases and services

    Energy Technology Data Exchange (ETDEWEB)

    Pronyaev, Vladimir; Zerkin, Viktor; Muir, Douglas [International Atomic Energy Agency, Nuclear Data Section, Vienna (Austria); Winchell, David; Arcilla, Ramon [Brookhaven National Laboratory, National Nuclear Data Center, Upton, NY (United States)

    2002-08-01

    The use of relational database technology and general requirements for the next generation of nuclear databases and services are discussed. These requirements take into account an increased number of co-operating data centres working on diverse hardware and software platforms and users with different data-access capabilities. It is argued that the introduction of programming standards will allow the development of nuclear databases and data retrieval tools in a heterogeneous hardware and software environment. The functionality of this approach was tested with full-scale nuclear databases installed on different platforms having different operating and database management systems. User access through local network, internet, or CD-ROM has been investigated. (author)

  3. Synthesis of organic liquids/geo-polymer composites for the immobilization of nuclear wastes

    International Nuclear Information System (INIS)

    Cantarel, Vincent

    2016-01-01

    This work is included in the management of radioactive organic liquids research field. The process is based on an emulsification of organic liquid in an alkali silicate solution allowing the synthesis of a geo-polymer matrix. The first part of this work consists in carrying out a screening on different organic liquids. A model system representative of the various oils and a geo-polymer reference formulation are then defined. The second part deals with the structuration of the organic liquid/geo-polymer structuration, from the mixture of the reactants to the final material. It aims at determining the phenomena allowing the synthesis of a homogeneous composite. The last two parts aim at characterizing the composite by studying its structure (chemical structure, porosity of the geo-polymer and dispersion of the oil) and its properties with respect to the application to the immobilization of radioactive waste. Unlike calcium silicate-based cementitious matrices, the structure of the geo-polymer is not affected by the chemical nature of the organic liquids. Only acid oils inhibit or slow down the geo-polymerization reaction. In order to obtain a homogeneous material, the presence of surfactant molecules is necessary. The emulsion stabilization mechanism at the base of the process is relying on a synergy between the surfactant molecules and the aluminosilicate particles present in the geo-polymer paste. The kinetics (chemical and mechanical) of the geo-polymerization are not impacted by the presence of oil or surfactants. Only an increase in the viscoelastic moduli and the elastic character of the pastes can be observed. This difference in rheological behavior is mainly due to the presence of surfactant. The structure of the matrix is identical to that of a pure geo-polymer of the same formulation. The organic liquid is dispersed in spherical inclusions whose radius is between 5 and 15 μm. These droplets are separated from each other, and from the environment by the

  4. Improved Information Retrieval Performance on SQL Database Using Data Adapter

    Science.gov (United States)

    Husni, M.; Djanali, S.; Ciptaningtyas, H. T.; Wicaksana, I. G. N. A.

    2018-02-01

    The NoSQL databases, short for Not Only SQL, are increasingly being used as the number of big data applications increases. Most systems still use relational databases (RDBs), but as the number of data increases each year, the system handles big data with NoSQL databases to analyze and access data more quickly. NoSQL emerged as a result of the exponential growth of the internet and the development of web applications. The query syntax in the NoSQL database differs from the SQL database, therefore requiring code changes in the application. Data adapter allow applications to not change their SQL query syntax. Data adapters provide methods that can synchronize SQL databases with NotSQL databases. In addition, the data adapter provides an interface which is application can access to run SQL queries. Hence, this research applied data adapter system to synchronize data between MySQL database and Apache HBase using direct access query approach, where system allows application to accept query while synchronization process in progress. From the test performed using data adapter, the results obtained that the data adapter can synchronize between SQL databases, MySQL, and NoSQL database, Apache HBase. This system spends the percentage of memory resources in the range of 40% to 60%, and the percentage of processor moving from 10% to 90%. In addition, from this system also obtained the performance of database NoSQL better than SQL database.

  5. West Bank Gaza Geo-MIS System

    Data.gov (United States)

    US Agency for International Development — The Geo-MIS System is USAID/West Bank and Gaza's primary system for capturing and managing projectrelated information. Its purpose is to assist USAID and its...

  6. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage of MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. The data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 9 figs

  7. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. These data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 10 figs

  8. DOE Order 5480.28 Hanford facilities database

    Energy Technology Data Exchange (ETDEWEB)

    Hayenga, J.L., Westinghouse Hanford

    1996-09-01

    This document describes the development of a database of DOE and/or leased Hanford Site Facilities. The completed database will consist of structure/facility parameters essential to the prioritization of these structures for natural phenomena hazard vulnerability in compliance with DOE Order 5480.28, `Natural Phenomena Hazards Mitigation`. The prioritization process will be based upon the structure/facility vulnerability to natural phenomena hazards. The ACCESS based database, `Hanford Facilities Site Database`, is generated from current Hanford Site information and databases.

  9. IAEA/NDS requirements related to database software

    International Nuclear Information System (INIS)

    Pronyaev, V.; Zerkin, V.

    2001-01-01

    Full text: The Nuclear Data Section of the IAEA disseminates data to the NDS users through Internet or on CD-ROMs and diskettes. OSU Web-server on DEC Alpha with Open VMS and Oracle/DEC DBMS provides via CGI scripts and FORTRAN retrieval programs access to the main nuclear databases supported by the networks of Nuclear Reactions Data Centres and Nuclear Structure and Decay Data Centres (CINDA, EXFOR, ENDF, NSR, ENSDF). For Web-access to data from other libraries and files, hyper-links to the files stored in ASCII text or other formats are used. Databases on CD-ROM are usually provided with some retrieval system. They are distributed in the run-time mode and comply with all license requirements for software used in their development. Although major development work is done now at the PC with MS-Windows and Linux, NDS may not at present, due to some institutional conditions, use these platforms for organization of the Web access to the data. Starting the end of 1999, the NDS, in co-operation with other data centers, began to work out the strategy of migration of main network nuclear data bases onto platforms other than DEC Alpha/Open VMS/DBMS. Because the different co-operating centers have their own preferences for hardware and software, the requirement to provide maximum platform independence for nuclear databases is the most important and desirable feature. This requirement determined some standards for the nuclear database software development. Taking into account the present state and future development, these standards can be formulated as follows: 1. All numerical data (experimental, evaluated, recommended values and their uncertainties) prepared for inclusion in the IAEA/NDS nuclear database should be submitted in the form of the ASCII text files and will be kept at NDS as a master file. 2. Databases with complex structure should be submitted in the form of the files with standard SQL statements describing all its components. All extensions of standard SQL

  10. New Results from the Geoengineering Model Intercomparison Project (GeoMIP)

    Science.gov (United States)

    Robock, A.; Kravitz, B.

    2013-12-01

    The Geoengineering Model Intercomparison Project (GeoMIP) was designed to determine robust climate system model responses to Solar Radiation Management (SRM). While mitigation (reducing greenhouse gases emissions) is the most effective way of reducing future climate change, SRM (the deliberate modification of incoming solar radiation) has been proposed as a means of temporarily alleviating some of the effects of global warming. For society to make informed decisions as to whether SRM should ever be implemented, information is needed on the benefits, risks, and side effects, and GeoMIP seeks to aid in that endeavor. GeoMIP has organized four standardized climate model simulations involving reduction of insolation or increased amounts of stratospheric sulfate aerosols to counteract increasing greenhouse gases. Thirteen comprehensive atmosphere-ocean general circulation models have participated in the project so far. GeoMIP is a 'CMIP Coordinated Experiment' as part of the Climate Model Intercomparison Project 5 (CMIP5) and has been endorsed by SPARC (Stratosphere-troposphere Processes And their Role in Climate). GeoMIP has held three international workshops and has produced a number of recent journal articles. GeoMIP has found that if increasing greenhouse gases could be counteracted with insolation reduction, the global average temperature could be kept constant, but global average precipitation would reduce, particularly in summer monsoon regions around the world. Temperature changes would also not be uniform. The tropics would cool, but high latitudes would warm, with continuing, but reduced sea ice and ice sheet melting. Temperature extremes would still increase, but not as much as without SRM. If SRM were halted all at once, there would be rapid temperature and precipitation increases at 5-10 times the rates from gradual global warming. SRM combined with CO2 fertilization would have small impacts on rice production in China, but would increase maize production

  11. The Spatio-Temporal Evolution of Geo-Economic Relationships between China and ASEAN Countries: Competition or Cooperation?

    Directory of Open Access Journals (Sweden)

    Shufang Wang

    2017-06-01

    Full Text Available In the last 30 years, China’s economic power has experienced great changes and has brought about a profound impact on the world economy. This led us to ask a question: do changes in China’s economic power shift the geo-economic relationships between China and its neighboring countries? To answer this question, we researched the evolution of geo-economic relationships between China and the Association of Southeast Asian Nations (ASEAN countries. Using the Euclidean distance method, we explored the changes in these geo-economic relationships between China and ASEAN countries from 1980 to 2014. Our findings resulted in five conclusions: (1 Over time, geo-economic relationships between China and ASEAN countries remained relatively stable. (2 Geographically, the main geo-economic relationships between China and continental ASEAN countries were complementary, while the main geo-economic relationships between China and island ASEAN countries were competitive. (3 Geopolitics and geo-culture were attributed to the changes in geo-economic relationships. (4 The evolution of geo-economic relationships was characterized by path dependence. (5 Geo-economic relationships between China and ASEAN countries could be classified into four types: game type, with high cooperation and competition; complementary type, with high cooperation and low competition; fight type, with low cooperation and high competition; and loose type, with low cooperation and competition. Our findings contribute to improving the understanding of geo-economic relationships.

  12. The GEO-3 Scenarios 2002-2032. Quantification and Analysis of Environmental Impacts

    Energy Technology Data Exchange (ETDEWEB)

    Bakkes, J.; Potting, J. (eds.) [National Institute for Public Health and the Environment RIVM, Bilthoven (Netherlands); Henrichs, T. [Center for Environmental Systems Research CESR, University of Kassel, Kassel (Germany); Kemp-Benedict, E.; Raskin, P. [Stockholm Environment Institute SEI, Boston, MA (United States); Masui, T.; Rana, A. [National Institute for Environmental Studies NIES, Ibaraki (Japan); Nellemann, C. [United Nations Environment Programme UNEP, GRID Global and Regional Integrated Data centres Arendal, Lillehammer (Norway); Rothman, D. [International Centre for Integrative Studies ICIS, Maastricht University, Maastricht (Netherlands)

    2004-07-01

    The four contrasting visions of the world's next three decades as presented in the third Global Environment Outlook (GEO-3) have many implications for policy - from hunger to climate change and from freshwater issues to biodiversity. The four scenarios analysed are Markets First, Policy First, Security First, Sustainability First. Presenting a deeper analysis than the original GEO-3 report, this Technical Report quantifies the impacts of the scenarios for all 19 GEO 'sub-regions', such as Eastern Africa and Central Europe. Regional impacts are discussed in the context of sustainable development. The report summary compares the impacts of the four scenarios across regions - and for the world as a whole - in the light of internationally agreed targets including those in the Millennium Declaration where applicable. It provides an account of the analytical methods, key assumptions, models and other tools, along with the approaches used in the analyses. Based on the methods and results, the report looks back on the process of producing the forward-looking analysis for GEO-3. Were all analytical centres on the same track? Did the approach adopted for GEO-3 contribute to the overall GEO objective of strengthening global-regional involvement and linkages?.

  13. Fast Updating National Geo-Spatial Databases with High Resolution Imagery: China's Methodology and Experience

    Science.gov (United States)

    Chen, J.; Wang, D.; Zhao, R. L.; Zhang, H.; Liao, A.; Jiu, J.

    2014-04-01

    Geospatial databases are irreplaceable national treasure of immense importance. Their up-to-dateness referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million's kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. The use of many data sources and the integration of these data to form a high accuracy, quality checked product were required. It had in turn required up to date techniques of image matching, semantic integration, generalization, data base management and conflict resolution. Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. A national 1:50,000 databases updating strategy and its production workflow were designed, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country.

  14. Medicaid CHIP ESPC Database

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Environmental Scanning and Program Characteristic (ESPC) Database is in a Microsoft (MS) Access format and contains Medicaid and CHIP data, for the 50 states and...

  15. Complex Functions with GeoGebra

    Science.gov (United States)

    Breda, Ana Maria D'azevedo; Dos Santos, José Manuel Dos Santos

    2016-01-01

    Complex functions, generally feature some interesting peculiarities, seen as extensions of real functions. The visualization of complex functions properties usually requires the simultaneous visualization of two-dimensional spaces. The multiple Windows of GeoGebra, combined with its ability of algebraic computation with complex numbers, allow the…

  16. Specification of electron radiation environment at GEO and MEO for surface charging estimates

    Science.gov (United States)

    Ganushkina, N.; Dubyagin, S.; Mateo Velez, J. C.; Liemohn, M. W.

    2017-12-01

    A series of anomalies at GEO have been attributed to electrons of energy below 100 keV, responsible for surface charging. The process at play is charge deposition on covering insulating surfaces and is directly linked to the space environment at a time scale of a few tens of seconds. Even though modern satellites benefited from the analysis of past flight anomalies and losses, it appears that surface charging remains a source of problems. Accurate specification of the space environment at different orbits is of a key importance. We present the operational model for low energy (model (IMPTAM). This model has been operating online since March 2013 (http://fp7-spacecast.eu and imptam.fmi.fi) and it is driven by the real time solar wind and IMF parameters and by the real time Dst index. The presented model provides the low energy electron flux at all L-shells and at all satellite orbits, when necessary. IMPTAM is used to simulate the fluxes of low energy electrons inside the Earth's magnetosphere at the time of severe events measured on LANL satellites at GEO. There is no easy way to say what will be the flux of keV electrons at MEO when surface charging events are detected at GEO than to use a model. The maximal electron fluxes obtained at MEO (L = 4.6) within a few tens of minutes hours following the LANL events at GEO have been extracted to feed a database of theoretical/numerical worst-case environments for surface charging at MEO. All IMPTAM results are instantaneous, data have not been average. In order to validate the IMPTAM output at MEO, we conduct the statistical analysis of measured electron fluxes onboard Van Allen Probes (ECT HOPE (20 eV-45 keV) and ECT MagEIS (30 - 300 keV) at distances of 4.6 Re. IMPTAM e- flux at MEO is used as input to SPIS, the Spacecraft Plasma Interaction System Software toolkit for spacecraft-plasma interactions and spacecraft charging modelling (http://dev.spis.org/projects/spine/home/spis). The research leading to these results

  17. Geo-neutrinos and earth's interior

    International Nuclear Information System (INIS)

    Fiorentini, Gianni; Lissia, Marcello; Mantovani, Fabio

    2007-01-01

    The deepest hole that has ever been dug is about 12 km deep. Geochemists analyze samples from the Earth's crust and from the top of the mantle. Seismology can reconstruct the density profile throughout all Earth, but not its composition. In this respect, our planet is mainly unexplored. Geo-neutrinos, the antineutrinos from the progenies of U, Th and 40 K decays in the Earth, bring to the surface information from the whole planet, concerning its content of natural radioactive elements. Their detection can shed light on the sources of the terrestrial heat flow, on the present composition, and on the origins of the Earth. Geo-neutrinos represent a new probe of our planet, which can be exploited as a consequence of two fundamental advances that occurred in the last few years: the development of extremely low background neutrino detectors and the progress on understanding neutrino propagation. We review the status and the prospects of the field

  18. THE EXTRAGALACTIC DISTANCE DATABASE

    International Nuclear Information System (INIS)

    Tully, R. Brent; Courtois, Helene M.; Jacobs, Bradley A.; Rizzi, Luca; Shaya, Edward J.; Makarov, Dmitry I.

    2009-01-01

    A database can be accessed on the Web at http://edd.ifa.hawaii.edu that was developed to promote access to information related to galaxy distances. The database has three functional components. First, tables from many literature sources have been gathered and enhanced with links through a distinct galaxy naming convention. Second, comparisons of results both at the levels of parameters and of techniques have begun and are continuing, leading to increasing homogeneity and consistency of distance measurements. Third, new material is presented arising from ongoing observational programs at the University of Hawaii 2.2 m telescope, radio telescopes at Green Bank, Arecibo, and Parkes and with the Hubble Space Telescope. This new observational material is made available in tandem with related material drawn from archives and passed through common analysis pipelines.

  19. 經由校園網路存取圖書館光碟資料庫之研究 Studies on Multiuser Access Library CD-ROM Database via Campus Network

    Directory of Open Access Journals (Sweden)

    Ruey-shun Chen

    1992-06-01

    Full Text Available 無Library CD-ROM with its enormous storage, retrieval capabilities and reasonable price. It has been gradually replacing some of its printed counterpart. But one of the greatest limitation on the use of stand-alone CD-ROM workstation is that only one user can access the CD-ROM database at a time. This paper is proposed a new method to solve this problem. The method use personal computer via standard network system Ethernet high speed fiber network FADDY and standard protocol TCP/IP can access library CD-ROM database and perform a practical CD-ROM campus network system. Its advantage reduce redundant CD-ROM purchase fee and reduce damage by handed in and out and allows multiuser to access the same CD-ROM disc simultaneously.

  20. GEO 600 online detector characterization system

    International Nuclear Information System (INIS)

    Balasubramanian, R; Babak, S; Churches, D; Cokelaer, T

    2005-01-01

    A world-wide network of interferometric gravitational wave detectors is currently operational. The detectors in the network are still in their commissioning phase and are expected to achieve their design sensitivity over the next year or so. Each detector is a complex instrument involving many optical, mechanical and electronic subsystems and each subsystem is a source of noise at the output of the detector. Therefore, in addition to recording the main gravitational wave data channel at the output of the interferometer, the state of each detector subsystem is monitored and recorded. The analysis of these subsidiary data serves a dual purpose: first, it helps us to identify the primary sources of noise which could then be either removed altogether or reduced substantially and second, it helps us in vetoing spurious signals at the output of the interferometer. However, since these subsidiary data are both large in volume (1 MB s -1 ) as well as complex in nature, it is not possible to look at all these data manually. We require an online monitoring and analysis tool which can process all the data channels for various noise artefacts such as transients, drifting of narrowband noise sources, noise couplings between data channels etc, and summarize the results of the analysis in a manner that can be accessed and interpreted conveniently. In this paper we describe the GEO 600 online detector characterization system (GODCS), which is the tool that is being used to monitor the output of the GEO 600 gravitational wave detector situated near Hanover in Germany. We describe the various algorithms that we use and how the results of several algorithms can be combined to make meaningful statements about the state of the detector. We also give implementation details such as the software architecture and the storage and retrieval of the output of GODCS. This paper will be useful to researchers in the area of gravitational wave astronomy as a record of the various analyses and