WorldWideScience

Sample records for geospatial catalogue web

  1. Grid Enabled Geospatial Catalogue Web Service

    Science.gov (United States)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  2. Restful Implementation of Catalogue Service for Geospatial Data Provenance

    Science.gov (United States)

    Jiang, L. C.; Yue, P.; Lu, X. C.

    2013-10-01

    Provenance, also known as lineage, is important in understanding the derivation history of data products. Geospatial data provenance helps data consumers to evaluate the quality and reliability of geospatial data. In a service-oriented environment, where data are often consumed or produced by distributed services, provenance could be managed by following the same service-oriented paradigm. The Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) is used for the registration and query of geospatial data provenance by extending ebXML Registry Information Model (ebRIM). Recent advance of the REpresentational State Transfer (REST) paradigm has shown great promise for the easy integration of distributed resources. RESTful Web Service aims to provide a standard way for Web clients to communicate with servers based on REST principles. The existing approach for provenance catalogue service could be improved by adopting the RESTful design. This paper presents the design and implementation of a catalogue service for geospatial data provenance following RESTful architecture style. A middleware named REST Converter is added on the top of the legacy catalogue service to support a RESTful style interface. The REST Converter is composed of a resource request dispatcher and six resource handlers. A prototype service is developed to demonstrate the applicability of the approach.

  3. Geospatial semantic web

    CERN Document Server

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such as WFS, WMS, RDF, OWL, and GeoSPARQL, and demonstrates how to use the Geospatial Semantic Web technologies to solve practical real-world problems such as spatial data interoperability.

  4. Geospatial Semantics and the Semantic Web

    CERN Document Server

    Ashish, Naveen

    2011-01-01

    The availability of geographic and geospatial information and services, especially on the open Web has become abundant in the last several years with the proliferation of online maps, geo-coding services, geospatial Web services and geospatially enabled applications. The need for geospatial reasoning has significantly increased in many everyday applications including personal digital assistants, Web search applications, local aware mobile services, specialized systems for emergency response, medical triaging, intelligence analysis and more. Geospatial Semantics and the Semantic Web: Foundation

  5. Distributed Multi-interface Catalogue for Geospatial Data

    Science.gov (United States)

    Nativi, S.; Bigagli, L.; Mazzetti, P.; Mattia, U.; Boldrini, E.

    2007-12-01

    Several geosciences communities (e.g. atmospheric science, oceanography, hydrology) have developed tailored data and metadata models and service protocol specifications for enabling online data discovery, inventory, evaluation, access and download. These specifications are conceived either profiling geospatial information standards or extending the well-accepted geosciences data models and protocols in order to capture more semantics. These artifacts have generated a set of related catalog -and inventory services- characterizing different communities, initiatives and projects. In fact, these geospatial data catalogs are discovery and access systems that use metadata as the target for query on geospatial information. The indexed and searchable metadata provide a disciplined vocabulary against which intelligent geospatial search can be performed within or among communities. There exists a clear need to conceive and achieve solutions to implement interoperability among geosciences communities, in the context of the more general geospatial information interoperability framework. Such solutions should provide search and access capabilities across catalogs, inventory lists and their registered resources. Thus, the development of catalog clearinghouse solutions is a near-term challenge in support of fully functional and useful infrastructures for spatial data (e.g. INSPIRE, GMES, NSDI, GEOSS). This implies the implementation of components for query distribution and virtual resource aggregation. These solutions must implement distributed discovery functionalities in an heterogeneous environment, requiring metadata profiles harmonization as well as protocol adaptation and mediation. We present a catalog clearinghouse solution for the interoperability of several well-known cataloguing systems (e.g. OGC CSW, THREDDS catalog and data services). The solution implements consistent resource discovery and evaluation over a dynamic federation of several well-known cataloguing and

  6. The Geospatial Web and Local Geographical Education

    Science.gov (United States)

    Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.

    2010-01-01

    Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…

  7. Infrastructure for the Geospatial Web

    Science.gov (United States)

    Lake, Ron; Farley, Jim

    Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.

  8. A resource-oriented architecture for a Geospatial Web

    Science.gov (United States)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary

  9. BioCatalogue: a universal catalogue of web services for the life sciences.

    Science.gov (United States)

    Bhagat, Jiten; Tanoh, Franck; Nzuobontane, Eric; Laurent, Thomas; Orlowski, Jerzy; Roos, Marco; Wolstencroft, Katy; Aleksejevs, Sergejs; Stevens, Robert; Pettifer, Steve; Lopez, Rodrigo; Goble, Carole A

    2010-07-01

    The use of Web Services to enable programmatic access to on-line bioinformatics is becoming increasingly important in the Life Sciences. However, their number, distribution and the variable quality of their documentation can make their discovery and subsequent use difficult. A Web Services registry with information on available services will help to bring together service providers and their users. The BioCatalogue (http://www.biocatalogue.org/) provides a common interface for registering, browsing and annotating Web Services to the Life Science community. Services in the BioCatalogue can be described and searched in multiple ways based upon their technical types, bioinformatics categories, user tags, service providers or data inputs and outputs. They are also subject to constant monitoring, allowing the identification of service problems and changes and the filtering-out of unavailable or unreliable resources. The system is accessible via a human-readable 'Web 2.0'-style interface and a programmatic Web Service interface. The BioCatalogue follows a community approach in which all services can be registered, browsed and incrementally documented with annotations by any member of the scientific community.

  10. Automatic geospatial information Web service composition based on ontology interface matching

    Science.gov (United States)

    Xu, Xianbin; Wu, Qunyong; Wang, Qinmin

    2008-10-01

    With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.

  11. A Javascript GIS Platform Based on Invocable Geospatial Web Services

    Directory of Open Access Journals (Sweden)

    Konstantinos Evangelidis

    2018-04-01

    Full Text Available Semantic Web technologies are being increasingly adopted by the geospatial community during last decade through the utilization of open standards for expressing and serving geospatial data. This was also dramatically assisted by the ever-increasing access and usage of geographic mapping and location-based services via smart devices in people’s daily activities. In this paper, we explore the developmental framework of a pure JavaScript client-side GIS platform exclusively based on invocable geospatial Web services. We also extend JavaScript utilization on the server side by deploying a node server acting as a bridge between open source WPS libraries and popular geoprocessing engines. The vehicle for such an exploration is a cross platform Web browser capable of interpreting JavaScript commands to achieve interaction with geospatial providers. The tool is a generic Web interface providing capabilities of acquiring spatial datasets, composing layouts and applying geospatial processes. In an ideal form the end-user will have to identify those services, which satisfy a geo-related need and put them in the appropriate row. The final output may act as a potential collector of freely available geospatial web services. Its server-side components may exploit geospatial processing suppliers composing that way a light-weight fully transparent open Web GIS platform.

  12. BPELPower—A BPEL execution engine for geospatial web services

    Science.gov (United States)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  13. Automated geospatial Web Services composition based on geodata quality requirements

    Science.gov (United States)

    Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael

    2012-10-01

    Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.

  14. Using the Geospatial Web to Deliver and Teach Giscience Education Programs

    Science.gov (United States)

    Veenendaal, B.

    2015-05-01

    Geographic information science (GIScience) education has undergone enormous changes over the past years. One major factor influencing this change is the role of the geospatial web in GIScience. In addition to the use of the web for enabling and enhancing GIScience education, it is also used as the infrastructure for communicating and collaborating among geospatial data and users. The web becomes both the means and the content for a geospatial education program. However, the web does not replace the traditional face-to-face environment, but rather is a means to enhance it, expand it and enable an authentic and real world learning environment. This paper outlines the use of the web in both the delivery and content of the GIScience program at Curtin University. The teaching of the geospatial web, web and cloud based mapping, and geospatial web services are key components of the program, and the use of the web and online learning are important to deliver this program. Some examples of authentic and real world learning environments are provided including joint learning activities with partner universities.

  15. Geospatial metadata retrieval from web services

    Directory of Open Access Journals (Sweden)

    Ivanildo Barbosa

    Full Text Available Nowadays, producers of geospatial data in either raster or vector formats are able to make them available on the World Wide Web by deploying web services that enable users to access and query on those contents even without specific software for geoprocessing. Several providers around the world have deployed instances of WMS (Web Map Service, WFS (Web Feature Service and WCS (Web Coverage Service, all of them specified by the Open Geospatial Consortium (OGC. In consequence, metadata about the available contents can be retrieved to be compared with similar offline datasets from other sources. This paper presents a brief summary and describes the matching process between the specifications for OGC web services (WMS, WFS and WCS and the specifications for metadata required by the ISO 19115 - adopted as reference for several national metadata profiles, including the Brazilian one. This process focuses on retrieving metadata about the identification and data quality packages as well as indicates the directions to retrieve metadata related to other packages. Therefore, users are able to assess whether the provided contents fit to their purposes.

  16. The geospatial web how geobrowsers, social software and the web 2 0 are shaping the network society

    CERN Document Server

    Scharl, Arno; Tochtermann, Klaus

    2007-01-01

    The Geospatial Web will have a profound impact on managing knowledge, structuring work flows within and across organizations, and communicating with like-minded individuals in virtual communities. The enabling technologies for the Geospatial Web are geo-browsers such as NASA World Wind, Google Earth and Microsoft Live Local 3D. These three-dimensional platforms revolutionize the production and consumption of media products. They not only reveal the geographic distribution of Web resources and services, but also bring together people of similar interests, browsing behavior, or geographic location. This book summarizes the latest research on the Geospatial Web's technical foundations, describes information services and collaborative tools built on top of geo-browsers, and investigates the environmental, social and economic impacts of geospatial applications. The role of contextual knowledge in shaping the emerging network society deserves particular attention. By integrating geospatial and semantic technology, ...

  17. A geospatial search engine for discovering multi-format geospatial data across the web

    Science.gov (United States)

    Christopher Bone; Alan Ager; Ken Bunzel; Lauren Tierney

    2014-01-01

    The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created. However, challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist. The objective of this paper is to present a publically...

  18. An Ontology-supported Approach for Automatic Chaining of Web Services in Geospatial Knowledge Discovery

    Science.gov (United States)

    di, L.; Yue, P.; Yang, W.; Yu, G.

    2006-12-01

    Recent developments in geospatial semantic Web have shown promise for automatic discovery, access, and use of geospatial Web services to quickly and efficiently solve particular application problems. With the semantic Web technology, it is highly feasible to construct intelligent geospatial knowledge systems that can provide answers to many geospatial application questions. A key challenge in constructing such intelligent knowledge system is to automate the creation of a chain or process workflow that involves multiple services and highly diversified data and can generate the answer to a specific question of users. This presentation discusses an approach for automating composition of geospatial Web service chains by employing geospatial semantics described by geospatial ontologies. It shows how ontology-based geospatial semantics are used for enabling the automatic discovery, mediation, and chaining of geospatial Web services. OWL-S is used to represent the geospatial semantics of individual Web services and the type of the services it belongs to and the type of the data it can handle. The hierarchy and classification of service types are described in the service ontology. The hierarchy and classification of data types are presented in the data ontology. For answering users' geospatial questions, an Artificial Intelligent (AI) planning algorithm is used to construct the service chain by using the service and data logics expressed in the ontologies. The chain can be expressed as a graph with nodes representing services and connection weights representing degrees of semantic matching between nodes. The graph is a visual representation of logical geo-processing path for answering users' questions. The graph can be instantiated to a physical service workflow for execution to generate the answer to a user's question. A prototype system, which includes real world geospatial applications, is implemented to demonstrate the concept and approach.

  19. An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services

    Science.gov (United States)

    Shah, M.; Verma, Y.; Nandakumar, R.

    2012-07-01

    Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  20. GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.

    Science.gov (United States)

    Liang, Steve H L; Huang, Chih-Yuan

    2013-10-02

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  1. GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web

    Directory of Open Access Journals (Sweden)

    Steve H.L. Liang

    2013-10-01

    Full Text Available The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web’s full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  2. Using a Web GIS Plate Tectonics Simulation to Promote Geospatial Thinking

    Science.gov (United States)

    Bodzin, Alec M.; Anastasio, David; Sharif, Rajhida; Rutzmoser, Scott

    2016-01-01

    Learning with Web-based geographic information system (Web GIS) can promote geospatial thinking and analysis of georeferenced data. Web GIS can enable learners to analyze rich data sets to understand spatial relationships that are managed in georeferenced data visualizations. We developed a Web GIS plate tectonics simulation as a capstone learning…

  3. Open Source Web Based Geospatial Processing with OMAR

    Directory of Open Access Journals (Sweden)

    Mark Lucas

    2009-01-01

    Full Text Available The availability of geospatial data sets is exploding. New satellites, aerial platforms, video feeds, global positioning system tagged digital photos, and traditional GIS information are dramatically increasing across the globe. These raw materials need to be dynamically processed, combined and correlated to generate value added information products to answer a wide range of questions. This article provides an overview of OMAR web based geospatial processing. OMAR is part of the Open Source Software Image Map project under the Open Source Geospatial Foundation. The primary contributors of OSSIM make their livings by providing professional services to US Government agencies and programs. OMAR provides one example that open source software solutions are increasingly being deployed in US government agencies. We will also summarize the capabilities of OMAR and its plans for near term development.

  4. GeoSearch: A lightweight broking middleware for geospatial resources discovery

    Science.gov (United States)

    Gui, Z.; Yang, C.; Liu, K.; Xia, J.

    2012-12-01

    With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value

  5. The National 3-D Geospatial Information Web-Based Service of Korea

    Science.gov (United States)

    Lee, D. T.; Kim, C. W.; Kang, I. G.

    2013-09-01

    3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of

  6. Web mapping system for complex processing and visualization of environmental geospatial datasets

    Science.gov (United States)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial

  7. Geospatial Information Relevant to the Flood Protection Available on The Mainstream Web

    Directory of Open Access Journals (Sweden)

    Kliment Tomáš

    2014-03-01

    Full Text Available Flood protection is one of several disciplines where geospatial data is very important and is a crucial component. Its management, processing and sharing form the foundation for their efficient use; therefore, special attention is required in the development of effective, precise, standardized, and interoperable models for the discovery and publishing of data on the Web. This paper describes the design of a methodology to discover Open Geospatial Consortium (OGC services on the Web and collect descriptive information, i.e., metadata in a geocatalogue. A pilot implementation of the proposed methodology - Geocatalogue of geospatial information provided by OGC services discovered on Google (hereinafter “Geocatalogue” - was used to search for available resources relevant to the area of flood protection. The result is an analysis of the availability of resources discovered through their metadata collected from the OGC services (WMS, WFS, etc. and the resources they provide (WMS layers, WFS objects, etc. within the domain of flood protection.

  8. Informal information for web-based engineering catalogues

    Science.gov (United States)

    Allen, Richard D.; Culley, Stephen J.; Hicks, Ben J.

    2001-10-01

    Success is highly dependent on the ability of a company to efficiently produce optimal designs. In order to achieve this companies must minimize time to market and possess the ability to make fully informed decisions at the early phase of the design process. Such decisions may include the choice of component and suppliers, as well as cost and maintenance considerations. Computer modeling and electronic catalogues are becoming the preferred medium for the selection and design of mechanical components. In utilizing these techniques, the designer demands the capability to identify, evaluate and select mechanical components both quantitatively and qualitatively. Quantitative decisions generally encompass performance data included in the formal catalogue representation. It is in the area of qualitative decisions that the use of what the authors call 'Informal Information' is of crucial importance. Thus, 'Informal Information' must often be incorporated into the selection process and selection systems. This would enable more informed decisions to be made quicker, without the need for information retrieval via discussion with colleagues in the design environment. This paper provides an overview of the use of electronic information in the design of mechanical systems, including a discussion of limitations of current technology. The importance of Informal Information is discussed and the requirements for association with web based electronic catalogues are developed. This system is based on a flexible XML schema and enables the storage, classification and recall of Informal Information packets. Furthermore, a strategy for the inclusion of Informal Information is proposed, and an example case is used to illustrate the benefits.

  9. DEVELOPING WEB MAPPING APPLICATION USING ARCGIS SERVER WEB APPLICATION DEVELOPMEN FRAMEWORK (ADF FOR GEOSPATIAL DATA GENERATED DURING REHABILITATION AND RECONSTRUCTION PROCESS OF POST-TSUNAMI 2004 DISASTER IN ACEH

    Directory of Open Access Journals (Sweden)

    Nizamuddin Nizamuddin

    2014-04-01

    Full Text Available ESRI ArcGIS Server is equipped with ArcGIS Server Web Application Development Framework (ADF and ArcGIS Web Controls integration for Visual Studio.NET. Both the ArcGIS Server Manager for .NET and ArcGIS Web Controls can be easily utilized for developing the ASP.NET based ESRI Web mapping application. In  this study we implemented both tools for developing the ASP.NET based ESRI Web mapping application for geospatial data generated dring rehabilitation and reconstruction process of post-tsunami 2004 disaster in Aceh province. Rehabilitation and reconstruction process has produced a tremendous amount of geospatial data. This method was chosen in this study because in the process of developing  a web mapping application, one can easily and quickly create Mapping Services of huge geospatial data and also develop Web mapping application without writing any code. However, when utilizing Visual Studio.NET 2008, one needs to have some coding ability.

  10. Streamlining geospatial metadata in the Semantic Web

    Science.gov (United States)

    Fugazza, Cristiano; Pepe, Monica; Oggioni, Alessandro; Tagliolato, Paolo; Carrara, Paola

    2016-04-01

    In the geospatial realm, data annotation and discovery rely on a number of ad-hoc formats and protocols. These have been created to enable domain-specific use cases generalized search is not feasible for. Metadata are at the heart of the discovery process and nevertheless they are often neglected or encoded in formats that either are not aimed at efficient retrieval of resources or are plainly outdated. Particularly, the quantum leap represented by the Linked Open Data (LOD) movement did not induce so far a consistent, interlinked baseline in the geospatial domain. In a nutshell, datasets, scientific literature related to them, and ultimately the researchers behind these products are only loosely connected; the corresponding metadata intelligible only to humans, duplicated on different systems, seldom consistently. Instead, our workflow for metadata management envisages i) editing via customizable web- based forms, ii) encoding of records in any XML application profile, iii) translation into RDF (involving the semantic lift of metadata records), and finally iv) storage of the metadata as RDF and back-translation into the original XML format with added semantics-aware features. Phase iii) hinges on relating resource metadata to RDF data structures that represent keywords from code lists and controlled vocabularies, toponyms, researchers, institutes, and virtually any description one can retrieve (or directly publish) in the LOD Cloud. In the context of a distributed Spatial Data Infrastructure (SDI) built on free and open-source software, we detail phases iii) and iv) of our workflow for the semantics-aware management of geospatial metadata.

  11. Geospatial Web Services in Real Estate Information System

    Science.gov (United States)

    Radulovic, Aleksandra; Sladic, Dubravka; Govedarica, Miro; Popovic, Dragana; Radovic, Jovana

    2017-12-01

    Since the data of cadastral records are of great importance for the economic development of the country, they must be well structured and organized. Records of real estate on the territory of Serbia met many problems in previous years. To prevent problems and to achieve efficient access, sharing and exchange of cadastral data on the principles of interoperability, domain model for real estate is created according to current standards in the field of spatial data. The resulting profile of the domain model for the Serbian real estate cadastre is based on the current legislation and on Land Administration Domain Model (LADM) which is specified in the ISO19152 standard. Above such organized data, and for their effective exchange, it is necessary to develop a model of services that must be provided by the institutions interested in the exchange of cadastral data. This is achieved by introducing a service-oriented architecture in the information system of real estate cadastre and with that ensures efficiency of the system. It is necessary to develop user services for download, review and use of the real estate data through the web. These services should be provided to all users who need access to cadastral data (natural and legal persons as well as state institutions) through e-government. It is also necessary to provide search, view and download of cadastral spatial data by specifying geospatial services. Considering that real estate contains geometric data for parcels and buildings it is necessary to establish set of geospatial services that would provide information and maps for the analysis of spatial data, and for forming a raster data. Besides the theme Cadastral parcels, INSPIRE directive specifies several themes that involve data on buildings and land use, for which data can be provided from real estate cadastre. In this paper, model of geospatial services in Serbia is defined. A case study of using these services to estimate which household is at risk of

  12. Operational Marine Data Acquisition and Delivery Powered by Web and Geospatial Standards

    Science.gov (United States)

    Thomas, R.; Buck, J. J. H.

    2015-12-01

    As novel sensor types and new platforms are deployed to monitor the global oceans, the volumes of scientific and environmental data collected in the marine context are rapidly growing. In order to use these data in both the traditional operational modes and in innovative "Big Data" applications the data must be readily understood by software agents. One approach to achieving this is the application of both World Wide Web and Open Geospatial Consortium standards: namely Linked Data1 and Sensor Web Enablement2 (SWE). The British Oceanographic Data Centre (BODC) is adopting this strategy in a number of European Commission funded projects (NETMAR; SenseOCEAN; Ocean Data Interoperability Platform - ODIP; and AtlantOS) to combine its existing data archiving architecture with SWE components (such as Sensor Observation Services) and a Linked Data interface. These will evolve the data management and data transfer from a process that requires significant manual intervention to an automated operational process enabling the rapid, standards-based, ingestion and delivery of data. This poster will show the current capabilities of BODC and the status of on-going implementation of this strategy. References1. World Wide Web Consortium. (2013). Linked Data. Available:http://www.w3.org/standards/semanticweb/data. Last accessed 7th April 20152. Open Geospatial Consortium. (2014). Sensor Web Enablement (SWE). Available:http://www.opengeospatial.org/ogc/markets-technologies/swe. Last accessed 8th October 2014

  13. Interacting With A Near Real-Time Urban Digital Watershed Using Emerging Geospatial Web Technologies

    Science.gov (United States)

    Liu, Y.; Fazio, D. J.; Abdelzaher, T.; Minsker, B.

    2007-12-01

    The value of real-time hydrologic data dissemination including river stage, streamflow, and precipitation for operational stormwater management efforts is particularly high for communities where flash flooding is common and costly. Ideally, such data would be presented within a watershed-scale geospatial context to portray a holistic view of the watershed. Local hydrologic sensor networks usually lack comprehensive integration with sensor networks managed by other agencies sharing the same watershed due to administrative, political, but mostly technical barriers. Recent efforts on providing unified access to hydrological data have concentrated on creating new SOAP-based web services and common data format (e.g. WaterML and Observation Data Model) for users to access the data (e.g. HIS and HydroSeek). Geospatial Web technology including OGC sensor web enablement (SWE), GeoRSS, Geo tags, Geospatial browsers such as Google Earth and Microsoft Virtual Earth and other location-based service tools provides possibilities for us to interact with a digital watershed in near-real-time. OGC SWE proposes a revolutionary concept towards a web-connected/controllable sensor networks. However, these efforts have not provided the capability to allow dynamic data integration/fusion among heterogeneous sources, data filtering and support for workflows or domain specific applications where both push and pull mode of retrieving data may be needed. We propose a light weight integration framework by extending SWE with open source Enterprise Service Bus (e.g., mule) as a backbone component to dynamically transform, transport, and integrate both heterogeneous sensor data sources and simulation model outputs. We will report our progress on building such framework where multi-agencies" sensor data and hydro-model outputs (with map layers) will be integrated and disseminated in a geospatial browser (e.g. Microsoft Virtual Earth). This is a collaborative project among NCSA, USGS Illinois Water

  14. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    Science.gov (United States)

    Hearn,, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  15. Describing Geospatial Assets in the Web of Data: A Metadata Management Scenario

    Directory of Open Access Journals (Sweden)

    Cristiano Fugazza

    2016-12-01

    Full Text Available Metadata management is an essential enabling factor for geospatial assets because discovery, retrieval, and actual usage of the latter are tightly bound to the quality of these descriptions. Unfortunately, the multi-faceted landscape of metadata formats, requirements, and conventions makes it difficult to identify editing tools that can be easily tailored to the specificities of a given project, workgroup, and Community of Practice. Our solution is a template-driven metadata editing tool that can be customised to any XML-based schema. Its output is constituted by standards-compliant metadata records that also have a semantics-aware counterpart eliciting novel exploitation techniques. Moreover, external data sources can easily be plugged in to provide autocompletion functionalities on the basis of the data structures made available on the Web of Data. Beside presenting the essentials on customisation of the editor by means of two use cases, we extend the methodology to the whole life cycle of geospatial metadata. We demonstrate the novel capabilities enabled by RDF-based metadata representation with respect to traditional metadata management in the geospatial domain.

  16. The new OGC Catalogue Services 3.0 specification - status of work

    Science.gov (United States)

    Bigagli, Lorenzo; Voges, Uwe

    2013-04-01

    We report on the work of the Open Geospatial Consortium Catalogue Services 3.0 Standards Working Group (OGC Cat 3.0 SWG for short), started in March 2008, with the purpose to process change requests on the Catalogue Services 2.0.2 Implementation Specification (OGC 07-006r1) and produce a revised version thereof, comprising the related XML schemas and abstract test suite. The work was initially intended as a minor revision (version 2.1), but later retargeted as a major update of the standard and rescheduled (the anticipated roadmap ended in 2008). The target audience of Catalogue Services 3.0 includes: • Implementors of catalogue services solutions. • Designers and developers of catalogue services profiles. • Providers/users of catalogue services. The two main general areas of enhancement included: restructuring the specification document according to the OGC standard for modular specifications (OGC 08-131r3, also known as Core and Extension model); incorporating the current mass-market technologies for discovery on the Web, namely OpenSearch. The document was initially split into four parts: the general model and the three protocol bindings HTTP, Z39.50, and CORBA. The CORBA binding, which was very rarely implemented, and the Z39.50 binding have later been dropped. Parts of the Z39.50 binding, namely Search/Retrieve via URL (SRU; same semantics as Z39.50, but stateless), have been provided as a discussion paper (OGC 12-082) for possibly developing a future SRU profile. The Catalogue Services 3.0 specification is structured as follows: • Part 1: General Model (Core) • Part 2: HTTP Protocol Binding (CSW) In CSW, the GET/KVP encoding is mandatory. The POST/XML encoding is optional. SOAP is supported as a special case of the POST/XML encoding. OpenSearch must always be supported, regardless of the implemented profiles, along with the OpenSearch Geospatial and Temporal Extensions (OGC 10-032r2). The latter specifies spatial (e.g. point-plus-radius, bounding

  17. Exchanging the Context between OGC Geospatial Web clients and GIS applications using Atom

    Science.gov (United States)

    Maso, Joan; Díaz, Paula; Riverola, Anna; Pons, Xavier

    2013-04-01

    Currently, the discovery and sharing of geospatial information over the web still presents difficulties. News distribution through website content was simplified by the use of Really Simple Syndication (RSS) and Atom syndication formats. This communication exposes an extension of Atom to redistribute references to geospatial information in a Spatial Data Infrastructure distributed environment. A geospatial client can save the status of an application that involves several OGC services of different kind and direct data and share this status with other users that need the same information and use different client vendor products in an interoperable way. The extensibility of the Atom format was essential to define a format that could be used in RSS enabled web browser, Mass Market map viewers and emerging geospatial enable integrated clients that support Open Geospatial Consortium (OGC) services. Since OWS Context has been designed as an Atom extension, it is possible to see the document in common places where Atom documents are valid. Internet web browsers are able to present the document as a list of items with title, abstract, time, description and downloading features. OWS Context uses GeoRSS so that, the document can be to be interpreted by both Google maps and Bing Maps as items that have the extent represented in a dynamic map. Another way to explode a OWS Context is to develop an XSLT to transform the Atom feed into an HTML5 document that shows the exact status of the client view window that saved the context document. To accomplish so, we use the width and height of the client window, and the extent of the view in world (geographic) coordinates in order to calculate the scale of the map. Then, we can mix elements in world coordinates (such as CF-NetCDF files or GML) with elements in pixel coordinates (such as WMS maps, WMTS tiles and direct SVG content). A smarter map browser application called MiraMon Map Browser is able to write a context document and read

  18. A web service for service composition to aid geospatial modelers

    Science.gov (United States)

    Bigagli, L.; Santoro, M.; Roncella, R.; Mazzetti, P.

    2012-04-01

    The identification of appropriate mechanisms for process reuse, chaining and composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. In the Earth and Space Sciences, such a facility could primarily enable integrated and interoperable modeling, for what several approaches have been proposed and developed, over the last years. In fact, GEOSS is specifically tasked with the development of the so-called "Model Web". At increasing levels of abstraction and generalization, the initial stove-pipe software tools have evolved to community-wide modeling frameworks, to Component-Based Architecture solution, and, more recently, started to embrace Service-Oriented Architectures technologies, such as the OGC WPS specification and the WS-* stack of W3C standards for service composition. However, so far, the level of abstraction seems too low for implementing the Model Web vision, and far too complex technological aspects must still be addressed by both providers and users, resulting in limited usability and, eventually, difficult uptake. As by the recent ICT trend of resource virtualization, it has been suggested that users in need of a particular processing capability, required by a given modeling workflow, may benefit from outsourcing the composition activities into an external first-class service, according to the Composition as a Service (CaaS) approach. A CaaS system provides the necessary interoperability service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general) in the form of executable workflows. This work introduces the architecture of a CaaS system, as a distributed information system for creating, validating, editing, storing, publishing, and executing geospatial workflows. This way, the users can be freed from the need of a composition infrastructure and

  19. Investigating Climate Change Issues With Web-Based Geospatial Inquiry Activities

    Science.gov (United States)

    Dempsey, C.; Bodzin, A. M.; Sahagian, D. L.; Anastasio, D. J.; Peffer, T.; Cirucci, L.

    2011-12-01

    In the Environmental Literacy and Inquiry middle school Climate Change curriculum we focus on essential climate literacy principles with an emphasis on weather and climate, Earth system energy balance, greenhouse gases, paleoclimatology, and how human activities influence climate change (http://www.ei.lehigh.edu/eli/cc/). It incorporates a related set of a framework and design principles to provide guidance for the development of the geospatial technology-integrated Earth and environmental science curriculum materials. Students use virtual globes, Web-based tools including an interactive carbon calculator and geologic timeline, and inquiry-based lab activities to investigate climate change topics. The curriculum includes educative curriculum materials that are designed to promote and support teachers' learning of important climate change content and issues, geospatial pedagogical content knowledge, and geographic spatial thinking. The curriculum includes baseline instructional guidance for teachers and provides implementation and adaptation guidance for teaching with diverse learners including low-level readers, English language learners and students with disabilities. In the curriculum, students use geospatial technology tools including Google Earth with embedded spatial data to investigate global temperature changes, areas affected by climate change, evidence of climate change, and the effects of sea level rise on the existing landscape. We conducted a designed-based research implementation study with urban middle school students. Findings showed that the use of the Climate Change curriculum showed significant improvement in urban middle school students' understanding of climate change concepts.

  20. Integrated web system of geospatial data services for climate research

    Science.gov (United States)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander

    2016-04-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required. An approach for integrated analysis of georefernced climatological data sets based on combination of web and GIS technologies in the framework of spatial data infrastructure paradigm is presented. According to this approach a dedicated data-processing web system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is being developed. It is based on Open Geospatial Consortium (OGC) standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement #14.613.21.0037.

  1. Global polar geospatial information service retrieval based on search engine and ontology reasoning

    Science.gov (United States)

    Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang

    2007-01-01

    In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.

  2. Interoperability in planetary research for geospatial data analysis

    Science.gov (United States)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  3. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    Science.gov (United States)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  4. GEO Label Web Services for Dynamic and Effective Communication of Geospatial Metadata Quality

    Science.gov (United States)

    Lush, Victoria; Nüst, Daniel; Bastin, Lucy; Masó, Joan; Lumsden, Jo

    2014-05-01

    We present demonstrations of the GEO label Web services and their integration into a prototype extension of the GEOSS portal (http://scgeoviqua.sapienzaconsulting.com/web/guest/geo_home), the GMU portal (http://gis.csiss.gmu.edu/GADMFS/) and a GeoNetwork catalog application (http://uncertdata.aston.ac.uk:8080/geonetwork/srv/eng/main.home). The GEO label is designed to communicate, and facilitate interrogation of, geospatial quality information with a view to supporting efficient and effective dataset selection on the basis of quality, trustworthiness and fitness for use. The GEO label which we propose was developed and evaluated according to a user-centred design (UCD) approach in order to maximise the likelihood of user acceptance once deployed. The resulting label is dynamically generated from producer metadata in ISO or FDGC format, and incorporates user feedback on dataset usage, ratings and discovered issues, in order to supply a highly informative summary of metadata completeness and quality. The label was easily incorporated into a community portal as part of the GEO Architecture Implementation Programme (AIP-6) and has been successfully integrated into a prototype extension of the GEOSS portal, as well as the popular metadata catalog and editor, GeoNetwork. The design of the GEO label was based on 4 user studies conducted to: (1) elicit initial user requirements; (2) investigate initial user views on the concept of a GEO label and its potential role; (3) evaluate prototype label visualizations; and (4) evaluate and validate physical GEO label prototypes. The results of these studies indicated that users and producers support the concept of a label with drill-down interrogation facility, combining eight geospatial data informational aspects, namely: producer profile, producer comments, lineage information, standards compliance, quality information, user feedback, expert reviews, and citations information. These are delivered as eight facets of a wheel

  5. Digital content sewed together within a library catalogue WebLib - The CERN Document Server

    CERN Document Server

    Vigen, Jens

    2002-01-01

    Aggregation, harvesting, personalization techniques, portals, service provision, etc. have all become buzzwords. Most of them simply describing what librarians have been doing for hundreds of years. Prior to the Web few people outside the libraries were concerned about these issues, a situation which today it is completely turned upside down. Hopefully the new actors on the arena of knowledge management will take full advantage of all the available "savoir faire". At CERN, the European Organization for Nuclear Research, librarians and informaticians have set up a complete system, WebLib, actually based on the traditional library catalogue. Digital content is, within this framework, being integrated to the highest possible level in order to meet the strong requirements of the particle physics community. The paper gives an overview of the steps CERN has made towards the digital library from the day the laboratory conceived the World Wide Web to present.

  6. Geospatial health

    DEFF Research Database (Denmark)

    Utzinger, Jürg; Rinaldi, Laura; Malone, John B.

    2011-01-01

    Geospatial Health is an international, peer-reviewed scientific journal produced by the Global Network for Geospatial Health (GnosisGIS). This network was founded in 2000 and the inaugural issue of its official journal was published in November 2006 with the aim to cover all aspects of geographical...... information system (GIS) applications, remote sensing and other spatial analytic tools focusing on human and veterinary health. The University of Naples Federico II is the publisher, producing two issues per year, both as hard copy and an open-access online version. The journal is referenced in major...... databases, including CABI, ISI Web of Knowledge and PubMed. In 2008, it was assigned its first impact factor (1.47), which has now reached 1.71. Geospatial Health is managed by an editor-in-chief and two associate editors, supported by five regional editors and a 23-member strong editorial board...

  7. Automating Geospatial Visualizations with Smart Default Renderers for Data Exploration Web Applications

    Science.gov (United States)

    Ekenes, K.

    2017-12-01

    This presentation will outline the process of creating a web application for exploring large amounts of scientific geospatial data using modern automated cartographic techniques. Traditional cartographic methods, including data classification, may inadvertently hide geospatial and statistical patterns in the underlying data. This presentation demonstrates how to use smart web APIs that quickly analyze the data when it loads, and provides suggestions for the most appropriate visualizations based on the statistics of the data. Since there are just a few ways to visualize any given dataset well, it is imperative to provide smart default color schemes tailored to the dataset as opposed to static defaults. Since many users don't go beyond default values, it is imperative that they are provided with smart default visualizations. Multiple functions for automating visualizations are available in the Smart APIs, along with UI elements allowing users to create more than one visualization for a dataset since there isn't a single best way to visualize a given dataset. Since bivariate and multivariate visualizations are particularly difficult to create effectively, this automated approach removes the guesswork out of the process and provides a number of ways to generate multivariate visualizations for the same variables. This allows the user to choose which visualization is most appropriate for their presentation. The methods used in these APIs and the renderers generated by them are not available elsewhere. The presentation will show how statistics can be used as the basis for automating default visualizations of data along continuous ramps, creating more refined visualizations while revealing the spread and outliers of the data. Adding interactive components to instantaneously alter visualizations allows users to unearth spatial patterns previously unknown among one or more variables. These applications may focus on a single dataset that is frequently updated, or configurable

  8. GIS-and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Wei [Colorado School of Mines, Golden, CO (United States); Minnick, Matthew [Colorado School of Mines, Golden, CO (United States); Geza, Mengistu [Colorado School of Mines, Golden, CO (United States); Murray, Kyle [Colorado School of Mines, Golden, CO (United States); Mattson, Earl [Colorado School of Mines, Golden, CO (United States)

    2012-09-30

    The Colorado School of Mines (CSM) was awarded a grant by the National Energy Technology Laboratory (NETL), Department of Energy (DOE) to conduct a research project en- titled GIS- and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development in October of 2008. The ultimate goal of this research project is to develop a water resource geo-spatial infrastructure that serves as “baseline data” for creating solutions on water resource management and for supporting decisions making on oil shale resource development. The project came to the end on September 30, 2012. This final project report will report the key findings from the project activity, major accomplishments, and expected impacts of the research. At meantime, the gamma version (also known as Version 4.0) of the geodatabase as well as other various deliverables stored on digital storage media will be send to the program manager at NETL, DOE via express mail. The key findings from the project activity include the quantitative spatial and temporal distribution of the water resource throughout the Piceance Basin, water consumption with respect to oil shale production, and data gaps identified. Major accomplishments of this project include the creation of a relational geodatabase, automated data processing scripts (Matlab) for database link with surface water and geological model, ArcGIS Model for hydrogeologic data processing for groundwater model input, a 3D geological model, surface water/groundwater models, energy resource development systems model, as well as a web-based geo-spatial infrastructure for data exploration, visualization and dissemination. This research will have broad impacts of the devel- opment of the oil shale resources in the US. The geodatabase provides a “baseline” data for fur- ther study of the oil shale development and identification of further data collection needs. The 3D geological model provides better understanding through data interpolation and

  9. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    Science.gov (United States)

    Mazzetti, Paolo

    2010-05-01

    In last decades two main paradigms for resource sharing emerged and reached maturity: the Web and the Grid. They both demonstrate suitable for building Distributed Computing Infrastructures (DCIs) supporting the coordinated sharing of resources (i.e. data, information, services, etc) on the Internet. Grid and Web DCIs have much in common as a result of their underlying Internet technology (protocols, models and specifications). However, being based on different requirements and architectural approaches, they show some differences as well. The Web's "major goal was to be a shared information space through which people and machines could communicate" [Berners-Lee 1996]. The success of the Web, and its consequent pervasiveness, made it appealing for building specialized systems like the Spatial Data Infrastructures (SDIs). In this systems the introduction of Web-based geo-information technologies enables specialized services for geospatial data sharing and processing. The Grid was born to achieve "flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources" [Foster 2001]. It specifically focuses on large-scale resource sharing, innovative applications, and, in some cases, high-performance orientation. In the Earth and Space Sciences (ESS) the most part of handled information is geo-referred (geo-information) since spatial and temporal meta-information is of primary importance in many application domains: Earth Sciences, Disasters Management, Environmental Sciences, etc. On the other hand, in several application areas there is the need of running complex models which require the large processing and storage capabilities that the Grids are able to provide. Therefore the integration of geo-information and Grid technologies might be a valuable approach in order to enable advanced ESS applications. Currently both geo-information and Grid technologies have reached a high level of maturity, allowing to build such an

  10. Web-Based Geospatial Visualization of GPM Data with CesiumJS

    Science.gov (United States)

    Lammers, Matt

    2018-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology have made online visualization of large geospatial datasets such as those coming from precipitation satellites viable. These data benefit from being visualized on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS (http://cesiumjs.org), developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. This presentation will describe how CesiumJS has been used in three-dimensional visualization products developed as part of the NASA Precipitation Processing System (PPS) STORM data-order website. Existing methods of interacting with Global Precipitation Measurement (GPM) Mission data primarily focus on two-dimensional static images, whether displaying vertical slices or horizontal surface/height-level maps. These methods limit interactivity with the robust three-dimensional data coming from the GPM core satellite. Integrating the data with CesiumJS in a web-based user interface has allowed us to create the following products. We have linked with the data-order interface an on-the-fly visualization tool for any GPM/partner satellite orbit. A version of this tool also focuses on high-impact weather events. It enables viewing of combined radar and microwave-derived precipitation data on mobile devices and in a way that can be embedded into other websites. We also have used CesiumJS to visualize a method of integrating gridded precipitation data with modeled wind speeds that animates over time. Emphasis in the presentation will be placed on how a variety of technical methods were used to create these tools, and how the flexibility of the CesiumJS framework facilitates creative approaches to interact with the data.

  11. Design and Development of a Framework Based on Ogc Web Services for the Visualization of Three Dimensional Large-Scale Geospatial Data Over the Web

    Science.gov (United States)

    Roccatello, E.; Nozzi, A.; Rumor, M.

    2013-05-01

    This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.

  12. A Smart Web-Based Geospatial Data Discovery System with Oceanographic Data as an Example

    Directory of Open Access Journals (Sweden)

    Yongyao Jiang

    2018-02-01

    Full Text Available Discovering and accessing geospatial data presents a significant challenge for the Earth sciences community as massive amounts of data are being produced on a daily basis. In this article, we report a smart web-based geospatial data discovery system that mines and utilizes data relevancy from metadata user behavior. Specifically, (1 the system enables semantic query expansion and suggestion to assist users in finding more relevant data; (2 machine-learned ranking is utilized to provide the optimal search ranking based on a number of identified ranking features that can reflect users’ search preferences; (3 a hybrid recommendation module is designed to allow users to discover related data considering metadata attributes and user behavior; (4 an integrated graphic user interface design is developed to quickly and intuitively guide data consumers to the appropriate data resources. As a proof of concept, we focus on a well-defined domain-oceanography and use oceanographic data discovery as an example. Experiments and a search example show that the proposed system can improve the scientific community’s data search experience by providing query expansion, suggestion, better search ranking, and data recommendation via a user-friendly interface.

  13. Testing OGC Web Feature and Coverage Service performance: Towards efficient delivery of geospatial data

    Directory of Open Access Journals (Sweden)

    Gregory Giuliani

    2013-12-01

    Full Text Available OGC Web Feature Service (WFS and Web Coverage Service (WCS specifications allow interoperable access to distributed geospatial data made available through spatial data infrastructures (SDIs. To ensure that a service is sufficiently responsive to fulfill users’ expectations and requirements, performance of services must be measured and monitored to track latencies, bottlenecks, and errors that may negatively influence its over- all quality. Despite the importance of data retrieval and access, little research has been published on this topic and mostly concentrates on the usability of services when integrating distributed data sources. Considering these issues, this paper extends and validates the FOSS4G approach to measure the server-side performance of different WFS and WCS services provided by various software implementations; and provides guidance to data providers looking to improve the quality of their services. Our results show that performance of tested implementations is generally satisfactory and memory tuning/data and storage optimization are essential to handle increased efficiency and reliability of services.

  14. ESO Catalogue Facility Design and Performance

    Science.gov (United States)

    Moins, C.; Retzlaff, J.; Arnaboldi, M.; Zampieri, S.; Delmotte, N.; Forchí, V.; Klein Gebbinck, M.; Lockhart, J.; Micol, A.; Vera Sequeiros, I.; Bierwirth, T.; Peron, M.; Romaniello, M.; Suchar, D.

    2013-10-01

    The ESO Phase 3 Catalogue Facility provides investigators with the possibility to ingest catalogues resulting from ESO public surveys and large programs and to query and download their content according to positional and non-positional criteria. It relies on a chain of tools that covers the complete workflow from submission to validation and ingestion into the ESO archive and catalogue repository and a web application to browse and query catalogues. This repository consists of two components. One is a Sybase ASE relational database where catalogue meta-data are stored. The second one is a Sybase IQ data warehouse where the content of each catalogue is ingested in a specific table that returns all records matching a user's query. Spatial indexing has been implemented in Sybase IQ to speed up positional queries and relies on the Spherical Geometry Toolkit from the Johns Hopkins University which implements the Hierarchical Triangular Mesh (HTM) algorithm. It is based on a recursive decomposition of the celestial sphere in spherical triangles and the assignment of an index to each of them. It has been complemented with the use of optimized indexes on the non-positional columns that are likely to be frequently used as query constraints. First tests performed on catalogues such as 2MASS have confirmed that this approach provides a very good level of performance and a smooth user experience that are likely to facilitate the scientific exploitation of catalogues.

  15. The LandCarbon Web Application: Advanced Geospatial Data Delivery and Visualization Tools for Communication about Ecosystem Carbon Sequestration and Greenhouse Gas Fluxes

    Science.gov (United States)

    Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.

    2015-12-01

    The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application

  16. Leveraging the geospatial advantage

    Science.gov (United States)

    Ben Butler; Andrew Bailey

    2013-01-01

    The Wildland Fire Decision Support System (WFDSS) web-based application leverages geospatial data to inform strategic decisions on wildland fires. A specialized data team, working within the Wildland Fire Management Research Development and Application group (WFM RD&A), assembles authoritative national-level data sets defining values to be protected. The use of...

  17. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    Science.gov (United States)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  18. NCI's Distributed Geospatial Data Server

    Science.gov (United States)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under

  19. An Effective Framework for Distributed Geospatial Query Processing in Grids

    Directory of Open Access Journals (Sweden)

    CHEN, B.

    2010-08-01

    Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.

  20. The geospatial data quality REST API for primary biodiversity data.

    Science.gov (United States)

    Otegui, Javier; Guralnick, Robert P

    2016-06-01

    We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  1. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    Science.gov (United States)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources

  2. The Road to Responsive: University of Toronto Libraries’ Journey to a New Library Catalogue Interface

    Directory of Open Access Journals (Sweden)

    Lisa Gayhart

    2014-01-01

    Full Text Available With the recent surge in the mobile device market and an ever expanding patron base with increasingly divergent levels of technical ability, the University of Toronto Libraries embarked on the development of a new catalogue discovery layer to fit the needs of its diverse users. The result: a mobile-friendly, flexible and intuitive web application that brings the full power of a faceted library catalogue to users without compromising quality or performance, employing Responsive Web Design principles.

  3. Usare WebDewey

    OpenAIRE

    Baldi, Paolo

    2016-01-01

    This presentation shows how to use the WebDewey tool. Features of WebDewey. Italian WebDewey compared with American WebDewey. Querying Italian WebDewey. Italian WebDewey and MARC21. Italian WebDewey and UNIMARC. Numbers, captions, "equivalente verbale": Dewey decimal classification in Italian catalogues. Italian WebDewey and Nuovo soggettario. Italian WebDewey and LCSH. Italian WebDewey compared with printed version of Italian Dewey Classification (22. edition): advantages and disadvantages o...

  4. A Practice Approach of Multi-source Geospatial Data Integration for Web-based Geoinformation Services

    Science.gov (United States)

    Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.

    2014-04-01

    Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.

  5. PENGEMBANGAN PERANGKAT LUNAK SISTEM INFORMASI GEOGRAFIS BERBASIS WEB

    Directory of Open Access Journals (Sweden)

    Budi Santosa

    2015-04-01

    Full Text Available Geospatial information is currently not only can be displayed using GIS software in a stand alone but can use the Internet as a medium for distributing geospatial information. Through the internet the whole population in the world can access geospatial information and provides a medium for geographic information processing desired without being limited by location. Web-based GIS map evolved from a web and client server architecture for distributed into a unity. Internet technology provides a new form for all functions of information systems is data collection, data storage, data retrieval (retrieving, data analysis and visualization of data. In this paper, the latest technology, web-based GIS with emphasis on architecture and stage of development of web-based GIS software that starts from the needs analysis to the maintenance stage. The implementation phase of the development of web-based GIS software to produce a web-based GIS product is right with the right process as well.

  6. Architecture of a Process Broker for Interoperable Geospatial Modeling on the Web

    Directory of Open Access Journals (Sweden)

    Lorenzo Bigagli

    2015-04-01

    Full Text Available The identification of appropriate mechanisms for process sharing and reuse by means of composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. Modelers in need of running complex workflows may benefit from outsourcing process composition to a dedicated external service, according to the brokering approach. This work introduces our architecture of a process broker, as a distributed information system for creating, validating, editing, storing, publishing and executing geospatial-modeling workflows. The broker provides a service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general in the form of interoperable, executable workflows. The described solution has been experimentally applied in several use scenarios in the context of EU-funded projects and the Global Earth Observation System of Systems.

  7. Review of Web Mapping: Eras, Trends and Directions

    Directory of Open Access Journals (Sweden)

    Bert Veenendaal

    2017-10-01

    Full Text Available Web mapping and the use of geospatial information online have evolved rapidly over the past few decades. Almost everyone in the world uses mapping information, whether or not one realizes it. Almost every mobile phone now has location services and every event and object on the earth has a location. The use of this geospatial location data has expanded rapidly, thanks to the development of the Internet. Huge volumes of geospatial data are available and daily being captured online, and are used in web applications and maps for viewing, analysis, modeling and simulation. This paper reviews the developments of web mapping from the first static online map images to the current highly interactive, multi-sourced web mapping services that have been increasingly moved to cloud computing platforms. The whole environment of web mapping captures the integration and interaction between three components found online, namely, geospatial information, people and functionality. In this paper, the trends and interactions among these components are identified and reviewed in relation to the technology developments. The review then concludes by exploring some of the opportunities and directions.

  8. A CLOUD-BASED PLATFORM SUPPORTING GEOSPATIAL COLLABORATION FOR GIS EDUCATION

    Directory of Open Access Journals (Sweden)

    X. Cheng

    2015-05-01

    Full Text Available GIS-related education needs support of geo-data and geospatial software. Although there are large amount of geographic information resources distributed on the web, the discovery, process and integration of these resources are still unsolved. Researchers and teachers always searched geo-data by common search engines but results were not satisfied. They also spent much money and energy on purchase and maintenance of various kinds of geospatial software. Aimed at these problems, a cloud-based geospatial collaboration platform called GeoSquare was designed and implemented. The platform serves as a geoportal encouraging geospatial data, information, and knowledge sharing through highly interactive and expressive graphic interfaces. Researchers and teachers can solve their problems effectively in this one-stop solution. Functions, specific design and implementation details are presented in this paper. Site of GeoSquare is: http://geosquare.tianditu.com/

  9. a Cloud-Based Platform Supporting Geospatial Collaboration for GIS Education

    Science.gov (United States)

    Cheng, X.; Gui, Z.; Hu, K.; Gao, S.; Shen, P.; Wu, H.

    2015-05-01

    GIS-related education needs support of geo-data and geospatial software. Although there are large amount of geographic information resources distributed on the web, the discovery, process and integration of these resources are still unsolved. Researchers and teachers always searched geo-data by common search engines but results were not satisfied. They also spent much money and energy on purchase and maintenance of various kinds of geospatial software. Aimed at these problems, a cloud-based geospatial collaboration platform called GeoSquare was designed and implemented. The platform serves as a geoportal encouraging geospatial data, information, and knowledge sharing through highly interactive and expressive graphic interfaces. Researchers and teachers can solve their problems effectively in this one-stop solution. Functions, specific design and implementation details are presented in this paper. Site of GeoSquare is: http://geosquare.tianditu.com/

  10. Lsiviewer 2.0 - a Client-Oriented Online Visualization Tool for Geospatial Vector Data

    Science.gov (United States)

    Manikanta, K.; Rajan, K. S.

    2017-09-01

    Geospatial data visualization systems have been predominantly through applications that are installed and run in a desktop environment. Over the last decade, with the advent of web technologies and its adoption by Geospatial community, the server-client model for data handling, data rendering and visualization respectively has been the most prevalent approach in Web-GIS. While the client devices have become functionally more powerful over the recent years, the above model has largely ignored it and is still in a mode of serverdominant computing paradigm. In this paper, an attempt has been made to develop and demonstrate LSIViewer - a simple, easy-to-use and robust online geospatial data visualisation system for the user's own data that harness the client's capabilities for data rendering and user-interactive styling, with a reduced load on the server. The developed system can support multiple geospatial vector formats and can be integrated with other web-based systems like WMS, WFS, etc. The technology stack used to build this system is Node.js on the server side and HTML5 Canvas and JavaScript on the client side. Various tests run on a range of vector datasets, upto 35 MB, showed that the time taken to render the vector data using LSIViewer is comparable to a desktop GIS application, QGIS, over an identical system.

  11. OpenClimateGIS - A Web Service Providing Climate Model Data in Commonly Used Geospatial Formats

    Science.gov (United States)

    Erickson, T. A.; Koziol, B. W.; Rood, R. B.

    2011-12-01

    The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.

  12. Web-GIS visualisation of permafrost-related Remote Sensing products for ESA GlobPermafrost

    Science.gov (United States)

    Haas, A.; Heim, B.; Schaefer-Neth, C.; Laboor, S.; Nitze, I.; Grosse, G.; Bartsch, A.; Kaab, A.; Strozzi, T.; Wiesmann, A.; Seifert, F. M.

    2016-12-01

    The ESA GlobPermafrost (www.globpermafrost.info) provides a remote sensing service for permafrost research and applications. The service comprises of data product generation for various sites and regions as well as specific infrastructure allowing overview and access to datasets. Based on an online user survey conducted within the project, the user community extensively applies GIS software to handle remote sensing-derived datasets and requires preview functionalities before accessing them. In response, we develop the Permafrost Information System PerSys which is conceptualized as an open access geospatial data dissemination and visualization portal. PerSys will allow visualisation of GlobPermafrost raster and vector products such as land cover classifications, Landsat multispectral index trend datasets, lake and wetland extents, InSAR-based land surface deformation maps, rock glacier velocity fields, spatially distributed permafrost model outputs, and land surface temperature datasets. The datasets will be published as WebGIS services relying on OGC-standardized Web Mapping Service (WMS) and Web Feature Service (WFS) technologies for data display and visualization. The WebGIS environment will be hosted at the AWI computing centre where a geodata infrastructure has been implemented comprising of ArcGIS for Server 10.4, PostgreSQL 9.2 and a browser-driven data viewer based on Leaflet (http://leafletjs.com). Independently, we will provide an `Access - Restricted Data Dissemination Service', which will be available to registered users for testing frequently updated versions of project datasets. PerSys will become a core project of the Arctic Permafrost Geospatial Centre (APGC) within the ERC-funded PETA-CARB project (www.awi.de/petacarb). The APGC Data Catalogue will contain all final products of GlobPermafrost, allow in-depth dataset search via keywords, spatial and temporal coverage, data type, etc., and will provide DOI-based links to the datasets archived in the

  13. The national atlas as a metaphor for improved use of a national geospatial data infrastructure

    NARCIS (Netherlands)

    Aditya Kurniawan Muhammad, T.

    2007-01-01

    Geospatial Data infrastructures have been developed worldwide. Geoportals have been created as an interface to allow users or the community to discover and use geospatial data offered by providers of these initiatives. This study focuses on the development of a web national atlas as an alternative

  14. Model My Watershed and BiG CZ Data Portal: Interactive geospatial analysis and hydrological modeling web applications that leverage the Amazon cloud for scientists, resource managers and students

    Science.gov (United States)

    Aufdenkampe, A. K.; Mayorga, E.; Tarboton, D. G.; Sazib, N. S.; Horsburgh, J. S.; Cheetham, R.

    2016-12-01

    The Model My Watershed Web app (http://wikiwatershed.org/model/) was designed to enable citizens, conservation practitioners, municipal decision-makers, educators, and students to interactively select any area of interest anywhere in the continental USA to: (1) analyze real land use and soil data for that area; (2) model stormwater runoff and water-quality outcomes; and (3) compare how different conservation or development scenarios could modify runoff and water quality. The BiG CZ Data Portal is a web application for scientists for intuitive, high-performance map-based discovery, visualization, access and publication of diverse earth and environmental science data via a map-based interface that simultaneously performs geospatial analysis of selected GIS and satellite raster data for a selected area of interest. The two web applications share a common codebase (https://github.com/WikiWatershed and https://github.com/big-cz), high performance geospatial analysis engine (http://geotrellis.io/ and https://github.com/geotrellis) and deployment on the Amazon Web Services (AWS) cloud cyberinfrastructure. Users can use "on-the-fly" rapid watershed delineation over the national elevation model to select their watershed or catchment of interest. The two web applications also share the goal of enabling the scientists, resource managers and students alike to share data, analyses and model results. We will present these functioning web applications and their potential to substantially lower the bar for studying and understanding our water resources. We will also present work in progress, including a prototype system for enabling citizen-scientists to register open-source sensor stations (http://envirodiy.org/mayfly/) to stream data into these systems, so that they can be reshared using Water One Flow web services.

  15. Roma-BZCAT: a multifrequency catalogue of blazars

    Science.gov (United States)

    Massaro, E.; Giommi, P.; Leto, C.; Marchegiani, P.; Maselli, A.; Perri, M.; Piranomonte, S.; Sclavi, S.

    2009-02-01

    We present a new catalogue of blazars based on multifrequency surveys and on an extensive review of the literature. Blazars are classified as BL Lacertae objects, as flat spectrum radio quasars or as blazars of uncertain/transitional type. Each object is identified by a root name, coded as BZB, BZQ and BZU for these three subclasses respectively, and by its coordinates. This catalogue is being built as a tool useful for the identification of the extragalactic sources that will be detected by present and future experiments for X and gamma-ray astronomy, like Swift, AGILE, Fermi-GLAST and Simbol-X. An electronic version is available from the ASI Science Data Center web site at http://www.asdc.asi.it/bzcat.

  16. OpenSearch technology for geospatial resources discovery

    Science.gov (United States)

    Papeschi, Fabrizio; Enrico, Boldrini; Mazzetti, Paolo

    2010-05-01

    set of services for discovery, access, and processing of geospatial resources in a SOA framework. GI-cat is a distributed CSW framework implementation developed by the ESSI Lab of the Italian National Research Council (CNR-IMAA) and the University of Florence. It provides brokering and mediation functionalities towards heterogeneous resources and inventories, exposing several standard interfaces for query distribution. This work focuses on a new GI-cat interface which allows the catalog to be queried according to the OpenSearch syntax specification, thus filling the gap between the SOA architectural design of the CSW and the Web 2.0. At the moment, there is no OGC standard specification about this topic, but an official change request has been proposed in order to enable the OGC catalogues to support OpenSearch queries. In this change request, an OpenSearch extension is proposed providing a standard mechanism to query a resource based on temporal and geographic extents. Two new catalog operations are also proposed, in order to publish a suitable OpenSearch interface. This extended interface is implemented by the modular GI-cat architecture adding a new profiling module called "OpenSearch profiler". Since GI-cat also acts as a clearinghouse catalog, another component called "OpenSearch accessor" is added in order to access OpenSearch compliant services. An important role in the GI-cat extension, is played by the adopted mapping strategy. Two different kind of mappings are required: query, and response elements mapping. Query mapping is provided in order to fit the simple OpenSearch query syntax to the complex CSW query expressed by the OGC Filter syntax. GI-cat internal data model is based on the ISO-19115 profile, that is more complex than the simple XML syndication formats, such as RSS 2.0 and Atom 1.0, suggested by OpenSearch. Once response elements are available, in order to be presented, they need to be translated from the GI-cat internal data model, to the above

  17. Mapping a Difference: The Power of Geospatial Visualization

    Science.gov (United States)

    Kolvoord, B.

    2015-12-01

    Geospatial Technologies (GST), such as GIS, GPS and remote sensing, offer students and teachers the opportunity to study the "why" of where. By making maps and collecting location-based data, students can pursue authentic problems using sophisticated tools. The proliferation of web- and cloud-based tools has made these technologies broadly accessible to schools. In addition, strong spatial thinking skills have been shown to be a key factor in supporting students that want to study science, technology, engineering, and mathematics (STEM) disciplines (Wai, Lubinski and Benbow) and pursue STEM careers. Geospatial technologies strongly scaffold the development of these spatial thinking skills. For the last ten years, the Geospatial Semester, a unique dual-enrollment partnership between James Madison University and Virginia high schools, has provided students with the opportunity to use GST's to hone their spatial thinking skills and to do extended projects of local interest, including environmental, geological and ecological studies. Along with strong spatial thinking skills, these students have also shown strong problem solving skills, often beyond those of fellow students in AP classes. Programs like the Geospatial Semester are scalable and within the reach of many college and university departments, allowing strong engagement with K-12 schools. In this presentation, we'll share details of the Geospatial Semester and research results on the impact of the use of these technologies on students' spatial thinking skills, and discuss the success and challenges of developing K-12 partnerships centered on geospatial visualization.

  18. GSKY: A scalable distributed geospatial data server on the cloud

    Science.gov (United States)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as

  19. Recent innovation of geospatial information technology to support disaster risk management and responses

    Science.gov (United States)

    Une, Hiroshi; Nakano, Takayuki

    2018-05-01

    Geographic location is one of the most fundamental and indispensable information elements in the field of disaster response and prevention. For example, in the case of the Tohoku Earthquake in 2011, aerial photos taken immediately after the earthquake greatly improved information sharing among different government offices and facilitated rescue and recovery operations, and maps prepared after the disaster assisted in the rapid reconstruction of affected local communities. Thanks to the recent development of geospatial information technology, this information has become more essential for disaster response activities. Advancements in web mapping technology allows us to better understand the situation by overlaying various location-specific data on base maps on the web and specifying the areas on which activities should be focused. Through 3-D modelling technology, we can have a more realistic understanding of the relationship between disaster and topography. Geospatial information technology can sup-port proper preparation and emergency responses against disasters by individuals and local communities through hazard mapping and other information services using mobile devices. Thus, geospatial information technology is playing a more vital role on all stages of disaster risk management and responses. In acknowledging geospatial information's vital role in disaster risk reduction, the Sendai Framework for Disaster Risk Reduction 2015-2030, adopted at the Third United Nations World Conference on Disaster Risk Reduction, repeatedly reveals the importance of utilizing geospatial information technology for disaster risk reduction. This presentation aims to report the recent practical applications of geospatial information technology for disaster risk management and responses.

  20. Arab Libraries’ Web-based OPACs: An evaluative study in the light of IFLA’s Guidelines For Online Public Access Catalogue (OPAC Displays

    Directory of Open Access Journals (Sweden)

    Sherif Kamel Shaheen

    2005-03-01

    Full Text Available The research aims at evaluating Arabic Libraries’ Web-based Catalogues in the light of Principles and Recommendations published in: IFLA’s Guidelines For OPAC Displays (September 30, 2003 Draft For Worldwide Review. The total No. Of Recommendations reached” 38 “were categorized under three main titles, as follows: User Needs (12 recommendations, Content and arrangement Principle (25 recommendations, Standardization Principle (one recommendation However that number increased to reach 88 elements when formulated as evaluative criteria and included in the study’s checklist.

  1. Technologies Connotation and Developing Characteristics of Open Geospatial Information Platform

    Directory of Open Access Journals (Sweden)

    GUO Renzhong

    2016-02-01

    Full Text Available Based on the background of developments of surveying,mapping and geoinformation,aimed at the demands of data fusion,real-time sharing,in-depth processing and personalization,this paper analyzes significant features of geo-spatial service in digital city,focuses on theory,method and key techniques of open environment of cloud computing,multi-path data updating,full-scale urban geocoding,multi-source spatial data integration,adaptive geo-processing and adaptive Web mapping.As the basis for it,the Open Geospatial information platform is developed,and successfully implicated in digital Shenzhen.

  2. Enhancing the online discovery of geospatial data through ...

    African Journals Online (AJOL)

    However, geoportals are often known to geoinformation communities only and present technological limitations which make it difficult for general purpose web search engines to discover and index the data catalogued in (or registered with) a geoportal. The mismatch between standard spatial metadata content and the ...

  3. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    Science.gov (United States)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  4. Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics

    Science.gov (United States)

    Singh, R.; Bermudez, L. E.

    2013-12-01

    Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics The Open Geospatial Consortium (OGC) mission is to serve as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC coordinates with over 400 institutions in the development of geospatial standards. In the last years two main trends are making disruptions in geospatial applications: mobile and context sharing. People now have more and more mobile devices to support their work and personal life. Mobile devices are intermittently connected to the internet and have smaller computing capacity than a desktop computer. Based on this trend a new OGC file format standard called GeoPackage will enable greater geospatial data sharing on mobile devices. GeoPackage is perhaps best understood as the natural evolution of Shapefiles, which have been the predominant lightweight geodata sharing format for two decades. However the format is extremely limited. Four major shortcomings are that only vector points, lines, and polygons are supported; property names are constrained by the dBASE format; multiple files are required to encode a single data set; and multiple Shapefiles are required to encode multiple data sets. A more modern lingua franca for geospatial data is long overdue. GeoPackage fills this need with support for vector data, image tile matrices, and raster data. And it builds upon a database container - SQLite - that's self-contained, single-file, cross-platform, serverless, transactional, and open source. A GeoPackage, in essence, is a set of SQLite database tables whose content and layout is described in the candidate GeoPackage Implementation Specification available at https://portal.opengeospatial.org/files/?artifact_id=54838&version=1. The second trend is sharing client 'contexts'. When a user is looking into an article or a product on the web

  5. TOWARDS IMPLEMENTATION OF THE FOG COMPUTING CONCEPT INTO THE GEOSPATIAL DATA INFRASTRUCTURES

    Directory of Open Access Journals (Sweden)

    E. A. Panidi

    2016-01-01

    Full Text Available The information technologies and Global Network technologies in particular are developing very quickly. According to this, the problem remains actual that incorporates implementation issues for the general-purpose technologies into the information systems which operate with geospatial data. The paper discusses the implementation feasibility for a number of new approaches and concepts that solve the problems of spatial data publish and management on the Global Network. A brief review describes some contemporary concepts and technologies used for distributed data storage and management, which provide combined use of server-side and client-side resources. In particular, the concepts of Cloud Computing, Fog Computing, and Internet of Things, also with Java Web Start, WebRTC and WebTorrent technologies are mentioned. The author's experience is described briefly, which incorporates the number of projects devoted to the development of the portable solutions for geospatial data and GIS software publication on the Global Network.

  6. Catalogue 2.0 the future of the library catalogue

    CERN Document Server

    Chambers, Sally

    2014-01-01

    Brings together some of the foremost international cataloguing practitioners and thought leaders, including Lorcan Dempsey, Emmanuelle Bermès, Marshall Breeding and Karen Calhoun, to provide an overview of the current state of the art of the library catalogue and look ahead to see what the library catalogue might become.

  7. Where do we go with Union Catalogues?

    Directory of Open Access Journals (Sweden)

    Edmund Chamberlain

    2013-07-01

    Full Text Available The United Kingdom boasts union catalogues for its major research libraries, journal holdings, archives and, most recently, for its public library collections. For researchers wanting to locate material across the UK, such aggregations have long served as a first stop for researchers wanting to find the right material and also provided a showcase for our formidable research collections. In the global networked environment, search engines and social networks can fulfil much of the functionality of union catalogues and have become the natural places to which our users go for search and discovery, even in academic situations. Right now, there is a ‘disconnect’ between the data describing our collections and the places users first turn to start their searches. This can be fixed by exposing descriptive data to wider audiences beyond the silo of the local catalogue, but data publishing is a fast moving area with little obvious short-term institutional-level gain and some start-up barriers. Publishing library data to the open web at the level of a national aggregation would utilize existing skill sets and infrastructure, minimize risk and maximize impact.

  8. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    Science.gov (United States)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current

  9. Development of Web GIS for complex processing and visualization of climate geospatial datasets as an integral part of dedicated Virtual Research Environment

    Science.gov (United States)

    Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander

    2017-04-01

    For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map

  10. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    Science.gov (United States)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We

  11. International Atomic Energy Agency publications. Publications catalogue 2004

    International Nuclear Information System (INIS)

    2004-03-01

    This Publications Catalogue lists all sales publications of the IAEA published in 2002, 2003 and forthcoming in early 2004. Most IAEA publications are issued in English, though some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  12. Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis

    Science.gov (United States)

    Singh, R.; Percivall, G.

    2009-12-01

    (note: acronym glossary at end of abstract) For scientists to have confidence in the veracity of data sets and computational processes not under their control, operational transparency must be much greater than previously required. Being able to have a universally understood and machine-readable language for describing such things as the completeness of metadata, data provenance and uncertainty, and the discrete computational steps in a complex process take on increased importance. OGC has been involved with technological issues associated with climate change since 2005 when we, along with the IEEE Committee on Earth Observation, began a close working relationship with GEO and GEOSS (http://earthobservations.org). GEO/GEOS provide the technology platform to GCOS who in turn represents the earth observation community to UNFCCC. OGC and IEEE are the organizers of the GEO/GEOSS Architecture Implementation Pilot (see http://www.ogcnetwork.net/AIpilot). This continuing work involves closely working with GOOS (Global Ocean Observing System) and WMO (World Meteorological Organization). This session reports on the findings of recent work within the OGC’s community of software developers and users to apply geospatial web services to the climate studies domain. The value of this work is to evolve OGC web services, moving from data access and query to geo-processing and workflows. Two projects will be described, the GEOSS API-2 and the CCIP. AIP is a task of the GEOSS Architecture and Data Committee. During its duration, two GEO Tasks defined the project: AIP-2 began as GEO Task AR-07-02, to lead the incorporation of contributed components consistent with the GEOSS Architecture using a GEO Web Portal and a Clearinghouse search facility to access services through GEOSS Interoperability Arrangements in support of the GEOSS Societal Benefit Areas. AIP-2 concluded as GEOS Task AR-09-01b, to develop and pilot new process and infrastructure components for the GEOSS Common

  13. Bibliographic information organization in the semantic web

    CERN Document Server

    Willer, Mirna

    2013-01-01

    New technologies will underpin the future generation of library catalogues. To facilitate their role providing information, serving users, and fulfilling their mission as cultural heritage and memory institutions, libraries must take a technological leap; their standards and services must be transformed to those of the Semantic Web. Bibliographic Information Organization in the Semantic Web explores the technologies that may power future library catalogues, and argues the necessity of such a leap. The text introduces international bibliographic standards and models, and fundamental concepts in

  14. Geospatial Brokering - Challenges and Future Directions

    Science.gov (United States)

    White, C. E.

    2012-12-01

    An important feature of many brokers is to facilitate straightforward human access to scientific data while maintaining programmatic access to it for system solutions. Standards-based protocols are critical for this, and there are a number of protocols to choose from. In this discussion, we will present a web application solution that leverages certain protocols - e.g., OGC CSW, REST, and OpenSearch - to provide programmatic as well as human access to geospatial resources. We will also discuss managing resources to reduce duplication yet increase discoverability, federated search solutions, and architectures that combine human-friendly interfaces with powerful underlying data management. The changing requirements witnessed in brokering solutions over time, our recent experience participating in the EarthCube brokering hack-a-thon, and evolving interoperability standards provide insight to future technological and philosophical directions planned for geospatial broker solutions. There has been much change over the past decade, but with the unprecedented data collaboration of recent years, in many ways the challenges and opportunities are just beginning.

  15. Teaching Tectonics to Undergraduates with Web GIS

    Science.gov (United States)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  16. A Catalogue of marine biodiversity indicators

    Directory of Open Access Journals (Sweden)

    Heliana Teixeira

    2016-11-01

    Full Text Available A Catalogue of Marine Biodiversity Indicators was developed with the aim of providing the basis for assessing the environmental status of the marine ecosystems. Useful for the implementation of the Marine Strategy Framework Directive (MSFD, this catalogue allows the navigation of a database of indicators mostly related to biological diversity, non-indigenous species, food webs, and seafloor integrity. Over 600 indicators were compiled, which were developed and used in the framework of different initiatives (e.g. EU policies, research projects and in national and international contexts (e.g. Regional Seas Conventions, and assessments in non-European seas. The catalogue reflects the current scientific capability to address environmental assessment needs by providing a broad coverage of the most relevant indicators for marine biodiversity and ecosystem integrity.The available indicators are reviewed according to their typology, data requirements, development status, geographical coverage, relevance to habitats or biodiversity components, and related human pressures. Through this comprehensive overview, we discuss the potential of the current set of indicators in a wide range of contexts, from large-scale to local environmental programs, and we also address shortcomings in light of current needs.Developed by the DEVOTES Project, the catalogue is freely available through the DEVOTool software application, which provides browsing and query options for the associated metadata. The tool allows extraction of ranked indicator lists best fulfilling selected criteria, enabling users to search for suitable indicators to address a particular biodiversity component, ecosystem feature, habitat or pressure in a marine area of interest.This tool is useful for EU Member States, Regional Sea Conventions, the European Commission, non-governmental organizations, managers, scientists and any person interested in marine environmental assessment. It allows users to

  17. A Catalogue of Marine Biodiversity Indicators

    KAUST Repository

    Teixeira, Heliana; Berg, Torsten; Uusitalo, Laura; Fü rhaupter, Karin; Heiskanen, Anna Stiina; Mazik, Krysia; Lynam, Christopher P.; Neville, Suzanna; Rodriguez, J. German; Papadopoulou, Nadia; Moncheva, Snejana; Churilova, Tanya; Kryvenko, Olga; Krause-Jensen, Dorte; Zaiko, Anastasija; Verí ssimo, Helena; Pantazi, Maria; Carvalho, Susana; Patrí cio, Joana; Uyarra, Maria C.; Borja, À ngel

    2016-01-01

    A Catalogue of Marine Biodiversity Indicators was developed with the aim of providing the basis for assessing the environmental status of the marine ecosystems. Useful for the implementation of the Marine Strategy Framework Directive (MSFD), this catalogue allows the navigation of a database of indicators mostly related to biological diversity, non-indigenous species, food webs, and seafloor integrity. Over 600 indicators were compiled, which were developed and used in the framework of different initiatives (e.g., EU policies, research projects) and in national and international contexts (e.g., Regional Seas Conventions, and assessments in non-European seas). The catalogue reflects the current scientific capability to address environmental assessment needs by providing a broad coverage of the most relevant indicators for marine biodiversity and ecosystem integrity. The available indicators are reviewed according to their typology, data requirements, development status, geographical coverage, relevance to habitats or biodiversity components, and related human pressures. Through this comprehensive overview, we discuss the potential of the current set of indicators in a wide range of contexts, from large-scale to local environmental programs, and we also address shortcomings in light of current needs. Developed by the DEVOTES Project, the catalogue is freely available through the DEVOTool software application, which provides browsing and query options for the associated metadata. The tool allows extraction of ranked indicator lists best fulfilling selected criteria, enabling users to search for suitable indicators to address a particular biodiversity component, ecosystem feature, habitat, or pressure in a marine area of interest. This tool is useful for EU Member States, Regional Sea Conventions, the European Commission, non-governmental organizations, managers, scientists, and any person interested in marine environmental assessment. It allows users to build

  18. A Catalogue of Marine Biodiversity Indicators

    KAUST Repository

    Teixeira, Heliana

    2016-11-04

    A Catalogue of Marine Biodiversity Indicators was developed with the aim of providing the basis for assessing the environmental status of the marine ecosystems. Useful for the implementation of the Marine Strategy Framework Directive (MSFD), this catalogue allows the navigation of a database of indicators mostly related to biological diversity, non-indigenous species, food webs, and seafloor integrity. Over 600 indicators were compiled, which were developed and used in the framework of different initiatives (e.g., EU policies, research projects) and in national and international contexts (e.g., Regional Seas Conventions, and assessments in non-European seas). The catalogue reflects the current scientific capability to address environmental assessment needs by providing a broad coverage of the most relevant indicators for marine biodiversity and ecosystem integrity. The available indicators are reviewed according to their typology, data requirements, development status, geographical coverage, relevance to habitats or biodiversity components, and related human pressures. Through this comprehensive overview, we discuss the potential of the current set of indicators in a wide range of contexts, from large-scale to local environmental programs, and we also address shortcomings in light of current needs. Developed by the DEVOTES Project, the catalogue is freely available through the DEVOTool software application, which provides browsing and query options for the associated metadata. The tool allows extraction of ranked indicator lists best fulfilling selected criteria, enabling users to search for suitable indicators to address a particular biodiversity component, ecosystem feature, habitat, or pressure in a marine area of interest. This tool is useful for EU Member States, Regional Sea Conventions, the European Commission, non-governmental organizations, managers, scientists, and any person interested in marine environmental assessment. It allows users to build

  19. DESIGN FOR CONNECTING SPATIAL DATA INFRASTRUCTURES WITH SENSOR WEB (SENSDI

    Directory of Open Access Journals (Sweden)

    D. Bhattacharya

    2016-06-01

    Full Text Available Integrating Sensor Web With Spatial Data Infrastructures (SENSDI aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS; 'Sensor Planning Service' (SPS; 'Sensor Alert Service' (SAS; a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS. Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.

  20. High performance geospatial and climate data visualization using GeoJS

    Science.gov (United States)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring

  1. SENSOR WEB SERVICES FOR EARLY FLOOD WARNINGS BASED ON SOIL MOISTURE PROFILES

    OpenAIRE

    T. Brinkhoff; S. Jansen

    2012-01-01

    As result of improved computing and communication capabilities, the use of sensors and sensor networks for environmental monitoring has gained considerable importance in the recent years. For an interoperable integration of sensor data like sensor descriptions, sensor measurements and alarm events, the Open Geospatial Consortium (OGC) started the Sensor Web Enablement (SWE) initiative and proposed several specifications in respect to a geospatial sensor web. First implementations of ...

  2. Importance of the spatial data and the sensor web in the ubiquitous computing area

    Science.gov (United States)

    Akçit, Nuhcan; Tomur, Emrah; Karslıoǧlu, Mahmut O.

    2014-08-01

    Spatial data has become a critical issue in recent years. In the past years, nearly more than three quarters of databases, were related directly or indirectly to locations referring to physical features, which constitute the relevant aspects. Spatial data is necessary to identify or calculate the relationships between spatial objects when using spatial operators in programs or portals. Originally, calculations were conducted using Geographic Information System (GIS) programs on local computers. Subsequently, through the Internet, they formed a geospatial web, which is integrated into a discoverable collection of geographically related web standards and key features, and constitutes a global network of geospatial data that employs the World Wide Web to process textual data. In addition, the geospatial web is used to gather spatial data producers, resources, and users. Standards also constitute a critical dimension in further globalizing the idea of the geospatial web. The sensor web is an example of the real time service that the geospatial web can provide. Sensors around the world collect numerous types of data. The sensor web is a type of sensor network that is used for visualizing, calculating, and analyzing collected sensor data. Today, people use smart devices and systems more frequently because of the evolution of technology and have more than one mobile device. The considerable number of sensors and different types of data that are positioned around the world have driven the production of interoperable and platform-independent sensor web portals. The focus of such production has been on further developing the idea of an interoperable and interdependent sensor web of all devices that share and collect information. The other pivotal idea consists of encouraging people to use and send data voluntarily for numerous purposes with the some level of credibility. The principal goal is to connect mobile and non-mobile device in the sensor web platform together to

  3. Towards Precise Metadata-set for Discovering 3D Geospatial Models in Geo-portals

    Science.gov (United States)

    Zamyadi, A.; Pouliot, J.; Bédard, Y.

    2013-09-01

    Accessing 3D geospatial models, eventually at no cost and for unrestricted use, is certainly an important issue as they become popular among participatory communities, consultants, and officials. Various geo-portals, mainly established for 2D resources, have tried to provide access to existing 3D resources such as digital elevation model, LIDAR or classic topographic data. Describing the content of data, metadata is a key component of data discovery in geo-portals. An inventory of seven online geo-portals and commercial catalogues shows that the metadata referring to 3D information is very different from one geo-portal to another as well as for similar 3D resources in the same geo-portal. The inventory considered 971 data resources affiliated with elevation. 51% of them were from three geo-portals running at Canadian federal and municipal levels whose metadata resources did not consider 3D model by any definition. Regarding the remaining 49% which refer to 3D models, different definition of terms and metadata were found, resulting in confusion and misinterpretation. The overall assessment of these geo-portals clearly shows that the provided metadata do not integrate specific and common information about 3D geospatial models. Accordingly, the main objective of this research is to improve 3D geospatial model discovery in geo-portals by adding a specific metadata-set. Based on the knowledge and current practices on 3D modeling, and 3D data acquisition and management, a set of metadata is proposed to increase its suitability for 3D geospatial models. This metadata-set enables the definition of genuine classes, fields, and code-lists for a 3D metadata profile. The main structure of the proposal contains 21 metadata classes. These classes are classified in three packages as General and Complementary on contextual and structural information, and Availability on the transition from storage to delivery format. The proposed metadata set is compared with Canadian Geospatial

  4. Enhancing discovery in spatial data infrastructures using a search engine

    Directory of Open Access Journals (Sweden)

    Paolo Corti

    2018-05-01

    Full Text Available A spatial data infrastructure (SDI is a framework of geospatial data, metadata, users and tools intended to provide an efficient and flexible way to use spatial information. One of the key software components of an SDI is the catalogue service which is needed to discover, query and manage the metadata. Catalogue services in an SDI are typically based on the Open Geospatial Consortium (OGC Catalogue Service for the Web (CSW standard which defines common interfaces for accessing the metadata information. A search engine is a software system capable of supporting fast and reliable search, which may use ‘any means necessary’ to get users to the resources they need quickly and efficiently. These techniques may include full text search, natural language processing, weighted results, fuzzy tolerance results, faceting, hit highlighting, recommendations and many others. In this paper we present an example of a search engine being added to an SDI to improve search against large collections of geospatial datasets. The Centre for Geographic Analysis (CGA at Harvard University re-engineered the search component of its public domain SDI (Harvard WorldMap which is based on the GeoNode platform. A search engine was added to the SDI stack to enhance the CSW catalogue discovery abilities. It is now possible to discover spatial datasets from metadata by using the standard search operations of the catalogue and to take advantage of the new abilities of the search engine, to return relevant and reliable content to SDI users.

  5. Real-time GIS data model and sensor web service platform for environmental data management.

    Science.gov (United States)

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  6. SDI-based business processes: A territorial analysis web information system in Spain

    Science.gov (United States)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  7. National Geospatial Program

    Science.gov (United States)

    Carswell, William J.

    2011-01-01

    The National Geospatial Program (NGP; http://www.usgs.gov/ngpo/) satisfies the needs of customers by providing geospatial products and services that customers incorporate into their decisionmaking and operational activities. These products and services provide geospatial data that are organized and maintained in cost-effective ways and developed by working with partners and organizations whose activities align with those of the program. To accomplish its mission, the NGP— organizes, maintains, publishes, and disseminates the geospatial baseline of the Nation's topography, natural landscape, and manmade environment through The National Map

  8. Web Map Services (WMS) Global Mosaic

    Science.gov (United States)

    Percivall, George; Plesea, Lucian

    2003-01-01

    The WMS Global Mosaic provides access to imagery of the global landmass using an open standard for web mapping. The seamless image is a mosaic of Landsat 7 scenes; geographically-accurate with 30 and 15 meter resolutions. By using the OpenGIS Web Map Service (WMS) interface, any organization can use the global mosaic as a layer in their geospatial applications. Based on a trade study, an implementation approach was chosen that extends a previously developed WMS hosting a Landsat 5 CONUS mosaic developed by JPL. The WMS Global Mosaic supports the NASA Geospatial Interoperability Office goal of providing an integrated digital representation of the Earth, widely accessible for humanity's critical decisions.

  9. Smart Cities Intelligence System (SMACiSYS) Integrating Sensor Web with Spatial Data Infrastructures (sensdi)

    Science.gov (United States)

    Bhattacharya, D.; Painho, M.

    2017-09-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  10. SMART CITIES INTELLIGENCE SYSTEM (SMACiSYS INTEGRATING SENSOR WEB WITH SPATIAL DATA INFRASTRUCTURES (SENSDI

    Directory of Open Access Journals (Sweden)

    D. Bhattacharya

    2017-09-01

    Full Text Available The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS with sensor-web access (SENSDI utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  11. Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data

    Science.gov (United States)

    Baxes, Gregory; Mixon, Brian; Linger, TIm

    2013-01-01

    Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be

  12. Users and Union Catalogues

    Science.gov (United States)

    Hartley, R. J.; Booth, Helen

    2006-01-01

    Union catalogues have had an important place in libraries for many years. Their use has been little investigated. Recent interest in the relative merits of physical and virtual union catalogues and a recent collaborative project between a physical and several virtual union catalogues in the United Kingdom led to the opportunity to study how users…

  13. Exploring U.S Cropland - A Web Service based Cropland Data Layer Visualization, Dissemination and Querying System (Invited)

    Science.gov (United States)

    Yang, Z.; Han, W.; di, L.

    2010-12-01

    The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability

  14. Geospatial Services Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: To process, store, and disseminate geospatial data to the Department of Defense and other Federal agencies.DESCRIPTION: The Geospatial Services Laboratory...

  15. Geo-communication and web-based geospatial infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2005-01-01

    The introduction of web-services as index-portals based on geoinformation has changed the conditions for both content and form of geocommunication. A high number of players and interactions (as well as a very high number of all kinds of information and combinations of these) characterize web-services......, where maps are only a part of the whole. These new conditions demand new ways of modelling the processes leading to geo-communication. One new aspect is the fact that the service providers have become a part of the geo-communication process with influence on the content. Another aspect...

  16. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    Science.gov (United States)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-03-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  17. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    Science.gov (United States)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-04-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  18. The IKEA Catalogue

    DEFF Research Database (Denmark)

    Brown, Barry; Bleecker, Julian; D'Adamo, Marco

    2016-01-01

    This paper is an introduction to the "Future IKEA Catalogue", enclosed here as an example of a design fiction produced from a long standing industrial-academic collaboration. We introduce the catalogue here by discussing some of our experiences using design fiction` with companies and public sector...

  19. Cataloguing outside the box a practical guide to cataloguing special collections materials

    CERN Document Server

    Falk, Patricia

    2010-01-01

    A practical guide to cataloguing and processing the unique special collections formats in the Browne Popular Culture Library (BPCL) and the Music Library and Sound Recordings Archives (MLSRA) at Bowling Green State University (BGSU) (e.g. fanzines, popular sound recordings, comic books, motion picture scripts and press kits, popular fiction). Cataloguing Outside the Box provides guidance to professionals in library and information science facing the same cataloguing challenges. Additionally, name authority work for these collections is addressed.provides practical guidelines and solutions for

  20. 3D geospatial visualizations: Animation and motion effects on spatial objects

    Science.gov (United States)

    Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos

    2018-02-01

    Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.

  1. On Hydronymic Catalogues Composition Principles: Cataloguing of Hydronyms of the Msta River Basin

    OpenAIRE

    Valery L. Vasilyev; Nina N. Vikhrova

    2015-01-01

    The article presents a brief review of the few Russian hydronymic catalogues (relating to the basins of the Don, Oka, Svir and other rivers) based on the hydrographic principle. The authors argue that, in comparison with alphabetized hydronymic dictionaries, hydronymic catalogues have some obvious advantages for onomastic lexicography. This kind of catalogues should include, firstly, all historically attested forms of a hydronym (including those considered to be occasional miswritings) and, s...

  2. Semantic Sensor Web Enablement for COAST, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Sensor Web Enablement (SWE) is an Open Geospatial Consortium (OGC) standard Service Oriented Architecture (SOA) that facilitates discovery and integration of...

  3. CERN Technical Training: Autumn 2007 Course Catalogue

    CERN Multimedia

    2007-01-01

    The following course sessions are scheduled in the framework of the CERN Technical Training Program 2007. You may find the full updated Technical Training course programme in our web-catalogue. OFFICE SOFTWARE CERN EDMS MTF en pratique F 4.9 1/2 d WORD 2007 (Short Course III) - How to work with long documents E/F 14.9 1/2 d FrontPage 2003 - niveau 1 E/F 17-18.9 2 d WORD 2007 - Niveau1: ECDL F 20.-21-9 2 d ACCESS 2007 - Level 1: ECDL E 20-21.9 2 d EXCEL 2007 (Short Course I) - How to work with Formulae E/F 21.9 1/2 d CERN EDMS Introduction E 24.9 1 d Java 2 Enterprise Edition - Part 1: Web Applications E 24-25.9 2 d Outlook 2007 (Short Course II) - Calendar, Tasks and Notes E/F 28.9 1/2 d Outlook 2007 (Short Course III) - Meeting and Delegation ...

  4. On Hydronymic Catalogues Composition Principles: Cataloguing of Hydronyms of the Msta River Basin

    Directory of Open Access Journals (Sweden)

    Valery L. Vasilyev

    2015-06-01

    Full Text Available The article presents a brief review of the few Russian hydronymic catalogues (relating to the basins of the Don, Oka, Svir and other rivers based on the hydrographic principle. The authors argue that, in comparison with alphabetized hydronymic dictionaries, hydronymic catalogues have some obvious advantages for onomastic lexicography. This kind of catalogues should include, firstly, all historically attested forms of a hydronym (including those considered to be occasional miswritings and, secondly, all non-hydronymic names making part of the respective hydronymic microsystem and providing “external” (i. e., chronological, derivational, etymological, ethno-historical information about the hydronym. The authors point out that the cataloguing of hydronyms based on the hydrographic principle entails some difficulties: impossibility to localize some bodies of water mentioned in ancient and medieval documents; differences in the indication of the same bodies of water on old and contemporary maps; historical differences in establishing hydrographic hierarchies; historical changes of lake-river systems, etc. The authors also share their experience in creating a hydronymic catalogue of the Msta River basin in Novgorod and Tver Regions of Russia. They describe the principles of the composition of the catalogue and present a short excerpt of it that orders names in the system of the Volma River, one of the Msta’s left tributaries.

  5. The Impact of a Geospatial Technology-Supported Energy Curriculum on Middle School Students' Science Achievement

    Science.gov (United States)

    Kulo, Violet; Bodzin, Alec

    2013-02-01

    Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade students classified in three ability level tracks. Data were gathered through pre/posttest content knowledge assessments, daily classroom observations, and daily reflective meetings with the teacher. Findings indicated a significant increase in the energy content knowledge for all the students. Effect sizes were large for all three ability level tracks, with the middle and low track classes having larger effect sizes than the upper track class. Learners in all three tracks were highly engaged with the curriculum. Curriculum effectiveness and practical issues involved with using geospatial technologies to support science learning are discussed.

  6. The Euro-Mediterranean Tsunami Catalogue

    Directory of Open Access Journals (Sweden)

    Alessandra Maramai

    2014-08-01

    Full Text Available A unified catalogue containing 290 tsunamis generated in the European and Mediterranean seas since 6150 B.C. to current days is presented. It is the result of a systematic and detailed review of all the regional catalogues available in literature covering the study area, each of them having their own format and level of accuracy. The realization of a single catalogue covering a so wide area and involving several countries was a complex task that posed a series of challenges, being the standardization and the quality of the data the most demanding. A “reliability” value was used to rate equally the quality of the data for each event and this parameter was assigned based on the trustworthiness of the information related to the generating cause, the tsunami description accuracy and also on the availability of coeval bibliographical sources. Following these criteria we included in the catalogue events whose reliability ranges from 0 (“very improbable tsunami” to 4 (“definite tsunami”. About 900 documentary sources, including historical documents, books, scientific reports, newspapers and previous catalogues, support the tsunami data and descriptions gathered in this catalogue. As a result, in the present paper a list of the 290 tsunamis with their main parameters is reported. The online version of the catalogue, available at http://roma2.rm.ingv.it/en/facilities/data_bases/52/catalogue_of_the_euro-mediterranean_tsunamis, provides additional information such as detailed descriptions, pictures, etc. and the complete list of bibliographical sources. Most of the included events have a high reliability value (3= “probable” and 4= “definite” which makes the Euro-Mediterranean Tsunami Catalogue an essential tool for the implementation of tsunami hazard and risk assessment.

  7. Python geospatial development

    CERN Document Server

    Westra, Erik

    2013-01-01

    This is a tutorial style book that will teach usage of Python tools for GIS using simple practical examples and then show you how to build a complete mapping application from scratch. The book assumes basic knowledge of Python. No knowledge of Open Source GIS is required.Experienced Python developers who want to learn about geospatial concepts, work with geospatial data, solve spatial problems, and build mapbased applications.This book will be useful those who want to get up to speed with Open Source GIS in order to build GIS applications or integrate GeoSpatial features into their existing ap

  8. Geospatial Authentication

    Science.gov (United States)

    Lyle, Stacey D.

    2009-01-01

    A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server.

  9. The Semantics of Web Services: An Examination in GIScience Applications

    Directory of Open Access Journals (Sweden)

    Xuan Shi

    2013-09-01

    Full Text Available Web service is a technological solution for software interoperability that supports the seamless integration of diverse applications. In the vision of web service architecture, web services are described by the Web Service Description Language (WSDL, discovered through Universal Description, Discovery and Integration (UDDI and communicate by the Simple Object Access Protocol (SOAP. Such a divination has never been fully accomplished yet. Although it was criticized that WSDL only has a syntactic definition of web services, but was not semantic, prior initiatives in semantic web services did not establish a correct methodology to resolve the problem. This paper examines the distinction and relationship between the syntactic and semantic definitions for web services that characterize different purposes in service computation. Further, this paper proposes that the semantics of web service are neutral and independent from the service interface definition, data types and platform. Such a conclusion can be a universal law in software engineering and service computing. Several use cases in the GIScience application are examined in this paper, while the formalization of geospatial services needs to be constructed by the GIScience community towards a comprehensive ontology of the conceptual definitions and relationships for geospatial computation. Advancements in semantic web services research will happen in domain science applications.

  10. Sustainable energy catalogue - for European decision-makers. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Gram, S.; Jacobsen, Soeren

    2006-10-15

    is a list of contacts, pamphlets, web pages etc., where it is possible to find more information about the individual technologies. Each chapter offers both an overview of the single technology concerning development stage, best available technology, supply potential, environmental impact etc., a comparison between the different technologies and information about how the technologies can interact with each other and with the energy system. Further more a timeline that spreads towards (further) commercialisation is drawn for each technology. This timeline includes significant events with regard to research and development, the market and policies that are expected to be of importance to the success of the technology. The text on each technology is written on a background of expert knowledge supplied by a selection of experts on each technology presented in the catalogue. The texts are further more reviewed and evaluated by an external expert. A review is placed in connection with the individual technology. (au)

  11. A Geospatial Cyberinfrastructure for Urban Economic Analysis and Spatial Decision-Making

    Directory of Open Access Journals (Sweden)

    Michael F. Goodchild

    2013-05-01

    Full Text Available Urban economic modeling and effective spatial planning are critical tools towards achieving urban sustainability. However, in practice, many technical obstacles, such as information islands, poor documentation of data and lack of software platforms to facilitate virtual collaboration, are challenging the effectiveness of decision-making processes. In this paper, we report on our efforts to design and develop a geospatial cyberinfrastructure (GCI for urban economic analysis and simulation. This GCI provides an operational graphic user interface, built upon a service-oriented architecture to allow (1 widespread sharing and seamless integration of distributed geospatial data; (2 an effective way to address the uncertainty and positional errors encountered in fusing data from diverse sources; (3 the decomposition of complex planning questions into atomic spatial analysis tasks and the generation of a web service chain to tackle such complex problems; and (4 capturing and representing provenance of geospatial data to trace its flow in the modeling task. The Greater Los Angeles Region serves as the test bed. We expect this work to contribute to effective spatial policy analysis and decision-making through the adoption of advanced GCI and to broaden the application coverage of GCI to include urban economic simulations.

  12. The two union catalogues of Myanmar

    Energy Technology Data Exchange (ETDEWEB)

    Hla, Win [Myanmar Scientific and Technological Research Dept., Yangon (Myanmar)

    1995-04-01

    The article mentions about the two union catalogues of Myanmar. The first one is the ``Consolidated Catalogue of journals and the periodicals contained in the libraries of Kasuali, Calcutta, Bombay, Madras, Coonoor, Rangoon and Shillong``. This was published by Indian Research Fund Association of Calcutta in 1933. This is the first union catalogue of medical periodicals for both Myanmar and India as well. The second one is ``the Regional Union Catalogue of Scientific Serials: Yangon``. This was published in 1977, its second printing in 1989. This union catalogue excludes medical serials. Twenty libraries took part in the compilation and publishing of the union catalogue with Technical Information Centre of Myanmar Scientific and Technological Research Department, (formerly Central Research Organization), No. 6, Kaba Aye Pagoda Road, Yankin P.O. Yangon, Myanmar, taking the leading role.

  13. The two union catalogues of Myanmar

    International Nuclear Information System (INIS)

    Hla, Win

    1995-01-01

    The article mentions about the two union catalogues of Myanmar. The first one is the ''Consolidated Catalogue of journals and the periodicals contained in the libraries of Kasuali, Calcutta, Bombay, Madras, Coonoor, Rangoon and Shillong''. This was published by Indian Research Fund Association of Calcutta in 1933. This is the first union catalogue of medical periodicals for both Myanmar and India as well. The second one is ''the Regional Union Catalogue of Scientific Serials: Yangon''. This was published in 1977, its second printing in 1989. This union catalogue excludes medical serials. Twenty libraries took part in the compilation and publishing of the union catalogue with Technical Information Centre of Myanmar Scientific and Technological Research Department, (formerly Central Research Organization), No. 6, Kaba Aye Pagoda Road, Yankin P.O. Yangon, Myanmar, taking the leading role

  14. Building an Elastic Parallel OGC Web Processing Service on a Cloud-Based Cluster: A Case Study of Remote Sensing Data Processing Service

    Directory of Open Access Journals (Sweden)

    Xicheng Tan

    2015-10-01

    Full Text Available Since the Open Geospatial Consortium (OGC proposed the geospatial Web Processing Service (WPS, standard OGC Web Service (OWS-based geospatial processing has become the major type of distributed geospatial application. However, improving the performance and sustainability of the distributed geospatial applications has become the dominant challenge for OWSs. This paper presents the construction of an elastic parallel OGC WPS service on a cloud-based cluster and the designs of a high-performance, cloud-based WPS service architecture, the scalability scheme of the cloud, and the algorithm of the elastic parallel geoprocessing. Experiments of the remote sensing data processing service demonstrate that our proposed method can provide a higher-performance WPS service that uses less computing resources. Our proposed method can also help institutions reduce hardware costs, raise the rate of hardware usage, and conserve energy, which is important in building green and sustainable geospatial services or applications.

  15. Lowering the barriers for accessing distributed geospatial big data to advance spatial data science: the PolarHub solution

    Science.gov (United States)

    Li, W.

    2017-12-01

    Data is the crux of science. The widespread availability of big data today is of particular importance for fostering new forms of geospatial innovation. This paper reports a state-of-the-art solution that addresses a key cyberinfrastructure research problem—providing ready access to big, distributed geospatial data resources on the Web. We first formulate this data-access problem and introduce its indispensable elements, including identifying the cyber-location, space and time coverage, theme, and quality of the dataset. We then propose strategies to tackle each data-access issue and make the data more discoverable and usable for geospatial data users and decision makers. Among these strategies is large-scale web crawling as a key technique to support automatic collection of online geospatial data that are highly distributed, intrinsically heterogeneous, and known to be dynamic. To better understand the content and scientific meanings of the data, methods including space-time filtering, ontology-based thematic classification, and service quality evaluation are incorporated. To serve a broad scientific user community, these techniques are integrated into an operational data crawling system, PolarHub, which is also an important cyberinfrastructure building block to support effective data discovery. A series of experiments were conducted to demonstrate the outstanding performance of the PolarHub system. We expect this work to contribute significantly in building the theoretical and methodological foundation for data-driven geography and the emerging spatial data science.

  16. The African Geospatial Sciences Institute (agsi): a New Approach to Geospatial Training in North Africa

    Science.gov (United States)

    Oeldenberger, S.; Khaled, K. B.

    2012-07-01

    The African Geospatial Sciences Institute (AGSI) is currently being established in Tunisia as a non-profit, non-governmental organization (NGO). Its objective is to accelerate the geospatial capacity development in North-Africa, providing the facilities for geospatial project and management training to regional government employees, university graduates, private individuals and companies. With typical course durations between one and six months, including part-time programs and long-term mentoring, its focus is on practical training, providing actual project execution experience. The AGSI will complement formal university education and will work closely with geospatial certification organizations and the geospatial industry. In the context of closer cooperation between neighboring North Africa and the European Community, the AGSI will be embedded in a network of several participating European and African universities, e. g. the ITC, and international organizations, such as the ISPRS, the ICA and the OGC. Through a close cooperation with African organizations, such as the AARSE, the RCMRD and RECTAS, the network and exchange of ideas, experiences, technology and capabilities will be extended to Saharan and sub-Saharan Africa. A board of trustees will be steering the AGSI operations and will ensure that practical training concepts and contents are certifiable and can be applied within a credit system to graduate and post-graduate education at European and African universities. The geospatial training activities of the AGSI are centered on a facility with approximately 30 part- and full-time general staff and lecturers in Tunis during the first year. The AGSI will operate a small aircraft with a medium-format aerial camera and compact LIDAR instrument for local, community-scale data capture. Surveying training, the photogrammetric processing of aerial images, GIS data capture and remote sensing training will be the main components of the practical training courses

  17. Geospatial Cyberinfrastructure and Geoprocessing Web—A Review of Commonalities and Differences of E-Science Approaches

    Directory of Open Access Journals (Sweden)

    Barbara Hofer

    2013-08-01

    Full Text Available Online geoprocessing gains momentum through increased online data repositories, web service infrastructures, online modeling capabilities and the required online computational resources. Advantages of online geoprocessing include reuse of data and services, extended collaboration possibilities among scientists, and efficiency thanks to distributed computing facilities. In the field of Geographic Information Science (GIScience, two recent approaches exist that have the goal of supporting science in online environments: the geospatial cyberinfrastructure and the geoprocessing web. Due to its historical development, the geospatial cyberinfrastructure has strengths related to the technologies required for data storage and processing. The geoprocessing web focuses on providing components for model development and sharing. These components shall allow expert users to develop, execute and document geoprocessing workflows in online environments. Despite this difference in the emphasis of the two approaches, the objectives, concepts and technologies they use overlap. This paper provides a review of the definitions and representative implementations of the two approaches. The provided overview clarifies which aspects of e-Science are highlighted in approaches differentiated in the geographic information domain. The discussion of the two approaches leads to the conclusion that synergies in research on e-Science environments shall be extended. Full-fledged e-Science environments will require the integration of approaches with different strengths.

  18. Gestión documental y de contenidos web: informe de situación

    OpenAIRE

    Saorín, Tomás; Pástor-Sánchez, Juan-Antonio

    2012-01-01

    Review of major 2011 developments in the field of bibliographic cataloguing, document management and documentary languages. The use of content management systems for web publishing, the emergence of new approaches such as web experience management and the integration of web productivity components are analyzed

  19. A Spatial Data Infrastructure Integrating Multisource Heterogeneous Geospatial Data and Time Series: A Study Case in Agriculture

    Directory of Open Access Journals (Sweden)

    Gloria Bordogna

    2016-05-01

    Full Text Available Currently, the best practice to support land planning calls for the development of Spatial Data Infrastructures (SDI capable of integrating both geospatial datasets and time series information from multiple sources, e.g., multitemporal satellite data and Volunteered Geographic Information (VGI. This paper describes an original OGC standard interoperable SDI architecture and a geospatial data and metadata workflow for creating and managing multisource heterogeneous geospatial datasets and time series, and discusses it in the framework of the Space4Agri project study case developed to support the agricultural sector in Lombardy region, Northern Italy. The main novel contributions go beyond the application domain for which the SDI has been developed and are the following: the ingestion within an a-centric SDI, potentially distributed in several nodes on the Internet to support scalability, of products derived by processing remote sensing images, authoritative data, georeferenced in-situ measurements and voluntary information (VGI created by farmers and agronomists using an original Smart App; the workflow automation for publishing sets and time series of heterogeneous multisource geospatial data and relative web services; and, finally, the project geoportal, that can ease the analysis of the geospatial datasets and time series by providing complex intelligent spatio-temporal query and answering facilities.

  20. A Geo-Event-Based Geospatial Information Service: A Case Study of Typhoon Hazard

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-03-01

    Full Text Available Social media is valuable in propagating information during disasters for its timely and available characteristics nowadays, and assists in making decisions when tagged with locations. Considering the ambiguity and inaccuracy in some social data, additional authoritative data are needed for important verification. However, current works often fail to leverage both social and authoritative data and, on most occasions, the data are used in disaster analysis after the fact. Moreover, current works organize the data from the perspective of the spatial location, but not from the perspective of the disaster, making it difficult to dynamically analyze the disaster. All of the disaster-related data around the affected locations need to be retrieved. To solve these limitations, this study develops a geo-event-based geospatial information service (GEGIS framework and proceeded as follows: (1 a geo-event-related ontology was constructed to provide a uniform semantic basis for the system; (2 geo-events and attributes were extracted from the web using a natural language process (NLP and used in the semantic similarity match of the geospatial resources; and (3 a geospatial information service prototype system was designed and implemented for automatically retrieving and organizing geo-event-related geospatial resources. A case study of a typhoon hazard is analyzed here within the GEGIS and shows that the system would be effective when typhoons occur.

  1. Improvement Of Search Process In Electronic Catalogues

    Directory of Open Access Journals (Sweden)

    Titas Savickas

    2014-05-01

    Full Text Available The paper presents investigation on search in electronic catalogues. The chosen problem domain is the search system in the electronic catalogue of Lithuanian Academic Libraries. The catalogue uses ALEPH system with MARC21 bibliographic format. The article presents analysis of problems pertaining to the current search engine and user expectations related to the search system of the electronic catalogue of academic libraries. Subsequent to analysis, the research paper presents the architecture for a semantic search system in the electronic catalogue that uses search process designed to improve search results for users.

  2. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    Science.gov (United States)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  3. Examining the Effect of Enactment of a Geospatial Curriculum on Students' Geospatial Thinking and Reasoning

    Science.gov (United States)

    Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara

    2014-08-01

    A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.

  4. Technology Catalogue

    International Nuclear Information System (INIS)

    1994-02-01

    The Department of Energy's Office of Environmental Restoration and Waste Management (EM) is responsible for remediating its contaminated sites and managing its waste inventory in a safe and efficient manner. EM's Office of Technology Development (OTD) supports applied research and demonstration efforts to develop and transfer innovative, cost-effective technologies to its site clean-up and waste management programs within EM's Office of Environmental Restoration and Office of Waste Management. The purpose of the Technology Catalogue is to provide performance data on OTD-developed technologies to scientists and engineers assessing and recommending technical solutions within the Department's clean-up and waste management programs, as well as to industry, other federal and state agencies, and the academic community. OTD's applied research and demonstration activities are conducted in programs referred to as Integrated Demonstrations (IDs) and Integrated Programs (IPs). The IDs test and evaluate.systems, consisting of coupled technologies, at specific sites to address generic problems, such as the sensing, treatment, and disposal of buried waste containers. The IPs support applied research activities in specific applications areas, such as in situ remediation, efficient separations processes, and site characterization. The Technology Catalogue is a means for communicating the status. of the development of these innovative technologies. The FY93 Technology Catalogue features technologies successfully demonstrated in the field through IDs and sufficiently mature to be used in the near-term. Technologies from the following IDs are featured in the FY93 Technology Catalogue: Buried Waste ID (Idaho National Engineering Laboratory, Idaho); Mixed Waste Landfill ID (Sandia National Laboratories, New Mexico); Underground Storage Tank ID (Hanford, Washington); Volatile organic compound (VOC) Arid ID (Richland, Washington); and VOC Non-Arid ID (Savannah River Site, South Carolina)

  5. On compact galaxies in the UGC catalogue

    International Nuclear Information System (INIS)

    Kogoshvili, N.G.

    1980-01-01

    A problem of separation of compact galaxies in the UGC Catalogue is considered. Value of surface brightness equal to or less than 21sup(m) was used as compactness criterion from a square second of arc. 96 galaxies, which are brighter than 14sup(m)5 satisfy this criterion. Among compact galaxies discovered in the UGC Catalogue 7% are the Zwicky galaxies, 15% belong to the Markarian galaxies and 27% of galaxies are part of a galaxy list with high surface brightness. Considerable divergence in estimates of total share of compact galaxies in the B.A. Worontsov-Veljaminov Morphological Catalogue of Galaxies (MCG) and the UGC Catalogue is noted. This divergence results from systematical underestimation of visible sizes of compact galaxies in the MCG Catalogue as compared with the UGC Catalogue [ru

  6. Efficient Extraction of Content from Enriched Geospatial and Networked Data

    DEFF Research Database (Denmark)

    Qu, Qiang

    forth, which makes it possible to extract relevant and interesting information that can then be utilized in different applications. However, web content is often semantically rich, structurally complex, and highly dynamic. This dissertation addresses some of the challenges posed by the use of such data......Social network services such as Google Places and Twitter have led to a proliferation of user-generated web content that is constantly shared among users. These services enable access to various types of content, covering geospatial locations, textual descriptions, social relationships, and so...... by merging edges and nodes in the original graph. Generalized, compressed graphs provide a way to interpret large networks. The dissertation reports on studies that compare the proposed solutions with respect to their tradeoffs between result complexity and quality. The findings suggest that the solutions...

  7. Geospatial Technologies and Geography Education in a Changing World : Geospatial Practices and Lessons Learned

    NARCIS (Netherlands)

    2015-01-01

    Book published by IGU Commission on Geographical Education. It focuses particularly on what has been learned from geospatial projects and research from the past decades of implementing geospatial technologies in formal and informal education.

  8. A Geospatial Online Instruction Model

    OpenAIRE

    Athena OWEN-NAGEL; John C. RODGERS III; Shrinidhi AMBINAKUDIGE

    2012-01-01

    The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model’s effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based teaching effectiveness has not yet been clearly demonstrated for geospatial courses. The pedagogical model implemented in this study heavily utilizes ...

  9. Collective Sensing: Integrating Geospatial Technologies to Understand Urban Systems—An Overview

    Directory of Open Access Journals (Sweden)

    Geoffrey J. Hay

    2011-08-01

    Full Text Available Cities are complex systems composed of numerous interacting components that evolve over multiple spatio-temporal scales. Consequently, no single data source is sufficient to satisfy the information needs required to map, monitor, model, and ultimately understand and manage our interaction within such urban systems. Remote sensing technology provides a key data source for mapping such environments, but is not sufficient for fully understanding them. In this article we provide a condensed urban perspective of critical geospatial technologies and techniques: (i Remote Sensing; (ii Geographic Information Systems; (iii object-based image analysis; and (iv sensor webs, and recommend a holistic integration of these technologies within the language of open geospatial consortium (OGC standards in-order to more fully understand urban systems. We then discuss the potential of this integration and conclude that this extends the monitoring and mapping options beyond “hard infrastructure” by addressing “humans as sensors”, mobility and human-environment interactions, and future improvements to quality of life and of social infrastructures.

  10. A Collaborative Geospatial Shoreline Inventory Tool to Guide Coastal Development and Habitat Conservation

    Directory of Open Access Journals (Sweden)

    Peter Gies

    2013-05-01

    Full Text Available We are developing a geospatial inventory tool that will guide habitat conservation, restoration and coastal development and benefit several stakeholders who seek mitigation and adaptation strategies to shoreline changes resulting from erosion and sea level rise. The ESRI Geoportal Server, which is a type of web portal used to find and access geospatial information in a central repository, is customized by adding a Geoinventory tool capability that allows any shoreline related data to be searched, displayed and analyzed on a map viewer. Users will be able to select sections of the shoreline and generate statistical reports in the map viewer to allow for comparisons. The tool will also facilitate map-based discussion forums and creation of user groups to encourage citizen participation in decisions regarding shoreline stabilization and restoration, thereby promoting sustainable coastal development.

  11. NREL Information Resources Catalogue 1999

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-04-03

    This is the sixth annual catalogue listing documents produced by NREL during the last fiscal year. Each year the catalogue is mailed to state energy offices, DOE support offices, and to anyone looking to find out more information about NREL's activities and publications.

  12. Extending the ISC-GEM Global Earthquake Instrumental Catalogue

    Science.gov (United States)

    Di Giacomo, Domenico; Engdhal, Bob; Storchak, Dmitry; Villaseñor, Antonio; Harris, James

    2015-04-01

    After a 27-month project funded by the GEM Foundation (www.globalquakemodel.org), in January 2013 we released the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php) as a special product to use for seismic hazard studies. The new catalogue was necessary as improved seismic hazard studies necessitate that earthquake catalogues are homogeneous (to the largest extent possible) over time in their fundamental parameters, such as location and magnitude. Due to time and resource limitation, the ISC-GEM catalogue (1900-2009) included earthquakes selected according to the following time-variable cut-off magnitudes: Ms=7.5 for earthquakes occurring before 1918; Ms=6.25 between 1918 and 1963; and Ms=5.5 from 1964 onwards. Because of the importance of having a reliable seismic input for seismic hazard studies, funding from GEM and two commercial companies in the US and UK allowed us to start working on the extension of the ISC-GEM catalogue both for earthquakes that occurred beyond 2009 and for earthquakes listed in the International Seismological Summary (ISS) which fell below the cut-off magnitude of 6.25. This extension is part of a four-year program that aims at including in the ISC-GEM catalogue large global earthquakes that occurred before the beginning of the ISC Bulletin in 1964. In this contribution we present the updated ISC GEM catalogue, which will include over 1000 more earthquakes that occurred in 2010 2011 and several hundreds more between 1950 and 1959. The catalogue extension between 1935 and 1949 is currently underway. The extension of the ISC-GEM catalogue will also be helpful for regional cross border seismic hazard studies as the ISC-GEM catalogue should be used as basis for cross-checking the consistency in location and magnitude of those earthquakes listed both in the ISC GEM global catalogue and regional catalogues.

  13. BKE: The catalogue of Bunker-Est Vesuvian station

    International Nuclear Information System (INIS)

    Sarao, A.; Peresan, A.; Vaccari, F.; De Natale, G.; Mariano, A.

    2002-06-01

    A catalogue of 9003 earthquakes associated to the Vesuvian volcano activity as recorded at Bunker Est station (BKE), located on Mt. Vesuvius and operated by the Osservatorio Vesuviano (OV), is presented here. The aim of this catalogue is to integrate the information collected in the catalogue compiled by OV that contains the volcanic earthquakes recorded at station OVO (OVO catalogue) since February 1972. A brief statistical description of the data included in the catalogue BKE and of the empirical relations used for the estimate of the magnitude MBKE from the duration is provided, together with the essential information about the catalogue source. A complete list of the events reported in the catalogue BKE and a description of its format is given in the Appendix. The catalogue BKE has been realized with the cooperation of the University of Trieste - Department of Earth Sciences, the International Centre for Theoretical Physics (ICTP) - Structure and Non-linear Dynamics of the earth (SAND) Group, and the Osservatorio Vesuviano of Naples in the framework of the project 'Eruptive Scenarios from Physical Modeling and Experimental Volcanology' funded by INGV. (author)

  14. Geospatial Information Response Team

    Science.gov (United States)

    Witt, Emitt C.

    2010-01-01

    Extreme emergency events of national significance that include manmade and natural disasters seem to have become more frequent during the past two decades. The Nation is becoming more resilient to these emergencies through better preparedness, reduced duplication, and establishing better communications so every response and recovery effort saves lives and mitigates the long-term social and economic impacts on the Nation. The National Response Framework (NRF) (http://www.fema.gov/NRF) was developed to provide the guiding principles that enable all response partners to prepare for and provide a unified national response to disasters and emergencies. The NRF provides five key principles for better preparation, coordination, and response: 1) engaged partnerships, 2) a tiered response, 3) scalable, flexible, and adaptable operations, 4) unity of effort, and 5) readiness to act. The NRF also describes how communities, tribes, States, Federal Government, privatesector, and non-governmental partners apply these principles for a coordinated, effective national response. The U.S. Geological Survey (USGS) has adopted the NRF doctrine by establishing several earth-sciences, discipline-level teams to ensure that USGS science, data, and individual expertise are readily available during emergencies. The Geospatial Information Response Team (GIRT) is one of these teams. The USGS established the GIRT to facilitate the effective collection, storage, and dissemination of geospatial data information and products during an emergency. The GIRT ensures that timely geospatial data are available for use by emergency responders, land and resource managers, and for scientific analysis. In an emergency and response capacity, the GIRT is responsible for establishing procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing coordinated products and services utilizing the USGS' exceptional pool of

  15. THROES: a caTalogue of HeRschel Observations of Evolved Stars. I. PACS range spectroscopy

    Science.gov (United States)

    Ramos-Medina, J.; Sánchez Contreras, C.; García-Lario, P.; Rodrigo, C.; da Silva Santos, J.; Solano, E.

    2018-03-01

    This is the first of a series of papers presenting the THROES (A caTalogue of HeRschel Observations of Evolved Stars) project, intended to provide a comprehensive overview of the spectroscopic results obtained in the far-infrared (55-670 μm) with the Herschel space observatory on low-to-intermediate mass evolved stars in our Galaxy. Here we introduce the catalogue of interactively reprocessed Photoconductor Array Camera and Spectrometer (PACS) spectra covering the 55-200 μm range for 114 stars in this category for which PACS range spectroscopic data is available in the Herschel Science Archive (HSA). Our sample includes objects spanning a range of evolutionary stages, from the asymptotic giant branch to the planetary nebula phase, displaying a wide variety of chemical and physical properties. The THROES/PACS catalogue is accessible via a dedicated web-based interface and includes not only the science-ready Herschel spectroscopic data for each source, but also complementary photometric and spectroscopic data from other infrared observatories, namely IRAS, ISO, or AKARI, at overlapping wavelengths. Our goal is to create a legacy-value Herschel dataset that can be used by the scientific community in the future to deepen our knowledge and understanding of these latest stages of the evolution of low-to-intermediate mass stars. The THROES/PACS catalogue is accessible at http://https://throes.cab.inta-csic.es/

  16. UCI2001: The updated catalogue of Italy

    International Nuclear Information System (INIS)

    Peresan, A.; Panza, G.F.

    2002-05-01

    A new updated earthquake catalogue for the Italian territory, named UCI2001, is described here; it consists of an updated and revised version of the CCI1996 catalogue (Peresan et al., 1997). The revision essentially corresponds to the incorporation of data from the NEIC (National Earthquake Information Centre) and ALPOR (Catalogo delle Alpi Orientali) catalogues, while the updating is performed using the NEIC Preliminary Determinations of Epicenters since 1986. A brief overview of the catalogues used for the monitoring of seismicity in the Italian area is provided, together with the essential information about the structure of the UCI2001 catalogue and a description of its format. A complete list of the events, as on May 1 2002, is given in the Appendix. (author)

  17. The geo-spatial information infrastructure at the Centre for Control and Prevention of Zoonoses, University of Ibadan, Nigeria: an emerging sustainable One-Health pavilion.

    Science.gov (United States)

    Olugasa, B O

    2014-12-01

    The World-Wide-Web as a contemporary means of information sharing offers a platform for geo-spatial information dissemination to improve education about spatio-temporal patterns of disease spread at the human-animal-environment interface in developing countries of West Africa. In assessing the quality of exposure to geospatial information applications among students in five purposively selected institutions in West Africa, this study reviewed course contents and postgraduate programmes in zoonoses surveillance. Geospatial information content and associated practical exercises in zoonoses surveillance were scored.. Seven criteria were used to categorize and score capability, namely, spatial data capture; thematic map design and interpretation; spatio-temporal analysis; remote sensing of data; statistical modelling; the management of spatial data-profile; and web-based map sharing operation within an organization. These criteria were used to compute weighted exposure during training at the institutions. A categorical description of institution with highest-scoring of computed Cumulative Exposure Point Average (CEPA) was based on an illustration with retrospective records of rabies cases, using data from humans, animals and the environment, that were sourced from Grand Bassa County, Liberia to create and share maps and information with faculty, staff, students and the neighbourhood about animal bite injury surveillance and spatial distribution of rabies-like illness. Uniformly low CEPA values (0-1.3) were observed across academic departments. The highest (3.8) was observed at the Centre for Control and Prevention of Zoonoses (CCPZ), University of Ibadan, Nigeria, where geospatial techniques were systematically taught, and thematic and predictive maps were produced and shared online with other institutions in West Africa. In addition, a short course in zoonosis surveillance, which offers inclusive learning in geospatial applications, is taught at CCPZ. The paper

  18. Enhancing the Teaching of Digital Processing of Remote Sensing Image Course through Geospatial Web Processing Services

    Science.gov (United States)

    di, L.; Deng, M.

    2010-12-01

    Remote sensing (RS) is an essential method to collect data for Earth science research. Huge amount of remote sensing data, most of them in the image form, have been acquired. Almost all geography departments in the world offer courses in digital processing of remote sensing images. Such courses place emphasis on how to digitally process large amount of multi-source images for solving real world problems. However, due to the diversity and complexity of RS images and the shortcomings of current data and processing infrastructure, obstacles for effectively teaching such courses still remain. The major obstacles include 1) difficulties in finding, accessing, integrating and using massive RS images by students and educators, and 2) inadequate processing functions and computing facilities for students to freely explore the massive data. Recent development in geospatial Web processing service systems, which make massive data, computing powers, and processing capabilities to average Internet users anywhere in the world, promises the removal of the obstacles. The GeoBrain system developed by CSISS is an example of such systems. All functions available in GRASS Open Source GIS have been implemented as Web services in GeoBrain. Petabytes of remote sensing images in NASA data centers, the USGS Landsat data archive, and NOAA CLASS are accessible transparently and processable through GeoBrain. The GeoBrain system is operated on a high performance cluster server with large disk storage and fast Internet connection. All GeoBrain capabilities can be accessed by any Internet-connected Web browser. Dozens of universities have used GeoBrain as an ideal platform to support data-intensive remote sensing education. This presentation gives a specific example of using GeoBrain geoprocessing services to enhance the teaching of GGS 588, Digital Remote Sensing taught at the Department of Geography and Geoinformation Science, George Mason University. The course uses the textbook "Introductory

  19. Prototype of a web - based participative decision support platform in natural hazards and risk management

    NARCIS (Netherlands)

    Aye, Z.C.; Jaboyedoff, M.; Derron, M.H.; van Westen, C.J.

    2015-01-01

    This paper presents the current state and development of a prototype web-GIS (Geographic Information System) decision support platform intended for application in natural hazards and risk management, mainly for floods and landslides. This web platform uses open-source geospatial software and

  20. Evidence Based Cataloguing: Moving Beyond the Rules

    Directory of Open Access Journals (Sweden)

    Kathy Carter

    2010-12-01

    Full Text Available Cataloguing is sometimes regarded as a rule-bound, production-based activity that offers little scope for professional judgement and decision-making. In reality, cataloguing involves challenging decisions that can have significant service and financial impacts. The current environment for cataloguing is a maelstrom of changing demands and competing visions for the future. With information-seekers turning en masse to Google and their behaviour receiving greater attention, library vendors are offering “discovery layer” products to replace traditional OPACs, and cataloguers are examining and debating a transformed version of their descriptive cataloguing rules (Resource Description and Access or RDA. In his “Perceptions of the future of cataloging: Is the sky really falling?” (2009, Ivey provides a good summary of this environment. At the same time, myriad new metadata formats and schema are being developed and applied for digital collections in libraries and other institutions. In today’s libraries, cataloguing is no longer limited to management of traditional AACR and MARC-based metadata for traditional library collections. And like their parent institutions, libraries cannot ignore growing pressures to demonstrate accountability and tangible value provided by their services. More than ever, research and an evidence based approach can help guide cataloguing decision-making.

  1. A framework for efficient spatial web object retrieval

    DEFF Research Database (Denmark)

    Wu, Dinging; Cong, Gao; Jensen, Christian S.

    2012-01-01

    The conventional Internet is acquiring a geospatial dimension. Web documents are being geo-tagged and geo-referenced objects such as points of interest are being associated with descriptive text documents. The resulting fusion of geo-location and documents enables new kinds of queries that take...

  2. A multimembership catalogue for 1876 open clusters using UCAC4 data

    Science.gov (United States)

    Sampedro, L.; Dias, W. S.; Alfaro, E. J.; Monteiro, H.; Molino, A.

    2017-10-01

    The main objective of this work is to determine the cluster members of 1876 open clusters, using positions and proper motions of the astrometric fourth United States Naval Observatory (USNO) CCD Astrograph Catalog (UCAC4). For this purpose, we apply three different methods, all based on a Bayesian approach, but with different formulations: a purely parametric method, another completely non-parametric algorithm and a third, recently developed by Sampedro & Alfaro, using both formulations at different steps of the whole process. The first and second statistical moments of the members' phase-space subspace, obtained after applying the three methods, are compared for every cluster. Although, on average, the three methods yield similar results, there are also specific differences between them, as well as for some particular clusters. The comparison with other published catalogues shows good agreement. We have also estimated, for the first time, the mean proper motion for a sample of 18 clusters. The results are organized in a single catalogue formed by two main files, one with the most relevant information for each cluster, partially including that in UCAC4, and the other showing the individual membership probabilities for each star in the cluster area. The final catalogue, with an interface design that enables an easy interaction with the user, is available in electronic format at the Stellar Systems Group (SSG-IAA) web site (http://ssg.iaa.es/en/content/sampedro-cluster-catalog).

  3. Using Web GIS for Public Health Education

    Science.gov (United States)

    Reed, Rajika E.; Bodzin, Alec M.

    2016-01-01

    An interdisciplinary curriculum unit that used Web GIS mapping to investigate malaria disease patterns and spread in relation to the environment for a high school Advanced Placement Environmental Science course was developed. A feasibility study was conducted to investigate the efficacy of the unit to promote geospatial thinking and reasoning…

  4. Practical cataloguing AACR, RDA and MARC 21

    CERN Document Server

    Welsh, Anne

    2012-01-01

    Written at a time of transition in international cataloguing, this book provides cataloguers and students with a background in general cataloguing principles, the code (AACR2) and format (MARC 21) and the new standard (RDA). It provides library managers with an overview of the development of RDA in order to equip them to make the transition.

  5. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    Science.gov (United States)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This

  6. SOFTWARE ARCHITECTURE DESIGN OF GIS WEB SERVICE AGGREGATION BASED ON SERVICE GROUP

    Directory of Open Access Journals (Sweden)

    J.-C. Liu

    2012-08-01

    Full Text Available Based on the analysis of research status of domestic and international GIS web service aggregation and development tendency of public platform of GIS web service, the paper designed software architecture of GIS web service aggregation based on GIS web service group. Firstly, using heterogeneous GIS services model, the software architecture converted a variety of heterogeneous services to a unified interface of GIS services, and divided different types of GIS services into different service groups referring to description of GIS services. Secondly, a service aggregation process model was designed. This model completed the task of specific service aggregation instance, by automatically selecting member GIS Web services in the same service group. Dynamic capabilities and automatic adaptation of GIS Web services aggregation process were achieved. Thirdly, this paper designed a service evaluation model of GIS web service aggregation based on service group from three aspects, i.e. GIS Web Service itself, networking conditions and service consumer. This model implemented effective quality evaluation and performance monitoring of GIS web service aggregation. It could be used to guide the execution, monitor and service selection of aggregation process. Therefore, robustness of aggregated GIS web service was improved. Finally, the software architecture has been widely used in public platform of GIS web service and a number of geo-spatial framework constructions for digital city in Sichuan Province, and aggregated various GIS web services such as World Map(National Public Platform of Geo-spatial Service, ArcGIS, SuperMap, MapGIS, NewMap etc. Applications of items showed that this software architecture was practicability.

  7. Alcohol promotions in Australian supermarket catalogues.

    Science.gov (United States)

    Johnston, Robyn; Stafford, Julia; Pierce, Hannah; Daube, Mike

    2017-07-01

    In Australia, most alcohol is sold as packaged liquor from off-premises retailers, a market increasingly dominated by supermarket chains. Competition between retailers may encourage marketing approaches, for example, discounting, that evidence indicates contribute to alcohol-related harms. This research documented the nature and variety of promotional methods used by two major supermarket retailers to promote alcohol products in their supermarket catalogues. Weekly catalogues from the two largest Australian supermarket chains were reviewed for alcohol-related content over 12 months. Alcohol promotions were assessed for promotion type, product type, number of standard drinks, purchase price and price/standard drink. Each store catalogue included, on average, 13 alcohol promotions/week, with price-based promotions most common. Forty-five percent of promotions required the purchase of multiple alcohol items. Wine was the most frequently promoted product (44%), followed by beer (24%) and spirits (18%). Most (99%) wine cask (2-5 L container) promotions required multiple (two to three) casks to be purchased. The average number of standard drinks required to be purchased to participate in catalogue promotions was 31.7 (SD = 24.9; median = 23.1). The median price per standard drink was $1.49 (range $0.19-$9.81). Cask wines had the lowest cost per standard drink across all product types. Supermarket catalogues' emphasis on low prices/high volumes of alcohol reflects that retailers are taking advantage of limited restrictions on off-premise sales and promotion, which allow them to approach market competition in ways that may increase alcohol-related harms in consumers. Regulation of alcohol marketing should address retailer catalogue promotions. [Johnston R, Stafford J, Pierce H, Daube M. Alcohol promotions in Australian supermarket catalogues. Drug Alcohol Rev 2017;36:456-463]. © 2016 Australasian Professional Society on Alcohol and other Drugs.

  8. Development and challenges of using web-based GIS for health applications

    DEFF Research Database (Denmark)

    Gao, Sheng; Mioc, Darka; Boley, Harold

    2011-01-01

    Web-based GIS is increasingly used in health applications. It has the potential to provide critical information in a timely manner, support health care policy development, and educate decision makers and the general public. This paper describes the trends and recent development of health...... applications using a Web-based GIS. Recent progress on the database storage and geospatial Web Services has advanced the use of Web-based GIS for health applications, with various proprietary software, open source software, and Application Programming Interfaces (APIs) available. Current challenges in applying...... care planning, and public health participation....

  9. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    Science.gov (United States)

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  10. Geospatial Technology in Geography Education

    NARCIS (Netherlands)

    Muniz Solari, Osvaldo; Demirci, A.; van der Schee, J.A.

    2015-01-01

    The book is presented as an important starting point for new research in Geography Education (GE) related to the use and application of geospatial technologies (GSTs). For this purpose, the selection of topics was based on central ideas to GE in its relationship with GSTs. The process of geospatial

  11. Teaching And Learning Tectonics With Web-GIS

    Science.gov (United States)

    Anastasio, D. J.; Sahagian, D. L.; Bodzin, A.; Teletzke, A. L.; Rutzmoser, S.; Cirucci, L.; Bressler, D.; Burrows, J. E.

    2012-12-01

    Tectonics is a new curriculum enhancement consisting of six Web GIS investigations designed to augment a traditional middle school Earth science curriculum. The investigations are aligned to Disciplinary Core Ideas: Earth and Space Science from the National Research Council's (2012) Framework for K-12 Science Education and to tectonics benchmark ideas articulated in the AAAS Project 2061 (2007) Atlas of Science Literacy. The curriculum emphasizes geospatial thinking and scientific inquiry and consists of the following modules: Geohazards, which plate boundary is closest to me? How do we recognize plate boundaries? How does thermal energy move around the Earth? What happens when plates diverge? What happens when plate move sideways past each other? What happens when plates collide? The Web GIS interface uses JavaScript for simplicity, intuition, and convenience for implementation on a variety of platforms making it easier for diverse middle school learners and their teachers to conduct authentic Earth science investigations, including multidisciplinary visualization, analysis, and synthesis of data. Instructional adaptations allow students who are English language learners, have disabilities, or are reluctant readers to perform advanced desktop GIS functions including spatial analysis, map visualization and query. The Web GIS interface integrates graphics, multimedia, and animation in addition to newly developed features, which allow users to explore and discover geospatial patterns that would not be easily visible using typical classroom instructional materials. The Tectonics curriculum uses a spatial learning design model that incorporates a related set of frameworks and design principles. The framework builds on the work of other successful technology-integrated curriculum projects and includes, alignment of materials and assessments with learning goals, casting key ideas in real-world problems, engaging students in scientific practices that foster the use of key

  12. An All-Sky Portable (ASP) Optical Catalogue

    Science.gov (United States)

    Flesch, Eric Wim

    2017-06-01

    This optical catalogue combines the all-sky USNO-B1.0/A1.0 and most-sky APM catalogues, plus overlays of SDSS optical data, into a single all-sky map presented in a sparse binary format that is easily downloaded at 9 Gb zipped. Total count is 1 163 237 190 sources and each has J2000 astrometry, red and blue magnitudes with PSFs and variability indicator, and flags for proper motion, epoch, and source survey and catalogue for each of the photometry and astrometry. The catalogue is available on http://quasars.org/asp.html, and additional data for this paper is available at http://dx.doi.org/10.4225/50/5807fbc12595f.

  13. Dynamic Science Data Services for Display, Analysis and Interaction in Widely-Accessible, Web-Based Geospatial Platforms, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — TerraMetrics, Inc., proposes a Phase II R/R&D program to implement the TerraBlocksTM Server architecture that provides geospatial data authoring, storage and...

  14. Brokered virtual hubs for facilitating access and use of geospatial Open Data

    Science.gov (United States)

    Mazzetti, Paolo; Latre, Miguel; Kamali, Nargess; Brumana, Raffaella; Braumann, Stefan; Nativi, Stefano

    2016-04-01

    Open Data is a major trend in current information technology scenario and it is often publicised as one of the pillars of the information society in the near future. In particular, geospatial Open Data have a huge potential also for Earth Sciences, through the enablement of innovative applications and services integrating heterogeneous information. However, open does not mean usable. As it was recognized at the very beginning of the Web revolution, many different degrees of openness exist: from simple sharing in a proprietary format to advanced sharing in standard formats and including semantic information. Therefore, to fully unleash the potential of geospatial Open Data, advanced infrastructures are needed to increase the data openness degree, enhancing their usability. In October 2014, the ENERGIC OD (European NEtwork for Redistributing Geospatial Information to user Communities - Open Data) project, funded by the European Union under the Competitiveness and Innovation framework Programme (CIP), has started. In response to the EU call, the general objective of the project is to "facilitate the use of open (freely available) geographic data from different sources for the creation of innovative applications and services through the creation of Virtual Hubs". The ENERGIC OD Virtual Hubs aim to facilitate the use of geospatial Open Data by lowering and possibly removing the main barriers which hampers geo-information (GI) usage by end-users and application developers. Data and services heterogeneity is recognized as one of the major barriers to Open Data (re-)use. It imposes end-users and developers to spend a lot of effort in accessing different infrastructures and harmonizing datasets. Such heterogeneity cannot be completely removed through the adoption of standard specifications for service interfaces, metadata and data models, since different infrastructures adopt different standards to answer to specific challenges and to address specific use-cases. Thus

  15. Web の探索行動と情報評価過程の分析

    OpenAIRE

    種市, 淳子; 逸村, 裕; TANEICHI, Junko; ITSUMURA, Hiroshi

    2005-01-01

    In this study, we discussed information seeking behavior on the Web. First, the currentWeb-searching studies are reviewed from the perspective of: (1) Web-searching characteristics; (2) the process model for how users evaluate Web resources. Secondly, we investigated information seeking processes using the Web search engine and online public access catalogue (OPAC) system by undergraduate students, through an experiment and its protocol analysis. The results indicate that: (1) Web-searching p...

  16. Gaia Data Release 1. Catalogue validation

    NARCIS (Netherlands)

    Arenou, F.; Luri, X.; Babusiaux, C.; Fabricius, C.; Helmi, A.; Robin, A. C.; Vallenari, A.; Blanco-Cuaresma, S.; Cantat-Gaudin, T.; Findeisen, K.; Reylé, C.; Ruiz-Dern, L.; Sordo, R.; Turon, C.; Walton, N. A.; Shih, I.-C.; Antiche, E.; Barache, C.; Barros, M.; Breddels, M.; Carrasco, J. M.; Costigan, G.; Diakité, S.; Eyer, L.; Figueras, F.; Galluccio, L.; Heu, J.; Jordi, C.; Krone-Martins, A.; Lallement, R.; Lambert, S.; Leclerc, N.; Marrese, P. M.; Moitinho, A.; Mor, R.; Romero-Gómez, M.; Sartoretti, P.; Soria, S.; Soubiran, C.; Souchay, J.; Veljanoski, J.; Ziaeepour, H.; Giuffrida, G.; Pancino, E.; Bragaglia, A.

    Context. Before the publication of the Gaia Catalogue, the contents of the first data release have undergone multiple dedicated validation tests. Aims: These tests aim to provide in-depth analysis of the Catalogue content in order to detect anomalies and individual problems in specific objects or in

  17. OSGeo - Open Source Geospatial Foundation

    Directory of Open Access Journals (Sweden)

    Margherita Di Leo

    2012-09-01

    Full Text Available L'esigenza nata verso la fine del 2005 di selezionare ed organizzare più di 200 progetti FOSS4G porta alla nascita nel Febbraio2006 di OSGeo (the Open Source Geospatial Foundation, organizzazione internazionale la cui mission è promuovere lo sviluppo collaborativo di software libero focalizzato sull'informazione geografica (FOSS4G.Open   Source   Geospatial   Foundation (OSGeoThe Open Source Geospatial Foundation (OSGeo  is  a  not-for-profit  organization, created  in  early  2006  to  the  aim  at  sup-porting   the   collaborative   development of  geospatial  open  source  software,  and promote its widespread use. The founda-tion provides financial, organizational and legal support to the broader open source geospatial community. It also serves as an independent  legal  entity  to  which  com-munity  members  can  contribute  code, funding  and  other  resources,  secure  in the knowledge that their contributions will be maintained for public benefit. OSGeo also  serves  as  an  outreach  and  advocacy organization for the open source geospa-tial  community,  and  provides  a  common forum  and  shared  infrastructure  for  im-proving  cross-project  collaboration.  The foundation's projects are all freely available and  useable  under  an  OSI-certified  open source license. The Italian OSGeo local chapter is named GFOSS.it     (Associazione     Italiana     per l'informazione Geografica Libera.

  18. OGC® Sensor Web Enablement Standards

    Directory of Open Access Journals (Sweden)

    George Percivall

    2006-09-01

    Full Text Available This article provides a high-level overview of and architecture for the Open Geospatial Consortium (OGC standards activities that focus on sensors, sensor networks, and a concept called the “Sensor Web”. This OGC work area is known as Sensor Web Enablement (SWE. This article has been condensed from "OGC® Sensor Web Enablement: Overview And High Level Architecture," an OGC White Paper by Mike Botts, PhD, George Percivall, Carl Reed, PhD, and John Davidson which can be downloaded from http://www.opengeospatial.org/pt/15540. Readers interested in greater technical and architecture detail can download and read the OGC SWE Architecture Discussion Paper titled “The OGC Sensor Web Enablement Architecture” (OGC document 06-021r1, http://www.opengeospatial.org/pt/14140.

  19. DECENTRALIZED ORCHESTRATION OF COMPOSITE OGC WEB PROCESSING SERVICES IN THE CLOUD

    Directory of Open Access Journals (Sweden)

    F. Xiao

    2016-09-01

    Full Text Available Current web-based GIS or RS applications generally rely on centralized structure, which has inherent drawbacks such as single points of failure, network congestion, and data inconsistency, etc. The inherent disadvantages of traditional GISs need to be solved for new applications on Internet or Web. Decentralized orchestration offers performance improvements in terms of increased throughput and scalability and lower response time. This paper investigates build time and runtime issues related to decentralized orchestration of composite geospatial processing services based on OGC WPS standard specification. A case study of dust storm detection was demonstrated to evaluate the proposed method and the experimental results indicate that the method proposed in this study is effective for its ability to produce the high quality solution at a low cost of communications for geospatial processing service composition problem.

  20. A Geospatial Online Instruction Model

    Science.gov (United States)

    Rodgers, John C., III; Owen-Nagel, Athena; Ambinakudige, Shrinidhi

    2012-01-01

    The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model's effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based…

  1. SMART CITIES INTELLIGENCE SYSTEM (SMACiSYS) INTEGRATING SENSOR WEB WITH SPATIAL DATA INFRASTRUCTURES (SENSDI)

    OpenAIRE

    D. Bhattacharya; M. Painho

    2017-01-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists...

  2. From Geomatics to Geospatial Intelligent Service Science

    Directory of Open Access Journals (Sweden)

    LI Deren

    2017-10-01

    Full Text Available The paper reviews the 60 years of development from traditional surveying and mapping to today's geospatial intelligent service science.The three important stages of surveying and mapping, namely analogue,analytical and digital stage are summarized.The author introduces the integration of GNSS,RS and GIS(3S,which forms the rise of geospatial informatics(Geomatics.The development of geo-spatial information science in digital earth era is analyzed,and the latest progress of geo-spatial information science towards real-time intelligent service in smart earth era is discussed.This paper focuses on the three development levels of "Internet plus" spatial information intelligent service.In the era of big data,the traditional geomatics will surely take advantage of the integration of communication,navigation,remote sensing,artificial intelligence,virtual reality and brain cognition science,and become geospatial intelligent service science,thereby making contributions to national economy,defense and people's livelihood.

  3. Data Quality, Provenance and IPR Management services: their role in empowering geospatial data suppliers and users

    Science.gov (United States)

    Millard, Keiran

    2015-04-01

    This paper looks at current experiences of geospatial users and geospatial suppliers and how they have been limited by suitable frameworks for managing and communicating data quality, data provenance and intellectual property rights (IPR). Current political and technological drivers mean that increasing volumes of geospatial data are available through a plethora of different products and services, and whilst this is inherently a good thing it does create a new generation of challenges. This paper consider two examples of where these issues have been examined and looks at the challenges and possible solutions from a data user and data supplier perspective. The first example is the IQmulus project that is researching fusion environments for big geospatial point clouds and coverages. The second example is the EU Emodnet programme that is establishing thematic data portals for public marine and coastal data. IQmulus examines big geospatial data; the data from sources such as LIDAR, SONAR and numerical simulations; these data are simply too big for routine and ad-hoc analysis, yet they could realise a myriad of disparate, and readily useable, information products with the right infrastructure in place. IQmulus is researching how to deliver this infrastructure technically, but a financially sustainable delivery depends on being able to track and manage ownership and IPR across the numerous data sets being processed. This becomes complex when the data is composed of multiple overlapping coverages, however managing this allows for uses to be delivered highly-bespoke products to meet their budget and technical needs. The Emodnet programme delivers harmonised marine data at the EU scale across seven thematic portals. As part of the Emodnet programme a series of 'check points' have been initiated to examine how useful these services and other public data services actually are to solve real-world problems. One key finding is that users have been confused by the fact that often

  4. Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus

    Science.gov (United States)

    Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.

  5. Geospatial Image Stream Processing: Models, techniques, and applications in remote sensing change detection

    Science.gov (United States)

    Rueda-Velasquez, Carlos Alberto

    Detection of changes in environmental phenomena using remotely sensed data is a major requirement in the Earth sciences, especially in natural disaster related scenarios where real-time detection plays a crucial role in the saving of human lives and the preservation of natural resources. Although various approaches formulated to model multidimensional data can in principle be applied to the inherent complexity of remotely sensed geospatial data, there are still challenging peculiarities that demand a precise characterization in the context of change detection, particularly in scenarios of fast changes. In the same vein, geospatial image streams do not fit appropriately in the standard Data Stream Management System (DSMS) approach because these systems mainly deal with tuple-based streams. Recognizing the necessity for a systematic effort to address the above issues, the work presented in this thesis is a concrete step toward the foundation and construction of an integrated Geospatial Image Stream Processing framework, GISP. First, we present a data and metadata model for remotely sensed image streams. We introduce a precise characterization of images and image streams in the context of remotely sensed geospatial data. On this foundation, we define spatially-aware temporal operators with a consistent semantics for change analysis tasks. We address the change detection problem in settings where multiple image stream sources are available, and thus we introduce an architectural design for the processing of geospatial image streams from multiple sources. With the aim of targeting collaborative scientific environments, we construct a realization of our architecture based on Kepler, a robust and widely used scientific workflow management system, as the underlying computational support; and open data and Web interface standards, as a means to facilitate the interoperability of GISP instances with other processing infrastructures and client applications. We demonstrate our

  6. Library catalogues as resources for book history: case study of Novosel’s bookstore catalogue in Zagreb (1794 - 1825

    Directory of Open Access Journals (Sweden)

    Marijana Tomić

    2008-07-01

    Full Text Available The aim of the paper is to analyze the book catalogue of Novosel’s bookstore, which operated in Zagreb from 1794 to 1825, and investigate the history of books and writing in Zagreb at the turn of the 19th century. The catalogue we analyzed is believed to have been published in 1801. Bearing in mind that the market-based economy started to develop in the late 18th century, it can be stipulated that Novosel and his staff and successors based the offer in their bookstore on market analysis, i.e. their readers’ needs. The increase in offer has sparked off new advertising techniques, i.e. printing of catalogues. It follows that their book catalogue reflects the image of the cultural and intellectual status and needs of readers in those times. The paper provides a short overview of book trade in the late 18th century Zagreb and of bookstore advertisements published both in books and individually, as well as a short overview of Novosel’s bookstore business. In the analysis we partly use the methodology introduced by Robert Darnton, the so-called Darnton’s circle, which takes a holistic view of the history of books taking into consideration all stages a book needs to go through - from the author, publisher, printer, bookstores, to readers, including the author him/herself as a reader. Every element is considered in relation to other elements in the circle, and in connection with external factors such as the economic and social environment, and political and intellectual influences. The books presented in the catalogue have been analyzed using different criteria: language, genre and country where they were printed. Books printed in Croatia and those written in Croatian have been given priority. In the catalogue analysis we used the database Skupni katalog hrvatskih knjižnica (joint Croatian library catalogue in order to reconstruct the printing year and printing shops that have not been listed in the catalogues. Using this methodology, we partly

  7. Cataloguing In Special Libraries In The 1990s

    Directory of Open Access Journals (Sweden)

    Elizabeth Makin

    1996-01-01

    Full Text Available Cataloguing in special libraries has been virtually ignored in the literature since the turn of the century, although there are many books and papers on cataloguing in general. It is not clear why this should be so, since it can be argued that the needs of special libraries are different from those of public, academic and national libraries. Special libraries are primarily interested in the information content of documents in the sense that they have little or no interest in documents except as "packages" in which information may be encapsulated. It is therefore reasonable to assume, a priori, that special libraries would undertake detailed indexing and light cataloguing, perhaps reducing the catalogue to the status of a finding list. This paper reports the results of a survey of current cataloguing practice in special libraries.

  8. A Global Geospatial Database of 5000+ Historic Flood Event Extents

    Science.gov (United States)

    Tellman, B.; Sullivan, J.; Doyle, C.; Kettner, A.; Brakenridge, G. R.; Erickson, T.; Slayback, D. A.

    2017-12-01

    A key dataset that is missing for global flood model validation and understanding historic spatial flood vulnerability is a global historical geo-database of flood event extents. Decades of earth observing satellites and cloud computing now make it possible to not only detect floods in near real time, but to run these water detection algorithms back in time to capture the spatial extent of large numbers of specific events. This talk will show results from the largest global historical flood database developed to date. We use the Dartmouth Flood Observatory flood catalogue to map over 5000 floods (from 1985-2017) using MODIS, Landsat, and Sentinel-1 Satellites. All events are available for public download via the Earth Engine Catalogue and via a website that allows the user to query floods by area or date, assess population exposure trends over time, and download flood extents in geospatial format.In this talk, we will highlight major trends in global flood exposure per continent, land use type, and eco-region. We will also make suggestions how to use this dataset in conjunction with other global sets to i) validate global flood models, ii) assess the potential role of climatic change in flood exposure iii) understand how urbanization and other land change processes may influence spatial flood exposure iv) assess how innovative flood interventions (e.g. wetland restoration) influence flood patterns v) control for event magnitude to assess the role of social vulnerability and damage assessment vi) aid in rapid probabilistic risk assessment to enable microinsurance markets. Authors on this paper are already using the database for the later three applications and will show examples of wetland intervention analysis in Argentina, social vulnerability analysis in the USA, and micro insurance in India.

  9. Updated earthquake catalogue for seismic hazard analysis in Pakistan

    Science.gov (United States)

    Khan, Sarfraz; Waseem, Muhammad; Khan, Muhammad Asif; Ahmed, Waqas

    2018-03-01

    A reliable and homogenized earthquake catalogue is essential for seismic hazard assessment in any area. This article describes the compilation and processing of an updated earthquake catalogue for Pakistan. The earthquake catalogue compiled in this study for the region (quadrangle bounded by the geographical limits 40-83° N and 20-40° E) includes 36,563 earthquake events, which are reported as 4.0-8.3 moment magnitude (M W) and span from 25 AD to 2016. Relationships are developed between the moment magnitude and body, and surface wave magnitude scales to unify the catalogue in terms of magnitude M W. The catalogue includes earthquakes from Pakistan and neighbouring countries to minimize the effects of geopolitical boundaries in seismic hazard assessment studies. Earthquakes reported by local and international agencies as well as individual catalogues are included. The proposed catalogue is further used to obtain magnitude of completeness after removal of dependent events by using four different algorithms. Finally, seismicity parameters of the seismic sources are reported, and recommendations are made for seismic hazard assessment studies in Pakistan.

  10. Database Organisation in a Web-Enabled Free and Open-Source Software (foss) Environment for Spatio-Temporal Landslide Modelling

    Science.gov (United States)

    Das, I.; Oberai, K.; Sarathi Roy, P.

    2012-07-01

    Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.

  11. Smart sensor-based geospatial architecture for dike monitoring

    Science.gov (United States)

    Herle, S.; Becker, R.; Blankenbach, J.

    2016-04-01

    Artificial hydraulic structures like dams or dikes used for water level regulations or flood prevention are continuously under the influence of the weather and variable river regimes. Thus, ongoing monitoring and simulation is crucial in order to determine the inner condition. Potentially life-threatening situations, in extreme case a failure, must be counteracted by all available means. Nowadays flood warning systems rely exclusively on water level forecast without considering the state of the structure itself. Area-covering continuous knowledge of the inner state including time dependent changes increases the capability of recognizing and locating vulnerable spots for early treatment. In case of a predicted breach, advance warning time for alerting affected citizens can be extended. Our approach is composed of smart sensors integrated in a service-oriented geospatial architecture to monitor and simulate artificial hydraulic structures continuously. The sensors observe the inner state of the construction like the soil moisture or the stress and deformation over time but also various external influences like water levels or wind speed. They are interconnected in distributed network architecture by a so-called sensor bus system based on lightweight protocols like Message Queue Telemetry Transport for Sensor Networks (MQTT-SN). These sensor data streams are transferred into an OGC Sensor Web Enablement (SWE) data structure providing high-level geo web services to end users. Bundled with 3rd party geo web services (WMS etc.) powerful processing and simulation tools can be invoked using the Web Processing Service (WPS) standard. Results will be visualized in a geoportal allowing user access to all information.

  12. Catalogue of Korean manuscripts and rare books

    DEFF Research Database (Denmark)

    Lerbæk Pedersen, Bent

    2014-01-01

    Catalogue of Korean manuscripts and rare books in The Royal Library, Copenhagen and the National Museum of Denmark......Catalogue of Korean manuscripts and rare books in The Royal Library, Copenhagen and the National Museum of Denmark...

  13. Catalogue of Icelandic Volcanoes

    Science.gov (United States)

    Ilyinskaya, Evgenia; Larsen, Gudrún; Gudmundsson, Magnús T.; Vogfjörd, Kristin; Jonsson, Trausti; Oddsson, Björn; Reynisson, Vidir; Pagneux, Emmanuel; Barsotti, Sara; Karlsdóttir, Sigrún; Bergsveinsson, Sölvi; Oddsdóttir, Thorarna

    2017-04-01

    The Catalogue of Icelandic Volcanoes (CIV) is a newly developed open-access web resource (http://icelandicvolcanoes.is) intended to serve as an official source of information about volcanoes in Iceland for the public and decision makers. CIV contains text and graphic information on all 32 active volcanic systems in Iceland, as well as real-time data from monitoring systems in a format that enables non-specialists to understand the volcanic activity status. The CIV data portal contains scientific data on all eruptions since Eyjafjallajökull 2010 and is an unprecedented endeavour in making volcanological data open and easy to access. CIV forms a part of an integrated volcanic risk assessment project in Iceland GOSVÁ (commenced in 2012), as well as being part of the European Union funded effort FUTUREVOLC (2012-2016) on establishing an Icelandic volcano supersite. The supersite concept implies integration of space and ground based observations for improved monitoring and evaluation of volcanic hazards, and open data policy. This work is a collaboration of the Icelandic Meteorological Office, the Institute of Earth Sciences at the University of Iceland, and the Civil Protection Department of the National Commissioner of the Iceland Police, with contributions from a large number of specialists in Iceland and elsewhere.

  14. Central Asia earthquake catalogue from ancient time to 2009

    Directory of Open Access Journals (Sweden)

    Natalya N. Mikhailova

    2015-04-01

    Full Text Available In this work, we present the seismic catalogue compiled for Central Asia (Kazakhstan, Kyrgyzstan, Tajikistan, Uzbekistan and Turkmenistan in the framework of the Earthquake Model Central Asia (EMCA project. The catalogue from 2000 B.C. to 2009 A.D. is composed by 33,034 earthquakes in the MLH magnitude (magnitude by surface waves on horizontal components widely used in practice of the former USSR countries range from 1.5 to 8.3. The catalogue includes both macroseimic and instrumental constrained data, with about 32,793 earthquake after 1900 A.D. The main sources and procedure used to compile the catalogues are discussed, and the comparison with the ISC-GEM catalogue presented. Magnitude of completeness analysis shows that the catalogue is complete down to magnitude 4 from 1959 and to magnitude 7 from 1873, whereas the obtained regional b value is 0.805.

  15. Geospatial field applications within United States Department of Agriculture, Veterinary Services.

    Science.gov (United States)

    FitzMaurice, Priscilla L; Freier, Jerome E; Geter, Kenneth D

    2007-01-01

    Epidemiologists, veterinary medical officers and animal health technicians within Veterinary Services (VS) are actively utilising global positioning system (GPS) technology to obtain positional data on livestock and poultry operations throughout the United States. Geospatial data, if acquired for monitoring and surveillance purposes, are stored within the VS Generic Database (GDB). If the information is collected in response to an animal disease outbreak, the data are entered into the Emergency Management Response System (EMRS). The Spatial Epidemiology group within the Centers for Epidemiology and Animal Health (CEAH) has established minimum data accuracy standards for geodata acquisition. To ensure that field-collected geographic coordinates meet these minimum standards, field personnel are trained in proper data collection procedures. Positional accuracy is validated with digital atlases, aerial photographs, Web-based parcel maps, or address geocoding. Several geospatial methods and technologies are under investigation for future use within VS. These include the direct transfer of coordinates from GPS receivers to computers, GPS-enabled digital cameras, tablet PCs, and GPS receivers preloaded with custom ArcGIS maps - all with the objective of reducing transcription and data entry errors and improving the ease of data collection in the field.

  16. Addressing the Challenge: Cataloguing Electronic Books in Academic Libraries

    Directory of Open Access Journals (Sweden)

    Shuzhen Zhao

    2010-03-01

    Full Text Available Objective ‐ This paper explores the various issues and challenges arising from e‐book cataloguing experienced at the University of Windsor’s Leddy Library and the Ontario Council of University Libraries (OCUL. This discussion uses an evidence based approach to identify and resolve issues relevant to academic libraries as well as to consortia. With the ever rising popularity of e‐books within academic libraries, cataloguing librarians are actively seeking more effective methods of managing this new electronic medium, including the development of new cataloguing policies and procedures. This paper will explore the various issues and challenges surrounding e‐book cataloguing and processing within academic libraries, and will identify new policies and procedures that may be used to effectively assist in e‐book management.Methods ‐ This paper presents a case study of e‐book cataloguing practices undertaken by a Canadian academic library and the consortium with which it is affiliated. Towards this end, the University of Windsor’s Leddy Library will be the prime focus of this study, with its establishment of a new e‐book MARC records database. The research is based on the results of the e‐book MARC project undertaken by the Leddy Library and the Ontario Council of University Libraries (OCUL.Through analysis of various suppliers’ MARC records and the actual implementation of the e‐book MARC project, the authors developed and evaluated a new approach to e‐book cataloguing for use in academic libraries.Results ‐ This practice‐based approach towards the development of a new method of e‐book cataloguing required continual modification and examination of e‐book MARC records within the target library. The Leddy Library’s e‐book MARC project provided an excellent opportunity to test the library’s existing cataloguing standards and procedures for print format, while at the same time, identifying related e‐books issues

  17. A FRAMEWORK FOR AN OPEN SOURCE GEOSPATIAL CERTIFICATION MODEL

    Directory of Open Access Journals (Sweden)

    T. U. R. Khan

    2016-06-01

    Full Text Available The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission “Making geospatial education and opportunities accessible to all”. Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the “Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM. The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and

  18. a Framework for AN Open Source Geospatial Certification Model

    Science.gov (United States)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  19. Visualization and Ontology of Geospatial Intelligence

    Science.gov (United States)

    Chan, Yupo

    Recent events have deepened our conviction that many human endeavors are best described in a geospatial context. This is evidenced in the prevalence of location-based services, as afforded by the ubiquitous cell phone usage. It is also manifested by the popularity of such internet engines as Google Earth. As we commute to work, travel on business or pleasure, we make decisions based on the geospatial information provided by such location-based services. When corporations devise their business plans, they also rely heavily on such geospatial data. By definition, local, state and federal governments provide services according to geographic boundaries. One estimate suggests that 85 percent of data contain spatial attributes.

  20. On the moroccan tsunami catalogue

    Directory of Open Access Journals (Sweden)

    F. Kaabouben

    2009-07-01

    Full Text Available A primary tool for regional tsunami hazard assessment is a reliable historical and instrumental catalogue of events. Morocco by its geographical situation, with two marine sides, stretching along the Atlantic coast to the west and along the Mediterranean coast to the north, is the country of Western Africa most exposed to the risk of tsunamis. Previous information on tsunami events affecting Morocco are included in the Iberian and/or the Mediterranean lists of tsunami events, as it is the case of the European GITEC Tsunami Catalogue, but there is a need to organize this information in a dataset and to assess the likelihood of claimed historical tsunamis in Morocco. Due to the fact that Moroccan sources are scarce, this compilation rely on historical documentation from neighbouring countries (Portugal and Spain and so the compatibility between the new tsunami catalogue presented here and those that correspond to the same source areas is also discussed.

  1. Catalogue of HI PArameters (CHIPA)

    Science.gov (United States)

    Saponara, J.; Benaglia, P.; Koribalski, B.; Andruchow, I.

    2015-08-01

    The catalogue of HI parameters of galaxies HI (CHIPA) is the natural continuation of the compilation by M.C. Martin in 1998. CHIPA provides the most important parameters of nearby galaxies derived from observations of the neutral Hydrogen line. The catalogue contains information of 1400 galaxies across the sky and different morphological types. Parameters like the optical diameter of the galaxy, the blue magnitude, the distance, morphological type, HI extension are listed among others. Maps of the HI distribution, velocity and velocity dispersion can also be display for some cases. The main objective of this catalogue is to facilitate the bibliographic queries, through searching in a database accessible from the internet that will be available in 2015 (the website is under construction). The database was built using the open source `` mysql (SQL, Structured Query Language, management system relational database) '', while the website was built with ''HTML (Hypertext Markup Language)'' and ''PHP (Hypertext Preprocessor)''.

  2. An approach for heterogeneous and loosely coupled geospatial data distributed computing

    Science.gov (United States)

    Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui

    2010-07-01

    Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.

  3. Catalogue of Exoplanets in Multiple-Star-Systems

    Science.gov (United States)

    Schwarz, Richard; Funk, Barbara; Bazsó, Ákos; Pilat-Lohinger, Elke

    2017-07-01

    Cataloguing the data of exoplanetary systems becomes more and more important, due to the fact that they conclude the observations and support the theoretical studies. Since 1995 there is a database which list most of the known exoplanets (The Extrasolar Planets Encyclopaedia is available at http://exoplanet.eu/ and described at Schneider et al. 2011). With the growing number of detected exoplanets in binary and multiple star systems it became more important to mark and to separate them into a new database. Therefore we started to compile a catalogue for binary and multiple star systems. Since 2013 the catalogue can be found at http://www.univie.ac.at/adg/schwarz/multiple.html (description can be found at Schwarz et al. 2016) which will be updated regularly and is linked to the Extrasolar Planets Encyclopaedia. The data of the binary catalogue can be downloaded as a file (.csv) and used for statistical purposes. Our database is divided into two parts: the data of the stars and the planets, given in a separate list. Every columns of the list can be sorted in two directions: ascending, meaning from the lowest value to the highest, or descending. In addition an introduction and help is also given in the menu bar of the catalogue including an example list.

  4. Catalogue of Meteor Showers and Storms in Korean History

    Directory of Open Access Journals (Sweden)

    Sang-Hyeon Ahn

    2004-03-01

    Full Text Available We present a more complete and accurate catalogue of astronomical records for meteor showers and meteor storms appeared in primary official Korean history books, such as Samguk-sagi, Koryo-sa, Seungjeongwon-ilgi, and Choson-Wangjo-Sillok. So far the catalogue made by Imoto and Hasegawa in 1958 has been widely used in the international astronomical society. The catalogue is based on a report by Sekiguchi in 1917 that is mainly based on secondary history books. We observed that the catalogue has a number of errors in either dates or sources of the records. We have thoroughly checked the primary official history books, instead of the secondary ones, in order to make a corrected and extended catalogue. The catalogue contains 25 records of meteor storms, four records of intense meteor-showers, and five records of usual showers in Korean history. We also find that some of those records seem to correspond to some presently active meteor showers such as the Leonids, the Perseids, and the ¥ç-Aquarids-Orionids pair. However, a large number of those records do not correspond to such present showers. This catalogue we obtained can be useful for various astrophysical studies in the future.

  5. Technology Catalogue. First edition

    Energy Technology Data Exchange (ETDEWEB)

    1994-02-01

    The Department of Energy`s Office of Environmental Restoration and Waste Management (EM) is responsible for remediating its contaminated sites and managing its waste inventory in a safe and efficient manner. EM`s Office of Technology Development (OTD) supports applied research and demonstration efforts to develop and transfer innovative, cost-effective technologies to its site clean-up and waste management programs within EM`s Office of Environmental Restoration and Office of Waste Management. The purpose of the Technology Catalogue is to provide performance data on OTD-developed technologies to scientists and engineers assessing and recommending technical solutions within the Department`s clean-up and waste management programs, as well as to industry, other federal and state agencies, and the academic community. OTD`s applied research and demonstration activities are conducted in programs referred to as Integrated Demonstrations (IDs) and Integrated Programs (IPs). The IDs test and evaluate.systems, consisting of coupled technologies, at specific sites to address generic problems, such as the sensing, treatment, and disposal of buried waste containers. The IPs support applied research activities in specific applications areas, such as in situ remediation, efficient separations processes, and site characterization. The Technology Catalogue is a means for communicating the status. of the development of these innovative technologies. The FY93 Technology Catalogue features technologies successfully demonstrated in the field through IDs and sufficiently mature to be used in the near-term. Technologies from the following IDs are featured in the FY93 Technology Catalogue: Buried Waste ID (Idaho National Engineering Laboratory, Idaho); Mixed Waste Landfill ID (Sandia National Laboratories, New Mexico); Underground Storage Tank ID (Hanford, Washington); Volatile organic compound (VOC) Arid ID (Richland, Washington); and VOC Non-Arid ID (Savannah River Site, South Carolina).

  6. Competencies and materials for repositioning cataloguers for ...

    African Journals Online (AJOL)

    The purpose of this study was to determine the competencies and materials for repositioning cataloguers for information management in an electronic era. The survey method was adopted for the research design using questionnaire for data collection. The population comprised of 44 cataloguers in 12 universities in ...

  7. Discovering Land Cover Web Map Services from the Deep Web with JavaScript Invocation Rules

    Directory of Open Access Journals (Sweden)

    Dongyang Hou

    2016-06-01

    Full Text Available Automatic discovery of isolated land cover web map services (LCWMSs can potentially help in sharing land cover data. Currently, various search engine-based and crawler-based approaches have been developed for finding services dispersed throughout the surface web. In fact, with the prevalence of geospatial web applications, a considerable number of LCWMSs are hidden in JavaScript code, which belongs to the deep web. However, discovering LCWMSs from JavaScript code remains an open challenge. This paper aims to solve this challenge by proposing a focused deep web crawler for finding more LCWMSs from deep web JavaScript code and the surface web. First, the names of a group of JavaScript links are abstracted as initial judgements. Through name matching, these judgements are utilized to judge whether or not the fetched webpages contain predefined JavaScript links that may prompt JavaScript code to invoke WMSs. Secondly, some JavaScript invocation functions and URL formats for WMS are summarized as JavaScript invocation rules from prior knowledge of how WMSs are employed and coded in JavaScript. These invocation rules are used to identify the JavaScript code for extracting candidate WMSs through rule matching. The above two operations are incorporated into a traditional focused crawling strategy situated between the tasks of fetching webpages and parsing webpages. Thirdly, LCWMSs are selected by matching services with a set of land cover keywords. Moreover, a search engine for LCWMSs is implemented that uses the focused deep web crawler to retrieve and integrate the LCWMSs it discovers. In the first experiment, eight online geospatial web applications serve as seed URLs (Uniform Resource Locators and crawling scopes; the proposed crawler addresses only the JavaScript code in these eight applications. All 32 available WMSs hidden in JavaScript code were found using the proposed crawler, while not one WMS was discovered through the focused crawler

  8. Planck 2015 results. XXVIII. The Planck Catalogue of Galactic Cold Clumps

    CERN Document Server

    Ade, P.A.R.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartolo, N.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R.C.; Calabrese, E.; Catalano, A.; Chamballu, A.; Chiang, H.C.; Christensen, P.R.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T.A.; Eriksen, H.K.; Falgarone, E.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A.A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.L.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, A.H.; Jaffe, T.R.; Jones, W.C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T.S.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C.R.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P.B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P.M.; Macías-Pérez, J.F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Marshall, D.J.; Martin, P.G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Mazzotta, P.; McGehee, P.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J.A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Nørgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T.J.; Pelkonen, V.-M.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Pratt, G.W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J.P.; Reach, W.T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J.A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Stolyarov, V.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J.A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L.A.; Wandelt, B.D.; Wehus, I.K.; Yvon, D.; Zacchei, A.

    2016-01-01

    We present the Planck Catalogue of Galactic Cold Clumps (PGCC), an all-sky catalogue of Galactic cold clump candidates detected by Planck. This catalogue is the full version of the Early Cold Core (ECC) catalogue, which was made available in 2011 with the Early Release Compact Source Catalogue (ERCSC) and contained 915 high S/N sources. It is based on the Planck 48 months mission data that are currently being released to the astronomical community. The PGCC catalogue is an observational catalogue consisting exclusively of Galactic cold sources. The three highest Planck bands (857, 545, 353 GHz) have been combined with IRAS data at 3 THz to perform a multi-frequency detection of sources colder than their local environment. After rejection of possible extragalactic contaminants, the PGCC catalogue contains 13188 Galactic sources spread across the whole sky, i.e., from the Galactic plane to high latitudes, following the spatial distribution of the main molecular cloud complexes. The median temperature of PGCC so...

  9. Economic Assessment of the Use Value of Geospatial Information

    Directory of Open Access Journals (Sweden)

    Richard Bernknopf

    2015-07-01

    Full Text Available Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI contained in geospatial data is the difference between the net benefits (in present value terms of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1 a retrospective model about environmental regulation of agrochemicals; (2 a prospective model about the impact and mitigation of earthquakes in urban areas; and (3 a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  10. Economic assessment of the use value of geospatial information

    Science.gov (United States)

    Bernknopf, Richard L.; Shapiro, Carl D.

    2015-01-01

    Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI) contained in geospatial data is the difference between the net benefits (in present value terms) of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1) a retrospective model about environmental regulation of agrochemicals; (2) a prospective model about the impact and mitigation of earthquakes in urban areas; and (3) a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  11. WYSIWYG GEOPROCESSING: COUPLING SENSOR WEB AND GEOPROCESSING SERVICES IN VIRTUAL GLOBES

    Directory of Open Access Journals (Sweden)

    X. Zhai

    2012-08-01

    Full Text Available We propose to advance the scientific understanding and applications of geospatial data by coupling Sensor Web and Geoprocessing Services in Virtual Globes for higher-education teaching and research. The vision is the concept of "What You See is What You Get" geoprocessing, shortly known as WYSIWYG geoprocessing. Virtual Globes offer tremendous opportunities, such as providing a learning tool to help educational users and researchers digest global-scale geospatial information about the world, and acting as WYSIWYG platforms, where domain experts can see what their fingertips act in an interactive three-dimensional virtual environment. In the meantime, Sensor Web and Web Service technologies make a large amount of Earth observing sensors and geoprocessing functionalities easily accessible to educational users and researchers like their local resources. Coupling Sensor Web and geoprocessing Services in Virtual Globes will bring a virtual learning and research environment to the desktops of students and professors, empowering them with WYSIWYG geoprocessing capabilities. The implementation combines the visualization and communication power of Virtual Globes with the on-demand data collection and analysis functionalities of Sensor Web and geoprocessing services, to help students and researchers investigate various scientific problems in an environment with natural and intuitive user experiences. The work will contribute to the scientific and educational activities of geoinformatic communities in that they will have a platform that are easily accessible and help themselves perceive world space and perform live geoscientific processes.

  12. Geospatial Modeling of Asthma Population in Relation to Air Pollution

    Science.gov (United States)

    Kethireddy, Swatantra R.; Tchounwou, Paul B.; Young, John H.; Luvall, Jeffrey C.; Alhamdan, Mohammad

    2013-01-01

    Current observations indicate that asthma is growing every year in the United States, specific reasons for this are not well understood. This study stems from an ongoing research effort to investigate the spatio-temporal behavior of asthma and its relatedness to air pollution. The association between environmental variables such as air quality and asthma related health issues over Mississippi State are investigated using Geographic Information Systems (GIS) tools and applications. Health data concerning asthma obtained from Mississippi State Department of Health (MSDH) for 9-year period of 2003-2011, and data of air pollutant concentrations (PM2.5) collected from USEPA web resources, and are analyzed geospatially to establish the impacts of air quality on human health specifically related to asthma. Disease mapping using geospatial techniques provides valuable insights into the spatial nature, variability, and association of asthma to air pollution. Asthma patient hospitalization data of Mississippi has been analyzed and mapped using quantitative Choropleth techniques in ArcGIS. Patients have been geocoded to their respective zip codes. Potential air pollutant sources of Interstate highways, Industries, and other land use data have been integrated in common geospatial platform to understand their adverse contribution on human health. Existing hospitals and emergency clinics are being injected into analysis to further understand their proximity and easy access to patient locations. At the current level of analysis and understanding, spatial distribution of Asthma is observed in the populations of Zip code regions in gulf coast, along the interstates of south, and in counties of Northeast Mississippi. It is also found that asthma is prevalent in most of the urban population. This GIS based project would be useful to make health risk assessment and provide information support to the administrators and decision makers for establishing satellite clinics in future.

  13. An atlas of classification. Signage between open shelves, the Web and the catalogue

    Directory of Open Access Journals (Sweden)

    Andrea Fabbrizzi

    2014-05-01

    Questa segnaletica è fondata sulla comunicazione cross-mediale, e integra le modalità comunicative della biblioteca a vari livelli, sia nel contesto dello stesso medium, sia tra media diversi: tra i cartelli sulle testate degli scaffali, tra questi cartelli e il sito web della biblioteca, tra il sito web e il catalogo. Per questo sistema integrato sono particolarmente adatti i dispositivi mobili come i tablet e gli smartphone, perché danno la possibilità di accedere al Web mentre ci si muove tra gli scaffali. Il collegamento diretto tra gli scaffali aperti classificati e il catalogo è reso possibile dai codici QR stampati sui cartelli.

  14. Second ROSAT all-sky survey (2RXS) source catalogue

    Science.gov (United States)

    Boller, Th.; Freyberg, M. J.; Trümper, J.; Haberl, F.; Voges, W.; Nandra, K.

    2016-04-01

    Aims: We present the second ROSAT all-sky survey source catalogue, hereafter referred to as the 2RXS catalogue. This is the second publicly released ROSAT catalogue of point-like sources obtained from the ROSAT all-sky survey (RASS) observations performed with the position-sensitive proportional counter (PSPC) between June 1990 and August 1991, and is an extended and revised version of the bright and faint source catalogues. Methods: We used the latest version of the RASS processing to produce overlapping X-ray images of 6.4° × 6.4° sky regions. To create a source catalogue, a likelihood-based detection algorithm was applied to these, which accounts for the variable point-spread function (PSF) across the PSPC field of view. Improvements in the background determination compared to 1RXS were also implemented. X-ray control images showing the source and background extraction regions were generated, which were visually inspected. Simulations were performed to assess the spurious source content of the 2RXS catalogue. X-ray spectra and light curves were extracted for the 2RXS sources, with spectral and variability parameters derived from these products. Results: We obtained about 135 000 X-ray detections in the 0.1-2.4 keV energy band down to a likelihood threshold of 6.5, as adopted in the 1RXS faint source catalogue. Our simulations show that the expected spurious content of the catalogue is a strong function of detection likelihood, and the full catalogue is expected to contain about 30% spurious detections. A more conservative likelihood threshold of 9, on the other hand, yields about 71 000 detections with a 5% spurious fraction. We recommend thresholds appropriate to the scientific application. X-ray images and overlaid X-ray contour lines provide an additional user product to evaluate the detections visually, and we performed our own visual inspections to flag uncertain detections. Intra-day variability in the X-ray light curves was quantified based on the

  15. A 'new generation' earthquake catalogue

    Directory of Open Access Journals (Sweden)

    E. Boschi

    2000-06-01

    Full Text Available In 1995, we published the first release of the Catalogo dei Forti Terremoti in Italia, 461 a.C. - 1980, in Italian (Boschi et al., 1995. Two years later this was followed by a second release, again in Italian, that included more earthquakes, more accurate research and a longer time span (461 B.C. to 1990 (Boschi et al., 1997. Aware that the record of Italian historical seismicity is probably the most extensive of the whole world, and hence that our catalogue could be of interest for a wider interna-tional readership, Italian was clearly not the appropriate language to share this experience with colleagues from foreign countries. Three years after publication of the second release therefore, and after much additional research and fine tuning of methodologies and algorithms, I am proud to introduce this third release in English. All the tools and accessories have been translated along with the texts describing the development of the underlying research strategies and current contents. The English title is Catalogue of Strong Italian Earthquakes, 461 B.C. to 1997. This Preface briefly describes the scientific context within which the Catalogue of Strong Italian Earthquakes was conceived and progressively developed. The catalogue is perhaps the most impor-tant outcome of a well-established joint project between the Istituto Nazionale di Geofisica, the leading Italian institute for basic and applied research in seismology and solid earth geophysics, and SGA (Storia Geofisica Ambiente, a private firm specialising in the historical investigation and systematisation of natural phenomena. In her contribution "Method of investigation, typology and taxonomy of the basic data: navigating between seismic effects and historical contexts", Emanuela Guidoboni outlines the general framework of modern historical seismology, its complex relation with instrumental seismology on the one hand and historical research on the other. This presentation also highlights

  16. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  17. Geospatial Health: the first five years

    Directory of Open Access Journals (Sweden)

    Jürg Utzinger

    2011-11-01

    Full Text Available Geospatial Health is an international, peer-reviewed scientific journal produced by the Global Network for Geospatial Health (GnosisGIS. This network was founded in 2000 and the inaugural issue of its official journal was published in November 2006 with the aim to cover all aspects of geographical information system (GIS applications, remote sensing and other spatial analytic tools focusing on human and veterinary health. The University of Naples Federico II is the publisher, producing two issues per year, both as hard copy and an open-access online version. The journal is referenced in major databases, including CABI, ISI Web of Knowledge and PubMed. In 2008, it was assigned its first impact factor (1.47, which has now reached 1.71. Geospatial Health is managed by an editor-in-chief and two associate editors, supported by five regional editors and a 23-member strong editorial board. This overview takes stock of the first five years of publishing: 133 contributions have been published so far, primarily original research (79.7%, followed by reviews (7.5%, announcements (6.0%, editorials and meeting reports (3.0% each and a preface in the first issue. A content analysis of all the original research articles and reviews reveals that three quarters of the publications focus on human health with the remainder dealing with veterinary health. Two thirds of the papers come from Africa, Asia and Europe with similar numbers of contributions from each continent. Studies of more than 35 different diseases, injuries and risk factors have been presented. Malaria and schistosomiasis were identified as the two most important diseases (11.2% each. Almost half the contributions were based on GIS, one third on spatial analysis, often using advanced Bayesian geostatistics (13.8%, and one quarter on remote sensing. The 120 original research articles, reviews and editorials were produced by 505 authors based at institutions and universities in 52 countries

  18. Catalogue of National Health and Social Care Data Collections

    LENUS (Irish Health Repository)

    Kamel Boulos, Maged N

    2011-12-21

    Abstract \\'Wikification of GIS by the masses\\' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild\\'s term \\'Volunteered Geographic Information\\'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced \\'Wikipedias of the Earth\\' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and \\'human-in-the-loop sensing\\' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis\\/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, "noise", misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world.

  19. VIRAC: The VVV Infrared Astrometric Catalogue

    OpenAIRE

    Smith, L. C.; Lucas, P. W.; Kurtev, R.; Smart, R.; Minniti, D.; Borissova, J.; Jones, H. R. A; Zhang, Z. H.; Marocco, F.; Peña, C. Contreras; Gromadzki, M.; Kuhn, M. A.; Drew, J. E.; Pinfield, D. J.; Bedin, L. R.

    2017-01-01

    We present VIRAC version 1, a near-infrared proper motion and parallax catalogue of the VISTA VVV survey for 312,587,642 unique sources averaged across all overlapping pawprint and tile images covering 560 deg$^2$ of the bulge of the Milky Way and southern disk. The catalogue includes 119 million high quality proper motion measurements, of which 47 million have statistical uncertainties below 1 mas yr$^{-1}$. In the 11$

  20. Advancing the Implementation of Hydrologic Models as Web-based Applications

    Science.gov (United States)

    Dahal, P.; Tarboton, D. G.; Castronova, A. M.

    2017-12-01

    Advanced computer simulations are required to understand hydrologic phenomenon such as rainfall-runoff response, groundwater hydrology, snow hydrology, etc. Building a hydrologic model instance to simulate a watershed requires investment in data (diverse geospatial datasets such as terrain, soil) and computer resources, typically demands a wide skill set from the analyst, and the workflow involved is often difficult to reproduce. This work introduces a web-based prototype infrastructure in the form of a web application that provides researchers with easy to use access to complete hydrological modeling functionality. This includes creating the necessary geospatial and forcing data, preparing input files for a model by applying complex data preprocessing, running the model for a user defined watershed, and saving the results to a web repository. The open source Tethys Platform was used to develop the web app front-end Graphical User Interface (GUI). We used HydroDS, a webservice that provides data preparation processing capability to support backend computations used by the app. Results are saved in HydroShare, a hydrologic information system that supports the sharing of hydrologic data, model and analysis tools. The TOPographic Kinematic APproximation and Integration (TOPKAPI) model served as the example for which we developed a complete hydrologic modeling service to demonstrate the approach. The final product is a complete modeling system accessible through the web to create input files, and run the TOPKAPI hydrologic model for a watershed of interest. We are investigating similar functionality for the preparation of input to Regional Hydro-Ecological Simulation System (RHESSys). Key Words: hydrologic modeling, web services, hydrologic information system, HydroShare, HydroDS, Tethys Platform

  1. INTAMAP: The design and implementation of an interoperable automated interpolation web service

    NARCIS (Netherlands)

    Pebesma, E.; Cornford, D.; Dubois, G.; Heuvelink, G.B.M.; Hristopulos, D.; Pilz, J.; Stohlker, U.; Morin, G.; Skoien, J.O.

    2011-01-01

    INTAMAP is a Web Processing Service for the automatic spatial interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the Open Geospatial Consortium (OGC), (ii) using a suitable environment for statistical modelling and

  2. Geospatial Information is the Cornerstone of Effective Hazards Response

    Science.gov (United States)

    Newell, Mark

    2008-01-01

    Every day there are hundreds of natural disasters world-wide. Some are dramatic, whereas others are barely noticeable. A natural disaster is commonly defined as a natural event with catastrophic consequences for living things in the vicinity. Those events include earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, and wildfires. Man-made disasters are events that are caused by man either intentionally or by accident, and that directly or indirectly threaten public health and well-being. These occurrences span the spectrum from terrorist attacks to accidental oil spills. To assist in responding to natural and potential man-made disasters, the U.S. Geological Survey (USGS) has established the Geospatial Information Response Team (GIRT) (http://www.usgs.gov/emergency/). The primary purpose of the GIRT is to ensure rapid coordination and availability of geospatial information for effective response by emergency responders, and land and resource managers, and for scientific analysis. The GIRT is responsible for establishing monitoring procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing relevant geospatial products and services. The GIRT is focused on supporting programs, offices, other agencies, and the public in mission response to hazards. The GIRT will leverage the USGS Geospatial Liaison Network and partnerships with the Department of Homeland Security (DHS), National Geospatial-Intelligence Agency (NGA), and Northern Command (NORTHCOM) to coordinate the provisioning and deployment of USGS geospatial data, products, services, and equipment. The USGS geospatial liaisons will coordinate geospatial information sharing with State, local, and tribal governments, and ensure geospatial liaison back-up support procedures are in place. The GIRT will coordinate disposition of USGS staff in support of DHS response center activities as requested by DHS. The GIRT

  3. A participatory web map service: the case of Theewaterskloof Dam ...

    African Journals Online (AJOL)

    Previously, GIS was critiqued as a segregating science used exclusively by geospatial experts. ... It presents a case study methodology for the development and testing of a web GIS that can be optimised for smartphones and tablets so that communities can access updated information while using the dam, which is rated as ...

  4. From Analysis to Impact: Challenges and Outcomes from Google's Cloud-based Platforms for Analyzing and Leveraging Petapixels of Geospatial Data

    Science.gov (United States)

    Thau, D.

    2017-12-01

    For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson

  5. A Python Geospatial Language Toolkit

    Science.gov (United States)

    Fillmore, D.; Pletzer, A.; Galloy, M.

    2012-12-01

    The volume and scope of geospatial data archives, such as collections of satellite remote sensing or climate model products, has been rapidly increasing and will continue to do so in the near future. The recently launched (October 2011) Suomi National Polar-orbiting Partnership satellite (NPP) for instance, is the first of a new generation of Earth observation platforms that will monitor the atmosphere, oceans, and ecosystems, and its suite of instruments will generate several terabytes each day in the form of multi-spectral images and derived datasets. Full exploitation of such data for scientific analysis and decision support applications has become a major computational challenge. Geophysical data exploration and knowledge discovery could benefit, in particular, from intelligent mechanisms for extracting and manipulating subsets of data relevant to the problem of interest. Potential developments include enhanced support for natural language queries and directives to geospatial datasets. The translation of natural language (that is, human spoken or written phrases) into complex but unambiguous objects and actions can be based on a context, or knowledge domain, that represents the underlying geospatial concepts. This poster describes a prototype Python module that maps English phrases onto basic geospatial objects and operations. This module, along with the associated computational geometry methods, enables the resolution of natural language directives that include geographic regions of arbitrary shape and complexity.

  6. Publications catalogue 1982-83

    International Nuclear Information System (INIS)

    1982-04-01

    This catalogue lists the technical reports, papers, speeches, regulatory documents, news releases, information bulletins, notices, and miscellaneous documents issued by the Canadian Atomic Energy Control Board between 1977 and 1982

  7. Planck 2013 results. XXVIII. The Planck Catalogue of Compact Sources

    DEFF Research Database (Denmark)

    Planck Collaboration,; Ade, P. A. R.; Aghanim, N.

    2013-01-01

    The Planck Catalogue of Compact Sources (PCCS) is the catalogue of sources detected in the Planck nominal mission data. It consists of nine single-frequency catalogues of compact sources containing reliable sources, both Galactic and extragalactic, detected over the entire sky. The PCCS covers th...

  8. CHALLENGES AND OPPORTUNITIES OF CATALOGUE RETAILING

    OpenAIRE

    Heri Bezic; Katija Vojvodic; Zrinka Gjanovic

    2012-01-01

    Today`s retail environment is characterised by new, store and non-store, retailing formats, a wide range of new products, the use of new information and communication technologies and, consequently, the changing customer behaviour. Catalogue retailing is a non-store retail format that has a long history in North America and Europe. Previous research revealed that the primary shopping motives related to catalogue retailing were convenience oriented. Other motives included recreational orientat...

  9. Planck 2013 results. XXIX. Planck catalogue of Sunyaev-Zeldovich sources

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Aussel, H.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Barrena, R.; Bartelmann, M.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bikmaev, I.; Bobin, J.; Bock, J.J.; Bohringer, H.; Bonaldi, A.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Bridges, M.; Bucher, M.; Burenin, R.; Burigana, C.; Butler, R.C.; Cardoso, J.F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chary, R.R.; Chen, X.; Chiang, L.Y.; Chiang, H.C.; Chon, G.; Christensen, P.R.; Churazov, E.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Da Silva, A.; Dahle, H.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Democles, J.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dolag, K.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Ensslin, T.A.; Eriksen, H.K.; Feroz, F.; Finelli, F.; Flores-Cacho, I.; Forni, O.; Frailis, M.; Franceschi, E.; Fromenteau, S.; Galeotta, S.; Ganga, K.; Genova-Santos, R.T.; Giard, M.; Giardino, G.; Gilfanov, M.; Giraud-Heraud, Y.; Gonzalez-Nuevo, J.; Gorski, K.M.; Grainge, K.J.B.; Gratton, S.; Gregorio, A.; N, E.Groeneboom; Gruppuso, A.; Hansen, F.K.; Hanson, D.; Harrison, D.; Hempel, A.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Hurley-Walker, N.; Jaffe, T.R.; Jaffe, A.H.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Khamitov, I.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Laureijs, R.J.; Lawrence, C.R.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Li, C.; Liddle, A.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; MacTavish, C.J.; Maffei, B.; Maino, D.; Mandolesi, N.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Mei, S.; Meinhold, P.R.; Melchiorri, A.; Melin, J.B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Nesvadba, N.P.H.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I.J.; Olamaie, M.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T.J.; Perdereau, O.; Perotto, L.; Perrott, Y.C.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Reach, W.T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rumsey, C.; Rusholme, B.; Sandri, M.; Santos, D.; Saunders, R.D.E.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Shimwell, T.W.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Turler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vibert, L.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; White, M.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We describe the all-sky Planck catalogue of clusters and cluster candidates derived from Sunyaev--Zeldovich (SZ) effect detections using the first 15.5 months of Planck satellite observations. The catalogue contains 1227 entries, making it over six times the size of the Planck Early SZ (ESZ) sample and the largest SZ-selected catalogue to date. It contains 861 confirmed clusters, of which 178 have been confirmed as clusters, mostly through follow-up observations, and a further 683 are previously-known clusters. The remaining 366 have the status of cluster candidates, and we divide them into three classes according to the quality of evidence that they are likely to be true clusters. The Planck SZ catalogue is the deepest all-sky cluster catalogue, with redshifts up to about one, and spans the broadest cluster mass range from (0.1 to 1.6) 10^{15}Msun. Confirmation of cluster candidates through comparison with existing surveys or cluster catalogues is extensively described, as is the statistical characterization...

  10. Dark Energy Survey Year 1 Results: Weak Lensing Shape Catalogues

    Energy Technology Data Exchange (ETDEWEB)

    Zuntz, J.; et al.

    2017-08-04

    We present two galaxy shape catalogues from the Dark Energy Survey Year 1 data set, covering 1500 square degrees with a median redshift of $0.59$. The catalogues cover two main fields: Stripe 82, and an area overlapping the South Pole Telescope survey region. We describe our data analysis process and in particular our shape measurement using two independent shear measurement pipelines, METACALIBRATION and IM3SHAPE. The METACALIBRATION catalogue uses a Gaussian model with an innovative internal calibration scheme, and was applied to $riz$-bands, yielding 34.8M objects. The IM3SHAPE catalogue uses a maximum-likelihood bulge/disc model calibrated using simulations, and was applied to $r$-band data, yielding 21.9M objects. Both catalogues pass a suite of null tests that demonstrate their fitness for use in weak lensing science. We estimate the 1$\\sigma$ uncertainties in multiplicative shear calibration to be $0.013$ and $0.025$ for the METACALIBRATION and IM3SHAPE catalogues, respectively.

  11. Matilda, where are you: subject description of juvenile fiction in the Slovenian catalogue and catalogues of neighbouring countries

    Directory of Open Access Journals (Sweden)

    Alenka Šauperl

    2009-01-01

    Full Text Available Differences in subject description of juvenile fiction was investigated on five examples of international classics in five library catalogues: Oton Župančič Public Library (Knjižnica Otona Župančiča in Ljubljana, Slovenia, Stadtbibliothek public library in Graz, Austria, integrated catalogues of libraries in the Gorizia region in Italy (Sistema bibliotecario della Provincia di Gorizia and the Karlovac region in Croatia (Skupni katalog knjižnica Karlovačke županije in September 2008. As Slovenian youth rarely speaks languages of neighbouring countries, British Library catalogue was added.Results show that catalogue records are inconsistent within an individual library as well as in comparision with other libraries in the sample. Librarians do not make consistent subject descriptions. Class number, which is present in all catalogues except in the Austrian one, usually represents: the author’s country, language and/or nationality,the literary genre, and the target audience.Subject headings in the sample bring information on the subject (aboutness, author’s country, language and/or nationality, the literary genre, and target audience. Summaries tell more on the story. But they can also bring information on emotional experience of the reader, information on the author or history of the literary work. It would be economically beneficial if subject description could be more consistent. But uniform subject description is not possible because of diverse library collections and users.The solution might be in the use of multiple levels of subject description regarding to the type of the libraries.

  12. The Raincoast eCatalogue: the creation of an electronic catalogue as a supplemental selling tool for sales representatives

    OpenAIRE

    Kemp, Elizabeth Anne

    2011-01-01

    Raincoast Books Distribution Ltd. is a Canadian book distributor that provides sales, marketing and distribution services for a number of international and Canadian publishers. Each publishing season Raincoast Books distributes approximately 25,000 paper catalogues to sales representatives and retail accounts. Traditional paper catalogues have major disadvantages including their static format, high cost of production and distribution, inclusion of frontlist titles only and environmental impac...

  13. Integration of Geospatial Science in Teacher Education

    Science.gov (United States)

    Hauselt, Peggy; Helzer, Jennifer

    2012-01-01

    One of the primary missions of our university is to train future primary and secondary teachers. Geospatial sciences, including GIS, have long been excluded from teacher education curriculum. This article explains the curriculum revisions undertaken to increase the geospatial technology education of future teachers. A general education class…

  14. First Prototype of a Web Map Interface for ESA's Planetary Science Archive (PSA)

    Science.gov (United States)

    Manaud, N.; Gonzalez, J.

    2014-04-01

    We present a first prototype of a Web Map Interface that will serve as a proof of concept and design for ESA's future fully web-based Planetary Science Archive (PSA) User Interface. The PSA is ESA's planetary science archiving authority and central repository for all scientific and engineering data returned by ESA's Solar System missions [1]. All data are compliant with NASA's Planetary Data System (PDS) Standards and are accessible through several interfaces [2]: in addition to serving all public data via FTP and the Planetary Data Access Protocol (PDAP), a Java-based User Interface provides advanced search, preview, download, notification and delivery-basket functionality. It allows the user to query and visualise instrument observations footprints using a map-based interface (currently only available for Mars Express HRSC and OMEGA instruments). During the last decade, the planetary mapping science community has increasingly been adopting Geographic Information System (GIS) tools and standards, originally developed for and used in Earth science. There is an ongoing effort to produce and share cartographic products through Open Geospatial Consortium (OGC) Web Services, or as standalone data sets, so that they can be readily used in existing GIS applications [3,4,5]. Previous studies conducted at ESAC [6,7] have helped identify the needs of Planetary GIS users, and define key areas of improvement for the future Web PSA User Interface. Its web map interface shall will provide access to the full geospatial content of the PSA, including (1) observation geometry footprints of all remote sensing instruments, and (2) all georeferenced cartographic products, such as HRSC map-projected data or OMEGA global maps from Mars Express. It shall aim to provide a rich user experience for search and visualisation of this content using modern and interactive web mapping technology. A comprehensive set of built-in context maps from external sources, such as MOLA topography, TES

  15. Geospatial Data Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...

  16. Bridging the Gap Between Surveyors and the Geo-Spatial Society

    Science.gov (United States)

    Müller, H.

    2016-06-01

    For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.

  17. Geospatial Information from Satellite Imagery for Geovisualisation of Smart Cities in India

    Science.gov (United States)

    Mohan, M.

    2016-06-01

    In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.

  18. Sensor Webs with a Service-Oriented Architecture for On-demand Science Products

    Science.gov (United States)

    Mandl, Daniel; Ungar, Stephen; Ames, Troy; Justice, Chris; Frye, Stuart; Chien, Steve; Tran, Daniel; Cappelaere, Patrice; Derezinsfi, Linda; Paules, Granville; hide

    2007-01-01

    This paper describes the work being managed by the NASA Goddard Space Flight Center (GSFC) Information System Division (ISD) under a NASA Earth Science Technology Ofice (ESTO) Advanced Information System Technology (AIST) grant to develop a modular sensor web architecture which enables discovery of sensors and workflows that can create customized science via a high-level service-oriented architecture based on Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) web service standards. These capabilities serve as a prototype to a user-centric architecture for Global Earth Observing System of Systems (GEOSS). This work builds and extends previous sensor web efforts conducted at NASA/GSFC using the Earth Observing 1 (EO-1) satellite and other low-earth orbiting satellites.

  19. Broad Absorption Line Quasar catalogues with Supervised Neural Networks

    International Nuclear Information System (INIS)

    Scaringi, Simone; Knigge, Christian; Cottis, Christopher E.; Goad, Michael R.

    2008-01-01

    We have applied a Learning Vector Quantization (LVQ) algorithm to SDSS DR5 quasar spectra in order to create a large catalogue of broad absorption line quasars (BALQSOs). We first discuss the problems with BALQSO catalogues constructed using the conventional balnicity and/or absorption indices (BI and AI), and then describe the supervised LVQ network we have trained to recognise BALQSOs. The resulting BALQSO catalogue should be substantially more robust and complete than BI-or AI-based ones.

  20. Key Technologies and Applications of Satellite and Sensor Web-coupled Real-time Dynamic Web Geographic Information System

    Directory of Open Access Journals (Sweden)

    CHEN Nengcheng

    2017-10-01

    Full Text Available The geo-spatial information service has failed to reflect the live status of spot and meet the needs of integrated monitoring and real-time information for a long time. To tackle the problems in observation sharing and integrated management of space-borne, air-borne, and ground-based platforms and efficient service of spatio-temporal information, an observation sharing model was proposed. The key technologies in real-time dynamic geographical information system (GIS including maximum spatio-temporal coverage-based optimal layout of earth-observation sensor Web, task-driven and feedback-based control, real-time access of streaming observations, dynamic simulation, warning and decision support were detailed. An real-time dynamic Web geographical information system (WebGIS named GeoSensor and its applications in sensing and management of spatio-temporal information of Yangtze River basin including navigation, flood prevention, and power generation were also introduced.

  1. PROCESSING, CATALOGUING AND DISTRIBUTION OF UAS IMAGES IN NEAR REAL TIME

    Directory of Open Access Journals (Sweden)

    I. Runkel

    2013-08-01

    Full Text Available Why are UAS such a hype? UAS make the data capture flexible, fast and easy. For many applications this is more important than a perfect photogrammetric aerial image block. To ensure, that the advantage of a fast data capturing will be valid up to the end of the processing chain, all intermediate steps like data processing and data dissemination to the customer need to be flexible and fast as well. GEOSYSTEMS has established the whole processing workflow as server/client solution. This is the focus of the presentation. Depending on the image acquisition system the image data can be down linked during the flight to the data processing computer or it is stored on a mobile device and hooked up to the data processing computer after the flight campaign. The image project manager reads the data from the device and georeferences the images according to the position data. The meta data is converted into an ISO conform format and subsequently all georeferenced images are catalogued in the raster data management System ERDAS APOLLO. APOLLO provides the data, respectively the images as an OGC-conform services to the customer. Within seconds the UAV-images are ready to use for GIS application, image processing or direct interpretation via web applications – where ever you want. The whole processing chain is built in a generic manner. It can be adapted to a magnitude of applications. The UAV imageries can be processed and catalogued as single ortho imges or as image mosaic. Furthermore, image data of various cameras can be fusioned. By using WPS (web processing services image enhancement, image analysis workflows like change detection layers can be calculated and provided to the image analysts. The processing of the WPS runs direct on the raster data management server. The image analyst has no data and no software on his local computer. This workflow is proven to be fast, stable and accurate. It is designed to support time critical applications for security

  2. Processing, Cataloguing and Distribution of Uas Images in Near Real Time

    Science.gov (United States)

    Runkel, I.

    2013-08-01

    Why are UAS such a hype? UAS make the data capture flexible, fast and easy. For many applications this is more important than a perfect photogrammetric aerial image block. To ensure, that the advantage of a fast data capturing will be valid up to the end of the processing chain, all intermediate steps like data processing and data dissemination to the customer need to be flexible and fast as well. GEOSYSTEMS has established the whole processing workflow as server/client solution. This is the focus of the presentation. Depending on the image acquisition system the image data can be down linked during the flight to the data processing computer or it is stored on a mobile device and hooked up to the data processing computer after the flight campaign. The image project manager reads the data from the device and georeferences the images according to the position data. The meta data is converted into an ISO conform format and subsequently all georeferenced images are catalogued in the raster data management System ERDAS APOLLO. APOLLO provides the data, respectively the images as an OGC-conform services to the customer. Within seconds the UAV-images are ready to use for GIS application, image processing or direct interpretation via web applications - where ever you want. The whole processing chain is built in a generic manner. It can be adapted to a magnitude of applications. The UAV imageries can be processed and catalogued as single ortho imges or as image mosaic. Furthermore, image data of various cameras can be fusioned. By using WPS (web processing services) image enhancement, image analysis workflows like change detection layers can be calculated and provided to the image analysts. The processing of the WPS runs direct on the raster data management server. The image analyst has no data and no software on his local computer. This workflow is proven to be fast, stable and accurate. It is designed to support time critical applications for security demands - the images

  3. Statistical Validation of a Web-Based GIS Application and Its Applicability to Cardiovascular-Related Studies

    Directory of Open Access Journals (Sweden)

    Jae Eun Lee

    2015-12-01

    Full Text Available Purpose: There is abundant evidence that neighborhood characteristics are significantly linked to the health of the inhabitants of a given space within a given time frame. This study is to statistically validate a web-based GIS application designed to support cardiovascular-related research developed by the NIH funded Research Centers in Minority Institutions (RCMI Translational Research Network (RTRN Data Coordinating Center (DCC and discuss its applicability to cardiovascular studies. Methods: Geo-referencing, geocoding and geospatial analyses were conducted for 500 randomly selected home addresses in a U.S. southeastern Metropolitan area. The correlation coefficient, factor analysis and Cronbach’s alpha (α were estimated to quantify measures of the internal consistency, reliability and construct/criterion/discriminant validity of the cardiovascular-related geospatial variables (walk score, number of hospitals, fast food restaurants, parks and sidewalks. Results: Cronbach’s α for CVD GEOSPATIAL variables was 95.5%, implying successful internal consistency. Walk scores were significantly correlated with number of hospitals (r = 0.715; p < 0.0001, fast food restaurants (r = 0.729; p < 0.0001, parks (r = 0.773; p < 0.0001 and sidewalks (r = 0.648; p < 0.0001 within a mile from homes. It was also significantly associated with diversity index (r = 0.138, p = 0.0023, median household incomes (r = −0.181; p < 0.0001, and owner occupied rates (r = −0.440; p < 0.0001. However, its non-significant correlation was found with median age, vulnerability, unemployment rate, labor force, and population growth rate. Conclusion: Our data demonstrates that geospatial data generated by the web-based application were internally consistent and demonstrated satisfactory validity. Therefore, the GIS application may be useful to apply to cardiovascular-related studies aimed to investigate potential impact of geospatial factors on diseases and/or the long

  4. GEOSPATIAL INFORMATION FROM SATELLITE IMAGERY FOR GEOVISUALISATION OF SMART CITIES IN INDIA

    Directory of Open Access Journals (Sweden)

    M. Mohan

    2016-06-01

    Full Text Available In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.

  5. GeoSpatial Data Analysis for DHS Programs

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, Eric G.; Burke, John S.; Carlson, Carrie A.; Gillen, David S.; Joslyn, Cliff A.; Olsen, Bryan K.; Critchlow, Terence J.

    2009-05-10

    The Department of Homeland Security law enforcement faces the continual challenge of analyzing their custom data sources in a geospatial context. From a strategic perspective law enforcement has certain requirements to first broadly characterize a given situation using their custom data sources and then once it is summarily understood, to geospatially analyze their data in detail.

  6. A new version of the European tsunami catalogue: updating and revision

    Directory of Open Access Journals (Sweden)

    S. Tinti

    2001-01-01

    Full Text Available A new version of the European catalogue of tsunamis is presented here. It differs from the latest release of the catalogue that was produced in 1998 and is known as GITEC tsunami catalogue in some important aspects. In the first place, it is a database built on the Visual FoxPro 6.0 DBMS that can be used and maintained under the PC operating systems currently available. Conversely, the GITEC catalogue was compatible only with Windows 95 and older PC platforms. In the second place, it is enriched by new facilities and a new type of data, such as a database of pictures that can be accessed easily from the main screen of the catalogue. Thirdly, it has been updated by including the newly published references. Minute and painstaking search for new data has been undertaken to re-evaluate cases that were not included in the GITEC catalogue, though they were mentioned in previous catalogues; the exclusion was motivated by a lack of data. This last work has focused so far on Italian cases of the last two centuries. The result is that at least two events have been found which deserve inclusion in the new catalogue: one occurred in 1809 in the Gulf of La Spezia, and the other occurred in 1940 in the Gulf of Palermo. Two further events are presently under investigation.

  7. International Atomic Energy Agency Publications. Catalogue 1986-1999

    International Nuclear Information System (INIS)

    2000-11-01

    This catalogue lists all sales publications of the International Atomic Energy Agency issued from 1986 up to the end of 1999 and still available. Some earlier titles which form part of an established series or are still considered important have also been included. The catalogue is in CD-ROM format

  8. Catalogue of meteorites from South America

    CERN Document Server

    Acevedo, Rogelio Daniel; García, Víctor Manuel

    2014-01-01

    The first Catalogue of Meteorites from South America includes new specimens never previously reported, while doubtful cases and pseudometeorites have been deliberately omitted.The falling of these objects is a random event, but the sites where old meteorites are found tend to be focused in certain areas, e.g. in the deflation surfaces in Chile's Atacama Desert, due to favorable climate conditions and ablation processes.Our Catalogue provides basic information on each specimen like its provenance and the place where it was discovered (in geographic co-ordinates and with illustrative maps), its

  9. Capacity Building through Geospatial Education in Planning and School Curricula

    Science.gov (United States)

    Kumar, P.; Siddiqui, A.; Gupta, K.; Jain, S.; Krishna Murthy, Y. V. N.

    2014-11-01

    Geospatial technology has widespread usage in development planning and resource management. It offers pragmatic tools to help urban and regional planners to realize their goals. On the request of Ministry of Urban Development, Govt. of India, the Indian Institute of Remote Sensing (IIRS), Dehradun has taken an initiative to study the model syllabi of All India Council for Technical Education for planning curricula of Bachelor and Master (five disciplines) programmes. It is inferred that geospatial content across the semesters in various planning fields needs revision. It is also realized that students pursuing planning curricula are invariably exposed to spatial mapping tools but the popular digital drafting software have limitations on geospatial analysis of planning phenomena. Therefore, students need exposure on geospatial technologies to understand various real world phenomena. Inputs were given to seamlessly merge and incorporate geospatial components throughout the semesters wherever seems relevant. Another initiative by IIRS was taken to enhance the understanding and essence of space and geospatial technologies amongst the young minds at 10+2 level. The content was proposed in a manner such that youngsters start realizing the innumerable contributions made by space and geospatial technologies in their day-to-day life. This effort both at school and college level would help in not only enhancing job opportunities for young generation but also utilizing the untapped human resource potential. In the era of smart cities, higher economic growth and aspirations for a better tomorrow, integration of Geospatial technologies with conventional wisdom can no longer be ignored.

  10. Challenges in sharing of geospatial data by data custodians in South Africa

    Science.gov (United States)

    Kay, Sissiel E.

    2018-05-01

    As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.

  11. An Assessment of Online Public Access Catalogue (OPAC ...

    African Journals Online (AJOL)

    The main purpose of this study was to assess the computerized catalogue and its utilization in university libraries in Lagos state. Survey research method was employed for the study. The population for the study was drawn from two university libraries in Lagos state that have automated their catalogues. These libraries are ...

  12. The ASAS-SN bright supernova catalogue - III. 2016

    DEFF Research Database (Denmark)

    Holoien, T. W. -S.; Brown, J. S.; Stanek, K. Z.

    2017-01-01

    This catalogue summarizes information for all supernovae discovered by the All-Sky Automated Survey for SuperNovae (ASAS-SN) and all other bright (m(peak)d......This catalogue summarizes information for all supernovae discovered by the All-Sky Automated Survey for SuperNovae (ASAS-SN) and all other bright (m(peak)d...

  13. A NoSQL–SQL Hybrid Organization and Management Approach for Real-Time Geospatial Data: A Case Study of Public Security Video Surveillance

    Directory of Open Access Journals (Sweden)

    Chen Wu

    2017-01-01

    Full Text Available With the widespread deployment of ground, air and space sensor sources (internet of things or IoT, social networks, sensor networks, the integrated applications of real-time geospatial data from ubiquitous sensors, especially in public security and smart city domains, are becoming challenging issues. The traditional geographic information system (GIS mostly manages time-discretized geospatial data by means of the Structured Query Language (SQL database management system (DBMS and emphasizes query and retrieval of massive historical geospatial data on disk. This limits its capability for on-the-fly access of real-time geospatial data for online analysis in real time. This paper proposes a hybrid database organization and management approach with SQL relational databases (RDB and not only SQL (NoSQL databases (including the main memory database, MMDB, and distributed files system, DFS. This hybrid approach makes full use of the advantages of NoSQL and SQL DBMS for the real-time access of input data and structured on-the-fly analysis results which can meet the requirements of increased spatio-temporal big data linking analysis. The MMDB facilitates real-time access of the latest input data such as the sensor web and IoT, and supports the real-time query for online geospatial analysis. The RDB stores change information such as multi-modal features and abnormal events extracted from real-time input data. The DFS on disk manages the massive geospatial data, and the extensible storage architecture and distributed scheduling of a NoSQL database satisfy the performance requirements of incremental storage and multi-user concurrent access. A case study of geographic video (GeoVideo surveillance of public security is presented to prove the feasibility of this hybrid organization and management approach.

  14. A PUBLIC PLATFORM FOR GEOSPATIAL DATA SHARING FOR DISASTER RISK MANAGEMENT

    Directory of Open Access Journals (Sweden)

    S. Balbo

    2014-01-01

    This paper presents a case study scenario of setting up a Web platform based on GeoNode. It is a public platform called MASDAP and promoted by the Government of Malawi in order to support development of the country and build resilience against natural disasters. A substantial amount of geospatial data has already been collected about hydrogeological risk, as well as several other-disasters related information. Moreover this platform will help to ensure that the data created by a number of past or ongoing projects is maintained and that this information remains accessible and useful. An Integrated Flood Risk Management Plan for a river basin has already been included in the platform and other data from future disaster risk management projects will be added as well.

  15. Communicating Thematic Data Quality with Web Map Services

    Directory of Open Access Journals (Sweden)

    Jon D. Blower

    2015-10-01

    Full Text Available Geospatial information of many kinds, from topographic maps to scientific data, is increasingly being made available through web mapping services. These allow georeferenced map images to be served from data stores and displayed in websites and geographic information systems, where they can be integrated with other geographic information. The Open Geospatial Consortium’s Web Map Service (WMS standard has been widely adopted in diverse communities for sharing data in this way. However, current services typically provide little or no information about the quality or accuracy of the data they serve. In this paper we will describe the design and implementation of a new “quality-enabled” profile of WMS, which we call “WMS-Q”. This describes how information about data quality can be transmitted to the user through WMS. Such information can exist at many levels, from entire datasets to individual measurements, and includes the many different ways in which data uncertainty can be expressed. We also describe proposed extensions to the Symbology Encoding specification, which include provision for visualizing uncertainty in raster data in a number of different ways, including contours, shading and bivariate colour maps. We shall also describe new open-source implementations of the new specifications, which include both clients and servers.

  16. VIRAC: the VVV Infrared Astrometric Catalogue

    Science.gov (United States)

    Smith, L. C.; Lucas, P. W.; Kurtev, R.; Smart, R.; Minniti, D.; Borissova, J.; Jones, H. R. A.; Zhang, Z. H.; Marocco, F.; Contreras Peña, C.; Gromadzki, M.; Kuhn, M. A.; Drew, J. E.; Pinfield, D. J.; Bedin, L. R.

    2018-02-01

    We present VIRAC version 1, a near-infrared proper motion and parallax catalogue of the VISTA Variables in the Via Lactea (VVV) survey for 312 587 642 unique sources averaged across all overlapping pawprint and tile images covering 560 deg2 of the bulge of the Milky Way and southern disc. The catalogue includes 119 million high-quality proper motion measurements, of which 47 million have statistical uncertainties below 1 mas yr-1. In the 11 stars and brown dwarfs, subdwarfs, white dwarfs) and kinematic distance measurements of young stellar objects. Nearby objects discovered include LTT 7251 B, an L7 benchmark companion to a G dwarf with over 20 published elemental abundances, a bright L subdwarf, VVV 1256-6202, with extremely blue colours and nine new members of the 25 pc sample. We also demonstrate why this catalogue remains useful in the era of Gaia. Future versions will be based on profile fitting photometry, use the Gaia absolute reference frame and incorporate the longer time baseline of the VVV extended survey.

  17. Integrating Free and Open Source Solutions into Geospatial Science Education

    Directory of Open Access Journals (Sweden)

    Vaclav Petras

    2015-06-01

    Full Text Available While free and open source software becomes increasingly important in geospatial research and industry, open science perspectives are generally less reflected in universities’ educational programs. We present an example of how free and open source software can be incorporated into geospatial education to promote open and reproducible science. Since 2008 graduate students at North Carolina State University have the opportunity to take a course on geospatial modeling and analysis that is taught with both proprietary and free and open source software. In this course, students perform geospatial tasks simultaneously in the proprietary package ArcGIS and the free and open source package GRASS GIS. By ensuring that students learn to distinguish between geospatial concepts and software specifics, students become more flexible and stronger spatial thinkers when choosing solutions for their independent work in the future. We also discuss ways to continually update and improve our publicly available teaching materials for reuse by teachers, self-learners and other members of the GIS community. Only when free and open source software is fully integrated into geospatial education, we will be able to encourage a culture of openness and, thus, enable greater reproducibility in research and development applications.

  18. Catalogue of knowledge and skills for sleep medicine.

    Science.gov (United States)

    Penzel, Thomas; Pevernagie, Dirk; Dogas, Zoran; Grote, Ludger; de Lacy, Simone; Rodenbeck, Andrea; Bassetti, Claudio; Berg, Søren; Cirignotta, Fabio; d'Ortho, Marie-Pia; Garcia-Borreguero, Diego; Levy, Patrick; Nobili, Lino; Paiva, Teresa; Peigneux, Philippe; Pollmächer, Thomas; Riemann, Dieter; Skene, Debra J; Zucconi, Marco; Espie, Colin

    2014-04-01

    Sleep medicine is evolving globally into a medical subspeciality in its own right, and in parallel, behavioural sleep medicine and sleep technology are expanding rapidly. Educational programmes are being implemented at different levels in many European countries. However, these programmes would benefit from a common, interdisciplinary curriculum. This 'catalogue of knowledge and skills' for sleep medicine is proposed, therefore, as a template for developing more standardized curricula across Europe. The Board and The Sleep Medicine Committee of the European Sleep Research Society (ESRS) have compiled the catalogue based on textbooks, standard of practice publications, systematic reviews and professional experience, validated subsequently by an online survey completed by 110 delegates specialized in sleep medicine from different European countries. The catalogue comprises 10 chapters covering physiology, pathology, diagnostic and treatment procedures to societal and organizational aspects of sleep medicine. Required levels of knowledge and skills are defined, as is a proposed workload of 60 points according to the European Credit Transfer System (ECTS). The catalogue is intended to be a basis for sleep medicine education, for sleep medicine courses and for sleep medicine examinations, serving not only physicians with a medical speciality degree, but also PhD and MSc health professionals such as clinical psychologists and scientists, technologists and nurses, all of whom may be involved professionally in sleep medicine. In the future, the catalogue will be revised in accordance with advances in the field of sleep medicine. © 2013 European Sleep Research Society.

  19. Applying Geospatial Technologies for International Development and Public Health: The USAID/NASA SERVIR Program

    Science.gov (United States)

    Hemmings, Sarah; Limaye, Ashutosh; Irwin, Dan

    2011-01-01

    adaptation strategies for nations affected by climate change. Conclusions: SERVIR is a platform for collaboration and cross-agency coordination, international partnerships, and delivery of web-based geospatial information services and applications. SERVIR makes a variety of geospatial data available for use in studies of environmental health outcomes.

  20. Geospatial Absorption and Regional Effects

    Directory of Open Access Journals (Sweden)

    IOAN MAC

    2009-01-01

    Full Text Available The geospatial absorptions are characterized by a specific complexity both in content and in their phenomenological and spatial manifestation fields. Such processes are differentiated according to their specificity to pre-absorption, absorption or post-absorption. The mechanisms that contribute to absorption are extremely numerous: aggregation, extension, diffusion, substitution, resistivity (resilience, stratification, borrowings, etc. Between these mechanisms frequent relations are established determining an amplification of the process and of its regional effects. The installation of the geographic osmosis phenomenon in a given territory (a place for example leads to a homogenization of the geospatial state and to the installation of the regional homogeneity.

  1. Planck 2015 results. XXVI. The Second Planck Catalogue of Compact Sources

    CERN Document Server

    Ade, P.A.R.; Argueso, F.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartolo, N.; Battaner, E.; Beichman, C.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bock, J.J.; Bohringer, H.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R.C.; Calabrese, E.; Cardoso, J.F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chary, R.R.; Chiang, H.C.; Christensen, P.R.; Clemens, M.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Ensslin, T.A.; Eriksen, H.K.; Falgarone, E.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A.A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giraud-Heraud, Y.; Gjerlow, E.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.L.; Helou, G.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, A.H.; Jaffe, T.R.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Lattanzi, M.; Lawrence, C.R.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; McGehee, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J.A.; Naselsky, P.; Nati, F.; Natoli, P.; Negrello, M.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T.J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Reach, W.T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Sanghera, H.S.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Stolyarov, V.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tornikoski, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Turler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L.A.; Walter, B.; Wandelt, B.D.; Wehus, I.K.; Yvon, D.; Zacchei, A.; Zonca, A.

    2016-01-01

    The Second Planck Catalogue of Compact Sources is a catalogue of sources detected in single-frequency maps from the full duration of the Planck mission and supersedes previous versions of the Planck compact source catalogues. It consists of compact sources, both Galactic and extragalactic, detected over the entire sky. Compact sources detected in the lower frequency channels are assigned to the PCCS2, while at higher frequencies they are assigned to one of two sub-catalogues, the PCCS2 or PCCS2E, depending on their location on the sky. The first of these catalogues covers most of the sky and allows the user to produce subsamples at higher reliabilities than the target 80% integral reliability of the catalogue. The PCCS2E contains sources detected in sky regions where the diffuse emission makes it difficult to quantify the reliability of the detections. Both the PCCS2 and PCCS2E include polarization measurements, in the form of polarized flux densities, or upper limits, and orientation angles for all seven pol...

  2. Remapping simulated halo catalogues in redshift space

    OpenAIRE

    Mead, Alexander; Peacock, John

    2014-01-01

    We discuss the extension to redshift space of a rescaling algorithm, designed to alter the effective cosmology of a pre-existing simulated particle distribution or catalogue of dark matter haloes. The rescaling approach was initially developed by Angulo & White and was adapted and applied to halo catalogues in real space in our previous work. This algorithm requires no information other than the initial and target cosmological parameters, and it contains no tuned parameters. It is shown here ...

  3. Creation of Defects Catalogue for Nonconforming Product Identification in the Foundry Organization

    Directory of Open Access Journals (Sweden)

    Andrea Sütőová

    2013-12-01

    Full Text Available The paper deals with system of casting defects classification problematics and creation of defects catalogue in the foundry organization. There is described the value of correct defects classification and identification in the literature review and also some tools for defects classification are mentioned. Existing defects classifications and catalogues are often unusable for particular production processes and casting technology. Many foundries therefore create their own defects catalogues. The sample of created catalogue, which classifies and describes defects occuring in the aluminium foundry organization and its benefits are presented in the paper. The created catalogue primarily serves as a visual support for production operators and quality control processes.

  4. Biosecurity and geospatial analysis of mycoplasma infections in ...

    African Journals Online (AJOL)

    Geospatial database of farm locations and biosecurity measures are essential to control disease outbreaks. A study was conducted to establish geospatial database on poultry farms in Al-Jabal Al-Gharbi region of Libya, to evaluate the biosecurity level of each farm and to determine the seroprevalence of mycoplasma and ...

  5. Searches over graphs representing geospatial-temporal remote sensing data

    Science.gov (United States)

    Brost, Randolph; Perkins, David Nikolaus

    2018-03-06

    Various technologies pertaining to identifying objects of interest in remote sensing images by searching over geospatial-temporal graph representations are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Geospatial-temporal graph searches are made computationally efficient by taking advantage of characteristics of geospatial-temporal data in remote sensing images through the application of various graph search techniques.

  6. Uniform Title in Theory and in Slovenian and Croatian Cataloguing Practice

    Directory of Open Access Journals (Sweden)

    Marija Petek

    2013-09-01

    Full Text Available ABSTRACTPurpose:  The paper investigates the importance and development of uniform title that enables collocation in the library catalogue. Research results on use of uniform titles in two union catalogues, the Slovenian COBISS and the Croatian CROLIST are also presented.Methodology/approach:  Theoretical apects of the uniform title are treated: for the first time by Panizzi, then in the Paris Principles being the basis for the Verona's cataloguing code; in the latest International Cataloguing Principles including conceptual models Functional Requirements for Bibliographic Records (FRBR and Functional Requirements for Authority Data (FRAD; and last but not least in the international cataloguing code Resource Description and Access (RDA. To find out whether the uniform titles are used consistently according to the Verona's cataloguing code and to the requirements of the bibliographic formats COMARC and UNIMARC, the frequency of tags 300 and 500 in bibliographic records is explored.Results:  The research results indicate that the use of uniform titles in COBISS and CROLIST is not satisfactory and that the tags 300 and 500 are often missing in bibliographic recods. In online catalogues a special attention should be given to the uniform title as it is considered an efficient linking device in the catalogue and as it enables collocation.Research limitations:  The research is limited to bibliographic records for translations of works of personal authors and of anonymous works; corporate authors are not included.Originality/practical implications:  Presenting development of the uniform title from the very beginning up to now and the first research on the uniform title in COBISS.

  7. IAEA Publications Catalogue 2013-2014 - full details of publications published 2012-2014, forthcoming publications and a stocklist of publications published 2010-2013

    International Nuclear Information System (INIS)

    2013-08-01

    This publications catalogue lists all sales publications of the IAEA published in 2012 and 2013 and those forthcoming in 2013-2014. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  8. IAEA Publications Catalogue 2014-2015 - full details of publications published 2013-2015, forthcoming publications and a stocklist of publications published 2011-2014

    International Nuclear Information System (INIS)

    2014-07-01

    This publications catalogue lists all sales publications of the IAEA published in 2013 and 2014 and those forthcoming in 2014-2015. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  9. IAEA Publications Catalogue 2015-2016 - full details of publications published 2014-2016, forthcoming publications and a stocklist of publications published 2012-2015

    International Nuclear Information System (INIS)

    2015-01-01

    This publications catalogue lists all sales publications of the IAEA published in 2014 and 2015 and those forthcoming in 2015-2016. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  10. Library Catalogue Users Are Influenced by Trends in Web Searching Search Strategies. A review of: Novotny, Eric. “I Don’t Think I Click: A Protocol Analysis Study of Use of a Library Online Catalog in the Internet Age.” College & Research Libraries, 65.6 (Nov. 2004: 525-37.

    Directory of Open Access Journals (Sweden)

    Susan Haigh

    2006-09-01

    Full Text Available Objective – To explore how Web-savvy users think about and search an online catalogue. Design – Protocol analysis study. Setting – Academic library (Pennsylvania State University Libraries. Subjects – Eighteen users (17 students, 1 faculty member of an online public access catalog, divided into two groups of nine first-time and nine experienced users. Method – The study team developed five tasks that represented a range of activities commonly performed by library users, such as searching for a specific item, identifying a library location, and requesting a copy. Seventeen students and one faculty member, divided evenly between novice and experienced searchers, were recruited to “think aloud” through the performance of the tasks. Data were gathered through audio recordings, screen capture software, and investigator notes. The time taken for each task was recorded, and investigators rated task completion as “successful,” “partially successful,” “fail,” or “search aborted.” After the searching session, participants were interviewed to clarify their actions and provide further commentary on the catalogue search. Main results – Participants in both test groups were relatively unsophisticated subject searchers. They made minimal use of Boolean operators, and tended not to repair failed searches by rethinking the search vocabulary and using synonyms. Participants did not have a strong understanding of library catalogue contents or structure and showed little curiosity in developing an understanding of how to utilize the catalogue. Novice users were impatient both in choosing search options and in evaluating their search results. They assumed search results were sorted by relevance, and thus would not typically browse past the initial screen. They quickly followed links, fearlessly tried different searches and options, and rapidly abandoned false trails. Experienced users were more effective and efficient searchers than

  11. Energy research projects in the Nordic countries - catalogue 1983

    International Nuclear Information System (INIS)

    1983-01-01

    The Nordic energy ministers at their meeting February 9, 1982 agreed upon a working plan for the Nordic energy cooperation. As part of this plan a contact group was established in order to maintain coordination and cooperation within the area of energy research and development. This group decided April 1982 to establish a catalogue of energy research projects in the Nordic countries. A pilot catalogue was published in June 1982. The 1983 catalogue gives an up-to-date survey of energy research and development projects in the Nordic countries. About 2125 projects are described, and information is given on investigator(s), performing organization, financing body, funds, and period. The catalogue is prepared by the Nordic energy libraries through their cooperation in Nordic Atomic Libraries Joint Secretariat. The information is also included in the data base Nordic Energy Index (NEI), which is online accessible at I/S Datacentralen, Copenhagen, via EURONET, SCANNET, TYMNET, AND TELENET. (BP)

  12. Remapping dark matter halo catalogues between cosmological simulations

    Science.gov (United States)

    Mead, A. J.; Peacock, J. A.

    2014-05-01

    We present and test a method for modifying the catalogue of dark matter haloes produced from a given cosmological simulation, so that it resembles the result of a simulation with an entirely different set of parameters. This extends the method of Angulo & White, which rescales the full particle distribution from a simulation. Working directly with the halo catalogue offers an advantage in speed, and also allows modifications of the internal structure of the haloes to account for non-linear differences between cosmologies. Our method can be used directly on a halo catalogue in a self-contained manner without any additional information about the overall density field; although the large-scale displacement field is required by the method, this can be inferred from the halo catalogue alone. We show proof of concept of our method by rescaling a matter-only simulation with no baryon acoustic oscillation (BAO) features to a more standard Λ cold dark matter model containing a cosmological constant and a BAO signal. In conjunction with the halo occupation approach, this method provides a basis for the rapid generation of mock galaxy samples spanning a wide range of cosmological parameters.

  13. Towards a Next-Generation Catalogue Cross-Match Service

    Science.gov (United States)

    Pineau, F.; Boch, T.; Derriere, S.; Arches Consortium

    2015-09-01

    We have been developing in the past several catalogue cross-match tools. On one hand the CDS XMatch service (Pineau et al. 2011), able to perform basic but very efficient cross-matches, scalable to the largest catalogues on a single regular server. On the other hand, as part of the European project ARCHES1, we have been developing a generic and flexible tool which performs potentially complex multi-catalogue cross-matches and which computes probabilities of association based on a novel statistical framework. Although the two approaches have been managed so far as different tracks, the need for next generation cross-match services dealing with both efficiency and complexity is becoming pressing with forthcoming projects which will produce huge high quality catalogues. We are addressing this challenge which is both theoretical and technical. In ARCHES we generalize to N catalogues the candidate selection criteria - based on the chi-square distribution - described in Pineau et al. (2011). We formulate and test a number of Bayesian hypothesis which necessarily increases dramatically with the number of catalogues. To assign a probability to each hypotheses, we rely on estimated priors which account for local densities of sources. We validated our developments by comparing the theoretical curves we derived with the results of Monte-Carlo simulations. The current prototype is able to take into account heterogeneous positional errors, object extension and proper motion. The technical complexity is managed by OO programming design patterns and SQL-like functionalities. Large tasks are split into smaller independent pieces for scalability. Performances are achieved resorting to multi-threading, sequential reads and several tree data-structures. In addition to kd-trees, we account for heterogeneous positional errors and object's extension using M-trees. Proper-motions are supported using a modified M-tree we developed, inspired from Time Parametrized R-trees (TPR

  14. Planck 2015 results: XXVIII. The Planck Catalogue of Galactic cold clumps

    DEFF Research Database (Denmark)

    Ade, P. A R; Aghanim, N.; Arnaud, M.

    2016-01-01

    We present the Planck Catalogue of Galactic Cold Clumps (PGCC), an all-sky catalogue of Galactic cold clump candidates detected by Planck. This catalogue is the full version of the Early Cold Core (ECC) catalogue, which was made available in 2011 with the Early Release Compact Source Catalogue (E...

  15. Injury surveillance in low-resource settings using Geospatial and Social Web technologies

    Directory of Open Access Journals (Sweden)

    Schuurman Nadine

    2010-05-01

    Full Text Available Abstract Background Extensive public health gains have benefited high-income countries in recent decades, however, citizens of low and middle-income countries (LMIC have largely not enjoyed the same advancements. This is in part due to the fact that public health data - the foundation for public health advances - are rarely collected in many LMIC. Injury data are particularly scarce in many low-resource settings, despite the huge associated burden of morbidity and mortality. Advances in freely-accessible and easy-to-use information and communication (ICT technology may provide the impetus for increased public health data collection in settings with limited financial and personnel resources. Methods and Results A pilot study was conducted at a hospital in Cape Town, South Africa to assess the utility and feasibility of using free (non-licensed, and easy-to-use Social Web and GeoWeb tools for injury surveillance in low-resource settings. Data entry, geocoding, data exploration, and data visualization were successfully conducted using these technologies, including Google Spreadsheet, Mapalist, BatchGeocode, and Google Earth. Conclusion This study examined the potential for Social Web and GeoWeb technologies to contribute to public health data collection and analysis in low-resource settings through an injury surveillance pilot study conducted in Cape Town, South Africa. The success of this study illustrates the great potential for these technologies to be leveraged for public health surveillance in resource-constrained environments, given their ease-of-use and low-cost, and the sharing and collaboration capabilities they afford. The possibilities and potential limitations of these technologies are discussed in relation to the study, and to the field of public health in general.

  16. Mapping and Analysis of Forest and Land Fire Potential Using Geospatial Technology and Mathematical Modeling

    International Nuclear Information System (INIS)

    Suliman, M D H; Mahmud, M; Reba, M N M; S, L W

    2014-01-01

    Forest and land fire can cause negative implications for forest ecosystems, biodiversity, air quality and soil structure. However, the implications involved can be minimized through effective disaster management system. Effective disaster management mechanisms can be developed through appropriate early warning system as well as an efficient delivery system. This study tried to focus on two aspects, namely by mapping the potential of forest fire and land as well as the delivery of information to users through WebGIS application. Geospatial technology and mathematical modeling used in this study for identifying, classifying and mapping the potential area for burning. Mathematical models used is the Analytical Hierarchy Process (AHP), while Geospatial technologies involved include remote sensing, Geographic Information System (GIS) and digital field data collection. The entire Selangor state was chosen as our study area based on a number of cases have been reported over the last two decades. AHP modeling to assess the comparison between the three main criteria of fuel, topography and human factors design. Contributions of experts directly involved in forest fire fighting operations and land comprising officials from the Fire and Rescue Department Malaysia also evaluated in this model. The study found that about 32.83 square kilometers of the total area of Selangor state are the extreme potential for fire. Extreme potential areas identified are in Bestari Jaya and Kuala Langat High Ulu. Continuity of information and terrestrial forest fire potential was displayed in WebGIS applications on the internet. Display information through WebGIS applications is a better approach to help the decision-making process at a high level of confidence and approximate real conditions. Agencies involved in disaster management such as Jawatankuasa Pengurusan Dan Bantuan Bencana (JPBB) of District, State and the National under the National Security Division and the Fire and Rescue

  17. Mapping and Analysis of Forest and Land Fire Potential Using Geospatial Technology and Mathematical Modeling

    Science.gov (United States)

    Suliman, M. D. H.; Mahmud, M.; Reba, M. N. M.; S, L. W.

    2014-02-01

    Forest and land fire can cause negative implications for forest ecosystems, biodiversity, air quality and soil structure. However, the implications involved can be minimized through effective disaster management system. Effective disaster management mechanisms can be developed through appropriate early warning system as well as an efficient delivery system. This study tried to focus on two aspects, namely by mapping the potential of forest fire and land as well as the delivery of information to users through WebGIS application. Geospatial technology and mathematical modeling used in this study for identifying, classifying and mapping the potential area for burning. Mathematical models used is the Analytical Hierarchy Process (AHP), while Geospatial technologies involved include remote sensing, Geographic Information System (GIS) and digital field data collection. The entire Selangor state was chosen as our study area based on a number of cases have been reported over the last two decades. AHP modeling to assess the comparison between the three main criteria of fuel, topography and human factors design. Contributions of experts directly involved in forest fire fighting operations and land comprising officials from the Fire and Rescue Department Malaysia also evaluated in this model. The study found that about 32.83 square kilometers of the total area of Selangor state are the extreme potential for fire. Extreme potential areas identified are in Bestari Jaya and Kuala Langat High Ulu. Continuity of information and terrestrial forest fire potential was displayed in WebGIS applications on the internet. Display information through WebGIS applications is a better approach to help the decision-making process at a high level of confidence and approximate real conditions. Agencies involved in disaster management such as Jawatankuasa Pengurusan Dan Bantuan Bencana (JPBB) of District, State and the National under the National Security Division and the Fire and Rescue

  18. Revelation of `Hidden' Balinese Geospatial Heritage on A Map

    Science.gov (United States)

    Soeria Atmadja, Dicky A. S.; Wikantika, Ketut; Budi Harto, Agung; Putra, Daffa Gifary M.

    2018-05-01

    Bali is not just about beautiful nature. It also has a unique and interesting cultural heritage, including `hidden' geospatial heritage. Tri Hita Karana is a Hinduism concept of life consisting of human relation to God, to other humans and to the nature (Parahiyangan, Pawongan and Palemahan), Based on it, - in term of geospatial aspect - the Balinese derived its spatial orientation, spatial planning & lay out, measurement as well as color and typography. Introducing these particular heritage would be a very interesting contribution to Bali tourism. As a respond to these issues, a question arise on how to reveal these unique and highly valuable geospatial heritage on a map which can be used to introduce and disseminate them to the tourists. Symbols (patterns & colors), orientation, distance, scale, layout and toponimy have been well known as elements of a map. There is an chance to apply Balinese geospatial heritage in representing these map elements.

  19. Australian comments on data catalogues

    Energy Technology Data Exchange (ETDEWEB)

    Symonds, J L [A.A.E.C. Research Establishment, Lucas Heights (Australia)

    1968-05-01

    Between the need for some neutron data and a final evaluated set of data, the need for an action file, a bibliographic and reference file of catalogue, and a data storage and retrieval file is discussed.

  20. Sensor Webs and Virtual Globes: Enabling Understanding of Changes in a partially Glaciated Watershed

    Science.gov (United States)

    Heavner, M.; Fatland, D. R.; Habermann, M.; Berner, L.; Hood, E.; Connor, C.; Galbraith, J.; Knuth, E.; O'Brien, W.

    2008-12-01

    The University of Alaska Southeast is currently implementing a sensor web identified as the SouthEast Alaska MOnitoring Network for Science, Telecommunications, Education, and Research (SEAMONSTER). SEAMONSTER is operating in the partially glaciated Mendenhall and Lemon Creek Watersheds, in the Juneau area, on the margins of the Juneau Icefield. These watersheds are studied for both 1. long term monitoring of changes, and 2. detection and analysis of transient events (such as glacier lake outburst floods). The heterogeneous sensors (meteorologic, dual frequency GPS, water quality, lake level, etc), power and bandwidth constraints, and competing time scales of interest require autonomous reactivity of the sensor web. They also present challenges for operational management of the sensor web. The harsh conditions on the glaciers provide additional operating constraints. The tight integration of the sensor web and virtual global enabling technology enhance the project in multiple ways. We are utilizing virtual globe infrastructures to enhance both sensor web management and data access. SEAMONSTER utilizes virtual globes for education and public outreach, sensor web management, data dissemination, and enabling collaboration. Using a PosgreSQL with GIS extensions database coupled to the Open Geospatial Consortium (OGC) Geoserver, we generate near-real-time auto-updating geobrowser files of the data in multiple OGC standard formats (e.g KML, WCS). Additionally, embedding wiki pages in this database allows the development of a geospatially aware wiki describing the projects for better public outreach and education. In this presentation we will describe how we have implemented these technologies to date, the lessons learned, and our efforts towards greater OGC standard implementation. A major focus will be on demonstrating how geobrowsers and virtual globes have made this project possible.

  1. Bim and Gis: when Parametric Modeling Meets Geospatial Data

    Science.gov (United States)

    Barazzetti, L.; Banfi, F.

    2017-12-01

    Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.

  2. BIM AND GIS: WHEN PARAMETRIC MODELING MEETS GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2017-12-01

    Full Text Available Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building scale to the infrastructure (where geospatial data cannot be neglected has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by “pure” GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator industry, as well as new solutions for parametric modelling with additional geoinformation.

  3. Regional Geology Web Map Application Development: Javascript v2.0

    International Nuclear Information System (INIS)

    Russell, Glenn

    2017-01-01

    This is a milestone report for the FY2017 continuation of the Spent Fuel, Storage, and Waste, Technology (SFSWT) program (formerly Used Fuel Disposal (UFD) program) development of the Regional Geology Web Mapping Application by the Idaho National Laboratory Geospatial Science and Engineering group. This application was developed for general public use and is an interactive web-based application built in Javascript to visualize, reference, and analyze US pertinent geological features of the SFSWT program. This tool is a version upgrade from Adobe FLEX technology. It is designed to facilitate informed decision making of the geology of continental US relevant to the SFSWT program.

  4. Regional Geology Web Map Application Development: Javascript v2.0

    Energy Technology Data Exchange (ETDEWEB)

    Russell, Glenn [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-19

    This is a milestone report for the FY2017 continuation of the Spent Fuel, Storage, and Waste, Technology (SFSWT) program (formerly Used Fuel Disposal (UFD) program) development of the Regional Geology Web Mapping Application by the Idaho National Laboratory Geospatial Science and Engineering group. This application was developed for general public use and is an interactive web-based application built in Javascript to visualize, reference, and analyze US pertinent geological features of the SFSWT program. This tool is a version upgrade from Adobe FLEX technology. It is designed to facilitate informed decision making of the geology of continental US relevant to the SFSWT program.

  5. Assessing the catalogue module of Alice for window software ...

    African Journals Online (AJOL)

    The paper presents a general description of Alice For Window Software with a detailed analysis of the catalogue module. It highlights the basic features of the module such as add, edit, delete, search field and the grab button. The cataloguing process is clearly delineated. The paper also discusses Alice For Window ...

  6. A Geospatial Semantic Enrichment and Query Service for Geotagged Photographs

    Science.gov (United States)

    Ennis, Andrew; Nugent, Chris; Morrow, Philip; Chen, Liming; Ioannidis, George; Stan, Alexandru; Rachev, Preslav

    2015-01-01

    With the increasing abundance of technologies and smart devices, equipped with a multitude of sensors for sensing the environment around them, information creation and consumption has now become effortless. This, in particular, is the case for photographs with vast amounts being created and shared every day. For example, at the time of this writing, Instagram users upload 70 million photographs a day. Nevertheless, it still remains a challenge to discover the “right” information for the appropriate purpose. This paper describes an approach to create semantic geospatial metadata for photographs, which can facilitate photograph search and discovery. To achieve this we have developed and implemented a semantic geospatial data model by which a photograph can be enrich with geospatial metadata extracted from several geospatial data sources based on the raw low-level geo-metadata from a smartphone photograph. We present the details of our method and implementation for searching and querying the semantic geospatial metadata repository to enable a user or third party system to find the information they are looking for. PMID:26205265

  7. Geospatial Information Service System Based on GeoSOT Grid & Encoding

    Directory of Open Access Journals (Sweden)

    LI Shizhong

    2016-12-01

    Full Text Available With the rapid development of the space and earth observation technology, it is important to establish a multi-source, multi-scale and unified cross-platform reference for global data. In practice, the production and maintenance of geospatial data are scattered in different units, and the standard of the data grid varies between departments and systems. All these bring out the disunity of standards among different historical periods or orgnizations. Aiming at geospatial information security library for the national high resolution earth observation, there are some demands for global display, associated retrieval and template applications and other integrated services for geospatial data. Based on GeoSOT grid and encoding theory system, "geospatial information security library information of globally unified grid encoding management" data subdivision organization solutions have been proposed; system-level analyses, researches and designs have been carried out. The experimental results show that the data organization and management method based on GeoSOT can significantly improve the overall efficiency of the geospatial information security service system.

  8. Analysis of the seismic catalogues for the Vrancea Region, Romania

    International Nuclear Information System (INIS)

    Romashkova, L.L.; Kossobokov, V.G.

    2005-11-01

    Vrancea (Romania) is a geographical region between Eastern and Southern Carpathian Mountains. The region is characterized by a rather high level of seismic activity mainly at intermediate (up to 200 km) depths. These intermediate-depth earthquakes occur between 45 deg-46 deg N and 26 deg-27 deg E. The shallow earthquakes are dispersed over a much broader territory. We performed the comparative analysis of earthquake catalogues available for Vrancea region aiming at the compilation of a data set, to be as complete and homogeneous as possible, which, hopefully, will be used for the prediction of strong and possibly moderate earthquakes in the region by means of M8 algorithm. The two catalogues under study are: 1) Global Hypocenter Data Base catalogue, NEIC (GHDB, 1989) and 2) local Vrancea seismic catalogue (Moldoveanu et al., 1995) and their updates. (author)

  9. Derivation of photometric redshifts for the 3XMM catalogue

    Science.gov (United States)

    Georgantopoulos, I.; Corral, A.; Mountrichas, G.; Ruiz, A.; Masoura, V.; Fotopoulou, S.; Watson, M.

    2017-10-01

    We present the results from our ESA Prodex project that aims to derive photometric redshifts for the 3XMM catalogue. The 3XMM DR-6 offers the largest X-ray survey, containing 470,000 unique sources over 1000 sq. degrees. We cross-correlate the X-ray positions with optical and near-IR catalogues using Bayesian statistics. The optical catalogue used so far is the SDSS while currently we are employing the recently released PANSTARRS catalogue. In the near IR we use the Viking, VHS, UKIDS surveys and also the WISE W1 and W2 filters. The estimation of photometric redshifts is based on the TPZ software. The training sample is based on X-ray selected samples with available SDSS spectroscopy. We present here the results for the 40,000 3XMM sources with available SDSS counterparts. Our analysis provides very reliable photometric redshifts with sigma(mad)=0.05 and a fraction of outliers of 8% for the optically extended sources. We discuss the wide range of applications that are feasible using this unprecedented resource.

  10. Interfacce Web per database bibliografici il sistema di informazioni scientifiche del CERN

    CERN Document Server

    Brugnolo, F

    1997-01-01

    Analysis of how to develop and organise a scientific information service based on the Word Wide Web, the specificity of the databases word-oriented and the problems linked to the information retrieval on the WWW. The analysis is done both in the theoretical and in the practical point of view. The case of the CERN scientific information service is taken into account. We study the reorganisation of t he whole architecture and the development of the Web User Interface. We conclude with the description of the service Personal Virtual Library, developed for CERN Library Catalogue.

  11. Developing a prenatal nursing care International Classification for Nursing Practice catalogue.

    Science.gov (United States)

    Liu, L; Coenen, A; Tao, H; Jansen, K R; Jiang, A L

    2017-09-01

    This study aimed to develop a prenatal nursing care catalogue of International Classification for Nursing Practice. As a programme of the International Council of Nurses, International Classification for Nursing Practice aims to support standardized electronic nursing documentation and facilitate collection of comparable nursing data across settings. This initiative enables the study of relationships among nursing diagnoses, nursing interventions and nursing outcomes for best practice, healthcare management decisions, and policy development. The catalogues are usually focused on target populations. Pregnant women are the nursing population addressed in this project. According to the guidelines for catalogue development, three research steps have been adopted: (a) identifying relevant nursing diagnoses, interventions and outcomes; (b) developing a conceptual framework for the catalogue; (c) expert's validation. This project established a prenatal nursing care catalogue with 228 terms in total, including 69 nursing diagnosis, 92 nursing interventions and 67 nursing outcomes, among them, 57 nursing terms were newly developed. All terms in the catalogue were organized by a framework with two main categories, i.e. Expected Changes of Pregnancy and Pregnancy at Risk. Each category had four domains, representing the physical, psychological, behavioral and environmental perspectives of nursing practice. This catalogue can ease the documentation workload among prenatal care nurses, and facilitate storage and retrieval of standardized data for many purposes, such as quality improvement, administration decision-support and researches. The documentations of prenatal care provided data that can be more fluently communicated, compared and evaluated across various healthcare providers and clinic settings. © 2016 International Council of Nurses.

  12. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address

  13. SWOT analysis on National Common Geospatial Information Service Platform of China

    Science.gov (United States)

    Zheng, Xinyan; He, Biao

    2010-11-01

    Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.

  14. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin

    2010-03-01

    Geospatial information systems provide an abundance of information for researchers and scientists. Unfortunately this type of data can usually only be analyzed a few megapixels at a time, giving researchers a very narrow view into these voluminous data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented levels expediting analysis, interrogation, and discovery. ©2010 IEEE.

  15. Modeling photovoltaic diffusion: an analysis of geospatial datasets

    International Nuclear Information System (INIS)

    Davidson, Carolyn; Drury, Easan; Lopez, Anthony; Elmore, Ryan; Margolis, Robert

    2014-01-01

    This study combines address-level residential photovoltaic (PV) adoption trends in California with several types of geospatial information—population demographics, housing characteristics, foreclosure rates, solar irradiance, vehicle ownership preferences, and others—to identify which subsets of geospatial information are the best predictors of historical PV adoption. Number of rooms, heating source and house age were key variables that had not been previously explored in the literature, but are consistent with the expected profile of a PV adopter. The strong relationship provided by foreclosure indicators and mortgage status have less of an intuitive connection to PV adoption, but may be highly correlated with characteristics inherent in PV adopters. Next, we explore how these predictive factors and model performance varies between different Investor Owned Utility (IOU) regions in California, and at different spatial scales. Results suggest that models trained with small subsets of geospatial information (five to eight variables) may provide similar explanatory power as models using hundreds of geospatial variables. Further, the predictive performance of models generally decreases at higher resolution, i.e., below ZIP code level since several geospatial variables with coarse native resolution become less useful for representing high resolution variations in PV adoption trends. However, for California we find that model performance improves if parameters are trained at the regional IOU level rather than the state-wide level. We also find that models trained within one IOU region are generally representative for other IOU regions in CA, suggesting that a model trained with data from one state may be applicable in another state. (letter)

  16. IAEA Publications Catalogue 2016-2017 - full details of publications published 2015-2016, forthcoming publications 2016-2017 and a stocklist of publications published 2013-2016

    International Nuclear Information System (INIS)

    2016-08-01

    This publications catalogue lists all sales publications of the IAEA published in 2015-2016 and those forthcoming in 2016-2017. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  17. Geospatial Services in Special Libraries: A Needs Assessment Perspective

    Science.gov (United States)

    Barnes, Ilana

    2013-01-01

    Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…

  18. The Database of the Catalogue of Clinical Practice Guidelines Published via Internet in the Czech Language -The Current State

    Czech Academy of Sciences Publication Activity Database

    Zvolský, Miroslav

    2010-01-01

    Roč. 6, č. 1 (2010), s. 83-89 ISSN 1801-5603 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : internet * World Wide Web * database * clinical practice guideline * clinical practice * evidence-based medicine * formalisation * GLIF (Guideline Inerchange Format) * doctor of medicine, * decision support systems Subject RIV: IN - Informatics, Computer Science http://www.ejbi.org/en/ejbi/article/63-en-the-database-of-the-catalogue-of-clinical- practice -guidelines-published-via-internet-in-the-czech-language-the-current-state.html

  19. AN AUTOMATED END-TO-END MULTI-AGENT QOS BASED ARCHITECTURE FOR SELECTION OF GEOSPATIAL WEB SERVICES

    Directory of Open Access Journals (Sweden)

    M. Shah

    2012-07-01

    With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  20. Catalogue of theses

    International Nuclear Information System (INIS)

    Paranjpe, S.V.

    1975-01-01

    The catalogue lists 442 theses submitted by the scientists of the Bhabha Atomic Research Centre, since its inception, to the various universities in India and abroad for the award of M. Sc. and Ph. D. degrees. Theses are grouped under broad subject headings which are arranged in the order of Universal Decimal Classification Scheme. In addition to the author and guide index, a detailed subject index is appended which enhances the utility of the compilation. (S.V.P.)

  1. The Value of Information - Accounting for a New Geospatial Paradigm

    Science.gov (United States)

    Pearlman, J.; Coote, A. M.

    2014-12-01

    A new frontier in consideration of socio-economic benefit is valuing information as an asset, often referred to as Infonomics. Conventional financial practice does not easily provide a mechanism for valuing information and yet clearly for many of the largest corporations, such as Google and Facebook, it is their principal asset. This is exacerbated for public sector organizations, as those that information-centric rather than information-enabled are relatively few - statistics, archiving and mapping agencies are perhaps the only examples - so it's not at the top of the agenda for Government. However, it is a hugely important issue when valuing Geospatial data and information. Geospatial data allows public institutions to operate, and facilitates the provision of essential services for emergency response and national defense. In this respect, geospatial data is strongly analogous to other types of public infrastructure, such as utilities and roads. The use of Geospatial data is widespread from companies in the transportation or construction sectors to individual planning for daily events. The categorization of geospatial data as infrastructure is critical to decisions related to investment in its management, maintenance and upgrade over time. Geospatial data depreciates in the same way that physical infrastructure depreciates. It needs to be maintained otherwise its functionality and value in use declines. We have coined the term geo-infonomics to encapsulate the concept. This presentation will develop the arguments around its importance and current avenues of research.

  2. Using Open and Interoperable Ways to Publish and Access LANCE AIRS Near-Real Time Data

    Science.gov (United States)

    Zhao, Peisheng; Lynnes, Christopher; Vollmer, Bruce; Savtchenko, Andrey; Theobald, Michael; Yang, Wenli

    2011-01-01

    The Atmospheric Infrared Sounder (AIRS) Near-Real Time (NRT) data from the Land Atmosphere Near real-time Capability for EOS (LANCE) element at the Goddard Earth Sciences Data and Information Services Center (GES DISC) provides information on the global and regional atmospheric state, with very low temporal latency, to support climate research and improve weather forecasting. An open and interoperable platform is useful to facilitate access to, and integration of, LANCE AIRS NRT data. As Web services technology has matured in recent years, a new scalable Service-Oriented Architecture (SOA) is emerging as the basic platform for distributed computing and large networks of interoperable applications. Following the provide-register-discover-consume SOA paradigm, this presentation discusses how to use open-source geospatial software components to build Web services for publishing and accessing AIRS NRT data, explore the metadata relevant to registering and discovering data and services in the catalogue systems, and implement a Web portal to facilitate users' consumption of the data and services.

  3. Generation of Multiple Metadata Formats from a Geospatial Data Repository

    Science.gov (United States)

    Hudspeth, W. B.; Benedict, K. K.; Scott, S.

    2012-12-01

    The Earth Data Analysis Center (EDAC) at the University of New Mexico is partnering with the CYBERShARE and Environmental Health Group from the Center for Environmental Resource Management (CERM), located at the University of Texas, El Paso (UTEP), the Biodiversity Institute at the University of Kansas (KU), and the New Mexico Geo- Epidemiology Research Network (GERN) to provide a technical infrastructure that enables investigation of a variety of climate-driven human/environmental systems. Two significant goals of this NASA-funded project are: a) to increase the use of NASA Earth observational data at EDAC by various modeling communities through enabling better discovery, access, and use of relevant information, and b) to expose these communities to the benefits of provenance for improving understanding and usability of heterogeneous data sources and derived model products. To realize these goals, EDAC has leveraged the core capabilities of its Geographic Storage, Transformation, and Retrieval Engine (Gstore) platform, developed with support of the NSF EPSCoR Program. The Gstore geospatial services platform provides general purpose web services based upon the REST service model, and is capable of data discovery, access, and publication functions, metadata delivery functions, data transformation, and auto-generated OGC services for those data products that can support those services. Central to the NASA ACCESS project is the delivery of geospatial metadata in a variety of formats, including ISO 19115-2/19139, FGDC CSDGM, and the Proof Markup Language (PML). This presentation details the extraction and persistence of relevant metadata in the Gstore data store, and their transformation into multiple metadata formats that are increasingly utilized by the geospatial community to document not only core library catalog elements (e.g. title, abstract, publication data, geographic extent, projection information, and database elements), but also the processing steps used to

  4. Development of a web application for water resources based on open source software

    Science.gov (United States)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  5. Netherlands Oil and Gas Catalogue 2010

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-04-15

    The Netherlands Oil and Gas Catalogue 2010 is a specialist publication from IRO (the Association of Dutch Suppliers in the Oil and Gas Industry) and Pedemex. A wealth of relevant business information has been collected with the greatest care to be included in this publication, and subsequently brought together in an easy to digest order. The catalogue is broken down into the following headings: (1) Engineering, Consultancy and Research; (2) Exploration, Drilling and Production; (3) Construction and Fabrication; (4) Contracting,Transport and Installation; (5) Equipment Supply; (6) Miscellaneous. In addition you can by using keywords, rapidly identify the company you are looking for. A list is also enclosed with the names and contact details of IRO members, and the sectors in which they are active.

  6. M3.2.3 Personas Catalogue

    DEFF Research Database (Denmark)

    Guldbæk Rasmussen, Katja; Iversen, Rie; Petersen, Gitte

    This catalogue contains 7 personas developed for use in the Europeana projects. The premise of this work has been to find already existing personas within the domains of archives, museums and libraries in Europe. These have then been pared down to their essentials and rebuilt, using input from...... Europeana partners and research on behavior and search patterns. If you have never worked with personas before,please take the time to read the short introduction in the chapter about method. The personas, and a brief “How To” is the central issue in this catalogue and therefore placed at the front....... For those wanting to dig a little deeper into how the personas were created, more in-depth material can be found in the chapters at the back....

  7. Uncertainty visualisation in the Model Web

    Science.gov (United States)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool

  8. Catalogue of ptyctimous mites (Acari, Oribatida) of the world.

    Science.gov (United States)

    NiedbaŁa, Wojciech; Liu, Dong

    2018-03-11

    As important representatives of Oribatida (Acari), ptyctimous mites comprise more than 1400 described species in 40 genera and subgenera, with nearly cosmopolitan distribution except for the Arctic and Antarctic Regions. They are capable of folding the aspidosoma under the opisthosoma to protect their appendages, and are primarily soil and litter inhabitants, feeding on fungi and decaying plant remains with various levels of specificity. Our purpose was to provide a detailed catalogue of all known ptyctimous mite species in the world with information of distribution, taxonomic issues and some remarks. Data of known juvenile  instars of ptyctimous mites which were not included in Norton Ermilov (2014) were added. We hope that our catalogue with bibliography will be helpful in taxonomic and ecological studies.        The catalogue presents taxonomic information and geographic distribution of 1431 known species of the world belonging to 42 genera and eight families (not including data of genus and species inquirenda, nomina nuda and species without author name). Among them, 261 species are listed as synonyms, 43 species inquirenda, nine homonyms, 17 new synonyms, one new subgenus Mahuntritia subgenus nov. and three new names are included in the catalogue.

  9. Version 2000 of the Catalogue of Galactic Planetary Nebulae

    Science.gov (United States)

    Kohoutek, L.

    2001-11-01

    The ``Catalogue of Galactic Planetary Nebulae (Version 2000)'' appears in Abhandlungen aus der Hamburger Sternwarte, Band XII in the year 2001. It is a continuation of CGPN(1967) and contains 1510 objects classified as galactic PNe up to the end of 1999. The lists of possible pre-PNe and possible post-PNe are also given. The catalogue is restricted only to the data belonging to the location and identification of the objects. It gives identification charts of PNe discovered since 1965 (published in the supplements to CGPN) and those charts of objects discovered earlier, which have wrong or uncertain identification. The question ``what is a planetary nebula'' is discussed and the typical values of PNe and of their central stars are summarized. Short statistics about the discoveries of PNe are given. The catalogue is also available in the Centre de Données, Strasbourg and at Hamburg Observatory via internet. The Catalogue is only available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsweb.u-strasbg.fr/cgi-bin/qcat?J/A+A/378/843

  10. INTERACT Station Catalogue - 2015

    DEFF Research Database (Denmark)

    INTERACT stations are located in all major environmental envelopes of the Arctic providing an ideal platform for studying climate change and its impact on the environment and local communities. Since alpine environments face similar changes and challenges as the Arctic, the INTERACT network also ...... catalogue includes descriptions of 73 research stations included in the network at the time of printing....

  11. Planck 2013 results. XXIX. Planck catalogue of Sunyaev-Zeldovich sources

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.

    2013-01-01

    We describe the all-sky Planck catalogue of clusters and cluster candidates derived from Sunyaev-Zeldovich (SZ) effect detections using the first 15.5 months of Planck satellite observations. The catalogue contains 1227 entries, making it over six times the size of the Planck Early SZ (ESZ) sampl...

  12. Geospatial data sharing, online spatial analysis and processing of Indian Biodiversity data in Internet GIS domain - A case study for raster based online geo-processing

    Science.gov (United States)

    Karnatak, H.; Pandey, K.; Oberai, K.; Roy, A.; Joshi, D.; Singh, H.; Raju, P. L. N.; Krishna Murthy, Y. V. N.

    2014-11-01

    National Biodiversity Characterization at Landscape Level, a project jointly sponsored by Department of Biotechnology and Department of Space, was implemented to identify and map the potential biodiversity rich areas in India. This project has generated spatial information at three levels viz. Satellite based primary information (Vegetation Type map, spatial locations of road & village, Fire occurrence); geospatially derived or modelled information (Disturbance Index, Fragmentation, Biological Richness) and geospatially referenced field samples plots. The study provides information of high disturbance and high biological richness areas suggesting future management strategies and formulating action plans. The study has generated for the first time baseline database in India which will be a valuable input towards climate change study in the Indian Subcontinent. The spatial data generated during the study is organized as central data repository in Geo-RDBMS environment using PostgreSQL and POSTGIS. The raster and vector data is published as OGC WMS and WFS standard for development of web base geoinformation system using Service Oriented Architecture (SOA). The WMS and WFS based system allows geo-visualization, online query and map outputs generation based on user request and response. This is a typical mashup architecture based geo-information system which allows access to remote web services like ISRO Bhuvan, Openstreet map, Google map etc., with overlay on Biodiversity data for effective study on Bio-resources. The spatial queries and analysis with vector data is achieved through SQL queries on POSTGIS and WFS-T operations. But the most important challenge is to develop a system for online raster based geo-spatial analysis and processing based on user defined Area of Interest (AOI) for large raster data sets. The map data of this study contains approximately 20 GB of size for each data layer which are five in number. An attempt has been to develop system using

  13. Creating of Central Geospatial Database of the Slovak Republic and Procedures of its Revision

    Science.gov (United States)

    Miškolci, M.; Šafář, V.; Šrámková, R.

    2016-06-01

    The article describes the creation of initial three dimensional geodatabase from planning and designing through the determination of technological and manufacturing processes to practical using of Central Geospatial Database (CGD - official name in Slovak language is Centrálna Priestorová Databáza - CPD) and shortly describes procedures of its revision. CGD ensures proper collection, processing, storing, transferring and displaying of digital geospatial information. CGD is used by Ministry of Defense (MoD) for defense and crisis management tasks and by Integrated rescue system. For military personnel CGD is run on MoD intranet, and for other users outside of MoD is transmutated to ZbGIS (Primary Geodatabase of Slovak Republic) and is run on public web site. CGD is a global set of geo-spatial information. CGD is a vector computer model which completely covers entire territory of Slovakia. Seamless CGD is created by digitizing of real world using of photogrammetric stereoscopic methods and measurements of objects properties. Basic vector model of CGD (from photogrammetric processing) is then taken out to the field for inspection and additional gathering of objects properties in the whole area of mapping. Finally real-world objects are spatially modeled as a entities of three-dimensional database. CGD gives us opportunity, to get know the territory complexly in all the three spatial dimensions. Every entity in CGD has recorded the time of collection, which allows the individual to assess the timeliness of information. CGD can be utilized for the purposes of geographical analysis, geo-referencing, cartographic purposes as well as various special-purpose mapping and has the ambition to cover the needs not only the MoD, but to become a reference model for the national geographical infrastructure.

  14. SemantGeo: Powering Ecological and Environment Data Discovery and Search with Standards-Based Geospatial Reasoning

    Science.gov (United States)

    Seyed, P.; Ashby, B.; Khan, I.; Patton, E. W.; McGuinness, D. L.

    2013-12-01

    Recent efforts to create and leverage standards for geospatial data specification and inference include the GeoSPARQL standard, Geospatial OWL ontologies (e.g., GAZ, Geonames), and RDF triple stores that support GeoSPARQL (e.g., AllegroGraph, Parliament) that use RDF instance data for geospatial features of interest. However, there remains a gap on how best to fuse software engineering best practices and GeoSPARQL within semantic web applications to enable flexible search driven by geospatial reasoning. In this abstract we introduce the SemantGeo module for the SemantEco framework that helps fill this gap, enabling scientists find data using geospatial semantics and reasoning. SemantGeo provides multiple types of geospatial reasoning for SemantEco modules. The server side implementation uses the Parliament SPARQL Endpoint accessed via a Tomcat servlet. SemantGeo uses the Google Maps API for user-specified polygon construction and JsTree for providing containment and categorical hierarchies for search. SemantGeo uses GeoSPARQL for spatial reasoning alone and in concert with RDFS/OWL reasoning capabilities to determine, e.g., what geofeatures are within, partially overlap with, or within a certain distance from, a given polygon. We also leverage qualitative relationships defined by the Gazetteer ontology that are composites of spatial relationships as well as administrative designations or geophysical phenomena. We provide multiple mechanisms for exploring data, such as polygon (map-based) and named-feature (hierarchy-based) selection, that enable flexible search constraints using boolean combination of selections. JsTree-based hierarchical search facets present named features and include a 'part of' hierarchy (e.g., measurement-site-01, Lake George, Adirondack Region, NY State) and type hierarchies (e.g., nodes in the hierarchy for WaterBody, Park, MeasurementSite), depending on the ';axis of choice' option selected. Using GeoSPARQL and aforementioned ontology

  15. Geo-spatial technologies in urban environments policy, practice, and pixels

    CERN Document Server

    Jensen, Ryan R; McLean, Daniel

    2004-01-01

    Using Geospatial Technologies in Urban Environments simultaneously fills two gaping vacuums in the scholarly literature on urban geography. The first is the clear and straightforward application of geospatial technologies to practical urban issues. By using remote sensing and statistical techniques (correlation-regression analysis, the expansion method, factor analysis, and analysis of variance), the - thors of these 12 chapters contribute significantly to our understanding of how geospatial methodologies enhance urban studies. For example, the GIS Specialty Group of the Association of American Geographers (AAG) has the largest m- bership of all the AAG specialty groups, followed by the Urban Geography S- cialty Group. Moreover, the Urban Geography Specialty Group has the largest number of cross-memberships with the GIS Specialty Group. This book advances this important geospatial and urban link. Second, the book fills a wide void in the urban-environment literature. Although the Annals of the Association of ...

  16. Towards Geo-spatial Hypermedia: Concepts and Prototype Implementation

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Vestergaard, Peter Posselt; Ørbæk, Peter

    2002-01-01

    This paper combines spatial hypermedia with techniques from Geographical Information Systems and location based services. We describe the Topos 3D Spatial Hypermedia system and how it has been developed to support geo-spatial hypermedia coupling hypermedia information to model representations...... of real world buildings and landscapes. The prototype experiments are primarily aimed at supporting architects and landscape architects in their work on site. Here it is useful to be able to superimpose and add different layers of information to, e.g. a landscape depending on the task being worked on. We...... and indirect navigation. Finally, we conclude with a number of research issues which are central to the future development of geo-spatial hypermedia, including design issues in combining metaphorical and literal hypermedia space, as well as a discussion of the role of spatial parsing in a geo-spatial context....

  17. Using Web Crawler Technology for Text Analysis of Geo-Events: A Case Study of the Huangyan Island Incident

    Science.gov (United States)

    Hu, H.; Ge, Y. J.

    2013-11-01

    With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.

  18. HR diagrams derived from the Michigan Spectral Catalogue

    International Nuclear Information System (INIS)

    Houk, N.; Fesen, R.

    1978-01-01

    The authors present some HR diagrams constructed using data from the Michigan Spectral Catalogues. Houk (1975) has been systematically reclassifying the Henry Draper stars on the MK system, from the south pole northward. Objective-prism plates, with a reciprocal dispersion of 108 A/mm, have been taken with the Michigan Curtis Schmidt telescope at Cerro Tololo Inter-American Observatory in Chile. The spectra are classified visually from the plates, and the results are put onto IBM cards and magnetic tape from which the catalogues are produced. (Auth.)

  19. VizieR Online Data Catalog: A unified supernova catalogue (Lennarz+, 2012)

    Science.gov (United States)

    Lennarz, D.; Altmann, D.; Wiebusch, C.

    2011-11-01

    A supernova catalogue containing data for 5526 extragalactic supernovae that were discovered up to 2010 December 31. It combines several catalogues that are currently available online in a consistent and traceable way. During the comparison of the catalogues inconsistent entries were identified and resolved where possible. Remaining inconsistencies are marked transparently and can be easily identified. Thus it is possible to select a high-quality sample in a most simple way. Where available, redshift-based distance estimates to the supernovae were replaced by journal-refereed distances. (1 data file).

  20. Soil Monitor: an open source web application for real-time soil sealing monitoring and assessment

    Science.gov (United States)

    Langella, Giuliano; Basile, Angelo; Giannecchini, Simone; Iamarino, Michela; Munafò, Michele; Terribile, Fabio

    2016-04-01

    Soil sealing is one of the most important causes of land degradation and desertification. In Europe, soil covered by impermeable materials has increased by about 80% from the Second World War till nowadays, while population has only grown by one third. There is an increasing concern at the high political levels about the need to attenuate imperviousness itself and its effects on soil functions. European Commission promulgated a roadmap (COM(2011) 571) by which the net land take would be zero by 2050. Furthermore, European Commission also published a report in 2011 providing best practices and guidelines for limiting soil sealing and imperviousness. In this scenario, we developed an open source and an open source based Soil Sealing Geospatial Cyber Infrastructure (SS-GCI) named as "Soil Monitor". This tool merges a webGIS with parallel geospatial computation in a fast and dynamic fashion in order to provide real-time assessments of soil sealing at high spatial resolution (20 meters and below) over the whole Italy. Common open source webGIS packages are used to implement both the data management and visualization infrastructures, such as GeoServer and MapStore. The high-speed geospatial computation is ensured by a GPU parallelism using the CUDA (Computing Unified Device Architecture) framework by NVIDIA®. This kind of parallelism required the writing - from scratch - all codes needed to fulfil the geospatial computation built behind the soil sealing toolbox. The combination of GPU computing with webGIS infrastructures is relatively novel and required particular attention at the Java-CUDA programming interface. As a result, Soil Monitor is smart because it can perform very high time-consuming calculations (querying for instance an Italian administrative region as area of interest) in less than one minute. The web application is embedded in a web browser and nothing must be installed before using it. Potentially everybody can use it, but the main targets are the

  1. Points of View: Herbert Bayer’s Exhibition Catalogue for the 1930 Section Allemande

    Directory of Open Access Journals (Sweden)

    Wallis Miller

    2017-01-01

    Full Text Available Sigfried Giedion called Herbert Bayer’s exhibition catalogue for the 1930 'Section Allemande' a “minor typographical masterpiece.” Like similar catalogues, it is inexpensive, provides an inventory list, has an introduction, functions as a guide, and is illustrated. However, the majority of its images are of installations, not their contents. Bayer accommodates the catalogue type for applied arts exhibitions by listing installations as objects, but he confronts the type by showing installations as display contexts that establish points of view, emulating, idealizing and interpreting the experience of the exhibition. By independently constructing ways of seeing and understanding the exhibition, the catalogue resists being an appendage to the exhibition, despite their close relationship. Giedion may have viewed Bayer’s catalogue as an important but secondary work of graphic design, but this article argues that it is of primary significance as an exhibition catalogue, an unusual essay on the book typology that is conscious of its history while moving outside — to other types of book design and to exhibitions — to transform it.

  2. Selling pictures: the illustrated auction catalogue

    Directory of Open Access Journals (Sweden)

    Elizabeth Pergam

    2014-12-01

    Full Text Available This essay is based upon a survey of reproductions in auction catalogues – from their first appearance in the early eighteenth century until their more consistent use in the second decade of the twentieth century. Examining the role of these illustrations sheds light on how auctions functioned; it was not just the works of art that were traded, but knowledge about those works of art became currency to be exchanged. In contrast to the high end engravings and photographs of luxury illustrated art books, reproductions in auction catalogues – publications produced as ephemeral marketing tools – were of noticeably lower quality. This study of the status of reproductions, therefore, investigates the evolving understanding of art knowledge, both aesthetic and economic, and the interdependence of the market and connoisseurship.

  3. CMS offline web tools

    International Nuclear Information System (INIS)

    Metson, S; Newbold, D; Belforte, S; Kavka, C; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Tuura, L; Evans, D; Fanfani, A; Feichtinger, D; Kuznetsov, V; Lingen, F van; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools

  4. CMS offline web tools

    Energy Technology Data Exchange (ETDEWEB)

    Metson, S; Newbold, D [H.H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Belforte, S; Kavka, C [INFN, Sezione di Trieste (Italy); Bockelman, B [University of Nebraska Lincoln, Lincoln, NE (United States); Dziedziniewicz, K [CERN, Geneva (Switzerland); Egeland, R [University of Minnesota Twin Cities, Minneapolis, MN (United States); Elmer, P [Princeton (United States); Eulisse, G; Tuura, L [Northeastern University, Boston, MA (United States); Evans, D [Fermilab MS234, Batavia, IL (United States); Fanfani, A [Universita degli Studi di Bologna (Italy); Feichtinger, D [PSI, Villigen (Switzerland); Kuznetsov, V [Cornell University, Ithaca, NY (United States); Lingen, F van [California Institute of Technology, Pasedena, CA (United States); Wakefield, S [Blackett Laboratory, Imperial College, London (United Kingdom)

    2008-07-15

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools.

  5. New implementation of OGC Web Processing Service in Python programming language. PyWPS-4 and issues we are facing with processing of large raster data using OGC WPS

    OpenAIRE

    J. Čepický; L. M. de Sousa

    2016-01-01

    The OGC® Web Processing Service (WPS) Interface Standard provides rules for standardizing inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data ...

  6. International Atomic Energy Agency. Publications catalogue 2009 including full details of publications published in 2008-2009, forthcoming publications and a stocklist of publications published in 2006-2007

    International Nuclear Information System (INIS)

    2009-06-01

    This Publications Catalogue lists all sales publications of the IAEA published in 2008 and 2009 and forthcoming in 2009. Most IAEA publications are issued in English, some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  7. TopoCad - A unified system for geospatial data and services

    Science.gov (United States)

    Felus, Y. A.; Sagi, Y.; Regev, R.; Keinan, E.

    2013-10-01

    "E-government" is a leading trend in public sector activities in recent years. The Survey of Israel set as a vision to provide all of its services and datasets online. The TopoCad system is the latest software tool developed in order to unify a number of services and databases into one on-line and user friendly system. The TopoCad system is based on Web 1.0 technology; hence the customer is only a consumer of data. All data and services are accessible for the surveyors and geo-information professional in an easy and comfortable way. The future lies in Web 2.0 and Web 3.0 technologies through which professionals can upload their own data for quality control and future assimilation with the national database. A key issue in the development of this complex system was to implement a simple and easy (comfortable) user experience (UX). The user interface employs natural language dialog box in order to understand the user requirements. The system then links spatial data with alpha-numeric data in a flawless manner. The operation of the TopoCad requires no user guide or training. It is intuitive and self-taught. The system utilizes semantic engines and machine understanding technologies to link records from diverse databases in a meaningful way. Thus, the next generation of TopoCad will include five main modules: users and projects information, coordinates transformations and calculations services, geospatial data quality control, linking governmental systems and databases, smart forms and applications. The article describes the first stage of the TopoCad system and gives an overview of its future development.

  8. National Geospatial-Intelligence Agency Academic Research Program

    Science.gov (United States)

    Loomer, S. A.

    2004-12-01

    "Know the Earth.Show the Way." In fulfillment of its vision, the National Geospatial-Intelligence Agency (NGA) provides geospatial intelligence in all its forms and from whatever source-imagery, imagery intelligence, and geospatial data and information-to ensure the knowledge foundation for planning, decision, and action. To achieve this, NGA conducts a multi-disciplinary program of basic research in geospatial intelligence topics through grants and fellowships to the leading investigators, research universities, and colleges of the nation. This research provides the fundamental science support to NGA's applied and advanced research programs. The major components of the NGA Academic Research Program (NARP) are: - NGA University Research Initiatives (NURI): Three-year basic research grants awarded competitively to the best investigators across the US academic community. Topics are selected to provide the scientific basis for advanced and applied research in NGA core disciplines. - Historically Black College and University - Minority Institution Research Initiatives (HBCU-MI): Two-year basic research grants awarded competitively to the best investigators at Historically Black Colleges and Universities, and Minority Institutions across the US academic community. - Director of Central Intelligence Post-Doctoral Research Fellowships: Fellowships providing access to advanced research in science and technology applicable to the intelligence community's mission. The program provides a pool of researchers to support future intelligence community needs and develops long-term relationships with researchers as they move into career positions. This paper provides information about the NGA Academic Research Program, the projects it supports and how other researchers and institutions can apply for grants under the program.

  9. A Spatial Data Infrastructure to Share Earth and Space Science Data

    Science.gov (United States)

    Nativi, S.; Mazzetti, P.; Bigagli, L.; Cuomo, V.

    2006-05-01

    Spatial Data Infrastructure:SDI (also known as Geospatial Data Infrastructure) is fundamentally a mechanism to facilitate the sharing and exchange of geospatial data. SDI is a scheme necessary for the effective collection, management, access, delivery and utilization of geospatial data; it is important for: objective decision making and sound land based policy, support economic development and encourage socially and environmentally sustainable development. As far as data model and semantics are concerned, a valuable and effective SDI should be able to cross the boundaries between the Geographic Information System/Science (GIS) and Earth and Space Science (ESS) communities. Hence, SDI should be able to discover, access and share information and data produced and managed by both GIS and ESS communities, in an integrated way. In other terms, SDI must be built on a conceptual and technological framework which abstracts the nature and structure of shared dataset: feature-based data or Imagery, Gridded and Coverage Data (IGCD). ISO TC211 and the Open Geospatial Consortium provided important artifacts to build up this framework. In particular, the OGC Web Services (OWS) initiatives and several Interoperability Experiment (e.g. the GALEON IE) are extremely useful for this purpose. We present a SDI solution which is able to manage both GIS and ESS datasets. It is based on OWS and other well-accepted or promising technologies, such as: UNIDATA netCDF and CDM, ncML and ncML-GML. Moreover, it uses a specific technology to implement a distributed and federated system of catalogues: the GI-Cat. This technology performs data model mediation and protocol adaptation tasks. It is used to work out a metadata clearinghouse service, implementing a common (federal) catalogue model which is based on the ISO 19115 core metadata for geo-dataset. Nevertheless, other well- accepted or standard catalogue data models can be easily implemented as common view (e.g. OGC CS-W, the next coming

  10. Catalogue of European earthquakes with intensities higher than 4

    International Nuclear Information System (INIS)

    Van Gils, J.M.; Leydecker, G.

    1991-01-01

    The catalogue of European earthquakes with intensities higher than 4 contains some 20 000 seismic events that happened in member countries of the European Communities, Switzerland and Austria. It was prepared on the basis of already existing national catalogues and includes historical data as well as present-day data. All historical data are harmonized as far as possible to the same intensity scale (MSK-scale) to make them suitable for computerization. Present-day data include instrumental and macroseismic data. Instrumental data are expressed in terms of magnitude (Richter scale) while macroseismic data are given in intensities. Compilation of seismic data can provide a basis for statistically supported studies of site selection procedures and the qualitative assessment of seismic risks. Three groups of seismic maps illustrate the content of the catalogue for different time periods and different intensities

  11. GASS-WEB: a web server for identifying enzyme active sites based on genetic algorithms.

    Science.gov (United States)

    Moraes, João P A; Pappa, Gisele L; Pires, Douglas E V; Izidoro, Sandro C

    2017-07-03

    Enzyme active sites are important and conserved functional regions of proteins whose identification can be an invaluable step toward protein function prediction. Most of the existing methods for this task are based on active site similarity and present limitations including performing only exact matches on template residues, template size restraints, despite not being capable of finding inter-domain active sites. To fill this gap, we proposed GASS-WEB, a user-friendly web server that uses GASS (Genetic Active Site Search), a method based on an evolutionary algorithm to search for similar active sites in proteins. GASS-WEB can be used under two different scenarios: (i) given a protein of interest, to match a set of specific active site templates; or (ii) given an active site template, looking for it in a database of protein structures. The method has shown to be very effective on a range of experiments and was able to correctly identify >90% of the catalogued active sites from the Catalytic Site Atlas. It also managed to achieve a Matthew correlation coefficient of 0.63 using the Critical Assessment of protein Structure Prediction (CASP 10) dataset. In our analysis, GASS was ranking fourth among 18 methods. GASS-WEB is freely available at http://gass.unifei.edu.br/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Brandenburg 3D - a comprehensive 3D Subsurface Model, Conception of an Infrastructure Node and a Web Application

    Science.gov (United States)

    Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim

    2014-05-01

    The Energiewende and the increasing scarcity of raw materials will lead to an intensified utilization of the subsurface in Germany. Within this context, geological 3D modeling is a fundamental approach for integrated decision and planning processes. Initiated by the development of the European Geospatial Infrastructure INSPIRE, the German State Geological Offices started digitizing their predominantly analog archive inventory. Until now, a comprehensive 3D subsurface model of Brandenburg did not exist. Therefore the project B3D strived to develop a new 3D model as well as a subsequent infrastructure node to integrate all geological and spatial data within the Geodaten-Infrastruktur Brandenburg (Geospatial Infrastructure, GDI-BB) and provide it to the public through an interactive 2D/3D web application. The functionality of the web application is based on a client-server architecture. Server-sided, all available spatial data is published through GeoServer. GeoServer is designed for interoperability and acts as the reference implementation of the Open Geospatial Consortium (OGC) Web Feature Service (WFS) standard that provides the interface that allows requests for geographical features. In addition, GeoServer implements, among others, the high performance certified compliant Web Map Service (WMS) that serves geo-referenced map images. For publishing 3D data, the OGC Web 3D Service (W3DS), a portrayal service for three-dimensional geo-data, is used. The W3DS displays elements representing the geometry, appearance, and behavior of geographic objects. On the client side, the web application is solely based on Free and Open Source Software and leans on the JavaScript API WebGL that allows the interactive rendering of 2D and 3D graphics by means of GPU accelerated usage of physics and image processing as part of the web page canvas without the use of plug-ins. WebGL is supported by most web browsers (e.g., Google Chrome, Mozilla Firefox, Safari, and Opera). The web

  13. International Atomic Energy Agency. Publications Catalogue 2011/12 - full details of publications published 2010-2012, forthcoming publications and a stocklist of publications published in 2008-2011

    International Nuclear Information System (INIS)

    2011-06-01

    This publications catalogue lists all sales publications of the IAEA published in 2010 and 2011 and those forthcoming in 2011/12. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  14. Impact of magnitude uncertainties on seismic catalogue properties

    Science.gov (United States)

    Leptokaropoulos, K. M.; Adamaki, A. K.; Roberts, R. G.; Gkarlaouni, C. G.; Paradisopoulou, P. M.

    2018-05-01

    Catalogue-based studies are of central importance in seismological research, to investigate the temporal, spatial and size distribution of earthquakes in specified study areas. Methods for estimating the fundamental catalogue parameters like the Gutenberg-Richter (G-R) b-value and the completeness magnitude (Mc) are well established and routinely applied. However, the magnitudes reported in seismicity catalogues contain measurement uncertainties which may significantly distort the estimation of the derived parameters. In this study, we use numerical simulations of synthetic data sets to assess the reliability of different methods for determining b-value and Mc, assuming the G-R law validity. After contaminating the synthetic catalogues with Gaussian noise (with selected standard deviations), the analysis is performed for numerous data sets of different sample size (N). The noise introduced to the data generally leads to a systematic overestimation of magnitudes close to and above Mc. This fact causes an increase of the average number of events above Mc, which in turn leads to an apparent decrease of the b-value. This may result to a significant overestimation of seismicity rate even well above the actual completeness level. The b-value can in general be reliably estimated even for relatively small data sets (N < 1000) when only magnitudes higher than the actual completeness level are used. Nevertheless, a correction of the total number of events belonging in each magnitude class (i.e. 0.1 unit) should be considered, to deal with the magnitude uncertainty effect. Because magnitude uncertainties (here with the form of Gaussian noise) are inevitable in all instrumental catalogues, this finding is fundamental for seismicity rate and seismic hazard assessment analyses. Also important is that for some data analyses significant bias cannot necessarily be avoided by choosing a high Mc value for analysis. In such cases, there may be a risk of severe miscalculation of

  15. E-Catalogue “Knowledge Management Practices in Nuclear Organizations”

    International Nuclear Information System (INIS)

    Sheveleva, S.; Pasztory, Z.

    2014-01-01

    The objectives of NKM E-Catalogue: Many nuclear organizations from IAEA Member States have considerable experiences and excellent achievements in the development of Knowledge Management Systems. Depending on organization’s strategy and type of business, they choose various methods and tools of knowledge management for realizing their aims. This catalogue will be available to all Member States interested in learning about collected knowledge management practices in order to enhance their own knowledge management programmes

  16. MATCHING ALTERNATIVE ADDRESSES: A SEMANTIC WEB APPROACH

    Directory of Open Access Journals (Sweden)

    S. Ariannamazi

    2015-12-01

    Full Text Available Rapid development of crowd-sourcing or volunteered geographic information (VGI provides opportunities for authoritatives that deal with geospatial information. Heterogeneity of multiple data sources and inconsistency of data types is a key characteristics of VGI datasets. The expansion of cities resulted in the growing number of POIs in the OpenStreetMap, a well-known VGI source, which causes the datasets to outdate in short periods of time. These changes made to spatial and aspatial attributes of features such as names and addresses might cause confusion or ambiguity in the processes that require feature’s literal information like addressing and geocoding. VGI sources neither will conform specific vocabularies nor will remain in a specific schema for a long period of time. As a result, the integration of VGI sources is crucial and inevitable in order to avoid duplication and the waste of resources. Information integration can be used to match features and qualify different annotation alternatives for disambiguation. This study enhances the search capabilities of geospatial tools with applications able to understand user terminology to pursuit an efficient way for finding desired results. Semantic web is a capable tool for developing technologies that deal with lexical and numerical calculations and estimations. There are a vast amount of literal-spatial data representing the capability of linguistic information in knowledge modeling, but these resources need to be harmonized based on Semantic Web standards. The process of making addresses homogenous generates a helpful tool based on spatial data integration and lexical annotation matching and disambiguating.

  17. Statistical Validation of a Web-Based GIS Application and Its Applicability to Cardiovascular-Related Studies.

    Science.gov (United States)

    Lee, Jae Eun; Sung, Jung Hye; Malouhi, Mohamad

    2015-12-22

    There is abundant evidence that neighborhood characteristics are significantly linked to the health of the inhabitants of a given space within a given time frame. This study is to statistically validate a web-based GIS application designed to support cardiovascular-related research developed by the NIH funded Research Centers in Minority Institutions (RCMI) Translational Research Network (RTRN) Data Coordinating Center (DCC) and discuss its applicability to cardiovascular studies. Geo-referencing, geocoding and geospatial analyses were conducted for 500 randomly selected home addresses in a U.S. southeastern Metropolitan area. The correlation coefficient, factor analysis and Cronbach's alpha (α) were estimated to quantify measures of the internal consistency, reliability and construct/criterion/discriminant validity of the cardiovascular-related geospatial variables (walk score, number of hospitals, fast food restaurants, parks and sidewalks). Cronbach's α for CVD GEOSPATIAL variables was 95.5%, implying successful internal consistency. Walk scores were significantly correlated with number of hospitals (r = 0.715; p restaurants (r = 0.729; p application were internally consistent and demonstrated satisfactory validity. Therefore, the GIS application may be useful to apply to cardiovascular-related studies aimed to investigate potential impact of geospatial factors on diseases and/or the long-term effect of clinical trials.

  18. UKRVO Astronomical WEB Services

    Directory of Open Access Journals (Sweden)

    Mazhaev, O.E.

    2017-01-01

    Full Text Available Ukraine Virtual Observatory (UkrVO has been a member of the International Virtual Observatory Alliance (IVOA since 2011. The virtual observatory (VO is not a magic solution to all problems of data storing and processing, but it provides certain standards for building infrastructure of astronomical data center. The astronomical databases help data mining and offer to users an easy access to observation metadata, images within celestial sphere and results of image processing. The astronomical web services (AWS of UkrVO give to users handy tools for data selection from large astronomical catalogues for a relatively small region of interest in the sky. Examples of the AWS usage are showed.

  19. MultiSpec: A Desktop and Online Geospatial Image Data Processing Tool

    Science.gov (United States)

    Biehl, L. L.; Hsu, W. K.; Maud, A. R. M.; Yeh, T. T.

    2017-12-01

    MultiSpec is an easy to learn and use, freeware image processing tool for interactively analyzing a broad spectrum of geospatial image data, with capabilities such as image display, unsupervised and supervised classification, feature extraction, feature enhancement, and several other functions. Originally developed for Macintosh and Windows desktop computers, it has a community of several thousand users worldwide, including researchers and educators, as a practical and robust solution for analyzing multispectral and hyperspectral remote sensing data in several different file formats. More recently MultiSpec was adapted to run in the HUBzero collaboration platform so that it can be used within a web browser, allowing new user communities to be engaged through science gateways. MultiSpec Online has also been extended to interoperate with other components (e.g., data management) in HUBzero through integration with the geospatial data building blocks (GABBs) project. This integration enables a user to directly launch MultiSpec Online from data that is stored and/or shared in a HUBzero gateway and to save output data from MultiSpec Online to hub storage, allowing data sharing and multi-step workflows without having to move data between different systems. MultiSpec has also been used in K-12 classes for which one example is the GLOBE program (www.globe.gov) and in outreach material such as that provided by the USGS (eros.usgs.gov/educational-activities). MultiSpec Online now provides teachers with another way to use MultiSpec without having to install the desktop tool. Recently MultiSpec Online was used in a geospatial data session with 30-35 middle school students at the Turned Onto Technology and Leadership (TOTAL) Camp in the summers of 2016 and 2017 at Purdue University. The students worked on a flood mapping exercise using Landsat 5 data to learn about land remote sensing using supervised classification techniques. Online documentation is available for Multi

  20. Issues on Building Kazakhstan Geospatial Portal to Implement E-Government

    Science.gov (United States)

    Sagadiyev, K.; Kang, H. K.; Li, K. J.

    2016-06-01

    A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.

  1. ISSUES ON BUILDING KAZAKHSTAN GEOSPATIAL PORTAL TO IMPLEMENT E-GOVERNMENT

    Directory of Open Access Journals (Sweden)

    K. Sagadiyev

    2016-06-01

    Full Text Available A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.

  2. IAEA Publications Catalogue 2017-2018 - full details of publications published 2016-2017, forthcoming publications 2017-2018 and a stocklist of publications published 2014-2017

    International Nuclear Information System (INIS)

    2016-08-01

    This publications catalogue lists all sales publications of the IAEA published in 2016–2017 and those forthcoming in 2017–2018. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. Most publications are issued in softcover. A complete listing of all IAEA priced publications is available on the IAEA’s web site: www.iaea.org/books

  3. WFCatalog: A catalogue for seismological waveform data

    Science.gov (United States)

    Trani, Luca; Koymans, Mathijs; Atkinson, Malcolm; Sleeman, Reinoud; Filgueira, Rosa

    2017-09-01

    This paper reports advances in seismic waveform description and discovery leading to a new seismological service and presents the key steps in its design, implementation and adoption. This service, named WFCatalog, which stands for waveform catalogue, accommodates features of seismological waveform data. Therefore, it meets the need for seismologists to be able to select waveform data based on seismic waveform features as well as sensor geolocations and temporal specifications. We describe the collaborative design methods and the technical solution showing the central role of seismic feature catalogues in framing the technical and operational delivery of the new service. Also, we provide an overview of the complex environment wherein this endeavour is scoped and the related challenges discussed. As multi-disciplinary, multi-organisational and global collaboration is necessary to address today's challenges, canonical representations can provide a focus for collaboration and conceptual tools for agreeing directions. Such collaborations can be fostered and formalised by rallying intellectual effort into the design of novel scientific catalogues and the services that support them. This work offers an example of the benefits generated by involving cross-disciplinary skills (e.g. data and domain expertise) from the early stages of design, and by sustaining the engagement with the target community throughout the delivery and deployment process.

  4. International Atomic Energy Agency publications. Publications catalogue 2007 including full details of publications published in 2005-2007 and forthcoming and a stocklist of publications published in 2003-2004

    International Nuclear Information System (INIS)

    2007-01-01

    This Publications Catalogue lists all sales publications of the IAEA published in 2005, 2006 and 2007 and forthcoming. Most IAEA publications are issued in English, some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  5. International Atomic Energy Agency publications. Publications catalogue 2007 including full details of publications published in 2005-2007 and forthcoming and a stocklist of publications published in 2003-2004

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-07-01

    This Publications Catalogue lists all sales publications of the IAEA published in 2005, 2006 and 2007 and forthcoming. Most IAEA publications are issued in English, some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books.

  6. Planck 2013 results. XXVIII. The Planck Catalogue of Compact Sources

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Cardoso, J.F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chen, X.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clemens, M.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F.K.; Hanson, D.; Harrison, D.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Jaffe, T.R.; Jaffe, A.H.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Laureijs, R.J.; Lawrence, C.R.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Leroy, C.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; McGehee, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Negrello, M.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I.J.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T.J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Reach, W.T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Schammel, M.P.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Turler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Walter, B.; Wandelt, B.D.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    The Planck Catalogue of Compact Sources (PCCS) is the catalogue of sources detected in the first 15 months of Planck operations, the "nominal" mission. It consists of nine single-frequency catalogues of compact sources, both Galactic and extragalactic, detected over the entire sky. The PCCS covers the frequency range 30--857\\,GHz with higher sensitivity (it is 90% complete at 180 mJy in the best channel) and better angular resolution (from ~33' to ~5') than previous all-sky surveys in this frequency band. By construction its reliability is >80% and more than 65% of the sources have been detected at least in two contiguous Planck channels. In this paper we present the construction and validation of the PCCS, its contents and its statistical characterization.

  7. The Availability of MeSH in Vendor-Supplied Cataloguing Records, as Seen Through the Catalogue of a Canadian Academic Health Library

    Directory of Open Access Journals (Sweden)

    Pamela S. Morgan

    2007-09-01

    Full Text Available This study examines the prevalence of medical subject headings in vendor-supplied cataloguing records for publications contained within aggregated databases or publisher collections. In the first phase, the catalogue of one Canadian academic medical library was examined to determine the extent to which medical subject headings (MeSH are available in the vendor-supplied records. In the second phase, these results were compared to the catalogues of other Canadian academic medical libraries in order to reach a generalization regarding the availability of MeSH headings for electronic resources. MeSH was more widespread in records for electronic journals but was noticeably lacking in records for electronic monographs, and for Canadian publications. There is no standard for ensuring MeSH are assigned to monograph records for health titles and there is no library in Canada with responsibility for ensuring that Canadian health publications receive Medical Subject Headings. It is incumbent upon libraries using MeSH to ensure that vendors are aware of this need when purchasing record sets.

  8. Exploring best cataloguing rules in the 21st century: Changes from ...

    African Journals Online (AJOL)

    In this digital era, the need to embrace change is inevitable. The authors described fundamental changes that were necessary to move cataloguing practice to the next level. Some of these changes include but not limited to: cataloguing working tools, changes in information resources, vocabulary, main entry points and ...

  9. GIBS Geospatial Data Abstraction Library (GDAL)

    Data.gov (United States)

    National Aeronautics and Space Administration — GDAL is an open source translator library for raster geospatial data formats that presents a single abstract data model to the calling application for all supported...

  10. Towards Web-based representation and processing of health information

    DEFF Research Database (Denmark)

    Gao, S.; Mioc, Darka; Yi, X.L.

    2009-01-01

    facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion: The designed......Background: There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data....... For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been...

  11. Searching the databases: a quick look at Amazon and two other online catalogues.

    Science.gov (United States)

    Potts, Hilary

    2003-01-01

    The Amazon Online Catalogue was compared with the Library of Congress Catalogue and the British Library Catalogue, both also available online, by searching on both neutral (Gay, Lesbian, Homosexual) and pejorative (Perversion, Sex Crime) subject terms, and also by searches using Boolean logic in an attempt to identify Lesbian Fiction items and religion-based anti-gay material. Amazon was much more likely to be the first port of call for non-academic enquiries. Although excluding much material necessary for academic research, it carried more information about the individual books and less historical homophobic baggage in its terminology than the great national catalogues. Its back catalogue of second-hand books outnumbered those in print. Current attitudes may partially be gauged by the relative numbers of titles published under each heading--e.g., there may be an inverse relationship between concern about child sex abuse and homophobia, more noticeable in U.S. because of the activities of the religious right.

  12. Geospatial Information System Capability Maturity Models

    Science.gov (United States)

    2017-06-01

    To explore how State departments of transportation (DOTs) evaluate geospatial tool applications and services within their own agencies, particularly their experiences using capability maturity models (CMMs) such as the Urban and Regional Information ...

  13. Clay club catalogue of characteristics of argillaceous rocks

    International Nuclear Information System (INIS)

    2005-01-01

    The OECD/NEA Working Group on the Characterisation, the Understanding and the Performance of Argillaceous Rocks as Repository Host Formations, namely the Clay Club, examines the various argillaceous rocks that are being considered for the deep geological disposal of radioactive waste, i.e. from plastic, soft, poorly indurated clays to brittle, hard mud-stones or shales. The Clay Club considered it necessary and timely to provide a catalogue to gather in a structured way the key geo-scientific characteristics of the various argillaceous formations that are - or were - studied in NEA member countries with regard to radioactive waste disposal. The present catalogue represents the outcomes of this Clay Club initiative. (author)

  14. Towards Geo-spatial Information Science in Big Data Era

    Directory of Open Access Journals (Sweden)

    LI Deren

    2016-04-01

    Full Text Available Since the 1990s, with the advent of worldwide information revolution and the development of internet, geospatial information science have also come of age, which pushed forward the building of digital Earth and cyber city. As we entered the 21st century, with the development and integration of global information technology and industrialization, internet of things and cloud computing came into being, human society enters into the big data era. This article covers the key features (ubiquitous, multi-dimension and dynamics, internet+networking, full automation and real-time, from sensing to recognition, crowdsourcing and VGI, and service-oriented of geospatial information science in the big data era and addresses the key technical issues (non-linear four dimensional Earth reference frame system, space based enhanced GNSS, space-air and land unified network communication techniques, on board processing techniques for multi-sources image data, smart interface service techniques for space-borne information, space based resource scheduling and network security, design and developing of a payloads based multi-functional satellite platform. That needs to be resolved to provide a new definition of geospatial information science in big data era. Based on the discussion in this paper, the author finally proposes a new definition of geospatial information science (geomatics, i.e. Geomatics is a multiple discipline science and technology which, using a systematic approach, integrates all the means for spatio-temporal data acquisition, information extraction, networked management, knowledge discovering, spatial sensing and recognition, as well as intelligent location based services of any physical objects and human activities around the earth and its environment. Starting from this new definition, geospatial information science will get much more chances and find much more tasks in big data era for generation of smart earth and smart city . Our profession

  15. European wind turbine catalogue

    International Nuclear Information System (INIS)

    1994-01-01

    The THERMIE European Community programme is designed to promote the greater use of European technology and this catalogue contributes to the fulfillment of this aim by dissemination of information on 50 wind turbines from 30 manufacturers. These turbines are produced in Europe and are commercially available. The manufacturers presented produce and sell grid-connected turbines which have been officially approved in countries where this approval is acquired, however some of the wind turbines included in the catalogue have not been regarded as fully commercially available at the time of going to print. The entries, which are illustrated by colour photographs, give company profiles, concept descriptions, measured power curves, prices, and information on design and dimension, safety systems, stage of development, special characteristics, annual energy production, and noise pollution. Lists are given of wind turbine manufacturers and agents and of consultants and developers in the wind energy sector. Exchange rates used in the conversion of the prices of wind turbines are also given. Information can be found on the OPET network (organizations recognised by the European Commission as an Organization for the Promotion of Energy Technologies (OPET)). An article describes the development of the wind power industry during the last 10-15 years and another article on certification aims to give an overview of the most well-known and acknowledged type approvals currently issued in Europe. (AB)

  16. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    Science.gov (United States)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc

  17. Student Focused Geospatial Curriculum Initiatives: Internships and Certificate Programs at NCCU

    Science.gov (United States)

    Vlahovic, G.; Malhotra, R.

    2009-12-01

    This paper reports recent efforts by the Department of Environmental, Earth and Geospatial Sciences faculty at North Carolina Central University (NCCU) to develop a leading geospatial sciences program that will be considered a model for other Historically Black College/University (HBCU) peers nationally. NCCU was established in 1909 and is the nation’s first state supported public liberal arts college funded for African Americans. In the most recent annual ranking of America’s best black colleges by the US News and World Report (Best Colleges 2010), NCCU was ranked 10th in the nation. As one of only two HBCUs in the southeast offering an undergraduate degree in Geography (McKee, J.O. and C. V. Dixon. Geography in Historically Black Colleges/ Universities in the Southeast, in The Role of the South in Making of American Geography: Centennial of the AAG, 2004), NCCU is uniquely positioned to positively affect talent and diversity of the geospatial discipline in the future. Therefore, successful creation of research and internship pathways for NCCU students has national implications because it will increase the number of minority students joining the workforce and applying to PhD programs. Several related efforts will be described, including research and internship projects with Fugro EarthData Inc., Center for Remote Sensing and Mapping Science at the University of Georgia, Center for Earthquake Research and Information at the University of Memphis and the City of Durham. The authors will also outline requirements and recent successes of ASPRS Provisional Certification Program, developed and pioneered as collaborative effort between ASPRS and NCCU. This certificate program allows graduating students majoring in geospatial technologies and allied fields to become provisionally certified by passing peer-review and taking the certification exam. At NCCU, projects and certification are conducted under the aegis of the Geospatial Research, Innovative Teaching and

  18. Focused sunlight factor of forest fire danger assessment using Web-GIS and RS technologies

    Science.gov (United States)

    Baranovskiy, Nikolay V.; Sherstnyov, Vladislav S.; Yankovich, Elena P.; Engel, Marina V.; Belov, Vladimir V.

    2016-08-01

    Timiryazevskiy forestry of Tomsk region (Siberia, Russia) is a study area elaborated in current research. Forest fire danger assessment is based on unique technology using probabilistic criterion, statistical data on forest fires, meteorological conditions, forest sites classification and remote sensing data. MODIS products are used for estimating some meteorological conditions and current forest fire situation. Geonformation technologies are used for geospatial analysis of forest fire danger situation on controlled forested territories. GIS-engine provides opportunities to construct electronic maps with different levels of forest fire probability and support raster layer for satellite remote sensing data on current forest fires. Web-interface is used for data loading on specific web-site and for forest fire danger data representation via World Wide Web. Special web-forms provide interface for choosing of relevant input data in order to process the forest fire danger data and assess the forest fire probability.

  19. Catalogue of tide gauges in the Pacific

    National Research Council Canada - National Science Library

    Ridgway, N. M

    1984-01-01

    Although this catalogue is primarily intended to provide a list of sources for tidal data which can be used in postevent studies of tsunamis, it may also be useful in other branches of oceanographic...

  20. Autonomous Mission Operations for Sensor Webs

    Science.gov (United States)

    Underbrink, A.; Witt, K.; Stanley, J.; Mandl, D.

    2008-12-01

    We present interim results of a 2005 ROSES AIST project entitled, "Using Intelligent Agents to Form a Sensor Web for Autonomous Mission Operations", or SWAMO. The goal of the SWAMO project is to shift the control of spacecraft missions from a ground-based, centrally controlled architecture to a collaborative, distributed set of intelligent agents. The network of intelligent agents intends to reduce management requirements by utilizing model-based system prediction and autonomic model/agent collaboration. SWAMO agents are distributed throughout the Sensor Web environment, which may include multiple spacecraft, aircraft, ground systems, and ocean systems, as well as manned operations centers. The agents monitor and manage sensor platforms, Earth sensing systems, and Earth sensing models and processes. The SWAMO agents form a Sensor Web of agents via peer-to-peer coordination. Some of the intelligent agents are mobile and able to traverse between on-orbit and ground-based systems. Other agents in the network are responsible for encapsulating system models to perform prediction of future behavior of the modeled subsystems and components to which they are assigned. The software agents use semantic web technologies to enable improved information sharing among the operational entities of the Sensor Web. The semantics include ontological conceptualizations of the Sensor Web environment, plus conceptualizations of the SWAMO agents themselves. By conceptualizations of the agents, we mean knowledge of their state, operational capabilities, current operational capacities, Web Service search and discovery results, agent collaboration rules, etc. The need for ontological conceptualizations over the agents is to enable autonomous and autonomic operations of the Sensor Web. The SWAMO ontology enables automated decision making and responses to the dynamic Sensor Web environment and to end user science requests. The current ontology is compatible with Open Geospatial Consortium (OGC

  1. International Atomic Energy Agency publications. Publications catalogue 2005 including full details of publications published in 2003-2004 and forthcoming in 2005 and a stocklist of publications published in 2001-2002

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-03-15

    This Publications Catalogue lists all sales publications of the IAEA published in 2003, 2004 and forthcoming in 2005. Most IAEA publications are issued in English, some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books.

  2. International Atomic Energy Agency publications. Publications catalogue 2006 including full details of publications published in 2004-2005 and forthcoming in 2006 and a stocklist of publications published in 2002-2003

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-03-15

    This Publications Catalogue lists all sales publications of the IAEA published in 2004, 2005 and forthcoming in 2006. Most IAEA publications are issued in English, some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books.

  3. International Atomic Energy Agency publications. Publications catalogue 2006 including full details of publications published in 2004-2005 and forthcoming in 2006 and a stocklist of publications published in 2002-2003

    International Nuclear Information System (INIS)

    2006-03-01

    This Publications Catalogue lists all sales publications of the IAEA published in 2004, 2005 and forthcoming in 2006. Most IAEA publications are issued in English, some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  4. International Atomic Energy Agency publications. Publications catalogue 2005 including full details of publications published in 2003-2004 and forthcoming in 2005 and a stocklist of publications published in 2001-2002

    International Nuclear Information System (INIS)

    2005-03-01

    This Publications Catalogue lists all sales publications of the IAEA published in 2003, 2004 and forthcoming in 2005. Most IAEA publications are issued in English, some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  5. Transport Infrastructure in the Process of Cataloguing Brownfields

    Science.gov (United States)

    Kramářová, Zuzana

    2017-10-01

    To begin with, the identification and follow-up revitalisation of brownfields raises a burning issue in territorial planning as well as in construction engineering. This phenomenon occurs not only in the Czech Republic and Europe, but also world-wide experts conduct its careful investigation. These issues may be divided into several areas. First, it is identifying and cataloguing single territorial localities; next, it means a complex process of locality revitalisation. As a matter of fact, legislative framework represents a separate area, which is actually highly specific in individual countries in accordance with the existing law, norms and regulations (it concerns mainly territorial planning and territory segmentation into appropriate administrative units). Legislative base of the Czech Republic was analysed in an article at WMCAUS in 2016. The solution of individual identification and following cataloguing of brownfields is worked out by Form of Regional Studies within the Legislation of the Czech Republic. Due to huge the scale of issues to be tackled, their content is only loosely defined in regard to Building Act and its implementing regulations, e.g. examining the layout of future construction in the area, locating architecturally or otherwise interesting objects, transport or technical infrastructure management, tourism, socially excluded localities etc. Legislative base does not exist, there is no common method for identifying and cataloguing brownfields. Therefore, individual catalogue lists are subject to customer’s requirements. All the same, the relevant information which the database contains may be always examined. One of them is part about transport infrastructure. The information may be divided into three subareas - information on transport accessibility of the locality, information on the actual infrastructure in the locality and information on the transport accessibility of human resources.

  6. Geospatial Data as a Service: Towards planetary scale real-time analytics

    Science.gov (United States)

    Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.

    2017-12-01

    The rapid growth of earth systems, environmental and geophysical datasets poses a challenge to both end-users and infrastructure providers. For infrastructure and data providers, tasks like managing, indexing and storing large collections of geospatial data needs to take into consideration the various use cases by which consumers will want to access and use the data. Considerable investment has been made by the Earth Science community to produce suitable real-time analytics platforms for geospatial data. There are currently different interfaces that have been defined to provide data services. Unfortunately, there is considerable difference on the standards, protocols or data models which have been designed to target specific communities or working groups. The Australian National University's National Computational Infrastructure (NCI) is used for a wide range of activities in the geospatial community. Earth observations, climate and weather forecasting are examples of these communities which generate large amounts of geospatial data. The NCI has been carrying out significant effort to develop a data and services model that enables the cross-disciplinary use of data. Recent developments in cloud and distributed computing provide a publicly accessible platform where new infrastructures can be built. One of the key components these technologies offer is the possibility of having "limitless" compute power next to where the data is stored. This model is rapidly transforming data delivery from centralised monolithic services towards ubiquitous distributed services that scale up and down adapting to fluctuations in the demand. NCI has developed GSKY, a scalable, distributed server which presents a new approach for geospatial data discovery and delivery based on OGC standards. We will present the architecture and motivating use-cases that drove GSKY's collaborative design, development and production deployment. We show our approach offers the community valuable exploratory

  7. Scalable global grid catalogue for Run3 and beyond

    Science.gov (United States)

    Martinez Pedreira, M.; Grigoras, C.; ALICE Collaboration

    2017-10-01

    The AliEn (ALICE Environment) file catalogue is a global unique namespace providing mapping between a UNIX-like logical name structure and the corresponding physical files distributed over 80 storage elements worldwide. Powerful search tools and hierarchical metadata information are integral parts of the system and are used by the Grid jobs as well as local users to store and access all files on the Grid storage elements. The catalogue has been in production since 2005 and over the past 11 years has grown to more than 2 billion logical file names. The backend is a set of distributed relational databases, ensuring smooth growth and fast access. Due to the anticipated fast future growth, we are looking for ways to enhance the performance and scalability by simplifying the catalogue schema while keeping the functionality intact. We investigated different backend solutions, such as distributed key value stores, as replacement for the relational database. This contribution covers the architectural changes in the system, together with the technology evaluation, benchmark results and conclusions.

  8. Studies on Colombian Cryptogams IV. A catalogue of the Hepaticae of Colombia

    NARCIS (Netherlands)

    Gradstein, S.R.; Hekking, W.H.A.

    1979-01-01

    □ The main purpose of this catalogue is to provide a complete listing of the species of liverworts hitherto known from Colombia and to summarize our present knowledge of species distribution within the country. It was prepared parallel to a catalogue of the mosses (Musci), which is being published

  9. Planck 2015 results: XXVII. The second Planck catalogue of Sunyaev-Zeldovich sources

    DEFF Research Database (Denmark)

    Ade, P. A R; Aghanim, N.; Arnaud, M.

    2016-01-01

    We present the all-sky Planck catalogue of Sunyaev-Zeldovich (SZ) sources detected from the 29 month full-mission data. The catalogue (PSZ2) is the largest SZ-selected sample of galaxy clusters yet produced and the deepest systematic all-sky surveyof galaxy clusters. It contains 1653 detections, ...

  10. MapFactory - Towards a mapping design pattern for big geospatial data

    Science.gov (United States)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  11. Center of Excellence for Geospatial Information Science research plan 2013-18

    Science.gov (United States)

    Usery, E. Lynn

    2013-01-01

    The U.S. Geological Survey Center of Excellence for Geospatial Information Science (CEGIS) was created in 2006 and since that time has provided research primarily in support of The National Map. The presentations and publications of the CEGIS researchers document the research accomplishments that include advances in electronic topographic map design, generalization, data integration, map projections, sea level rise modeling, geospatial semantics, ontology, user-centered design, volunteer geographic information, and parallel and grid computing for geospatial data from The National Map. A research plan spanning 2013–18 has been developed extending the accomplishments of the CEGIS researchers and documenting new research areas that are anticipated to support The National Map of the future. In addition to extending the 2006–12 research areas, the CEGIS research plan for 2013–18 includes new research areas in data models, geospatial semantics, high-performance computing, volunteered geographic information, crowdsourcing, social media, data integration, and multiscale representations to support the Three-Dimensional Elevation Program (3DEP) and The National Map of the future of the U.S. Geological Survey.

  12. Modern Special Collections Cataloguing: A University of London Case Study

    OpenAIRE

    Attar, Karen

    2013-01-01

    Recent years have seen a growing emphasis on modern special collections (in themselves no new phenomenon), with a dichotomy between guidance for detailed cataloguing in Descriptive Cataloging of Rare Materials (Books) (DCRM(B), 2007) and the value of clearing cataloguing backlogs expeditiously. This article describes the De la Mare Family Archive of Walter de la Mare's Printed Oeuvre at Senate House Library, University of London, as an example of a modern author collections in an institutiona...

  13. IAEA Catalogue of Services for Nuclear Infrastructure Development. Rev. 1, April 2014

    International Nuclear Information System (INIS)

    2014-04-01

    This IAEA Catalogue offers a wide range of services to Member States embarking on a new nuclear power programme or expanding an existing one. A new IAEA Catalogue of Services for Nuclear Infrastructure Development helps Member States to identify and request IAEA assistance for national organizations at different stages of the development or expansion of a nuclear power programme. This IAEA Catalogue of Services is presented in two tables. It is based on the IAEA Milestones Approach for nuclear power infrastructure development, documented in 'Milestones in the Development of a National Infrastructure for Nuclear Power' (IAEA Nuclear Energy Series NG-G-3.1). The two tables allow users to identify and select available IAEA services by: i) The three phases of the IAEA Milestones Approach, or ii) Organizations typically involved in the development of a nuclear power programme: the government / Nuclear Energy Programme Implementing Organization (NEPIO), the regulatory body and the owner operator of a nuclear power plant. This Catalogue includes information on the following IAEA services: i) Workshops / Training Courses; ii) Expert Missions / Advisory Services; iii) Review Missions / Peer Reviews; iv) Training tools and networks. The Catalogue lists both existing IAEA services and those being developed for the 19 issues to be addressed in developing a national nuclear infrastructure. Each existing service is linked to a relevant IAEA webpage that either describes a particular service or gives practical examples of the type of assistance that the Agency offers (e.g. workshops or missions). The owners of these webpages can be contacted for more detailed information or to request assistance. This IAEA Catalogue of Services will be updated regularly

  14. Using Web Crawler Technology for Geo-Events Analysis: A Case Study of the Huangyan Island Incident

    Directory of Open Access Journals (Sweden)

    Hao Hu

    2014-04-01

    Full Text Available Social networking and network socialization provide abundant text information and social relationships into our daily lives. Making full use of these data in the big data era is of great significance for us to better understand the changing world and the information-based society. Though politics have been integrally involved in the hyperlinked world issues since the 1990s, the text analysis and data visualization of geo-events faced the bottleneck of traditional manual analysis. Though automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have been explored and built recently, the data mining and information collection are not comprehensive enough because of the sensibility, complexity, relativity, timeliness, and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency, and dissemination path of the Huangyan Island incident were studied by using web crawler technology and the text analysis. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios, and dissemination flow graph, based on the crawled information and data processing not only highlight the characteristics of geo-event itself, but also implicate many interesting phenomenon and deep-seated problems behind it, such as related topics, theme vocabularies, subject contents, hot countries, event bodies, opinion leaders, high-frequency vocabularies, information sources, semantic structure, propagation paths, distribution of different attitudes, and regional difference of net citizens’ response in the Huangyan Island incident. Furthermore, the text analysis of network information with the help of focused web crawler is able to express the time-space relationship of crawled information and the information characteristic of semantic network to the geo-events. Therefore, it is a useful tool to

  15. Big Web data, small focus: An ethnosemiotic approach to culturally themed selective Web archiving

    Directory of Open Access Journals (Sweden)

    Saskia Huc-Hepher

    2015-07-01

    Full Text Available This paper proposes a multimodal ethnosemiotic conceptual framework for culturally themed selective Web archiving, taking as a practical example the curation of the London French Special Collection (LFSC in the UK Web Archive. Its focus on a particular ‘community’ is presented as advantageous in overcoming the sheer scale of data available on the Web; yet, it is argued that these ethnographic boundaries may be flawed if they do not map onto the collective self-perception of the London French. The approach establishes several theoretical meeting points between Pierre Bourdieu’s ethnography and Gunther Kress’s multimodal social semiotics, notably, the foregrounding of practice and the meaning-making potentialities of the everyday; the implications of language and categorisation; the interplay between (curating/researcher subject and (curated/research object; evolving notions of agency, authorship and audience; together with social engagement, and the archive as dynamic process and product. The curation rationale proposed stems from Bourdieu’s three-stage field analysis model, which places a strong emphasis on habitus, considered to be most accurately (represented through blogs, yet necessitates its contextualisation within the broader (diasporic field(s, through institutional websites, for example, whilst advocating a reflexive awareness of the researcher/curator’s (subjective role. This, alongside the Kressian acknowledgement of the inherent multimodality of on-line resources, lends itself convincingly to selection and valuation strategies, whilst the discussion of language, genre, authorship and audience is relevant to the potential cataloguing of Web objects. By conceptualising the culturally themed selective Web-archiving process within the ethnosemiotic framework constructed, concrete recommendations emerge regarding curation, classification and crowd-sourcing.

  16. Strategizing Teacher Professional Development for Classroom Uses of Geospatial Data and Tools

    Science.gov (United States)

    Zalles, Daniel R.; Manitakos, James

    2016-01-01

    Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE), a 4.5-year National Science Foundation funded project, explored the strategies that stimulate teacher commitment to the project's driving innovation: having students use geospatial information technology (GIT) to learn about weather, climate,…

  17. Recursos para el desarrollo de bibliotecas digitales en ambiente web 2.0

    OpenAIRE

    Céspedes Escobar, Nazly; Díaz Souza, Eddy

    2007-01-01

    The irruption of Technologies of Information and Communication (TIC) motivated the change and facilitated the appearance of digital libraries, which initially promoted their online catalogues, it acted like bridges or footbridges to communicate with its user community (present and remote ) with the physical collection.. Today, the development of this technologies proposes the use of tools and resources, that claim transformation of digital libraries inside of Web 2.0, where the philosophy aim...

  18. An application of data mining techniques in designing catalogue for a laundry service

    Directory of Open Access Journals (Sweden)

    Khasanah Annisa Uswatun

    2018-01-01

    Full Text Available Catalogues are the media that companies use to promote their products or services. Since catalogue is one of marketing media, the first essential step before designing product catalogue is determining the market target. Besides, it is also important to put some information that appeal to the target market, such as discount or promos by analysing customer pattern preferences in using services or buying product. This study conduct two data mining technique. The first is clustering analysis to segment customer and the second one is association rule mining to discover an interesting pattern about the services that commonly used by the customer at the same service time. Thus, the results will be used as a recommendation to make an attractive marketing strategy to be put in the service catalogue promo for a laundry in Sleman Yogyakarta. The clustering result showed that the biggest customer segment is university student who come 3 until 5 times in a month on weekends, while the association rule result showed that clothes, shoes, and bed sheet have strong relationship. The catalogue design is presented in the end of the paper.

  19. Geospatial Big Data Handling Theory and Methods: A Review and Research Challenges

    DEFF Research Database (Denmark)

    Li, Songnian; Dragicevic, Suzana; Anton, François

    2016-01-01

    Big data has now become a strong focus of global interest that is increasingly attracting the attention of academia, industry, government and other organizations. Big data can be situated in the disciplinary area of traditional geospatial data handling theory and methods. The increasing volume...... for Photogrammetry and Remote Sensing (ISPRS) Technical Commission II (TC II) revisits the existing geospatial data handling methods and theories to determine if they are still capable of handling emerging geospatial big data. Further, the paper synthesises problems, major issues and challenges with current...... developments as well as recommending what needs to be developed further in the near future....

  20. Towards the Development of a Taxonomy for Visualisation of Streamed Geospatial Data

    Science.gov (United States)

    Sibolla, B. H.; Van Zyl, T.; Coetzee, S.

    2016-06-01

    Geospatial data has very specific characteristics that need to be carefully captured in its visualisation, in order for the user and the viewer to gain knowledge from it. The science of visualisation has gained much traction over the last decade as a response to various visualisation challenges. During the development of an open source based, dynamic two-dimensional visualisation library, that caters for geospatial streaming data, it was found necessary to conduct a review of existing geospatial visualisation taxonomies. The review was done in order to inform the design phase of the library development, such that either an existing taxonomy can be adopted or extended to fit the needs at hand. The major challenge in this case is to develop dynamic two dimensional visualisations that enable human interaction in order to assist the user to understand the data streams that are continuously being updated. This paper reviews the existing geospatial data visualisation taxonomies that have been developed over the years. Based on the review, an adopted taxonomy for visualisation of geospatial streaming data is presented. Example applications of this taxonomy are also provided. The adopted taxonomy will then be used to develop the information model for the visualisation library in a further study.

  1. Tendencies in the application of the concept of catalogue marketing in Republic of Serbia and the world

    Directory of Open Access Journals (Sweden)

    Zelić Darko

    2010-01-01

    Full Text Available Catalogue marketing is one of the direct marketing channels. This concept implies making a lot of strategic and tactical decisions that determine catalogue's market success. Catalogue sales is most developed in USA (where it originated and in Western Europe. In Serbia, catalogue marketing is applied just in last few years, since big foreign catalog companies started their business in this region. Here, catalogue marketing is at a lower level of development than in the developed countries, and it comprises a minor part of total trade turnover. The positive thing is that now there are laws that regulates this area, which is encouraging for its development. More and more companies in Serbia are presenting and selling its product range through Internet catalogs. The survey, whose results are briefly presented in this article, showed that consumers in Serbia shop less by print catalogues than consumers in developed countries, and that the partition of those who buy through e-catalogues is increasing. With the increase in standard of living, and overcoming the crisis, there is a chance for catalogue marketing to become much more important concept among consumers and companies in Serbia.

  2. The compiled catalogue of galaxies in machine-readable form and its statistical investigation

    International Nuclear Information System (INIS)

    Kogoshvili, N.G.

    1982-01-01

    The compilation of a machine-readable catalogue of relatively bright galaxies was undertaken in Abastumani Astrophysical Observatory in order to facilitate the statistical analysis of a large observational material on galaxies from the Palomar Sky Survey. In compiling the catalogue of galaxies the following problems were considered: the collection of existing information for each galaxy; a critical approach to data aimed at the selection of the most important features of the galaxies; the recording of data in computer-readable form; and the permanent updating of the catalogue. (Auth.)

  3. Planck 2013 results. XXXII. The updated Planck catalogue of Sunyaev-Zeldovich sources

    Science.gov (United States)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Aussel, H.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Barrena, R.; Bartelmann, M.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bikmaev, I.; Bobin, J.; Bock, J. J.; Böhringer, H.; Bonaldi, A.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Bridges, M.; Bucher, M.; Burenin, R.; Burigana, C.; Butler, R. C.; Cardoso, J.-F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chary, R.-R.; Chen, X.; Chiang, H. C.; Chiang, L.-Y.; Chon, G.; Christensen, P. R.; Churazov, E.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Comis, B.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Da Silva, A.; Dahle, H.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Démoclès, J.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Enßlin, T. A.; Eriksen, H. K.; Feroz, F.; Ferragamo, A.; Finelli, F.; Flores-Cacho, I.; Forni, O.; Frailis, M.; Franceschi, E.; Fromenteau, S.; Galeotta, S.; Ganga, K.; Génova-Santos, R. T.; Giard, M.; Giardino, G.; Gilfanov, M.; Giraud-Héraud, Y.; González-Nuevo, J.; Górski, K. M.; Grainge, K. J. B.; Gratton, S.; Gregorio, A.; Groeneboom, N., E.; Gruppuso, A.; Hansen, F. K.; Hanson, D.; Harrison, D.; Hempel, A.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Hurley-Walker, N.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Khamitov, I.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Laureijs, R. J.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Li, C.; Liddle, A.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; MacTavish, C. J.; Maffei, B.; Maino, D.; Mandolesi, N.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Mei, S.; Meinhold, P. R.; Melchiorri, A.; Melin, J.-B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mikkelsen, K.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nastasi, A.; Nati, F.; Natoli, P.; Nesvadba, N. P. H.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I. J.; Olamaie, M.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrott, Y. C.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Reach, W. T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rumsey, C.; Rusholme, B.; Sandri, M.; Santos, D.; Saunders, R. D. E.; Savini, G.; Schammel, M. P.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Shimwell, T. W.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Streblyanska, A.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tramonte, D.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vibert, L.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; White, M.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2015-09-01

    We update the all-sky Planck catalogue of 1227 clusters and cluster candidates (PSZ1) published in March 2013, derived from detections of the Sunyaev-Zeldovich (SZ) effect using the first 15.5 months of Planck satellite observations. As an addendum, we deliver an updated version of the PSZ1 catalogue, reporting the further confirmation of 86 Planck-discovered clusters. In total, the PSZ1 now contains 947 confirmed clusters, of which 214 were confirmed as newly discovered clusters through follow-up observations undertaken by the Planck Collaboration. The updated PSZ1 contains redshifts for 913 systems, of which 736 (~ 80.6%) are spectroscopic, and associated mass estimates derived from the Yz mass proxy. We also provide a new SZ quality flag for the remaining 280 candidates. This flag was derived from a novel artificial neural-network classification of the SZ signal. Based on this assessment, the purity of the updated PSZ1 catalogue is estimated to be 94%. In this release, we provide the full updated catalogue and an additional readme file with further information on the Planck SZ detections. The catalogue is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/581/A14

  4. The British Film Catalogue: 1895-1970.

    Science.gov (United States)

    Gifford, Denis

    This reference book catalogues nearly every commercial film produced in Britain for public entertainment from 1895 to 1970. The entries are listed chronologically by year and month. Each entry is limited to a single film and contains a cross index code number, exhibition date, main title, length, color system, production company, distribution…

  5. Geospatial Data as a Service: The GEOGLAM Rangelands and Pasture Productivity Map Experience

    Science.gov (United States)

    Evans, B. J. K.; Antony, J.; Guerschman, J. P.; Larraondo, P. R.; Richards, C. J.

    2017-12-01

    Empowering end-users like pastoralists, land management specialists and land policy makers in the use of earth observation data for both day-to-day and seasonal planning needs both interactive delivery of multiple geospatial datasets and the capability of supporting on-the-fly dynamic queries while simultaneously fostering a community around the effort. The use of and wide adoption of large data archives, like those produced by earth observation missions, are often limited by compute and storage capabilities of the remote user. We demonstrate that wide-scale use of large data archives can be facilitated by end-users dynamically requesting value-added products using open standards (WCS, WMS, WPS), with compute running in the cloud or dedicated data-centres and visualizing outputs on web-front ends. As an example, we will demonstrate how a tool called GSKY can empower a remote end-user by providing the data delivery and analytics capabilities for the GEOGLAM Rangelands and Pasture Productivity (RAPP) Map tool. The GEOGLAM RAPP initiative from the Group on Earth Observations (GEO) and its Agricultural Monitoring subgroup aims at providing practical tools to end-users focusing on the important role of rangelands and pasture systems in providing food production security from both agricultural crops and animal protein. Figure 1, is a screen capture from the RAPP Map interface for an important pasture area in the Namibian rangelands. The RAPP Map has been in production for six months and has garnered significant interest from groups and users all over the world. GSKY, being formulated around the theme of Open Geospatial Data-as-a-Service capabilities uses distributed computing and storage to facilitate this. It works behind the scenes, accepting OGC standard requests in WCS, WMS and WPS. Results from these requests are rendered on a web-front end. In this way, the complexities of data locality and compute execution are masked from an end user. On-the-fly computation of

  6. Fast Deployment on the Cloud of Integrated Postgres, API and a Jupyter Notebook for Geospatial Collaboration

    Science.gov (United States)

    Fatland, R.; Tan, A.; Arendt, A. A.

    2016-12-01

    We describe a Python-based implementation of a PostgreSQL database accessed through an Application Programming Interface (API) hosted on the Amazon Web Services public cloud. The data is geospatial and concerns hydrological model results in the glaciated catchment basins of southcentral and southeast Alaska. This implementation, however, is intended to be generalized to other forms of geophysical data, particularly data that is intended to be shared across a collaborative team or publicly. An example (moderate-size) dataset is provided together with the code base and a complete installation tutorial on GitHub. An enthusiastic scientist with some familiarity with software installation can replicate the example system in two hours. This installation includes database, API, a test Client and a supporting Jupyter Notebook, specifically oriented towards Python 3 and markup text to comprise an executable paper. The installation 'on the cloud' often engenders discussion and consideration of cloud cost and safety. By treating the process as somewhat "cookbook" we hope to first demonstrate the feasibility of the proposition. A discussion of cost and data security is provided in this presentation and in the accompanying tutorial/documentation. This geospatial data system case study is part of a larger effort at the University of Washington to enable research teams to take advantage of the public cloud to meet challenges in data management and analysis.

  7. A Metadata Schema for Geospatial Resource Discovery Use Cases

    Directory of Open Access Journals (Sweden)

    Darren Hardy

    2014-07-01

    Full Text Available We introduce a metadata schema that focuses on GIS discovery use cases for patrons in a research library setting. Text search, faceted refinement, and spatial search and relevancy are among GeoBlacklight's primary use cases for federated geospatial holdings. The schema supports a variety of GIS data types and enables contextual, collection-oriented discovery applications as well as traditional portal applications. One key limitation of GIS resource discovery is the general lack of normative metadata practices, which has led to a proliferation of metadata schemas and duplicate records. The ISO 19115/19139 and FGDC standards specify metadata formats, but are intricate, lengthy, and not focused on discovery. Moreover, they require sophisticated authoring environments and cataloging expertise. Geographic metadata standards target preservation and quality measure use cases, but they do not provide for simple inter-institutional sharing of metadata for discovery use cases. To this end, our schema reuses elements from Dublin Core and GeoRSS to leverage their normative semantics, community best practices, open-source software implementations, and extensive examples already deployed in discovery contexts such as web search and mapping. Finally, we discuss a Solr implementation of the schema using a "geo" extension to MODS.

  8. CURRENT TRENDS IN CATALOGUING AND THE CHALLENGES ...

    African Journals Online (AJOL)

    resources collected by libraries, results in rich metadata that can be used for many .... in creating timely and high quality records, cataloguers need to develop a ... is a professional function for which there is no substitutes for the human begin.

  9. Representation of activity in images using geospatial temporal graphs

    Science.gov (United States)

    Brost, Randolph; McLendon, III, William C.; Parekh, Ojas D.; Rintoul, Mark Daniel; Watson, Jean-Paul; Strip, David R.; Diegert, Carl

    2018-05-01

    Various technologies pertaining to modeling patterns of activity observed in remote sensing images using geospatial-temporal graphs are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Activity patterns may be discerned from the graphs by coding nodes representing persistent objects like buildings differently from nodes representing ephemeral objects like vehicles, and examining the geospatial-temporal relationships of ephemeral nodes within the graph.

  10. Efficient Retrieval of the Top-k Most Relevant Spatial Web Objects

    DEFF Research Database (Denmark)

    Cong, Gao; Jensen, Christian Søndergaard; Wu, Dingming

    2009-01-01

    The conventional Internet is acquiring a geo-spatial dimension. Web documents are being geo-tagged, and geo-referenced objects such as points of interest are being associated with descriptive text documents. The resulting fusion of geo-location and documents enables a new kind of top-k query...... that takes into account both location proximity and text relevancy. To our knowledge, only naive techniques exist that are capable of computing a general web information retrieval query while also taking location into account. This paper proposes a new indexing framework for location-aware top-k text...... both text relevancy and location proximity to prune the search space. Results of empirical studies with an implementation of the framework demonstrate that the paper’s proposal offers scalability and is capable of excellent performance....

  11. The ASAS-SN bright supernova catalogue - III. 2016

    Science.gov (United States)

    Holoien, T. W.-S.; Brown, J. S.; Stanek, K. Z.; Kochanek, C. S.; Shappee, B. J.; Prieto, J. L.; Dong, Subo; Brimacombe, J.; Bishop, D. W.; Bose, S.; Beacom, J. F.; Bersier, D.; Chen, Ping; Chomiuk, L.; Falco, E.; Godoy-Rivera, D.; Morrell, N.; Pojmanski, G.; Shields, J. V.; Strader, J.; Stritzinger, M. D.; Thompson, Todd A.; Woźniak, P. R.; Bock, G.; Cacella, P.; Conseil, E.; Cruz, I.; Fernandez, J. M.; Kiyota, S.; Koff, R. A.; Krannich, G.; Marples, P.; Masi, G.; Monard, L. A. G.; Nicholls, B.; Nicolas, J.; Post, R. S.; Stone, G.; Wiethoff, W. S.

    2017-11-01

    This catalogue summarizes information for all supernovae discovered by the All-Sky Automated Survey for SuperNovae (ASAS-SN) and all other bright (mpeak ≤ 17), spectroscopically confirmed supernovae discovered in 2016. We then gather the near-infrared through ultraviolet magnitudes of all host galaxies and the offsets of the supernovae from the centres of their hosts from public data bases. We illustrate the results using a sample that now totals 668 supernovae discovered since 2014 May 1, including the supernovae from our previous catalogues, with type distributions closely matching those of the ideal magnitude limited sample from Li et al. This is the third of a series of yearly papers on bright supernovae and their hosts from the ASAS-SN team.

  12. Stakeholder Alignment and Changing Geospatial Information Capabilities

    Science.gov (United States)

    Winter, S.; Cutcher-Gershenfeld, J.; King, J. L.

    2015-12-01

    Changing geospatial information capabilities can have major economic and social effects on activities such as drought monitoring, weather forecasts, agricultural productivity projections, water and air quality assessments, the effects of forestry practices and so on. Whose interests are served by such changes? Two common mistakes are assuming stability in the community of stakeholders and consistency in stakeholder behavior. Stakeholder communities can reconfigure dramatically as some leave the discussion, others enter, and circumstances shift — all resulting in dynamic points of alignment and misalignment . New stakeholders can bring new interests, and existing stakeholders can change their positions. Stakeholders and their interests need to be be considered as geospatial information capabilities change, but this is easier said than done. New ways of thinking about stakeholder alignment in light of changes in capability are presented.

  13. Provenance metadata gathering and cataloguing of EFIT++ code execution

    Energy Technology Data Exchange (ETDEWEB)

    Lupelli, I., E-mail: ivan.lupelli@ccfe.ac.uk [CCFE, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Muir, D.G.; Appel, L.; Akers, R.; Carr, M. [CCFE, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Abreu, P. [Instituto de Plasmas e Fusão Nuclear, Instituto Superior Técnico, Universidade de Lisboa, 1049-001 Lisboa (Portugal)

    2015-10-15

    Highlights: • An approach for automatic gathering of provenance metadata has been presented. • A provenance metadata catalogue has been created. • The overhead in the code runtime is less than 10%. • The metadata/data size ratio is about ∼20%. • A visualization interface based on Gephi, has been presented. - Abstract: Journal publications, as the final product of research activity, are the result of an extensive complex modeling and data analysis effort. It is of paramount importance, therefore, to capture the origins and derivation of the published data in order to achieve high levels of scientific reproducibility, transparency, internal and external data reuse and dissemination. The consequence of the modern research paradigm is that high performance computing and data management systems, together with metadata cataloguing, have become crucial elements within the nuclear fusion scientific data lifecycle. This paper describes an approach to the task of automatically gathering and cataloguing provenance metadata, currently under development and testing at Culham Center for Fusion Energy. The approach is being applied to a machine-agnostic code that calculates the axisymmetric equilibrium force balance in tokamaks, EFIT++, as a proof of principle test. The proposed approach avoids any code instrumentation or modification. It is based on the observation and monitoring of input preparation, workflow and code execution, system calls, log file data collection and interaction with the version control system. Pre-processing, post-processing, and data export and storage are monitored during the code runtime. Input data signals are captured using a data distribution platform called IDAM. The final objective of the catalogue is to create a complete description of the modeling activity, including user comments, and the relationship between data output, the main experimental database and the execution environment. For an intershot or post-pulse analysis (∼1000

  14. Provenance metadata gathering and cataloguing of EFIT++ code execution

    International Nuclear Information System (INIS)

    Lupelli, I.; Muir, D.G.; Appel, L.; Akers, R.; Carr, M.; Abreu, P.

    2015-01-01

    Highlights: • An approach for automatic gathering of provenance metadata has been presented. • A provenance metadata catalogue has been created. • The overhead in the code runtime is less than 10%. • The metadata/data size ratio is about ∼20%. • A visualization interface based on Gephi, has been presented. - Abstract: Journal publications, as the final product of research activity, are the result of an extensive complex modeling and data analysis effort. It is of paramount importance, therefore, to capture the origins and derivation of the published data in order to achieve high levels of scientific reproducibility, transparency, internal and external data reuse and dissemination. The consequence of the modern research paradigm is that high performance computing and data management systems, together with metadata cataloguing, have become crucial elements within the nuclear fusion scientific data lifecycle. This paper describes an approach to the task of automatically gathering and cataloguing provenance metadata, currently under development and testing at Culham Center for Fusion Energy. The approach is being applied to a machine-agnostic code that calculates the axisymmetric equilibrium force balance in tokamaks, EFIT++, as a proof of principle test. The proposed approach avoids any code instrumentation or modification. It is based on the observation and monitoring of input preparation, workflow and code execution, system calls, log file data collection and interaction with the version control system. Pre-processing, post-processing, and data export and storage are monitored during the code runtime. Input data signals are captured using a data distribution platform called IDAM. The final objective of the catalogue is to create a complete description of the modeling activity, including user comments, and the relationship between data output, the main experimental database and the execution environment. For an intershot or post-pulse analysis (∼1000

  15. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin; Kuester, Falk

    2010-01-01

    data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented

  16. NASA space geodesy program: Catalogue of site information

    Science.gov (United States)

    Bryant, M. A.; Noll, C. E.

    1993-01-01

    This is the first edition of the NASA Space Geodesy Program: Catalogue of Site Information. This catalogue supersedes all previous versions of the Crustal Dynamics Project: Catalogue of Site Information, last published in May 1989. This document is prepared under the direction of the Space Geodesy and Altimetry Projects Office (SGAPO), Code 920.1, Goddard Space Flight Center. SGAPO has assumed the responsibilities of the Crustal Dynamics Project, which officially ended December 31, 1991. The catalog contains information on all NASA supported sites as well as sites from cooperating international partners. This catalog is designed to provde descriptions and occupation histories of high-accuracy geodetic measuring sites employing space-related techniques. The emphasis of the catalog has been in the past, and continues to be with this edition, station information for facilities and remote locations utilizing the Satellite Laser Ranging (SLR), Lunar Laser Ranging (LLR), and Very Long Baseline Interferometry (VLBI) techniques. With the proliferation of high-quality Global Positioning System (GPS) receivers and Doppler Orbitography and Radiopositioning Integrated by Satellite (DORIS) transponders, many co-located at established SLR and VLBI observatories, the requirement for accurate station and localized survey information for an ever broadening base of scientists and engineers has been recognized. It is our objective to provide accurate station information to scientific groups interested in these facilities.

  17. Process and results of the development of an ICNP® Catalogue for Cancer Pain

    Directory of Open Access Journals (Sweden)

    Marisaulina Wanderley Abrantes de Carvalho

    2013-10-01

    Full Text Available This was a methodological study conducted to describe the process and results of the development of an International Classification for Nursing Practice (ICNP® Catalogue for Cancer Pain. According to the International Council of Nurses (ICN, this catalogue contains a subset of nursing diagnoses, outcomes, and interventions to document the implementation of the nursing process in cancer patients. This catalogue was developed in several steps according to the guidelines recommended by the ICN. As a result, 68 statements on nursing diagnoses/outcomes were obtained, which were classified according to the theoretical model for nursing care related to cancer pain into physical (28, psychological (29, and sociocultural and spiritual (11 aspects. A total of 116 corresponding nursing interventions were obtained. The proposed ICNP® Catalogue for Cancer Pain aims to provide safe and systematic orientation to nurses who work in this field, thus improving the quality of patient care and facilitating the performance of the nursing process.

  18. Vienna International Centre Library Film and Video Catalogue: Peaceful applications of nuclear energy 1928-1998

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-31

    The catalogue lists films and videos which are available on free loan from Vienna International Centre Library for educational, non-commercial, non-profit showings involving no admission charges or appeals for funds. Much of the material listed has been donated to the IAEA by the Governments of Member States. The items are arranged in the catalogue by number. The catalogue also includes a title index and a subject index

  19. Vienna International Centre Library Film and Video Catalogue: Peaceful applications of nuclear energy 1928-1998

    International Nuclear Information System (INIS)

    1998-01-01

    The catalogue lists films and videos which are available on free loan from Vienna International Centre Library for educational, non-commercial, non-profit showings involving no admission charges or appeals for funds. Much of the material listed has been donated to the IAEA by the Governments of Member States. The items are arranged in the catalogue by number. The catalogue also includes a title index and a subject index

  20. The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework

    Science.gov (United States)

    Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.

    2016-12-01

    The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During

  1. NativeView: A Geospatial Curriculum for Native Nation Building

    Science.gov (United States)

    Rattling Leaf, J.

    2007-12-01

    In the spirit of collaboration and reciprocity, James Rattling Leaf of Sinte Gleska University on the Rosebud Reservation of South Dakota will present recent developments, experiences, insights and a vision for education in Indian Country. As a thirty-year young institution, Sinte Gleska University is founded by a strong vision of ancestral leadership and the values of the Lakota Way of Life. Sinte Gleska University (SGU) has initiated the development of a Geospatial Education Curriculum project. NativeView: A Geospatial Curriculum for Native Nation Building is a two-year project that entails a disciplined approach towards the development of a relevant Geospatial academic curriculum. This project is designed to meet the educational and land management needs of the Rosebud Lakota Tribe through the utilization of Geographic Information Systems (GIS), Remote Sensing (RS) and Global Positioning Systems (GPS). In conjunction with the strategy and progress of this academic project, a formal presentation and demonstration of the SGU based Geospatial software RezMapper software will exemplify an innovative example of state of the art information technology. RezMapper is an interactive CD software package focused toward the 21 Lakota communities on the Rosebud Reservation that utilizes an ingenious concept of multimedia mapping and state of the art data compression and presentation. This ongoing development utilizes geographic data, imagery from space, historical aerial photography and cultural features such as historic Lakota documents, language, song, video and historical photographs in a multimedia fashion. As a tangible product, RezMapper will be a project deliverable tool for use in the classroom and to a broad range of learners.

  2. The new geospatial tools: global transparency enhancing safeguards verification

    International Nuclear Information System (INIS)

    Pabian, Frank Vincent

    2010-01-01

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  3. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    Science.gov (United States)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  4. The sixth catalogue of galactic Wolf-Rayet stars, their past and present

    International Nuclear Information System (INIS)

    Hucht, K.A. van der; Conti, P.S.; Lundstroem, I.; Stenholm, B.

    1981-01-01

    This paper presents the Sixth Catalogue of galactic Wolf-Rayet stars (Pop I), a short history on the five earlier WR catalogues, improved spectral classification, finding charts, a discussion on related objects, and a review of the current statur of Wolf-Rayet star research. The appendix presents a bibliography on most of the Wolf-Rayet literature published since 1867. (orig.)

  5. Catalogue of high-mass X-ray binaries in the Galaxy (4th edition)

    NARCIS (Netherlands)

    Liu, Q.Z.; van Paradijs, J.; van den Heuvel, E.P.J.

    2006-01-01

    We present a new edition of the catalogue of high-mass X-ray binaries in the Galaxy. The catalogue contains source name(s), coordinates, finding chart, X-ray luminosity, system parameters, and stellar parameters of the components and other characteristic properties of 114 high-mass X-ray binaries,

  6. Geoportale del Consorzio LaMMA Disseminazione di dati meteo in near real-time tramite standard OGC e software Open Source

    Directory of Open Access Journals (Sweden)

    Simone Giannechini

    2014-02-01

    Full Text Available This paper describes the spatial data infrastructure (SDI used by the LaMMA Consortium - Environmental Mod elling and Monitoring Laboratory for Sustainable Developm ent of Tuscany Region for sharing, viewing and cataloguing (metadata and related information all geospatial data that are daily proc essed and used op erationally in many meteorological and environmental app lications.The SDI was develop ed using Open Source technologies, mo reover the geospatial data has been imp lemented through protoco ls based on ogc (Open Geospatial Consortium standards such as WMS, WFS and CSW. Geoserver was used for disseminating geospatial data and maps through OGC WMS and WFS protoco ls while GeoNetwork was used as the cataloguing and search po rtal through also the CSW protocol; eventually MapStore was used to implement the mash-up front-end.The innovative aspect of this po rtal is the fact that it currently is ingesting, fusing and disseminating geospatial data related to the MetOcfield from various sources in near real-time in a comp rehensive manner that allows users to create add ed value visualizations for the support of operational use cases as well as to access and download underlying data (where app licable.

  7. Developing a distributed HTML5-based search engine for geospatial resource discovery

    Science.gov (United States)

    ZHOU, N.; XIA, J.; Nebert, D.; Yang, C.; Gui, Z.; Liu, K.

    2013-12-01

    With explosive growth of data, Geospatial Cyberinfrastructure(GCI) components are developed to manage geospatial resources, such as data discovery and data publishing. However, the efficiency of geospatial resources discovery is still challenging in that: (1) existing GCIs are usually developed for users of specific domains. Users may have to visit a number of GCIs to find appropriate resources; (2) The complexity of decentralized network environment usually results in slow response and pool user experience; (3) Users who use different browsers and devices may have very different user experiences because of the diversity of front-end platforms (e.g. Silverlight, Flash or HTML). To address these issues, we developed a distributed and HTML5-based search engine. Specifically, (1)the search engine adopts a brokering approach to retrieve geospatial metadata from various and distributed GCIs; (2) the asynchronous record retrieval mode enhances the search performance and user interactivity; (3) the search engine based on HTML5 is able to provide unified access capabilities for users with different devices (e.g. tablet and smartphone).

  8. Using ESO Reflex with Web Services

    Science.gov (United States)

    Järveläinen, P.; Savolainen, V.; Oittinen, T.; Maisala, S.; Ullgrén, M. Hook, R.

    2008-08-01

    ESO Reflex is a prototype graphical workflow system, based on Taverna, and primarily intended to be a flexible way of running ESO data reduction recipes along with other legacy applications and user-written tools. ESO Reflex can also readily use the Taverna Web Services features that are based on the Apache Axis SOAP implementation. Taverna is a general purpose Web Service client, and requires no programming to use such services. However, Taverna also has some restrictions: for example, no numerical types such integers. In addition the preferred binding style is document/literal wrapped, but most astronomical services publish the Axis default WSDL using RPC/encoded style. Despite these minor limitations we have created simple but very promising test VO workflow using the Sesame name resolver service at CDS Strasbourg, the Hubble SIAP server at the Multi-Mission Archive at Space Telescope (MAST) and the WESIX image cataloging and catalogue cross-referencing service at the University of Pittsburgh. ESO Reflex can also pass files and URIs via the PLASTIC protocol to visualisation tools and has its own viewer for VOTables. We picked these three Web Services to try to set up a realistic and useful ESO Reflex workflow. They also demonstrate ESO Reflex abilities to use many kind of Web Services because each of them requires a different interface. We describe each of these services in turn and comment on how it was used

  9. The Efficacy of Educative Curriculum Materials to Support Geospatial Science Pedagogical Content Knowledge

    Science.gov (United States)

    Bodzin, Alec; Peffer, Tamara; Kulo, Violet

    2012-01-01

    Teaching and learning about geospatial aspects of energy resource issues requires that science teachers apply effective science pedagogical approaches to implement geospatial technologies into classroom instruction. To address this need, we designed educative curriculum materials as an integral part of a comprehensive middle school energy…

  10. Nansat: a Scientist-Orientated Python Package for Geospatial Data Processing

    Directory of Open Access Journals (Sweden)

    Anton A. Korosov

    2016-10-01

    Full Text Available Nansat is a Python toolbox for analysing and processing 2-dimensional geospatial data, such as satellite imagery, output from numerical models, and gridded in-situ data. It is created with strong focus on facilitating research, and development of algorithms and autonomous processing systems. Nansat extends the widely used Geospatial Abstraction Data Library (GDAL by adding scientific meaning to the datasets through metadata, and by adding common functionality for data analysis and handling (e.g., exporting to various data formats. Nansat uses metadata vocabularies that follow international metadata standards, in particular the Climate and Forecast (CF conventions, and the NASA Directory Interchange Format (DIF and Global Change Master Directory (GCMD keywords. Functionality that is commonly needed in scientific work, such as seamless access to local or remote geospatial data in various file formats, collocation of datasets from different sources and geometries, and visualization, is also built into Nansat. The paper presents Nansat workflows, its functional structure, and examples of typical applications.

  11. The Future of Geospatial Standards

    Science.gov (United States)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds

  12. Interoperability And Value Added To Earth Observation Data

    Science.gov (United States)

    Gasperi, J.

    2012-04-01

    Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.

  13. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    Science.gov (United States)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https

  14. lawn: An R client for the Turf JavaScript Library for Geospatial Analysis

    Science.gov (United States)

    lawn is an R package to provide access to the geospatial analysis capabilities in the Turf javascript library. Turf expects data in GeoJSON format. Given that many datasets are now available natively in GeoJSON providing an easier method for conducting geospatial analyses on thes...

  15. Promenade Among Words and Things: The Gallery as Catalogue, the Catalogue as Gallery

    Directory of Open Access Journals (Sweden)

    Mari Lending

    2015-12-01

    Full Text Available In the mid nineteenth century new casting techniques allowed for the production of huge architectural fragments. Well-selected collections could ideally display perfect series in galleries in which the visitor could wander among monuments and experience architecture history on full scale. The disembodied material of plaster was considered capable of embodying a number of modern historical taxonomies and aesthetical programs, most importantly chronology, comparison, style, and evolution. Veritable showcases of historicism, the casts could illustrate in spatial arrangements new conceptions on the history, contemporaneity and future of architecture. Plaster cast became a main medium in which to publish antiquities as novelties for grand audiences, taking the printed and published beyond the two-dimensional space of words and images. However, due to the increasing market of casts and their sheere size and weight, the reproductions as mounted in the galleries often behaved as unruly as architecture does outside curatorial control. In the end only the catalogues, the paper versions of these imaginary museums were capable to create the orders that their plaster referents constantly aspired to destroy. An important chapter in the history of the architecture museum these plaster monuments belong to a part of architectural print culture in which catalogues were curated and galleries edited. Metaphors drawn from the realm of writing saturated the discourse on the display of casts. Images and texts fluctuated and the image-objects were compared to books, paper, pages, documents and libraries but above all to illustrations inviting promenades in time and space.

  16. The WATCH solar X-ray burst catalogue

    DEFF Research Database (Denmark)

    Crosby, N.; Lund, Niels; Vilmer, N.

    1998-01-01

    The WATCH experiment aboard the GRANAT satellite provides observations of the Sun in the deka-keV range covering the years 1990 through mid-1992. An introduction to the experiment is given followed by an explanation of how the WATCH solar burst catalogue was created. The different parameters list...

  17. A Geospatial Data Recommender System based on Metadata and User Behaviour

    Science.gov (United States)

    Li, Y.; Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; Finch, C. J.; McGibbney, L. J.

    2017-12-01

    Earth observations are produced in a fast velocity through real time sensors, reaching tera- to peta- bytes of geospatial data daily. Discovering and accessing the right data from the massive geospatial data is like finding needle in the haystack. To help researchers find the right data for study and decision support, quite a lot of research focusing on improving search performance have been proposed including recommendation algorithm. However, few papers have discussed the way to implement a recommendation algorithm in geospatial data retrieval system. In order to address this problem, we propose a recommendation engine to improve discovering relevant geospatial data by mining and utilizing metadata and user behavior data: 1) metadata based recommendation considers the correlation of each attribute (i.e., spatiotemporal, categorical, and ordinal) to data to be found. In particular, phrase extraction method is used to improve the accuracy of the description similarity; 2) user behavior data are utilized to predict the interest of a user through collaborative filtering; 3) an integration method is designed to combine the results of the above two methods to achieve better recommendation Experiments show that in the hybrid recommendation list, the all the precisions are larger than 0.8 from position 1 to 10.

  18. Dark Energy Survey Year 1 Results: galaxy mock catalogues for BAO

    Energy Technology Data Exchange (ETDEWEB)

    Avila, S.; et al.

    2017-12-17

    Mock catalogues are a crucial tool in the analysis of galaxy surveys data, both for the accurate computation of covariance matrices, and for the optimisation of analysis methodology and validation of data sets. In this paper, we present a set of 1800 galaxy mock catalogues designed to match the Dark Energy Survey Year-1 BAO sample (Crocce et al. 2017) in abundance, observational volume, redshift distribution and uncertainty, and redshift dependent clustering. The simulated samples were built upon HALOGEN (Avila et al. 2015) halo catalogues, based on a $2LPT$ density field with an exponential bias. For each of them, a lightcone is constructed by the superposition of snapshots in the redshift range $0.45catalogues and compare their clustering to the data using the angular correlation function $ w(\\theta)$, the comoving transverse separation clustering $\\xi_{\\mu<0.8}(s_{\\perp})$ and the angular power spectrum $C_\\ell$.

  19. KINGDOM OF SAUDI ARABIA GEOSPATIAL INFORMATION INFRASTRUCTURE – AN INITIAL STUDY

    Directory of Open Access Journals (Sweden)

    S. H. Alsultan

    2015-10-01

    Full Text Available This paper reviews the current Geographic Information System (Longley et al. implementation and status in the Kingdom of Saudi Arabia (KSA. Based on the review, several problems were identified and discussed. The characteristic of these problems show that the country needs a national geospatial centre. As a new initiative for a national geospatial centre, a study is being conducted especially on best practice from other countries, availability of national committee for standards and policies on data sharing, and the best proposed organization structure inside the administration for the KSA. The study also covers the degree of readiness and awareness among the main GIS stakeholders within the country as well as private parties. At the end of this paper, strategic steps for the national geospatial management centre were proposed as the initial output of the study.

  20. A Geospatial Decision Support System Toolkit, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and commercialize a working prototype Geospatial Decision Support Toolkit (GeoKit). GeoKit will enable scientists, agencies, and stakeholders to...

  1. Nuclear Knowledge Management Case Studies Catalogue “NKM CSC”

    International Nuclear Information System (INIS)

    Atieh, T.

    2016-01-01

    Full text: Over the past several years, many nuclear organizations in IAEA’s Member States have accumulated considerable experiences and achievements in the development and application of nuclear knowledge management (NKM) methodology and tools to improve their organizational performance. The IAEA NKM Section has initiated a project entitled “NKM Case Studies Catalogue (NKM CSC)” to capture and document, as well as preserve NKM experience and facilitate its sharing among NKM practitioners and experts. This is done through collection and preservation of information of relevant experiential knowledge in “case study” format. The catalogue will therefore support community of practice mechanisms. An input template is currently under development and will be used to help contributors in Member States who are providing concise set of information about their respective case studies. This information will be made searchable and easily retrievable through a platform that supports collaboration among NKM practitioners and experts. It is planned to launch the Nuclear Knowledge Management Case Studies Catalogue “NKM CSC” at the occasion of the “Third International Conference on Nuclear Knowledge Management—Challenges and Approaches, 7-–11 November 2016, Vienna, Austria”, and to include the accepted case studies submitted to this Conference. (author

  2. Resolving taxonmic discrepancies: Role of Electronic Catalogues of Known Organisms

    Directory of Open Access Journals (Sweden)

    Vishwas Chavan

    2005-01-01

    Full Text Available There is a disparity in availability of nomenclature change literature to the taxonomists of the developing world and availability of taxonomic papers published by developing world scientists to their counterparts in developed part of the globe. This has resulted in several discrepancies in the naming of organisms. Development of electronic catalogues of names of known organisms would help in pointing out these issues. We have attempted to highlight a few of such discrepancies found while developing IndFauna, an electronic catalogue of known Indian fauna and comparing it with existing global and regional databases.Full Text: PDF

  3. The new geospatial tools: global transparency enhancing safeguards verification

    Energy Technology Data Exchange (ETDEWEB)

    Pabian, Frank Vincent [Los Alamos National Laboratory

    2010-09-16

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  4. Catalogue Creation for Space Situational Awareness with Optical Sensors

    Science.gov (United States)

    Hobson, T.; Clarkson, I.; Bessell, T.; Rutten, M.; Gordon, N.; Moretti, N.; Morreale, B.

    2016-09-01

    In order to safeguard the continued use of space-based technologies, effective monitoring and tracking of man-made resident space objects (RSOs) is paramount. The diverse characteristics, behaviours and trajectories of RSOs make space surveillance a challenging application of the discipline that is tracking and surveillance. When surveillance systems are faced with non-canonical scenarios, it is common for human operators to intervene while researchers adapt and extend traditional tracking techniques in search of a solution. A complementary strategy for improving the robustness of space surveillance systems is to place greater emphasis on the anticipation of uncertainty. Namely, give the system the intelligence necessary to autonomously react to unforeseen events and to intelligently and appropriately act on tenuous information rather than discard it. In this paper we build from our 2015 campaign and describe the progression of a low-cost intelligent space surveillance system capable of autonomously cataloguing and maintaining track of RSOs. It currently exploits robotic electro-optical sensors, high-fidelity state-estimation and propagation as well as constrained initial orbit determination (IOD) to intelligently and adaptively manage its sensors in order to maintain an accurate catalogue of RSOs. In a step towards fully autonomous cataloguing, the system has been tasked with maintaining surveillance of a portion of the geosynchronous (GEO) belt. Using a combination of survey and track-refinement modes, the system is capable of maintaining a track of known RSOs and initiating tracks on previously unknown objects. Uniquely, due to the use of high-fidelity representations of a target's state uncertainty, as few as two images of previously unknown RSOs may be used to subsequently initiate autonomous search and reacquisition. To achieve this capability, particularly within the congested environment of the GEO-belt, we use a constrained admissible region (CAR) to

  5. Multifractal Omori law for earthquake triggering: new tests on the California, Japan and worldwide catalogues

    Science.gov (United States)

    Ouillon, G.; Sornette, D.; Ribeiro, E.

    2009-07-01

    The Multifractal Stress-Activated model is a statistical model of triggered seismicity based on mechanical and thermodynamic principles. It predicts that, above a triggering magnitude cut-off M0, the exponent p of the Omori law for the time decay of the rate of aftershocks is a linear increasing function p(M) = a0M + b0 of the main shock magnitude M. We previously reported empirical support for this prediction, using the Southern California Earthquake Center (SCEC) catalogue. Here, we confirm this observation using an updated, longer version of the same catalogue, as well as new methods to estimate p. One of this methods is the newly defined Scaling Function Analysis (SFA), adapted from the wavelet transform. This method is able to measure a mathematical singularity (hence a p-value), erasing the possible regular part of a time-series. The SFA also proves particularly efficient to reveal the coexistence and superposition of several types of relaxation laws (typical Omori sequences and short-lived swarms sequences) which can be mixed within the same catalogue. Another new method consists in monitoring the largest aftershock magnitude observed in successive time intervals, and thus shortcuts the problem of missing events with small magnitudes in aftershock catalogues. The same methods are used on data from the worldwide Harvard Centroid Moment Tensor (CMT) catalogue and show results compatible with those of Southern California. For the Japan Meteorological Agency (JMA) catalogue, we still observe a linear dependence of p on M, but with a smaller slope. The SFA shows however that results for this catalogue may be biased by numerous swarm sequences, despite our efforts to remove them before the analysis.

  6. Assessing the socioeconomic impact and value of open geospatial information

    Science.gov (United States)

    Pearlman, Francoise; Pearlman, Jay; Bernknopf, Richard; Coote, Andrew; Craglia, Massimo; Friedl, Lawrence; Gallo, Jason; Hertzfeld, Henry; Jolly, Claire; Macauley, Molly K.; Shapiro, Carl; Smart, Alan

    2016-03-10

    The production and accessibility of geospatial information including Earth observation is changing greatly both technically and in terms of human participation. Advances in technology have changed the way that geospatial data are produced and accessed, resulting in more efficient processes and greater accessibility than ever before. Improved technology has also created opportunities for increased participation in the gathering and interpretation of data through crowdsourcing and citizen science efforts. Increased accessibility has resulted in greater participation in the use of data as prices for Government-produced data have fallen and barriers to access have been reduced.

  7. Sextant: Visualizing time-evolving linked geospatial data

    NARCIS (Netherlands)

    C. Nikolaou (Charalampos); K. Dogani (Kallirroi); K. Bereta (Konstantina); G. Garbis (George); M. Karpathiotakis (Manos); K. Kyzirakos (Konstantinos); M. Koubarakis (Manolis)

    2015-01-01

    textabstractThe linked open data cloud is constantly evolving as datasets get continuously updated with newer versions. As a result, representing, querying, and visualizing the temporal dimension of linked data is crucial. This is especially important for geospatial datasets that form the backbone

  8. A cross-sectional ecological analysis of international and sub-national health inequalities in commercial geospatial resource availability.

    Science.gov (United States)

    Dotse-Gborgbortsi, Winfred; Wardrop, Nicola; Adewole, Ademola; Thomas, Mair L H; Wright, Jim

    2018-05-23

    Commercial geospatial data resources are frequently used to understand healthcare utilisation. Although there is widespread evidence of a digital divide for other digital resources and infra-structure, it is unclear how commercial geospatial data resources are distributed relative to health need. To examine the distribution of commercial geospatial data resources relative to health needs, we assembled coverage and quality metrics for commercial geocoding, neighbourhood characterisation, and travel time calculation resources for 183 countries. We developed a country-level, composite index of commercial geospatial data quality/availability and examined its distribution relative to age-standardised all-cause and cause specific (for three main causes of death) mortality using two inequality metrics, the slope index of inequality and relative concentration index. In two sub-national case studies, we also examined geocoding success rates versus area deprivation by district in Eastern Region, Ghana and Lagos State, Nigeria. Internationally, commercial geospatial data resources were inversely related to all-cause mortality. This relationship was more pronounced when examining mortality due to communicable diseases. Commercial geospatial data resources for calculating patient travel times were more equitably distributed relative to health need than resources for characterising neighbourhoods or geocoding patient addresses. Countries such as South Africa have comparatively high commercial geospatial data availability despite high mortality, whilst countries such as South Korea have comparatively low data availability and low mortality. Sub-nationally, evidence was mixed as to whether geocoding success was lowest in more deprived districts. To our knowledge, this is the first global analysis of commercial geospatial data resources in relation to health outcomes. In countries such as South Africa where there is high mortality but also comparatively rich commercial geospatial

  9. Research and Practical Trends in Geospatial Sciences

    Science.gov (United States)

    Karpik, A. P.; Musikhin, I. A.

    2016-06-01

    In recent years professional societies have been undergoing fundamental restructuring brought on by extensive technological change and rapid evolution of geospatial science. Almost all professional communities have been affected. Communities are embracing digital techniques, modern equipment, software and new technological solutions at a staggering pace. In this situation, when planning financial investments and intellectual resource management, it is crucial to have a clear understanding of those trends that will be in great demand in 3-7 years. This paper reviews current scientific and practical activities of such non-governmental international organizations as International Federation of Surveyors, International Cartographic Association, and International Society for Photogrammetry and Remote Sensing, analyzes and groups most relevant topics brought up at their scientific events, forecasts most probable research and practical trends in geospatial sciences, outlines topmost leading countries and emerging markets for further detailed analysis of their activities, types of scientific cooperation and joint implementation projects.

  10. Imprementation of Vgi-Based Geoportal for Empowering Citizen's Geospatial Observatories Related to Urban Disaster Management

    Science.gov (United States)

    Lee, Sanghoon

    2016-06-01

    The volunteered geospatial information (VGI) will be efficient and cost-effective method for generating and sharing large disasterrelated geospatial data. The national mapping organizations have played the role of major geospatial collector have been moving toward the considering public participation data collecting method. Due to VGI can conduct to encourage public participation and empower citizens, mapping agency could make a partnership with members of the VGI community to help to provide well-structured geospatial data. In order to effectively be understood and sharing the public semantics, datasets and action model of the public participation GeoPortal, the implemented VGI-GeoPortal designated as the basis of ISO 19154, ISO 19101 and OGC Reference Model. The proof of concepts of VGI-GeoPortal has been implemented urban flooding use-case in Republic of Korea to collect from the public, and analyze disaster-related geospatial data including high-disaster potential information such as the location of poor drainage sewer, small signs of occurring landslide, flooding vulnerability of urban structure, and etc.

  11. Methods and Tools to Align Curriculum to the Skills and Competencies Needed by the Workforce - an Example from Geospatial Science and Technology

    Science.gov (United States)

    Johnson, A. B.

    2012-12-01

    Geospatial science and technology (GST) including geographic information systems, remote sensing, global positioning systems and mobile applications, are valuable tools for geoscientists and students learning to become geoscientists. GST allows the user to analyze data spatially and temporarily and then visualize the data and outcomes in multiple formats (digital, web and paper). GST has evolved rapidly and it has been difficult to create effective curriculum as few guidelines existed to help educators. In 2010, the US Department of Labor (DoL), in collaboration with the National Geospatial Center of Excellence (GeoTech Center), a National Science Foundation supported grant, approved the Geospatial Technology Competency Mode (GTCM). The GTCM was developed and vetted with industry experts and provided the structure and example competencies needed across the industry. While the GTCM was helpful, a more detailed list of skills and competencies needed to be identified in order to build appropriate curriculum. The GeoTech Center carried out multiple DACUM events to identify the skills and competencies needed by entry-level workers. DACUM (Developing a Curriculum) is a job analysis process whereby expert workers are convened to describe what they do for a specific occupation. The outcomes from multiple DACUMs were combined into a MetaDACUM and reviewed by hundreds of GST professionals. This provided a list of more than 320 skills and competencies needed by the workforce. The GeoTech Center then held multiple workshops across the U.S. where more than 100 educators knowledgeable in teaching GST parsed the list into Model Courses and a Model Certificate Program. During this process, tools were developed that helped educators define which competency should be included in a specific course and the depth of instruction for that competency. This presentation will provide details about the process, methodology and tools used to create the Models and suggest how they can be used

  12. Development of Geospatial Map Based Election Portal

    Science.gov (United States)

    Gupta, A. Kumar Chandra; Kumar, P.; Vasanth Kumar, N.

    2014-11-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Election portal (GMEP) of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for pertain to planning and management of Department of Chief Electoral Officer, and as an election related information searching tools (Polling Station, Assembly and parliamentary constituency etc.,) for the citizens of NCTD. The GMEP is based on Client-Server architecture model. It has been developed using ArcGIS Server 10.0 with J2EE front-end on Microsoft Windows environment. The GMEP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMEP includes delimited precinct area boundaries of Voters Area of Polling stations, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMEP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of elections. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  13. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    International Nuclear Information System (INIS)

    Schreiner, Steffen; Banerjee, Subho Sankar; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Bagnasco, Stefano; Zhu Jianlin

    2011-01-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  14. An implementation of OGC WPS and BPEL4WS compliant dynamic geoprocessing services chain

    Science.gov (United States)

    Xie, Bin; Zhang, Denghui; Yu, Le; Zhang, Dengrong

    2008-12-01

    How to use web services quickly and efficiently is quite important in geospatial applications. A possible solution of sharing and integrating geospatial resources in opening web environment is to chain distributed and diversified geodata and geoprocessing by using web services. This paper presents an approach for chaining geoprocessing by employing Web Processing Service (WPS) and Business Process Execution Language for Web Services (BPEL4WS) under the service-oriented architecture (SOA) and Open Geospatial Consortium (OGC) standard. Workflow control model and SQL Server based register center are used in a prototype system for chaining geoprocessing web services which have been performed functionality decomposition and packed by using extended WPS.

  15. AGWA: The Automated Geospatial Watershed Assessment Tool

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment Tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  16. Users Manual for the Geospatial Stream Flow Model (GeoSFM)

    Science.gov (United States)

    Artan, Guleid A.; Asante, Kwabena; Smith, Jodie; Pervez, Md Shahriar; Entenmann, Debbie; Verdin, James P.; Rowland, James

    2008-01-01

    The monitoring of wide-area hydrologic events requires the manipulation of large amounts of geospatial and time series data into concise information products that characterize the location and magnitude of the event. To perform these manipulations, scientists at the U.S. Geological Survey Center for Earth Resources Observation and Science (EROS), with the cooperation of the U.S. Agency for International Development, Office of Foreign Disaster Assistance (USAID/OFDA), have implemented a hydrologic modeling system. The system includes a data assimilation component to generate data for a Geospatial Stream Flow Model (GeoSFM) that can be run operationally to identify and map wide-area streamflow anomalies. GeoSFM integrates a geographical information system (GIS) for geospatial preprocessing and postprocessing tasks and hydrologic modeling routines implemented as dynamically linked libraries (DLLs) for time series manipulations. Model results include maps that depicting the status of streamflow and soil water conditions. This Users Manual provides step-by-step instructions for running the model and for downloading and processing the input data required for initial model parameterization and daily operation.

  17. A catalogue of crude oil and oil product properties, 1990

    International Nuclear Information System (INIS)

    Bobra, M.A.; Callaghan, S.

    1990-09-01

    This catalogue is a compilation of available data on crude oils and petroleum products. The emphasis of the catalogue is upon oils which could potentially impact Canada's environment. Other oils which are unlikely to be of direct Canadian concern are also included because they have been well characterized and used in relevant studies. The properties listed for each oil are those which will provide an indication of a spilled oil's environmental behaviour and effects. The properties on which data is provided include API gravity, density, viscosity, interfacial tension, pour point, flash point, vapor pressure, volatility and component distribution, emulsion formation tendency and stability, weathering, dispersability, major hydrocarbon groups, aqueous solubility, toxicity, sulfur content, fire point, and wax content. Most of the chemical-physical properties listed in this catalogue were measured using standard tests. For certain properties, data are given at different temperatures and for different degrees of oil weathering. An oil's degree of weathering is expresed as the volume or weight percent evaporated from the fresh oil. Weathered oils used for testing were artificially weathered by gas stripping following the method of Mackay and Stiver. 109 refs

  18. A catalogue of crude oil and oil product properties, 1992

    International Nuclear Information System (INIS)

    Whiticar, S.; Bobra, M.; Liuzzo, P.; Callaghan, S.; Fingas, M.; Jokuty, P.; Ackerman, F.; Cao, J.

    1993-02-01

    This catalogue is a compilation of available data on crude oils and petroleum products. The emphasis of the catalogue is upon oils which could potentially impact Canada's environment. Other oils which are unlikely to be of direct Canadian concern are also included because they have been well characterized and used in relevant studies. The properties listed for each oil are those which will provide an indication of a spilled oil's environmental behaviour and effects. The properties on which data is provided include API gravity, density, viscosity, interfacial tension, pour point, flash point, vapor pressure, volatility and component distribution, emulsion formation tendency and stability, weathering, dispersability, major hydrocarbon groups, aqueous solubility, toxicity, sulfur content, fire point, and wax content. Most of the chemical-physical properties listed in this catalogue were measured using standard tests. For certain properties, data are given at different temperatures and for different degrees of oil weathering. An oil's degree of weathering is expresed as the volume or weight percent evaporated from the fresh oil. Weathered oils used for testing were artificially weathered by gas stripping following the method of Mackay and Stiver. 140 refs

  19. VizieR Online Data Catalog: WATCH Solar X-Ray Burst Catalogue (Crosby+ 1998)

    Science.gov (United States)

    Crosby, N.; Lund, N.; Vilmer, N.; Sunyaev, R.

    1998-01-01

    Catalogue containing solar X-ray bursts measured by the Danish Wide Angle Telescope for Cosmic Hard X-Rays (WATCH) experiment aboard the Russian satellite GRANAT in the deca-keV energy range. Table 1 lists the periods during which solar observations with WATCH are available (WATCH ON-TIME) and where the bursts listed in the catalogue have been observed. (2 data files).

  20. Prototype-based analysis of GAMA galaxy catalogue data

    NARCIS (Netherlands)

    Nolte, A.; Wang, L.; Biehl, M; Verleysen, Michel

    2018-01-01

    We present a prototype-based machine learning analysis of labeled galaxy catalogue data containing parameters from the Galaxy and Mass Assembly (GAMA) survey. Using both an unsupervised and supervised method, the Self-Organizing Map and Generalized Relevance Matrix Learning Vec- tor Quantization, we

  1. Procedures and challenges of retrospective catalogue conversion in ...

    African Journals Online (AJOL)

    The study recommended that management of the universities should provide stand- by electricity generator and upgrading of Internet network services among other things in the two university libraries for effective and efficient service delivery. Key words: Catalogue, Libraries Procedures, Conversion, Universities ...

  2. AN INTEROPERABLE ARCHITECTURE FOR AIR POLLUTION EARLY WARNING SYSTEM BASED ON SENSOR WEB

    Directory of Open Access Journals (Sweden)

    F. Samadzadegan

    2013-09-01

    Full Text Available Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE framework of the Open Geospatial Consortium (OGC, which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research

  3. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    Science.gov (United States)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an

  4. USGS Geospatial Fabric and Geo Data Portal for Continental Scale Hydrology Simulations

    Science.gov (United States)

    Sampson, K. M.; Newman, A. J.; Blodgett, D. L.; Viger, R.; Hay, L.; Clark, M. P.

    2013-12-01

    This presentation describes use of United States Geological Survey (USGS) data products and server-based resources for continental-scale hydrologic simulations. The USGS Modeling of Watershed Systems (MoWS) group provides a consistent national geospatial fabric built on NHDPlus. They have defined more than 100,000 hydrologic response units (HRUs) over the continental United States based on points of interest (POIs) and split into left and right bank based on the corresponding stream segment. Geophysical attributes are calculated for each HRU that can be used to define parameters in hydrologic and land-surface models. The Geo Data Portal (GDP) project at the USGS Center for Integrated Data Analytics (CIDA) provides access to downscaled climate datasets and processing services via web-interface and python modules for creating forcing datasets for any polygon (such as an HRU). These resources greatly reduce the labor required for creating model-ready data in-house, contributing to efficient and effective modeling applications. We will present an application of this USGS cyber-infrastructure for assessments of impacts of climate change on hydrology over the continental United States.

  5. Identification of stars and digital version of the catalogue of 1958 by Brodskaya and Shajn

    Science.gov (United States)

    Gorbunov, M. A.; Shlyapnikov, A. A.

    2017-12-01

    The following topics are considered: the identification of objects on search maps, the determination of their coordinates at the epoch of 2000, and converting the published version of the catalogue of 1958 by Brodskaya and Shajn into a machine-readable format. The statistics for photometric and spectral data from the original catalogue is presented. A digital version of the catalogue is described, as well as its presentation in HTML, VOTable and AJS formats and the basic principles of work in the interactive application of International Virtual Observatory - the Aladin Sky Atlas.

  6. A study on state of Geospatial courses in Indian Universities

    Science.gov (United States)

    Shekhar, S.

    2014-12-01

    Today the world is dominated by three technologies such as Nano technology, Bio technology and Geospatial technology. This increases the huge demand for experts in the respective field for disseminating the knowledge as well as for an innovative research. Therefore, the prime need is to train the existing fraternity to gain progressive knowledge in these technologies and impart the same to student community. The geospatial technology faces some peculiar problem than other two technologies because of its interdisciplinary, multi-disciplinary nature. It attracts students and mid career professionals from various disciplines including Physics, Computer science, Engineering, Geography, Geology, Agriculture, Forestry, Town Planning and so on. Hence there is always competition to crab and stabilize their position. The students of Master's degree in Geospatial science are facing two types of problem. The first one is no unique identity in the academic field. Neither they are exempted for National eligibility Test for Lecturer ship nor given an opportunity to have the exam in geospatial science. The second one is differential treatment by the industrial world. The students are either given low grade jobs or poorly paid for their job. Thus, it is a serious issue about the future of this course in the Universities and its recognition in the academic and industrial world. The universities should make this course towards more job oriented in consultation with the Industries and Industries should come forward to share their demands and requirements to the Universities, so that necessary changes in the curriculum can be made to meet the industrial requirements.

  7. Maintenir la continuité des collections à l'heure d'Internet : du catalogue de vente au site web de maison de vente

    OpenAIRE

    Jacquet , Françoise

    2014-01-01

    International audience; La Bibliothèque nationale de France conserve depuis des siècles une importante collection de catalogues de vente sur support papier. Cependant avec l’apparition d’Internet on assiste à une dématérialisation des données documentaires en art. Aujourd’hui les sites des maisons de vente offrent en ligne les adjudications des ventes, récentes ou archivées et certaines ventes sont désormais annoncées uniquement sur Internet. Ces informations qui ne se trouvent pas sur les ca...

  8. RESEARCH AND PRACTICAL TRENDS IN GEOSPATIAL SCIENCES

    Directory of Open Access Journals (Sweden)

    A. P. Karpik

    2016-06-01

    Full Text Available In recent years professional societies have been undergoing fundamental restructuring brought on by extensive technological change and rapid evolution of geospatial science. Almost all professional communities have been affected. Communities are embracing digital techniques, modern equipment, software and new technological solutions at a staggering pace. In this situation, when planning financial investments and intellectual resource management, it is crucial to have a clear understanding of those trends that will be in great demand in 3-7 years. This paper reviews current scientific and practical activities of such non-governmental international organizations as International Federation of Surveyors, International Cartographic Association, and International Society for Photogrammetry and Remote Sensing, analyzes and groups most relevant topics brought up at their scientific events, forecasts most probable research and practical trends in geospatial sciences, outlines topmost leading countries and emerging markets for further detailed analysis of their activities, types of scientific cooperation and joint implementation projects.

  9. Multiband Study of Radio Sources of the RCR Catalogue with Virtual Observatory Tools

    Directory of Open Access Journals (Sweden)

    Zhelenkova O. P.

    2012-09-01

    Full Text Available We present early results of our multiband study of the RATAN Cold Revised (RCR catalogue obtained from seven cycles of the “Cold” survey carried with the RATAN-600 radio telescope at 7.6 cm in 1980-1999, at the declination of the SS 433 source. We used the 2MASS and LAS UKIDSS infrared surveys, the DSS-II and SDSS DR7 optical surveys, as well as the USNO-B1 and GSC-II catalogues, the VLSS, TXS, NVSS, FIRST and GB6 radio surveys to accumulate information about the sources. For radio sources that have no detectable optical candidate in optical or infrared catalogues, we additionally looked through images in several bands from the SDSS, LAS UKIDSS, DPOSS, 2MASS surveys and also used co-added frames in different bands. We reliably identified 76% of radio sources of the RCR catalogue. We used the ALADIN and SAOImage DS9 scripting capabilities, interoperability services of ALADIN and TOPCAT, and also other Virtual Observatory (VO tools and resources, such as CASJobs, NED, Vizier, and WSA, for effective data access, visualization and analysis. Without VO tools it would have been problematic to perform our study.

  10. Geospatial technology perspectives for mining vis-a-vis sustainable forest ecosystems

    Directory of Open Access Journals (Sweden)

    Goparaju Laxmi

    2017-06-01

    Full Text Available Forests, the backbone of biogeochemical cycles and life supporting systems, are under severe pressure due to varied anthropogenic activities. Mining activities are one among the major reasons for forest destruction questioning the survivability and sustainability of flora and fauna existing in that area. Thus, monitoring and managing the impact of mining activities on natural resources at regular intervals is necessary to check the status of their depleted conditions, and to take up restoration and conservative measurements. Geospatial technology provides means to identify the impact of different mining operations on forest ecosystems and helps in proposing initiatives for safeguarding the forest environment. In this context, the present study highlights the problems related to mining in forest ecosystems and elucidates how geospatial technology can be employed at various stages of mining activities to achieve a sustainable forest ecosystem. The study collates information from various sources and highlights the role of geospatial technology in mining industries and reclamation process.

  11. Geospatial Data Management Platform for Urban Groundwater

    Science.gov (United States)

    Gaitanaru, D.; Priceputu, A.; Gogu, C. R.

    2012-04-01

    Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis

  12. VizieR Online Data Catalog: Catalogue of Galactic Planetary Nebulae (Kohoutek, 2001)

    Science.gov (United States)

    Kohoutek, L.

    2001-05-01

    The "Catalogue of Galactic Planetary Nebulae (Version 2000)" appears in Abhandlungen aus der Hamburger Sternwarte, Band XII in the year 2001. It is a continuation of CGPN(1967) and contains 1510 objects classified as galactic PNe up to the end of 1999. The lists of possible pre-PNe and possible post-PNe are also given. The catalogue is restricted only to the data belonging to the location and identification of the objects. It gives identification charts of PNe discovered since 1965 (published in the supplements to CGPN) and those charts of objects discovered earlier, which have wrong or uncertain identification. The question "what is a planetary nebula" is discussed and the typical values of PNe and of their central stars are summarized. Short statistics about the discoveries of PNe are given. The catalogue is also available in the Centre de Donnees, Strasbourg and at Hamburg Observatory via internet. (15 data files).

  13. Solar Maps | Geospatial Data Science | NREL

    Science.gov (United States)

    Solar Maps Solar Maps These solar maps provide average daily total solar resource information on disability, contact the Geospatial Data Science Team. U.S. State Solar Resource Maps Access state maps of MT NE NV NH NJ NM NY NC ND OH OK OR PA RI SC SD TN TX UT VT VA WA WV WI WY × U.S. Solar Resource

  14. Increasing the value of geospatial informatics with open approaches for Big Data

    Science.gov (United States)

    Percivall, G.; Bermudez, L. E.

    2017-12-01

    Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."

  15. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    Directory of Open Access Journals (Sweden)

    Shaoming Pan

    Full Text Available Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  16. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    Science.gov (United States)

    Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen

    2015-01-01

    Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  17. The Academic SDI—Towards understanding spatial data infrastructures for research and education

    CSIR Research Space (South Africa)

    Coetzee, S

    2017-05-01

    Full Text Available facilitating and coordinating the exchange of geospatial data and services between stakeholders from different levels in the spatial data community. Universities and other research organisations typically have well-established libraries and digital catalogues...

  18. Remote Sensing Technologies and Geospatial Modelling Hierarchy for Smart City Support

    Science.gov (United States)

    Popov, M.; Fedorovsky, O.; Stankevich, S.; Filipovich, V.; Khyzhniak, A.; Piestova, I.; Lubskyi, M.; Svideniuk, M.

    2017-12-01

    The approach to implementing the remote sensing technologies and geospatial modelling for smart city support is presented. The hierarchical structure and basic components of the smart city information support subsystem are considered. Some of the already available useful practical developments are described. These include city land use planning, urban vegetation analysis, thermal condition forecasting, geohazard detection, flooding risk assessment. Remote sensing data fusion approach for comprehensive geospatial analysis is discussed. Long-term city development forecasting by Forrester - Graham system dynamics model is provided over Kiev urban area.

  19. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    Science.gov (United States)

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  20. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    Science.gov (United States)

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  1. Partially populated catalogue of measured properties of field sections.

    Science.gov (United States)

    2014-10-01

    This catalogue documents the construction, monitoring, and mixture information of 11 test sections: four in SH 15 in the north Amarillo, three in US 62 in Childress, and four in Loop 820 in Fort Worth.

  2. Halo substructure in the SDSS-Gaia catalogue: streams and clumps

    Science.gov (United States)

    Myeong, G. C.; Evans, N. W.; Belokurov, V.; Amorisco, N. C.; Koposov, S. E.

    2018-04-01

    We use the Sloan Digital Sky Survey (SDSS)-Gaia Catalogue to identify six new pieces of halo substructure. SDSS-Gaia is an astrometric catalogue that exploits SDSS data release 9 to provide first epoch photometry for objects in the Gaia source catalogue. We use a version of the catalogue containing 245 316 stars with all phase-space coordinates within a heliocentric distance of ˜10 kpc. We devise a method to assess the significance of halo substructures based on their clustering in velocity space. The two most substantial structures are multiple wraps of a stream which has undergone considerable phase mixing (S1, with 94 members) and a kinematically cold stream (S2, with 61 members). The member stars of S1 have a median position of (X, Y, Z) = (8.12, -0.22, 2.75) kpc and a median metallicity of [Fe/H] = -1.78. The stars of S2 have median coordinates (X, Y, Z) = (8.66, 0.30, 0.77) kpc and a median metallicity of [Fe/H] = -1.91. They lie in velocity space close to some of the stars in the stream reported by Helmi et al. By modelling, we estimate that both structures had progenitors with virial masses ≈1010M⊙ and infall times ≳ 9 Gyr ago. Using abundance matching, these correspond to stellar masses between 106 and 107M⊙. These are somewhat larger than the masses inferred through the mass-metallicity relation by factors of 5 to 15. Additionally, we identify two further substructures (S3 and S4 with 55 and 40 members) and two clusters or moving group (C1 and C2 with 24 and 12) members. In all six cases, clustering in kinematics is found to correspond to clustering in both configuration space and metallicity, adding credence to the reliability of our detections.

  3. Provisional host catalogue of Fig wasps (Hymenoptera, Chalcidoidea)

    NARCIS (Netherlands)

    Wiebes, J.T.

    1966-01-01

    INTRODUCTION In this catalogue — entitled "provisional" because our knowledge of the subject is still so evidently incomplete — all species of Ficus mentioned as hosts of fig wasps, are listed with the Hymenoptera Chalcidoidea reared from their receptacles. The names used for the Agaonidae are in

  4. White dwarf-main sequence binaries from LAMOST: the DR5 catalogue

    Science.gov (United States)

    Ren, J.-J.; Rebassa-Mansergas, A.; Parsons, S. G.; Liu, X.-W.; Luo, A.-L.; Kong, X.; Zhang, H.-T.

    2018-03-01

    We present the data release (DR) 5 catalogue of white dwarf-main sequence (WDMS) binaries from the Large Area Multi-Object fiber Spectroscopic Telescope (LAMOST). The catalogue contains 876 WDMS binaries, of which 757 are additions to our previous LAMOST DR1 sample and 357 are systems that have not been published before. We also describe a LAMOST-dedicated survey that aims at obtaining spectra of photometrically-selected WDMS binaries from the Sloan Digital Sky Survey (SDSS) that are expected to contain cool white dwarfs and/or early type M dwarf companions. This is a population under-represented in previous SDSS WDMS binary catalogues. We determine the stellar parameters (white dwarf effective temperatures, surface gravities and masses, and M dwarf spectral types) of the LAMOST DR5 WDMS binaries and make use of the parameter distributions to analyse the properties of the sample. We find that, despite our efforts, systems containing cool white dwarfs remain under-represented. Moreover, we make use of LAMOST DR5 and SDSS DR14 (when available) spectra to measure the Na I λλ 8183.27, 8194.81 absorption doublet and/or Hα emission radial velocities of our systems. This allows identifying 128 binaries displaying significant radial velocity variations, 76 of which are new. Finally, we cross-match our catalogue with the Catalina Surveys and identify 57 systems displaying light curve variations. These include 16 eclipsing systems, two of which are new, and nine binaries that are new eclipsing candidates. We calculate periodograms from the photometric data and measure (estimate) the orbital periods of 30 (15) WDMS binaries.

  5. IMPREMENTATION OF VGI-BASED GEOPORTAL FOR EMPOWERING CITIZEN’S GEOSPATIAL OBSERVATORIES RELATED TO URBAN DISASTER MANAGEMENT

    Directory of Open Access Journals (Sweden)

    S. Lee

    2016-06-01

    Full Text Available The volunteered geospatial information (VGI will be efficient and cost-effective method for generating and sharing large disasterrelated geospatial data. The national mapping organizations have played the role of major geospatial collector have been moving toward the considering public participation data collecting method. Due to VGI can conduct to encourage public participation and empower citizens, mapping agency could make a partnership with members of the VGI community to help to provide well-structured geospatial data. In order to effectively be understood and sharing the public semantics, datasets and action model of the public participation GeoPortal, the implemented VGI-GeoPortal designated as the basis of ISO 19154, ISO 19101 and OGC Reference Model. The proof of concepts of VGI-GeoPortal has been implemented urban flooding use-case in Republic of Korea to collect from the public, and analyze disaster-related geospatial data including high-disaster potential information such as the location of poor drainage sewer, small signs of occurring landslide, flooding vulnerability of urban structure, and etc.

  6. ATLAS EventIndex Data Collection Supervisor and Web Interface

    CERN Document Server

    Garcia Montoro, Carlos; The ATLAS collaboration; Sanchez, Javier

    2016-01-01

    The EventIndex project consists in the development and deployment of a complete catalogue of events for the ATLAS experiment [1][2] at the LHC accelerator at CERN. In 2015 the ATLAS experiment has produced 12 billion real events in 1 million files, and 5 billion simulated events in 8 million files. The ATLAS EventIndex is running in production since mid-2015, reliably collecting information worldwide about all produced events and storing them in a central Hadoop infrastructure. A subset of this information is copied to an Oracle relational database. This paper presents two components of the ATLAS EventIndex [3]: its data collection supervisor and its web interface partner.

  7. ATLAS EventIndex Data Collection Supervisor and Web Interface

    CERN Document Server

    Garcia Montoro, Carlos; The ATLAS collaboration

    2016-01-01

    The EventIndex project consists in the development and deployment of a complete catalogue of events for the ATLAS experiment at the LHC accelerator at CERN. In 2015 the ATLAS experiment has produced 12 billion real events in 1 million files, and 5 billion simulated events in 8 million files. The ATLAS EventIndex is running in production since mid- 2015, reliably collecting information worldwide about all produced events and storing them in a central Hadoop infrastructure. A subset of this information is copied to an Oracle relational database. These slides present two components of the ATLAS EventIndex: its data collection supervisor and its web interface partner.

  8. Adoption of Geospatial Systems towards evolving Sustainable Himalayan Mountain Development

    Science.gov (United States)

    Murthy, M. S. R.; Bajracharya, B.; Pradhan, S.; Shestra, B.; Bajracharya, R.; Shakya, K.; Wesselmann, S.; Ali, M.; Bajracharya, S.; Pradhan, S.

    2014-11-01

    Natural resources dependence of mountain communities, rapid social and developmental changes, disaster proneness and climate change are conceived as the critical factors regulating sustainable Himalayan mountain development. The Himalayan region posed by typical geographic settings, diverse physical and cultural diversity present a formidable challenge to collect and manage data, information and understands varied socio-ecological settings. Recent advances in earth observation, near real-time data, in-situ measurements and in combination of information and communication technology have transformed the way we collect, process, and generate information and how we use such information for societal benefits. Glacier dynamics, land cover changes, disaster risk reduction systems, food security and ecosystem conservation are a few thematic areas where geospatial information and knowledge have significantly contributed to informed decision making systems over the region. The emergence and adoption of near-real time systems, unmanned aerial vehicles (UAV), board-scale citizen science (crowd-sourcing), mobile services and mapping, and cloud computing have paved the way towards developing automated environmental monitoring systems, enhanced scientific understanding of geophysical and biophysical processes, coupled management of socio-ecological systems and community based adaptation models tailored to mountain specific environment. There are differentiated capacities among the ICIMOD regional member countries with regard to utilization of earth observation and geospatial technologies. The region can greatly benefit from a coordinated and collaborative approach to capture the opportunities offered by earth observation and geospatial technologies. The regional level data sharing, knowledge exchange, and Himalayan GEO supporting geospatial platforms, spatial data infrastructure, unique region specific satellite systems to address trans-boundary challenges would go a long way in

  9. Transportation of Large Wind Components: A Review of Existing Geospatial Data

    Energy Technology Data Exchange (ETDEWEB)

    Mooney, Meghan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maclaurin, Galen [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    This report features the geospatial data component of a larger project evaluating logistical and infrastructure requirements for transporting oversized and overweight (OSOW) wind components. The goal of the larger project was to assess the status and opportunities for improving the infrastructure and regulatory practices necessary to transport wind turbine towers, blades, and nacelles from current and potential manufacturing facilities to end-use markets. The purpose of this report is to summarize existing geospatial data on wind component transportation infrastructure and to provide a data gap analysis, identifying areas for further analysis and data collection.

  10. QBCov: A Linked Data interface for Discrete Global Grid Systems, a new approach to delivering coverage data on the web

    Science.gov (United States)

    Zhang, Z.; Toyer, S.; Brizhinev, D.; Ledger, M.; Taylor, K.; Purss, M. B. J.

    2016-12-01

    We are witnessing a rapid proliferation of geoscientific and geospatial data from an increasing variety of sensors and sensor networks. This data presents great opportunities to resolve cross-disciplinary problems. However, working with it often requires an understanding of file formats and protocols seldom used outside of scientific computing, potentially limiting the data's value to other disciplines. In this paper, we present a new approach to serving satellite coverage data on the web, which improves ease-of-access using the principles of linked data. Linked data adapts the concepts and protocols of the human-readable web to machine-readable data; the number of developers familiar with web technologies makes linked data a natural choice for bringing coverages to a wider audience. Our approach to using linked data also makes it possible to efficiently service high-level SPARQL queries: for example, "Retrieve all Landsat ETM+ observations of San Francisco between July and August 2016" can easily be encoded in a single query. We validate the new approach, which we call QBCov, with a reference implementation of the entire stack, including a simple web-based client for interacting with Landsat observations. In addition to demonstrating the utility of linked data for publishing coverages, we investigate the heretofore unexplored relationship between Discrete Global Grid Systems (DGGS) and linked data. Our conclusions are informed by the aforementioned reference implementation of QBCov, which is backed by a hierarchical file format designed around the rHEALPix DGGS. Not only does the choice of a DGGS-based representation provide an efficient mechanism for accessing large coverages at multiple scales, but the ability of DGGS to produce persistent, unique identifiers for spatial regions is especially valuable in a linked data context. This suggests that DGGS has an important role to play in creating sustainable and scalable linked data infrastructures. QBCov is being

  11. A catalogue of the genera of the Vespidae (Hymenoptera)

    NARCIS (Netherlands)

    Vecht, van der J.; Carpenter, J.M.

    1990-01-01

    A comprehensive generic catalogue of the Vespidae is presented. New nomenclatural changes include synonymy of Alastoroides Saussure, 1856, with Paralastor Saussure 1856; Araucodynerus Willink, 1968, with Hypodynerus Saussure 1855; and Paranortonia Bequaert, 1940, with Parazumia Saussure, 1855.

  12. GEOSPATIAL DATA PROCESSING FOR 3D CITY MODEL GENERATION, MANAGEMENT AND VISUALIZATION

    Directory of Open Access Journals (Sweden)

    I. Toschi

    2017-05-01

    Full Text Available Recent developments of 3D technologies and tools have increased availability and relevance of 3D data (from 3D points to complete city models in the geospatial and geo-information domains. Nevertheless, the potential of 3D data is still underexploited and mainly confined to visualization purposes. Therefore, the major challenge today is to create automatic procedures that make best use of available technologies and data for the benefits and needs of public administrations (PA and national mapping agencies (NMA involved in “smart city” applications. The paper aims to demonstrate a step forward in this process by presenting the results of the SENECA project (Smart and SustaiNablE City from Above – http://seneca.fbk.eu. State-of-the-art processing solutions are investigated in order to (i efficiently exploit the photogrammetric workflow (aerial triangulation and dense image matching, (ii derive topologically and geometrically accurate 3D geo-objects (i.e. building models at various levels of detail and (iii link geometries with non-spatial information within a 3D geo-database management system accessible via web-based client. The developed methodology is tested on two case studies, i.e. the cities of Trento (Italy and Graz (Austria. Both spatial (i.e. nadir and oblique imagery and non-spatial (i.e. cadastral information and building energy consumptions data are collected and used as input for the project workflow, starting from 3D geometry capture and modelling in urban scenarios to geometry enrichment and management within a dedicated webGIS platform.

  13. Geospatial Data Processing for 3d City Model Generation, Management and Visualization

    Science.gov (United States)

    Toschi, I.; Nocerino, E.; Remondino, F.; Revolti, A.; Soria, G.; Piffer, S.

    2017-05-01

    Recent developments of 3D technologies and tools have increased availability and relevance of 3D data (from 3D points to complete city models) in the geospatial and geo-information domains. Nevertheless, the potential of 3D data is still underexploited and mainly confined to visualization purposes. Therefore, the major challenge today is to create automatic procedures that make best use of available technologies and data for the benefits and needs of public administrations (PA) and national mapping agencies (NMA) involved in "smart city" applications. The paper aims to demonstrate a step forward in this process by presenting the results of the SENECA project (Smart and SustaiNablE City from Above - http://seneca.fbk.eu). State-of-the-art processing solutions are investigated in order to (i) efficiently exploit the photogrammetric workflow (aerial triangulation and dense image matching), (ii) derive topologically and geometrically accurate 3D geo-objects (i.e. building models) at various levels of detail and (iii) link geometries with non-spatial information within a 3D geo-database management system accessible via web-based client. The developed methodology is tested on two case studies, i.e. the cities of Trento (Italy) and Graz (Austria). Both spatial (i.e. nadir and oblique imagery) and non-spatial (i.e. cadastral information and building energy consumptions) data are collected and used as input for the project workflow, starting from 3D geometry capture and modelling in urban scenarios to geometry enrichment and management within a dedicated webGIS platform.

  14. INTEGRATING GEOSPATIAL TECHNOLOGIES AND SECONDARY STUDENT PROJECTS: THE GEOSPATIAL SEMESTER

    Directory of Open Access Journals (Sweden)

    Bob Kolvoord

    2012-12-01

    Full Text Available Resumen:El Semestre Geoespacial es una actividad de educación geográfica centrada en que los estudiantes del último curso de secundaria en los institutos norteamericanos, adquieran competencias y habilidades específicas en sistemas de información geográfica, GPS y teledetección. A través de una metodología de aprendizaje basado en proyectos, los alumnos se motivan e implican en la realización de trabajos de investigación en los que analizan, e incluso proponen soluciones, diferentes procesos, problemas o cuestiones de naturaleza espacial. El proyecto está coordinado por la Universidad James Madison y lleva siete años implantándose en diferentes institutos del Estado de Virginia, implicando a más de 20 centros educativos y 1.500 alumnos. Los alumnos que superan esta asignatura de la enseñanza secundaria obtienen la convalidación de determinados créditos académicos de la Universidad de referencia.Palabras clave:Sistemas de información geográfica, enseñanza, didáctica de la geografía, semestre geoespacial.Abstract:The Geospatial Semester is a geographical education activity focused on students in their final year of secondary schools in the U.S., acquiring specific skills in GIS, GPS and remote sensing. Through a methodology for project-based learning, students are motivated and involved in conducting research using geographic information systems and analyze, and even propose solutions, different processes, problems or issues spatial in nature. The Geospatial Semester university management not only ensures proper coaching, guidance and GIS training for teachers of colleges, but has established a system whereby students who pass this course of secondary education gain the recognition of certain credits from the University.Key words:Geographic information system, teaching, geographic education, geospatial semester. Résumé:Le semestre géospatial est une activité axée sur l'éducation géographique des étudiants en derni

  15. NASA SensorWeb and OGC Standards for Disaster Management

    Science.gov (United States)

    Mandl, Dan

    2010-01-01

    I. Goal: Enable user to cost-effectively find and create customized data products to help manage disasters; a) On-demand; b) Low cost and non-specialized tools such as Google Earth and browsers; c) Access via open network but with sufficient security. II. Use standards to interface various sensors and resultant data: a) Wrap sensors in Open Geospatial Consortium (OGC) standards; b) Wrap data processing algorithms and servers with OGC standards c) Use standardized workflows to orchestrate and script the creation of these data; products. III. Target Web 2.0 mass market: a) Make it simple and easy to use; b) Leverage new capabilities and tools that are emerging; c) Improve speed and responsiveness.

  16. Automatic Scaling Hadoop in the Cloud for Efficient Process of Big Geospatial Data

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2016-09-01

    Full Text Available Efficient processing of big geospatial data is crucial for tackling global and regional challenges such as climate change and natural disasters, but it is challenging not only due to the massive data volume but also due to the intrinsic complexity and high dimensions of the geospatial datasets. While traditional computing infrastructure does not scale well with the rapidly increasing data volume, Hadoop has attracted increasing attention in geoscience communities for handling big geospatial data. Recently, many studies were carried out to investigate adopting Hadoop for processing big geospatial data, but how to adjust the computing resources to efficiently handle the dynamic geoprocessing workload was barely explored. To bridge this gap, we propose a novel framework to automatically scale the Hadoop cluster in the cloud environment to allocate the right amount of computing resources based on the dynamic geoprocessing workload. The framework and auto-scaling algorithms are introduced, and a prototype system was developed to demonstrate the feasibility and efficiency of the proposed scaling mechanism using Digital Elevation Model (DEM interpolation as an example. Experimental results show that this auto-scaling framework could (1 significantly reduce the computing resource utilization (by 80% in our example while delivering similar performance as a full-powered cluster; and (2 effectively handle the spike processing workload by automatically increasing the computing resources to ensure the processing is finished within an acceptable time. Such an auto-scaling approach provides a valuable reference to optimize the performance of geospatial applications to address data- and computational-intensity challenges in GIScience in a more cost-efficient manner.

  17. The AKARI IRC asteroid flux catalogue: updated diameters and albedos

    Science.gov (United States)

    Alí-Lagoa, V.; Müller, T. G.; Usui, F.; Hasegawa, S.

    2018-05-01

    The AKARI IRC all-sky survey provided more than twenty thousand thermal infrared observations of over five thousand asteroids. Diameters and albedos were obtained by fitting an empirically calibrated version of the standard thermal model to these data. After the publication of the flux catalogue in October 2016, our aim here is to present the AKARI IRC all-sky survey data and discuss valuable scientific applications in the field of small body physical properties studies. As an example, we update the catalogue of asteroid diameters and albedos based on AKARI using the near-Earth asteroid thermal model (NEATM). We fit the NEATM to derive asteroid diameters and, whenever possible, infrared beaming parameters. We fit groups of observations taken for the same object at different epochs of the survey separately, so we compute more than one diameter for approximately half of the catalogue. We obtained a total of 8097 diameters and albedos for 5170 asteroids, and we fitted the beaming parameter for almost two thousand of them. When it was not possible to fit the beaming parameter, we used a straight line fit to our sample's beaming parameter-versus-phase angle plot to set the default value for each fit individually instead of using a single average value. Our diameters agree with stellar-occultation-based diameters well within the accuracy expected for the model. They also match the previous AKARI-based catalogue at phase angles lower than 50°, but we find a systematic deviation at higher phase angles, at which near-Earth and Mars-crossing asteroids were observed. The AKARI IRC All-sky survey is an essential source of information about asteroids, especially the large ones, since, it provides observations at different observation geometries, rotational coverages and aspect angles. For example, by comparing in more detail a few asteroids for which dimensions were derived from occultations, we discuss how the multiple observations per object may already provide three

  18. A global catalogue of Ceres impact craters ≥ 1 km and preliminary analysis

    Science.gov (United States)

    Gou, Sheng; Yue, Zongyu; Di, Kaichang; Liu, Zhaoqin

    2018-03-01

    The orbital data products of Ceres, including global LAMO image mosaic and global HAMO DTM with a resolution of 35 m/pixel and 135 m/pixel respectively, are utilized in this research to create a global catalogue of impact craters with diameter ≥ 1 km, and their morphometric parameters are calculated. Statistics shows: (1) There are 29,219 craters in the catalogue, and the craters have a various morphologies, e.g., polygonal crater, floor fractured crater, complex crater with central peak, etc.; (2) The identifiable smallest crater size is extended to 1 km and the crater numbers have been updated when compared with the crater catalogue (D ≥ 20 km) released by the Dawn Science Team; (3) The d/D ratios for fresh simple craters, obviously degraded simple crater and polygonal simple crater are 0.11 ± 0.04, 0.05 ± 0.04 and 0.14 ± 0.02 respectively. (4) The d/D ratios for non-polygonal complex crater and polygonal complex crater are 0.08 ± 0.04 and 0.09 ± 0.03. The global crater catalogue created in this work can be further applied to many other scientific researches, such as comparing d/D with other bodies, inferring subsurface properties, determining surface age, and estimating average erosion rate.

  19. The PMA Catalogue as a realization of the extragalactic reference system in optical and near infrared wavelengths

    Science.gov (United States)

    Akhmetov, Volodymyr S.; Fedorov, Peter N.; Velichko, Anna B.

    2018-04-01

    We combined the data from the Gaia DR1 and Two-Micron All Sky Survey (2MASS) catalogues in order to derive the absolute proper motions more than 420 million stars distributed all over the sky in the stellar magnitude range 8 mag 2MASS catalogue objects, the 2-dimensional median filter was used. The PMA system of proper motion has been obtained by direct link to 1.6 millions extragalactic sources. The short analysis of the absolute proper motion of the PMA stars Catalogue is presented in this work. From a comparison of this data with same stars from the TGAS, UCAC4 and PPMXL catalogues, the equatorial components of the mutual rotation vector of these coordinate systems are determined.

  20. Geospatial Database for Strata Objects Based on Land Administration Domain Model (ladm)

    Science.gov (United States)

    Nasorudin, N. N.; Hassan, M. I.; Zulkifli, N. A.; Rahman, A. Abdul

    2016-09-01

    Recently in our country, the construction of buildings become more complex and it seems that strata objects database becomes more important in registering the real world as people now own and use multilevel of spaces. Furthermore, strata title was increasingly important and need to be well-managed. LADM is a standard model for land administration and it allows integrated 2D and 3D representation of spatial units. LADM also known as ISO 19152. The aim of this paper is to develop a strata objects database using LADM. This paper discusses the current 2D geospatial database and needs for 3D geospatial database in future. This paper also attempts to develop a strata objects database using a standard data model (LADM) and to analyze the developed strata objects database using LADM data model. The current cadastre system in Malaysia includes the strata title is discussed in this paper. The problems in the 2D geospatial database were listed and the needs for 3D geospatial database in future also is discussed. The processes to design a strata objects database are conceptual, logical and physical database design. The strata objects database will allow us to find the information on both non-spatial and spatial strata title information thus shows the location of the strata unit. This development of strata objects database may help to handle the strata title and information.

  1. Sharing human-generated observations by integrating HMI and the Semantic Sensor Web.

    Science.gov (United States)

    Sigüenza, Alvaro; Díaz-Pardo, David; Bernat, Jesús; Vancea, Vasile; Blanco, José Luis; Conejero, David; Gómez, Luis Hernández

    2012-01-01

    Current "Internet of Things" concepts point to a future where connected objects gather meaningful information about their environment and share it with other objects and people. In particular, objects embedding Human Machine Interaction (HMI), such as mobile devices and, increasingly, connected vehicles, home appliances, urban interactive infrastructures, etc., may not only be conceived as sources of sensor information, but, through interaction with their users, they can also produce highly valuable context-aware human-generated observations. We believe that the great promise offered by combining and sharing all of the different sources of information available can be realized through the integration of HMI and Semantic Sensor Web technologies. This paper presents a technological framework that harmonizes two of the most influential HMI and Sensor Web initiatives: the W3C's Multimodal Architecture and Interfaces (MMI) and the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) with its semantic extension, respectively. Although the proposed framework is general enough to be applied in a variety of connected objects integrating HMI, a particular development is presented for a connected car scenario where drivers' observations about the traffic or their environment are shared across the Semantic Sensor Web. For implementation and evaluation purposes an on-board OSGi (Open Services Gateway Initiative) architecture was built, integrating several available HMI, Sensor Web and Semantic Web technologies. A technical performance test and a conceptual validation of the scenario with potential users are reported, with results suggesting the approach is sound.

  2. Multi-source Geospatial Data Analysis with Google Earth Engine

    Science.gov (United States)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  3. The QuakeSim Project: Web Services for Managing Geophysical Data and Applications

    Science.gov (United States)

    Pierce, Marlon E.; Fox, Geoffrey C.; Aktas, Mehmet S.; Aydin, Galip; Gadgil, Harshawardhan; Qi, Zhigang; Sayar, Ahmet

    2008-04-01

    We describe our distributed systems research efforts to build the “cyberinfrastructure” components that constitute a geophysical Grid, or more accurately, a Grid of Grids. Service-oriented computing principles are used to build a distributed infrastructure of Web accessible components for accessing data and scientific applications. Our data services fall into two major categories: Archival, database-backed services based around Geographical Information System (GIS) standards from the Open Geospatial Consortium, and streaming services that can be used to filter and route real-time data sources such as Global Positioning System data streams. Execution support services include application execution management services and services for transferring remote files. These data and execution service families are bound together through metadata information and workflow services for service orchestration. Users may access the system through the QuakeSim scientific Web portal, which is built using a portlet component approach.

  4. Nebula observations. Catalogues and archive of photoplates

    Science.gov (United States)

    Shlyapnikov, A. A.; Smirnova, M. A.; Elizarova, N. V.

    2017-12-01

    A process of data systematization based on "Academician G.A. Shajn's Plan" for studying the Galaxy structure related to nebula observations is considered. The creation of digital versions of catalogues of observations and publications is described, as well as their presentation in HTML, VOTable and AJS formats and basic principles of work in the interactive application of International Virtual Observatory the Aladin Sky Atlas.

  5. Radioisotopes and radiopharmaceuticals catalogue

    International Nuclear Information System (INIS)

    2002-01-01

    The Chilean Nuclear Energy Commission (CCHEN) presents its radioisotopes and radiopharmaceuticals 2002 catalogue. In it we found physical characteristics of 9 different reactor produced radioisotopes ( Tc-99m, I-131, Sm-153, Ir-192, P-32, Na-24, K-42, Cu-64, Rb-86 ), 7 radiopharmaceuticals ( MDP, DTPA, DMSA, Disida, Phitate, S-Coloid, Red Blood Cells In-Vivo, Red Blood Cells In-Vitro) and 4 labelled compounds ( DMSA-Tc99m, DTPA-Tc99m, MIBG-I131, EDTMP-Sm153 ). In the near future the number of items will be increased with new reactor and cyclotron products. Our production system will be certified by ISO 9000 on March 2003. CCHEN is interested in being a national and an international supplier of these products (RS)

  6. Technology catalogue. Second edition

    International Nuclear Information System (INIS)

    1995-04-01

    The Department of Energy's (DOE's) Office of Environmental Management (EM) is responsible for remediating DOE contaminated sites and managing the DOE waste inventory in a safe and efficient manner. EM's Office of Technology Development (OTD) supports applied research and demonstration efforts to develop and transfer innovative, cost-effective technologies to its site clean-up and waste-management programs within EM. The purpose of the Technology Catalogue is to: (a) provide performance data on OTD-developed technologies to scientists and engineers responsible for preparing Remedial Investigation/Feasibility Studies (RI/FSs) and other compliance documents for the DOE's clean-up and waste-management programs; and (b) identify partnering and commercialization opportunities with industry, other federal and state agencies, and the academic community

  7. Technology catalogue. Second edition

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    The Department of Energy`s (DOE`s) Office of Environmental Management (EM) is responsible for remediating DOE contaminated sites and managing the DOE waste inventory in a safe and efficient manner. EM`s Office of Technology Development (OTD) supports applied research and demonstration efforts to develop and transfer innovative, cost-effective technologies to its site clean-up and waste-management programs within EM. The purpose of the Technology Catalogue is to: (a) provide performance data on OTD-developed technologies to scientists and engineers responsible for preparing Remedial Investigation/Feasibility Studies (RI/FSs) and other compliance documents for the DOE`s clean-up and waste-management programs; and (b) identify partnering and commercialization opportunities with industry, other federal and state agencies, and the academic community.

  8. High Performance Processing and Analysis of Geospatial Data Using CUDA on GPU

    Directory of Open Access Journals (Sweden)

    STOJANOVIC, N.

    2014-11-01

    Full Text Available In this paper, the high-performance processing of massive geospatial data on many-core GPU (Graphic Processing Unit is presented. We use CUDA (Compute Unified Device Architecture programming framework to implement parallel processing of common Geographic Information Systems (GIS algorithms, such as viewshed analysis and map-matching. Experimental evaluation indicates the improvement in performance with respect to CPU-based solutions and shows feasibility of using GPU and CUDA for parallel implementation of GIS algorithms over large-scale geospatial datasets.

  9. Catalogue of nuclear fusion codes - 1976

    International Nuclear Information System (INIS)

    1976-10-01

    A catalogue is presented of the computer codes in nuclear fusion research developed by JAERI, Division of Thermonuclear Fusion Research and Division of Large Tokamak Development in particular. It contains a total of about 100 codes under the categories: Atomic Process, Data Handling, Experimental Data Processing, Engineering, Input and Output, Special Languages and Their Application, Mathematical Programming, Miscellaneous, Numerical Analysis, Nuclear Physics, Plasma Physics and Fusion Research, Plasma Simulation and Numerical Technique, Reactor Design, Solid State Physics, Statistics, and System Program. (auth.)

  10. Mapping Heritage: Geospatial Online Databases of Historic Roads. The Case of the N-340 Roadway Corridor on the Spanish Mediterranean

    Directory of Open Access Journals (Sweden)

    Mar Loren-Méndez

    2018-04-01

    Full Text Available The study has developed an online geospatial database for assessing the complexity of roadway heritage, overcoming the limitations of traditional heritage catalogues and databases: the itemization of heritage assets and the rigidity of the database structure. Reflecting the current openness in the field of heritage studies, the research proposes an interdisciplinary approach that reframes heritage databases, both conceptually and technologically. Territorial scale is key for heritage interpretation, the complex characteristics of each type of heritage, and social appropriation. The system is based on an open-source content-management system and framework called ProcessWire, allowing flexibility in the definition of data fields and serving as an internal working tool for research collaboration. Accessibility, flexibility, and ease of use do not preclude rigor: the database works in conjunction with a GIS (Geographic Information System support system and is complemented by a bibliographical archive. A hierarchical multiscalar heritage characterization has been implemented in order to include the different territorial scales and to facilitate the creation of itineraries. Having attained the main goals of conceptual heritage coherence, accessibility, and rigor, the database should strive for broader capacity to integrate GIS information and stimulate public participation, a step toward controlled crowdsourcing and collaborative heritage characterization.

  11. Automated Geospatial Watershed Assessment Tool (AGWA) Poster Presentation

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  12. A Research Agenda for Geospatial Technologies and Learning

    Science.gov (United States)

    Baker, Tom R.; Battersby, Sarah; Bednarz, Sarah W.; Bodzin, Alec M.; Kolvoord, Bob; Moore, Steven; Sinton, Diana; Uttal, David

    2015-01-01

    Knowledge around geospatial technologies and learning remains sparse, inconsistent, and overly anecdotal. Studies are needed that are better structured; more systematic and replicable; attentive to progress and findings in the cognate fields of science, technology, engineering, and math education; and coordinated for multidisciplinary approaches.…

  13. Academic research opportunities at the National Geospatial-Intelligence Agency(NGA)

    Science.gov (United States)

    Loomer, Scott A.

    2006-05-01

    The vision of the National Geospatial-Intelligence Agency (NGA) is to "Know the Earth...Show the Way." To achieve this vision, the NGA provides geospatial intelligence in all its forms and from whatever source-imagery, imagery intelligence, and geospatial data and information-to ensure the knowledge foundation for planning, decision, and action. Academia plays a key role in the NGA research and development program through the NGA Academic Research Program. This multi-disciplinary program of basic research in geospatial intelligence topics provides grants and fellowships to the leading investigators, research universities, and colleges of the nation. This research provides the fundamental science support to NGA's applied and advanced research programs. The major components of the NGA Academic Research Program are: *NGA University Research Initiatives (NURI): Three-year basic research grants awarded competitively to the best investigators across the US academic community. Topics are selected to provide the scientific basis for advanced and applied research in NGA core disciplines. *Historically Black College and University - Minority Institution Research Initiatives (HBCU-MI): Two-year basic research grants awarded competitively to the best investigators at Historically Black Colleges and Universities, and Minority Institutions across the US academic community. *Intelligence Community Post-Doctoral Research Fellowships: Fellowships providing access to advanced research in science and technology applicable to the intelligence community's mission. The program provides a pool of researchers to support future intelligence community needs and develops long-term relationships with researchers as they move into career positions. This paper provides information about the NGA Academic Research Program, the projects it supports and how researchers and institutions can apply for grants under the program. In addition, other opportunities for academia to engage with NGA through

  14. A New User Interface for On-Demand Customizable Data Products for Sensors in a SensorWeb

    Science.gov (United States)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Sullivan, Don

    2011-01-01

    A SensorWeb is a set of sensors, which can consist of ground, airborne and space-based sensors interoperating in an automated or autonomous collaborative manner. The NASA SensorWeb toolbox, developed at NASA/GSFC in collaboration with NASA/JPL, NASA/Ames and other partners, is a set of software and standards that (1) enables users to create virtual private networks of sensors over open networks; (2) provides the capability to orchestrate their actions; (3) provides the capability to customize the output data products and (4) enables automated delivery of the data products to the users desktop. A recent addition to the SensorWeb Toolbox is a new user interface, together with web services co-resident with the sensors, to enable rapid creation, loading and execution of new algorithms for processing sensor data. The web service along with the user interface follows the Open Geospatial Consortium (OGC) standard called Web Coverage Processing Service (WCPS). This presentation will detail the prototype that was built and how the WCPS was tested against a HyspIRI flight testbed and an elastic computation cloud on the ground with EO-1 data. HyspIRI is a future NASA decadal mission. The elastic computation cloud stores EO-1 data and runs software similar to Amazon online shopping.

  15. A geospatial soil-based DSS to reconcile landscape management and land protection

    Science.gov (United States)

    Manna, Piero; Basile, Angelo; Bonfante, Antonello; D'Antonio, Amedeo; De Michele, Carlo; Iamarino, Michela; Langella, Giuliano; Florindo Mileti, Antonio; Pileri, Paolo; Vingiani, Simona; Terribile, Fabio

    2017-04-01

    The implementation of UN Agenda 2030 may represent a great opportunity to place soil science at the hearth of many Sustainable Development Goals (e.g. SDGs 2, 3, 13, 15, 15.3, 16.7). On the other side the high complexity embedded in the factual implementation of SDG and many others ambitious objectives (e.g. FAO goals) may cause new frustrations if these policy documents will not bring real progresses. The scientific communities are asked to contribute to disentangle this complexity and possibly identifying a "way to go". This may help the large number of European directives (e.g. WFD, EIA), regulations and communications aiming to achieve a better environment but still facing large difficulties in their full implementation (e.g. COM2015/120; COM2013/683). This contribution has the motivation to provide a different perspective, thinking that the full implementation of SDGs and integrated land policies requires to challenge some key overlooked issues including full competence (and managing capability) about the landscape variability, its multi-functionalities (e.g. agriculture / environment) and its dynamic nature (many processes, including crops growth and fate of pollutants, are dynamic); moreover, it requires to support actions at a very detailed local scale since many processes and problems are site specific. The landscape and all the above issues have the soil as pulsing heart. Accordingly, we aim to demonstrate the multiple benefits in using a smart geoSpatial Decision Support System (S-DSS) grounded on soil modelling, called SOILCONSWEB (EU LIFE+ project and its extensions). It is a freely-accessible web platform based on a Geospatial Cyber-Infrastructure (GCI) and developed in Valle Telesina (South Italy) over an area of 20,000 ha. It supports a multilevel decision-making in agriculture and environment including the interaction with other land uses (such as landscape and urban planning) and thus it simultaneously delivers to SDGs 2, 3, 13, 15, 15.3, 16.7.

  16. Conceptual models in the field of library catalogues

    Directory of Open Access Journals (Sweden)

    Marija Petek

    2000-01-01

    Full Text Available The publishing world is changing quickly and so must also bibliographic control. It is tirne to re-examine cataloguing rules and MARC formats. This can be done by the method of conceptual modelling. Some conceptual models are presented; an IFLA study on the functional requirements for bibliographic records is described in detail.

  17. Improving Library Management by Using Cost Analysis Tools: A Case Study for Cataloguing Processes

    Directory of Open Access Journals (Sweden)

    Lorena Siguenza-Guzman

    2014-02-01

    Full Text Available TTDABC is a relatively new costing management technique, initially developed for manufacturing processes, which is gaining attention in libraries. This is because TDABC is a fast and simple method that only requires two parameters, an estimation of time required to perform an activity and the unit cost per time of supplying capacity. A few case studies have been documented with regard to TDABC in libraries; all of them being oriented to analyse specific library activities such as inter-library loan, acquisition and circulation processes. The primary focus of this paper is to describe TDABC implementation in one of the most important library processes, namely cataloguing. In particular, original and copy cataloguing are analysed through a case study to demonstrate the applicability and usefulness of TDABC to perform cost analysis of cataloguing processes.

  18. Geospatial cryptography: enabling researchers to access private, spatially referenced, human subjects data for cancer control and prevention.

    Science.gov (United States)

    Jacquez, Geoffrey M; Essex, Aleksander; Curtis, Andrew; Kohler, Betsy; Sherman, Recinda; Emam, Khaled El; Shi, Chen; Kaufmann, Andy; Beale, Linda; Cusick, Thomas; Goldberg, Daniel; Goovaerts, Pierre

    2017-07-01

    As the volume, accuracy and precision of digital geographic information have increased, concerns regarding individual privacy and confidentiality have come to the forefront. Not only do these challenge a basic tenet underlying the advancement of science by posing substantial obstacles to the sharing of data to validate research results, but they are obstacles to conducting certain research projects in the first place. Geospatial cryptography involves the specification, design, implementation and application of cryptographic techniques to address privacy, confidentiality and security concerns for geographically referenced data. This article defines geospatial cryptography and demonstrates its application in cancer control and surveillance. Four use cases are considered: (1) national-level de-duplication among state or province-based cancer registries; (2) sharing of confidential data across cancer registries to support case aggregation across administrative geographies; (3) secure data linkage; and (4) cancer cluster investigation and surveillance. A secure multi-party system for geospatial cryptography is developed. Solutions under geospatial cryptography are presented and computation time is calculated. As services provided by cancer registries to the research community, de-duplication, case aggregation across administrative geographies and secure data linkage are often time-consuming and in some instances precluded by confidentiality and security concerns. Geospatial cryptography provides secure solutions that hold significant promise for addressing these concerns and for accelerating the pace of research with human subjects data residing in our nation's cancer registries. Pursuit of the research directions posed herein conceivably would lead to a geospatially encrypted geographic information system (GEGIS) designed specifically to promote the sharing and spatial analysis of confidential data. Geospatial cryptography holds substantial promise for accelerating the

  19. Architecture of a spatial data service system for statistical analysis and visualization of regional climate changes

    Science.gov (United States)

    Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.

  20. Big Data analytics in the Geo-Spatial Domain

    NARCIS (Netherlands)

    R.A. Goncalves (Romulo); M.G. Ivanova (Milena); M.L. Kersten (Martin); H. Scholten; S. Zlatanova; F. Alvanaki (Foteini); P. Nourian (Pirouz); E. Dias

    2014-01-01

    htmlabstractBig data collections in many scientific domains have inherently rich spatial and geo-spatial features. Spatial location is among the core aspects of data in Earth observation sciences, astronomy, and seismology to name a few. The goal of our project is to design an efficient data