WorldWideScience

Sample records for geospatial catalogue web

  1. Grid Enabled Geospatial Catalogue Web Service

    Science.gov (United States)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  2. Grid computing enhances standards-compatible geospatial catalogue service

    Science.gov (United States)

    Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang

    2010-04-01

    A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and

  3. Geospatial semantic web

    CERN Document Server

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such as WFS, WMS, RDF, OWL, and GeoSPARQL, and demonstrates how to use the Geospatial Semantic Web technologies to solve practical real-world problems such as spatial data interoperability.

  4. Borderless Geospatial Web (bolegweb)

    Science.gov (United States)

    Cetl, V.; Kliment, T.; Kliment, M.

    2016-06-01

    The effective access and use of geospatial information (GI) resources acquires a critical value of importance in modern knowledge based society. Standard web services defined by Open Geospatial Consortium (OGC) are frequently used within the implementations of spatial data infrastructures (SDIs) to facilitate discovery and use of geospatial data. This data is stored in databases located in a layer, called the invisible web, thus are ignored by search engines. SDI uses a catalogue (discovery) service for the web as a gateway to the GI world through the metadata defined by ISO standards, which are structurally diverse to OGC metadata. Therefore, a crosswalk needs to be implemented to bridge the OGC resources discovered on mainstream web with those documented by metadata in an SDI to enrich its information extent. A public global wide and user friendly portal of OGC resources available on the web ensures and enhances the use of GI within a multidisciplinary context and bridges the geospatial web from the end-user perspective, thus opens its borders to everybody. Project "Crosswalking the layers of geospatial information resources to enable a borderless geospatial web" with the acronym BOLEGWEB is ongoing as a postdoctoral research project at the Faculty of Geodesy, University of Zagreb in Croatia (http://bolegweb.geof.unizg.hr/). The research leading to the results of the project has received funding from the European Union Seventh Framework Programme (FP7 2007-2013) under Marie Curie FP7-PEOPLE-2011-COFUND. The project started in the November 2014 and is planned to be finished by the end of 2016. This paper provides an overview of the project, research questions and methodology, so far achieved results and future steps.

  5. BORDERLESS GEOSPATIAL WEB (BOLEGWEB

    Directory of Open Access Journals (Sweden)

    V. Cetl

    2016-06-01

    Full Text Available The effective access and use of geospatial information (GI resources acquires a critical value of importance in modern knowledge based society. Standard web services defined by Open Geospatial Consortium (OGC are frequently used within the implementations of spatial data infrastructures (SDIs to facilitate discovery and use of geospatial data. This data is stored in databases located in a layer, called the invisible web, thus are ignored by search engines. SDI uses a catalogue (discovery service for the web as a gateway to the GI world through the metadata defined by ISO standards, which are structurally diverse to OGC metadata. Therefore, a crosswalk needs to be implemented to bridge the OGC resources discovered on mainstream web with those documented by metadata in an SDI to enrich its information extent. A public global wide and user friendly portal of OGC resources available on the web ensures and enhances the use of GI within a multidisciplinary context and bridges the geospatial web from the end-user perspective, thus opens its borders to everybody. Project “Crosswalking the layers of geospatial information resources to enable a borderless geospatial web” with the acronym BOLEGWEB is ongoing as a postdoctoral research project at the Faculty of Geodesy, University of Zagreb in Croatia (http://bolegweb.geof.unizg.hr/. The research leading to the results of the project has received funding from the European Union Seventh Framework Programme (FP7 2007-2013 under Marie Curie FP7-PEOPLE-2011-COFUND. The project started in the November 2014 and is planned to be finished by the end of 2016. This paper provides an overview of the project, research questions and methodology, so far achieved results and future steps.

  6. Semantic web-based intelligent geospatial web services

    CERN Document Server

    Yue, Peng

    2013-01-01

    By introducing Semantic Web technologies into geospatial Web services, this book addresses the semantic description of geospatial data and standards-based Web services, discovery of geospatial data and services, and generation of composite services. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered in geospatial catalogue services. The ontology-based approach helps to improve the recall and precision of data and services discovery. Semantics-enabled metadata tracking and satisfaction allows analysts to focus on the g

  7. Integrating semantic web technologies and geospatial catalog services for geospatial information discovery and processing in cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Yue, Peng [Wuhan University; Gong, Jianya [Wuhan University; Di, Liping [George Mason University; He, Lianlian [Hubei University; Wei, Yaxing [ORNL

    2011-04-01

    Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information and discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.

  8. Geospatial Semantics and the Semantic Web

    CERN Document Server

    Ashish, Naveen

    2011-01-01

    The availability of geographic and geospatial information and services, especially on the open Web has become abundant in the last several years with the proliferation of online maps, geo-coding services, geospatial Web services and geospatially enabled applications. The need for geospatial reasoning has significantly increased in many everyday applications including personal digital assistants, Web search applications, local aware mobile services, specialized systems for emergency response, medical triaging, intelligence analysis and more. Geospatial Semantics and the Semantic Web: Foundation

  9. Fire alerts on the geospatial semantic web

    CSIR Research Space (South Africa)

    Mcferren, GA

    2006-01-01

    Full Text Available conceptbased queries of data and knowledge repositories. Future AFIS versions would supply highly tuned, meaningful and customised fire alerts to users based on an open framework of geospatial Web services, ontologies and software agents. Other Webbased...

  10. A flexible geospatial sensor observation service for diverse sensor data based on Web service

    Science.gov (United States)

    Chen, Nengcheng; Di, Liping; Yu, Genong; Min, Min

    Achieving a flexible and efficient geospatial Sensor Observation Service (SOS) is difficult, given the diversity of sensor networks, the heterogeneity of sensor data storage, and the differing requirements of users. This paper describes development of a service-oriented multi-purpose SOS framework. The goal is to create a single method of access to the data by integrating the sensor observation service with other Open Geospatial Consortium (OGC) services — Catalogue Service for the Web (CSW), Transactional Web Feature Service (WFS-T) and Transactional Web Coverage Service (WCS-T). The framework includes an extensible sensor data adapter, an OGC-compliant geospatial SOS, a geospatial catalogue service, a WFS-T, and a WCS-T for the SOS, and a geospatial sensor client. The extensible sensor data adapter finds, stores, and manages sensor data from live sensors, sensor models, and simulation systems. Abstract factory design patterns are used during design and implementation. A sensor observation service compatible with the SWE is designed, following the OGC "core" and "transaction" specifications. It is implemented using Java servlet technology. It can be easily deployed in any Java servlet container and automatically exposed for discovery using Web Service Description Language (WSDL). Interaction sequences between a Sensor Web data consumer and an SOS, between a producer and an SOS, and between an SOS and a CSW are described in detail. The framework has been successfully demonstrated in application scenarios for EO-1 observations, weather observations, and water height gauge observations.

  11. GeoWeb Crawler: An Extensible and Scalable Web Crawling Framework for Discovering Geospatial Web Resources

    Directory of Open Access Journals (Sweden)

    Chih-Yuan Huang

    2016-08-01

    Full Text Available With the advance of the World-Wide Web (WWW technology, people can easily share content on the Web, including geospatial data and web services. Thus, the “big geospatial data management” issues start attracting attention. Among the big geospatial data issues, this research focuses on discovering distributed geospatial resources. As resources are scattered on the WWW, users cannot find resources of their interests efficiently. While the WWW has Web search engines addressing web resource discovery issues, we envision that the geospatial Web (i.e., GeoWeb also requires GeoWeb search engines. To realize a GeoWeb search engine, one of the first steps is to proactively discover GeoWeb resources on the WWW. Hence, in this study, we propose the GeoWeb Crawler, an extensible Web crawling framework that can find various types of GeoWeb resources, such as Open Geospatial Consortium (OGC web services, Keyhole Markup Language (KML and Environmental Systems Research Institute, Inc (ESRI Shapefiles. In addition, we apply the distributed computing concept to promote the performance of the GeoWeb Crawler. The result shows that for 10 targeted resources types, the GeoWeb Crawler discovered 7351 geospatial services and 194,003 datasets. As a result, the proposed GeoWeb Crawler framework is proven to be extensible and scalable to provide a comprehensive index of GeoWeb.

  12. Geospatial metadata retrieval from web services

    Directory of Open Access Journals (Sweden)

    Ivanildo Barbosa

    Full Text Available Nowadays, producers of geospatial data in either raster or vector formats are able to make them available on the World Wide Web by deploying web services that enable users to access and query on those contents even without specific software for geoprocessing. Several providers around the world have deployed instances of WMS (Web Map Service, WFS (Web Feature Service and WCS (Web Coverage Service, all of them specified by the Open Geospatial Consortium (OGC. In consequence, metadata about the available contents can be retrieved to be compared with similar offline datasets from other sources. This paper presents a brief summary and describes the matching process between the specifications for OGC web services (WMS, WFS and WCS and the specifications for metadata required by the ISO 19115 - adopted as reference for several national metadata profiles, including the Brazilian one. This process focuses on retrieving metadata about the identification and data quality packages as well as indicates the directions to retrieve metadata related to other packages. Therefore, users are able to assess whether the provided contents fit to their purposes.

  13. Publishing Geospatial Data through Geospatial Web Service and XML Database System

    OpenAIRE

    Pouria Amirian; Ali A. Alesheikh

    2008-01-01

    Technically the spatial non-interoperability problem associated with current geospatial processing systems can be categorized as data and access non-interoperability. In GIS community, Open GIS Consortium (OGC) geospatial Web services have been introduced to overcome spatial non-interoperability problem associated with most geospatial processing systems. At the same time, in Information Technology (IT) world, the best solution for providing interoperability among heterogeneous systems is Web ...

  14. A geospatial search engine for discovering multi-format geospatial data across the web

    Science.gov (United States)

    Christopher Bone; Alan Ager; Ken Bunzel; Lauren Tierney

    2014-01-01

    The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created. However, challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist. The objective of this paper is to present a publically...

  15. A Javascript GIS Platform Based on Invocable Geospatial Web Services

    Directory of Open Access Journals (Sweden)

    Konstantinos Evangelidis

    2018-04-01

    Full Text Available Semantic Web technologies are being increasingly adopted by the geospatial community during last decade through the utilization of open standards for expressing and serving geospatial data. This was also dramatically assisted by the ever-increasing access and usage of geographic mapping and location-based services via smart devices in people’s daily activities. In this paper, we explore the developmental framework of a pure JavaScript client-side GIS platform exclusively based on invocable geospatial Web services. We also extend JavaScript utilization on the server side by deploying a node server acting as a bridge between open source WPS libraries and popular geoprocessing engines. The vehicle for such an exploration is a cross platform Web browser capable of interpreting JavaScript commands to achieve interaction with geospatial providers. The tool is a generic Web interface providing capabilities of acquiring spatial datasets, composing layouts and applying geospatial processes. In an ideal form the end-user will have to identify those services, which satisfy a geo-related need and put them in the appropriate row. The final output may act as a potential collector of freely available geospatial web services. Its server-side components may exploit geospatial processing suppliers composing that way a light-weight fully transparent open Web GIS platform.

  16. A resource-oriented architecture for a Geospatial Web

    Science.gov (United States)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary

  17. A GEOSPATIAL WEB SERVICES COMPOSITION FRAMEWORK SUPPORTING REAL-TIME STATUS MONITORING

    OpenAIRE

    L. You; Z. Gui; W. Guo; S. Shen; H. Wu

    2012-01-01

    Geospatial web services composition becomes one of the main solutions for complex computing in the GIS realm with the development of information interoperability and advanced IT technologies. Standard geospatial web services only have two simple statuses: success or failure. However the procedures for geospatial information processing and analysis always feature intensive data, complex computation, and long processing times. Thus, standard geospatial web services composition only pro...

  18. An Ontology-supported Approach for Automatic Chaining of Web Services in Geospatial Knowledge Discovery

    Science.gov (United States)

    di, L.; Yue, P.; Yang, W.; Yu, G.

    2006-12-01

    Recent developments in geospatial semantic Web have shown promise for automatic discovery, access, and use of geospatial Web services to quickly and efficiently solve particular application problems. With the semantic Web technology, it is highly feasible to construct intelligent geospatial knowledge systems that can provide answers to many geospatial application questions. A key challenge in constructing such intelligent knowledge system is to automate the creation of a chain or process workflow that involves multiple services and highly diversified data and can generate the answer to a specific question of users. This presentation discusses an approach for automating composition of geospatial Web service chains by employing geospatial semantics described by geospatial ontologies. It shows how ontology-based geospatial semantics are used for enabling the automatic discovery, mediation, and chaining of geospatial Web services. OWL-S is used to represent the geospatial semantics of individual Web services and the type of the services it belongs to and the type of the data it can handle. The hierarchy and classification of service types are described in the service ontology. The hierarchy and classification of data types are presented in the data ontology. For answering users' geospatial questions, an Artificial Intelligent (AI) planning algorithm is used to construct the service chain by using the service and data logics expressed in the ontologies. The chain can be expressed as a graph with nodes representing services and connection weights representing degrees of semantic matching between nodes. The graph is a visual representation of logical geo-processing path for answering users' questions. The graph can be instantiated to a physical service workflow for execution to generate the answer to a user's question. A prototype system, which includes real world geospatial applications, is implemented to demonstrate the concept and approach.

  19. The geospatial web how geobrowsers, social software and the web 2 0 are shaping the network society

    CERN Document Server

    Scharl, Arno; Tochtermann, Klaus

    2007-01-01

    The Geospatial Web will have a profound impact on managing knowledge, structuring work flows within and across organizations, and communicating with like-minded individuals in virtual communities. The enabling technologies for the Geospatial Web are geo-browsers such as NASA World Wind, Google Earth and Microsoft Live Local 3D. These three-dimensional platforms revolutionize the production and consumption of media products. They not only reveal the geographic distribution of Web resources and services, but also bring together people of similar interests, browsing behavior, or geographic location. This book summarizes the latest research on the Geospatial Web's technical foundations, describes information services and collaborative tools built on top of geo-browsers, and investigates the environmental, social and economic impacts of geospatial applications. The role of contextual knowledge in shaping the emerging network society deserves particular attention. By integrating geospatial and semantic technology, ...

  20. A web service for service composition to aid geospatial modelers

    Science.gov (United States)

    Bigagli, L.; Santoro, M.; Roncella, R.; Mazzetti, P.

    2012-04-01

    The identification of appropriate mechanisms for process reuse, chaining and composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. In the Earth and Space Sciences, such a facility could primarily enable integrated and interoperable modeling, for what several approaches have been proposed and developed, over the last years. In fact, GEOSS is specifically tasked with the development of the so-called "Model Web". At increasing levels of abstraction and generalization, the initial stove-pipe software tools have evolved to community-wide modeling frameworks, to Component-Based Architecture solution, and, more recently, started to embrace Service-Oriented Architectures technologies, such as the OGC WPS specification and the WS-* stack of W3C standards for service composition. However, so far, the level of abstraction seems too low for implementing the Model Web vision, and far too complex technological aspects must still be addressed by both providers and users, resulting in limited usability and, eventually, difficult uptake. As by the recent ICT trend of resource virtualization, it has been suggested that users in need of a particular processing capability, required by a given modeling workflow, may benefit from outsourcing the composition activities into an external first-class service, according to the Composition as a Service (CaaS) approach. A CaaS system provides the necessary interoperability service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general) in the form of executable workflows. This work introduces the architecture of a CaaS system, as a distributed information system for creating, validating, editing, storing, publishing, and executing geospatial workflows. This way, the users can be freed from the need of a composition infrastructure and

  1. Geospatial Web Services in Real Estate Information System

    Science.gov (United States)

    Radulovic, Aleksandra; Sladic, Dubravka; Govedarica, Miro; Popovic, Dragana; Radovic, Jovana

    2017-12-01

    Since the data of cadastral records are of great importance for the economic development of the country, they must be well structured and organized. Records of real estate on the territory of Serbia met many problems in previous years. To prevent problems and to achieve efficient access, sharing and exchange of cadastral data on the principles of interoperability, domain model for real estate is created according to current standards in the field of spatial data. The resulting profile of the domain model for the Serbian real estate cadastre is based on the current legislation and on Land Administration Domain Model (LADM) which is specified in the ISO19152 standard. Above such organized data, and for their effective exchange, it is necessary to develop a model of services that must be provided by the institutions interested in the exchange of cadastral data. This is achieved by introducing a service-oriented architecture in the information system of real estate cadastre and with that ensures efficiency of the system. It is necessary to develop user services for download, review and use of the real estate data through the web. These services should be provided to all users who need access to cadastral data (natural and legal persons as well as state institutions) through e-government. It is also necessary to provide search, view and download of cadastral spatial data by specifying geospatial services. Considering that real estate contains geometric data for parcels and buildings it is necessary to establish set of geospatial services that would provide information and maps for the analysis of spatial data, and for forming a raster data. Besides the theme Cadastral parcels, INSPIRE directive specifies several themes that involve data on buildings and land use, for which data can be provided from real estate cadastre. In this paper, model of geospatial services in Serbia is defined. A case study of using these services to estimate which household is at risk of

  2. Using a Web GIS Plate Tectonics Simulation to Promote Geospatial Thinking

    Science.gov (United States)

    Bodzin, Alec M.; Anastasio, David; Sharif, Rajhida; Rutzmoser, Scott

    2016-01-01

    Learning with Web-based geographic information system (Web GIS) can promote geospatial thinking and analysis of georeferenced data. Web GIS can enable learners to analyze rich data sets to understand spatial relationships that are managed in georeferenced data visualizations. We developed a Web GIS plate tectonics simulation as a capstone learning…

  3. A catalogue of bird bones: an exercise in semantic web practice

    OpenAIRE

    Gudmundsson, Gudmundur; Brewington, Seth D.; McGovern, Thomas H.; Petersen, Aevar

    2010-01-01

    The vast databases of natural history collections are increasingly being made accessible through the internet. The challenge is to place this data in a wider context that may reach beyond the interests of scholars only. The North Atlantic Biocultural Organization and Icelandic Institute of Natural History are jointly developing a web based catalogue of bird bones, comprising digital images, and related information from the museum database. Linking the bird bone catalogue wit...

  4. Integrated web system of geospatial data services for climate research

    Science.gov (United States)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander

    2016-04-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required. An approach for integrated analysis of georefernced climatological data sets based on combination of web and GIS technologies in the framework of spatial data infrastructure paradigm is presented. According to this approach a dedicated data-processing web system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is being developed. It is based on Open Geospatial Consortium (OGC) standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement #14.613.21.0037.

  5. GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web

    Science.gov (United States)

    Liang, Steve H.L.; Huang, Chih-Yuan

    2013-01-01

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision. PMID:24152921

  6. GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.

    Science.gov (United States)

    Liang, Steve H L; Huang, Chih-Yuan

    2013-10-02

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  7. The use of geospatial web services for exchanging utilities data

    Science.gov (United States)

    Kuczyńska, Joanna

    2013-04-01

    Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data

  8. Geospatial Information Relevant to the Flood Protection Available on The Mainstream Web

    Directory of Open Access Journals (Sweden)

    Kliment Tomáš

    2014-03-01

    Full Text Available Flood protection is one of several disciplines where geospatial data is very important and is a crucial component. Its management, processing and sharing form the foundation for their efficient use; therefore, special attention is required in the development of effective, precise, standardized, and interoperable models for the discovery and publishing of data on the Web. This paper describes the design of a methodology to discover Open Geospatial Consortium (OGC services on the Web and collect descriptive information, i.e., metadata in a geocatalogue. A pilot implementation of the proposed methodology - Geocatalogue of geospatial information provided by OGC services discovered on Google (hereinafter “Geocatalogue” - was used to search for available resources relevant to the area of flood protection. The result is an analysis of the availability of resources discovered through their metadata collected from the OGC services (WMS, WFS, etc. and the resources they provide (WMS layers, WFS objects, etc. within the domain of flood protection.

  9. GEO Label Web Services for Dynamic and Effective Communication of Geospatial Metadata Quality

    Science.gov (United States)

    Lush, Victoria; Nüst, Daniel; Bastin, Lucy; Masó, Joan; Lumsden, Jo

    2014-05-01

    We present demonstrations of the GEO label Web services and their integration into a prototype extension of the GEOSS portal (http://scgeoviqua.sapienzaconsulting.com/web/guest/geo_home), the GMU portal (http://gis.csiss.gmu.edu/GADMFS/) and a GeoNetwork catalog application (http://uncertdata.aston.ac.uk:8080/geonetwork/srv/eng/main.home). The GEO label is designed to communicate, and facilitate interrogation of, geospatial quality information with a view to supporting efficient and effective dataset selection on the basis of quality, trustworthiness and fitness for use. The GEO label which we propose was developed and evaluated according to a user-centred design (UCD) approach in order to maximise the likelihood of user acceptance once deployed. The resulting label is dynamically generated from producer metadata in ISO or FDGC format, and incorporates user feedback on dataset usage, ratings and discovered issues, in order to supply a highly informative summary of metadata completeness and quality. The label was easily incorporated into a community portal as part of the GEO Architecture Implementation Programme (AIP-6) and has been successfully integrated into a prototype extension of the GEOSS portal, as well as the popular metadata catalog and editor, GeoNetwork. The design of the GEO label was based on 4 user studies conducted to: (1) elicit initial user requirements; (2) investigate initial user views on the concept of a GEO label and its potential role; (3) evaluate prototype label visualizations; and (4) evaluate and validate physical GEO label prototypes. The results of these studies indicated that users and producers support the concept of a label with drill-down interrogation facility, combining eight geospatial data informational aspects, namely: producer profile, producer comments, lineage information, standards compliance, quality information, user feedback, expert reviews, and citations information. These are delivered as eight facets of a wheel

  10. The growing role of web-based geospatial technology in disaster response and support.

    Science.gov (United States)

    Kawasaki, Akiyuki; Berman, Merrick Lex; Guan, Wendy

    2013-04-01

    This paper examines changes in disaster response and relief efforts and recent web-based geospatial technological developments through an evaluation of the experiences of the Center for Geographic Analysis, Harvard University, of the Sichuan (2008) and Haiti (2010) earthquake responses. This paper outlines how conventional GIS (geographic information systems) disaster responses by governmental agencies and relief response organisations and the means for geospatial data-sharing have been transformed into a more dynamic, more transparent, and decentralised form with a wide participation. It begins by reviewing briefly at historical changes in the employment of geospatial technologies in major devastating disasters, including the Sichuan and Haiti earthquakes (case studies for our geospatial portal project). It goes on to assess changes in the available dataset type and in geospatial disaster responders, as well as the impact of geospatial technological changes on disaster relief effort. Finally, the paper discusses lessons learned from recent responses and offers some thoughts for future development. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  11. Digital content sewed together within a library catalogue WebLib - The CERN Document Server

    CERN Document Server

    Vigen, Jens

    2002-01-01

    Aggregation, harvesting, personalization techniques, portals, service provision, etc. have all become buzzwords. Most of them simply describing what librarians have been doing for hundreds of years. Prior to the Web few people outside the libraries were concerned about these issues, a situation which today it is completely turned upside down. Hopefully the new actors on the arena of knowledge management will take full advantage of all the available "savoir faire". At CERN, the European Organization for Nuclear Research, librarians and informaticians have set up a complete system, WebLib, actually based on the traditional library catalogue. Digital content is, within this framework, being integrated to the highest possible level in order to meet the strong requirements of the particle physics community. The paper gives an overview of the steps CERN has made towards the digital library from the day the laboratory conceived the World Wide Web to present.

  12. Geo-communication and web-based geospatial infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2005-01-01

    The introduction of web-services as index-portals based on geoinformation has changed the conditions for both content and form of geocommunication. A high number of players and interactions (as well as a very high number of all kinds of information and combinations of these) characterize web-serv...

  13. GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web

    Directory of Open Access Journals (Sweden)

    Steve H.L. Liang

    2013-10-01

    Full Text Available The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web’s full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  14. An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services

    Science.gov (United States)

    Shah, M.; Verma, Y.; Nandakumar, R.

    2012-07-01

    Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  15. Web mapping system for complex processing and visualization of environmental geospatial datasets

    Science.gov (United States)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial

  16. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    Science.gov (United States)

    Hearn,, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  17. Operational Marine Data Acquisition and Delivery Powered by Web and Geospatial Standards

    Science.gov (United States)

    Thomas, R.; Buck, J. J. H.

    2015-12-01

    As novel sensor types and new platforms are deployed to monitor the global oceans, the volumes of scientific and environmental data collected in the marine context are rapidly growing. In order to use these data in both the traditional operational modes and in innovative "Big Data" applications the data must be readily understood by software agents. One approach to achieving this is the application of both World Wide Web and Open Geospatial Consortium standards: namely Linked Data1 and Sensor Web Enablement2 (SWE). The British Oceanographic Data Centre (BODC) is adopting this strategy in a number of European Commission funded projects (NETMAR; SenseOCEAN; Ocean Data Interoperability Platform - ODIP; and AtlantOS) to combine its existing data archiving architecture with SWE components (such as Sensor Observation Services) and a Linked Data interface. These will evolve the data management and data transfer from a process that requires significant manual intervention to an automated operational process enabling the rapid, standards-based, ingestion and delivery of data. This poster will show the current capabilities of BODC and the status of on-going implementation of this strategy. References1. World Wide Web Consortium. (2013). Linked Data. Available:http://www.w3.org/standards/semanticweb/data. Last accessed 7th April 20152. Open Geospatial Consortium. (2014). Sensor Web Enablement (SWE). Available:http://www.opengeospatial.org/ogc/markets-technologies/swe. Last accessed 8th October 2014

  18. Investigating Climate Change Issues With Web-Based Geospatial Inquiry Activities

    Science.gov (United States)

    Dempsey, C.; Bodzin, A. M.; Sahagian, D. L.; Anastasio, D. J.; Peffer, T.; Cirucci, L.

    2011-12-01

    In the Environmental Literacy and Inquiry middle school Climate Change curriculum we focus on essential climate literacy principles with an emphasis on weather and climate, Earth system energy balance, greenhouse gases, paleoclimatology, and how human activities influence climate change (http://www.ei.lehigh.edu/eli/cc/). It incorporates a related set of a framework and design principles to provide guidance for the development of the geospatial technology-integrated Earth and environmental science curriculum materials. Students use virtual globes, Web-based tools including an interactive carbon calculator and geologic timeline, and inquiry-based lab activities to investigate climate change topics. The curriculum includes educative curriculum materials that are designed to promote and support teachers' learning of important climate change content and issues, geospatial pedagogical content knowledge, and geographic spatial thinking. The curriculum includes baseline instructional guidance for teachers and provides implementation and adaptation guidance for teaching with diverse learners including low-level readers, English language learners and students with disabilities. In the curriculum, students use geospatial technology tools including Google Earth with embedded spatial data to investigate global temperature changes, areas affected by climate change, evidence of climate change, and the effects of sea level rise on the existing landscape. We conducted a designed-based research implementation study with urban middle school students. Findings showed that the use of the Climate Change curriculum showed significant improvement in urban middle school students' understanding of climate change concepts.

  19. Describing Geospatial Assets in the Web of Data: A Metadata Management Scenario

    Directory of Open Access Journals (Sweden)

    Cristiano Fugazza

    2016-12-01

    Full Text Available Metadata management is an essential enabling factor for geospatial assets because discovery, retrieval, and actual usage of the latter are tightly bound to the quality of these descriptions. Unfortunately, the multi-faceted landscape of metadata formats, requirements, and conventions makes it difficult to identify editing tools that can be easily tailored to the specificities of a given project, workgroup, and Community of Practice. Our solution is a template-driven metadata editing tool that can be customised to any XML-based schema. Its output is constituted by standards-compliant metadata records that also have a semantics-aware counterpart eliciting novel exploitation techniques. Moreover, external data sources can easily be plugged in to provide autocompletion functionalities on the basis of the data structures made available on the Web of Data. Beside presenting the essentials on customisation of the editor by means of two use cases, we extend the methodology to the whole life cycle of geospatial metadata. We demonstrate the novel capabilities enabled by RDF-based metadata representation with respect to traditional metadata management in the geospatial domain.

  20. A Smart Web-Based Geospatial Data Discovery System with Oceanographic Data as an Example

    Directory of Open Access Journals (Sweden)

    Yongyao Jiang

    2018-02-01

    Full Text Available Discovering and accessing geospatial data presents a significant challenge for the Earth sciences community as massive amounts of data are being produced on a daily basis. In this article, we report a smart web-based geospatial data discovery system that mines and utilizes data relevancy from metadata user behavior. Specifically, (1 the system enables semantic query expansion and suggestion to assist users in finding more relevant data; (2 machine-learned ranking is utilized to provide the optimal search ranking based on a number of identified ranking features that can reflect users’ search preferences; (3 a hybrid recommendation module is designed to allow users to discover related data considering metadata attributes and user behavior; (4 an integrated graphic user interface design is developed to quickly and intuitively guide data consumers to the appropriate data resources. As a proof of concept, we focus on a well-defined domain-oceanography and use oceanographic data discovery as an example. Experiments and a search example show that the proposed system can improve the scientific community’s data search experience by providing query expansion, suggestion, better search ranking, and data recommendation via a user-friendly interface.

  1. OpenClimateGIS - A Web Service Providing Climate Model Data in Commonly Used Geospatial Formats

    Science.gov (United States)

    Erickson, T. A.; Koziol, B. W.; Rood, R. B.

    2011-12-01

    The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.

  2. Automating Geospatial Visualizations with Smart Default Renderers for Data Exploration Web Applications

    Science.gov (United States)

    Ekenes, K.

    2017-12-01

    This presentation will outline the process of creating a web application for exploring large amounts of scientific geospatial data using modern automated cartographic techniques. Traditional cartographic methods, including data classification, may inadvertently hide geospatial and statistical patterns in the underlying data. This presentation demonstrates how to use smart web APIs that quickly analyze the data when it loads, and provides suggestions for the most appropriate visualizations based on the statistics of the data. Since there are just a few ways to visualize any given dataset well, it is imperative to provide smart default color schemes tailored to the dataset as opposed to static defaults. Since many users don't go beyond default values, it is imperative that they are provided with smart default visualizations. Multiple functions for automating visualizations are available in the Smart APIs, along with UI elements allowing users to create more than one visualization for a dataset since there isn't a single best way to visualize a given dataset. Since bivariate and multivariate visualizations are particularly difficult to create effectively, this automated approach removes the guesswork out of the process and provides a number of ways to generate multivariate visualizations for the same variables. This allows the user to choose which visualization is most appropriate for their presentation. The methods used in these APIs and the renderers generated by them are not available elsewhere. The presentation will show how statistics can be used as the basis for automating default visualizations of data along continuous ramps, creating more refined visualizations while revealing the spread and outliers of the data. Adding interactive components to instantaneously alter visualizations allows users to unearth spatial patterns previously unknown among one or more variables. These applications may focus on a single dataset that is frequently updated, or configurable

  3. GIS-and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Wei [Colorado School of Mines, Golden, CO (United States); Minnick, Matthew [Colorado School of Mines, Golden, CO (United States); Geza, Mengistu [Colorado School of Mines, Golden, CO (United States); Murray, Kyle [Colorado School of Mines, Golden, CO (United States); Mattson, Earl [Colorado School of Mines, Golden, CO (United States)

    2012-09-30

    The Colorado School of Mines (CSM) was awarded a grant by the National Energy Technology Laboratory (NETL), Department of Energy (DOE) to conduct a research project en- titled GIS- and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development in October of 2008. The ultimate goal of this research project is to develop a water resource geo-spatial infrastructure that serves as “baseline data” for creating solutions on water resource management and for supporting decisions making on oil shale resource development. The project came to the end on September 30, 2012. This final project report will report the key findings from the project activity, major accomplishments, and expected impacts of the research. At meantime, the gamma version (also known as Version 4.0) of the geodatabase as well as other various deliverables stored on digital storage media will be send to the program manager at NETL, DOE via express mail. The key findings from the project activity include the quantitative spatial and temporal distribution of the water resource throughout the Piceance Basin, water consumption with respect to oil shale production, and data gaps identified. Major accomplishments of this project include the creation of a relational geodatabase, automated data processing scripts (Matlab) for database link with surface water and geological model, ArcGIS Model for hydrogeologic data processing for groundwater model input, a 3D geological model, surface water/groundwater models, energy resource development systems model, as well as a web-based geo-spatial infrastructure for data exploration, visualization and dissemination. This research will have broad impacts of the devel- opment of the oil shale resources in the US. The geodatabase provides a “baseline” data for fur- ther study of the oil shale development and identification of further data collection needs. The 3D geological model provides better understanding through data interpolation and

  4. Testing OGC Web Feature and Coverage Service performance: Towards efficient delivery of geospatial data

    Directory of Open Access Journals (Sweden)

    Gregory Giuliani

    2013-12-01

    Full Text Available OGC Web Feature Service (WFS and Web Coverage Service (WCS specifications allow interoperable access to distributed geospatial data made available through spatial data infrastructures (SDIs. To ensure that a service is sufficiently responsive to fulfill users’ expectations and requirements, performance of services must be measured and monitored to track latencies, bottlenecks, and errors that may negatively influence its over- all quality. Despite the importance of data retrieval and access, little research has been published on this topic and mostly concentrates on the usability of services when integrating distributed data sources. Considering these issues, this paper extends and validates the FOSS4G approach to measure the server-side performance of different WFS and WCS services provided by various software implementations; and provides guidance to data providers looking to improve the quality of their services. Our results show that performance of tested implementations is generally satisfactory and memory tuning/data and storage optimization are essential to handle increased efficiency and reliability of services.

  5. Web-Based Geospatial Visualization of GPM Data with CesiumJS

    Science.gov (United States)

    Lammers, Matt

    2018-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology have made online visualization of large geospatial datasets such as those coming from precipitation satellites viable. These data benefit from being visualized on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS (http://cesiumjs.org), developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. This presentation will describe how CesiumJS has been used in three-dimensional visualization products developed as part of the NASA Precipitation Processing System (PPS) STORM data-order website. Existing methods of interacting with Global Precipitation Measurement (GPM) Mission data primarily focus on two-dimensional static images, whether displaying vertical slices or horizontal surface/height-level maps. These methods limit interactivity with the robust three-dimensional data coming from the GPM core satellite. Integrating the data with CesiumJS in a web-based user interface has allowed us to create the following products. We have linked with the data-order interface an on-the-fly visualization tool for any GPM/partner satellite orbit. A version of this tool also focuses on high-impact weather events. It enables viewing of combined radar and microwave-derived precipitation data on mobile devices and in a way that can be embedded into other websites. We also have used CesiumJS to visualize a method of integrating gridded precipitation data with modeled wind speeds that animates over time. Emphasis in the presentation will be placed on how a variety of technical methods were used to create these tools, and how the flexibility of the CesiumJS framework facilitates creative approaches to interact with the data.

  6. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    Science.gov (United States)

    Mazzetti, Paolo

    2010-05-01

    In last decades two main paradigms for resource sharing emerged and reached maturity: the Web and the Grid. They both demonstrate suitable for building Distributed Computing Infrastructures (DCIs) supporting the coordinated sharing of resources (i.e. data, information, services, etc) on the Internet. Grid and Web DCIs have much in common as a result of their underlying Internet technology (protocols, models and specifications). However, being based on different requirements and architectural approaches, they show some differences as well. The Web's "major goal was to be a shared information space through which people and machines could communicate" [Berners-Lee 1996]. The success of the Web, and its consequent pervasiveness, made it appealing for building specialized systems like the Spatial Data Infrastructures (SDIs). In this systems the introduction of Web-based geo-information technologies enables specialized services for geospatial data sharing and processing. The Grid was born to achieve "flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources" [Foster 2001]. It specifically focuses on large-scale resource sharing, innovative applications, and, in some cases, high-performance orientation. In the Earth and Space Sciences (ESS) the most part of handled information is geo-referred (geo-information) since spatial and temporal meta-information is of primary importance in many application domains: Earth Sciences, Disasters Management, Environmental Sciences, etc. On the other hand, in several application areas there is the need of running complex models which require the large processing and storage capabilities that the Grids are able to provide. Therefore the integration of geo-information and Grid technologies might be a valuable approach in order to enable advanced ESS applications. Currently both geo-information and Grid technologies have reached a high level of maturity, allowing to build such an

  7. Globalization and Mobilization of Earth Science Education with GeoBrain Geospatial Web Service Technology

    Science.gov (United States)

    Deng, M.; di, L.

    2005-12-01

    The needs for Earth science education to prepare students as globally-trained geoscience workforce increase tremendously with globalization of the economy. However, current academic programs often have difficulties in providing students world-view training or experiences with global context due to lack of resources and suitable teaching technology. This paper presents a NASA funded project with insights and solutions to this problem. The project aims to establish a geospatial data-rich learning and research environment that enable the students, faculty and researchers from institutes all over the world easily accessing, analyzing and modeling with the huge amount of NASA EOS data just like they possess those vast resources locally at their desktops. With the environment, classroom demonstration and training for students to deal with global climate and environment issues for any part of the world are possible in any classroom with Internet connection. Globalization and mobilization of Earth science education can be truly realized through the environment. This project, named as NASA EOS Higher Education Alliance: Mobilization of NASA EOS Data and Information through Web Services and Knowledge Management Technologies for Higher Education Teaching and Research, is built on profound technology and infrastructure foundations including web service technology, NASA EOS data resources, and open interoperability standards. An open, distributed, standard compliant, interoperable web-based system, called GeoBrain, is being developed by this project to provide a data-rich on-line learning and research environment. The system allows users to dynamically and collaboratively develop interoperable, web-executable geospatial process and analysis modules and models, and run them on-line against any part of the peta-byte archives for getting back the customized information products rather than raw data. The system makes a data-rich globally-capable Earth science learning and research

  8. WebGL Visualisation of 3D Environmental Models Based on Finnish Open Geospatial Data Sets

    Science.gov (United States)

    Krooks, A.; Kahkonen, J.; Lehto, L.; Latvala, P.; Karjalainen, M.; Honkavaara, E.

    2014-08-01

    Recent developments in spatial data infrastructures have enabled real time GIS analysis and visualization using open input data sources and service interfaces. In this study we present a new concept where metric point clouds derived from national open airborne laser scanning (ALS) and photogrammetric image data are processed, analyzed, finally visualised a through open service interfaces to produce user-driven analysis products from targeted areas. The concept is demonstrated in three environmental applications: assessment of forest storm damages, assessment of volumetric changes in open pit mine and 3D city model visualization. One of the main objectives was to study the usability and requirements of national level photogrammetric imagery in these applications. The results demonstrated that user driven 3D geospatial analyses were possible with the proposed approach and current technology, for instance, the landowner could assess the amount of fallen trees within his property borders after a storm easily using any web browser. On the other hand, our study indicated that there are still many uncertainties especially due to the insufficient standardization of photogrammetric products and processes and their quality indicators.

  9. Bim-Gis Integrated Geospatial Information Model Using Semantic Web and Rdf Graphs

    Science.gov (United States)

    Hor, A.-H.; Jadidi, A.; Sohn, G.

    2016-06-01

    In recent years, 3D virtual indoor/outdoor urban modelling becomes a key spatial information framework for many civil and engineering applications such as evacuation planning, emergency and facility management. For accomplishing such sophisticate decision tasks, there is a large demands for building multi-scale and multi-sourced 3D urban models. Currently, Building Information Model (BIM) and Geographical Information Systems (GIS) are broadly used as the modelling sources. However, data sharing and exchanging information between two modelling domains is still a huge challenge; while the syntactic or semantic approaches do not fully provide exchanging of rich semantic and geometric information of BIM into GIS or vice-versa. This paper proposes a novel approach for integrating BIM and GIS using semantic web technologies and Resources Description Framework (RDF) graphs. The novelty of the proposed solution comes from the benefits of integrating BIM and GIS technologies into one unified model, so-called Integrated Geospatial Information Model (IGIM). The proposed approach consists of three main modules: BIM-RDF and GIS-RDF graphs construction, integrating of two RDF graphs, and query of information through IGIM-RDF graph using SPARQL. The IGIM generates queries from both the BIM and GIS RDF graphs resulting a semantically integrated model with entities representing both BIM classes and GIS feature objects with respect to the target-client application. The linkage between BIM-RDF and GIS-RDF is achieved through SPARQL endpoints and defined by a query using set of datasets and entity classes with complementary properties, relationships and geometries. To validate the proposed approach and its performance, a case study was also tested using IGIM system design.

  10. Enhancing the Teaching of Digital Processing of Remote Sensing Image Course through Geospatial Web Processing Services

    Science.gov (United States)

    di, L.; Deng, M.

    2010-12-01

    Remote sensing (RS) is an essential method to collect data for Earth science research. Huge amount of remote sensing data, most of them in the image form, have been acquired. Almost all geography departments in the world offer courses in digital processing of remote sensing images. Such courses place emphasis on how to digitally process large amount of multi-source images for solving real world problems. However, due to the diversity and complexity of RS images and the shortcomings of current data and processing infrastructure, obstacles for effectively teaching such courses still remain. The major obstacles include 1) difficulties in finding, accessing, integrating and using massive RS images by students and educators, and 2) inadequate processing functions and computing facilities for students to freely explore the massive data. Recent development in geospatial Web processing service systems, which make massive data, computing powers, and processing capabilities to average Internet users anywhere in the world, promises the removal of the obstacles. The GeoBrain system developed by CSISS is an example of such systems. All functions available in GRASS Open Source GIS have been implemented as Web services in GeoBrain. Petabytes of remote sensing images in NASA data centers, the USGS Landsat data archive, and NOAA CLASS are accessible transparently and processable through GeoBrain. The GeoBrain system is operated on a high performance cluster server with large disk storage and fast Internet connection. All GeoBrain capabilities can be accessed by any Internet-connected Web browser. Dozens of universities have used GeoBrain as an ideal platform to support data-intensive remote sensing education. This presentation gives a specific example of using GeoBrain geoprocessing services to enhance the teaching of GGS 588, Digital Remote Sensing taught at the Department of Geography and Geoinformation Science, George Mason University. The course uses the textbook "Introductory

  11. An atlas of classification. Signage between open shelves, the Web and the catalogue

    Directory of Open Access Journals (Sweden)

    Andrea Fabbrizzi

    2014-05-01

    Questa segnaletica è fondata sulla comunicazione cross-mediale, e integra le modalità comunicative della biblioteca a vari livelli, sia nel contesto dello stesso medium, sia tra media diversi: tra i cartelli sulle testate degli scaffali, tra questi cartelli e il sito web della biblioteca, tra il sito web e il catalogo. Per questo sistema integrato sono particolarmente adatti i dispositivi mobili come i tablet e gli smartphone, perché danno la possibilità di accedere al Web mentre ci si muove tra gli scaffali. Il collegamento diretto tra gli scaffali aperti classificati e il catalogo è reso possibile dai codici QR stampati sui cartelli.

  12. Geospatial health

    DEFF Research Database (Denmark)

    Utzinger, Jürg; Rinaldi, Laura; Malone, John B.

    2011-01-01

    Geospatial Health is an international, peer-reviewed scientific journal produced by the Global Network for Geospatial Health (GnosisGIS). This network was founded in 2000 and the inaugural issue of its official journal was published in November 2006 with the aim to cover all aspects of geographical...... information system (GIS) applications, remote sensing and other spatial analytic tools focusing on human and veterinary health. The University of Naples Federico II is the publisher, producing two issues per year, both as hard copy and an open-access online version. The journal is referenced in major...... databases, including CABI, ISI Web of Knowledge and PubMed. In 2008, it was assigned its first impact factor (1.47), which has now reached 1.71. Geospatial Health is managed by an editor-in-chief and two associate editors, supported by five regional editors and a 23-member strong editorial board...

  13. A "Neogeographical Education"? The Geospatial Web, GIS and Digital Art in Adult Education

    Science.gov (United States)

    Papadimitriou, Fivos

    2010-01-01

    Neogeography provides a link between the science of geography and digital art. The carriers of this link are geospatial technologies (global navigational satellite systems such as the global positioning system, Geographical Information System [GIS] and satellite imagery) along with ubiquitous information and communication technologies (such as…

  14. Design and Development of a Framework Based on Ogc Web Services for the Visualization of Three Dimensional Large-Scale Geospatial Data Over the Web

    Science.gov (United States)

    Roccatello, E.; Nozzi, A.; Rumor, M.

    2013-05-01

    This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.

  15. Arab Libraries’ Web-based OPACs: An evaluative study in the light of IFLA’s Guidelines For Online Public Access Catalogue (OPAC Displays

    Directory of Open Access Journals (Sweden)

    Sherif Kamel Shaheen

    2005-03-01

    Full Text Available The research aims at evaluating Arabic Libraries’ Web-based Catalogues in the light of Principles and Recommendations published in: IFLA’s Guidelines For OPAC Displays (September 30, 2003 Draft For Worldwide Review. The total No. Of Recommendations reached” 38 “were categorized under three main titles, as follows: User Needs (12 recommendations, Content and arrangement Principle (25 recommendations, Standardization Principle (one recommendation However that number increased to reach 88 elements when formulated as evaluative criteria and included in the study’s checklist.

  16. GeoSemOLAP: Geospatial OLAP on the Semantic Web Made Easy

    DEFF Research Database (Denmark)

    Gur, Nurefsan; Nielsen, Jacob; Hose, Katja

    2017-01-01

    The proliferation of spatial data and the publication of multidimensional (MD) data on the Semantic Web (SW) has led to new opportunities for On-Line Analytical Processing (SOLAP) over spatial data using SPARQL. However, formulating such queries results in verbose statements and can easily become...

  17. Injury surveillance in low-resource settings using Geospatial and Social Web technologies

    Directory of Open Access Journals (Sweden)

    Schuurman Nadine

    2010-05-01

    Full Text Available Abstract Background Extensive public health gains have benefited high-income countries in recent decades, however, citizens of low and middle-income countries (LMIC have largely not enjoyed the same advancements. This is in part due to the fact that public health data - the foundation for public health advances - are rarely collected in many LMIC. Injury data are particularly scarce in many low-resource settings, despite the huge associated burden of morbidity and mortality. Advances in freely-accessible and easy-to-use information and communication (ICT technology may provide the impetus for increased public health data collection in settings with limited financial and personnel resources. Methods and Results A pilot study was conducted at a hospital in Cape Town, South Africa to assess the utility and feasibility of using free (non-licensed, and easy-to-use Social Web and GeoWeb tools for injury surveillance in low-resource settings. Data entry, geocoding, data exploration, and data visualization were successfully conducted using these technologies, including Google Spreadsheet, Mapalist, BatchGeocode, and Google Earth. Conclusion This study examined the potential for Social Web and GeoWeb technologies to contribute to public health data collection and analysis in low-resource settings through an injury surveillance pilot study conducted in Cape Town, South Africa. The success of this study illustrates the great potential for these technologies to be leveraged for public health surveillance in resource-constrained environments, given their ease-of-use and low-cost, and the sharing and collaboration capabilities they afford. The possibilities and potential limitations of these technologies are discussed in relation to the study, and to the field of public health in general.

  18. Injury surveillance in low-resource settings using Geospatial and Social Web technologies.

    Science.gov (United States)

    Cinnamon, Jonathan; Schuurman, Nadine

    2010-05-24

    Extensive public health gains have benefited high-income countries in recent decades, however, citizens of low and middle-income countries (LMIC) have largely not enjoyed the same advancements. This is in part due to the fact that public health data - the foundation for public health advances - are rarely collected in many LMIC. Injury data are particularly scarce in many low-resource settings, despite the huge associated burden of morbidity and mortality. Advances in freely-accessible and easy-to-use information and communication (ICT) technology may provide the impetus for increased public health data collection in settings with limited financial and personnel resources. A pilot study was conducted at a hospital in Cape Town, South Africa to assess the utility and feasibility of using free (non-licensed), and easy-to-use Social Web and GeoWeb tools for injury surveillance in low-resource settings. Data entry, geocoding, data exploration, and data visualization were successfully conducted using these technologies, including Google Spreadsheet, Mapalist, BatchGeocode, and Google Earth. This study examined the potential for Social Web and GeoWeb technologies to contribute to public health data collection and analysis in low-resource settings through an injury surveillance pilot study conducted in Cape Town, South Africa. The success of this study illustrates the great potential for these technologies to be leveraged for public health surveillance in resource-constrained environments, given their ease-of-use and low-cost, and the sharing and collaboration capabilities they afford. The possibilities and potential limitations of these technologies are discussed in relation to the study, and to the field of public health in general.

  19. Dynamic Science Data Services for Display, Analysis and Interaction in Widely-Accessible, Web-Based Geospatial Platforms, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — TerraMetrics, Inc., proposes a Phase II R/R&D program to implement the TerraBlocksTM Server architecture that provides geospatial data authoring, storage and...

  20. Catalogue of Icelandic Volcanoes

    Science.gov (United States)

    Ilyinskaya, Evgenia; Larsen, Gudrun; Gudmundsson, Magnus T.; Vogfjord, Kristin; Pagneux, Emmanuel; Oddsson, Bjorn; Barsotti, Sara; Karlsdottir, Sigrun

    2016-04-01

    The Catalogue of Icelandic Volcanoes is a newly developed open-access web resource in English intended to serve as an official source of information about active volcanoes in Iceland and their characteristics. The Catalogue forms a part of an integrated volcanic risk assessment project in Iceland GOSVÁ (commenced in 2012), as well as being part of the effort of FUTUREVOLC (2012-2016) on establishing an Icelandic volcano supersite. Volcanic activity in Iceland occurs on volcanic systems that usually comprise a central volcano and fissure swarm. Over 30 systems have been active during the Holocene (the time since the end of the last glaciation - approximately the last 11,500 years). In the last 50 years, over 20 eruptions have occurred in Iceland displaying very varied activity in terms of eruption styles, eruptive environments, eruptive products and the distribution lava and tephra. Although basaltic eruptions are most common, the majority of eruptions are explosive, not the least due to magma-water interaction in ice-covered volcanoes. Extensive research has taken place on Icelandic volcanism, and the results reported in numerous scientific papers and other publications. In 2010, the International Civil Aviation Organisation (ICAO) funded a 3 year project to collate the current state of knowledge and create a comprehensive catalogue readily available to decision makers, stakeholders and the general public. The work on the Catalogue began in 2011, and was then further supported by the Icelandic government and the EU through the FP7 project FUTUREVOLC. The Catalogue of Icelandic Volcanoes is a collaboration of the Icelandic Meteorological Office (the state volcano observatory), the Institute of Earth Sciences at the University of Iceland, and the Civil Protection Department of the National Commissioner of the Iceland Police, with contributions from a large number of specialists in Iceland and elsewhere. The Catalogue is built up of chapters with texts and various

  1. The LandCarbon Web Application: Advanced Geospatial Data Delivery and Visualization Tools for Communication about Ecosystem Carbon Sequestration and Greenhouse Gas Fluxes

    Science.gov (United States)

    Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.

    2015-12-01

    The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application

  2. Technology Catalogue

    International Nuclear Information System (INIS)

    1994-02-01

    The Department of Energy's Office of Environmental Restoration and Waste Management (EM) is responsible for remediating its contaminated sites and managing its waste inventory in a safe and efficient manner. EM's Office of Technology Development (OTD) supports applied research and demonstration efforts to develop and transfer innovative, cost-effective technologies to its site clean-up and waste management programs within EM's Office of Environmental Restoration and Office of Waste Management. The purpose of the Technology Catalogue is to provide performance data on OTD-developed technologies to scientists and engineers assessing and recommending technical solutions within the Department's clean-up and waste management programs, as well as to industry, other federal and state agencies, and the academic community. OTD's applied research and demonstration activities are conducted in programs referred to as Integrated Demonstrations (IDs) and Integrated Programs (IPs). The IDs test and evaluate.systems, consisting of coupled technologies, at specific sites to address generic problems, such as the sensing, treatment, and disposal of buried waste containers. The IPs support applied research activities in specific applications areas, such as in situ remediation, efficient separations processes, and site characterization. The Technology Catalogue is a means for communicating the status. of the development of these innovative technologies. The FY93 Technology Catalogue features technologies successfully demonstrated in the field through IDs and sufficiently mature to be used in the near-term. Technologies from the following IDs are featured in the FY93 Technology Catalogue: Buried Waste ID (Idaho National Engineering Laboratory, Idaho); Mixed Waste Landfill ID (Sandia National Laboratories, New Mexico); Underground Storage Tank ID (Hanford, Washington); Volatile organic compound (VOC) Arid ID (Richland, Washington); and VOC Non-Arid ID (Savannah River Site, South Carolina)

  3. Leveraging the geospatial advantage

    Science.gov (United States)

    Ben Butler; Andrew Bailey

    2013-01-01

    The Wildland Fire Decision Support System (WFDSS) web-based application leverages geospatial data to inform strategic decisions on wildland fires. A specialized data team, working within the Wildland Fire Management Research Development and Application group (WFM RD&A), assembles authoritative national-level data sets defining values to be protected. The use of...

  4. GeoSearch: A lightweight broking middleware for geospatial resources discovery

    Science.gov (United States)

    Gui, Z.; Yang, C.; Liu, K.; Xia, J.

    2012-12-01

    With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value

  5. Geospatial Technology

    Science.gov (United States)

    Reed, Philip A.; Ritz, John

    2004-01-01

    Geospatial technology refers to a system that is used to acquire, store, analyze, and output data in two or three dimensions. This data is referenced to the earth by some type of coordinate system, such as a map projection. Geospatial systems include thematic mapping, the Global Positioning System (GPS), remote sensing (RS), telemetry, and…

  6. AN AUTOMATED END-TO-END MULTI-AGENT QOS BASED ARCHITECTURE FOR SELECTION OF GEOSPATIAL WEB SERVICES

    Directory of Open Access Journals (Sweden)

    M. Shah

    2012-07-01

    With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  7. Development of Web GIS for complex processing and visualization of climate geospatial datasets as an integral part of dedicated Virtual Research Environment

    Science.gov (United States)

    Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander

    2017-04-01

    For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map

  8. Dynamic Science Data Services for Display, Analysis and Interaction in Widely-Accessible, Web-Based Geospatial Platforms, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — TerraMetrics, Inc., proposes an SBIR Phase I R/R&D program to investigate and develop a key web services architecture that provides data processing, storage and...

  9. Cataloguer\\'s workstation: implications for cataloguing theory and ...

    African Journals Online (AJOL)

    The significance of Technical Services Workstation (TSW) in both the effective retrieval of information and in cataloguing is discussed. A well defined future in the development and management of Technical Services Workstation is recommended. Keywords: developing countries, cataloguing, practice, theory, workstation

  10. Geospatial Authentication

    Science.gov (United States)

    Lyle, Stacey D.

    2009-01-01

    A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time has been developed. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server. The Geospatial Authentication software has two parts Server and Client. The server software is a virtual private network (VPN) developed in Linux operating system using Perl programming language. The server can be a stand-alone VPN server or can be combined with other applications and services. The client software is a GUI Windows CE software, or Mobile Graphical Software, that allows users to authenticate into a network. The purpose of the client software is to pass the needed satellite information to the server for authentication.

  11. Approach to Facilitating Geospatial Data and Metadata Publication Using a Standard Geoservice

    Directory of Open Access Journals (Sweden)

    Sergio Trilles

    2017-04-01

    Full Text Available Nowadays, the existence of metadata is one of the most important aspects of effective discovery of geospatial data published in Spatial Data Infrastructures (SDIs. However, due to lack of efficient mechanisms integrated in the data workflow, to assist users in metadata generation, a lot of low quality and outdated metadata are stored in the catalogues. This paper presents a mechanism for generating and publishing metadata through a publication service. This mechanism is provided as a web service implemented with a standard interface called a web processing service, which improves interoperability between other SDI components. This work extends previous research, in which a publication service has been designed in the framework of the European Directive Infrastructure for Spatial Information in Europe (INSPIRE as a solution to assist users in automatically publishing geospatial data and metadata in order to improve, among other aspects, SDI maintenance and usability. Also, this work adds more extra features in order to support more geospatial formats, such as sensor data.

  12. ESO Catalogue Facility Design and Performance

    Science.gov (United States)

    Moins, C.; Retzlaff, J.; Arnaboldi, M.; Zampieri, S.; Delmotte, N.; Forchí, V.; Klein Gebbinck, M.; Lockhart, J.; Micol, A.; Vera Sequeiros, I.; Bierwirth, T.; Peron, M.; Romaniello, M.; Suchar, D.

    2013-10-01

    The ESO Phase 3 Catalogue Facility provides investigators with the possibility to ingest catalogues resulting from ESO public surveys and large programs and to query and download their content according to positional and non-positional criteria. It relies on a chain of tools that covers the complete workflow from submission to validation and ingestion into the ESO archive and catalogue repository and a web application to browse and query catalogues. This repository consists of two components. One is a Sybase ASE relational database where catalogue meta-data are stored. The second one is a Sybase IQ data warehouse where the content of each catalogue is ingested in a specific table that returns all records matching a user's query. Spatial indexing has been implemented in Sybase IQ to speed up positional queries and relies on the Spherical Geometry Toolkit from the Johns Hopkins University which implements the Hierarchical Triangular Mesh (HTM) algorithm. It is based on a recursive decomposition of the celestial sphere in spherical triangles and the assignment of an index to each of them. It has been complemented with the use of optimized indexes on the non-positional columns that are likely to be frequently used as query constraints. First tests performed on catalogues such as 2MASS have confirmed that this approach provides a very good level of performance and a smooth user experience that are likely to facilitate the scientific exploitation of catalogues.

  13. Geospatial Engineering

    Science.gov (United States)

    2017-02-22

    fashion , as the staff furthers its analysis, the level of detail required to fulfill the staff additional geospatial information requirements...freeze in winter, and subterranean channels and outlets may shift in location. D-5. Wells may yield large quantities of water if they tap into...the unclassified network, the Secret Internet Protocol Router Network, and on the Combined Enterprise Regional Information Exchange System− Korea

  14. Geospatial Services Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: To process, store, and disseminate geospatial data to the Department of Defense and other Federal agencies.DESCRIPTION: The Geospatial Services Laboratory...

  15. The IKEA Catalogue

    DEFF Research Database (Denmark)

    Brown, Barry; Bleecker, Julian; D'Adamo, Marco

    2016-01-01

    This paper is an introduction to the "Future IKEA Catalogue", enclosed here as an example of a design fiction produced from a long standing industrial-academic collaboration. We introduce the catalogue here by discussing some of our experiences using design fiction` with companies and public sector...

  16. Real-Time Geospatial Data Viewer (RETIGO): Web-Based Tool for Researchers and Citizen Scientists to Explore their Air Measurements

    Science.gov (United States)

    The collection of air measurements in real-time on moving platforms, such as wearable, bicycle-mounted, or vehicle-mounted air sensors, is becoming an increasingly common method to investigate local air quality. However, visualizing and analyzing geospatial air monitoring data re...

  17. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    Science.gov (United States)

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  18. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    OpenAIRE

    Ra Nagisetty Venkateswara; Anyanwu Matthew N; Viangteeravat Teeradache; Kamel Boulos Maged N; Kuscu Emin

    2011-01-01

    Abstract The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of hum...

  19. The new OGC Catalogue Services 3.0 specification - status of work

    Science.gov (United States)

    Bigagli, Lorenzo; Voges, Uwe

    2013-04-01

    We report on the work of the Open Geospatial Consortium Catalogue Services 3.0 Standards Working Group (OGC Cat 3.0 SWG for short), started in March 2008, with the purpose to process change requests on the Catalogue Services 2.0.2 Implementation Specification (OGC 07-006r1) and produce a revised version thereof, comprising the related XML schemas and abstract test suite. The work was initially intended as a minor revision (version 2.1), but later retargeted as a major update of the standard and rescheduled (the anticipated roadmap ended in 2008). The target audience of Catalogue Services 3.0 includes: • Implementors of catalogue services solutions. • Designers and developers of catalogue services profiles. • Providers/users of catalogue services. The two main general areas of enhancement included: restructuring the specification document according to the OGC standard for modular specifications (OGC 08-131r3, also known as Core and Extension model); incorporating the current mass-market technologies for discovery on the Web, namely OpenSearch. The document was initially split into four parts: the general model and the three protocol bindings HTTP, Z39.50, and CORBA. The CORBA binding, which was very rarely implemented, and the Z39.50 binding have later been dropped. Parts of the Z39.50 binding, namely Search/Retrieve via URL (SRU; same semantics as Z39.50, but stateless), have been provided as a discussion paper (OGC 12-082) for possibly developing a future SRU profile. The Catalogue Services 3.0 specification is structured as follows: • Part 1: General Model (Core) • Part 2: HTTP Protocol Binding (CSW) In CSW, the GET/KVP encoding is mandatory. The POST/XML encoding is optional. SOAP is supported as a special case of the POST/XML encoding. OpenSearch must always be supported, regardless of the implemented profiles, along with the OpenSearch Geospatial and Temporal Extensions (OGC 10-032r2). The latter specifies spatial (e.g. point-plus-radius, bounding

  20. NCI's Distributed Geospatial Data Server

    Science.gov (United States)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under

  1. Global polar geospatial information service retrieval based on search engine and ontology reasoning

    Science.gov (United States)

    Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang

    2007-01-01

    In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.

  2. An Effective Framework for Distributed Geospatial Query Processing in Grids

    Directory of Open Access Journals (Sweden)

    CHEN, B.

    2010-08-01

    Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.

  3. Catalogue 2.0 the future of the library catalogue

    CERN Document Server

    Chambers, Sally

    2014-01-01

    Brings together some of the foremost international cataloguing practitioners and thought leaders, including Lorcan Dempsey, Emmanuelle Bermès, Marshall Breeding and Karen Calhoun, to provide an overview of the current state of the art of the library catalogue and look ahead to see what the library catalogue might become.

  4. Influence of automated cataloguing system on manual cataloguing ...

    African Journals Online (AJOL)

    This study examied the automation of cataloguing and classification practices in academic libraries in South-West Nigerian and what effect the automated cataloguing systme has on manual cataloguing in the the libraries. The study population comprised 110 library professional and paraprofessional personnel working in ...

  5. Introduction to geospatial semantics and technology workshop handbook

    Science.gov (United States)

    Varanka, Dalia E.

    2012-01-01

    The workshop is a tutorial on introductory geospatial semantics with hands-on exercises using standard Web browsers. The workshop is divided into two sections, general semantics on the Web and specific examples of geospatial semantics using data from The National Map of the U.S. Geological Survey and the Open Ontology Repository. The general semantics section includes information and access to publicly available semantic archives. The specific session includes information on geospatial semantics with access to semantically enhanced data for hydrography, transportation, boundaries, and names. The Open Ontology Repository offers open-source ontologies for public use.

  6. Developing a Tile-Based Rendering Method to Improve Rendering Speed of 3D Geospatial Data with HTML5 and WebGL

    Directory of Open Access Journals (Sweden)

    Seokchan Kang

    2017-01-01

    Full Text Available A dedicated plug-in has been installed to visualize three-dimensional (3D city modeling spatial data in web-based applications. However, plug-in methods are gradually becoming obsolete, owing to their limited performance with respect to installation errors, unsupported cross-browsers, and security vulnerability. Particularly, in 2015, the NPAPI service was terminated in most existing web browsers except Internet Explorer. To overcome these problems, the HTML5/WebGL (next-generation web standard, confirmed in October 2014 technology emerged. In particular, WebGL is able to display 3D spatial data without plug-ins in browsers. In this study, we attempted to identify the requirements and limitations of displaying 3D city modeling spatial data using HTML5/WebGL, and we propose alternative ways based on the bin-packing algorithm that aggregates individual 3D city modeling data including buildings in tile units. The proposed method reduces the operational complexity and the number and volume of transmissions required for rendering processing to improve the speed of 3D data rendering. The proposed method was validated on real data for evaluating its effectiveness in 3D visualization of city modeling data in web-based applications.

  7. Geospatial Data Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...

  8. National Geospatial Program

    Science.gov (United States)

    Carswell, William J.

    2011-01-01

    The National Geospatial Program (NGP; http://www.usgs.gov/ngpo/) satisfies the needs of customers by providing geospatial products and services that customers incorporate into their decisionmaking and operational activities. These products and services provide geospatial data that are organized and maintained in cost-effective ways and developed by working with partners and organizations whose activities align with those of the program. To accomplish its mission, the NGP— organizes, maintains, publishes, and disseminates the geospatial baseline of the Nation's topography, natural landscape, and manmade environment through The National Map

  9. INTERACT Station Catalogue - 2015

    DEFF Research Database (Denmark)

    and research groups. Therefore, INTERACT has produced a catalogue of research stations including descriptions of the physical setting, facilities and services offered at the stations. It is our hope that this catalogue will help researchers identify research stations that suit their specific needs. The 2015......INTERACT stations are located in all major environmental envelopes of the Arctic providing an ideal platform for studying climate change and its impact on the environment and local communities. Since alpine environments face similar changes and challenges as the Arctic, the INTERACT network also...... includes some alpine stations located outside the Arctic. The INTERACT research stations provide an ideal platform for circumarctic research and monitoring. Activities span from small short term research projects to larger long term monitoring programmes. The stations are thus visited by many researchers...

  10. Technology Catalogue. First edition

    Energy Technology Data Exchange (ETDEWEB)

    1994-02-01

    The Department of Energy`s Office of Environmental Restoration and Waste Management (EM) is responsible for remediating its contaminated sites and managing its waste inventory in a safe and efficient manner. EM`s Office of Technology Development (OTD) supports applied research and demonstration efforts to develop and transfer innovative, cost-effective technologies to its site clean-up and waste management programs within EM`s Office of Environmental Restoration and Office of Waste Management. The purpose of the Technology Catalogue is to provide performance data on OTD-developed technologies to scientists and engineers assessing and recommending technical solutions within the Department`s clean-up and waste management programs, as well as to industry, other federal and state agencies, and the academic community. OTD`s applied research and demonstration activities are conducted in programs referred to as Integrated Demonstrations (IDs) and Integrated Programs (IPs). The IDs test and evaluate.systems, consisting of coupled technologies, at specific sites to address generic problems, such as the sensing, treatment, and disposal of buried waste containers. The IPs support applied research activities in specific applications areas, such as in situ remediation, efficient separations processes, and site characterization. The Technology Catalogue is a means for communicating the status. of the development of these innovative technologies. The FY93 Technology Catalogue features technologies successfully demonstrated in the field through IDs and sufficiently mature to be used in the near-term. Technologies from the following IDs are featured in the FY93 Technology Catalogue: Buried Waste ID (Idaho National Engineering Laboratory, Idaho); Mixed Waste Landfill ID (Sandia National Laboratories, New Mexico); Underground Storage Tank ID (Hanford, Washington); Volatile organic compound (VOC) Arid ID (Richland, Washington); and VOC Non-Arid ID (Savannah River Site, South Carolina).

  11. Catalogue of theses

    International Nuclear Information System (INIS)

    Paranjpe, S.V.

    1975-01-01

    The catalogue lists 442 theses submitted by the scientists of the Bhabha Atomic Research Centre, since its inception, to the various universities in India and abroad for the award of M. Sc. and Ph. D. degrees. Theses are grouped under broad subject headings which are arranged in the order of Universal Decimal Classification Scheme. In addition to the author and guide index, a detailed subject index is appended which enhances the utility of the compilation. (S.V.P.)

  12. The geospatial data quality REST API for primary biodiversity data.

    Science.gov (United States)

    Otegui, Javier; Guralnick, Robert P

    2016-06-01

    We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  13. Python geospatial development

    CERN Document Server

    Westra, Erik

    2013-01-01

    This is a tutorial style book that will teach usage of Python tools for GIS using simple practical examples and then show you how to build a complete mapping application from scratch. The book assumes basic knowledge of Python. No knowledge of Open Source GIS is required.Experienced Python developers who want to learn about geospatial concepts, work with geospatial data, solve spatial problems, and build mapbased applications.This book will be useful those who want to get up to speed with Open Source GIS in order to build GIS applications or integrate GeoSpatial features into their existing ap

  14. European wind turbine catalogue

    International Nuclear Information System (INIS)

    1994-01-01

    The THERMIE European Community programme is designed to promote the greater use of European technology and this catalogue contributes to the fulfillment of this aim by dissemination of information on 50 wind turbines from 30 manufacturers. These turbines are produced in Europe and are commercially available. The manufacturers presented produce and sell grid-connected turbines which have been officially approved in countries where this approval is acquired, however some of the wind turbines included in the catalogue have not been regarded as fully commercially available at the time of going to print. The entries, which are illustrated by colour photographs, give company profiles, concept descriptions, measured power curves, prices, and information on design and dimension, safety systems, stage of development, special characteristics, annual energy production, and noise pollution. Lists are given of wind turbine manufacturers and agents and of consultants and developers in the wind energy sector. Exchange rates used in the conversion of the prices of wind turbines are also given. Information can be found on the OPET network (organizations recognised by the European Commission as an Organization for the Promotion of Energy Technologies (OPET)). An article describes the development of the wind power industry during the last 10-15 years and another article on certification aims to give an overview of the most well-known and acknowledged type approvals currently issued in Europe. (AB)

  15. Usare WebDewey

    OpenAIRE

    Baldi, Paolo

    2016-01-01

    This presentation shows how to use the WebDewey tool. Features of WebDewey. Italian WebDewey compared with American WebDewey. Querying Italian WebDewey. Italian WebDewey and MARC21. Italian WebDewey and UNIMARC. Numbers, captions, "equivalente verbale": Dewey decimal classification in Italian catalogues. Italian WebDewey and Nuovo soggettario. Italian WebDewey and LCSH. Italian WebDewey compared with printed version of Italian Dewey Classification (22. edition): advantages and disadvantages o...

  16. Semantic Sensor Web Enablement for COAST Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Sensor Web Enablement (SWE) is an Open Geospatial Consortium (OGC) standard Service Oriented Architecture (SOA) that facilitates discovery and integration of...

  17. Interoperability in planetary research for geospatial data analysis

    Science.gov (United States)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  18. VIRAC: the VVV Infrared Astrometric Catalogue

    Science.gov (United States)

    Smith, L. C.; Lucas, P. W.; Kurtev, R.; Smart, R.; Minniti, D.; Borissova, J.; Jones, H. R. A.; Zhang, Z. H.; Marocco, F.; Contreras Peña, C.; Gromadzki, M.; Kuhn, M. A.; Drew, J. E.; Pinfield, D. J.; Bedin, L. R.

    2018-02-01

    We present VIRAC version 1, a near-infrared proper motion and parallax catalogue of the VISTA Variables in the Via Lactea (VVV) survey for 312 587 642 unique sources averaged across all overlapping pawprint and tile images covering 560 deg2 of the bulge of the Milky Way and southern disc. The catalogue includes 119 million high-quality proper motion measurements, of which 47 million have statistical uncertainties below 1 mas yr-1. In the 11 Gaia Astrometric Solution, though caution is advised for data with modest significance. The SQL data base housing the data is made available via the web. We give example applications for studies of Galactic structure, nearby objects (low-mass stars and brown dwarfs, subdwarfs, white dwarfs) and kinematic distance measurements of young stellar objects. Nearby objects discovered include LTT 7251 B, an L7 benchmark companion to a G dwarf with over 20 published elemental abundances, a bright L subdwarf, VVV 1256-6202, with extremely blue colours and nine new members of the 25 pc sample. We also demonstrate why this catalogue remains useful in the era of Gaia. Future versions will be based on profile fitting photometry, use the Gaia absolute reference frame and incorporate the longer time baseline of the VVV extended survey.

  19. Library Catalogue Users Are Influenced by Trends in Web Searching Search Strategies. A review of: Novotny, Eric. “I Don’t Think I Click: A Protocol Analysis Study of Use of a Library Online Catalog in the Internet Age.” College & Research Libraries, 65.6 (Nov. 2004: 525-37.

    Directory of Open Access Journals (Sweden)

    Susan Haigh

    2006-09-01

    Full Text Available Objective – To explore how Web-savvy users think about and search an online catalogue. Design – Protocol analysis study. Setting – Academic library (Pennsylvania State University Libraries. Subjects – Eighteen users (17 students, 1 faculty member of an online public access catalog, divided into two groups of nine first-time and nine experienced users. Method – The study team developed five tasks that represented a range of activities commonly performed by library users, such as searching for a specific item, identifying a library location, and requesting a copy. Seventeen students and one faculty member, divided evenly between novice and experienced searchers, were recruited to “think aloud” through the performance of the tasks. Data were gathered through audio recordings, screen capture software, and investigator notes. The time taken for each task was recorded, and investigators rated task completion as “successful,” “partially successful,” “fail,” or “search aborted.” After the searching session, participants were interviewed to clarify their actions and provide further commentary on the catalogue search. Main results – Participants in both test groups were relatively unsophisticated subject searchers. They made minimal use of Boolean operators, and tended not to repair failed searches by rethinking the search vocabulary and using synonyms. Participants did not have a strong understanding of library catalogue contents or structure and showed little curiosity in developing an understanding of how to utilize the catalogue. Novice users were impatient both in choosing search options and in evaluating their search results. They assumed search results were sorted by relevance, and thus would not typically browse past the initial screen. They quickly followed links, fearlessly tried different searches and options, and rapidly abandoned false trails. Experienced users were more effective and efficient searchers than

  20. NREL Information Resources Catalogue 1999

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-04-03

    This is the sixth annual catalogue listing documents produced by NREL during the last fiscal year. Each year the catalogue is mailed to state energy offices, DOE support offices, and to anyone looking to find out more information about NREL's activities and publications.

  1. Radioisotopes and radiopharmaceuticals catalogue

    International Nuclear Information System (INIS)

    2002-01-01

    The Chilean Nuclear Energy Commission (CCHEN) presents its radioisotopes and radiopharmaceuticals 2002 catalogue. In it we found physical characteristics of 9 different reactor produced radioisotopes ( Tc-99m, I-131, Sm-153, Ir-192, P-32, Na-24, K-42, Cu-64, Rb-86 ), 7 radiopharmaceuticals ( MDP, DTPA, DMSA, Disida, Phitate, S-Coloid, Red Blood Cells In-Vivo, Red Blood Cells In-Vitro) and 4 labelled compounds ( DMSA-Tc99m, DTPA-Tc99m, MIBG-I131, EDTMP-Sm153 ). In the near future the number of items will be increased with new reactor and cyclotron products. Our production system will be certified by ISO 9000 on March 2003. CCHEN is interested in being a national and an international supplier of these products (RS)

  2. Technology catalogue. Second edition

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    The Department of Energy`s (DOE`s) Office of Environmental Management (EM) is responsible for remediating DOE contaminated sites and managing the DOE waste inventory in a safe and efficient manner. EM`s Office of Technology Development (OTD) supports applied research and demonstration efforts to develop and transfer innovative, cost-effective technologies to its site clean-up and waste-management programs within EM. The purpose of the Technology Catalogue is to: (a) provide performance data on OTD-developed technologies to scientists and engineers responsible for preparing Remedial Investigation/Feasibility Studies (RI/FSs) and other compliance documents for the DOE`s clean-up and waste-management programs; and (b) identify partnering and commercialization opportunities with industry, other federal and state agencies, and the academic community.

  3. Technology catalogue. Second edition

    International Nuclear Information System (INIS)

    1995-04-01

    The Department of Energy's (DOE's) Office of Environmental Management (EM) is responsible for remediating DOE contaminated sites and managing the DOE waste inventory in a safe and efficient manner. EM's Office of Technology Development (OTD) supports applied research and demonstration efforts to develop and transfer innovative, cost-effective technologies to its site clean-up and waste-management programs within EM. The purpose of the Technology Catalogue is to: (a) provide performance data on OTD-developed technologies to scientists and engineers responsible for preparing Remedial Investigation/Feasibility Studies (RI/FSs) and other compliance documents for the DOE's clean-up and waste-management programs; and (b) identify partnering and commercialization opportunities with industry, other federal and state agencies, and the academic community

  4. A Catalogue of marine biodiversity indicators

    Directory of Open Access Journals (Sweden)

    Heliana Teixeira

    2016-11-01

    Full Text Available A Catalogue of Marine Biodiversity Indicators was developed with the aim of providing the basis for assessing the environmental status of the marine ecosystems. Useful for the implementation of the Marine Strategy Framework Directive (MSFD, this catalogue allows the navigation of a database of indicators mostly related to biological diversity, non-indigenous species, food webs, and seafloor integrity. Over 600 indicators were compiled, which were developed and used in the framework of different initiatives (e.g. EU policies, research projects and in national and international contexts (e.g. Regional Seas Conventions, and assessments in non-European seas. The catalogue reflects the current scientific capability to address environmental assessment needs by providing a broad coverage of the most relevant indicators for marine biodiversity and ecosystem integrity.The available indicators are reviewed according to their typology, data requirements, development status, geographical coverage, relevance to habitats or biodiversity components, and related human pressures. Through this comprehensive overview, we discuss the potential of the current set of indicators in a wide range of contexts, from large-scale to local environmental programs, and we also address shortcomings in light of current needs.Developed by the DEVOTES Project, the catalogue is freely available through the DEVOTool software application, which provides browsing and query options for the associated metadata. The tool allows extraction of ranked indicator lists best fulfilling selected criteria, enabling users to search for suitable indicators to address a particular biodiversity component, ecosystem feature, habitat or pressure in a marine area of interest.This tool is useful for EU Member States, Regional Sea Conventions, the European Commission, non-governmental organizations, managers, scientists and any person interested in marine environmental assessment. It allows users to

  5. A Catalogue of Marine Biodiversity Indicators

    KAUST Repository

    Teixeira, Heliana

    2016-11-04

    A Catalogue of Marine Biodiversity Indicators was developed with the aim of providing the basis for assessing the environmental status of the marine ecosystems. Useful for the implementation of the Marine Strategy Framework Directive (MSFD), this catalogue allows the navigation of a database of indicators mostly related to biological diversity, non-indigenous species, food webs, and seafloor integrity. Over 600 indicators were compiled, which were developed and used in the framework of different initiatives (e.g., EU policies, research projects) and in national and international contexts (e.g., Regional Seas Conventions, and assessments in non-European seas). The catalogue reflects the current scientific capability to address environmental assessment needs by providing a broad coverage of the most relevant indicators for marine biodiversity and ecosystem integrity. The available indicators are reviewed according to their typology, data requirements, development status, geographical coverage, relevance to habitats or biodiversity components, and related human pressures. Through this comprehensive overview, we discuss the potential of the current set of indicators in a wide range of contexts, from large-scale to local environmental programs, and we also address shortcomings in light of current needs. Developed by the DEVOTES Project, the catalogue is freely available through the DEVOTool software application, which provides browsing and query options for the associated metadata. The tool allows extraction of ranked indicator lists best fulfilling selected criteria, enabling users to search for suitable indicators to address a particular biodiversity component, ecosystem feature, habitat, or pressure in a marine area of interest. This tool is useful for EU Member States, Regional Sea Conventions, the European Commission, non-governmental organizations, managers, scientists, and any person interested in marine environmental assessment. It allows users to build

  6. TOWARDS COMPATIBILITY OF CONTEMPORARY GEOSPATIAL STANDARDS WITH THE FOG COMPUTING CONCEPT

    Directory of Open Access Journals (Sweden)

    E. A. Panidi

    2017-01-01

    Full Text Available This position paper considers the possibility of implementation of the Fog Computing paradigm into the contemporary Geographic Information Systems (GISs and into the geospatial Web services that provide data access. In particular, the paper is focused on the issue of compatibility of the existing geospatial standards developed by the Open Geospatial Consortium (OGC with the principles of Fog information systems. The WMS, WMTS, WFS, WCS, WPS and CS standards are highlighted. The conclusion is made that the OGC standards can be extended by new request types to ensure the implementation of Fog Computing functionality and the inverse compatibility with currently used Cloud-based Web services. Two fundamental problems are highlighted that arise when designing geospatial Fog Web services. The first one is the need to provide processing and management operations on spatial data at client devices (particularly on mobile devices when the geospatial Fog Web service is deployed on such a device. The second problem is the necessity to insure the geospatial data transmission using the HyperText Transfer Protocol, which is used in contemporary geospatial Web services. The JavaScript programming language and the WebRTC technology (Web Real-Time Communication are mentioned as examples of basic technologies that can be applied to geospatial Fog Web services. It is concluded that contemporary technologies used in GISs and Web services ensure in general the implementation of Fog Computing into geospatial data management tasks. However, the known examples of such implementation do not exist today, and further research and development are required in this direction.

  7. The Road to Responsive: University of Toronto Libraries’ Journey to a New Library Catalogue Interface

    Directory of Open Access Journals (Sweden)

    Lisa Gayhart

    2014-01-01

    Full Text Available With the recent surge in the mobile device market and an ever expanding patron base with increasingly divergent levels of technical ability, the University of Toronto Libraries embarked on the development of a new catalogue discovery layer to fit the needs of its diverse users. The result: a mobile-friendly, flexible and intuitive web application that brings the full power of a faceted library catalogue to users without compromising quality or performance, employing Responsive Web Design principles.

  8. RDA: an innovation in cataloguing

    Directory of Open Access Journals (Sweden)

    Stuart Hunt

    2013-07-01

    Full Text Available With effect from 31 March 2013, Resource Description and Access (RDA has become the cataloguing content standard used by the British Library and the Library of Congress. Concurrent with these institutions, other libraries, principally in the English-speaking world, have also adopted, or are planning to adopt, RDA. This article will discuss what RDA is, how and why it is an innovation in cataloguing, and will then examine its adoption by libraries. It will also address implications for library catalogues. Particular emphasis will be placed on the pattern of adoption, applying Everett Rogers' categorization to libraries as they implement RDA.

  9. A catalogue quality audit tool

    OpenAIRE

    Chapman, Ann; Massey, Owen

    2002-01-01

    The current need for performance measurement and quality targets for services to users requires suitable performance indicators for libraries to use. This paper looks at the self-assessment audit tool for catalogue quality developed by UKOLN in collaboration with Essex libraries. For the tool a checklist of errors was drawn up, which can then be used to assess the quality of records within a catalogue using a sample of library stock. The tool can be used to assess the quality of catalogue rec...

  10. Geospatial Information Response Team

    Science.gov (United States)

    Witt, Emitt C.

    2010-01-01

    Extreme emergency events of national significance that include manmade and natural disasters seem to have become more frequent during the past two decades. The Nation is becoming more resilient to these emergencies through better preparedness, reduced duplication, and establishing better communications so every response and recovery effort saves lives and mitigates the long-term social and economic impacts on the Nation. The National Response Framework (NRF) (http://www.fema.gov/NRF) was developed to provide the guiding principles that enable all response partners to prepare for and provide a unified national response to disasters and emergencies. The NRF provides five key principles for better preparation, coordination, and response: 1) engaged partnerships, 2) a tiered response, 3) scalable, flexible, and adaptable operations, 4) unity of effort, and 5) readiness to act. The NRF also describes how communities, tribes, States, Federal Government, privatesector, and non-governmental partners apply these principles for a coordinated, effective national response. The U.S. Geological Survey (USGS) has adopted the NRF doctrine by establishing several earth-sciences, discipline-level teams to ensure that USGS science, data, and individual expertise are readily available during emergencies. The Geospatial Information Response Team (GIRT) is one of these teams. The USGS established the GIRT to facilitate the effective collection, storage, and dissemination of geospatial data information and products during an emergency. The GIRT ensures that timely geospatial data are available for use by emergency responders, land and resource managers, and for scientific analysis. In an emergency and response capacity, the GIRT is responsible for establishing procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing coordinated products and services utilizing the USGS' exceptional pool of

  11. The national atlas as a metaphor for improved use of a national geospatial data infrastructure

    NARCIS (Netherlands)

    Aditya Kurniawan Muhammad, T.

    2007-01-01

    Geospatial Data infrastructures have been developed worldwide. Geoportals have been created as an interface to allow users or the community to discover and use geospatial data offered by providers of these initiatives. This study focuses on the development of a web national atlas as an alternative

  12. The Impact of a Geospatial Technology-Supported Energy Curriculum on Middle School Students' Science Achievement

    Science.gov (United States)

    Kulo, Violet; Bodzin, Alec

    2013-01-01

    Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade…

  13. Open Geospatial Education

    Directory of Open Access Journals (Sweden)

    Mariana Belgiu

    2015-04-01

    Full Text Available The advances in open data, free and open source software solutions and open access to research publications have influenced the emergence of open educational resources (OER initiatives. These initiatives permit access to openly licensed learning resources including courses, webinars, training materials and textbooks. Thereby, an increasing number of users has the opportunity to broaden their knowledge and gain new skills. The goal of this paper is to evaluate open education initiatives in the geospatial domain and its synergies with open spatial data and software movements. The paper is focusing on the Massive Open Online Course (MOOCs movement. The advantages and challenges of open geospatial education will be thoroughly discussed.

  14. Publications catalogue 1982-83

    International Nuclear Information System (INIS)

    1982-04-01

    This catalogue lists the technical reports, papers, speeches, regulatory documents, news releases, information bulletins, notices, and miscellaneous documents issued by the Canadian Atomic Energy Control Board between 1977 and 1982

  15. Catalogue of Tephritidae of Colombia

    Science.gov (United States)

    The present Catalogue includes 93 species and 23 genera of Tephritidae that have been recorded in Colombia. Four subfamilies (Blepharoneurinae, Dacinae, Trypetinae and Tephritinae), and eight tribes (Acrotaeniini, Carpomyini, Dacini, Eutretini, Myopitini, Noeetini, Tephritini, and Toxotrypanini) are...

  16. OpenSearch technology for geospatial resources discovery

    Science.gov (United States)

    Papeschi, Fabrizio; Enrico, Boldrini; Mazzetti, Paolo

    2010-05-01

    set of services for discovery, access, and processing of geospatial resources in a SOA framework. GI-cat is a distributed CSW framework implementation developed by the ESSI Lab of the Italian National Research Council (CNR-IMAA) and the University of Florence. It provides brokering and mediation functionalities towards heterogeneous resources and inventories, exposing several standard interfaces for query distribution. This work focuses on a new GI-cat interface which allows the catalog to be queried according to the OpenSearch syntax specification, thus filling the gap between the SOA architectural design of the CSW and the Web 2.0. At the moment, there is no OGC standard specification about this topic, but an official change request has been proposed in order to enable the OGC catalogues to support OpenSearch queries. In this change request, an OpenSearch extension is proposed providing a standard mechanism to query a resource based on temporal and geographic extents. Two new catalog operations are also proposed, in order to publish a suitable OpenSearch interface. This extended interface is implemented by the modular GI-cat architecture adding a new profiling module called "OpenSearch profiler". Since GI-cat also acts as a clearinghouse catalog, another component called "OpenSearch accessor" is added in order to access OpenSearch compliant services. An important role in the GI-cat extension, is played by the adopted mapping strategy. Two different kind of mappings are required: query, and response elements mapping. Query mapping is provided in order to fit the simple OpenSearch query syntax to the complex CSW query expressed by the OGC Filter syntax. GI-cat internal data model is based on the ISO-19115 profile, that is more complex than the simple XML syndication formats, such as RSS 2.0 and Atom 1.0, suggested by OpenSearch. Once response elements are available, in order to be presented, they need to be translated from the GI-cat internal data model, to the above

  17. Future Teachers' Dispositions toward Teaching with Geospatial Technologies

    Science.gov (United States)

    Jo, Injeong

    2016-01-01

    This study examined the effect of a minimal Web-based GIS experience within a semester-long methods course on enhancing preservice teachers' dispositions regarding the use of geospatial technologies for teaching. Fourteen preservice teachers enrolled in a senior-level methods course offered in geography and focused exclusively on how to teach…

  18. On the moroccan tsunami catalogue

    Directory of Open Access Journals (Sweden)

    F. Kaabouben

    2009-07-01

    Full Text Available A primary tool for regional tsunami hazard assessment is a reliable historical and instrumental catalogue of events. Morocco by its geographical situation, with two marine sides, stretching along the Atlantic coast to the west and along the Mediterranean coast to the north, is the country of Western Africa most exposed to the risk of tsunamis. Previous information on tsunami events affecting Morocco are included in the Iberian and/or the Mediterranean lists of tsunami events, as it is the case of the European GITEC Tsunami Catalogue, but there is a need to organize this information in a dataset and to assess the likelihood of claimed historical tsunamis in Morocco. Due to the fact that Moroccan sources are scarce, this compilation rely on historical documentation from neighbouring countries (Portugal and Spain and so the compatibility between the new tsunami catalogue presented here and those that correspond to the same source areas is also discussed.

  19. Catalogue of HI PArameters (CHIPA)

    Science.gov (United States)

    Saponara, J.; Benaglia, P.; Koribalski, B.; Andruchow, I.

    2015-08-01

    The catalogue of HI parameters of galaxies HI (CHIPA) is the natural continuation of the compilation by M.C. Martin in 1998. CHIPA provides the most important parameters of nearby galaxies derived from observations of the neutral Hydrogen line. The catalogue contains information of 1400 galaxies across the sky and different morphological types. Parameters like the optical diameter of the galaxy, the blue magnitude, the distance, morphological type, HI extension are listed among others. Maps of the HI distribution, velocity and velocity dispersion can also be display for some cases. The main objective of this catalogue is to facilitate the bibliographic queries, through searching in a database accessible from the internet that will be available in 2015 (the website is under construction). The database was built using the open source `` mysql (SQL, Structured Query Language, management system relational database) '', while the website was built with ''HTML (Hypertext Markup Language)'' and ''PHP (Hypertext Preprocessor)''.

  20. Semantic Sensor Web Enablement for COAST, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Sensor Web Enablement (SWE) is an Open Geospatial Consortium (OGC) standard Service Oriented Architecture (SOA) that facilitates discovery and integration of...

  1. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    Science.gov (United States)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We

  2. International Atomic Energy Agency publications. Publications catalogue 2004

    International Nuclear Information System (INIS)

    2004-03-01

    This Publications Catalogue lists all sales publications of the IAEA published in 2002, 2003 and forthcoming in early 2004. Most IAEA publications are issued in English, though some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  3. Considerations on Geospatial Big Data

    Science.gov (United States)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  4. Enhancing the online discovery of geospatial data through ...

    African Journals Online (AJOL)

    However, geoportals are often known to geoinformation communities only and present technological limitations which make it difficult for general purpose web search engines to discover and index the data catalogued in (or registered with) a geoportal. The mismatch between standard spatial metadata content and the ...

  5. Catalogue of Korean manuscripts and rare books

    DEFF Research Database (Denmark)

    Lerbæk Pedersen, Bent

    2014-01-01

    Catalogue of Korean manuscripts and rare books in The Royal Library, Copenhagen and the National Museum of Denmark......Catalogue of Korean manuscripts and rare books in The Royal Library, Copenhagen and the National Museum of Denmark...

  6. Technologies Connotation and Developing Characteristics of Open Geospatial Information Platform

    Directory of Open Access Journals (Sweden)

    GUO Renzhong

    2016-02-01

    Full Text Available Based on the background of developments of surveying,mapping and geoinformation,aimed at the demands of data fusion,real-time sharing,in-depth processing and personalization,this paper analyzes significant features of geo-spatial service in digital city,focuses on theory,method and key techniques of open environment of cloud computing,multi-path data updating,full-scale urban geocoding,multi-source spatial data integration,adaptive geo-processing and adaptive Web mapping.As the basis for it,the Open Geospatial information platform is developed,and successfully implicated in digital Shenzhen.

  7. Effect of Information Communications Technology on Cataloguing ...

    African Journals Online (AJOL)

    ICT also facilitates exchange of information resources between librarie s (7.1%) ; encourages use of virtual software and MARC 21 cataloguing ( 42%); it also facilitates subject heading determination for original cataloguing and makes automatic creation of online catalogues possible (14.3%). Key Words: Effect, Information ...

  8. Retrospective /Backlog Cataloguing: The Experience At The ...

    African Journals Online (AJOL)

    This paper looked at retrospective cataloguing in the Library of the University of Cape Coast (UCC) from 1991 to 2004. It was discovered that over twelve thousand (12,000) volumes of books had not been processed or catalogued in this Library. This means that these materials did not have any traces in the public catalogue ...

  9. Practical cataloguing AACR, RDA and MARC 21

    CERN Document Server

    Welsh, Anne

    2012-01-01

    Written at a time of transition in international cataloguing, this book provides cataloguers and students with a background in general cataloguing principles, the code (AACR2) and format (MARC 21) and the new standard (RDA). It provides library managers with an overview of the development of RDA in order to equip them to make the transition.

  10. Geospatial Technology in Geography Education

    NARCIS (Netherlands)

    Muniz Solari, Osvaldo; Demirci, A.; van der Schee, J.A.

    2015-01-01

    The book is presented as an important starting point for new research in Geography Education (GE) related to the use and application of geospatial technologies (GSTs). For this purpose, the selection of topics was based on central ideas to GE in its relationship with GSTs. The process of geospatial

  11. CERN Technical Training: Autumn 2007 Course Catalogue

    CERN Multimedia

    2007-01-01

    The following course sessions are scheduled in the framework of the CERN Technical Training Program 2007. You may find the full updated Technical Training course programme in our web-catalogue. OFFICE SOFTWARE CERN EDMS MTF en pratique F 4.9 1/2 d WORD 2007 (Short Course III) - How to work with long documents E/F 14.9 1/2 d FrontPage 2003 - niveau 1 E/F 17-18.9 2 d WORD 2007 - Niveau1: ECDL F 20.-21-9 2 d ACCESS 2007 - Level 1: ECDL E 20-21.9 2 d EXCEL 2007 (Short Course I) - How to work with Formulae E/F 21.9 1/2 d CERN EDMS Introduction E 24.9 1 d Java 2 Enterprise Edition - Part 1: Web Applications E 24-25.9 2 d Outlook 2007 (Short Course II) - Calendar, Tasks and Notes E/F 28.9 1/2 d Outlook 2007 (Short Course III) - Meeting and Delegation ...

  12. Improvement Of Search Process In Electronic Catalogues

    Directory of Open Access Journals (Sweden)

    Titas Savickas

    2014-05-01

    Full Text Available The paper presents investigation on search in electronic catalogues. The chosen problem domain is the search system in the electronic catalogue of Lithuanian Academic Libraries. The catalogue uses ALEPH system with MARC21 bibliographic format. The article presents analysis of problems pertaining to the current search engine and user expectations related to the search system of the electronic catalogue of academic libraries. Subsequent to analysis, the research paper presents the architecture for a semantic search system in the electronic catalogue that uses search process designed to improve search results for users.

  13. Finding geospatial pattern of unstructured data by clustering routes

    Science.gov (United States)

    Boustani, M.; Mattmann, C. A.; Ramirez, P.; Burke, W.

    2016-12-01

    Today the majority of data generated has a geospatial context to it. Either in attribute form as a latitude or longitude, or name of location or cross referenceable using other means such as an external gazetteer or location service. Our research is interested in exploiting geospatial location and context in unstructured data such as that found on the web in HTML pages, images, videos, documents, and other areas, and in structured information repositories found on intranets, in scientific environments, and otherwise. We are working together on the DARPA MEMEX project to exploit open source software tools such as the Lucene Geo Gazetteer, Apache Tika, Apache Lucene, and Apache OpenNLP, to automatically extract, and make meaning out of geospatial information. In particular, we are interested in unstructured descriptors e.g., a phone number, or a named entity, and the ability to automatically learn geospatial paths related to these descriptors. For example, a particular phone number may represent an entity that travels on a monthly basis, according to easily identifiable and somes more difficult to track patterns. We will present a set of automatic techniques to extract descriptors, and then to geospatially infer their paths across unstructured data.

  14. A 'new generation' earthquake catalogue

    Directory of Open Access Journals (Sweden)

    E. Boschi

    2000-06-01

    Full Text Available In 1995, we published the first release of the Catalogo dei Forti Terremoti in Italia, 461 a.C. - 1980, in Italian (Boschi et al., 1995. Two years later this was followed by a second release, again in Italian, that included more earthquakes, more accurate research and a longer time span (461 B.C. to 1990 (Boschi et al., 1997. Aware that the record of Italian historical seismicity is probably the most extensive of the whole world, and hence that our catalogue could be of interest for a wider interna-tional readership, Italian was clearly not the appropriate language to share this experience with colleagues from foreign countries. Three years after publication of the second release therefore, and after much additional research and fine tuning of methodologies and algorithms, I am proud to introduce this third release in English. All the tools and accessories have been translated along with the texts describing the development of the underlying research strategies and current contents. The English title is Catalogue of Strong Italian Earthquakes, 461 B.C. to 1997. This Preface briefly describes the scientific context within which the Catalogue of Strong Italian Earthquakes was conceived and progressively developed. The catalogue is perhaps the most impor-tant outcome of a well-established joint project between the Istituto Nazionale di Geofisica, the leading Italian institute for basic and applied research in seismology and solid earth geophysics, and SGA (Storia Geofisica Ambiente, a private firm specialising in the historical investigation and systematisation of natural phenomena. In her contribution "Method of investigation, typology and taxonomy of the basic data: navigating between seismic effects and historical contexts", Emanuela Guidoboni outlines the general framework of modern historical seismology, its complex relation with instrumental seismology on the one hand and historical research on the other. This presentation also highlights

  15. On compact galaxies in the UGC catalogue

    International Nuclear Information System (INIS)

    Kogoshvili, N.G.

    1980-01-01

    A problem of separation of compact galaxies in the UGC Catalogue is considered. Value of surface brightness equal to or less than 21sup(m) was used as compactness criterion from a square second of arc. 96 galaxies, which are brighter than 14sup(m)5 satisfy this criterion. Among compact galaxies discovered in the UGC Catalogue 7% are the Zwicky galaxies, 15% belong to the Markarian galaxies and 27% of galaxies are part of a galaxy list with high surface brightness. Considerable divergence in estimates of total share of compact galaxies in the B.A. Worontsov-Veljaminov Morphological Catalogue of Galaxies (MCG) and the UGC Catalogue is noted. This divergence results from systematical underestimation of visible sizes of compact galaxies in the MCG Catalogue as compared with the UGC Catalogue [ru

  16. The two union catalogues of Myanmar

    International Nuclear Information System (INIS)

    Hla, Win

    1995-01-01

    The article mentions about the two union catalogues of Myanmar. The first one is the ''Consolidated Catalogue of journals and the periodicals contained in the libraries of Kasuali, Calcutta, Bombay, Madras, Coonoor, Rangoon and Shillong''. This was published by Indian Research Fund Association of Calcutta in 1933. This is the first union catalogue of medical periodicals for both Myanmar and India as well. The second one is ''the Regional Union Catalogue of Scientific Serials: Yangon''. This was published in 1977, its second printing in 1989. This union catalogue excludes medical serials. Twenty libraries took part in the compilation and publishing of the union catalogue with Technical Information Centre of Myanmar Scientific and Technological Research Department, (formerly Central Research Organization), No. 6, Kaba Aye Pagoda Road, Yankin P.O. Yangon, Myanmar, taking the leading role

  17. Geospatial Brokering - Challenges and Future Directions

    Science.gov (United States)

    White, C. E.

    2012-12-01

    An important feature of many brokers is to facilitate straightforward human access to scientific data while maintaining programmatic access to it for system solutions. Standards-based protocols are critical for this, and there are a number of protocols to choose from. In this discussion, we will present a web application solution that leverages certain protocols - e.g., OGC CSW, REST, and OpenSearch - to provide programmatic as well as human access to geospatial resources. We will also discuss managing resources to reduce duplication yet increase discoverability, federated search solutions, and architectures that combine human-friendly interfaces with powerful underlying data management. The changing requirements witnessed in brokering solutions over time, our recent experience participating in the EarthCube brokering hack-a-thon, and evolving interoperability standards provide insight to future technological and philosophical directions planned for geospatial broker solutions. There has been much change over the past decade, but with the unprecedented data collaboration of recent years, in many ways the challenges and opportunities are just beginning.

  18. Making geospatial data in ASF archive readily accessible

    Science.gov (United States)

    Gens, R.; Hogenson, K.; Wolf, V. G.; Drew, L.; Stern, T.; Stoner, M.; Shapran, M.

    2015-12-01

    The way geospatial data is searched, managed, processed and used has changed significantly in recent years. A data archive such as the one at the Alaska Satellite Facility (ASF), one of NASA's twelve interlinked Distributed Active Archive Centers (DAACs), used to be searched solely via user interfaces that were specifically developed for its particular archive and data sets. ASF then moved to using an application programming interface (API) that defined a set of routines, protocols, and tools for distributing the geospatial information stored in the database in real time. This provided a more flexible access to the geospatial data. Yet, it was up to user to develop the tools to get a more tailored access to the data they needed. We present two new approaches for serving data to users. In response to the recent Nepal earthquake we developed a data feed for distributing ESA's Sentinel data. Users can subscribe to the data feed and are provided with the relevant metadata the moment a new data set is available for download. The second approach was an Open Geospatial Consortium (OGC) web feature service (WFS). The WFS hosts the metadata along with a direct link from which the data can be downloaded. It uses the open-source GeoServer software (Youngblood and Iacovella, 2013) and provides an interface to include the geospatial information in the archive directly into the user's geographic information system (GIS) as an additional data layer. Both services are run on top of a geospatial PostGIS database, an open-source geographic extension for the PostgreSQL object-relational database (Marquez, 2015). Marquez, A., 2015. PostGIS essentials. Packt Publishing, 198 p. Youngblood, B. and Iacovella, S., 2013. GeoServer Beginner's Guide, Packt Publishing, 350 p.

  19. Geospatial resources for the geologic community: The USGS National Map

    Science.gov (United States)

    Witt, Emitt C.

    2015-01-01

    Geospatial data are a key component of investigating, interpreting, and communicating the geological sciences. Locating geospatial data can be time-consuming, which detracts from time spent on a study because these data are not obviously placed in central locations or are served from many disparate databases. The National Map of the US Geological Survey is a publicly available resource for accessing the geospatial base map data needs of the geological community from a central location. The National Map data are available through a viewer and download platform providing access to eight primary data themes, plus the US Topo and scanned historical topographic maps. The eight themes are elevation, orthoimagery, hydrography, geographic names, boundaries, transportation, structures, and land cover, and they are being offered for download as predefined tiles in formats supported by leading geographic information system software. Data tiles are periodically refreshed to capture the most current content and are an efficient method for disseminating and receiving geospatial information. Elevation data, for example, are offered as a download from the National Map as 1° × 1° tiles for the 10- and 30- m products and as 15′ × 15′ tiles for the higher-resolution 3-m product. Vector data sets with smaller file sizes are offered at several tile sizes and formats. Partial tiles are not a download option—any prestaged data that intersect the requesting bounding box will be, in their entirety, part of the download order. While there are many options for accessing geospatial data via the Web, the National Map represents authoritative sources of data that are documented and can be referenced for citation and inclusion in scientific publications. Therefore, National Map products and services should be part of a geologist’s first stop for geospatial information and data.

  20. Flipping the cataloguing class: equipping and empowering cataloguers for the hybrid cataloguing environment

    OpenAIRE

    Welsh, A.

    2014-01-01

    With not only the Library of Congress and British Library moving to RDA in 2013 (Wiggins, 2012; Danskin, 2013), but also major research libraries including Cambridge University Library, the Bodleian and Trinity College, Dublin (Carty, 2013; O’Reilly, 2013; McManus, 2013), while others are adopting a wait and see approach (Gryspeerdt, 2012), it is not only current cataloguing staff who are required to understand and be able to create records in both the old (AACR2) and the new (RDA) cataloguin...

  1. A Cloud-enabled Service-oriented Spatial Web Portal for Facilitating Arctic Data Discovery, Integration, and Utilization

    Science.gov (United States)

    dias, S. B.; Yang, C.; Li, Z.; XIA, J.; Liu, K.; Gui, Z.; Li, W.

    2013-12-01

    Global climate change has become one of the biggest concerns for human kind in the 21st century due to its broad impacts on society and ecosystems across the world. Arctic has been observed as one of the most vulnerable regions to the climate change. In order to understand the impacts of climate change on the natural environment, ecosystems, biodiversity and others in the Arctic region, and thus to better support the planning and decision making process, cross-disciplinary researches are required to monitor and analyze changes of Arctic regions such as water, sea level, biodiversity and so on. Conducting such research demands the efficient utilization of various geospatially referenced data, web services and information related to Arctic region. In this paper, we propose a cloud-enabled and service-oriented Spatial Web Portal (SWP) to support the discovery, integration and utilization of Arctic related geospatial resources, serving as a building block of polar CI. This SWP leverages the following techniques: 1) a hybrid searching mechanism combining centralized local search, distributed catalogue search and specialized Internet search for effectively discovering Arctic data and web services from multiple sources; 2) a service-oriented quality-enabled framework for seamless integration and utilization of various geospatial resources; and 3) a cloud-enabled parallel spatial index building approach to facilitate near-real time resource indexing and searching. A proof-of-concept prototype is developed to demonstrate the feasibility of the proposed SWP, using an example of analyzing the Arctic snow cover change over the past 50 years.

  2. Open Source Testing Capability for Geospatial Software

    Science.gov (United States)

    Bermudez, L. E.

    2013-12-01

    Geospatial Software enables scientists to discover, access and process information for better understanding of the Earth. Hundreds, if not thousands, of geospatial software packages exist today. Many of these implement open standards. The OGC Implementation Statistics page [1] reports, for example, more than 450 software products that implement the OGC Web Map Service (WMS) 1.1.1 standard. Even though organizations voluntarily report their products as implementing the WMS standard, not all of these implementations can interoperate with each other. For example, a WMS client may not interact with all these WMS servers in the same functional way. Making the software work with other software, even when implementing the same standard, still remains a challenge, and the main reason is that not all implementations implement the standard correctly. The Open Geospatial Consortium (OGC) Compliance Program provides a testing infrastructure to test for the correct implementation of OGC standards in interfaces and encodings that enable communication between geospatial clients and servers. The OGC testing tool and the tests are all freely available, including the source code and access to the testing facility. The Test, Evaluation, And Measurement (TEAM) Engine is a test harness that executes test suites written using the OGC Compliance Testing Language (CTL) or the TestNG framework. TEAM Engine is available in Sourceforge. OGC hosts an official stable [2] deployment of TEAM Engine with the approved test suites. OGC also hosts a Beta TEAM Engine [3] with the tests in Beta and with new TEAM Engine functionality. Both deployments are freely available to everybody. The OGC testing infrastructure not only enables developers to test OGC standards, but it can be configured to test profiles of OGC standards and community-developed application agreements. These agreements can be any interface and encoding agreement, not only OGC based. The OGC Compliance Program is thus an important

  3. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  4. CURRENT TRENDS IN CATALOGUING AND THE CHALLENGES ...

    African Journals Online (AJOL)

    CURRENT TRENDS IN CATALOGUING AND THE CHALLENGES OF. A CATALOGUER IN THE DIGITAL ... Information Communication Technology (ICT) and the attendant innovations and trends that are required to cope in this new ... much tact and techniques including doggedness to keep pace with it. The tact one would.

  5. Competencies and materials for repositioning cataloguers for ...

    African Journals Online (AJOL)

    The purpose of this study was to determine the competencies and materials for repositioning cataloguers for information management in an electronic era. The survey method was adopted for the research design using questionnaire for data collection. The population comprised of 44 cataloguers in 12 universities in ...

  6. The 3XMM-DR4 Catalogue

    Science.gov (United States)

    Rosen, S.; Watson, M.; Pye, J.; Webb, N.; Schwope, A.; Freyberg, M.; Motch, C.; Ballet, J.; Carrera, F.; Page, M.; Page, C.

    2015-09-01

    The 3XMM-DR4 catalogue is the third generation catalogue of serendipitous X-ray sources from the European Space Agency's (ESA) XMM-Newton observatory, and has been created by the XMM-Newton Survey Science Centre (SSC) on behalf of ESA. Released in July 2013, 3XMM-DR4 contains 531261 detections from 372728 unique sources observed in 7427 XMM-Newton observations, about 50% more than in the preceding 2XMMi-DR3 catalogue, made public in April 2010. We review some of the key science-driven algorithmic and calibration changes to the processing pipeline adopted to enhance the scientific quality of the catalogue, such as the optimised filtering of background flares, improvements to the astrometric analysis and upgrades to the catalogue construction process. Examples of the gains obtained are illustrated.

  7. Sustainable energy catalogue - for European decision-makers. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Gram, S.; Jacobsen, Soeren

    2006-10-15

    is a list of contacts, pamphlets, web pages etc., where it is possible to find more information about the individual technologies. Each chapter offers both an overview of the single technology concerning development stage, best available technology, supply potential, environmental impact etc., a comparison between the different technologies and information about how the technologies can interact with each other and with the energy system. Further more a timeline that spreads towards (further) commercialisation is drawn for each technology. This timeline includes significant events with regard to research and development, the market and policies that are expected to be of importance to the success of the technology. The text on each technology is written on a background of expert knowledge supplied by a selection of experts on each technology presented in the catalogue. The texts are further more reviewed and evaluated by an external expert. A review is placed in connection with the individual technology. (au)

  8. Towards a geospatial wikipedia

    Science.gov (United States)

    Fritz, S.; McCallum, I.; Schill, C.; Perger, C.; Kraxner, F.; Obersteiner, M.

    2009-04-01

    Based on the Google Earth (http://earth.google.com) platform we have developed a geospatial Wikipedia (geo-wiki.org). The tool allows everybody in the world to contribute to spatial validation and is made available to the internet community interested in that task. We illustrate how this tool can be used for different applications. In our first application we combine uncertainty hotspot information from three global land cover datasets (GLC, MODIS, GlobCover). With an ever increasing amount of high resolution images available on Google Earth, it is becoming increasingly possible to distinguish land cover features with a high degree of accuracy. We first direct the land cover validation community to certain hotspots of land cover uncertainty and then ask them to fill in a small popup menu on type of land cover, possibly a picture at that location with the different cardinal points as well as date and what type of validation was chosen (google earth imagery/panoramio or if the person has ground truth data). We have implemented the tool via a land cover validation community at FACEBOOK which is based on a snowball system which allows the tracking of individuals and the possibility to ignore users which misuse the system. In a second application we illustrate how the tool could possibly be used for mapping malaria occurrence and small water bodies as well as overall malaria risk. For this application we have implemented a polygon as well as attribute function using Google maps as along with virtual earth using openlayers. The third application deals with illegal logging and how an alert system for illegal logging detection within a certain land tenure system could be implemented. Here we show how the tool can be used to document illegal logging via a YouTube video.

  9. Framework research of semantic sharing and interoperability of geospatial information

    Science.gov (United States)

    Zhao, Hu; Li, Lin; Shi, Yunfei

    2008-12-01

    Knowledge sharing and semantic interoperability is a significant research theme in Geographical Information Science (GIScience) because many researchers believe that semantic heterogeneity has been identified as the main obstacle for GIScience development. Interoperability issues can exist at three levels: syntactic, structural (also called systemic) and semantic. The former two, however, can be achieved by implementing international or domain standards proposed by several organizations, for example, Open Geospatial Consortium (OGC), World Wide Web Consortium (W3C) and the International Organization for Standardization/Technical Committee for Geographic information/Geomatics (ISO/TC 211). In this paper, we are concentrating on semantic interoperability, which is the sort of topic that halt conversations and cause people's eyes to glaze over, from two aspects: data/information/knowledge and operation/processing. We presented a service-centered architecture for semantic interoperability of geospatial data and processes. OGC standards like Web Feature Service (WFS) and Web Map Service (WMS) have been employed as normative interfaces for analyzing requests, division requests and delivering small requests. Ontology has been introduced to describe distributed resource including various data and geo-processing operations. The role of interoperability, especially from semantic perspective, has been distinguished at the first section in this paper. As a fundamental principal, the following section introduces semantic web, web service and other related works at this orientation. We present our service-based architecture in detail and its simple application at part three. Conclusion and further orientations have been illustrated at last section.

  10. Leveraging Industry Standards for GeoSpatial Portal Development

    Science.gov (United States)

    Zimble, D.; Garegnani, J. J.

    2005-12-01

    Rapid advances in mainstream IT data sharing techniques through the leveraging of mainstream IT standards such as the World Wide Web Consortium (W3C) extensible markup language (XML), simple object access protocol (SOAP) based web services and the Java Community Process (JCP) driven portlet technology (JSR-0168) in addition to the wide adoption of Open Geospatial Consortium (OGC) GIS web service specifications (WMS, WFS, WCS, WMC, CS-W etc.) are intersecting within commercial GIS technologies. For example, the next generation GIS Portal technology for the U.S. Government's Geospatial One-Stop has been developed to help establish an industrial strength geospatial portal that can be used as the primary U.S. Government coordinating portal for geospatial related activities. In addition to these technologies providing common highly interoperable portals, heavier desktop and server applications are further integrating technologies that will enable the scientific communities to link into these mainstream information portals. By example, we will discuss the incorporation of the Open Source scripting language known as Python into the commercial GIS platform both on the desktop and on the server. For example, users have already developed python code that can be deployed providing the GIS user access to large repositories of scientific multidimensional data via the OpeNDAP protocol that can be incorporated into the GIS analysis and workflow. Additional development in the support of NetCDF and in the future additional scientific data formats will expand the use of such formats within the GIS community. This presentation will provide an overview and demonstrations of these technologies and how they are relevant to the Earth and Space Science Informatics Community.

  11. GSKY: A scalable distributed geospatial data server on the cloud

    Science.gov (United States)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as

  12. Catalogues, conceptual models, data model: the orientations of research and the thematic guidelines in Library and information science

    Directory of Open Access Journals (Sweden)

    Antonella Trombone

    2016-05-01

    Full Text Available The adoption of new conceptual models for bibliographic and authority data, the latest normative structures of the catalogue, the language of data encoding and their communication in semantic Web, are some of the factors that are transforming library catalogues, cataloguing and management both of bibliographic and authority data. Moreover, these changes are related to the development of new environments of discovery, visualization and dissemination of bibliographic information. The essay proposes an analysis of the elements that underlie such changes carried out also through examining the themes developed by the scientific literature in the same area of interest.

  13. TerraService.NET: An Introduction to Web Services

    OpenAIRE

    Barclay, Tom; Gray, Jim; Strand, Eric; Ekblad, Steve; Richter, Jeffrey

    2002-01-01

    This article explores the design and construction of a geo-spatial Internet web service application from the host web site perspective and from the perspective of an application using the web service. The TerraService.NET web service was added to the popular TerraServer database and web site with no major structural changes to the database. The article discusses web service design, implementation, and deployment concepts and design guidelines. Web services enable applications that aggregate a...

  14. Bibliographic information organization in the semantic web

    CERN Document Server

    Willer, Mirna

    2013-01-01

    New technologies will underpin the future generation of library catalogues. To facilitate their role providing information, serving users, and fulfilling their mission as cultural heritage and memory institutions, libraries must take a technological leap; their standards and services must be transformed to those of the Semantic Web. Bibliographic Information Organization in the Semantic Web explores the technologies that may power future library catalogues, and argues the necessity of such a leap. The text introduces international bibliographic standards and models, and fundamental concepts in

  15. A CLOUD-BASED PLATFORM SUPPORTING GEOSPATIAL COLLABORATION FOR GIS EDUCATION

    Directory of Open Access Journals (Sweden)

    X. Cheng

    2015-05-01

    Full Text Available GIS-related education needs support of geo-data and geospatial software. Although there are large amount of geographic information resources distributed on the web, the discovery, process and integration of these resources are still unsolved. Researchers and teachers always searched geo-data by common search engines but results were not satisfied. They also spent much money and energy on purchase and maintenance of various kinds of geospatial software. Aimed at these problems, a cloud-based geospatial collaboration platform called GeoSquare was designed and implemented. The platform serves as a geoportal encouraging geospatial data, information, and knowledge sharing through highly interactive and expressive graphic interfaces. Researchers and teachers can solve their problems effectively in this one-stop solution. Functions, specific design and implementation details are presented in this paper. Site of GeoSquare is: http://geosquare.tianditu.com/

  16. a Cloud-Based Platform Supporting Geospatial Collaboration for GIS Education

    Science.gov (United States)

    Cheng, X.; Gui, Z.; Hu, K.; Gao, S.; Shen, P.; Wu, H.

    2015-05-01

    GIS-related education needs support of geo-data and geospatial software. Although there are large amount of geographic information resources distributed on the web, the discovery, process and integration of these resources are still unsolved. Researchers and teachers always searched geo-data by common search engines but results were not satisfied. They also spent much money and energy on purchase and maintenance of various kinds of geospatial software. Aimed at these problems, a cloud-based geospatial collaboration platform called GeoSquare was designed and implemented. The platform serves as a geoportal encouraging geospatial data, information, and knowledge sharing through highly interactive and expressive graphic interfaces. Researchers and teachers can solve their problems effectively in this one-stop solution. Functions, specific design and implementation details are presented in this paper. Site of GeoSquare is: http://geosquare.tianditu.com/

  17. Geospatial Absorption and Regional Effects

    Directory of Open Access Journals (Sweden)

    IOAN MAC

    2009-01-01

    Full Text Available The geospatial absorptions are characterized by a specific complexity both in content and in their phenomenological and spatial manifestation fields. Such processes are differentiated according to their specificity to pre-absorption, absorption or post-absorption. The mechanisms that contribute to absorption are extremely numerous: aggregation, extension, diffusion, substitution, resistivity (resilience, stratification, borrowings, etc. Between these mechanisms frequent relations are established determining an amplification of the process and of its regional effects. The installation of the geographic osmosis phenomenon in a given territory (a place for example leads to a homogenization of the geospatial state and to the installation of the regional homogeneity.

  18. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  19. From Card Catalogues to WebPACs: Celebrating Cataloguing in the 20th Century.

    Science.gov (United States)

    Gorman, Michael

    This paper provides an overview of cataloging in the 20th century. Highlights include: (1) issues in 1901, including the emerging cooperative cataloging system and the work of Charles Ammi Cutter; (2) the 1908 code, i.e., "Catalog Rules: Author and Title Entries," published in British and American editions; (3) the Vatican rules, a code…

  20. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    Science.gov (United States)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  1. Improving pest risk assessment and management through the aid of geospatial information technology standards

    Directory of Open Access Journals (Sweden)

    Trond Rafoss

    2013-09-01

    Full Text Available Delivery of geospatial information over the Internet for the management of risks from invasive alien species is an increasingly important service. The evolution of information technology standards for geospatial data is a key factor to simplify network publishing and exchange of maps and data. The World Wide Web Consortium (W3C-geolocation specification is a recent addition that may prove useful for pest risk management. In this article we implement the W3C-geolocation specification and Open Geospatial Consortium (OGC mapping standards in a Web browser application for smartphones and tablet computers to improve field surveys for alien invasive species. We report our first season field experiences using this tool for online mapping of plant disease outbreaks and host plant occurrence. It is expected that the improved field data collection tools will result in increased data availability and thereby new opportunities for risk assessment, because data-needs and availability are crucial for species distribution modelling and model-based forecasts of pest establishment potential. Finally, we close with a comment on the future potential of geospatial information standards to enhance the translation from data to decisions regarding pest risks, which should enable earlier detection of emerging risks as well as more robust projections of pest risks in novel areas. The forthcoming standard for processing of geospatial information, the Web Processing Standard (WPS, should open new technological capabilities both for automatic initiation and updating of risk assessment models based on new incoming data, and subsequent early warning.

  2. Catalogue of tide gauges in the Pacific

    National Research Council Canada - National Science Library

    Ridgway, N. M

    1984-01-01

    Although this catalogue is primarily intended to provide a list of sources for tidal data which can be used in postevent studies of tsunamis, it may also be useful in other branches of oceanographic...

  3. The new <<Catalogue of Strong Italian Earthquakes>>

    Directory of Open Access Journals (Sweden)

    G. Valensise

    1995-06-01

    Full Text Available We describe a new catalogue of strong ltalian earthquakes that the Istituto Nazionale di Geofisica in collaboration with SGA, has recently made available to the international scientific community and to the general public. The new catalogue differs from previous efforts in that for each event the usual seismic parameters are complemented by a list of intensity rated localities, a complete list of relevant references, a series of synoptic comments describing different aspects of the earthquake phenomenology. and in most cases even the text of the original written sources. The printed part of the catalogue has been published as a special monograph which contains also a computer version of the full database in the form of a CD-ROM. The software package includes a computer program for retrieving, selecting and displaying the catalogue data.

  4. Geospatial Health: the first five years

    Directory of Open Access Journals (Sweden)

    Jürg Utzinger

    2011-11-01

    Full Text Available Geospatial Health is an international, peer-reviewed scientific journal produced by the Global Network for Geospatial Health (GnosisGIS. This network was founded in 2000 and the inaugural issue of its official journal was published in November 2006 with the aim to cover all aspects of geographical information system (GIS applications, remote sensing and other spatial analytic tools focusing on human and veterinary health. The University of Naples Federico II is the publisher, producing two issues per year, both as hard copy and an open-access online version. The journal is referenced in major databases, including CABI, ISI Web of Knowledge and PubMed. In 2008, it was assigned its first impact factor (1.47, which has now reached 1.71. Geospatial Health is managed by an editor-in-chief and two associate editors, supported by five regional editors and a 23-member strong editorial board. This overview takes stock of the first five years of publishing: 133 contributions have been published so far, primarily original research (79.7%, followed by reviews (7.5%, announcements (6.0%, editorials and meeting reports (3.0% each and a preface in the first issue. A content analysis of all the original research articles and reviews reveals that three quarters of the publications focus on human health with the remainder dealing with veterinary health. Two thirds of the papers come from Africa, Asia and Europe with similar numbers of contributions from each continent. Studies of more than 35 different diseases, injuries and risk factors have been presented. Malaria and schistosomiasis were identified as the two most important diseases (11.2% each. Almost half the contributions were based on GIS, one third on spatial analysis, often using advanced Bayesian geostatistics (13.8%, and one quarter on remote sensing. The 120 original research articles, reviews and editorials were produced by 505 authors based at institutions and universities in 52 countries

  5. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    Science.gov (United States)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources

  6. Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics

    Science.gov (United States)

    Singh, R.; Bermudez, L. E.

    2013-12-01

    Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics The Open Geospatial Consortium (OGC) mission is to serve as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC coordinates with over 400 institutions in the development of geospatial standards. In the last years two main trends are making disruptions in geospatial applications: mobile and context sharing. People now have more and more mobile devices to support their work and personal life. Mobile devices are intermittently connected to the internet and have smaller computing capacity than a desktop computer. Based on this trend a new OGC file format standard called GeoPackage will enable greater geospatial data sharing on mobile devices. GeoPackage is perhaps best understood as the natural evolution of Shapefiles, which have been the predominant lightweight geodata sharing format for two decades. However the format is extremely limited. Four major shortcomings are that only vector points, lines, and polygons are supported; property names are constrained by the dBASE format; multiple files are required to encode a single data set; and multiple Shapefiles are required to encode multiple data sets. A more modern lingua franca for geospatial data is long overdue. GeoPackage fills this need with support for vector data, image tile matrices, and raster data. And it builds upon a database container - SQLite - that's self-contained, single-file, cross-platform, serverless, transactional, and open source. A GeoPackage, in essence, is a set of SQLite database tables whose content and layout is described in the candidate GeoPackage Implementation Specification available at https://portal.opengeospatial.org/files/?artifact_id=54838&version=1. The second trend is sharing client 'contexts'. When a user is looking into an article or a product on the web

  7. OSGeo - Open Source Geospatial Foundation

    Directory of Open Access Journals (Sweden)

    Margherita Di Leo

    2012-09-01

    Full Text Available L'esigenza nata verso la fine del 2005 di selezionare ed organizzare più di 200 progetti FOSS4G porta alla nascita nel Febbraio2006 di OSGeo (the Open Source Geospatial Foundation, organizzazione internazionale la cui mission è promuovere lo sviluppo collaborativo di software libero focalizzato sull'informazione geografica (FOSS4G.Open   Source   Geospatial   Foundation (OSGeoThe Open Source Geospatial Foundation (OSGeo  is  a  not-for-profit  organization, created  in  early  2006  to  the  aim  at  sup-porting   the   collaborative   development of  geospatial  open  source  software,  and promote its widespread use. The founda-tion provides financial, organizational and legal support to the broader open source geospatial community. It also serves as an independent  legal  entity  to  which  com-munity  members  can  contribute  code, funding  and  other  resources,  secure  in the knowledge that their contributions will be maintained for public benefit. OSGeo also  serves  as  an  outreach  and  advocacy organization for the open source geospa-tial  community,  and  provides  a  common forum  and  shared  infrastructure  for  im-proving  cross-project  collaboration.  The foundation's projects are all freely available and  useable  under  an  OSI-certified  open source license. The Italian OSGeo local chapter is named GFOSS.it     (Associazione     Italiana     per l'informazione Geografica Libera.

  8. Geospatial economics of the woody biomass supply in Kansas -- A case study

    Science.gov (United States)

    Olga Khaliukova; Darci Paull; Sarah L. Lewis-Gonzales; Nicolas Andre; Larry E. Biles; Timothy M. Young; James H. Perdue

    2017-01-01

    This research assessed the geospatial supply of cellulosic feedstocks for potential mill sites in Kansas (KS), with procurement zones extending to Arkansas (AR), Iowa(IA), Missouri(MO), Oklahoma (OK), and Nebraska (NE). A web-based modeling system, the Kansas Biomass Supply Assessment Tool, was developed to identify least-cost sourcing areas for logging residues and...

  9. Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks

    Science.gov (United States)

    Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2015-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.

  10. Geospatial Technologies and Geography Education in a Changing World : Geospatial Practices and Lessons Learned

    NARCIS (Netherlands)

    2015-01-01

    Book published by IGU Commission on Geographical Education. It focuses particularly on what has been learned from geospatial projects and research from the past decades of implementing geospatial technologies in formal and informal education.

  11. DCC Briefing Paper: Curating Geospatial Data

    OpenAIRE

    McGarva, Guy

    2006-01-01

    Geospatial data relate to the location of geographical features and the relationships between those features. They are vital for a wide range of business and government functions including defence, transportation, education, engineering, and recreation; in fact, almost all activities rely on geospatial data to some extent. It is widely accepted that the majority of all data held in corporate and government databases include some kind of geospatial characteristics. Such data are important not ...

  12. Constructing catalogue of temporal situations

    Directory of Open Access Journals (Sweden)

    Violetta Koseska-Toszewa

    2015-11-01

    Full Text Available Constructing catalogue of temporal situations The paper is aiming to create a common basis for description, comparing, and analysis natural languages. As a subject of comparison we have chosen temporal structures of some languages. For such a choice there exists a perfect tool, describing basic temporal phenomena, namely an ordering of states and events in time, certainty and uncertainty, independency of histories of separate objects, necessity and possibility. This tool is supported by the Petri nets formalism, which seems to be well suited for expressing the above mentioned phenomena. Petri nets are built form three primitive notions: of states, of events that begin or end the states, and so-called flow relation indicating succession of states and events. This simple constituents give rise to many possibilities of representing temporal phenomena; it turns out that such representations are sufficient for many (clearly, not necessarily all temporal situations appearing in natural languages. In description formalisms used till now there is no possibility of expressing such reality phenomena as temporal dependencies in compound statement, or combination of temporality and modality. Moreover, using these formalisms one cannot distinguish between two different sources of uncertainty of the speaker while describing the reality: one, due to the lack of knowledge of the speaker what is going on in outside world, the second, due to objective impossibility of foreseen ways in which some conflict situations will be (or already have been resolved. Petri net formalism seems to be perfectly suited for such differentiations. There are two main description principles that encompassed this paper. First, that assigns meaning to names of grammatical structures in different languages may lead to misunderstanding. Two grammatical structures with apparently close names may describe different reality. Additionally, some grammatical terms used in one language may be

  13. Strategic Model for Future Geospatial Education

    National Research Council Canada - National Science Library

    Hacker, Gary

    1998-01-01

    .... Increases in computing technology combined with unclassified access to high resolution satellite imagery, geospatial information, and positioning accuracy provided by the Global Positioning System (GPS...

  14. Catalogue of meteorites from South America

    CERN Document Server

    Acevedo, Rogelio Daniel; García, Víctor Manuel

    2014-01-01

    The first Catalogue of Meteorites from South America includes new specimens never previously reported, while doubtful cases and pseudometeorites have been deliberately omitted.The falling of these objects is a random event, but the sites where old meteorites are found tend to be focused in certain areas, e.g. in the deflation surfaces in Chile's Atacama Desert, due to favorable climate conditions and ablation processes.Our Catalogue provides basic information on each specimen like its provenance and the place where it was discovered (in geographic co-ordinates and with illustrative maps), its

  15. National Recordal System IK holder catalogue process

    CSIR Research Space (South Africa)

    Pretorius, R

    2012-10-01

    Full Text Available possible for a representative body to represent more than one community. THE CATALOGUE PROCESS The catalogue process was defined to encourage community ownership of the IK activities, whilst keeping an active audit trail to protect the communities... and IK holders against bio-piracy[2]. The IK audit trail will ensure that IK holders and communities are not locked out of any potential socio-economic benefits that may flow from their respective IK. One of the first activities when establishing a new...

  16. LSIVIEWER 2.0 – A CLIENT-ORIENTED ONLINE VISUALIZATION TOOL FOR GEOSPATIAL VECTOR DATA

    Directory of Open Access Journals (Sweden)

    K. Manikanta

    2017-09-01

    Full Text Available Geospatial data visualization systems have been predominantly through applications that are installed and run in a desktop environment. Over the last decade, with the advent of web technologies and its adoption by Geospatial community, the server-client model for data handling, data rendering and visualization respectively has been the most prevalent approach in Web-GIS. While the client devices have become functionally more powerful over the recent years, the above model has largely ignored it and is still in a mode of serverdominant computing paradigm. In this paper, an attempt has been made to develop and demonstrate LSIViewer – a simple, easy-to-use and robust online geospatial data visualisation system for the user’s own data that harness the client’s capabilities for data rendering and user-interactive styling, with a reduced load on the server. The developed system can support multiple geospatial vector formats and can be integrated with other web-based systems like WMS, WFS, etc. The technology stack used to build this system is Node.js on the server side and HTML5 Canvas and JavaScript on the client side. Various tests run on a range of vector datasets, upto 35 MB, showed that the time taken to render the vector data using LSIViewer is comparable to a desktop GIS application, QGIS, over an identical system.

  17. Lsiviewer 2.0 - a Client-Oriented Online Visualization Tool for Geospatial Vector Data

    Science.gov (United States)

    Manikanta, K.; Rajan, K. S.

    2017-09-01

    Geospatial data visualization systems have been predominantly through applications that are installed and run in a desktop environment. Over the last decade, with the advent of web technologies and its adoption by Geospatial community, the server-client model for data handling, data rendering and visualization respectively has been the most prevalent approach in Web-GIS. While the client devices have become functionally more powerful over the recent years, the above model has largely ignored it and is still in a mode of serverdominant computing paradigm. In this paper, an attempt has been made to develop and demonstrate LSIViewer - a simple, easy-to-use and robust online geospatial data visualisation system for the user's own data that harness the client's capabilities for data rendering and user-interactive styling, with a reduced load on the server. The developed system can support multiple geospatial vector formats and can be integrated with other web-based systems like WMS, WFS, etc. The technology stack used to build this system is Node.js on the server side and HTML5 Canvas and JavaScript on the client side. Various tests run on a range of vector datasets, upto 35 MB, showed that the time taken to render the vector data using LSIViewer is comparable to a desktop GIS application, QGIS, over an identical system.

  18. A High-performance Service-Oriented Geospatial Cyberinfrastructure for Rapid Disaster Response and Decision Making

    Science.gov (United States)

    Li, W.; Ren, Y.

    2013-12-01

    High population growth, urbanization and global climate change have resulted in more frequent occurrences of disasters, affecting people's life and property safety all over the world. Worse than the disaster it is the vulnerability of existing disaster management systems that are failed to realize timely collection of disaster-related data, estimation of damage, evacuation planning, resource scheduling and to make other decisions in the disastrous situation. The emerging geospatial cyberinfrastructure (GCI) provides a promising solution to address these issues. This paper reports our efforts in establishing a high-performance cyberinfrastructure for rapid disaster response and decision-making. This GCI is built upon a service-oriented architecture, with improved performance supported by a distributed computing cluster for efficient data transmission and rendering. Different from most works in literature in improving the client-side performance of geospatial web services, this cluster solves the fundamental performance issue on the server side. A web portal is also developed to integrate the real-time geospatial web services reporting disaster related information for integral analysis and collaborative decision-making. We expect this work to contribute to effective disaster management and geospatial interoperability.

  19. Cataloguing outside the box a practical guide to cataloguing special collections materials

    CERN Document Server

    Falk, Patricia

    2010-01-01

    A practical guide to cataloguing and processing the unique special collections formats in the Browne Popular Culture Library (BPCL) and the Music Library and Sound Recordings Archives (MLSRA) at Bowling Green State University (BGSU) (e.g. fanzines, popular sound recordings, comic books, motion picture scripts and press kits, popular fiction). Cataloguing Outside the Box provides guidance to professionals in library and information science facing the same cataloguing challenges. Additionally, name authority work for these collections is addressed.provides practical guidelines and solutions for

  20. Sharing geoscience algorithms in a Web service-oriented environment (GRASS GIS example)

    Science.gov (United States)

    Li, Xiaoyan; Di, Liping; Han, Weiguo; Zhao, Peisheng; Dadi, Upendra

    2010-08-01

    Effective use of the large amounts of geospatial data available for geospatial research and applications is needed. In this paper, the emerging SOAP-based Web service technologies have been used to develop a large number of standard compliant, chainable geospatial Web services, using existing geospatial modules in software systems or specific geoscientific algorithms. A prototype for wrapping legacy software modules or geoscientific algorithms into loosely coupled Web services is proposed from an implementation viewpoint. Module development for Web services adheres to the Open GIS Consortium (OGC) geospatial implementation and the World Wide Web consortium (W3C) standards. The Web service interfaces are designed using Web Services Description Language (WSDL) documents. This paper presents how the granularity of an individual existing geospatial service module used by other geoscientific workflows is decided. A treatment of concurrence processes and clustered deployment of Web services is used to overcome multi-user access and network speed limit problems. This endeavor should allow extensive use of geoscientific algorithms and geospatial data.

  1. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    Science.gov (United States)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  2. Learning R for geospatial analysis

    CERN Document Server

    Dorman, Michael

    2014-01-01

    This book is intended for anyone who wants to learn how to efficiently analyze geospatial data with R, including GIS analysts, researchers, educators, and students who work with spatial data and who are interested in expanding their capabilities through programming. The book assumes familiarity with the basic geographic information concepts (such as spatial coordinates), but no prior experience with R and/or programming is required. By focusing on R exclusively, you will not need to depend on any external software-a working installation of R is all that is necessary to begin.

  3. Nontraditional Resources Catalogue: Opening Trade Barriers.

    Science.gov (United States)

    Porter, Jeanne Harber, Ed.

    This catalogue provides a list of resources relevant to non-traditional careers, including work pattern information on flextime, job sharing, and industry-supported child care. The printed and audiovisual materials highlight journal articles, films, publications, test preparations, slides, cassettes, apprenticeship information centers, and Women's…

  4. Metadata-catalogue of European spatial datasets

    NARCIS (Netherlands)

    Willemen, J.P.M.; Kooistra, L.

    2004-01-01

    In order to facilitate a more effective accessibility of European spatial datasets, an assessment was carried out by the GeoDesk of the WUR to identify and describe key datasets that will be relevant for research carried out within WUR and MNP. The outline of the Metadata catalogue European spatial

  5. Catalogue of fish species of the Netherlands

    NARCIS (Netherlands)

    Nijssen, H.; Groot, de S.J.

    1974-01-01

    A catalogue of 179 fish species occurring in the fresh and coastal waters of the Netherlands is published. Scientific, Dutch and English names are given for each species as well as information on their abundance in the fresh waters of Holland and in the Dutch coastal waters within the twelve mile

  6. Catalogue of gamma rays from radionuclides

    International Nuclear Information System (INIS)

    Ekstroem, L.P.; Andersson, P.

    1983-10-01

    A catalogue of almost 11000 gamma rays is presented. The gamma rays are sorted by energy. In addition to the gamma-ray intensity per 100 decays of the parent, the decay half-life and associated gamma rays are given. All data are from a computer processing of a recent ENSDF file. (author)

  7. The Impact of a Geospatial Technology-Supported Energy Curriculum on Middle School Students' Science Achievement

    Science.gov (United States)

    Kulo, Violet; Bodzin, Alec

    2013-02-01

    Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade students classified in three ability level tracks. Data were gathered through pre/posttest content knowledge assessments, daily classroom observations, and daily reflective meetings with the teacher. Findings indicated a significant increase in the energy content knowledge for all the students. Effect sizes were large for all three ability level tracks, with the middle and low track classes having larger effect sizes than the upper track class. Learners in all three tracks were highly engaged with the curriculum. Curriculum effectiveness and practical issues involved with using geospatial technologies to support science learning are discussed.

  8. A Global Geospatial Database of 5000+ Historic Flood Event Extents

    Science.gov (United States)

    Tellman, B.; Sullivan, J.; Doyle, C.; Kettner, A.; Brakenridge, G. R.; Erickson, T.; Slayback, D. A.

    2017-12-01

    A key dataset that is missing for global flood model validation and understanding historic spatial flood vulnerability is a global historical geo-database of flood event extents. Decades of earth observing satellites and cloud computing now make it possible to not only detect floods in near real time, but to run these water detection algorithms back in time to capture the spatial extent of large numbers of specific events. This talk will show results from the largest global historical flood database developed to date. We use the Dartmouth Flood Observatory flood catalogue to map over 5000 floods (from 1985-2017) using MODIS, Landsat, and Sentinel-1 Satellites. All events are available for public download via the Earth Engine Catalogue and via a website that allows the user to query floods by area or date, assess population exposure trends over time, and download flood extents in geospatial format.In this talk, we will highlight major trends in global flood exposure per continent, land use type, and eco-region. We will also make suggestions how to use this dataset in conjunction with other global sets to i) validate global flood models, ii) assess the potential role of climatic change in flood exposure iii) understand how urbanization and other land change processes may influence spatial flood exposure iv) assess how innovative flood interventions (e.g. wetland restoration) influence flood patterns v) control for event magnitude to assess the role of social vulnerability and damage assessment vi) aid in rapid probabilistic risk assessment to enable microinsurance markets. Authors on this paper are already using the database for the later three applications and will show examples of wetland intervention analysis in Argentina, social vulnerability analysis in the USA, and micro insurance in India.

  9. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    Science.gov (United States)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current

  10. Web GIS and Public Health

    Directory of Open Access Journals (Sweden)

    Maryam Pourhassan

    2010-04-01

    Full Text Available Both government and private sector organizations are seeking ways to maintain and improve the health of the public in the world to control the costs at the same time. For this aim internet and use of georeferenced public health information for Geographic Information System application is an important and exciting development for the nation’s Department of Health and Human Services and other health agencies. Technological progress towards public health geospatial data integration, analysis, and visualization of space-time events using the Web portends eventual robust use of Geographic Information System by public health and other sectors of the economy. Increasing Web resources from distributed spatial data portals and global geospatial libraries, and a growing suite of Web integration tools, will provide new opportunities to advance disease surveillance, control and prevention, and insure public access and community empowerment in public health decision making.

  11. On Hydronymic Catalogues Composition Principles: Cataloguing of Hydronyms of the Msta River Basin

    Directory of Open Access Journals (Sweden)

    Valery L. Vasilyev

    2015-06-01

    Full Text Available The article presents a brief review of the few Russian hydronymic catalogues (relating to the basins of the Don, Oka, Svir and other rivers based on the hydrographic principle. The authors argue that, in comparison with alphabetized hydronymic dictionaries, hydronymic catalogues have some obvious advantages for onomastic lexicography. This kind of catalogues should include, firstly, all historically attested forms of a hydronym (including those considered to be occasional miswritings and, secondly, all non-hydronymic names making part of the respective hydronymic microsystem and providing “external” (i. e., chronological, derivational, etymological, ethno-historical information about the hydronym. The authors point out that the cataloguing of hydronyms based on the hydrographic principle entails some difficulties: impossibility to localize some bodies of water mentioned in ancient and medieval documents; differences in the indication of the same bodies of water on old and contemporary maps; historical differences in establishing hydrographic hierarchies; historical changes of lake-river systems, etc. The authors also share their experience in creating a hydronymic catalogue of the Msta River basin in Novgorod and Tver Regions of Russia. They describe the principles of the composition of the catalogue and present a short excerpt of it that orders names in the system of the Volma River, one of the Msta’s left tributaries.

  12. A Comparative Study of the Guo Shoujing Star Catalogue and the Ulugh Beg Star Catalogue

    Science.gov (United States)

    Sun, Xiaochun; Yang, Fan; Zhao, Yongheng

    2015-08-01

    The Chinese Star Catalogue by Guo Shoujing (1231-1316) contained equatorial coordinates of 678 stars, more than doubled the number of stars in previous Chinese star catalogues. In the period 1420-1437, using astronomical instruments at Samarkand Observatory, Ulugh Beg (1394-1449) made independent observations and determined star positions of 1018 stars. An analysis of two star catalogues will show the observational techniques behind them and their accuracies. Both astronomers tried to increase accuracy of measurement by enlarging the astronomical instruments. The Chinese catalogue gives equatorial coordinates of stars. The coordinates were directly read off the armillary sphere, which was mounted equatorially mounted. Sun Xiaochun (1996) suggested that the data of the existent Guo Shoujing catalogue was actually observed around 1380, at the beginning of the Ming dynasty. The Ulugh Beg catalogue gives ecliptic coordinates of stars. Does this mean they were directly measured using an ecliptic instrument? Using Fourier analysis we discover a 3 arc minute systematic error in the declinations, which are derived from the ecliptic coordinates, suggesting the data might be first measured equatorially and then converted to ecliptic coordinates, following Ptolemaic tradition. The 3 arc minute systematic error was caused by the misalignment of the instrument's pole and celestial north pole. And the Our comparative study might throw some light on transmission of astronomical knowledge and techniques between China and Central Asia in medieval times.

  13. The Use Of The Internet For Cataloguing And Classification | Zaid ...

    African Journals Online (AJOL)

    This paper discusses approaches to utilize various online library catalogues to facilitate cataloguing processes for bibliographic benefits. References will be made to the University of Lagos Library where success has been recorded as a result of use of various online catalogues to complement the tools used in classifying ...

  14. A Comparative Study of Job Satisfaction between Cataloguers in ...

    African Journals Online (AJOL)

    The study aimed at comparing job satisfaction among cataloguers in federal and private university libraries in Nigeria and to identify what cataloguers are satisfied and dissatisfied with. The instrument used for data collection was the questionnaire. Data was collected from cataloguers in federal and private university ...

  15. Seismic Catalogue and Seismic Network in Haiti

    Science.gov (United States)

    Belizaire, D.; Benito, B.; Carreño, E.; Meneses, C.; Huerfano, V.; Polanco, E.; McCormack, D.

    2013-05-01

    The destructive earthquake occurred on January 10, 2010 in Haiti, highlighted the lack of preparedness of the country to address seismic phenomena. At the moment of the earthquake, there was no seismic network operating in the country, and only a partial control of the past seismicity was possible, due to the absence of a national catalogue. After the 2010 earthquake, some advances began towards the installation of a national network and the elaboration of a seismic catalogue providing the necessary input for seismic Hazard Studies. This paper presents the state of the works carried out covering both aspects. First, a seismic catalogue has been built, compiling data of historical and instrumental events occurred in the Hispaniola Island and surroundings, in the frame of the SISMO-HAITI project, supported by the Technical University of Madrid (UPM) and Developed in cooperation with the Observatoire National de l'Environnement et de la Vulnérabilité of Haiti (ONEV). Data from different agencies all over the world were gathered, being relevant the role of the Dominican Republic and Puerto Rico seismological services which provides local data of their national networks. Almost 30000 events recorded in the area from 1551 till 2011 were compiled in a first catalogue, among them 7700 events with Mw ranges between 4.0 and 8.3. Since different magnitude scale were given by the different agencies (Ms, mb, MD, ML), this first catalogue was affected by important heterogeneity in the size parameter. Then it was homogenized to moment magnitude Mw using the empirical equations developed by Bonzoni et al (2011) for the eastern Caribbean. At present, this is the most exhaustive catalogue of the country, although it is difficult to assess its degree of completeness. Regarding the seismic network, 3 stations were installed just after the 2010 earthquake by the Canadian Government. The data were sent by telemetry thought the Canadian System CARINA. In 2012, the Spanish IGN together

  16. The Future of Geospatial Standards

    Science.gov (United States)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds

  17. M3.2.3 Personas Catalogue

    DEFF Research Database (Denmark)

    Guldbæk Rasmussen, Katja; Iversen, Rie; Petersen, Gitte

    This catalogue contains 7 personas developed for use in the Europeana projects. The premise of this work has been to find already existing personas within the domains of archives, museums and libraries in Europe. These have then been pared down to their essentials and rebuilt, using input from...... Europeana partners and research on behavior and search patterns. If you have never worked with personas before,please take the time to read the short introduction in the chapter about method. The personas, and a brief “How To” is the central issue in this catalogue and therefore placed at the front....... For those wanting to dig a little deeper into how the personas were created, more in-depth material can be found in the chapters at the back....

  18. Gamification and geospatial health management

    International Nuclear Information System (INIS)

    Wortley, David

    2014-01-01

    Sensor and Measurement technologies are rapidly developing for many consumer applications which have the potential to make a major impact on business and society. One of the most important areas for building a sustainable future is in health management. This opportunity arises because of the growing popularity of lifestyle monitoring devices such as the Jawbone UP bracelet, Nike Fuelband and Samsung Galaxy GEAR. These devices measure physical activity and calorie consumption and, when visualised on mobile and portable devices, enable users to take more responsibility for their personal health. This presentation looks at how the process of gamification can be applied to develop important geospatial health management applications that could not only improve the health of nations but also significantly address some of the issues in global health such as the ageing society and obesity

  19. Gamification and geospatial health management

    Science.gov (United States)

    Wortley, David

    2014-06-01

    Sensor and Measurement technologies are rapidly developing for many consumer applications which have the potential to make a major impact on business and society. One of the most important areas for building a sustainable future is in health management. This opportunity arises because of the growing popularity of lifestyle monitoring devices such as the Jawbone UP bracelet, Nike Fuelband and Samsung Galaxy GEAR. These devices measure physical activity and calorie consumption and, when visualised on mobile and portable devices, enable users to take more responsibility for their personal health. This presentation looks at how the process of gamification can be applied to develop important geospatial health management applications that could not only improve the health of nations but also significantly address some of the issues in global health such as the ageing society and obesity.

  20. Visualization and Ontology of Geospatial Intelligence

    Science.gov (United States)

    Chan, Yupo

    Recent events have deepened our conviction that many human endeavors are best described in a geospatial context. This is evidenced in the prevalence of location-based services, as afforded by the ubiquitous cell phone usage. It is also manifested by the popularity of such internet engines as Google Earth. As we commute to work, travel on business or pleasure, we make decisions based on the geospatial information provided by such location-based services. When corporations devise their business plans, they also rely heavily on such geospatial data. By definition, local, state and federal governments provide services according to geographic boundaries. One estimate suggests that 85 percent of data contain spatial attributes.

  1. The Hipparcos, Tycho, TRC, and ACT catalogues - A whole sky comparison of the proper motions

    NARCIS (Netherlands)

    Hoogerwerf, R; Blaauw, A

    We present a whole sky comparison of the proper motions contained in the Hipparcos Catalogue, the Tycho Catalogue, the Tycho Reference Catalogue (TRC), and the Astro-graphic Catalogue plus Tycho Reference Catalogue (ACT). The catalogues are compared in the 20 declination zones defined by the

  2. Catalogue of nuclear fusion codes - 1976

    International Nuclear Information System (INIS)

    1976-10-01

    A catalogue is presented of the computer codes in nuclear fusion research developed by JAERI, Division of Thermonuclear Fusion Research and Division of Large Tokamak Development in particular. It contains a total of about 100 codes under the categories: Atomic Process, Data Handling, Experimental Data Processing, Engineering, Input and Output, Special Languages and Their Application, Mathematical Programming, Miscellaneous, Numerical Analysis, Nuclear Physics, Plasma Physics and Fusion Research, Plasma Simulation and Numerical Technique, Reactor Design, Solid State Physics, Statistics, and System Program. (auth.)

  3. Nebula observations. Catalogues and archive of photoplates

    Science.gov (United States)

    Shlyapnikov, A. A.; Smirnova, M. A.; Elizarova, N. V.

    2017-12-01

    A process of data systematization based on "Academician G.A. Shajn's Plan" for studying the Galaxy structure related to nebula observations is considered. The creation of digital versions of catalogues of observations and publications is described, as well as their presentation in HTML, VOTable and AJS formats and basic principles of work in the interactive application of International Virtual Observatory the Aladin Sky Atlas.

  4. National Recordal System IK holder catalogue process

    CSIR Research Space (South Africa)

    Pretorius, R

    2011-11-01

    Full Text Available , development, and protection of IKS in South Africa. The IKS policy states[1]: ?In order to secure rights to knowledge, a recordal system needs to be put in place where communities, guilds and other IK holders can record their knowledge holdings in order..., and secondly to facilitate recording and documentation of IK for the preservation and protection against unauthorized access to data, for instance commercial exploitation, distortion and fraudulent acts. The fi rst version of the catalogue process...

  5. WFCatalog: A catalogue for seismological waveform data

    Science.gov (United States)

    Trani, Luca; Koymans, Mathijs; Atkinson, Malcolm; Sleeman, Reinoud; Filgueira, Rosa

    2017-09-01

    This paper reports advances in seismic waveform description and discovery leading to a new seismological service and presents the key steps in its design, implementation and adoption. This service, named WFCatalog, which stands for waveform catalogue, accommodates features of seismological waveform data. Therefore, it meets the need for seismologists to be able to select waveform data based on seismic waveform features as well as sensor geolocations and temporal specifications. We describe the collaborative design methods and the technical solution showing the central role of seismic feature catalogues in framing the technical and operational delivery of the new service. Also, we provide an overview of the complex environment wherein this endeavour is scoped and the related challenges discussed. As multi-disciplinary, multi-organisational and global collaboration is necessary to address today's challenges, canonical representations can provide a focus for collaboration and conceptual tools for agreeing directions. Such collaborations can be fostered and formalised by rallying intellectual effort into the design of novel scientific catalogues and the services that support them. This work offers an example of the benefits generated by involving cross-disciplinary skills (e.g. data and domain expertise) from the early stages of design, and by sustaining the engagement with the target community throughout the delivery and deployment process.

  6. INTEGRATING GEOSPATIAL TECHNOLOGIES AND SECONDARY STUDENT PROJECTS: THE GEOSPATIAL SEMESTER

    Directory of Open Access Journals (Sweden)

    Bob Kolvoord

    2012-12-01

    Full Text Available Resumen:El Semestre Geoespacial es una actividad de educación geográfica centrada en que los estudiantes del último curso de secundaria en los institutos norteamericanos, adquieran competencias y habilidades específicas en sistemas de información geográfica, GPS y teledetección. A través de una metodología de aprendizaje basado en proyectos, los alumnos se motivan e implican en la realización de trabajos de investigación en los que analizan, e incluso proponen soluciones, diferentes procesos, problemas o cuestiones de naturaleza espacial. El proyecto está coordinado por la Universidad James Madison y lleva siete años implantándose en diferentes institutos del Estado de Virginia, implicando a más de 20 centros educativos y 1.500 alumnos. Los alumnos que superan esta asignatura de la enseñanza secundaria obtienen la convalidación de determinados créditos académicos de la Universidad de referencia.Palabras clave:Sistemas de información geográfica, enseñanza, didáctica de la geografía, semestre geoespacial.Abstract:The Geospatial Semester is a geographical education activity focused on students in their final year of secondary schools in the U.S., acquiring specific skills in GIS, GPS and remote sensing. Through a methodology for project-based learning, students are motivated and involved in conducting research using geographic information systems and analyze, and even propose solutions, different processes, problems or issues spatial in nature. The Geospatial Semester university management not only ensures proper coaching, guidance and GIS training for teachers of colleges, but has established a system whereby students who pass this course of secondary education gain the recognition of certain credits from the University.Key words:Geographic information system, teaching, geographic education, geospatial semester. Résumé:Le semestre géospatial est une activité axée sur l'éducation géographique des étudiants en derni

  7. GIBS Geospatial Data Abstraction Library (GDAL)

    Data.gov (United States)

    National Aeronautics and Space Administration — GDAL is an open source translator library for raster geospatial data formats that presents a single abstract data model to the calling application for all supported...

  8. Geospatial Information System Capability Maturity Models

    Science.gov (United States)

    2017-06-01

    To explore how State departments of transportation (DOTs) evaluate geospatial tool applications and services within their own agencies, particularly their experiences using capability maturity models (CMMs) such as the Urban and Regional Information ...

  9. Geospatial Modeling of Asthma Population in Relation to Air Pollution

    Science.gov (United States)

    Kethireddy, Swatantra R.; Tchounwou, Paul B.; Young, John H.; Luvall, Jeffrey C.; Alhamdan, Mohammad

    2013-01-01

    Current observations indicate that asthma is growing every year in the United States, specific reasons for this are not well understood. This study stems from an ongoing research effort to investigate the spatio-temporal behavior of asthma and its relatedness to air pollution. The association between environmental variables such as air quality and asthma related health issues over Mississippi State are investigated using Geographic Information Systems (GIS) tools and applications. Health data concerning asthma obtained from Mississippi State Department of Health (MSDH) for 9-year period of 2003-2011, and data of air pollutant concentrations (PM2.5) collected from USEPA web resources, and are analyzed geospatially to establish the impacts of air quality on human health specifically related to asthma. Disease mapping using geospatial techniques provides valuable insights into the spatial nature, variability, and association of asthma to air pollution. Asthma patient hospitalization data of Mississippi has been analyzed and mapped using quantitative Choropleth techniques in ArcGIS. Patients have been geocoded to their respective zip codes. Potential air pollutant sources of Interstate highways, Industries, and other land use data have been integrated in common geospatial platform to understand their adverse contribution on human health. Existing hospitals and emergency clinics are being injected into analysis to further understand their proximity and easy access to patient locations. At the current level of analysis and understanding, spatial distribution of Asthma is observed in the populations of Zip code regions in gulf coast, along the interstates of south, and in counties of Northeast Mississippi. It is also found that asthma is prevalent in most of the urban population. This GIS based project would be useful to make health risk assessment and provide information support to the administrators and decision makers for establishing satellite clinics in future.

  10. Compiling an earthquake catalogue for the Arabian Plate, Western Asia

    Science.gov (United States)

    Deif, Ahmed; Al-Shijbi, Yousuf; El-Hussain, Issa; Ezzelarab, Mohamed; Mohamed, Adel M. E.

    2017-10-01

    The Arabian Plate is surrounded by regions of relatively high seismicity. Accounting for this seismicity is of great importance for seismic hazard and risk assessments, seismic zoning, and land use. In this study, a homogenous earthquake catalogue of moment-magnitude (Mw) for the Arabian Plate is provided. The comprehensive and homogenous earthquake catalogue provided in the current study spatially involves the entire Arabian Peninsula and neighboring areas, covering all earthquake sources that can generate substantial hazard for the Arabian Plate mainland. The catalogue extends in time from 19 to 2015 with a total number of 13,156 events, of which 497 are historical events. Four polygons covering the entire Arabian Plate were delineated and different data sources including special studies, local, regional and international catalogues were used to prepare the earthquake catalogue. Moment magnitudes (Mw) that provided by original sources were given the highest magnitude type priority and introduced to the catalogues with their references. Earthquakes with magnitude differ from Mw were converted into this scale applying empirical relationships derived in the current or in previous studies. The four polygons catalogues were included in two comprehensive earthquake catalogues constituting the historical and instrumental periods. Duplicate events were identified and discarded from the current catalogue. The present earthquake catalogue was declustered in order to contain only independent events and investigated for the completeness with time of different magnitude spans.

  11. GEOPACKAGE DATA FORMAT FOR COLLABORATIVEMAPPING OF GEOSPATIAL DATAIN LIMITED NETWORKENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    M. H. Rashidan

    2016-09-01

    Full Text Available With the growth of technology in earth and space science informatics has led to the revolution in a wide range of geospatial practice. Nowadays collaborative mapping has become a new hot spot, following mobile and web GIS. This paper explores the potential use of GeoPackage for collaborative mapping of geospatial data in limited network environments. GeoPackage is a data format that open-standard, platform-independent, portable, and self-describing. This paper focus on the implementation of GeoPackage in mobile application for field data collection. A mobile application was developed that implements the GeoPackage data format as an internal database to provide support for offline mapping. The developed mobile application demonstrates that vector and raster data can be stored in a single data format, which reduces the device storage consumption. The details of how GeoPackage data contribute to mobile GIS to achieve collaborative mapping in limited network environments are discussed. The findings show that the GeoPackage data format has great potential to improve existing mobile GIS applications.

  12. Energy renovation solutions - catalogue; Energirenoveringstiltag - katalog

    Energy Technology Data Exchange (ETDEWEB)

    Tommerup, H.

    2010-07-15

    The project's aim has been to develop methods and examples of extensive energy renovations to stimulate energy conservation and increased use of renewable energy in existing buildings. The current report represents an extensive technology catalogue of typical energy renovation measures in connection with the renovation of existing buildings. For every action the main aspects are explained concerning such issues as technology, use, barriers, indoor climate, energy conservation and prices. The report is mainly targeted at construction industry, but also many other stakeholders can benefit from the report. (ln)

  13. Tools for open geospatial science

    Science.gov (United States)

    Petras, V.; Petrasova, A.; Mitasova, H.

    2017-12-01

    Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.

  14. Central Asia earthquake catalogue from ancient time to 2009

    Directory of Open Access Journals (Sweden)

    Natalya N. Mikhailova

    2015-04-01

    Full Text Available In this work, we present the seismic catalogue compiled for Central Asia (Kazakhstan, Kyrgyzstan, Tajikistan, Uzbekistan and Turkmenistan in the framework of the Earthquake Model Central Asia (EMCA project. The catalogue from 2000 B.C. to 2009 A.D. is composed by 33,034 earthquakes in the MLH magnitude (magnitude by surface waves on horizontal components widely used in practice of the former USSR countries range from 1.5 to 8.3. The catalogue includes both macroseimic and instrumental constrained data, with about 32,793 earthquake after 1900 A.D. The main sources and procedure used to compile the catalogues are discussed, and the comparison with the ISC-GEM catalogue presented. Magnitude of completeness analysis shows that the catalogue is complete down to magnitude 4 from 1959 and to magnitude 7 from 1873, whereas the obtained regional b value is 0.805.

  15. Large geospatial images discovery: metadata model and technological framework

    Directory of Open Access Journals (Sweden)

    Lukáš Brůha

    2015-12-01

    Full Text Available The advancements in geospatial web technology triggered efforts for disclosure of valuable resources of historical collections. This paper focuses on the role of spatial data infrastructures (SDI in such efforts. The work describes the interplay between SDI technologies and potential use cases in libraries such as cartographic heritage. The metadata model is introduced to link up the sources from these two distinct fields. To enhance the data search capabilities, the work focuses on the representation of the content-based metadata of raster images, which is the crucial prerequisite to target the search in a more effective way. The architecture of the prototype system for automatic raster data processing, storage, analysis and distribution is introduced. The architecture responds to the characteristics of input datasets, namely to the continuous flow of very large raster data and related metadata. Proposed solutions are illustrated on the case study of cartometric analysis of digitised early maps and related metadata encoding.

  16. Geospatial Technology Strategic Plan 1997-2000

    Science.gov (United States)

    D'Erchia, Frank; D'Erchia, Terry D.; Getter, James; McNiff, Marcia; Root, Ralph; Stitt, Susan; White, Barbara

    1997-01-01

    Executive Summary -- Geospatial technology applications have been identified in many U.S. Geological Survey Biological Resources Division (BRD) proposals for grants awarded through internal and partnership programs. Because geospatial data and tools have become more sophisticated, accessible, and easy to use, BRD scientists frequently are using these tools and capabilities to enhance a broad spectrum of research activities. Bruce Babbitt, Secretary of the Interior, has acknowledged--and lauded--the important role of geospatial technology in natural resources management. In his keynote address to more than 5,500 people representing 87 countries at the Environmental Systems Research Institute Annual Conference (May 21, 1996), Secretary Babbitt stated, '. . .GIS [geographic information systems], if properly used, can provide a lot more than sets of data. Used effectively, it can help stakeholders to bring consensus out of conflict. And it can, by providing information, empower the participants to find new solutions to their problems.' This Geospatial Technology Strategic Plan addresses the use and application of geographic information systems, remote sensing, satellite positioning systems, image processing, and telemetry; describes methods of meeting national plans relating to geospatial data development, management, and serving; and provides guidance for sharing expertise and information. Goals are identified along with guidelines that focus on data sharing, training, and technology transfer. To measure success, critical performance indicators are included. The ability of the BRD to use and apply geospatial technology across all disciplines will greatly depend upon its success in transferring the technology to field biologists and researchers. The Geospatial Technology Strategic Planning Development Team coordinated and produced this document in the spirit of this premise. Individual Center and Program managers have the responsibility to implement the Strategic Plan

  17. Astrometric Star Catalogues as Combination of Hipparcos/Tycho Catalogues with Ground-Based Observations

    Science.gov (United States)

    Vondrak, J.

    The successful ESA mission Hipparcos provided very precise parallaxes, positions and proper motions of many stars in optical wavelength. Therefore, it is a primary representation of International Celestial Reference System in this wavelength. However, the shortness of the mission (less than four years) causes some problems with proper motions of the stars that are double or multiple. Therefore, a combination of the positions measured by Hipparcos satellite with ground-based observations with much longer history provides a better reference frame that is more stable in time. Several examples of such combinations are presented (ACT, TYCHO-2, FK6, GC+HIP, TYC2+HIP, ARIHIP) and briefly described. The stress is put on the most recent Earth Orientation Catalogue (EOC) that uses about 4.4 million optical observations of latitude/universal time variations (made during the twentieth century at 33 observatories in Earth orientation programmes), in combination with some of the above mentioned combined catalogues. The second version of the new catalogue EOC-2 contains 4418 objects, and the precision of their proper motions is far better than that of Hipparcos Catalogue.

  18. Extending the ISC-GEM Global Earthquake Instrumental Catalogue

    Science.gov (United States)

    Di Giacomo, Domenico; Engdhal, Bob; Storchak, Dmitry; Villaseñor, Antonio; Harris, James

    2015-04-01

    After a 27-month project funded by the GEM Foundation (www.globalquakemodel.org), in January 2013 we released the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php) as a special product to use for seismic hazard studies. The new catalogue was necessary as improved seismic hazard studies necessitate that earthquake catalogues are homogeneous (to the largest extent possible) over time in their fundamental parameters, such as location and magnitude. Due to time and resource limitation, the ISC-GEM catalogue (1900-2009) included earthquakes selected according to the following time-variable cut-off magnitudes: Ms=7.5 for earthquakes occurring before 1918; Ms=6.25 between 1918 and 1963; and Ms=5.5 from 1964 onwards. Because of the importance of having a reliable seismic input for seismic hazard studies, funding from GEM and two commercial companies in the US and UK allowed us to start working on the extension of the ISC-GEM catalogue both for earthquakes that occurred beyond 2009 and for earthquakes listed in the International Seismological Summary (ISS) which fell below the cut-off magnitude of 6.25. This extension is part of a four-year program that aims at including in the ISC-GEM catalogue large global earthquakes that occurred before the beginning of the ISC Bulletin in 1964. In this contribution we present the updated ISC GEM catalogue, which will include over 1000 more earthquakes that occurred in 2010 2011 and several hundreds more between 1950 and 1959. The catalogue extension between 1935 and 1949 is currently underway. The extension of the ISC-GEM catalogue will also be helpful for regional cross border seismic hazard studies as the ISC-GEM catalogue should be used as basis for cross-checking the consistency in location and magnitude of those earthquakes listed both in the ISC GEM global catalogue and regional catalogues.

  19. Broad Absorption Line Quasar catalogues with Supervised Neural Networks

    International Nuclear Information System (INIS)

    Scaringi, Simone; Knigge, Christian; Cottis, Christopher E.; Goad, Michael R.

    2008-01-01

    We have applied a Learning Vector Quantization (LVQ) algorithm to SDSS DR5 quasar spectra in order to create a large catalogue of broad absorption line quasars (BALQSOs). We first discuss the problems with BALQSO catalogues constructed using the conventional balnicity and/or absorption indices (BI and AI), and then describe the supervised LVQ network we have trained to recognise BALQSOs. The resulting BALQSO catalogue should be substantially more robust and complete than BI-or AI-based ones.

  20. Using Web GIS for Public Health Education

    Science.gov (United States)

    Reed, Rajika E.; Bodzin, Alec M.

    2016-01-01

    An interdisciplinary curriculum unit that used Web GIS mapping to investigate malaria disease patterns and spread in relation to the environment for a high school Advanced Placement Environmental Science course was developed. A feasibility study was conducted to investigate the efficacy of the unit to promote geospatial thinking and reasoning…

  1. A Geo-Event-Based Geospatial Information Service: A Case Study of Typhoon Hazard

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-03-01

    Full Text Available Social media is valuable in propagating information during disasters for its timely and available characteristics nowadays, and assists in making decisions when tagged with locations. Considering the ambiguity and inaccuracy in some social data, additional authoritative data are needed for important verification. However, current works often fail to leverage both social and authoritative data and, on most occasions, the data are used in disaster analysis after the fact. Moreover, current works organize the data from the perspective of the spatial location, but not from the perspective of the disaster, making it difficult to dynamically analyze the disaster. All of the disaster-related data around the affected locations need to be retrieved. To solve these limitations, this study develops a geo-event-based geospatial information service (GEGIS framework and proceeded as follows: (1 a geo-event-related ontology was constructed to provide a uniform semantic basis for the system; (2 geo-events and attributes were extracted from the web using a natural language process (NLP and used in the semantic similarity match of the geospatial resources; and (3 a geospatial information service prototype system was designed and implemented for automatically retrieving and organizing geo-event-related geospatial resources. A case study of a typhoon hazard is analyzed here within the GEGIS and shows that the system would be effective when typhoons occur.

  2. From Geomatics to Geospatial Intelligent Service Science

    Directory of Open Access Journals (Sweden)

    LI Deren

    2017-10-01

    Full Text Available The paper reviews the 60 years of development from traditional surveying and mapping to today's geospatial intelligent service science.The three important stages of surveying and mapping, namely analogue,analytical and digital stage are summarized.The author introduces the integration of GNSS,RS and GIS(3S,which forms the rise of geospatial informatics(Geomatics.The development of geo-spatial information science in digital earth era is analyzed,and the latest progress of geo-spatial information science towards real-time intelligent service in smart earth era is discussed.This paper focuses on the three development levels of "Internet plus" spatial information intelligent service.In the era of big data,the traditional geomatics will surely take advantage of the integration of communication,navigation,remote sensing,artificial intelligence,virtual reality and brain cognition science,and become geospatial intelligent service science,thereby making contributions to national economy,defense and people's livelihood.

  3. U.S. EPAs Public Geospatial Metadata Service

    Data.gov (United States)

    U.S. Environmental Protection Agency — EPAs public geospatial metadata service provides external parties (Data.gov, GeoPlatform.gov, and the general public) with access to EPA's geospatial metadata...

  4. A Method for Automating Geospatial Dataset Metadata

    Directory of Open Access Journals (Sweden)

    Robert I. Dunfey

    2009-11-01

    Full Text Available Metadata have long been recognised as crucial to geospatial asset management and discovery, and yet undertaking their creation remains an unenviable task often to be avoided. This paper proposes a practical approach designed to address such concerns, decomposing various data creation, management, update and documentation process steps that are subsequently leveraged to contribute towards metadata record completion. Using a customised utility embedded within a common GIS application, metadata elements are computationally derived from an imposed feature metadata standard, dataset geometry, an integrated storage protocol and pre-prepared content, and instantiated within a common geospatial discovery convention. Yielding 27 out of a 32 total metadata elements (or 15 out of 17 mandatory elements the approach demonstrably lessens the burden of metadata authorship. It also encourages improved geospatial asset management whilst outlining core requisites for developing a more open metadata strategy not bound to any particular application domain.

  5. Visualising the past: potential applications of Geospatial tools to paleoclimate research

    Science.gov (United States)

    Cook, A.; Turney, C. S.

    2012-12-01

    Recent advances in geospatial data acquisition, analysis and web-based data sharing offer new possibilities for understanding and visualising past modes of change. The availability, accessibility and cost-effectiveness of data is better than ever. Researchers can access remotely sensed data including terrain models; use secondary data from large consolidated repositories; make more accurate field measurements and combine data from disparate sources to form a single asset. An increase in the quantity and consistency of data is coupled with subtle yet significant improvements to the way in which geospatial systems manage data interoperability, topological and textual integrity, resulting in more stable analytical and modelling environments. Essentially, researchers now have greater control and more confidence in analytical tools and outputs. Web-based data sharing is growing rapidly, enabling researchers to publish and consume data directly into their spatial systems through OGC-compliant Web Map Services (WMS), Web Feature Services (WFS) and Web Coverage Services (WCS). This has been implemented at institutional, organisational and project scale around the globe. Some institutions have gone one step further and established Spatial Data Infrastructures (SDI) based on Federated Data Structures where the participating data owners retain control over who has access to what. It is important that advances in knowledge are transferred to audiences outside the scientific community in a way that is interesting and meaningful. The visualisation of paleodata through multi-media offers significant opportunities to highlight the parallels and distinctions between past climate dynamics and the challenges of today and tomorrow. Here we present an assessment of key innovations that demonstrate how Geospatial tools can be applied to palaeo-research and used to communicate the results to a diverse array of audiences in the digital age.

  6. A multimembership catalogue for 1876 open clusters using UCAC4 data

    Science.gov (United States)

    Sampedro, L.; Dias, W. S.; Alfaro, E. J.; Monteiro, H.; Molino, A.

    2017-10-01

    The main objective of this work is to determine the cluster members of 1876 open clusters, using positions and proper motions of the astrometric fourth United States Naval Observatory (USNO) CCD Astrograph Catalog (UCAC4). For this purpose, we apply three different methods, all based on a Bayesian approach, but with different formulations: a purely parametric method, another completely non-parametric algorithm and a third, recently developed by Sampedro & Alfaro, using both formulations at different steps of the whole process. The first and second statistical moments of the members' phase-space subspace, obtained after applying the three methods, are compared for every cluster. Although, on average, the three methods yield similar results, there are also specific differences between them, as well as for some particular clusters. The comparison with other published catalogues shows good agreement. We have also estimated, for the first time, the mean proper motion for a sample of 18 clusters. The results are organized in a single catalogue formed by two main files, one with the most relevant information for each cluster, partially including that in UCAC4, and the other showing the individual membership probabilities for each star in the cluster area. The final catalogue, with an interface design that enables an easy interaction with the user, is available in electronic format at the Stellar Systems Group (SSG-IAA) web site (http://ssg.iaa.es/en/content/sampedro-cluster-catalog).

  7. An Assessment of Online Public Access Catalogue (OPAC ...

    African Journals Online (AJOL)

    The main purpose of this study was to assess the computerized catalogue and its utilization in university libraries in Lagos state. Survey research method was employed for the study. The population for the study was drawn from two university libraries in Lagos state that have automated their catalogues. These libraries are ...

  8. Subject Catalogue Use at the Hezekiah Oluwasanmi Library ...

    African Journals Online (AJOL)

    A survey of the subject catalogue use at the Hezekiah Oluwasanmi Library, Obafemi Awolowo University, Ile-Ife was carried out for a period of six weeks using questionnaire. The focus of the study was to find out the extent to which the subject catalogue in the library, met the users needs, the level of use and the ...

  9. Challenges associated with cataloguing of electronic resources in ...

    African Journals Online (AJOL)

    The aim of the paper is to identify challenges associated with the cataloguing of e resources in some selected university libraries in south –south Nigeria. The descriptive survey design involving the use of questionnaire as the research instrument was adopted. The population comprised of cataloguers in five selected ...

  10. The ASAS-SN bright supernova catalogue - III. 2016

    DEFF Research Database (Denmark)

    Holoien, T. W. -S.; Brown, J. S.; Stanek, K. Z.

    2017-01-01

    This catalogue summarizes information for all supernovae discovered by the All-Sky Automated Survey for SuperNovae (ASAS-SN) and all other bright (m(peak)d......This catalogue summarizes information for all supernovae discovered by the All-Sky Automated Survey for SuperNovae (ASAS-SN) and all other bright (m(peak)d...

  11. A Survey Of Cataloguing Practices And Job Satisfaction In Nigerian ...

    African Journals Online (AJOL)

    The paper presents a survey of cataloguers in Nigerian academic libraries. With the use of a questionnaire, it attempts to identify the cataloguers' demography and work practices. It then attempts to explain the result as inferred satisfaction. Tables of frequency and percentages were used for data presentation. Findings ...

  12. International Atomic Energy Agency Publications. Catalogue 1986-1999

    International Nuclear Information System (INIS)

    2000-11-01

    This catalogue lists all sales publications of the International Atomic Energy Agency issued from 1986 up to the end of 1999 and still available. Some earlier titles which form part of an established series or are still considered important have also been included. The catalogue is in CD-ROM format

  13. Assessing the catalogue module of Alice for window software ...

    African Journals Online (AJOL)

    The paper presents a general description of Alice For Window Software with a detailed analysis of the catalogue module. It highlights the basic features of the module such as add, edit, delete, search field and the grab button. The cataloguing process is clearly delineated. The paper also discusses Alice For Window ...

  14. Security in a Replicated Metadata Catalogue

    CERN Document Server

    Koblitz, B

    2007-01-01

    The gLite-AMGA metadata has been developed by NA4 to provide simple relational metadata access for the EGEE user community. As advanced features, which will be the focus of this presentation, AMGA provides very fine-grained security also in connection with the built-in support for replication and federation of metadata. AMGA is extensively used by the biomedical community to store medical images metadata, digital libraries, in HEP for logging and bookkeeping data and in the climate community. The biomedical community intends to deploy a distributed metadata system for medical images consisting of various sites, which range from hospitals to computing centres. Only safe sharing of the highly sensitive metadata as provided in AMGA makes such a scenario possible. Other scenarios are digital libraries, which federate copyright protected (meta-) data into a common catalogue. The biomedical and digital libraries have been deployed using a centralized structure already for some time. They now intend to decentralize ...

  15. DESIGN FOR CONNECTING SPATIAL DATA INFRASTRUCTURES WITH SENSOR WEB (SENSDI

    Directory of Open Access Journals (Sweden)

    D. Bhattacharya

    2016-06-01

    Full Text Available Integrating Sensor Web With Spatial Data Infrastructures (SENSDI aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS; 'Sensor Planning Service' (SPS; 'Sensor Alert Service' (SAS; a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS. Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.

  16. Data Quality, Provenance and IPR Management services: their role in empowering geospatial data suppliers and users

    Science.gov (United States)

    Millard, Keiran

    2015-04-01

    This paper looks at current experiences of geospatial users and geospatial suppliers and how they have been limited by suitable frameworks for managing and communicating data quality, data provenance and intellectual property rights (IPR). Current political and technological drivers mean that increasing volumes of geospatial data are available through a plethora of different products and services, and whilst this is inherently a good thing it does create a new generation of challenges. This paper consider two examples of where these issues have been examined and looks at the challenges and possible solutions from a data user and data supplier perspective. The first example is the IQmulus project that is researching fusion environments for big geospatial point clouds and coverages. The second example is the EU Emodnet programme that is establishing thematic data portals for public marine and coastal data. IQmulus examines big geospatial data; the data from sources such as LIDAR, SONAR and numerical simulations; these data are simply too big for routine and ad-hoc analysis, yet they could realise a myriad of disparate, and readily useable, information products with the right infrastructure in place. IQmulus is researching how to deliver this infrastructure technically, but a financially sustainable delivery depends on being able to track and manage ownership and IPR across the numerous data sets being processed. This becomes complex when the data is composed of multiple overlapping coverages, however managing this allows for uses to be delivered highly-bespoke products to meet their budget and technical needs. The Emodnet programme delivers harmonised marine data at the EU scale across seven thematic portals. As part of the Emodnet programme a series of 'check points' have been initiated to examine how useful these services and other public data services actually are to solve real-world problems. One key finding is that users have been confused by the fact that often

  17. Development of an earthquake catalogue for western Canada

    Energy Technology Data Exchange (ETDEWEB)

    Addo, K.O. [BC Hydro, Burnaby, BC (Canada); Falero, V.M.; Youngs, R.R. [AMEC Geomatrix, Oakland, CA (United States)

    2009-07-01

    This paper discussed an earthquake catalogue for western Canada developed as part of an ongoing probabilistic seismic hazard assessment (PSHA) that is currently being conducted by BC Hydro. The official Geological Survey of Canada (GSC) earthquake catalogue was updated to include recent events as well as seismic events occurring south of the Canada-USA border. Canadian and United States catalogues were merged, and the magnitude of the events were converted to moment magnitude (M{sub w}) measurements. Aftershocks and anthropogenic events were also removed. Magnitude completeness for different regions within the study area was also established. The catalogue will be used to prioritize emergency response plans and select access routes and locations for emergency supplies. It was concluded that the catalogue can be used to estimate seismicity-based recurrence parameters for the PSHA study as well as to identify realistic epicentres and earthquake magnitudes for simulations and emergency planning exercises. 35 refs., 6 tabs., 15 figs.

  18. Cataloguing In Special Libraries In The 1990s

    Directory of Open Access Journals (Sweden)

    Elizabeth Makin

    1996-01-01

    Full Text Available Cataloguing in special libraries has been virtually ignored in the literature since the turn of the century, although there are many books and papers on cataloguing in general. It is not clear why this should be so, since it can be argued that the needs of special libraries are different from those of public, academic and national libraries. Special libraries are primarily interested in the information content of documents in the sense that they have little or no interest in documents except as "packages" in which information may be encapsulated. It is therefore reasonable to assume, a priori, that special libraries would undertake detailed indexing and light cataloguing, perhaps reducing the catalogue to the status of a finding list. This paper reports the results of a survey of current cataloguing practice in special libraries.

  19. Dealing with orphans: Catalogue synchronisation with SynCat

    Energy Technology Data Exchange (ETDEWEB)

    Millar, A Paul [Deutsches Elektronen-Sychrotron (DESY), Notkestrasse 85, 22607 Hamburg (Germany); Donno, Flavia; Lo Presti, Giuseppe [European Organization for Nuclear Research (CERN), CH-1211, Geneve 23 (Switzerland); Jensen, Jens; De Witt, Shaun, E-mail: paul.millar@desy.d [Science and Technology Facilities Council (STFC), Rutherford Appleton Laboratory, Didcot OX11 0QX (United Kingdom)

    2010-04-01

    In the gLite grid model a site will typically have a Storage Element (SE) that has no direct mechanism for updating any central or experiment-specific catalogues. This loose coupling was a deliberate decision that simplifies SE design; however, a consequence of this is that the catalogues may provide an incorrect view of what is stored on a SE. In this paper, we present SynCat: a mechanism to allow catalogues to re-synchronise with SEs. The paper describes how catalogues can be sure, within certain tolerance, that files believed to be stored at various SEs are really stored there. SynCat also allows catalogues to be aware of transitory file metadata (such as whether a file normally stored on tape is currently available from disk) with low latency.

  20. Smart sensor-based geospatial architecture for dike monitoring

    Science.gov (United States)

    Herle, S.; Becker, R.; Blankenbach, J.

    2016-04-01

    Artificial hydraulic structures like dams or dikes used for water level regulations or flood prevention are continuously under the influence of the weather and variable river regimes. Thus, ongoing monitoring and simulation is crucial in order to determine the inner condition. Potentially life-threatening situations, in extreme case a failure, must be counteracted by all available means. Nowadays flood warning systems rely exclusively on water level forecast without considering the state of the structure itself. Area-covering continuous knowledge of the inner state including time dependent changes increases the capability of recognizing and locating vulnerable spots for early treatment. In case of a predicted breach, advance warning time for alerting affected citizens can be extended. Our approach is composed of smart sensors integrated in a service-oriented geospatial architecture to monitor and simulate artificial hydraulic structures continuously. The sensors observe the inner state of the construction like the soil moisture or the stress and deformation over time but also various external influences like water levels or wind speed. They are interconnected in distributed network architecture by a so-called sensor bus system based on lightweight protocols like Message Queue Telemetry Transport for Sensor Networks (MQTT-SN). These sensor data streams are transferred into an OGC Sensor Web Enablement (SWE) data structure providing high-level geo web services to end users. Bundled with 3rd party geo web services (WMS etc.) powerful processing and simulation tools can be invoked using the Web Processing Service (WPS) standard. Results will be visualized in a geoportal allowing user access to all information.

  1. A Spatial Data Infrastructure Integrating Multisource Heterogeneous Geospatial Data and Time Series: A Study Case in Agriculture

    Directory of Open Access Journals (Sweden)

    Gloria Bordogna

    2016-05-01

    Full Text Available Currently, the best practice to support land planning calls for the development of Spatial Data Infrastructures (SDI capable of integrating both geospatial datasets and time series information from multiple sources, e.g., multitemporal satellite data and Volunteered Geographic Information (VGI. This paper describes an original OGC standard interoperable SDI architecture and a geospatial data and metadata workflow for creating and managing multisource heterogeneous geospatial datasets and time series, and discusses it in the framework of the Space4Agri project study case developed to support the agricultural sector in Lombardy region, Northern Italy. The main novel contributions go beyond the application domain for which the SDI has been developed and are the following: the ingestion within an a-centric SDI, potentially distributed in several nodes on the Internet to support scalability, of products derived by processing remote sensing images, authoritative data, georeferenced in-situ measurements and voluntary information (VGI created by farmers and agronomists using an original Smart App; the workflow automation for publishing sets and time series of heterogeneous multisource geospatial data and relative web services; and, finally, the project geoportal, that can ease the analysis of the geospatial datasets and time series by providing complex intelligent spatio-temporal query and answering facilities.

  2. 3D geospatial visualizations: Animation and motion effects on spatial objects

    Science.gov (United States)

    Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos

    2018-02-01

    Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.

  3. Geospatial Image Stream Processing: Models, techniques, and applications in remote sensing change detection

    Science.gov (United States)

    Rueda-Velasquez, Carlos Alberto

    Detection of changes in environmental phenomena using remotely sensed data is a major requirement in the Earth sciences, especially in natural disaster related scenarios where real-time detection plays a crucial role in the saving of human lives and the preservation of natural resources. Although various approaches formulated to model multidimensional data can in principle be applied to the inherent complexity of remotely sensed geospatial data, there are still challenging peculiarities that demand a precise characterization in the context of change detection, particularly in scenarios of fast changes. In the same vein, geospatial image streams do not fit appropriately in the standard Data Stream Management System (DSMS) approach because these systems mainly deal with tuple-based streams. Recognizing the necessity for a systematic effort to address the above issues, the work presented in this thesis is a concrete step toward the foundation and construction of an integrated Geospatial Image Stream Processing framework, GISP. First, we present a data and metadata model for remotely sensed image streams. We introduce a precise characterization of images and image streams in the context of remotely sensed geospatial data. On this foundation, we define spatially-aware temporal operators with a consistent semantics for change analysis tasks. We address the change detection problem in settings where multiple image stream sources are available, and thus we introduce an architectural design for the processing of geospatial image streams from multiple sources. With the aim of targeting collaborative scientific environments, we construct a realization of our architecture based on Kepler, a robust and widely used scientific workflow management system, as the underlying computational support; and open data and Web interface standards, as a means to facilitate the interoperability of GISP instances with other processing infrastructures and client applications. We demonstrate our

  4. The African Geospatial Sciences Institute (agsi): a New Approach to Geospatial Training in North Africa

    Science.gov (United States)

    Oeldenberger, S.; Khaled, K. B.

    2012-07-01

    The African Geospatial Sciences Institute (AGSI) is currently being established in Tunisia as a non-profit, non-governmental organization (NGO). Its objective is to accelerate the geospatial capacity development in North-Africa, providing the facilities for geospatial project and management training to regional government employees, university graduates, private individuals and companies. With typical course durations between one and six months, including part-time programs and long-term mentoring, its focus is on practical training, providing actual project execution experience. The AGSI will complement formal university education and will work closely with geospatial certification organizations and the geospatial industry. In the context of closer cooperation between neighboring North Africa and the European Community, the AGSI will be embedded in a network of several participating European and African universities, e. g. the ITC, and international organizations, such as the ISPRS, the ICA and the OGC. Through a close cooperation with African organizations, such as the AARSE, the RCMRD and RECTAS, the network and exchange of ideas, experiences, technology and capabilities will be extended to Saharan and sub-Saharan Africa. A board of trustees will be steering the AGSI operations and will ensure that practical training concepts and contents are certifiable and can be applied within a credit system to graduate and post-graduate education at European and African universities. The geospatial training activities of the AGSI are centered on a facility with approximately 30 part- and full-time general staff and lecturers in Tunis during the first year. The AGSI will operate a small aircraft with a medium-format aerial camera and compact LIDAR instrument for local, community-scale data capture. Surveying training, the photogrammetric processing of aerial images, GIS data capture and remote sensing training will be the main components of the practical training courses

  5. Identification of the condition of crops based on geospatial data embedded in graph databases

    Science.gov (United States)

    Idziaszek, P.; Mueller, W.; Górna, K.; Okoń, P.; Boniecki, P.; Koszela, K.; Fojud, A.

    2017-07-01

    The Web application presented here supports plant production and works with the graph database Neo4j shell to support the assessment of the condition of crops on the basis of geospatial data, including raster and vector data. The adoption of a graph database as a tool to store and manage the data, including geospatial data, is completely justified in the case of those agricultural holdings that have a wide range of types and sizes of crops. In addition, the authors tested the option of using the technology of Microsoft Cognitive Services at the level of produced application that enables an image analysis using the services provided. The presented application was designed using ASP.NET MVC technology and a wide range of leading IT tools.

  6. A Collaborative Geospatial Shoreline Inventory Tool to Guide Coastal Development and Habitat Conservation

    Directory of Open Access Journals (Sweden)

    Peter Gies

    2013-05-01

    Full Text Available We are developing a geospatial inventory tool that will guide habitat conservation, restoration and coastal development and benefit several stakeholders who seek mitigation and adaptation strategies to shoreline changes resulting from erosion and sea level rise. The ESRI Geoportal Server, which is a type of web portal used to find and access geospatial information in a central repository, is customized by adding a Geoinventory tool capability that allows any shoreline related data to be searched, displayed and analyzed on a map viewer. Users will be able to select sections of the shoreline and generate statistical reports in the map viewer to allow for comparisons. The tool will also facilitate map-based discussion forums and creation of user groups to encourage citizen participation in decisions regarding shoreline stabilization and restoration, thereby promoting sustainable coastal development.

  7. Malacological survey and geospatial distribution of Indoplanor ...

    African Journals Online (AJOL)

    Infected I. exutus, intermediate snail host of Schistosoma nasale, a cattle schistosome was found in two sites. The logistic regression analysis using remotely-sensed environmental data showed that NDVI was the significant variable influencing I. exutus distribution (B = -8.460, Sig = 0.043). Geospatial distribution maps were ...

  8. Geospatial intelligence about urban areas using SAR

    NARCIS (Netherlands)

    Broek, A.C. van den; Dekker, R.J.

    2007-01-01

    Radar satellites are important for geospatial intelligence about urban areas and urban situational awareness, since these satellites can collect data at day and night and independently of weather conditions ensuring that the information can be obtained at regular intervals and in time. For this

  9. Geospatial Technologies: Real Projects in Real Classrooms

    Science.gov (United States)

    Kolvoord, Bob

    2008-01-01

    Geospatial technologies of geographic information systems, global positioning systems, and remote sensing are just a few of the projects that evoke an unexpected drive and devotion from high school students in Virginia. Their integration into different curricular areas lets students focus on understanding their community and the many issues that…

  10. Geospatial Technologies and Higher Education in Argentina

    Science.gov (United States)

    Leguizamon, Saturnino

    2010-01-01

    The term "geospatial technologies" encompasses a large area of fields involving cartography, spatial analysis, geographic information system, remote sensing, global positioning systems and many others. These technologies should be expected to be available (as "natural tools") for a country with a large surface and a variety of…

  11. Teaching Tectonics to Undergraduates with Web GIS

    Science.gov (United States)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  12. Examining the Effect of Enactment of a Geospatial Curriculum on Students' Geospatial Thinking and Reasoning

    Science.gov (United States)

    Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara

    2014-08-01

    A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p < .001. Teacher enactment factors, including adherence to implementing the critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.

  13. Gestión documental y de contenidos web: informe de situación

    OpenAIRE

    Saorín, Tomás; Pástor-Sánchez, Juan-Antonio

    2012-01-01

    Review of major 2011 developments in the field of bibliographic cataloguing, document management and documentary languages. The use of content management systems for web publishing, the emergence of new approaches such as web experience management and the integration of web productivity components are analyzed

  14. Building an Elastic Parallel OGC Web Processing Service on a Cloud-Based Cluster: A Case Study of Remote Sensing Data Processing Service

    Directory of Open Access Journals (Sweden)

    Xicheng Tan

    2015-10-01

    Full Text Available Since the Open Geospatial Consortium (OGC proposed the geospatial Web Processing Service (WPS, standard OGC Web Service (OWS-based geospatial processing has become the major type of distributed geospatial application. However, improving the performance and sustainability of the distributed geospatial applications has become the dominant challenge for OWSs. This paper presents the construction of an elastic parallel OGC WPS service on a cloud-based cluster and the designs of a high-performance, cloud-based WPS service architecture, the scalability scheme of the cloud, and the algorithm of the elastic parallel geoprocessing. Experiments of the remote sensing data processing service demonstrate that our proposed method can provide a higher-performance WPS service that uses less computing resources. Our proposed method can also help institutions reduce hardware costs, raise the rate of hardware usage, and conserve energy, which is important in building green and sustainable geospatial services or applications.

  15. Generation of Multiple Metadata Formats from a Geospatial Data Repository

    Science.gov (United States)

    Hudspeth, W. B.; Benedict, K. K.; Scott, S.

    2012-12-01

    The Earth Data Analysis Center (EDAC) at the University of New Mexico is partnering with the CYBERShARE and Environmental Health Group from the Center for Environmental Resource Management (CERM), located at the University of Texas, El Paso (UTEP), the Biodiversity Institute at the University of Kansas (KU), and the New Mexico Geo- Epidemiology Research Network (GERN) to provide a technical infrastructure that enables investigation of a variety of climate-driven human/environmental systems. Two significant goals of this NASA-funded project are: a) to increase the use of NASA Earth observational data at EDAC by various modeling communities through enabling better discovery, access, and use of relevant information, and b) to expose these communities to the benefits of provenance for improving understanding and usability of heterogeneous data sources and derived model products. To realize these goals, EDAC has leveraged the core capabilities of its Geographic Storage, Transformation, and Retrieval Engine (Gstore) platform, developed with support of the NSF EPSCoR Program. The Gstore geospatial services platform provides general purpose web services based upon the REST service model, and is capable of data discovery, access, and publication functions, metadata delivery functions, data transformation, and auto-generated OGC services for those data products that can support those services. Central to the NASA ACCESS project is the delivery of geospatial metadata in a variety of formats, including ISO 19115-2/19139, FGDC CSDGM, and the Proof Markup Language (PML). This presentation details the extraction and persistence of relevant metadata in the Gstore data store, and their transformation into multiple metadata formats that are increasingly utilized by the geospatial community to document not only core library catalog elements (e.g. title, abstract, publication data, geographic extent, projection information, and database elements), but also the processing steps used to

  16. Lowering the barriers for accessing distributed geospatial big data to advance spatial data science: the PolarHub solution

    Science.gov (United States)

    Li, W.

    2017-12-01

    Data is the crux of science. The widespread availability of big data today is of particular importance for fostering new forms of geospatial innovation. This paper reports a state-of-the-art solution that addresses a key cyberinfrastructure research problem—providing ready access to big, distributed geospatial data resources on the Web. We first formulate this data-access problem and introduce its indispensable elements, including identifying the cyber-location, space and time coverage, theme, and quality of the dataset. We then propose strategies to tackle each data-access issue and make the data more discoverable and usable for geospatial data users and decision makers. Among these strategies is large-scale web crawling as a key technique to support automatic collection of online geospatial data that are highly distributed, intrinsically heterogeneous, and known to be dynamic. To better understand the content and scientific meanings of the data, methods including space-time filtering, ontology-based thematic classification, and service quality evaluation are incorporated. To serve a broad scientific user community, these techniques are integrated into an operational data crawling system, PolarHub, which is also an important cyberinfrastructure building block to support effective data discovery. A series of experiments were conducted to demonstrate the outstanding performance of the PolarHub system. We expect this work to contribute significantly in building the theoretical and methodological foundation for data-driven geography and the emerging spatial data science.

  17. Catalogue of Meteor Showers and Storms in Korean History

    Directory of Open Access Journals (Sweden)

    Sang-Hyeon Ahn

    2004-03-01

    Full Text Available We present a more complete and accurate catalogue of astronomical records for meteor showers and meteor storms appeared in primary official Korean history books, such as Samguk-sagi, Koryo-sa, Seungjeongwon-ilgi, and Choson-Wangjo-Sillok. So far the catalogue made by Imoto and Hasegawa in 1958 has been widely used in the international astronomical society. The catalogue is based on a report by Sekiguchi in 1917 that is mainly based on secondary history books. We observed that the catalogue has a number of errors in either dates or sources of the records. We have thoroughly checked the primary official history books, instead of the secondary ones, in order to make a corrected and extended catalogue. The catalogue contains 25 records of meteor storms, four records of intense meteor-showers, and five records of usual showers in Korean history. We also find that some of those records seem to correspond to some presently active meteor showers such as the Leonids, the Perseids, and the ¥ç-Aquarids-Orionids pair. However, a large number of those records do not correspond to such present showers. This catalogue we obtained can be useful for various astrophysical studies in the future.

  18. Updated earthquake catalogue for seismic hazard analysis in Pakistan

    Science.gov (United States)

    Khan, Sarfraz; Waseem, Muhammad; Khan, Muhammad Asif; Ahmed, Waqas

    2018-03-01

    A reliable and homogenized earthquake catalogue is essential for seismic hazard assessment in any area. This article describes the compilation and processing of an updated earthquake catalogue for Pakistan. The earthquake catalogue compiled in this study for the region (quadrangle bounded by the geographical limits 40-83° N and 20-40° E) includes 36,563 earthquake events, which are reported as 4.0-8.3 moment magnitude (M W) and span from 25 AD to 2016. Relationships are developed between the moment magnitude and body, and surface wave magnitude scales to unify the catalogue in terms of magnitude M W. The catalogue includes earthquakes from Pakistan and neighbouring countries to minimize the effects of geopolitical boundaries in seismic hazard assessment studies. Earthquakes reported by local and international agencies as well as individual catalogues are included. The proposed catalogue is further used to obtain magnitude of completeness after removal of dependent events by using four different algorithms. Finally, seismicity parameters of the seismic sources are reported, and recommendations are made for seismic hazard assessment studies in Pakistan.

  19. THROES: a caTalogue of HeRschel Observations of Evolved Stars. I. PACS range spectroscopy

    Science.gov (United States)

    Ramos-Medina, J.; Sánchez Contreras, C.; García-Lario, P.; Rodrigo, C.; da Silva Santos, J.; Solano, E.

    2018-03-01

    This is the first of a series of papers presenting the THROES (A caTalogue of HeRschel Observations of Evolved Stars) project, intended to provide a comprehensive overview of the spectroscopic results obtained in the far-infrared (55-670 μm) with the Herschel space observatory on low-to-intermediate mass evolved stars in our Galaxy. Here we introduce the catalogue of interactively reprocessed Photoconductor Array Camera and Spectrometer (PACS) spectra covering the 55-200 μm range for 114 stars in this category for which PACS range spectroscopic data is available in the Herschel Science Archive (HSA). Our sample includes objects spanning a range of evolutionary stages, from the asymptotic giant branch to the planetary nebula phase, displaying a wide variety of chemical and physical properties. The THROES/PACS catalogue is accessible via a dedicated web-based interface and includes not only the science-ready Herschel spectroscopic data for each source, but also complementary photometric and spectroscopic data from other infrared observatories, namely IRAS, ISO, or AKARI, at overlapping wavelengths. Our goal is to create a legacy-value Herschel dataset that can be used by the scientific community in the future to deepen our knowledge and understanding of these latest stages of the evolution of low-to-intermediate mass stars. The THROES/PACS catalogue is accessible at http://https://throes.cab.inta-csic.es/

  20. TOPCAT: Tool for OPerations on Catalogues And Tables

    Science.gov (United States)

    Taylor, Mark

    2011-01-01

    TOPCAT is an interactive graphical viewer and editor for tabular data. Its aim is to provide most of the facilities that astronomers need for analysis and manipulation of source catalogues and other tables, though it can be used for non-astronomical data as well. It understands a number of different astronomically important formats (including FITS and VOTable) and more formats can be added. It offers a variety of ways to view and analyse tables, including a browser for the cell data themselves, viewers for information about table and column metadata, and facilities for 1-, 2-, 3- and higher-dimensional visualisation, calculating statistics and joining tables using flexible matching algorithms. Using a powerful and extensible Java-based expression language new columns can be defined and row subsets selected for separate analysis. Table data and metadata can be edited and the resulting modified table can be written out in a wide range of output formats. It is a stand-alone application which works quite happily with no network connection. However, because it uses Virtual Observatory (VO) standards, it can cooperate smoothly with other tools in the VO world and beyond, such as VODesktop, Aladin and ds9. Between 2006 and 2009 TOPCAT was developed within the AstroGrid project, and is offered as part of a standard suite of applications on the AstroGrid web site, where you can find information on several other VO tools. The program is written in pure Java and available under the GNU General Public Licence. It has been developed in the UK within the Starlink and AstroGrid projects, and under PPARC and STFC grants. Its underlying table processing facilities are provided by STIL.

  1. The Sloan Digital Sky Survey QSO absorption line catalogue

    Science.gov (United States)

    York, Donald G.; vanden Berk, Daniel; Richards, Gordon T.; Crotts, Arlin P. S.; Khare, Pushpa; Lauroesch, James; Lemoine, Martin; Burles, Scott; Bernardi, Mariangela; Castander, Francisco J.; Frieman, Josh; Loveday, Jon; Meiksin, Avery; Nichol, Robert; Schlegel, David; Schneider, Donald P.; Subbarao, Mark; Stoughton, Chris; Szalay, Alex; Yanny, Brian; Alsayyad, Yusra; Kumar, Abhishek; Lundgren, Britt; Shanidze, Natela; Vanlandingham, Johnny; Wood, Matthew; Baugher, Britt; Brinkmann, Jon; Brunner, Robert; Fukugita, Masaaka; Hall, Patrick B.; Heckman, Timothy M.; Hobbs, Lewis M.; Hogan, Craig J.; Hui, Lam; Jenkins, Edward B.; Kunstz, Daniel; Menard, Brice; Nakamura, Osamu; Quashnock, Jean M.; Stein, Michael; Thakar, Aniruddha R.; Turnshek, David; Welty, Daniel E.; SDSS Collaboration

    2005-03-01

    The spectra of the Sloan Digital Sky Survey (SDSS) are being used to construct a catalogue of QSO absorption lines, for use in studies of abundances, relevant radiation fields, number counts as a function of redshift, and other matters, including the evolution of these parameters. The catalogue includes intervening, associated, and BAL absorbers, in order to allow a clearer definition of the relationships between these three classes. We describe the motivation for and the data products of the project to build the SDSS QSO absorption line catalogue.

  2. Planck 2013 results. XXVIII. The Planck Catalogue of Compact Sources

    DEFF Research Database (Denmark)

    Planck Collaboration,; Ade, P. A. R.; Aghanim, N.

    2013-01-01

    The Planck Catalogue of Compact Sources (PCCS) is the catalogue of sources detected in the Planck nominal mission data. It consists of nine single-frequency catalogues of compact sources containing reliable sources, both Galactic and extragalactic, detected over the entire sky. The PCCS covers...... the frequency range between 30--857 GHz with higher sensitivity (it is 90% complete at 180 mJy in the best channel) and better angular resolution (from ~33' to ~5') than previous all-sky surveys in the microwave band. By definition its reliability is >80% and more than 65% of the sources have been detected...

  3. Analysis of the seismic catalogues for the Vrancea Region, Romania

    International Nuclear Information System (INIS)

    Romashkova, L.L.; Kossobokov, V.G.

    2005-11-01

    Vrancea (Romania) is a geographical region between Eastern and Southern Carpathian Mountains. The region is characterized by a rather high level of seismic activity mainly at intermediate (up to 200 km) depths. These intermediate-depth earthquakes occur between 45 deg-46 deg N and 26 deg-27 deg E. The shallow earthquakes are dispersed over a much broader territory. We performed the comparative analysis of earthquake catalogues available for Vrancea region aiming at the compilation of a data set, to be as complete and homogeneous as possible, which, hopefully, will be used for the prediction of strong and possibly moderate earthquakes in the region by means of M8 algorithm. The two catalogues under study are: 1) Global Hypocenter Data Base catalogue, NEIC (GHDB, 1989) and 2) local Vrancea seismic catalogue (Moldoveanu et al., 1995) and their updates. (author)

  4. Energy research projects in the Nordic countries - catalogue 1983

    International Nuclear Information System (INIS)

    1983-01-01

    The Nordic energy ministers at their meeting February 9, 1982 agreed upon a working plan for the Nordic energy cooperation. As part of this plan a contact group was established in order to maintain coordination and cooperation within the area of energy research and development. This group decided April 1982 to establish a catalogue of energy research projects in the Nordic countries. A pilot catalogue was published in June 1982. The 1983 catalogue gives an up-to-date survey of energy research and development projects in the Nordic countries. About 2125 projects are described, and information is given on investigator(s), performing organization, financing body, funds, and period. The catalogue is prepared by the Nordic energy libraries through their cooperation in Nordic Atomic Libraries Joint Secretariat. The information is also included in the data base Nordic Energy Index (NEI), which is online accessible at I/S Datacentralen, Copenhagen, via EURONET, SCANNET, TYMNET, AND TELENET. (BP)

  5. An All-Sky Portable (ASP) Optical Catalogue

    Science.gov (United States)

    Flesch, Eric Wim

    2017-06-01

    This optical catalogue combines the all-sky USNO-B1.0/A1.0 and most-sky APM catalogues, plus overlays of SDSS optical data, into a single all-sky map presented in a sparse binary format that is easily downloaded at 9 Gb zipped. Total count is 1 163 237 190 sources and each has J2000 astrometry, red and blue magnitudes with PSFs and variability indicator, and flags for proper motion, epoch, and source survey and catalogue for each of the photometry and astrometry. The catalogue is available on http://quasars.org/asp.html, and additional data for this paper is available at http://dx.doi.org/10.4225/50/5807fbc12595f.

  6. CMS offline web tools

    International Nuclear Information System (INIS)

    Metson, S; Newbold, D; Belforte, S; Kavka, C; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Tuura, L; Evans, D; Fanfani, A; Feichtinger, D; Kuznetsov, V; Lingen, F van; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools

  7. Catalogue of chemical and isotopic nuclear reference materials

    International Nuclear Information System (INIS)

    Le Duigou, Y.

    1980-01-01

    An inventory of available reference materials for chemical, isotopic and trace elements analyses in the nuclear field has been established. Emphasis has been given to the substances containing uranium and plutonium. Reference values, size of samples, pricers and addresses of supplies are indicated. The present catalogue is a revised version of the EUR report 5229e entitled 'Catalogue of reference materials of interest to nuclear energy' (1974)

  8. The SHARE European Earthquake Catalogue (SHEEC) 1000–1899

    OpenAIRE

    Stucchi, M.; Rovida, A.; Gomez Capera, A. A.; Alexandre, P.; Camelbeeck, T.; Demircioglu, M. B.; Gasperini, P.; Kouskouna, V.; Musson, R. M. W.; Radulian, M.; Sesetyan, K.; Vilanova, S.; Baumont, D.; Bungum, H.; Fäh, D.

    2012-01-01

    In the frame of the European Commission project “Seismic Hazard Harmonization in Europe” (SHARE), aiming at harmonizing seismic hazard at a European scale, the compilation of a homogeneous, European parametric earthquake catalogue was planned. The goal was to be achieved by considering the most updated historical dataset and assessing homogenous magnitudes, with support from several institutions. This paper describes the SHARE European Earthquake Catalogue (SHEEC), ...

  9. FishTrace: a genetic catalogue of European fishes

    OpenAIRE

    ZANZI ANTONELLA; MARTINSOHN JANN

    2017-01-01

    Abstract FishTrace is a genetic catalogue for species identification associated to reference collections of taxonomically identified vouchers from more than 200 commercial marine fish species. The main purpose of the genetic catalogue is to enable reliable species identification for research purposes as well as in support of traceability schemes under the remit of food and feed laws. A major asset of FishTrace is that all genetic data are linked to biological collections of vouchers, that is ...

  10. Research and Practical Trends in Geospatial Sciences

    Science.gov (United States)

    Karpik, A. P.; Musikhin, I. A.

    2016-06-01

    In recent years professional societies have been undergoing fundamental restructuring brought on by extensive technological change and rapid evolution of geospatial science. Almost all professional communities have been affected. Communities are embracing digital techniques, modern equipment, software and new technological solutions at a staggering pace. In this situation, when planning financial investments and intellectual resource management, it is crucial to have a clear understanding of those trends that will be in great demand in 3-7 years. This paper reviews current scientific and practical activities of such non-governmental international organizations as International Federation of Surveyors, International Cartographic Association, and International Society for Photogrammetry and Remote Sensing, analyzes and groups most relevant topics brought up at their scientific events, forecasts most probable research and practical trends in geospatial sciences, outlines topmost leading countries and emerging markets for further detailed analysis of their activities, types of scientific cooperation and joint implementation projects.

  11. RESEARCH AND PRACTICAL TRENDS IN GEOSPATIAL SCIENCES

    Directory of Open Access Journals (Sweden)

    A. P. Karpik

    2016-06-01

    Full Text Available In recent years professional societies have been undergoing fundamental restructuring brought on by extensive technological change and rapid evolution of geospatial science. Almost all professional communities have been affected. Communities are embracing digital techniques, modern equipment, software and new technological solutions at a staggering pace. In this situation, when planning financial investments and intellectual resource management, it is crucial to have a clear understanding of those trends that will be in great demand in 3-7 years. This paper reviews current scientific and practical activities of such non-governmental international organizations as International Federation of Surveyors, International Cartographic Association, and International Society for Photogrammetry and Remote Sensing, analyzes and groups most relevant topics brought up at their scientific events, forecasts most probable research and practical trends in geospatial sciences, outlines topmost leading countries and emerging markets for further detailed analysis of their activities, types of scientific cooperation and joint implementation projects.

  12. Dark Energy Survey Year 1 Results: Weak Lensing Shape Catalogues

    Energy Technology Data Exchange (ETDEWEB)

    Zuntz, J.; et al.

    2017-08-04

    We present two galaxy shape catalogues from the Dark Energy Survey Year 1 data set, covering 1500 square degrees with a median redshift of $0.59$. The catalogues cover two main fields: Stripe 82, and an area overlapping the South Pole Telescope survey region. We describe our data analysis process and in particular our shape measurement using two independent shear measurement pipelines, METACALIBRATION and IM3SHAPE. The METACALIBRATION catalogue uses a Gaussian model with an innovative internal calibration scheme, and was applied to $riz$-bands, yielding 34.8M objects. The IM3SHAPE catalogue uses a maximum-likelihood bulge/disc model calibrated using simulations, and was applied to $r$-band data, yielding 21.9M objects. Both catalogues pass a suite of null tests that demonstrate their fitness for use in weak lensing science. We estimate the 1$\\sigma$ uncertainties in multiplicative shear calibration to be $0.013$ and $0.025$ for the METACALIBRATION and IM3SHAPE catalogues, respectively.

  13. Promenade Among Words and Things: The Gallery as Catalogue, the Catalogue as Gallery

    Directory of Open Access Journals (Sweden)

    Mari Lending

    2015-12-01

    Full Text Available In the mid nineteenth century new casting techniques allowed for the production of huge architectural fragments. Well-selected collections could ideally display perfect series in galleries in which the visitor could wander among monuments and experience architecture history on full scale. The disembodied material of plaster was considered capable of embodying a number of modern historical taxonomies and aesthetical programs, most importantly chronology, comparison, style, and evolution. Veritable showcases of historicism, the casts could illustrate in spatial arrangements new conceptions on the history, contemporaneity and future of architecture. Plaster cast became a main medium in which to publish antiquities as novelties for grand audiences, taking the printed and published beyond the two-dimensional space of words and images. However, due to the increasing market of casts and their sheere size and weight, the reproductions as mounted in the galleries often behaved as unruly as architecture does outside curatorial control. In the end only the catalogues, the paper versions of these imaginary museums were capable to create the orders that their plaster referents constantly aspired to destroy. An important chapter in the history of the architecture museum these plaster monuments belong to a part of architectural print culture in which catalogues were curated and galleries edited. Metaphors drawn from the realm of writing saturated the discourse on the display of casts. Images and texts fluctuated and the image-objects were compared to books, paper, pages, documents and libraries but above all to illustrations inviting promenades in time and space.

  14. TOWARDS IMPLEMENTATION OF THE FOG COMPUTING CONCEPT INTO THE GEOSPATIAL DATA INFRASTRUCTURES

    Directory of Open Access Journals (Sweden)

    E. A. Panidi

    2016-01-01

    Full Text Available The information technologies and Global Network technologies in particular are developing very quickly. According to this, the problem remains actual that incorporates implementation issues for the general-purpose technologies into the information systems which operate with geospatial data. The paper discusses the implementation feasibility for a number of new approaches and concepts that solve the problems of spatial data publish and management on the Global Network. A brief review describes some contemporary concepts and technologies used for distributed data storage and management, which provide combined use of server-side and client-side resources. In particular, the concepts of Cloud Computing, Fog Computing, and Internet of Things, also with Java Web Start, WebRTC and WebTorrent technologies are mentioned. The author's experience is described briefly, which incorporates the number of projects devoted to the development of the portable solutions for geospatial data and GIS software publication on the Global Network.

  15. Catalogue of Life: 2013 Annual Checklist

    Science.gov (United States)

    Nicolson, David T.; Roskov, Yuri; Kunze, Thomas; Paglinawan, Luvie; Orrell, Thomas; Culham, Alistair; Bailly, Nicolas; Kirk, Paul; Bourgoin, Thierry; Baillargeon, Guy; Hernandez, Franciso; De Wever, Aaike

    2013-01-01

    The most comprehensive and authoritative global index of species currently available, it consists of a single integrated species checklist and taxonomic hierarchy. It is available (1) as a DVD and booklet; and (2) on the Web. The contact for the booklet and DVD is Thomas Orrell at the Smithsonian Institution, Washington,DC. The URL for the online version is http://www.catalogueoflife.org/annual-checklist/2013/info/ac

  16. The library catalogue as a retrieval tool in academic libraries: a case ...

    African Journals Online (AJOL)

    The study revealed that the low awareness of catalogue use as a retrieval tool led to other i dentified causes of low utilization of library catalogue in retrieving information. Amongst the recommendations for enhanced use of the library catalogue were: practical application on the use of library catalogue to be emphasized and ...

  17. The Development of an Interoperable Open Source Geographic Information Technology Stack for Ingest, Management, and Delivery of Earth Observation and Geospatial Products

    Science.gov (United States)

    Benedict, K. K.; Sanchez-Silva, R.; Cavner, J. A.; Hudspeth, W. B.

    2009-12-01

    The rapid growth of geospatial data volume and number of sources has highlighted the need for, and spurred the growth and adoption of interoperable geospatial data services. For nearly a decade the Earth Data Analysis Center at The University of New Mexico has been developing standards-based geospatial data management systems based upon a core collection of Open Source technologies, with the collection of employed technologies contributing to a unified information architecture that is enabled by interoperability standards. These technologies include geodatabases (PostGIS), geospatial data access libraries and associated utility programs (GDAL and OGR), scripting languages that enable automated data processing and management (Python), online mapping servers (MapServer), online mapping (OpenLayers, MapFish, GeoEXT), and desktop GIS applications (uDig, QGIS, and GRASS). The interoperability standards upon which EDAC's geospatial information architectures are built include those coming out of the Open Geospatial Consortium (WMS, WFS, WCS, KML, GML), the World Wide Web Consortium (HTML, CSS, SOAP, XML), and ECMA (ECMAscript AKA Javascript). This paper outlines the complementary roles that these various Open Source applications play in the multi-tiered Services Oriented Architectures developed by EDAC in support of a variety of projects, and provides an illustration of how the capabilities enabled by these technologies are interconnected using well-defined open standards. These capabilities include data ingest and query services that support searching for data content based upon keywords and defined spatial extent. They also include data administration services that support data product ingest and registration, data product modification, and deletion from the data registry. Finally, the system supports dynamic generation of Open Geospatial Consortium services for each geospatial data product in the system, enabling integration of data from the system into a wide variety

  18. OGC® Sensor Web Enablement Standards

    Directory of Open Access Journals (Sweden)

    George Percivall

    2006-09-01

    Full Text Available This article provides a high-level overview of and architecture for the Open Geospatial Consortium (OGC standards activities that focus on sensors, sensor networks, and a concept called the “Sensor Web”. This OGC work area is known as Sensor Web Enablement (SWE. This article has been condensed from "OGC® Sensor Web Enablement: Overview And High Level Architecture," an OGC White Paper by Mike Botts, PhD, George Percivall, Carl Reed, PhD, and John Davidson which can be downloaded from http://www.opengeospatial.org/pt/15540. Readers interested in greater technical and architecture detail can download and read the OGC SWE Architecture Discussion Paper titled “The OGC Sensor Web Enablement Architecture” (OGC document 06-021r1, http://www.opengeospatial.org/pt/14140.

  19. The Raincoast eCatalogue: the creation of an electronic catalogue as a supplemental selling tool for sales representatives

    OpenAIRE

    Kemp, Elizabeth Anne

    2011-01-01

    Raincoast Books Distribution Ltd. is a Canadian book distributor that provides sales, marketing and distribution services for a number of international and Canadian publishers. Each publishing season Raincoast Books distributes approximately 25,000 paper catalogues to sales representatives and retail accounts. Traditional paper catalogues have major disadvantages including their static format, high cost of production and distribution, inclusion of frontlist titles only and environmental impac...

  20. A Geospatial Cyberinfrastructure for Urban Economic Analysis and Spatial Decision-Making

    Directory of Open Access Journals (Sweden)

    Michael F. Goodchild

    2013-05-01

    Full Text Available Urban economic modeling and effective spatial planning are critical tools towards achieving urban sustainability. However, in practice, many technical obstacles, such as information islands, poor documentation of data and lack of software platforms to facilitate virtual collaboration, are challenging the effectiveness of decision-making processes. In this paper, we report on our efforts to design and develop a geospatial cyberinfrastructure (GCI for urban economic analysis and simulation. This GCI provides an operational graphic user interface, built upon a service-oriented architecture to allow (1 widespread sharing and seamless integration of distributed geospatial data; (2 an effective way to address the uncertainty and positional errors encountered in fusing data from diverse sources; (3 the decomposition of complex planning questions into atomic spatial analysis tasks and the generation of a web service chain to tackle such complex problems; and (4 capturing and representing provenance of geospatial data to trace its flow in the modeling task. The Greater Los Angeles Region serves as the test bed. We expect this work to contribute to effective spatial policy analysis and decision-making through the adoption of advanced GCI and to broaden the application coverage of GCI to include urban economic simulations.

  1. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    Science.gov (United States)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This

  2. Geospatial considerations for a multiorganizational, landscape-scale program

    Science.gov (United States)

    O'Donnell, Michael S.; Assal, Timothy J.; Anderson, Patrick J.; Bowen, Zachary H.

    2013-01-01

    Geospatial data play an increasingly important role in natural resources management, conservation, and science-based projects. The management and effective use of spatial data becomes significantly more complex when the efforts involve a myriad of landscape-scale projects combined with a multiorganizational collaboration. There is sparse literature to guide users on this daunting subject; therefore, we present a framework of considerations for working with geospatial data that will provide direction to data stewards, scientists, collaborators, and managers for developing geospatial management plans. The concepts we present apply to a variety of geospatial programs or projects, which we describe as a “scalable framework” of processes for integrating geospatial efforts with management, science, and conservation initiatives. Our framework includes five tenets of geospatial data management: (1) the importance of investing in data management and standardization, (2) the scalability of content/efforts addressed in geospatial management plans, (3) the lifecycle of a geospatial effort, (4) a framework for the integration of geographic information systems (GIS) in a landscape-scale conservation or management program, and (5) the major geospatial considerations prior to data acquisition. We conclude with a discussion of future considerations and challenges.

  3. Addressing the Challenge: Cataloguing Electronic Books in Academic Libraries

    Directory of Open Access Journals (Sweden)

    Shuzhen Zhao

    2010-03-01

    Full Text Available Objective ‐ This paper explores the various issues and challenges arising from e‐book cataloguing experienced at the University of Windsor’s Leddy Library and the Ontario Council of University Libraries (OCUL. This discussion uses an evidence based approach to identify and resolve issues relevant to academic libraries as well as to consortia. With the ever rising popularity of e‐books within academic libraries, cataloguing librarians are actively seeking more effective methods of managing this new electronic medium, including the development of new cataloguing policies and procedures. This paper will explore the various issues and challenges surrounding e‐book cataloguing and processing within academic libraries, and will identify new policies and procedures that may be used to effectively assist in e‐book management.Methods ‐ This paper presents a case study of e‐book cataloguing practices undertaken by a Canadian academic library and the consortium with which it is affiliated. Towards this end, the University of Windsor’s Leddy Library will be the prime focus of this study, with its establishment of a new e‐book MARC records database. The research is based on the results of the e‐book MARC project undertaken by the Leddy Library and the Ontario Council of University Libraries (OCUL.Through analysis of various suppliers’ MARC records and the actual implementation of the e‐book MARC project, the authors developed and evaluated a new approach to e‐book cataloguing for use in academic libraries.Results ‐ This practice‐based approach towards the development of a new method of e‐book cataloguing required continual modification and examination of e‐book MARC records within the target library. The Leddy Library’s e‐book MARC project provided an excellent opportunity to test the library’s existing cataloguing standards and procedures for print format, while at the same time, identifying related e‐books issues

  4. Biases in cometary catalogues and Planet X

    Science.gov (United States)

    Horner, J.; Evans, N. W.

    2002-09-01

    catalogues.

  5. Catalogue of ISO LWS observations of asteroids

    Science.gov (United States)

    Hormuth, F.; Müller, T. G.

    2009-04-01

    Context: The long wavelength spectrometer (LWS) onboard the infrared space observatory (ISO) observed the four large main-belt asteroids (1) Ceres, (2) Pallas, (4) Vesta, and (10) Hygiea multiple times. The photometric and spectroscopic data cover the wavelength range between 43 and 197 μm, and are a unique dataset for future investigations and detailed characterisations of these bodies. Aims: The standard ISO archive products, produced through the last post-mission LWS pipeline, were still affected by instrument artefacts. Our goal was to provide the best possible data products to exploit the full scientific potential of these observations. Methods: For all asteroid observations we analysed in detail the dark current, the calibration reference flashes, the space environment effects (glitches), memory effects, tracking influences, and various other sources of uncertainty. We performed a refined reduction of all measurements, corrected for the various effects, and re-calibrated the data. We outline the data reduction process and give an overview of the available data and the quality of the observations. We apply a thermophysical model to the flux measurements to derive far-IR based diameter and albedo values of the asteroids. The measured thermal rotational lightcurve of (4) Vesta is compared to model predictions. Results: The catalogue of LWS (long wavelength spectrometer) observations of asteroids contains 57 manually reduced datasets, including seven non-standard observations, which as such did not have final pipeline products available before. In total, the archive now contains 11 spectral scans and 46 fixed grating measurements with a simultaneous observation at 10 key wavelengths distributed over the full LWS range. The new data products are now accessible via the ISO data archive as highly processed data products (HPDP). Conclusions: The quality of the data products was checked against state-of-the-art thermophysical model predictions and an excellent

  6. The effects of geography lessons with geospatial technologies on the development of high school students' relational thinking

    NARCIS (Netherlands)

    Favier, Tim|info:eu-repo/dai/nl/33811534X; van der Schee, Joop|info:eu-repo/dai/nl/072719575

    Geospatial technologies offer access to geospatial information via digital representations, such as digital maps, and tools for interaction with those representations. The question is whether geography lessons with geospatial technologies really contribute to the development of students' geospatial

  7. The effects of geography lessons with geospatial technologies on the development of high school students' relational thinking

    NARCIS (Netherlands)

    Favier, T.T.; van der Schee, J.A.

    2014-01-01

    Geospatial technologies offer access to geospatial information via digital representations, such as digital maps, and tools for interaction with those representations. The question is whether geography lessons with geospatial technologies really contribute to the development of students' geospatial

  8. Web Map Services (WMS) Global Mosaic

    Science.gov (United States)

    Percivall, George; Plesea, Lucian

    2003-01-01

    The WMS Global Mosaic provides access to imagery of the global landmass using an open standard for web mapping. The seamless image is a mosaic of Landsat 7 scenes; geographically-accurate with 30 and 15 meter resolutions. By using the OpenGIS Web Map Service (WMS) interface, any organization can use the global mosaic as a layer in their geospatial applications. Based on a trade study, an implementation approach was chosen that extends a previously developed WMS hosting a Landsat 5 CONUS mosaic developed by JPL. The WMS Global Mosaic supports the NASA Geospatial Interoperability Office goal of providing an integrated digital representation of the Earth, widely accessible for humanity's critical decisions.

  9. Maximum likelihood random galaxy catalogues and luminosity function estimation

    Science.gov (United States)

    Cole, Shaun

    2011-09-01

    We present a new algorithm to generate a random (unclustered) version of an magnitude limited observational galaxy redshift catalogue. It takes into account both galaxy evolution and the perturbing effects of large-scale structure. The key to the algorithm is a maximum likelihood (ML) method for jointly estimating both the luminosity function (LF) and the overdensity as a function of redshift. The random catalogue algorithm then works by cloning each galaxy in the original catalogue, with the number of clones determined by the ML solution. Each of these cloned galaxies is then assigned a random redshift uniformly distributed over the accessible survey volume, taking account of the survey magnitude limit(s) and, optionally, both luminosity and number density evolution. The resulting random catalogues, which can be employed in traditional estimates of galaxy clustering, make fuller use of the information available in the original catalogue and hence are superior to simply fitting a functional form to the observed redshift distribution. They are particularly well suited to studies of the dependence of galaxy clustering on galaxy properties as each galaxy in the random catalogue has the same list of attributes as measured for the galaxies in the genuine catalogue. The derivation of the joint overdensity and LF estimator reveals the limit in which the ML estimate reduces to the standard 1/Vmax LF estimate, namely when one makes the prior assumption that the are no fluctuations in the radial overdensity. The new ML estimator can be viewed as a generalization of the 1/Vmax estimate in which Vmax is replaced by a density corrected Vdc, max.

  10. Completeness and accuracy of WWW-based catalogues of medical E-learning modules.

    Science.gov (United States)

    Stausberg, Jürgen; Bludssat, Kevin; Geueke, Martin

    2005-09-01

    Catalogues of medical E-learning modules offer easy access to learning material that is available on the World Wide Web (WWW) free of charge. Therefore, a study was conducted on the retrieval performance of four WWW-based catalogues: CAL reviews, KELDAmed, LMU, and LRSMed. LRSMed is run by the authors. Completeness was calculated pairwise. Two reviewers checked the availability of 80 modules independently. Five criteria were chosen to calculate accuracy: gynaecology and microbiology as medical fields; case study as type of learning resource; diabetes mellitus and AIDS as free-text diagnoses. Also, two reviewers evaluated independently whether the module is really an appropriate resource for that criterion or not. The analysis is based on a consensual decision about the correct votes. From the URLs, 93% were available at the evaluation of completeness, 92% at the evaluation of accuracy. The kappa values for inter-rater reliability were 0.83 and 0.36. The best service offers 60.8% of the pooled E-learning resources. The resources retrieved by the five criteria were rated as correct in 69.3% (LRSMed), 76.6% (KELDAmed), and 82.7% (CAL reviews). The overall accuracy was 76.7%. Medical students and other potential users of E-learning modules should be aware that the completeness of WWW-based catalogues in this area is not satisfying. The retrieval accuracy is better; four of five resources offered correspond with the search criterion. Up to now, most of the services miss an application-programming interface that could be used for a meta-search to improve completeness.

  11. Applying Geospatial Technologies for International Development and Public Health: The USAID/NASA SERVIR Program

    Science.gov (United States)

    Hemmings, Sarah; Limaye, Ashutosh; Irwin, Dan

    2011-01-01

    adaptation strategies for nations affected by climate change. Conclusions: SERVIR is a platform for collaboration and cross-agency coordination, international partnerships, and delivery of web-based geospatial information services and applications. SERVIR makes a variety of geospatial data available for use in studies of environmental health outcomes.

  12. SDI-based business processes: A territorial analysis web information system in Spain

    Science.gov (United States)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  13. A Metadata Schema for Geospatial Resource Discovery Use Cases

    Directory of Open Access Journals (Sweden)

    Darren Hardy

    2014-07-01

    Full Text Available We introduce a metadata schema that focuses on GIS discovery use cases for patrons in a research library setting. Text search, faceted refinement, and spatial search and relevancy are among GeoBlacklight's primary use cases for federated geospatial holdings. The schema supports a variety of GIS data types and enables contextual, collection-oriented discovery applications as well as traditional portal applications. One key limitation of GIS resource discovery is the general lack of normative metadata practices, which has led to a proliferation of metadata schemas and duplicate records. The ISO 19115/19139 and FGDC standards specify metadata formats, but are intricate, lengthy, and not focused on discovery. Moreover, they require sophisticated authoring environments and cataloging expertise. Geographic metadata standards target preservation and quality measure use cases, but they do not provide for simple inter-institutional sharing of metadata for discovery use cases. To this end, our schema reuses elements from Dublin Core and GeoRSS to leverage their normative semantics, community best practices, open-source software implementations, and extensive examples already deployed in discovery contexts such as web search and mapping. Finally, we discuss a Solr implementation of the schema using a "geo" extension to MODS.

  14. Geospatial Visualization of Scientific Data Through Keyhole Markup Language

    Science.gov (United States)

    Wernecke, J.; Bailey, J. E.

    2008-12-01

    The development of virtual globes has provided a fun and innovative tool for exploring the surface of the Earth. However, it has been the paralleling maturation of Keyhole Markup Language (KML) that has created a new medium and perspective through which to visualize scientific datasets. Originally created by Keyhole Inc., and then acquired by Google in 2004, in 2007 KML was given over to the Open Geospatial Consortium (OGC). It became an OGC international standard on 14 April 2008, and has subsequently been adopted by all major geobrowser developers (e.g., Google, Microsoft, ESRI, NASA) and many smaller ones (e.g., Earthbrowser). By making KML a standard at a relatively young stage in its evolution, developers of the language are seeking to avoid the issues that plagued the early World Wide Web and development of Hypertext Markup Language (HTML). The popularity and utility of Google Earth, in particular, has been enhanced by KML features such as the Smithsonian volcano layer and the dynamic weather layers. Through KML, users can view real-time earthquake locations (USGS), view animations of polar sea-ice coverage (NSIDC), or read about the daily activities of chimpanzees (Jane Goodall Institute). Perhaps even more powerful is the fact that any users can create, edit, and share their own KML, with no or relatively little knowledge of manipulating computer code. We present an overview of the best current scientific uses of KML and a guide to how scientists can learn to use KML themselves.

  15. A participatory web map service: the case of Theewaterskloof Dam ...

    African Journals Online (AJOL)

    Previously, GIS was critiqued as a segregating science used exclusively by geospatial experts. ... It presents a case study methodology for the development and testing of a web GIS that can be optimised for smartphones and tablets so that communities can access updated information while using the dam, which is rated as ...

  16. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  17. A framework for efficient spatial web object retrieval

    DEFF Research Database (Denmark)

    Wu, Dinging; Cong, Gao; Jensen, Christian S.

    2012-01-01

    The conventional Internet is acquiring a geospatial dimension. Web documents are being geo-tagged and geo-referenced objects such as points of interest are being associated with descriptive text documents. The resulting fusion of geo-location and documents enables new kinds of queries that take...

  18. Catalogue of Exoplanets in Multiple-Star-Systems

    Science.gov (United States)

    Schwarz, Richard; Funk, Barbara; Bazsó, Ákos; Pilat-Lohinger, Elke

    2017-07-01

    Cataloguing the data of exoplanetary systems becomes more and more important, due to the fact that they conclude the observations and support the theoretical studies. Since 1995 there is a database which list most of the known exoplanets (The Extrasolar Planets Encyclopaedia is available at http://exoplanet.eu/ and described at Schneider et al. 2011). With the growing number of detected exoplanets in binary and multiple star systems it became more important to mark and to separate them into a new database. Therefore we started to compile a catalogue for binary and multiple star systems. Since 2013 the catalogue can be found at http://www.univie.ac.at/adg/schwarz/multiple.html (description can be found at Schwarz et al. 2016) which will be updated regularly and is linked to the Extrasolar Planets Encyclopaedia. The data of the binary catalogue can be downloaded as a file (.csv) and used for statistical purposes. Our database is divided into two parts: the data of the stars and the planets, given in a separate list. Every columns of the list can be sorted in two directions: ascending, meaning from the lowest value to the highest, or descending. In addition an introduction and help is also given in the menu bar of the catalogue including an example list.

  19. A catalogue of AKARI FIS BSC extragalactic objects

    Science.gov (United States)

    Marton, Gabor; Toth, L. Viktor; Gyorgy Balazs, Lajos

    2015-08-01

    We combined photometric data of about 70 thousand point sources from the AKARI Far-Infrared Surveyor Bright Source Catalogue with AllWISE catalogue data to identify galaxies. We used Quadratic Discriminant Analysis (QDA) to classify our sources. The classification was based on a 6D parameter space that contained AKARI [F65/F90], [F90/F140], [F140/F160] and WISE W1-W2 colours along with WISE W1 magnitudes and AKARI [F140] flux values. Sources were classified into 3 main objects types: YSO candidates, evolved stars and galaxies. The training samples were SIMBAD entries of the input point sources wherever an associated SIMBAD object was found within a 30 arcsecond search radius. The QDA resulted more than 5000 AKARI galaxy candidate sources. The selection was tested cross-correlating our AKARI extragalactic catalogue with the Revised IRAS-FSC Redshift Catalogue (RIFSCz). A very good match was found. A further classification attempt was also made to differentiate between extragalactic subtypes using Support Vector Machines (SVMs). The results of the various methods showed that we can confidently separate cirrus dominated objects (type 1 of RIFSCz). Some of our “galaxy candidate” sources are associated with 2MASS extended objects, and listed in the NASA Extragalactic Database so far without clear proofs of their extragalactic nature. Examples will be presented in our poster. Finally other AKARI extragalactic catalogues will be also compared to our statistical selection.

  20. Catalogue of knowledge and skills for sleep medicine.

    Science.gov (United States)

    Penzel, Thomas; Pevernagie, Dirk; Dogas, Zoran; Grote, Ludger; de Lacy, Simone; Rodenbeck, Andrea; Bassetti, Claudio; Berg, Søren; Cirignotta, Fabio; d'Ortho, Marie-Pia; Garcia-Borreguero, Diego; Levy, Patrick; Nobili, Lino; Paiva, Teresa; Peigneux, Philippe; Pollmächer, Thomas; Riemann, Dieter; Skene, Debra J; Zucconi, Marco; Espie, Colin

    2014-04-01

    Sleep medicine is evolving globally into a medical subspeciality in its own right, and in parallel, behavioural sleep medicine and sleep technology are expanding rapidly. Educational programmes are being implemented at different levels in many European countries. However, these programmes would benefit from a common, interdisciplinary curriculum. This 'catalogue of knowledge and skills' for sleep medicine is proposed, therefore, as a template for developing more standardized curricula across Europe. The Board and The Sleep Medicine Committee of the European Sleep Research Society (ESRS) have compiled the catalogue based on textbooks, standard of practice publications, systematic reviews and professional experience, validated subsequently by an online survey completed by 110 delegates specialized in sleep medicine from different European countries. The catalogue comprises 10 chapters covering physiology, pathology, diagnostic and treatment procedures to societal and organizational aspects of sleep medicine. Required levels of knowledge and skills are defined, as is a proposed workload of 60 points according to the European Credit Transfer System (ECTS). The catalogue is intended to be a basis for sleep medicine education, for sleep medicine courses and for sleep medicine examinations, serving not only physicians with a medical speciality degree, but also PhD and MSc health professionals such as clinical psychologists and scientists, technologists and nurses, all of whom may be involved professionally in sleep medicine. In the future, the catalogue will be revised in accordance with advances in the field of sleep medicine. © 2013 European Sleep Research Society.

  1. Development of Geospatial Map Based Election Portal

    Science.gov (United States)

    Gupta, A. Kumar Chandra; Kumar, P.; Vasanth Kumar, N.

    2014-11-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Election portal (GMEP) of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for pertain to planning and management of Department of Chief Electoral Officer, and as an election related information searching tools (Polling Station, Assembly and parliamentary constituency etc.,) for the citizens of NCTD. The GMEP is based on Client-Server architecture model. It has been developed using ArcGIS Server 10.0 with J2EE front-end on Microsoft Windows environment. The GMEP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMEP includes delimited precinct area boundaries of Voters Area of Polling stations, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMEP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of elections. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  2. UKRVO Astronomical WEB Services

    Directory of Open Access Journals (Sweden)

    Mazhaev, O.E.

    2017-01-01

    Full Text Available Ukraine Virtual Observatory (UkrVO has been a member of the International Virtual Observatory Alliance (IVOA since 2011. The virtual observatory (VO is not a magic solution to all problems of data storing and processing, but it provides certain standards for building infrastructure of astronomical data center. The astronomical databases help data mining and offer to users an easy access to observation metadata, images within celestial sphere and results of image processing. The astronomical web services (AWS of UkrVO give to users handy tools for data selection from large astronomical catalogues for a relatively small region of interest in the sky. Examples of the AWS usage are showed.

  3. Intelligence, mapping, and geospatial exploitation system (IMAGES)

    Science.gov (United States)

    Moellman, Dennis E.; Cain, Joel M.

    1998-08-01

    This paper provides further detail to one facet of the battlespace visualization concept described in last year's paper Battlespace Situation Awareness for Force XXI. It focuses on the National Imagery and Mapping Agency (NIMA) goal to 'provide customers seamless access to tailorable imagery, imagery intelligence, and geospatial information.' This paper describes Intelligence, Mapping, and Geospatial Exploitation System (IMAGES), an exploitation element capable of CONUS baseplant operations or field deployment to provide NIMA geospatial information collaboratively into a reconnaissance, surveillance, and target acquisition (RSTA) environment through the United States Imagery and Geospatial Information System (USIGS). In a baseplant CONUS setting IMAGES could be used to produce foundation data to support mission planning. In the field it could be directly associated with a tactical sensor receiver or ground station (e.g. UAV or UGV) to provide near real-time and mission specific RSTA to support mission execution. This paper provides IMAGES functional level design; describes the technologies, their interactions and interdependencies; and presents a notional operational scenario to illustrate the system flexibility. Using as a system backbone an intelligent software agent technology, called Open Agent ArchitectureTM (OAATM), IMAGES combines multimodal data entry, natural language understanding, and perceptual and evidential reasoning for system management. Configured to be DII COE compliant, it would utilize, to the extent possible, COTS applications software for data management, processing, fusion, exploitation, and reporting. It would also be modular, scaleable, and reconfigurable. This paper describes how the OAATM achieves data synchronization and enables the necessary level of information to be rapidly available to various command echelons for making informed decisions. The reasoning component will provide for the best information to be developed in the timeline

  4. Planck 2013 results. XXVIII. The Planck Catalogue of Compact Sources

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Cardoso, J.F.; Carvalho, P.; Catalano, A.; Challinor, A.; Chamballu, A.; Chen, X.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clemens, M.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dupac, X.; Efstathiou, G.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Hansen, F.K.; Hanson, D.; Harrison, D.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Jaffe, T.R.; Jaffe, A.H.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Laureijs, R.J.; Lawrence, C.R.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Leroy, C.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; McGehee, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschenes, M.A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Negrello, M.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I.J.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Pearson, T.J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Reach, W.T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Santos, D.; Savini, G.; Schammel, M.P.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sunyaev, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Turler, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Walter, B.; Wandelt, B.D.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    The Planck Catalogue of Compact Sources (PCCS) is the catalogue of sources detected in the first 15 months of Planck operations, the "nominal" mission. It consists of nine single-frequency catalogues of compact sources, both Galactic and extragalactic, detected over the entire sky. The PCCS covers the frequency range 30--857\\,GHz with higher sensitivity (it is 90% complete at 180 mJy in the best channel) and better angular resolution (from ~33' to ~5') than previous all-sky surveys in this frequency band. By construction its reliability is >80% and more than 65% of the sources have been detected at least in two contiguous Planck channels. In this paper we present the construction and validation of the PCCS, its contents and its statistical characterization.

  5. A Probabilistic Catalogue of Unresolved High Latitude Fermi LAT Sources

    Science.gov (United States)

    Portillo, Stephen; Daylan, Tansu; Finkbeiner, Douglas P.

    2016-01-01

    Several groups have identified a highly significant and spatially extended excess of GeV gamma-rays in the Inner Galaxy using data from the Fermi LAT. While this signal's properties are consistent with those expected from dark matter annihilation, another interpretation is that it is the emission from a population of unresolved point sources. Motivated by the point source interpretation, we implement a Bayesian method for producing probabilistic catalogues to constrain the population of point sources below the Fermi LAT detection limit. To validate our method, we apply it to the high latitude Fermi LAT data to confirm that the probabilistic catalogue recovers the resolved sources in the Fermi Collaboration's 3FGL catalogue. Then, we compare our constraints on the unresolved point source population at high latitude to those obtained using non-Poissonian template fitting.

  6. Catalogue of European earthquakes with intensities higher than 4

    International Nuclear Information System (INIS)

    Van Gils, J.M.; Leydecker, G.

    1991-01-01

    The catalogue of European earthquakes with intensities higher than 4 contains some 20 000 seismic events that happened in member countries of the European Communities, Switzerland and Austria. It was prepared on the basis of already existing national catalogues and includes historical data as well as present-day data. All historical data are harmonized as far as possible to the same intensity scale (MSK-scale) to make them suitable for computerization. Present-day data include instrumental and macroseismic data. Instrumental data are expressed in terms of magnitude (Richter scale) while macroseismic data are given in intensities. Compilation of seismic data can provide a basis for statistically supported studies of site selection procedures and the qualitative assessment of seismic risks. Three groups of seismic maps illustrate the content of the catalogue for different time periods and different intensities

  7. Capacity Building through Geospatial Education in Planning and School Curricula

    Science.gov (United States)

    Kumar, P.; Siddiqui, A.; Gupta, K.; Jain, S.; Krishna Murthy, Y. V. N.

    2014-11-01

    Geospatial technology has widespread usage in development planning and resource management. It offers pragmatic tools to help urban and regional planners to realize their goals. On the request of Ministry of Urban Development, Govt. of India, the Indian Institute of Remote Sensing (IIRS), Dehradun has taken an initiative to study the model syllabi of All India Council for Technical Education for planning curricula of Bachelor and Master (five disciplines) programmes. It is inferred that geospatial content across the semesters in various planning fields needs revision. It is also realized that students pursuing planning curricula are invariably exposed to spatial mapping tools but the popular digital drafting software have limitations on geospatial analysis of planning phenomena. Therefore, students need exposure on geospatial technologies to understand various real world phenomena. Inputs were given to seamlessly merge and incorporate geospatial components throughout the semesters wherever seems relevant. Another initiative by IIRS was taken to enhance the understanding and essence of space and geospatial technologies amongst the young minds at 10+2 level. The content was proposed in a manner such that youngsters start realizing the innumerable contributions made by space and geospatial technologies in their day-to-day life. This effort both at school and college level would help in not only enhancing job opportunities for young generation but also utilizing the untapped human resource potential. In the era of smart cities, higher economic growth and aspirations for a better tomorrow, integration of Geospatial technologies with conventional wisdom can no longer be ignored.

  8. Biosecurity and geospatial analysis of mycoplasma infections in ...

    African Journals Online (AJOL)

    Geospatial database of farm locations and biosecurity measures are essential to control disease outbreaks. A study was conducted to establish geospatial database on poultry farms in Al-Jabal Al-Gharbi region of Libya, to evaluate the biosecurity level of each farm and to determine the seroprevalence of mycoplasma and ...

  9. Geospatial Services in Special Libraries: A Needs Assessment Perspective

    Science.gov (United States)

    Barnes, Ilana

    2013-01-01

    Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…

  10. Incidental Learning of Geospatial Concepts across Grade Levels: Map Overlay

    Science.gov (United States)

    Battersby, Sarah E.; Golledge, Reginald G.; Marsh, Meredith J.

    2006-01-01

    In this paper, the authors evaluate map overlay, a concept central to geospatial thinking, to determine how it is naively and technically understood, as well as to identify when it is leaner innately. The evaluation is supported by results from studies at three grade levels to show the progression of incidentally learned geospatial knowledge as…

  11. GeoSpatial Data Analysis for DHS Programs

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, Eric G.; Burke, John S.; Carlson, Carrie A.; Gillen, David S.; Joslyn, Cliff A.; Olsen, Bryan K.; Critchlow, Terence J.

    2009-05-10

    The Department of Homeland Security law enforcement faces the continual challenge of analyzing their custom data sources in a geospatial context. From a strategic perspective law enforcement has certain requirements to first broadly characterize a given situation using their custom data sources and then once it is summarily understood, to geospatially analyze their data in detail.

  12. Fostering 21st Century Learning with Geospatial Technologies

    Science.gov (United States)

    Hagevik, Rita A.

    2011-01-01

    Global positioning systems (GPS) receivers and other geospatial tools can help teachers create engaging, hands-on activities in all content areas. This article provides a rationale for using geospatial technologies in the middle grades and describes classroom-tested activities in English language arts, science, mathematics, and social studies.…

  13. RSS as a distribution medium for geo-spatial hypermedia

    DEFF Research Database (Denmark)

    Hansen, Frank Allan; Christensen, Bent Guldbjerg; Bouvin, Niels Olof

    2005-01-01

    This paper describes how the XML based RSS syndication formats used in weblogs can be utilized as the distribution medium for geo-spatial hypermedia, and how this approach can be used to create a highly distributed multi-user annotation system for geo-spatial hypermedia. It is demonstrated, how...

  14. The PMA Catalogue: 420 million positions and absolute proper motions

    Science.gov (United States)

    Akhmetov, V. S.; Fedorov, P. N.; Velichko, A. B.; Shulga, V. M.

    2017-07-01

    We present a catalogue that contains about 420 million absolute proper motions of stars. It was derived from the combination of positions from Gaia DR1 and 2MASS, with a mean difference of epochs of about 15 yr. Most of the systematic zonal errors inherent in the 2MASS Catalogue were eliminated before deriving the absolute proper motions. The absolute calibration procedure (zero-pointing of the proper motions) was carried out using about 1.6 million positions of extragalactic sources. The mean formal error of the absolute calibration is less than 0.35 mas yr-1. The derived proper motions cover the whole celestial sphere without gaps for a range of stellar magnitudes from 8 to 21 mag. In the sky areas where the extragalactic sources are invisible (the avoidance zone), a dedicated procedure was used that transforms the relative proper motions into absolute ones. The rms error of proper motions depends on stellar magnitude and ranges from 2-5 mas yr-1 for stars with 10 mag < G < 17 mag to 5-10 mas yr-1 for faint ones. The present catalogue contains the Gaia DR1 positions of stars for the J2015 epoch. The system of the PMA proper motions does not depend on the systematic errors of the 2MASS positions, and in the range from 14 to 21 mag represents an independent realization of a quasi-inertial reference frame in the optical and near-infrared wavelength range. The Catalogue also contains stellar magnitudes taken from the Gaia DR1 and 2MASS catalogues. A comparison of the PMA proper motions of stars with similar data from certain recent catalogues has been undertaken.

  15. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    Science.gov (United States)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-03-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  16. The Historical Evolution and Reflection of Geospatial Information Grid

    Directory of Open Access Journals (Sweden)

    WAN Gang

    2016-12-01

    Full Text Available With the development of science and practices, the grid concept continues to evolve, and there are differences and relations between different industries about understanding of its connotation. First, the theoretic characteristics of Geospatial Grid are analyzed and its broad and narrow concept is put forward. Then the history of Geospatial Grid development is reviewed and analyzed, and Geospatial Grid is considered as one kind human spatial cognition theory, and it developed into Geospatial Information Grid under the condition of information. At the end, this paper argues that the research object of Geospatial Information Grid is the informational earth system and it is based on the model theory, the service and standards system will be constructed, and the service range extends from human to intelligent platforms at that time which leading to a bright future.

  17. Generic key and catalogue of Mymaridae (Hymenoptera) of Mexico.

    Science.gov (United States)

    Guzmán-Larralde, Adriana J; Huber, John T; Martínez, Humberto Quiroz

    2017-04-12

    The Mexican genera of Mymaridae (Hymenoptera: Chalcidoidea) are keyed in English and Spanish, and a catalogue of species occurring in Mexico is presented. Thirty-six genera, including 79 named species in 20 of the genera, are reported. These are mentioned in about 100 publications either as original species descriptions or as publications that specifically mention species and/or specimens from Mexico. In the catalogue, species distributions by state are given based on literature records, and collection data are compiled from about 3630 specimens examined in eight collections in Canada, Mexico and USA. Host are listed for specimens reared mainly in Mexico. A few extralimital host records are also given.

  18. Natural gas and the environment - environmental research catalogue

    International Nuclear Information System (INIS)

    1996-06-01

    An updated account of environmental research currently underway by the Canadian Gas Association member companies was presented in this catalogue. The catalogue provides a review of research conducted in the areas of energy efficiency and conservation, NO x emissions, greenhouse gas emissions, indoor air quality, land restoration, site contamination, and herbicide usage. Projects are listed in subject groupings; for each project there is a title, name of the sponsoring organization, report date/study completion date where available, a brief abstract characterizing the project, and name, and address of a contact person

  19. Gaia DR1 documentation Chapter 7: Catalogue consolidation and validation

    Science.gov (United States)

    Arenou, F.; Babusiaux, C.; Blanco-Cuaresma, S.; Borrachero, R.; Cantat-Gaudin, T.; Fabricius, C.; Findeisen, K.; Helmi, A.; Hutton, A.; Luri, X.; Marrese, P.; Marinoni, S.; Marrese, P.; Robin, A.; Sordo, R.; Soria, S.; Turon, C.; Utrilla Molina, E.; Vallenari, A.

    2017-12-01

    The Gaia Catalogue does not only produce a wealth of data, it also represents a complex processing before a Catalogue can be issued. The main data processing is being handled by three DPAC Coordination Units, CU3 for the astrometric data, CU5 for the photometric data and CU6 for the spectroscopic data. Then three Coordination Units analyse the processed data, CU4 for optical or binary stars, solar system objects and extended objects, CU7 for variable stars, and CU8 for classification. Finally, CU9 takes care of the intermediate and final publication of the Gaia data. For Gaia DR1, the situation has been simplified in the sense that CU4, CU6 and CU8 did not contribute to the first Catalogue. At the last step, several data fields may have been computed by several Coordination Units (e.g., parallaxes computed by CU3, then again by CU4 with a fit of an astrometric + binary model if the star happens to have a significant binary motion; or a mean magnitude computed by CU5 may be superseded by another estimation from CU7 if the stars happens to be a periodic variable; etc.), in several Data Processing Centres, so an (a) homogeneous, (b) convenient, (c) consistent Catalogue has to be built. First, to a so-called CompleteSource is attached astrometric and photometric information, then possible variability information is integrated, producing an homogeneous Catalogue. Second, sources that do not meet some minimum astrometric or photometric quality are filtered out. The filters applied are described in Section 4 of Gaia Collaboration et al. (2016a). Third, while flat files are kept for further operations, the data is integrated inside the Gaia Archive Core System (GACS) database; crossmatch with external catalogues is also performed, providing the convenient access to the data. Fourth, the consistency of the Catalogue is obtained through a dedicated validation of its content. Sources that do not pass the validation criteria are then filtered out. This chapter describes these

  20. PENGEMBANGAN PERANGKAT LUNAK SISTEM INFORMASI GEOGRAFIS BERBASIS WEB

    Directory of Open Access Journals (Sweden)

    Budi Santosa

    2015-04-01

    Full Text Available Geospatial information is currently not only can be displayed using GIS software in a stand alone but can use the Internet as a medium for distributing geospatial information. Through the internet the whole population in the world can access geospatial information and provides a medium for geographic information processing desired without being limited by location. Web-based GIS map evolved from a web and client server architecture for distributed into a unity. Internet technology provides a new form for all functions of information systems is data collection, data storage, data retrieval (retrieving, data analysis and visualization of data. In this paper, the latest technology, web-based GIS with emphasis on architecture and stage of development of web-based GIS software that starts from the needs analysis to the maintenance stage. The implementation phase of the development of web-based GIS software to produce a web-based GIS product is right with the right process as well.

  1. The Database of the Catalogue of Clinical Practice Guidelines Published via Internet in the Czech Language -The Current State

    Czech Academy of Sciences Publication Activity Database

    Zvolský, Miroslav

    2010-01-01

    Roč. 6, č. 1 (2010), s. 83-89 ISSN 1801-5603 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : internet * World Wide Web * database * clinical practice guideline * clinical practice * evidence-based medicine * formalisation * GLIF (Guideline Inerchange Format) * doctor of medicine, * decision support systems Subject RIV: IN - Informatics, Computer Science http://www.ejbi.org/en/ejbi/article/63-en-the-database-of-the-catalogue-of-clinical-practice-guidelines-published-via-internet-in-the-czech-language-the-current-state.html

  2. Library catalogues as resources for book history: case study of Novosel’s bookstore catalogue in Zagreb (1794 - 1825

    Directory of Open Access Journals (Sweden)

    Marijana Tomić

    2008-07-01

    Full Text Available The aim of the paper is to analyze the book catalogue of Novosel’s bookstore, which operated in Zagreb from 1794 to 1825, and investigate the history of books and writing in Zagreb at the turn of the 19th century. The catalogue we analyzed is believed to have been published in 1801. Bearing in mind that the market-based economy started to develop in the late 18th century, it can be stipulated that Novosel and his staff and successors based the offer in their bookstore on market analysis, i.e. their readers’ needs. The increase in offer has sparked off new advertising techniques, i.e. printing of catalogues. It follows that their book catalogue reflects the image of the cultural and intellectual status and needs of readers in those times. The paper provides a short overview of book trade in the late 18th century Zagreb and of bookstore advertisements published both in books and individually, as well as a short overview of Novosel’s bookstore business. In the analysis we partly use the methodology introduced by Robert Darnton, the so-called Darnton’s circle, which takes a holistic view of the history of books taking into consideration all stages a book needs to go through - from the author, publisher, printer, bookstores, to readers, including the author him/herself as a reader. Every element is considered in relation to other elements in the circle, and in connection with external factors such as the economic and social environment, and political and intellectual influences. The books presented in the catalogue have been analyzed using different criteria: language, genre and country where they were printed. Books printed in Croatia and those written in Croatian have been given priority. In the catalogue analysis we used the database Skupni katalog hrvatskih knjižnica (joint Croatian library catalogue in order to reconstruct the printing year and printing shops that have not been listed in the catalogues. Using this methodology, we partly

  3. Geospatial Data Management Platform for Urban Groundwater

    Science.gov (United States)

    Gaitanaru, D.; Priceputu, A.; Gogu, C. R.

    2012-04-01

    Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis

  4. The geo-spatial information infrastructure at the Centre for Control and Prevention of Zoonoses, University of Ibadan, Nigeria: an emerging sustainable One-Health pavilion.

    Science.gov (United States)

    Olugasa, B O

    2014-12-01

    The World-Wide-Web as a contemporary means of information sharing offers a platform for geo-spatial information dissemination to improve education about spatio-temporal patterns of disease spread at the human-animal-environment interface in developing countries of West Africa. In assessing the quality of exposure to geospatial information applications among students in five purposively selected institutions in West Africa, this study reviewed course contents and postgraduate programmes in zoonoses surveillance. Geospatial information content and associated practical exercises in zoonoses surveillance were scored.. Seven criteria were used to categorize and score capability, namely, spatial data capture; thematic map design and interpretation; spatio-temporal analysis; remote sensing of data; statistical modelling; the management of spatial data-profile; and web-based map sharing operation within an organization. These criteria were used to compute weighted exposure during training at the institutions. A categorical description of institution with highest-scoring of computed Cumulative Exposure Point Average (CEPA) was based on an illustration with retrospective records of rabies cases, using data from humans, animals and the environment, that were sourced from Grand Bassa County, Liberia to create and share maps and information with faculty, staff, students and the neighbourhood about animal bite injury surveillance and spatial distribution of rabies-like illness. Uniformly low CEPA values (0-1.3) were observed across academic departments. The highest (3.8) was observed at the Centre for Control and Prevention of Zoonoses (CCPZ), University of Ibadan, Nigeria, where geospatial techniques were systematically taught, and thematic and predictive maps were produced and shared online with other institutions in West Africa. In addition, a short course in zoonosis surveillance, which offers inclusive learning in geospatial applications, is taught at CCPZ. The paper

  5. Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data

    Science.gov (United States)

    Baxes, Gregory; Mixon, Brian; Linger, TIm

    2013-01-01

    Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be

  6. IAEA Publications Catalogue 2013-2014 - full details of publications published 2012-2014, forthcoming publications and a stocklist of publications published 2010-2013

    International Nuclear Information System (INIS)

    2013-08-01

    This publications catalogue lists all sales publications of the IAEA published in 2012 and 2013 and those forthcoming in 2013-2014. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  7. IAEA Publications Catalogue 2015-2016 - full details of publications published 2014-2016, forthcoming publications and a stocklist of publications published 2012-2015

    International Nuclear Information System (INIS)

    2015-01-01

    This publications catalogue lists all sales publications of the IAEA published in 2014 and 2015 and those forthcoming in 2015-2016. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  8. IAEA Publications Catalogue 2017-2018 - full details of publications published 2016-2017, forthcoming publications 2017-2018 and a stocklist of publications published 2014-2017

    International Nuclear Information System (INIS)

    2016-08-01

    This publications catalogue lists all sales publications of the IAEA published in 2016–2017 and those forthcoming in 2017–2018. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. Most publications are issued in softcover. A complete listing of all IAEA priced publications is available on the IAEA’s web site: www.iaea.org/books

  9. IAEA Publications Catalogue 2016-2017 - full details of publications published 2015-2016, forthcoming publications 2016-2017 and a stocklist of publications published 2013-2016

    International Nuclear Information System (INIS)

    2016-08-01

    This publications catalogue lists all sales publications of the IAEA published in 2015-2016 and those forthcoming in 2016-2017. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  10. IAEA Publications Catalogue 2014-2015 - full details of publications published 2013-2015, forthcoming publications and a stocklist of publications published 2011-2014

    International Nuclear Information System (INIS)

    2014-07-01

    This publications catalogue lists all sales publications of the IAEA published in 2013 and 2014 and those forthcoming in 2014-2015. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  11. Processing, Cataloguing and Distribution of Uas Images in Near Real Time

    Science.gov (United States)

    Runkel, I.

    2013-08-01

    Why are UAS such a hype? UAS make the data capture flexible, fast and easy. For many applications this is more important than a perfect photogrammetric aerial image block. To ensure, that the advantage of a fast data capturing will be valid up to the end of the processing chain, all intermediate steps like data processing and data dissemination to the customer need to be flexible and fast as well. GEOSYSTEMS has established the whole processing workflow as server/client solution. This is the focus of the presentation. Depending on the image acquisition system the image data can be down linked during the flight to the data processing computer or it is stored on a mobile device and hooked up to the data processing computer after the flight campaign. The image project manager reads the data from the device and georeferences the images according to the position data. The meta data is converted into an ISO conform format and subsequently all georeferenced images are catalogued in the raster data management System ERDAS APOLLO. APOLLO provides the data, respectively the images as an OGC-conform services to the customer. Within seconds the UAV-images are ready to use for GIS application, image processing or direct interpretation via web applications - where ever you want. The whole processing chain is built in a generic manner. It can be adapted to a magnitude of applications. The UAV imageries can be processed and catalogued as single ortho imges or as image mosaic. Furthermore, image data of various cameras can be fusioned. By using WPS (web processing services) image enhancement, image analysis workflows like change detection layers can be calculated and provided to the image analysts. The processing of the WPS runs direct on the raster data management server. The image analyst has no data and no software on his local computer. This workflow is proven to be fast, stable and accurate. It is designed to support time critical applications for security demands - the images

  12. PROCESSING, CATALOGUING AND DISTRIBUTION OF UAS IMAGES IN NEAR REAL TIME

    Directory of Open Access Journals (Sweden)

    I. Runkel

    2013-08-01

    Full Text Available Why are UAS such a hype? UAS make the data capture flexible, fast and easy. For many applications this is more important than a perfect photogrammetric aerial image block. To ensure, that the advantage of a fast data capturing will be valid up to the end of the processing chain, all intermediate steps like data processing and data dissemination to the customer need to be flexible and fast as well. GEOSYSTEMS has established the whole processing workflow as server/client solution. This is the focus of the presentation. Depending on the image acquisition system the image data can be down linked during the flight to the data processing computer or it is stored on a mobile device and hooked up to the data processing computer after the flight campaign. The image project manager reads the data from the device and georeferences the images according to the position data. The meta data is converted into an ISO conform format and subsequently all georeferenced images are catalogued in the raster data management System ERDAS APOLLO. APOLLO provides the data, respectively the images as an OGC-conform services to the customer. Within seconds the UAV-images are ready to use for GIS application, image processing or direct interpretation via web applications – where ever you want. The whole processing chain is built in a generic manner. It can be adapted to a magnitude of applications. The UAV imageries can be processed and catalogued as single ortho imges or as image mosaic. Furthermore, image data of various cameras can be fusioned. By using WPS (web processing services image enhancement, image analysis workflows like change detection layers can be calculated and provided to the image analysts. The processing of the WPS runs direct on the raster data management server. The image analyst has no data and no software on his local computer. This workflow is proven to be fast, stable and accurate. It is designed to support time critical applications for security

  13. a Public Platform for Geospatial Data Sharing for Disaster Risk Management

    Science.gov (United States)

    Balbo, S.; Boccardo, P.; Dalmasso, S.; Pasquali, P.

    2013-01-01

    Several studies have been conducted in Africa to assist local governments in addressing the risk situation related to natural hazards. Geospatial data containing information on vulnerability, impacts, climate change, disaster risk reduction is usually part of the output of such studies and is valuable to national and international organizations to reduce the risks and mitigate the impacts of disasters. Nevertheless this data isn't efficiently widely distributed and often resides in remote storage solutions hardly reachable. Spatial Data Infrastructures are technical solutions capable to solve this issue, by storing geospatial data and making them widely available through the internet. Among these solutions, GeoNode, an open source online platform for geospatial data sharing, has been developed in recent years. GeoNode is a platform for the management and publication of geospatial data. It brings together mature and stable open-source software projects under a consistent and easy-to-use interface allowing users, with little training, to quickly and easily share data and create interactive maps. GeoNode data management tools allow for integrated creation of data, metadata, and map visualizations. Each dataset in the system can be shared publicly or restricted to allow access to only specific users. Social features like user profiles and commenting and rating systems allow for the development of communities around each platform to facilitate the use, management, and quality control of the data the GeoNode instance contains (http://geonode.org/). This paper presents a case study scenario of setting up a Web platform based on GeoNode. It is a public platform called MASDAP and promoted by the Government of Malawi in order to support development of the country and build resilience against natural disasters. A substantial amount of geospatial data has already been collected about hydrogeological risk, as well as several other-disasters related information. Moreover this

  14. Web Caching

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 7. Web Caching - A Technique to Speedup Access to Web Contents. Harsha Srinath Shiva Shankar Ramanna. General Article Volume 7 Issue 7 July 2002 pp 54-62 ... Keywords. World wide web; data caching; internet traffic; web page access.

  15. Digitizing Villanova University's Eclipsing Binary Card Catalogue

    Science.gov (United States)

    Guzman, Giannina; Dalton, Briana; Conroy, Kyle; Prsa, Andrej

    2018-01-01

    Villanova University’s Department of Astrophysics and Planetary Science has years of hand-written archival data on Eclipsing Binaries at its disposal. This card catalog began at Princeton in the 1930’s with notable contributions from scientists such as Henry Norris Russel. During World War II, the archive was moved to the University of Pennsylvania, which was one of the world centers for Eclipsing Binary research, consequently, the contributions to the catalog during this time were immense. It was then moved to University of Florida at Gainesville before being accepted by Villanova in the 1990’s. The catalog has been kept in storage since then. The objective of this project is to digitize this archive and create a fully functional online catalog that contains the information available on the cards, along with the scan of the actual cards. Our group has built a database using a python-powered infrastructure to contain the collected data. The team also built a prototype web-based searchable interface as a front-end to the catalog. Following the data-entry process, information like the Right Ascension and Declination will be run against SIMBAD and any differences between values will be noted as part of the catalog. Information published online from the card catalog and even discrepancies in information for a star, could be a catalyst for new studies on these Eclipsing Binaries. Once completed, the database-driven interface will be made available to astronomers worldwide. The group will also acquire, from the database, a list of referenced articles that have yet to be found online in order to further pursue their digitization. This list will be comprised of references in the cards that were neither found on ADS nor online during the data-entry process. Pursuing the integration of these references to online queries such as ADS will be an ongoing process that will contribute and further facilitate studies on Eclipsing Binaries.

  16. The WATCH solar X-ray burst catalogue

    DEFF Research Database (Denmark)

    Crosby, N.; Lund, Niels; Vilmer, N.

    1998-01-01

    The WATCH experiment aboard the GRANAT satellite provides observations of the Sun in the deka-keV range covering the years 1990 through mid-1992. An introduction to the experiment is given followed by an explanation of how the WATCH solar burst catalogue was created. The different parameters list...

  17. Provisional host catalogue of Fig wasps (Hymenoptera, Chalcidoidea)

    NARCIS (Netherlands)

    Wiebes, J.T.

    1966-01-01

    INTRODUCTION In this catalogue — entitled "provisional" because our knowledge of the subject is still so evidently incomplete — all species of Ficus mentioned as hosts of fig wasps, are listed with the Hymenoptera Chalcidoidea reared from their receptacles. The names used for the Agaonidae are in

  18. Catalogue of Videorecordings and Films, Kindergarten to Grade 6, 1993.

    Science.gov (United States)

    Manitoba Dept. of Education, Winnipeg. Instructional Resources Branch.

    This catalogue lists and indexes 2,233 videorecordings, 16mm film, and videodisc titles held by the Library, Manitoba Education and Training for borrowing; some are also available for dubbing. The catalog indexes materials intended for children in kindergarten through grade 6, and is divided into three parts: an annotated title and series index, a…

  19. An annotated catalogue of the generic names of the Bromeliaceae

    NARCIS (Netherlands)

    Grant, J.R.; Zijlstra, G.

    1998-01-01

    An annotated catalogue of the known generic names of the Bromeliaceae is presented. It accounts for 187 names in six lists: I. Generic names (133), II. Invalid names (7), III. A synonymized checklist of the genera of the Bromeliaceae (56 accepted genera, and 77 synonyms), IV. Nothogenera (bigeneric

  20. Derivation of photometric redshifts for the 3XMM catalogue

    Science.gov (United States)

    Georgantopoulos, I.; Corral, A.; Mountrichas, G.; Ruiz, A.; Masoura, V.; Fotopoulou, S.; Watson, M.

    2017-10-01

    We present the results from our ESA Prodex project that aims to derive photometric redshifts for the 3XMM catalogue. The 3XMM DR-6 offers the largest X-ray survey, containing 470,000 unique sources over 1000 sq. degrees. We cross-correlate the X-ray positions with optical and near-IR catalogues using Bayesian statistics. The optical catalogue used so far is the SDSS while currently we are employing the recently released PANSTARRS catalogue. In the near IR we use the Viking, VHS, UKIDS surveys and also the WISE W1 and W2 filters. The estimation of photometric redshifts is based on the TPZ software. The training sample is based on X-ray selected samples with available SDSS spectroscopy. We present here the results for the 40,000 3XMM sources with available SDSS counterparts. Our analysis provides very reliable photometric redshifts with sigma(mad)=0.05 and a fraction of outliers of 8% for the optically extended sources. We discuss the wide range of applications that are feasible using this unprecedented resource.

  1. The oldest preserved Slovenian library catalogue – 1st part

    Directory of Open Access Journals (Sweden)

    Stanislav Južnič

    2008-01-01

    Full Text Available The article presents a professional work of the librarian Filip Terpin from Selca (North of Škofja Loka, Slovenia. Terpin’s catalogue of the bishop’s library in Gornji grad is the oldest catalogue in Slovenia, with the exception of some protestant book lists which date even earlier. It was created in 1655, thirteen years before the second oldest preserved catalogue, which was made by his contemporary Schönleben. Terpin’s catalogue is carefully described and compared with the Schönleben’s list of Volf Engelbert Auersperg’s library, which was made in 1668. The author highlights the professional relations between the two librarians and describes the destiny of the Primož Trubar’s and the bishof’s libraries in Gornji grad. The article is a part of an extensive research set out to identify and explore the provenience of science books which were under the Terpin’s custody in the Gornji grad library, especially of those owned by the Slovenian protestants or even Primož Trubar himself.

  2. EURISCO: The European search catalogue for plant genetic resources

    NARCIS (Netherlands)

    Weise, Stephan; Oppermann, Markus; Maggioni, Lorenzo; Hintum, van Theo; Knüpffer, Helmut

    2017-01-01

    The European Search Catalogue for Plant Genetic Resources, EURISCO, provides information about 1.8 million crop plant accessions preserved by almost 400 institutes in Europe and beyond. EURISCO is being maintained on behalf of the European Cooperative Programme for Plant Genetic Resources. It is

  3. The WATCH solar X-ray burst catalogue

    DEFF Research Database (Denmark)

    Crosby, N.; Lund, Niels; Vilmer, N.

    1998-01-01

    The WATCH experiment aboard the GRANAT satellite provides observations of the Sun in the deka-keV range covering the years 1990 through mid-1992. An introduction to the experiment is given followed by an explanation of how the WATCH solar burst catalogue was created. The different parameters listed...

  4. Networking the library catalogue: Lessons from the Kwame ...

    African Journals Online (AJOL)

    This paper describes the general procedure that the Kwame Nkrumah University of Science and Technology (KNUST) Library followed for carrying out the automation and subsequent networking of its library catalogues. The highlights of these activities include choice of software, selection of vendors, conversion of records ...

  5. consortial efforts in cataloguing, bibliography and indexing services ...

    African Journals Online (AJOL)

    manda

    consortium is a “formal association of a number of organizations, usually in a specific geographical area, with agreed goals and objectives. Services covered can include collection development, cataloguing, computer alliances, systems support, education and training, inter-library loans, library automation, purchasing, etc”.

  6. Modelling and Implementation of Catalogue Cards Using FreeMarker

    Science.gov (United States)

    Radjenovic, Jelen; Milosavljevic, Branko; Surla, Dusan

    2009-01-01

    Purpose: The purpose of this paper is to report on a study involving the specification (using Unified Modelling Language (UML) 2.0) of information requirements and implementation of the software components for generating catalogue cards. The implementation in a Java environment is developed using the FreeMarker software.…

  7. Impact of magnitude uncertainties on seismic catalogue properties

    Science.gov (United States)

    Leptokaropoulos, K. M.; Adamaki, A. K.; Roberts, R. G.; Gkarlaouni, C. G.; Paradisopoulou, P. M.

    2018-05-01

    Catalogue-based studies are of central importance in seismological research, to investigate the temporal, spatial and size distribution of earthquakes in specified study areas. Methods for estimating the fundamental catalogue parameters like the Gutenberg-Richter (G-R) b-value and the completeness magnitude (Mc) are well established and routinely applied. However, the magnitudes reported in seismicity catalogues contain measurement uncertainties which may significantly distort the estimation of the derived parameters. In this study, we use numerical simulations of synthetic data sets to assess the reliability of different methods for determining b-value and Mc, assuming the G-R law validity. After contaminating the synthetic catalogues with Gaussian noise (with selected standard deviations), the analysis is performed for numerous data sets of different sample size (N). The noise introduced to the data generally leads to a systematic overestimation of magnitudes close to and above Mc. This fact causes an increase of the average number of events above Mc, which in turn leads to an apparent decrease of the b-value. This may result to a significant overestimation of seismicity rate even well above the actual completeness level. The b-value can in general be reliably estimated even for relatively small data sets (N value for analysis. In such cases, there may be a risk of severe miscalculation of seismicity rate regardless the selected magnitude threshold, unless possible bias is properly assessed.

  8. Retrospective Conversion of Card catalogue at the University of ...

    African Journals Online (AJOL)

    Also, the article describes the conversion methods employed, procedures followed, and pre-conversion preparations made by the library to automate its card catalogue and to assign barcode labels to the collection. Finally it concludes by highlighting problems encountered throughout the project and by giving statistical ...

  9. Procedures and challenges of retrospective catalogue conversion in ...

    African Journals Online (AJOL)

    The study recommended that management of the universities should provide stand- by electricity generator and upgrading of Internet network services among other things in the two university libraries for effective and efficient service delivery. Key words: Catalogue, Libraries Procedures, Conversion, Universities ...

  10. International Catalogue of Sealed Radioactive Sources and Devices

    International Nuclear Information System (INIS)

    2010-01-01

    The international catalogue of sealed radioactive sources and devices have two major objectives. The first objective is to provide vital information for a wide range of individuals and organizations on industrially manufactured radioactive sources and devices. The second objective is to facilitate identification of design specifications based on limited information from orphan sources and devices to allow safe handling of these items.

  11. Problems and Challenges of Automating Cataloguing Process at ...

    African Journals Online (AJOL)

    This paper discusses the problems faced by Kenneth Dike Library in automating its cataloguing process since 1992. It further attempts to identify some of the constraints inhibiting the success of the process: inadequate funding, dearth of systems analysts, absence of dedicated commitment to automation on the part of the ...

  12. The Semantics of Web Services: An Examination in GIScience Applications

    Directory of Open Access Journals (Sweden)

    Xuan Shi

    2013-09-01

    Full Text Available Web service is a technological solution for software interoperability that supports the seamless integration of diverse applications. In the vision of web service architecture, web services are described by the Web Service Description Language (WSDL, discovered through Universal Description, Discovery and Integration (UDDI and communicate by the Simple Object Access Protocol (SOAP. Such a divination has never been fully accomplished yet. Although it was criticized that WSDL only has a syntactic definition of web services, but was not semantic, prior initiatives in semantic web services did not establish a correct methodology to resolve the problem. This paper examines the distinction and relationship between the syntactic and semantic definitions for web services that characterize different purposes in service computation. Further, this paper proposes that the semantics of web service are neutral and independent from the service interface definition, data types and platform. Such a conclusion can be a universal law in software engineering and service computing. Several use cases in the GIScience application are examined in this paper, while the formalization of geospatial services needs to be constructed by the GIScience community towards a comprehensive ontology of the conceptual definitions and relationships for geospatial computation. Advancements in semantic web services research will happen in domain science applications.

  13. The Planck Compact Source Catalogues: present and future

    Science.gov (United States)

    López-Caniego, Marcos; Aff002

    The Planck Collaboration has produced catalogues of radio and sub-millimeter compact sources at the nine Planck frequencies in total intensity and polarization. In particular, the 2015 Second Planck Catalogue of Compact Sources (PCCS2) contains over 45.000 sources detected in the Planck full mission maps. Since the Planck instruments have polarization capabilities in seven of its nine detectors, we were able to measure the polarized flux density of over 600 sources between 30 and 353 GHz. But we are searching not only for compact sources in single frequency maps, and we take advantage of the large frequency coverage of Planck to search for objects with specific emission laws. This is the case of the SZ catalogue of cluster of galaxies (PSZ2), that lists 1653 clusters, 1203 of which are confirmed clusters with clear associations in external data-sets, and the Galactic cold clump catalogue (PGCC) with 13188 objects. The Planck Collaboration has also published a list of high-redshift source candidates (see the report by Ludovic Montier here). These objects are rare bright sub-millimeter sources with an spectral energy distribution peaking between 353 and 857 GHz, and have been detected combining Planck and IRAS data. The colours of most of these objects are consistent with redshifts z>2, a fraction of which could be lensed objects with redshifts between 2 and 4. But new catalogues are foreseen. A multi-frequency compact source catalogue is being produced selecting sources at radio frequencies and studying them across all Planck bands. Multi-frequency catalogues can be difficult to produce in experiments like Planck that have a large frequency coverage and very different resolutions across bands. In some cases, a source can be very bright across the whole Planck frequency range and it is easy to do the associations across channels. However, it frequent to find unrelated sub-millimeter sources within the half-degree beam of the 30 GHz low frequency detector, and the

  14. Web Mining

    Science.gov (United States)

    Fürnkranz, Johannes

    The World-Wide Web provides every internet citizen with access to an abundance of information, but it becomes increasingly difficult to identify the relevant pieces of information. Research in web mining tries to address this problem by applying techniques from data mining and machine learning to Web data and documents. This chapter provides a brief overview of web mining techniques and research areas, most notably hypertext classification, wrapper induction, recommender systems and web usage mining.

  15. Impact of magnitude uncertainties on seismic catalogue properties

    Science.gov (United States)

    Leptokaropoulos, K. M.; Adamaki, A. K.; Roberts, R. G.; Gkarlaouni, C. G.; Paradisopoulou, P. M.

    2018-01-01

    Catalogue based studies are of central importance in seismological research, to investigate the temporal, spatial and size distribution of earthquakes in specified study areas. Methods for estimating the fundamental catalogue parameters like the Gutenberg-Richter (G-R) b-value and the completeness magnitude (Mc) are well established and routinely applied. However, the magnitudes reported in seismicity catalogues contain measurement uncertainties which may significantly distort the estimation of the derived parameters. In this study, we use numerical simulations of synthetic data sets to assess the reliability of different methods for determining b-value and Mc, assuming the G-R law validity. After contaminating the synthetic catalogues with Gaussian noise (with selected standard deviations), the analysis is performed for numerous data sets of different sample size (N). The noise introduced to the data generally leads to a systematic overestimation of magnitudes close to and above Mc. This fact causes an increase of the average number of events above Mc, which in turn leads to an apparent decrease of the b-value. This may result to a significant overestimation of seismicity rate even well above the actual completeness level. The b-value can in general be reliably estimated even for relatively small data sets (N < 1000) when only magnitudes higher than the actual completeness level are used. Nevertheless, a correction of the total number of events belonging in each magnitude class (i.e. 0.1 unit) should be considered, to deal with the magnitude uncertainty effect. Because magnitude uncertainties (here with the form of Gaussian noise) are inevitable in all instrumental catalogues, this finding is fundamental for seismicity rate and seismic hazard assessment analyses. Also important is that for some data analyses significant bias cannot necessarily be avoided by choosing a high Mc value for analysis. In such cases there may be a risk of severe miscalculation of

  16. Improving catalogue matching by supplementing astrometry with additional photometric information

    Science.gov (United States)

    Wilson, Tom J.; Naylor, Tim

    2018-02-01

    The matching of sources between photometric catalogues can lead to cases where objects of differing brightness are incorrectly assumed to be detections of the same source. The rejection of unphysical matches can be achieved through the inclusion of information about the sources' magnitudes. The method described here uses the additional photometric information from both catalogues in the process of accepting or rejecting counterparts, providing approximately a factor of 10 improvement in Bayes' factor with its inclusion. When folding in the photometric information we avoid using prior astrophysical knowledge. Additionally, the method allows for the possibility of no counterparts to sources and the possibility that sources overlap multiple potential counterparts. We formally describe the probability of two sources being the same astrometric object, allowing systematic effects of astrometric perturbation (by e.g. contaminant objects) to be accounted for. We apply the method to two cases. First, we test INT Photometric Hα Survey of the Northern Galactic Plane (IPHAS)-Gaia matches to compare the resulting matches in two catalogues of similar wavelength coverage but differing dynamical ranges. Second, we apply the method to matches between IPHAS and Two Micron All Sky Survey and show that the method holds when considering two catalogues with approximately equal astrometric precision. We discuss the importance of including the magnitude information in each case. Additionally, we discuss extending the method to multiple catalogue matches through an iterative matching process. The method allows for the selection of high-quality matches by providing an overall probability for each pairing, giving the flexibility to choose stars known to be good matches.

  17. Catalogue of Tenebrionidae (Coleoptera of North America

    Directory of Open Access Journals (Sweden)

    Yves Bousquet

    2018-01-01

    Full Text Available This catalogue includes all valid family-group (8 subfamilies, 52 tribes, 14 subtribes, genus-group (349 genera, 86 subgenera, and species-group names (2825 species, 215 subspecies of darkling beetles (Coleoptera: Tenebrionidae known to occur in North America1 and their available synonyms. Data on extant, subfossil and fossil taxa are given. For each name the author and year and page number of the description are provided, with additional information (e.g., type species for genus-group names, author of synonymies for invalid taxa depending on the taxon rank. Several new nomenclatural acts are included. One new genus, Lepidocnemeplatia Bousquet and Bouchard, is described. Spelaebiosis Bousquet and Bouchard [for Ardoinia Özdikmen, 2004], Blapstinus marcuzzii Aalbu [for Blapstinus kulzeri Marcuzzi, 1977], and Hymenorus campbelli Bouchard [for Hymenorus oculatus Doyen and Poinar, 1994] are proposed as new replacement names. Supporting evidence is provided for the conservation of usage of Tarpela micans (Fabricius, 1798 nomen protectum over Tarpela vittata (Olivier, 1793 nomen oblitum. The generic names Psilomera Motschulsky, 1870 [= Stenomorpha Solier, 1836], Steneleodes Blaisdell, 1909 [= Xysta Eschscholtz, 1829], Ooconibius Casey, 1895 and Euconibius Casey, 1895 [= Conibius LeConte, 1851] are new synonyms (valid names in square brackets. The following 127 new synonymies of species-group names, listed in their original combination, are proposed (valid names, in their current combination, placed in square brackets: Bothrasida mucorea Wilke, 1922 [= Pelecyphorus guanajuatensis (Champion, 1884]; Parasida zacualpanicola Wilke, 1922 [= Pelecyphorus asidoides Solier, 1836]; Stenosides kulzeri Pallister, 1954, Stenosides bisinuatus Pallister, 1954, and Parasida trisinuata Pallister, 1954 [= Pelecyphorus dispar (Champion, 1892]; Asida favosa Champion, 1884 and Asida similata Champion, 1884 [= Pelecyphorus fallax (Champion, 1884]; Ologlyptus bicarinatus

  18. Catalogue of Tenebrionidae (Coleoptera) of North America

    Science.gov (United States)

    Bousquet, Yves; Thomas, Donald B.; Bouchard, Patrice; Smith, Aaron D.; Aalbu, Rolf L.; Johnston, M. Andrew; Jr., Warren E. Steiner

    2018-01-01

    Abstract This catalogue includes all valid family-group (8 subfamilies, 52 tribes, 14 subtribes), genus-group (349 genera, 86 subgenera), and species-group names (2825 species, 215 subspecies) of darkling beetles (Coleoptera: Tenebrionidae) known to occur in North America1 and their available synonyms. Data on extant, subfossil and fossil taxa are given. For each name the author and year and page number of the description are provided, with additional information (e.g., type species for genus-group names, author of synonymies for invalid taxa) depending on the taxon rank. Several new nomenclatural acts are included. One new genus, Lepidocnemeplatia Bousquet and Bouchard, is described. Spelaebiosis Bousquet and Bouchard [for Ardoinia Özdikmen, 2004], Blapstinus marcuzzii Aalbu [for Blapstinus kulzeri Marcuzzi, 1977], and Hymenorus campbelli Bouchard [for Hymenorus oculatus Doyen and Poinar, 1994] are proposed as new replacement names. Supporting evidence is provided for the conservation of usage of Tarpela micans (Fabricius, 1798) nomen protectum over Tarpela vittata (Olivier, 1793) nomen oblitum. The generic names Psilomera Motschulsky, 1870 [= Stenomorpha Solier, 1836], Steneleodes Blaisdell, 1909 [= Xysta Eschscholtz, 1829], Ooconibius Casey, 1895 and Euconibius Casey, 1895 [= Conibius LeConte, 1851] are new synonyms (valid names in square brackets). The following 127 new synonymies of species-group names, listed in their original combination, are proposed (valid names, in their current combination, placed in square brackets): Bothrasida mucorea Wilke, 1922 [= Pelecyphorus guanajuatensis (Champion, 1884)]; Parasida zacualpanicola Wilke, 1922 [= Pelecyphorus asidoides Solier, 1836]; Stenosides kulzeri Pallister, 1954, Stenosides bisinuatus Pallister, 1954, and Parasida trisinuata Pallister, 1954 [= Pelecyphorus dispar (Champion, 1892)]; Asida favosa Champion, 1884 and Asida similata Champion, 1884 [= Pelecyphorus fallax (Champion, 1884)]; Ologlyptus bicarinatus

  19. VizieR Online Data Catalog: LUT Survey Catalogue Data Release 1 (Men+, 2016)

    Science.gov (United States)

    Meng, X.-M.; Han, X.-H.; Wei, J.-Y.; Wang, J.; Cao, L.; Qiu, Y.-L.; Wu, C.; Deng, J.-S.; Cai, H.-B.; Xin, L.-P.

    2016-06-01

    In the first release version, the catalogue provides high confidence sources which have been cross-identified with Tycho-2 catalogue. The catalogue provides equatorial coordinate positions, magnitudes measured by aperture photometry and PSF photometry, aperture apertures, and the corresponding Tycho-2 records of the stars detected by LUT survey program. No aperture correction or extinction correction was applied on these measurements. The catalogue is in IPAC table format. (1 data file).

  20. A novel algorithm for fully automated mapping of geospatial ontologies

    Science.gov (United States)

    Chaabane, Sana; Jaziri, Wassim

    2018-01-01

    Geospatial information is collected from different sources thus making spatial ontologies, built for the same geographic domain, heterogeneous; therefore, different and heterogeneous conceptualizations may coexist. Ontology integrating helps creating a common repository of the geospatial ontology and allows removing the heterogeneities between the existing ontologies. Ontology mapping is a process used in ontologies integrating and consists in finding correspondences between the source ontologies. This paper deals with the "mapping" process of geospatial ontologies which consist in applying an automated algorithm in finding the correspondences between concepts referring to the definitions of matching relationships. The proposed algorithm called "geographic ontologies mapping algorithm" defines three types of mapping: semantic, topological and spatial.

  1. Searches over graphs representing geospatial-temporal remote sensing data

    Energy Technology Data Exchange (ETDEWEB)

    Brost, Randolph; Perkins, David Nikolaus

    2018-03-06

    Various technologies pertaining to identifying objects of interest in remote sensing images by searching over geospatial-temporal graph representations are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Geospatial-temporal graph searches are made computationally efficient by taking advantage of characteristics of geospatial-temporal data in remote sensing images through the application of various graph search techniques.

  2. GPU based framework for geospatial analyses

    Science.gov (United States)

    Cosmin Sandric, Ionut; Ionita, Cristian; Dardala, Marian; Furtuna, Titus

    2017-04-01

    Parallel processing on multiple CPU cores is already used at large scale in geocomputing, but parallel processing on graphics cards is just at the beginning. Being able to use an simple laptop with a dedicated graphics card for advanced and very fast geocomputation is an advantage that each scientist wants to have. The necessity to have high speed computation in geosciences has increased in the last 10 years, mostly due to the increase in the available datasets. These datasets are becoming more and more detailed and hence they require more space to store and more time to process. Distributed computation on multicore CPU's and GPU's plays an important role by processing one by one small parts from these big datasets. These way of computations allows to speed up the process, because instead of using just one process for each dataset, the user can use all the cores from a CPU or up to hundreds of cores from GPU The framework provide to the end user a standalone tools for morphometry analyses at multiscale level. An important part of the framework is dedicated to uncertainty propagation in geospatial analyses. The uncertainty may come from the data collection or may be induced by the model or may have an infinite sources. These uncertainties plays important roles when a spatial delineation of the phenomena is modelled. Uncertainty propagation is implemented inside the GPU framework using Monte Carlo simulations. The GPU framework with the standalone tools proved to be a reliable tool for modelling complex natural phenomena The framework is based on NVidia Cuda technology and is written in C++ programming language. The code source will be available on github at https://github.com/sandricionut/GeoRsGPU Acknowledgement: GPU framework for geospatial analysis, Young Researchers Grant (ICUB-University of Bucharest) 2016, director Ionut Sandric

  3. Matilda, where are you: subject description of juvenile fiction in the Slovenian catalogue and catalogues of neighbouring countries

    Directory of Open Access Journals (Sweden)

    Alenka Šauperl

    2009-01-01

    Full Text Available Differences in subject description of juvenile fiction was investigated on five examples of international classics in five library catalogues: Oton Župančič Public Library (Knjižnica Otona Župančiča in Ljubljana, Slovenia, Stadtbibliothek public library in Graz, Austria, integrated catalogues of libraries in the Gorizia region in Italy (Sistema bibliotecario della Provincia di Gorizia and the Karlovac region in Croatia (Skupni katalog knjižnica Karlovačke županije in September 2008. As Slovenian youth rarely speaks languages of neighbouring countries, British Library catalogue was added.Results show that catalogue records are inconsistent within an individual library as well as in comparision with other libraries in the sample. Librarians do not make consistent subject descriptions. Class number, which is present in all catalogues except in the Austrian one, usually represents: the author’s country, language and/or nationality,the literary genre, and the target audience.Subject headings in the sample bring information on the subject (aboutness, author’s country, language and/or nationality, the literary genre, and target audience. Summaries tell more on the story. But they can also bring information on emotional experience of the reader, information on the author or history of the literary work. It would be economically beneficial if subject description could be more consistent. But uniform subject description is not possible because of diverse library collections and users.The solution might be in the use of multiple levels of subject description regarding to the type of the libraries.

  4. Brokered virtual hubs for facilitating access and use of geospatial Open Data

    Science.gov (United States)

    Mazzetti, Paolo; Latre, Miguel; Kamali, Nargess; Brumana, Raffaella; Braumann, Stefan; Nativi, Stefano

    2016-04-01

    Open Data is a major trend in current information technology scenario and it is often publicised as one of the pillars of the information society in the near future. In particular, geospatial Open Data have a huge potential also for Earth Sciences, through the enablement of innovative applications and services integrating heterogeneous information. However, open does not mean usable. As it was recognized at the very beginning of the Web revolution, many different degrees of openness exist: from simple sharing in a proprietary format to advanced sharing in standard formats and including semantic information. Therefore, to fully unleash the potential of geospatial Open Data, advanced infrastructures are needed to increase the data openness degree, enhancing their usability. In October 2014, the ENERGIC OD (European NEtwork for Redistributing Geospatial Information to user Communities - Open Data) project, funded by the European Union under the Competitiveness and Innovation framework Programme (CIP), has started. In response to the EU call, the general objective of the project is to "facilitate the use of open (freely available) geographic data from different sources for the creation of innovative applications and services through the creation of Virtual Hubs". The ENERGIC OD Virtual Hubs aim to facilitate the use of geospatial Open Data by lowering and possibly removing the main barriers which hampers geo-information (GI) usage by end-users and application developers. Data and services heterogeneity is recognized as one of the major barriers to Open Data (re-)use. It imposes end-users and developers to spend a lot of effort in accessing different infrastructures and harmonizing datasets. Such heterogeneity cannot be completely removed through the adoption of standard specifications for service interfaces, metadata and data models, since different infrastructures adopt different standards to answer to specific challenges and to address specific use-cases. Thus

  5. a Framework for AN Open Source Geospatial Certification Model

    Science.gov (United States)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  6. A Geospatial Decision Support System Toolkit, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to design a working prototype Geospatial Decision Support Toolkit (GeoKit) that will enable scientists, agencies, and stakeholders to configure and deploy...

  7. A Geospatial Decision Support System Toolkit, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and commercialize a working prototype Geospatial Decision Support Toolkit (GeoKit). GeoKit will enable scientists, agencies, and stakeholders to...

  8. A FRAMEWORK FOR AN OPEN SOURCE GEOSPATIAL CERTIFICATION MODEL

    Directory of Open Access Journals (Sweden)

    T. U. R. Khan

    2016-06-01

    Full Text Available The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission “Making geospatial education and opportunities accessible to all”. Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the “Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM. The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and

  9. Dhaka megacity geospatial perspectives on urbanisation, environment and health

    CERN Document Server

    Dewan, Ashraf

    2014-01-01

    Focused on Dhaka, and applicable to other cities, this book uses geospatial techniques to explore land use, climate variability, urban sprawl, population density modeling, flooding, water quality, urban growth modeling, infectious disease and quality of life.

  10. A Federated Geospatial and Imagery Exploitation Service (GIXS) Model

    National Research Council Canada - National Science Library

    Weber, Derek

    2000-01-01

    In order for the Geospatial and Imagery Exploitation Service (GIXS) architecture to take advantage of distributed processing of image exploitation tasks, it needs to be adapted to suit a federated environment...

  11. Conference on Geospatial Approaches to Cancer Control and Population Sciences

    Science.gov (United States)

    The purpose of this conference is to bring together a community of researchers across the cancer control continuum using geospatial tools, models and approaches to address cancer prevention and control.

  12. CREATING OF CENTRAL GEOSPATIAL DATABASE OF THE SLOVAK REPUBLIC AND PROCEDURES OF ITS REVISION

    Directory of Open Access Journals (Sweden)

    M. Miškolci

    2016-06-01

    Full Text Available The article describes the creation of initial three dimensional geodatabase from planning and designing through the determination of technological and manufacturing processes to practical using of Central Geospatial Database (CGD – official name in Slovak language is Centrálna Priestorová Databáza – CPD and shortly describes procedures of its revision. CGD ensures proper collection, processing, storing, transferring and displaying of digital geospatial information. CGD is used by Ministry of Defense (MoD for defense and crisis management tasks and by Integrated rescue system. For military personnel CGD is run on MoD intranet, and for other users outside of MoD is transmutated to ZbGIS (Primary Geodatabase of Slovak Republic and is run on public web site. CGD is a global set of geo-spatial information. CGD is a vector computer model which completely covers entire territory of Slovakia. Seamless CGD is created by digitizing of real world using of photogrammetric stereoscopic methods and measurements of objects properties. Basic vector model of CGD (from photogrammetric processing is then taken out to the field for inspection and additional gathering of objects properties in the whole area of mapping. Finally real-world objects are spatially modeled as a entities of three-dimensional database. CGD gives us opportunity, to get know the territory complexly in all the three spatial dimensions. Every entity in CGD has recorded the time of collection, which allows the individual to assess the timeliness of information. CGD can be utilized for the purposes of geographical analysis, geo-referencing, cartographic purposes as well as various special-purpose mapping and has the ambition to cover the needs not only the MoD, but to become a reference model for the national geographical infrastructure.

  13. Creating of Central Geospatial Database of the Slovak Republic and Procedures of its Revision

    Science.gov (United States)

    Miškolci, M.; Šafář, V.; Šrámková, R.

    2016-06-01

    The article describes the creation of initial three dimensional geodatabase from planning and designing through the determination of technological and manufacturing processes to practical using of Central Geospatial Database (CGD - official name in Slovak language is Centrálna Priestorová Databáza - CPD) and shortly describes procedures of its revision. CGD ensures proper collection, processing, storing, transferring and displaying of digital geospatial information. CGD is used by Ministry of Defense (MoD) for defense and crisis management tasks and by Integrated rescue system. For military personnel CGD is run on MoD intranet, and for other users outside of MoD is transmutated to ZbGIS (Primary Geodatabase of Slovak Republic) and is run on public web site. CGD is a global set of geo-spatial information. CGD is a vector computer model which completely covers entire territory of Slovakia. Seamless CGD is created by digitizing of real world using of photogrammetric stereoscopic methods and measurements of objects properties. Basic vector model of CGD (from photogrammetric processing) is then taken out to the field for inspection and additional gathering of objects properties in the whole area of mapping. Finally real-world objects are spatially modeled as a entities of three-dimensional database. CGD gives us opportunity, to get know the territory complexly in all the three spatial dimensions. Every entity in CGD has recorded the time of collection, which allows the individual to assess the timeliness of information. CGD can be utilized for the purposes of geographical analysis, geo-referencing, cartographic purposes as well as various special-purpose mapping and has the ambition to cover the needs not only the MoD, but to become a reference model for the national geographical infrastructure.

  14. Real-time GIS data model and sensor web service platform for environmental data management.

    Science.gov (United States)

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  15. Accuracy VS Performance: Finding the Sweet Spot in the Geospatial Resolution of Satellite Metadata

    Science.gov (United States)

    Baskin, W. E.; Mangosing, D. C.; Rinsland, P. L.

    2010-12-01

    NASA’s Atmospheric Science Data Center (ASDC) and the Cloud-Aerosol LIDAR and Infrared Pathfinder Satellite Observation (CALIPSO) team at the NASA Langley Research Center recently collaborated in the development of a new CALIPSO Search and Subset web application. The web application is comprised of three elements: (1) A PostGIS-enabled PostgreSQL database system, which is used to store temporal and geospatial metadata from CALIPSO’s LIDAR, Infrared, and Wide Field Camera datasets, (2) the SciFlo engine, which is a data flow engine that enables semantic, scientific data flow executions in a grid or clustered network computational environment, and (3) PHP-based web application that incorporates some Web 2.0 / AJAX technologies used in the web interface. The search portion of the web application leverages geodetic indexing and search capabilities that became available in the February 2010 release of PostGIS version1.5. This presentation highlights the lessons learned in experimenting with various geospatial resolutions of CALIPSO’s LIDAR sensor ground track metadata. Details of the various spatial resolutions, spatial database schema designs, spatial indexing strategies, and performance results will be discussed. The focus will be on illustrating our findings on the spatial resolutions for ground track metadata that optimized search time and search accuracy in the CALIPSO Search and Subset Application. The CALIPSO satellite provides new insight into the role that clouds and atmospheric aerosols (airborne particles) play in regulating Earth's weather, climate, and air quality. CALIPSO combines an active LIDAR instrument with passive infrared and visible imagers to probe the vertical structure and properties of thin clouds and aerosols over the globe. The CALIPSO satellite was launched on April 28, 2006 and is part of the A-train satellite constellation. The ASDC in Langley’s Science Directorate leads NASA’s program for the processing, archival and

  16. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  17. Geospatial Information is the Cornerstone of Effective Hazards Response

    Science.gov (United States)

    Newell, Mark

    2008-01-01

    Every day there are hundreds of natural disasters world-wide. Some are dramatic, whereas others are barely noticeable. A natural disaster is commonly defined as a natural event with catastrophic consequences for living things in the vicinity. Those events include earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, and wildfires. Man-made disasters are events that are caused by man either intentionally or by accident, and that directly or indirectly threaten public health and well-being. These occurrences span the spectrum from terrorist attacks to accidental oil spills. To assist in responding to natural and potential man-made disasters, the U.S. Geological Survey (USGS) has established the Geospatial Information Response Team (GIRT) (http://www.usgs.gov/emergency/). The primary purpose of the GIRT is to ensure rapid coordination and availability of geospatial information for effective response by emergency responders, and land and resource managers, and for scientific analysis. The GIRT is responsible for establishing monitoring procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing relevant geospatial products and services. The GIRT is focused on supporting programs, offices, other agencies, and the public in mission response to hazards. The GIRT will leverage the USGS Geospatial Liaison Network and partnerships with the Department of Homeland Security (DHS), National Geospatial-Intelligence Agency (NGA), and Northern Command (NORTHCOM) to coordinate the provisioning and deployment of USGS geospatial data, products, services, and equipment. The USGS geospatial liaisons will coordinate geospatial information sharing with State, local, and tribal governments, and ensure geospatial liaison back-up support procedures are in place. The GIRT will coordinate disposition of USGS staff in support of DHS response center activities as requested by DHS. The GIRT

  18. FogGIS: Fog Computing for Geospatial Big Data Analytics

    OpenAIRE

    Barik, Rabindra K.; Dubey, Harishchandra; Samaddar, Arun B.; Gupta, Rajan D.; Ray, Prakash K.

    2016-01-01

    Cloud Geographic Information Systems (GIS) has emerged as a tool for analysis, processing and transmission of geospatial data. The Fog computing is a paradigm where Fog devices help to increase throughput and reduce latency at the edge of the client. This paper developed a Fog-based framework named Fog GIS for mining analytics from geospatial data. We built a prototype using Intel Edison, an embedded microprocessor. We validated the FogGIS by doing preliminary analysis. including compression,...

  19. Domestic Disasters and Geospatial Technology for the Defense Logistics Agency

    Science.gov (United States)

    2014-12-01

    the public closer to geospatial information (Carpenter & Snell , 2013). Individuals create geospatial data at low cost and identify patterns within...is often customized to meet the needs of a specific government agency (Carpenter & Snell , 2013) 17 Using cloud computing for a GIS has also...alternative solutions to data collection. Three trends seem to drive the acceptance of cloud computing in GIS (Carpenter & Snell , 2013). The first is

  20. Global Geospatial Information Management: un'iniziativa delle Nazioni Unite

    Directory of Open Access Journals (Sweden)

    Mauro Salvemini

    2010-03-01

    Full Text Available What is Global Geospatial Information ManagementThere is general agreement of an urgent need for an inter-government consultative mechanism that can play a leadershiprole in setting the agenda for the development of global geospatial information and to promote its use to address key global challenges; to liaise and coordinate among Member States, and between Member States and international organizations.

  1. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin

    2010-03-01

    Geospatial information systems provide an abundance of information for researchers and scientists. Unfortunately this type of data can usually only be analyzed a few megapixels at a time, giving researchers a very narrow view into these voluminous data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented levels expediting analysis, interrogation, and discovery. ©2010 IEEE.

  2. Chamber catalogues of optical and fluorescent signatures distinguish bioaerosol classes

    Science.gov (United States)

    Hernandez, Mark; Perring, Anne E.; McCabe, Kevin; Kok, Greg; Granger, Gary; Baumgardner, Darrel

    2016-07-01

    Rapid bioaerosol characterization has immediate applications in the military, environmental and public health sectors. Recent technological advances have facilitated single-particle detection of fluorescent aerosol in near real time; this leverages controlled ultraviolet exposures with single or multiple wavelengths, followed by the characterization of associated fluorescence. This type of ultraviolet induced fluorescence has been used to detect airborne microorganisms and their fragments in laboratory studies, and it has been extended to field studies that implicate bioaerosol to compose a substantial fraction of supermicron atmospheric particles. To enhance the information yield that new-generation fluorescence instruments can provide, we report the compilation of a referential aerobiological catalogue including more than 50 pure cultures of common airborne bacteria, fungi and pollens, recovered at water activity equilibrium in a mesoscale chamber (1 m3). This catalogue juxtaposes intrinsic optical properties and select bandwidths of fluorescence emissions, which manifest to clearly distinguish between major classes of airborne microbes and pollens.

  3. Clay club catalogue of characteristics of argillaceous rocks

    International Nuclear Information System (INIS)

    2005-01-01

    The OECD/NEA Working Group on the Characterisation, the Understanding and the Performance of Argillaceous Rocks as Repository Host Formations, namely the Clay Club, examines the various argillaceous rocks that are being considered for the deep geological disposal of radioactive waste, i.e. from plastic, soft, poorly indurated clays to brittle, hard mud-stones or shales. The Clay Club considered it necessary and timely to provide a catalogue to gather in a structured way the key geo-scientific characteristics of the various argillaceous formations that are - or were - studied in NEA member countries with regard to radioactive waste disposal. The present catalogue represents the outcomes of this Clay Club initiative. (author)

  4. Economic Assessment of the Use Value of Geospatial Information

    Directory of Open Access Journals (Sweden)

    Richard Bernknopf

    2015-07-01

    Full Text Available Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI contained in geospatial data is the difference between the net benefits (in present value terms of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1 a retrospective model about environmental regulation of agrochemicals; (2 a prospective model about the impact and mitigation of earthquakes in urban areas; and (3 a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  5. BIM AND GIS: WHEN PARAMETRIC MODELING MEETS GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2017-12-01

    Full Text Available Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building scale to the infrastructure (where geospatial data cannot be neglected has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by “pure” GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator industry, as well as new solutions for parametric modelling with additional geoinformation.

  6. The Value of Information - Accounting for a New Geospatial Paradigm

    Science.gov (United States)

    Pearlman, J.; Coote, A. M.

    2014-12-01

    A new frontier in consideration of socio-economic benefit is valuing information as an asset, often referred to as Infonomics. Conventional financial practice does not easily provide a mechanism for valuing information and yet clearly for many of the largest corporations, such as Google and Facebook, it is their principal asset. This is exacerbated for public sector organizations, as those that information-centric rather than information-enabled are relatively few - statistics, archiving and mapping agencies are perhaps the only examples - so it's not at the top of the agenda for Government. However, it is a hugely important issue when valuing Geospatial data and information. Geospatial data allows public institutions to operate, and facilitates the provision of essential services for emergency response and national defense. In this respect, geospatial data is strongly analogous to other types of public infrastructure, such as utilities and roads. The use of Geospatial data is widespread from companies in the transportation or construction sectors to individual planning for daily events. The categorization of geospatial data as infrastructure is critical to decisions related to investment in its management, maintenance and upgrade over time. Geospatial data depreciates in the same way that physical infrastructure depreciates. It needs to be maintained otherwise its functionality and value in use declines. We have coined the term geo-infonomics to encapsulate the concept. This presentation will develop the arguments around its importance and current avenues of research.

  7. Integrating Free and Open Source Solutions into Geospatial Science Education

    Directory of Open Access Journals (Sweden)

    Vaclav Petras

    2015-06-01

    Full Text Available While free and open source software becomes increasingly important in geospatial research and industry, open science perspectives are generally less reflected in universities’ educational programs. We present an example of how free and open source software can be incorporated into geospatial education to promote open and reproducible science. Since 2008 graduate students at North Carolina State University have the opportunity to take a course on geospatial modeling and analysis that is taught with both proprietary and free and open source software. In this course, students perform geospatial tasks simultaneously in the proprietary package ArcGIS and the free and open source package GRASS GIS. By ensuring that students learn to distinguish between geospatial concepts and software specifics, students become more flexible and stronger spatial thinkers when choosing solutions for their independent work in the future. We also discuss ways to continually update and improve our publicly available teaching materials for reuse by teachers, self-learners and other members of the GIS community. Only when free and open source software is fully integrated into geospatial education, we will be able to encourage a culture of openness and, thus, enable greater reproducibility in research and development applications.

  8. Bim and Gis: when Parametric Modeling Meets Geospatial Data

    Science.gov (United States)

    Barazzetti, L.; Banfi, F.

    2017-12-01

    Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.

  9. Transport Infrastructure in the Process of Cataloguing Brownfields

    Science.gov (United States)

    Kramářová, Zuzana

    2017-10-01

    To begin with, the identification and follow-up revitalisation of brownfields raises a burning issue in territorial planning as well as in construction engineering. This phenomenon occurs not only in the Czech Republic and Europe, but also world-wide experts conduct its careful investigation. These issues may be divided into several areas. First, it is identifying and cataloguing single territorial localities; next, it means a complex process of locality revitalisation. As a matter of fact, legislative framework represents a separate area, which is actually highly specific in individual countries in accordance with the existing law, norms and regulations (it concerns mainly territorial planning and territory segmentation into appropriate administrative units). Legislative base of the Czech Republic was analysed in an article at WMCAUS in 2016. The solution of individual identification and following cataloguing of brownfields is worked out by Form of Regional Studies within the Legislation of the Czech Republic. Due to huge the scale of issues to be tackled, their content is only loosely defined in regard to Building Act and its implementing regulations, e.g. examining the layout of future construction in the area, locating architecturally or otherwise interesting objects, transport or technical infrastructure management, tourism, socially excluded localities etc. Legislative base does not exist, there is no common method for identifying and cataloguing brownfields. Therefore, individual catalogue lists are subject to customer’s requirements. All the same, the relevant information which the database contains may be always examined. One of them is part about transport infrastructure. The information may be divided into three subareas - information on transport accessibility of the locality, information on the actual infrastructure in the locality and information on the transport accessibility of human resources.

  10. Catalogue of gamma rays from radionuclides ordered by nuclide

    International Nuclear Information System (INIS)

    Ekstroem, L.P.; Andersson, P.; Sheppard, H.M.

    1984-01-01

    A catalogue of about 28500 gamma-ray energies from 2338 radionuclides is presented. The nuclides are listed in order of increasing (A,Z) of the daughter nuclide. In addition the gamma-ray intensity per 100 decays of the parent (if known) and the decay half-life are given. All data are from a computer processing of a recent ENSDF (Evaluated Nuclear Structure Data File) file. (authors)

  11. IAEA Library Catalogue of Books 1968-1970

    International Nuclear Information System (INIS)

    1971-01-01

    This is the first cumulative volume of the IAEA library new acquisitions. It lists new material received during the period March 1968 - December 1970. The catalogue is divided into four major sections. The first contains the full bibliographic listing for each entry. It is arranged by broad subjects, and within each subject by the Universal Decimal Classification (UDC) number. Each entry was then assigned a consecutive item number. The other three sections contain the personal author, title and corporate entry indexes, respectively

  12. English translations of German standards. Catalogue 1988. 24. ed.

    International Nuclear Information System (INIS)

    1988-01-01

    The catalogue contains a list of all currently available English translations of DIN standards, and of English translations of DIN handbooks, a numerical index, an alphabetical index, and an index of DIN EN, DIN IEC, DIN ISO standards, LN and VG standards. Some useful information on standards work in Germany and on the activities of DIN Deutsches Institut fuer Normung e.V. is given. (orig./HP)

  13. Geospatial Cyberinfrastructure and Geoprocessing Web—A Review of Commonalities and Differences of E-Science Approaches

    Directory of Open Access Journals (Sweden)

    Barbara Hofer

    2013-08-01

    Full Text Available Online geoprocessing gains momentum through increased online data repositories, web service infrastructures, online modeling capabilities and the required online computational resources. Advantages of online geoprocessing include reuse of data and services, extended collaboration possibilities among scientists, and efficiency thanks to distributed computing facilities. In the field of Geographic Information Science (GIScience, two recent approaches exist that have the goal of supporting science in online environments: the geospatial cyberinfrastructure and the geoprocessing web. Due to its historical development, the geospatial cyberinfrastructure has strengths related to the technologies required for data storage and processing. The geoprocessing web focuses on providing components for model development and sharing. These components shall allow expert users to develop, execute and document geoprocessing workflows in online environments. Despite this difference in the emphasis of the two approaches, the objectives, concepts and technologies they use overlap. This paper provides a review of the definitions and representative implementations of the two approaches. The provided overview clarifies which aspects of e-Science are highlighted in approaches differentiated in the geographic information domain. The discussion of the two approaches leads to the conclusion that synergies in research on e-Science environments shall be extended. Full-fledged e-Science environments will require the integration of approaches with different strengths.

  14. Diagnostic imaging and cataloguing of female genital malformations

    Directory of Open Access Journals (Sweden)

    Pedro Acién

    2016-08-01

    Full Text Available Abstract To help physicians and radiologists in the diagnosis of female genito-urinary malformations, especially of complex cases, the embryology of the female genital tract, the basis for Müllerian development anomalies, the current classifications for such anomalies and the comparison for inclusion and cataloguing of female genital malformations are briefly reviewed. The use of the embryological system to catalogue female genito-urinary malformations may ultimately be more useful in correlations with clinical presentations and in helping with the appropriate diagnosis and treatment. Diagnostic imaging of the different genito-urinary anomalies are exposed, placing particular emphasis on the anomalies within group II of the embryological and clinical classification (distal mesonephric anomalies, all of them associated with unilateral renal agenesis or dysplasia. Similarly, emphasis is placed on cases of cervico-vaginal agenesis, cavitated noncommunicated uterine horns, and cloacal and urogenital sinus anomalies and malformative combinations, all of them complex malformations. Diagnostic imaging for all these anomalies is essential. The best imaging tools and when to evaluate for other anomalies are also analysed in this review. Teaching points • The appropriate cataloguing of female genital malformations is controversial. • An embryological classification system suggests the best diagnosis and appropriate management. • The anomalies most frequently diagnosed incorrectly are the distal mesonephric anomalies (DMAs. • DMAs are associated with unilateral renal agenesis or renal dysplasia with ectopic ureter. • We analyse other complex malformations. Diagnostic imaging for these anomalies is essential.

  15. A catalogue of crude oil and oil product properties, 1990

    International Nuclear Information System (INIS)

    Bobra, M.A.; Callaghan, S.

    1990-09-01

    This catalogue is a compilation of available data on crude oils and petroleum products. The emphasis of the catalogue is upon oils which could potentially impact Canada's environment. Other oils which are unlikely to be of direct Canadian concern are also included because they have been well characterized and used in relevant studies. The properties listed for each oil are those which will provide an indication of a spilled oil's environmental behaviour and effects. The properties on which data is provided include API gravity, density, viscosity, interfacial tension, pour point, flash point, vapor pressure, volatility and component distribution, emulsion formation tendency and stability, weathering, dispersability, major hydrocarbon groups, aqueous solubility, toxicity, sulfur content, fire point, and wax content. Most of the chemical-physical properties listed in this catalogue were measured using standard tests. For certain properties, data are given at different temperatures and for different degrees of oil weathering. An oil's degree of weathering is expresed as the volume or weight percent evaporated from the fresh oil. Weathered oils used for testing were artificially weathered by gas stripping following the method of Mackay and Stiver. 109 refs

  16. A catalogue of crude oil and oil product properties, 1992

    International Nuclear Information System (INIS)

    Whiticar, S.; Bobra, M.; Liuzzo, P.; Callaghan, S.; Fingas, M.; Jokuty, P.; Ackerman, F.; Cao, J.

    1993-02-01

    This catalogue is a compilation of available data on crude oils and petroleum products. The emphasis of the catalogue is upon oils which could potentially impact Canada's environment. Other oils which are unlikely to be of direct Canadian concern are also included because they have been well characterized and used in relevant studies. The properties listed for each oil are those which will provide an indication of a spilled oil's environmental behaviour and effects. The properties on which data is provided include API gravity, density, viscosity, interfacial tension, pour point, flash point, vapor pressure, volatility and component distribution, emulsion formation tendency and stability, weathering, dispersability, major hydrocarbon groups, aqueous solubility, toxicity, sulfur content, fire point, and wax content. Most of the chemical-physical properties listed in this catalogue were measured using standard tests. For certain properties, data are given at different temperatures and for different degrees of oil weathering. An oil's degree of weathering is expresed as the volume or weight percent evaporated from the fresh oil. Weathered oils used for testing were artificially weathered by gas stripping following the method of Mackay and Stiver. 140 refs

  17. Nuclear Knowledge Management Case Studies Catalogue “NKM CSC”

    International Nuclear Information System (INIS)

    Atieh, T.

    2016-01-01

    Full text: Over the past several years, many nuclear organizations in IAEA’s Member States have accumulated considerable experiences and achievements in the development and application of nuclear knowledge management (NKM) methodology and tools to improve their organizational performance. The IAEA NKM Section has initiated a project entitled “NKM Case Studies Catalogue (NKM CSC)” to capture and document, as well as preserve NKM experience and facilitate its sharing among NKM practitioners and experts. This is done through collection and preservation of information of relevant experiential knowledge in “case study” format. The catalogue will therefore support community of practice mechanisms. An input template is currently under development and will be used to help contributors in Member States who are providing concise set of information about their respective case studies. This information will be made searchable and easily retrievable through a platform that supports collaboration among NKM practitioners and experts. It is planned to launch the Nuclear Knowledge Management Case Studies Catalogue “NKM CSC” at the occasion of the “Third International Conference on Nuclear Knowledge Management—Challenges and Approaches, 7-–11 November 2016, Vienna, Austria”, and to include the accepted case studies submitted to this Conference. (author

  18. Fast Deployment on the Cloud of Integrated Postgres, API and a Jupyter Notebook for Geospatial Collaboration

    Science.gov (United States)

    Fatland, R.; Tan, A.; Arendt, A. A.

    2016-12-01

    We describe a Python-based implementation of a PostgreSQL database accessed through an Application Programming Interface (API) hosted on the Amazon Web Services public cloud. The data is geospatial and concerns hydrological model results in the glaciated catchment basins of southcentral and southeast Alaska. This implementation, however, is intended to be generalized to other forms of geophysical data, particularly data that is intended to be shared across a collaborative team or publicly. An example (moderate-size) dataset is provided together with the code base and a complete installation tutorial on GitHub. An enthusiastic scientist with some familiarity with software installation can replicate the example system in two hours. This installation includes database, API, a test Client and a supporting Jupyter Notebook, specifically oriented towards Python 3 and markup text to comprise an executable paper. The installation 'on the cloud' often engenders discussion and consideration of cloud cost and safety. By treating the process as somewhat "cookbook" we hope to first demonstrate the feasibility of the proposition. A discussion of cost and data security is provided in this presentation and in the accompanying tutorial/documentation. This geospatial data system case study is part of a larger effort at the University of Washington to enable research teams to take advantage of the public cloud to meet challenges in data management and analysis.

  19. A NoSQL–SQL Hybrid Organization and Management Approach for Real-Time Geospatial Data: A Case Study of Public Security Video Surveillance

    Directory of Open Access Journals (Sweden)

    Chen Wu

    2017-01-01

    Full Text Available With the widespread deployment of ground, air and space sensor sources (internet of things or IoT, social networks, sensor networks, the integrated applications of real-time geospatial data from ubiquitous sensors, especially in public security and smart city domains, are becoming challenging issues. The traditional geographic information system (GIS mostly manages time-discretized geospatial data by means of the Structured Query Language (SQL database management system (DBMS and emphasizes query and retrieval of massive historical geospatial data on disk. This limits its capability for on-the-fly access of real-time geospatial data for online analysis in real time. This paper proposes a hybrid database organization and management approach with SQL relational databases (RDB and not only SQL (NoSQL databases (including the main memory database, MMDB, and distributed files system, DFS. This hybrid approach makes full use of the advantages of NoSQL and SQL DBMS for the real-time access of input data and structured on-the-fly analysis results which can meet the requirements of increased spatio-temporal big data linking analysis. The MMDB facilitates real-time access of the latest input data such as the sensor web and IoT, and supports the real-time query for online geospatial analysis. The RDB stores change information such as multi-modal features and abnormal events extracted from real-time input data. The DFS on disk manages the massive geospatial data, and the extensible storage architecture and distributed scheduling of a NoSQL database satisfy the performance requirements of incremental storage and multi-user concurrent access. A case study of geographic video (GeoVideo surveillance of public security is presented to prove the feasibility of this hybrid organization and management approach.

  20. The Federal Geospatial Platform a shared infrastructure for publishing, discovering and exploiting public data and spatial applications.

    Science.gov (United States)

    Dabolt, T. O.

    2016-12-01

    The proliferation of open data and data services continues to thrive and is creating new challenges on how researchers, policy analysts and other decision makes can quickly discover and use relevant data. While traditional metadata catalog approaches used by applications such as data.gov prove to be useful starting points for data search they can quickly frustrate end users who are seeking ways to quickly find and then use data in machine to machine environs. The Geospatial Platform is overcoming these obstacles and providing end users and applications developers a richer more productive user experience. The Geospatial Platform leverages a collection of open source and commercial technology hosted on Amazon Web Services providing an ecosystem of services delivering trusted, consistent data in open formats to all users as well as a shared infrastructure for federal partners to serve their spatial data assets. It supports a diverse array of communities of practice ranging on topics from the 16 National Geospatial Data Assets Themes, to homeland security and climate adaptation. Come learn how you can contribute your data and leverage others or check it out on your own at https://www.geoplatform.gov/

  1. Web evolution and Web Science

    OpenAIRE

    Hall, Wendy; Tiropanis, Thanassis

    2012-01-01

    This paper examines the evolution of the World Wide Web as a network of networks and discusses the emergence of Web Science as an interdisciplinary area that can provide us with insights on how the Web developed, and how it has affected and is affected by society. Through its different stages of evolution, the Web has gradually changed from a technological network of documents to a network where documents, data, people and organisations are interlinked in various and often unexpected ways. It...

  2. Provenance metadata gathering and cataloguing of EFIT++ code execution

    International Nuclear Information System (INIS)

    Lupelli, I.; Muir, D.G.; Appel, L.; Akers, R.; Carr, M.; Abreu, P.

    2015-01-01

    Highlights: • An approach for automatic gathering of provenance metadata has been presented. • A provenance metadata catalogue has been created. • The overhead in the code runtime is less than 10%. • The metadata/data size ratio is about ∼20%. • A visualization interface based on Gephi, has been presented. - Abstract: Journal publications, as the final product of research activity, are the result of an extensive complex modeling and data analysis effort. It is of paramount importance, therefore, to capture the origins and derivation of the published data in order to achieve high levels of scientific reproducibility, transparency, internal and external data reuse and dissemination. The consequence of the modern research paradigm is that high performance computing and data management systems, together with metadata cataloguing, have become crucial elements within the nuclear fusion scientific data lifecycle. This paper describes an approach to the task of automatically gathering and cataloguing provenance metadata, currently under development and testing at Culham Center for Fusion Energy. The approach is being applied to a machine-agnostic code that calculates the axisymmetric equilibrium force balance in tokamaks, EFIT++, as a proof of principle test. The proposed approach avoids any code instrumentation or modification. It is based on the observation and monitoring of input preparation, workflow and code execution, system calls, log file data collection and interaction with the version control system. Pre-processing, post-processing, and data export and storage are monitored during the code runtime. Input data signals are captured using a data distribution platform called IDAM. The final objective of the catalogue is to create a complete description of the modeling activity, including user comments, and the relationship between data output, the main experimental database and the execution environment. For an intershot or post-pulse analysis (∼1000

  3. Catalogue Creation for Space Situational Awareness with Optical Sensors

    Science.gov (United States)

    Hobson, T.; Clarkson, I.; Bessell, T.; Rutten, M.; Gordon, N.; Moretti, N.; Morreale, B.

    2016-09-01

    In order to safeguard the continued use of space-based technologies, effective monitoring and tracking of man-made resident space objects (RSOs) is paramount. The diverse characteristics, behaviours and trajectories of RSOs make space surveillance a challenging application of the discipline that is tracking and surveillance. When surveillance systems are faced with non-canonical scenarios, it is common for human operators to intervene while researchers adapt and extend traditional tracking techniques in search of a solution. A complementary strategy for improving the robustness of space surveillance systems is to place greater emphasis on the anticipation of uncertainty. Namely, give the system the intelligence necessary to autonomously react to unforeseen events and to intelligently and appropriately act on tenuous information rather than discard it. In this paper we build from our 2015 campaign and describe the progression of a low-cost intelligent space surveillance system capable of autonomously cataloguing and maintaining track of RSOs. It currently exploits robotic electro-optical sensors, high-fidelity state-estimation and propagation as well as constrained initial orbit determination (IOD) to intelligently and adaptively manage its sensors in order to maintain an accurate catalogue of RSOs. In a step towards fully autonomous cataloguing, the system has been tasked with maintaining surveillance of a portion of the geosynchronous (GEO) belt. Using a combination of survey and track-refinement modes, the system is capable of maintaining a track of known RSOs and initiating tracks on previously unknown objects. Uniquely, due to the use of high-fidelity representations of a target's state uncertainty, as few as two images of previously unknown RSOs may be used to subsequently initiate autonomous search and reacquisition. To achieve this capability, particularly within the congested environment of the GEO-belt, we use a constrained admissible region (CAR) to

  4. Web archives

    DEFF Research Database (Denmark)

    Finnemann, Niels Ole

    2018-01-01

    This article deals with general web archives and the principles for selection of materials to be preserved. It opens with a brief overview of reasons why general web archives are needed. Section two and three present major, long termed web archive initiatives and discuss the purposes and possible...... values of web archives and asks how to meet unknown future needs, demands and concerns. Section four analyses three main principles in contemporary web archiving strategies, topic centric, domain centric and time-centric archiving strategies and section five discuss how to combine these to provide...... a broad and rich archive. Section six is concerned with inherent limitations and why web archives are always flawed. The last sections deal with the question how web archives may fit into the rapidly expanding, but fragmented landscape of digital repositories taking care of various parts...

  5. Geospatial database for heritage building conservation

    Science.gov (United States)

    Basir, W. N. F. W. A.; Setan, H.; Majid, Z.; Chong, A.

    2014-02-01

    Heritage buildings are icons from the past that exist in present time. Through heritage architecture, we can learn about economic issues and social activities of the past. Nowadays, heritage buildings are under threat from natural disaster, uncertain weather, pollution and others. In order to preserve this heritage for the future generation, recording and documenting of heritage buildings are required. With the development of information system and data collection technique, it is possible to create a 3D digital model. This 3D information plays an important role in recording and documenting heritage buildings. 3D modeling and virtual reality techniques have demonstrated the ability to visualize the real world in 3D. It can provide a better platform for communication and understanding of heritage building. Combining 3D modelling with technology of Geographic Information System (GIS) will create a database that can make various analyses about spatial data in the form of a 3D model. Objectives of this research are to determine the reliability of Terrestrial Laser Scanning (TLS) technique for data acquisition of heritage building and to develop a geospatial database for heritage building conservation purposes. The result from data acquisition will become a guideline for 3D model development. This 3D model will be exported to the GIS format in order to develop a database for heritage building conservation. In this database, requirements for heritage building conservation process are included. Through this research, a proper database for storing and documenting of the heritage building conservation data will be developed.

  6. With Geospatial in Path of Smart City

    Science.gov (United States)

    Homainejad, A. S.

    2015-04-01

    With growth of urbanisation, there is a requirement for using the leverage of smart city in city management. The core of smart city is Information and Communication Technologies (ICT), and one of its elements is smart transport which includes sustainable transport and Intelligent Transport Systems (ITS). Cities and especially megacities are facing urgent transport challenge in traffic management. Geospatial can provide reliable tools for monitoring and coordinating traffic. In this paper a method for monitoring and managing the ongoing traffic in roads using aerial images and CCTV will be addressed. In this method, the road network was initially extracted and geo-referenced and captured in a 3D model. The aim is to detect and geo-referenced any vehicles on the road from images in order to assess the density and the volume of vehicles on the roads. If a traffic jam was recognised from the images, an alternative route would be suggested for easing the traffic jam. In a separate test, a road network was replicated in the computer and a simulated traffic was implemented in order to assess the traffic management during a pick time using this method.

  7. Dynamic object-oriented geospatial modeling

    Directory of Open Access Journals (Sweden)

    Tomáš Richta

    2010-02-01

    Full Text Available Published literature about moving objects (MO simplifies the problem to the representation and storage of moving points, moving lines, or moving regions. The main insufficiency of this approach is lack of MO inner structure and dynamics modeling – the autonomy of moving agent. This paper describes basics of the object-oriented geospatial methodology for modeling complex systems consisting of agents, which move within spatial environment. The main idea is that during the agent movement, different kinds of connections with other moving or stationary objects are established or disposed, based on some spatial constraint satisfaction or nonfulfilment respectively. The methodology is constructed with regard to following two main conditions – 1 the inner behavior of agents should be represented by any formalism, e.g.  Petri net, finite state machine, etc., and 2 the spatial characteristic of environment should be supplied by any information system, that is able to store defined set of spatial types, and support defined set of spatial operations. Finally, the methodology is demonstrated on simple simulation model of tram transportation system.

  8. Geospatial database for heritage building conservation

    International Nuclear Information System (INIS)

    Basir, W N F W A; Setan, H; Majid, Z; Chong, A

    2014-01-01

    Heritage buildings are icons from the past that exist in present time. Through heritage architecture, we can learn about economic issues and social activities of the past. Nowadays, heritage buildings are under threat from natural disaster, uncertain weather, pollution and others. In order to preserve this heritage for the future generation, recording and documenting of heritage buildings are required. With the development of information system and data collection technique, it is possible to create a 3D digital model. This 3D information plays an important role in recording and documenting heritage buildings. 3D modeling and virtual reality techniques have demonstrated the ability to visualize the real world in 3D. It can provide a better platform for communication and understanding of heritage building. Combining 3D modelling with technology of Geographic Information System (GIS) will create a database that can make various analyses about spatial data in the form of a 3D model. Objectives of this research are to determine the reliability of Terrestrial Laser Scanning (TLS) technique for data acquisition of heritage building and to develop a geospatial database for heritage building conservation purposes. The result from data acquisition will become a guideline for 3D model development. This 3D model will be exported to the GIS format in order to develop a database for heritage building conservation. In this database, requirements for heritage building conservation process are included. Through this research, a proper database for storing and documenting of the heritage building conservation data will be developed

  9. A PUBLIC PLATFORM FOR GEOSPATIAL DATA SHARING FOR DISASTER RISK MANAGEMENT

    Directory of Open Access Journals (Sweden)

    S. Balbo

    2014-01-01

    This paper presents a case study scenario of setting up a Web platform based on GeoNode. It is a public platform called MASDAP and promoted by the Government of Malawi in order to support development of the country and build resilience against natural disasters. A substantial amount of geospatial data has already been collected about hydrogeological risk, as well as several other-disasters related information. Moreover this platform will help to ensure that the data created by a number of past or ongoing projects is maintained and that this information remains accessible and useful. An Integrated Flood Risk Management Plan for a river basin has already been included in the platform and other data from future disaster risk management projects will be added as well.

  10. An updated geospatial liquefaction model for global application

    Science.gov (United States)

    Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.

    2017-01-01

    We present an updated geospatial approach to estimation of earthquake-induced liquefaction from globally available geospatial proxies. Our previous iteration of the geospatial liquefaction model was based on mapped liquefaction surface effects from four earthquakes in Christchurch, New Zealand, and Kobe, Japan, paired with geospatial explanatory variables including slope-derived VS30, compound topographic index, and magnitude-adjusted peak ground acceleration from ShakeMap. The updated geospatial liquefaction model presented herein improves the performance and the generality of the model. The updates include (1) expanding the liquefaction database to 27 earthquake events across 6 countries, (2) addressing the sampling of nonliquefaction for incomplete liquefaction inventories, (3) testing interaction effects between explanatory variables, and (4) overall improving model performance. While we test 14 geospatial proxies for soil density and soil saturation, the most promising geospatial parameters are slope-derived VS30, modeled water table depth, distance to coast, distance to river, distance to closest water body, and precipitation. We found that peak ground velocity (PGV) performs better than peak ground acceleration (PGA) as the shaking intensity parameter. We present two models which offer improved performance over prior models. We evaluate model performance using the area under the curve under the Receiver Operating Characteristic (ROC) curve (AUC) and the Brier score. The best-performing model in a coastal setting uses distance to coast but is problematic for regions away from the coast. The second best model, using PGV, VS30, water table depth, distance to closest water body, and precipitation, performs better in noncoastal regions and thus is the model we recommend for global implementation.

  11. GeoSpatial Workforce Development: enhancing the traditional learning environment in geospatial information technology

    Science.gov (United States)

    Lawhead, Pamela B.; Aten, Michelle L.

    2003-04-01

    The Center for GeoSpatial Workforce Development is embarking on a new era in education by developing a repository of dynamic online courseware authored by the foremost industry experts within the remote sensing and GIS industries. Virtual classrooms equipped with the most advanced instructions, computations, communications, course evaluation, and management facilities amplify these courses to enhance the learning environment and provide rapid feedback between instructors and students. The launch of this program included the objective development of the Model Curriculum by an independent consortium of remote sensing industry leaders. The Center's research and development focus on recruiting additional industry experts to develop the technical content of the courseware and then utilize state-of-the-art technology to enhance their material with visually stimulating animations, compelling audio clips and entertaining, interactive exercises intended to reach the broadest audience possible by targeting various learning styles. The courseware will be delivered via various media: Internet, CD-ROM, DVD, and compressed video, that translates into anywhere, anytime delivery of GeoSpatial Information Technology education.

  12. Tendencies in the application of the concept of catalogue marketing in Republic of Serbia and the world

    OpenAIRE

    Zelić Darko

    2010-01-01

    Catalogue marketing is one of the direct marketing channels. This concept implies making a lot of strategic and tactical decisions that determine catalogue's market success. Catalogue sales is most developed in USA (where it originated) and in Western Europe. In Serbia, catalogue marketing is applied just in last few years, since big foreign catalog companies started their business in this region. Here, catalogue marketing is at a lower level of development than in the developed countries, an...

  13. A Geospatial Semantic Enrichment and Query Service for Geotagged Photographs

    Science.gov (United States)

    Ennis, Andrew; Nugent, Chris; Morrow, Philip; Chen, Liming; Ioannidis, George; Stan, Alexandru; Rachev, Preslav

    2015-01-01

    With the increasing abundance of technologies and smart devices, equipped with a multitude of sensors for sensing the environment around them, information creation and consumption has now become effortless. This, in particular, is the case for photographs with vast amounts being created and shared every day. For example, at the time of this writing, Instagram users upload 70 million photographs a day. Nevertheless, it still remains a challenge to discover the “right” information for the appropriate purpose. This paper describes an approach to create semantic geospatial metadata for photographs, which can facilitate photograph search and discovery. To achieve this we have developed and implemented a semantic geospatial data model by which a photograph can be enrich with geospatial metadata extracted from several geospatial data sources based on the raw low-level geo-metadata from a smartphone photograph. We present the details of our method and implementation for searching and querying the semantic geospatial metadata repository to enable a user or third party system to find the information they are looking for. PMID:26205265

  14. A Geospatial Semantic Enrichment and Query Service for Geotagged Photographs

    Directory of Open Access Journals (Sweden)

    Andrew Ennis

    2015-07-01

    Full Text Available With the increasing abundance of technologies and smart devices, equipped with a multitude of sensors for sensing the environment around them, information creation and consumption has now become effortless. This, in particular, is the case for photographs with vast amounts being created and shared every day. For example, at the time of this writing, Instagram users upload 70 million photographs a day. Nevertheless, it still remains a challenge to discover the “right” information for the appropriate purpose. This paper describes an approach to create semantic geospatial metadata for photographs, which can facilitate photograph search and discovery. To achieve this we have developed and implemented a semantic geospatial data model by which a photograph can be enrich with geospatial metadata extracted from several geospatial data sources based on the raw low-level geo-metadata from a smartphone photograph. We present the details of our method and implementation for searching and querying the semantic geospatial metadata repository to enable a user or third party system to find the information they are looking for.

  15. Geospatial Technologies to Improve Urban Energy Efficiency

    Directory of Open Access Journals (Sweden)

    Bharanidharan Hemachandran

    2011-07-01

    Full Text Available The HEAT (Home Energy Assessment Technologies pilot project is a FREE Geoweb mapping service, designed to empower the urban energy efficiency movement by allowing residents to visualize the amount and location of waste heat leaving their homes and communities as easily as clicking on their house in Google Maps. HEAT incorporates Geospatial solutions for residential waste heat monitoring using Geographic Object-Based Image Analysis (GEOBIA and Canadian built Thermal Airborne Broadband Imager technology (TABI-320 to provide users with timely, in-depth, easy to use, location-specific waste-heat information; as well as opportunities to save their money and reduce their green-house-gas emissions. We first report on the HEAT Phase I pilot project which evaluates 368 residences in the Brentwood community of Calgary, Alberta, Canada, and describe the development and implementation of interactive waste heat maps, energy use models, a Hot Spot tool able to view the 6+ hottest locations on each home and a new HEAT Score for inter-city waste heat comparisons. We then describe current challenges, lessons learned and new solutions as we begin Phase II and scale from 368 to 300,000+ homes with the newly developed TABI-1800. Specifically, we introduce a new object-based mosaicing strategy, an adaptation of Emissivity Modulation to correct for emissivity differences, a new Thermal Urban Road Normalization (TURN technique to correct for scene-wide microclimatic variation. We also describe a new Carbon Score and opportunities to update city cadastral errors with automatically defined thermal house objects.

  16. Digital forestry maps representation using web mapping services

    Directory of Open Access Journals (Sweden)

    Martin Klimánek

    2008-01-01

    Full Text Available The Web Mapping Services (WMS are very useful means for presentation of digital geospatial data in the Internet environment. Typical Open Source example of these services is development environment MapServer, which was originally developed by the University of Minnesota ForNet project in cooperation with NASA and the Minnesota Department of Natural Resources. MapServer is not a full-featured Geographical Information System (GIS, but provides the core functionality to support a wide variety of web applications. Complex and open information system about forest (and cultural land is presented in real example of MapServer application with data from the Mendel University Training Forest. MapServer is used in effective representing of data for the University Forest staff, students and general public from October 2002. MapServer is usually applied in education process of GIS and Remote Sensing and for sharing of the Faculty of Forestry and Wood Technology Departments geospatial data.

  17. Web 25

    DEFF Research Database (Denmark)

    Web 25: Histories from the First 25 Years of the World Wide Web celebrates the 25th anniversary of the Web. Since the beginning of the 1990s, the Web has played an important role in the development of the Internet as well as in the development of most societies at large, from its early grey...... and blue webpages introducing the hyperlink for a wider public, to today’s multifacted uses of the Web as an integrated part of our daily lives. This is the rst book to look back at 25 years of Web evolution, and it tells some of the histories about how the Web was born and has developed. It takes...... the reader on an exciting time travel journey to learn more about the prehistory of the hyperlink, the birth of the Web, the spread of the early Web, and the Web’s introduction to the general public in mainstream media. Fur- thermore, case studies of blogs, literature, and traditional media going online...

  18. Web Engineering

    Energy Technology Data Exchange (ETDEWEB)

    White, Bebo

    2003-06-23

    Web Engineering is the application of systematic, disciplined and quantifiable approaches to development, operation, and maintenance of Web-based applications. It is both a pro-active approach and a growing collection of theoretical and empirical research in Web application development. This paper gives an overview of Web Engineering by addressing the questions: (a) why is it needed? (b) what is its domain of operation? (c) how does it help and what should it do to improve Web application development? and (d) how should it be incorporated in education and training? The paper discusses the significant differences that exist between Web applications and conventional software, the taxonomy of Web applications, the progress made so far and the research issues and experience of creating a specialization at the master's level. The paper reaches a conclusion that Web Engineering at this stage is a moving target since Web technologies are constantly evolving, making new types of applications possible, which in turn may require innovations in how they are built, deployed and maintained.

  19. Catalogue of earthquakes (=> M 3.0) in peninsular India

    International Nuclear Information System (INIS)

    Guha, S.K.; Basu, P.C.

    1993-03-01

    This comprehensive catalogue contains the earthquake data down to magnitude 3.0 for peninsular India. It is a collective work of earthquake data from seismic networks. This compilation is useful to assess earthquake parameters for design of nuclear power plants and other critical structures. In view of temporal nature of seismicity, the earthquake data around Koyna reservoir is listed separately under Appendix I and database for the region from seismic array station- Gauribidanur is tabulated in Appendix IIA and IIB for the period 1968-1975 and 1976 onwards. 50 refs., 2 ills

  20. Planck early results. VII. The Early Release Compact Source Catalogue

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Aghanim, N.; Arnaud, M.

    2011-01-01

    the different techniques, an implementation of the PowellSnakes source extraction technique was used at the five frequencies between 30 and 143 GHz while the SExtractor technique was used between 217 and 857GHz. The 10σ photometric flux density limit of the catalogue at |b| > 30 is 0.49, 1.0, 0.67, 0.5, 0.33, 0....... In addition, two early release catalogs that contain 915 cold molecular cloud core candidates and 189 SZ cluster candidates that have been generated using multifrequency algorithms are presented. The entire source list, with more than 15 000 unique sources, is ripe for follow-up characterisation with Herschel...

  1. An instrumental earthquake catalogue for northeastern Italy since 1900

    International Nuclear Information System (INIS)

    Margottini, C.; Martini, G.; Slejko, D.

    1991-01-01

    An earthquake catalogue of instrumental data for northeastern Italy since 1900 is presented. The different types of magnitude, which are the main parameters of the present study, have been evaluated so as to be as homogeneous as possible. Comparisons of the different magnitude values show linear dependence, at least in the medium magnitude range represented by the available data set. Correlations between the magnitude most significant for this region and chosen macroseismic data indicate a methodology for assessing the macroseismic magnitude of historical earthquakes which seems to be stable. (author)

  2. Profile catalogue for airfoil sections based on 3D computations

    DEFF Research Database (Denmark)

    Bertagnolio, F.; Sørensen, Niels N.; Johansen, Jeppe

    2006-01-01

    This report is a continuation of the Wind Turbine Airfoil Catalogue [1] which objective was, firstly to provide a database of aerodynamic characteristics for a wide range of airfoil profiles aimed at wind turbine applications, and secondly to test thetwo-dimensional Navier-Stokes solver EllipSys2D...... and the actual fluid flow, and thereby the incorrect prediction of airfoil characteristics. In addition, other features of the flow solver, such astransition and turbulence modelling, and their influence onto the numerical results are investigated. Conclusions are drawn regarding the evaluation of airfoil...... aerodynamic characteristics, as well as the use of the Navier-Stokes solver for fluid flowcalculations in general....

  3. Features, events and processes evaluation catalogue for argillaceous media

    International Nuclear Information System (INIS)

    Mazurek, M.; Pearson, F.J.; Volckaert, G.; Bock, H.

    2003-01-01

    The OECD/NEA Working Group on the Characterisation, the Understanding and the Performance of Argillaceous Rocks as Repository Host Formations for the disposal of radioactive waste (known as the 'Clay Club') launched a project called FEPCAT (Features, Events and Processes Catalogue for argillaceous media) in late 1998. The present report provides the results of work performed by an expert group to develop a FEPs database related to argillaceous formations, whether soft or indurated. It describes the methodology used for the work performed, provides a list of relevant FEPs and summarises the knowledge on each of them. It also provides general conclusions and identifies priorities for future work. (authors)

  4. Vienna International Centre Library Film and Video Catalogue: Peaceful applications of nuclear energy 1928-1998

    International Nuclear Information System (INIS)

    1998-01-01

    The catalogue lists films and videos which are available on free loan from Vienna International Centre Library for educational, non-commercial, non-profit showings involving no admission charges or appeals for funds. Much of the material listed has been donated to the IAEA by the Governments of Member States. The items are arranged in the catalogue by number. The catalogue also includes a title index and a subject index

  5. Geospatial information infrastructures to address spatial needs in health: Collaboration, challenges and opportunities

    OpenAIRE

    Granell Canut, Carlos; Belmonte Fernández, Óscar; Díaz Sánchez, Laura

    2013-01-01

    Most health-related issues such as public health outbreaks and epidemiological threats are better understood from a spatial–temporal perspective and, clearly demand related geospatial datasets and services so that decision makers may jointly make informed decisions and coordinate response plans. Although current health applications support a kind of geospatial features, these are still disconnected from the wide range of geospatial services and datasets that geospatial information infrastruct...

  6. Progresses and Prospects in Geospatial Big Data for E-government

    OpenAIRE

    LIU Jiping; ZHANG Fuhao; XU Shenghua

    2017-01-01

    In recent years, geospatial big data have attracted great attention from industry, academia, research and government sectors, and even have triggered a lot of industry changes. Geospatial big data for E-government provide new means for government information management and decision making. This paper analyzes the concepts and characteristics of geospatial big data for E-government, mainly reviews the key technologies in geospatial big data for E-government, including data integration, storage...

  7. Don’t Make Me Type: A Study of Students’ Perceptions of Library Catalogues on Tablet Computers

    Directory of Open Access Journals (Sweden)

    Erik Gordon Christiansen

    2015-06-01

    Full Text Available The objective of this mixed methods pilot study was to ascertain university students’ perceptions of online library catalogues using tablet computers, to determine how the participants used tablets and whether or not the NEOS consortium catalogue (NEOS played an important role in the participants’ academic research. The researcher recruited four students from the University of Alberta who were each asked to use NEOS to complete a series of simple timed usability tasks on a tablet computer of their choosing. The participants also answered a variety of semi-structured interview questions regarding their tablet usage, internet browsing habits, device preferences, general impressions of NEOS, and whether they were receptive to the idea of a mobile NEOS application. Overall, the students found the functionality and design of NEOS to be adequate. Typing, authentication, and scrolling through lists presented consistent usability problems while on a tablet. Only one participant was receptive to the idea of a NEOS application, while the other three participants said tablets were not conducive to conducting academic research and that they preferred using a web interface on a laptop or desktop computer instead.

  8. Web Caching

    Indian Academy of Sciences (India)

    operating systems, computer networks, distributed systems,. E-commerce and security. The World Wide Web has been growing in leaps and bounds. Studies have indicated that this massive distributed system can benefit greatly by making use of appropriate caching methods. Intelligent Web caching can lessen the burden ...

  9. Turning Music Catalogues into Archives of Musical Scores – or Vice Versa: Music Archives and Catalogues Based on MEI XML

    DEFF Research Database (Denmark)

    Geertinger, Axel Teich

    2014-01-01

    work in some presentation format (primarily PDF). Both types of collections are technically easy to build, but they have a number of limitations in terms of long-term preservation, data exchange and data re-use, and flexibility. A text-based data structure sophisticated enough to contain both detailed...... metadata and fully-featured scores may be a way of overcoming some of these limitations and at the same time include catalogue data in the score and vice versa. The Music Encoding Initiative (MEI) offers a framework for such an approach based on XML files. The article discusses pros and cons...

  10. Assessing the socioeconomic impact and value of open geospatial information

    Science.gov (United States)

    Pearlman, Francoise; Pearlman, Jay; Bernknopf, Richard; Coote, Andrew; Craglia, Massimo; Friedl, Lawrence; Gallo, Jason; Hertzfeld, Henry; Jolly, Claire; Macauley, Molly K.; Shapiro, Carl; Smart, Alan

    2016-03-10

    The production and accessibility of geospatial information including Earth observation is changing greatly both technically and in terms of human participation. Advances in technology have changed the way that geospatial data are produced and accessed, resulting in more efficient processes and greater accessibility than ever before. Improved technology has also created opportunities for increased participation in the gathering and interpretation of data through crowdsourcing and citizen science efforts. Increased accessibility has resulted in greater participation in the use of data as prices for Government-produced data have fallen and barriers to access have been reduced.

  11. Integration and magnitude homogenization of the Egyptian earthquake catalogue

    International Nuclear Information System (INIS)

    Hussein, H.M.; Abou Elenean, K.A.; Marzouk, I.A.; Abu El-Nader, E.; Peresan, A.; Korrat, I.M.; Panza, G.F.; El-Gabry, M.N.

    2008-03-01

    The aim of the present work is to compile and update a catalogue of the instrumentally recorded earthquakes in Egypt, with uniform and homogeneous source parameters as required for the analysis of seismicity and seismic hazard assessment. This in turn requires a detailed analysis and comparison of the properties of different available sources, including the distribution of events with time, the magnitude completeness and the scaling relations between different kinds of magnitude reported by different agencies. The observational data cover the time interval 1900- 2004 and an area between 22--33.5 deg N and 25--3 6 deg. E. The linear regressions between various magnitude types have been evaluated for different magnitude ranges. Using the best linear relationship determined for each available pair of magnitudes, as well as those identified between the magnitudes and the seismic moment, we convert the different magnitude types into moment magnitudes M W , through a multi-step conversion process. Analysis of the catalogue completeness, based on the MW thus estimated, allows us to identify two different time intervals with homogeneous properties. The first one (1900- 1984) appears to be complete for M W ≥ 4.5, while the second one (1985-2004) can be considered complete for magnitudes M W ≥ 3. (author)

  12. The SHARE European Earthquake Catalogue (SHEEC) 1000-1899

    Science.gov (United States)

    Stucchi, M.; Rovida, A.; Gomez Capera, A. A.; Alexandre, P.; Camelbeeck, T.; Demircioglu, M. B.; Gasperini, P.; Kouskouna, V.; Musson, R. M. W.; Radulian, M.; Sesetyan, K.; Vilanova, S.; Baumont, D.; Bungum, H.; Fäh, D.; Lenhardt, W.; Makropoulos, K.; Martinez Solares, J. M.; Scotti, O.; Živčić, M.; Albini, P.; Batllo, J.; Papaioannou, C.; Tatevossian, R.; Locati, M.; Meletti, C.; Viganò, D.; Giardini, D.

    2013-04-01

    In the frame of the European Commission project "Seismic Hazard Harmonization in Europe" (SHARE), aiming at harmonizing seismic hazard at a European scale, the compilation of a homogeneous, European parametric earthquake catalogue was planned. The goal was to be achieved by considering the most updated historical dataset and assessing homogenous magnitudes, with support from several institutions. This paper describes the SHARE European Earthquake Catalogue (SHEEC), which covers the time window 1000-1899. It strongly relies on the experience of the European Commission project "Network of Research Infrastructures for European Seismology" (NERIES), a module of which was dedicated to create the European "Archive of Historical Earthquake Data" (AHEAD) and to establish methodologies to homogenously derive earthquake parameters from macroseismic data. AHEAD has supplied the final earthquake list, obtained after sorting duplications out and eliminating many fake events; in addition, it supplied the most updated historical dataset. Macroseismic data points (MDPs) provided by AHEAD have been processed with updated, repeatable procedures, regionally calibrated against a set of recent, instrumental earthquakes, to obtain earthquake parameters. From the same data, a set of epicentral intensity-to-magnitude relations has been derived, with the aim of providing another set of homogeneous Mw estimates. Then, a strategy focussed on maximizing the homogeneity of the final epicentral location and Mw, has been adopted. Special care has been devoted also to supply location and Mw uncertainty. The paper focuses on the procedure adopted for the compilation of SHEEC and briefly comments on the achieved results.

  13. Ground Zero/Fresh Kills: Cataloguing Ruins, Garbage, and Memory

    Directory of Open Access Journals (Sweden)

    Cinzia Scarpino

    2011-09-01

    Full Text Available The aim of this paper is to show how the rise and fall of the Twin Towers can be read in relation to the rise and fall of the Staten Island Fresh Kills landfill, how their destinies were entwined from the start, and how the immediate cultural response to the collapse of the former and the closing of the latter recurred to the form of catalogues of objects, words, and images. From this angle it will be possible to posit the events within a larger, if somewhat unusual, cultural frame encompassing the history of two different yet complementary symbols of New York up to 2001 (the WTC and Fresh Kills. From Don DeLillo’s Underworld (1997 and Falling Man (2007 through Holman, Steve Zeitlin e Joe Dobkin’s Crisis (2001-2002; from Art Spiegelman’s In the Shadows of No Tower (2004 to Artists Respond’s 9-11 (2002; from the New York Times to Bearing Witness to History, the 2003-2006 retrospective of the Smithsonian Museum, relevant collective or individual responses to the 2001 attacks took the form of a catalogue, a list, a vertical or horizontal juxtaposition of data, objects, and memories, evoking a suggestive parallel to the organizing principle of past relics collected in museums and garbage stratified in sanitary landfills.

  14. HALOGEN: a tool for fast generation of mock halo catalogues

    Science.gov (United States)

    Avila, Santiago; Murray, Steven G.; Knebe, Alexander; Power, Chris; Robotham, Aaron S. G.; Garcia-Bellido, Juan

    2015-06-01

    We present a simple method of generating approximate synthetic halo catalogues: HALOGEN. This method uses a combination of second-order Lagrangian Perturbation Theory (2LPT) in order to generate the large-scale matter distribution, analytical mass functions to generate halo masses, and a single-parameter stochastic model for halo bias to position haloes. HALOGEN represents a simplification of similar recently published methods. Our method is constrained to recover the two-point function at intermediate (10 h-1 Mpc space distortions) with results from N-body simulations to determine the validity of our method for different purposes. One of the benefits of HALOGEN is its flexibility, and we demonstrate this by showing how it can be adapted to varying cosmologies and simulation specifications. A driving motivation for the development of such approximate schemes is the need to compute covariance matrices and study the systematic errors for large galaxy surveys, which requires thousands of simulated realizations. We discuss the applicability of our method in this context, and conclude that it is well suited to mass production of appropriate halo catalogues. The code is publicly available at https://github.com/savila/halogen.

  15. eGenomics: Cataloguing Our Complete Genome Collection III

    Directory of Open Access Journals (Sweden)

    Dawn Field

    2007-01-01

    Full Text Available This meeting report summarizes the proceedings of the “eGenomics: Cataloguing our Complete Genome Collection III” workshop held September 11–13, 2006, at the National Institute for Environmental eScience (NIEeS, Cambridge, United Kingdom. This 3rd workshop of the Genomic Standards Consortium was divided into two parts. The first half of the three-day workshop was dedicated to reviewing the genomic diversity of our current and future genome and metagenome collection, and exploring linkages to a series of existing projects through formal presentations. The second half was dedicated to strategic discussions. Outcomes of the workshop include a revised “Minimum Information about a Genome Sequence” (MIGS specification (v1.1, consensus on a variety of features to be added to the Genome Catalogue (GCat, agreement by several researchers to adopt MIGS for imminent genome publications, and an agreement by the EBI and NCBI to input their genome collections into GCat for the purpose of quantifying the amount of optional data already available (e.g., for geographic location coordinates and working towards a single, global list of all public genomes and metagenomes.

  16. Creating OGC Web Processing Service workflows using a web-based editor

    Science.gov (United States)

    de Jesus, J.; Walker, P.; Grant, M.

    2012-04-01

    The OGC WPS (Web Processing Service) specifies how geospatial algorithms may be accessed in an SOA (Service Oriented Architecture). Service providers can encode both simple and sophisticated algorithms as WPS processes and publish them as web services. These services are not only useful individually but may be built into complex processing chains (workflows) that can solve complex data analysis and/or scientific problems. The NETMAR project has extended the Web Processing Service (WPS) framework to provide transparent integration between it and the commonly used WSDL (Web Service Description Language) that describes the web services and its default SOAP (Simple Object Access Protocol) binding. The extensions allow WPS services to be orchestrated using commonly used tools (in this case Taverna Workbench, but BPEL based systems would also be an option). We have also developed a WebGUI service editor, based on HTML5 and the WireIt! Javascript API, that allows users to create these workflows using only a web browser. The editor is coded entirely in Javascript and performs all XSLT transformations needed to produce a Taverna compatible (T2FLOW) workflow description which can be exported and run on a local Taverna Workbench or uploaded to a web-based orchestration server and run there. Here we present the NETMAR WebGUI service chain editor and discuss the problems associated with the development of a WebGUI for scientific workflow editing; content transformation into the Taverna orchestration language (T2FLOW/SCUFL); final orchestration in the Taverna engine and how to deal with the large volumes of data being transferred between different WPS services (possibly running on different servers) during workflow orchestration. We will also demonstrate using the WebGUI for creating a simple workflow making use of published web processing services, showing how simple services may be chained together to produce outputs that would previously have required a GIS (Geographic

  17. Evaluation of groundwater potential using geospatial techniques

    Science.gov (United States)

    Hussein, Abdul-Aziz; Govindu, Vanum; Nigusse, Amare Gebre Medhin

    2017-09-01

    The issue of unsustainable groundwater utilization is becoming increasingly an evident problem and the key concern for many developing countries. One of the problems is the absence of updated spatial information on the quantity and distribution of groundwater resource. Like the other developing countries, groundwater evaluation in Ethiopia has been usually conducted using field survey which is not feasible in terms of time and resource. This study was conducted in Northern Ethiopia, Wollo Zone, in Gerardo River Catchment district to spatially delineate the groundwater potential areas using geospatial and MCDA tools. To do so, eight major biophysical and environmental factors like geomorphology, lithology, slope, rainfall, land use land cover (LULC), soil, lineament density and drainage density were considered. The sources of these data were satellite image, digital elevation model (DEM), existing thematic maps and metrological station data. Landsat image was used in ERDAS Imagine to drive the LULC of the area, while the geomorphology, soil, and lithology of the area were identified and classified through field survey and digitized from existing maps using the ArcGIS software. The slope, lineament and drainage density of the area were derived from DEM using spatial analysis tools. The rainfall surface map was generated using the thissen polygon interpolation. Finally, after all these thematic maps were organized, weighted value determination for each factor and its field value was computed using IDRSI software. At last, all the factors were integrated together and computed the model using the weighted overlay so that potential groundwater areas were mapped. The findings depicted that the most potential groundwater areas are found in the central and eastern parts of the study area, while the northern and western parts of the Gerado River Catchment have poor potential of groundwater availability. This is mainly due to the cumulative effect of steep topographic and

  18. SMART CITIES INTELLIGENCE SYSTEM (SMACiSYS INTEGRATING SENSOR WEB WITH SPATIAL DATA INFRASTRUCTURES (SENSDI

    Directory of Open Access Journals (Sweden)

    D. Bhattacharya

    2017-09-01

    Full Text Available The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS with sensor-web access (SENSDI utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  19. Smart Cities Intelligence System (SMACiSYS) Integrating Sensor Web with Spatial Data Infrastructures (sensdi)

    Science.gov (United States)

    Bhattacharya, D.; Painho, M.

    2017-09-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  20. Sensor web

    Science.gov (United States)

    Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)

    2011-01-01

    A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.

  1. Survey of Available Web Services for Maritime Tracking

    OpenAIRE

    DIMITROVA TATYANA

    2011-01-01

    In this paper we present a review of available web-services capable of tracking maritime vessels and containers. Many of the reviewed services are provided directly by the cargo carriers (with 57 such international carriers being catalogued here). They are analysed in-depth with the following criteria being used for evaluation purposes: type of tracking and useful information. We also review a range of services provided by non-carrier organizations specialising in maritime data. Reviewed serv...

  2. Big Data analytics in the Geo-Spatial Domain

    NARCIS (Netherlands)

    R.A. Goncalves (Romulo); M.G. Ivanova (Milena); M.L. Kersten (Martin); H. Scholten; S. Zlatanova; F. Alvanaki (Foteini); P. Nourian (Pirouz); E. Dias

    2014-01-01

    htmlabstractBig data collections in many scientific domains have inherently rich spatial and geo-spatial features. Spatial location is among the core aspects of data in Earth observation sciences, astronomy, and seismology to name a few. The goal of our project is to design an efficient data

  3. A study on state of Geospatial courses in Indian Universities

    Science.gov (United States)

    Shekhar, S.

    2014-12-01

    Today the world is dominated by three technologies such as Nano technology, Bio technology and Geospatial technology. This increases the huge demand for experts in the respective field for disseminating the knowledge as well as for an innovative research. Therefore, the prime need is to train the existing fraternity to gain progressive knowledge in these technologies and impart the same to student community. The geospatial technology faces some peculiar problem than other two technologies because of its interdisciplinary, multi-disciplinary nature. It attracts students and mid career professionals from various disciplines including Physics, Computer science, Engineering, Geography, Geology, Agriculture, Forestry, Town Planning and so on. Hence there is always competition to crab and stabilize their position. The students of Master's degree in Geospatial science are facing two types of problem. The first one is no unique identity in the academic field. Neither they are exempted for National eligibility Test for Lecturer ship nor given an opportunity to have the exam in geospatial science. The second one is differential treatment by the industrial world. The students are either given low grade jobs or poorly paid for their job. Thus, it is a serious issue about the future of this course in the Universities and its recognition in the academic and industrial world. The universities should make this course towards more job oriented in consultation with the Industries and Industries should come forward to share their demands and requirements to the Universities, so that necessary changes in the curriculum can be made to meet the industrial requirements.

  4. Global Geospatial Information Management: un'iniziativa delle Nazioni Unite

    Directory of Open Access Journals (Sweden)

    Mauro Salvemini

    2010-03-01

    role in setting the agenda for the development of global geospatial information and to promote its use to address key global challenges; to liaise and coordinate among Member States, and between Member States and international organizations.

  5. Challenges of Broadening Participation in the Geospatial Technology Workforce

    Science.gov (United States)

    DiBiase, D.

    2015-12-01

    In this presentation I'll describe the geospatial technology industry and its workforce needs, in relation to the geosciences. The talk will consider the special challenge of recruiting and retaining women and under-represented minorities in high tech firms like Esri. Finally, I'll discuss what my company is doing to help realize the benefits of a diverse workforce.

  6. Sextant: Visualizing time-evolving linked geospatial data

    NARCIS (Netherlands)

    C. Nikolaou (Charalampos); K. Dogani (Kallirroi); K. Bereta (Konstantina); G. Garbis (George); M. Karpathiotakis (Manos); K. Kyzirakos (Konstantinos); M. Koubarakis (Manolis)

    2015-01-01

    textabstractThe linked open data cloud is constantly evolving as datasets get continuously updated with newer versions. As a result, representing, querying, and visualizing the temporal dimension of linked data is crucial. This is especially important for geospatial datasets that form the backbone

  7. Geospatial Technology In Environmental Impact Assessments – Retrospective.

    Directory of Open Access Journals (Sweden)

    Goparaju Laxmi

    2015-10-01

    Full Text Available Environmental Impact Assessments are studies conducted to give us an insight into the various impacts caused by an upcoming industry or any developmental activity. It should address various social, economic and environmental issues ensuring that negative impacts are mitigated. In this context, geospatial technology has been used widely in recent times.

  8. Crisp Clustering Algorithm for 3D Geospatial Vector Data Quantization

    DEFF Research Database (Denmark)

    Azri, Suhaibah; Anton, François; Ujang, Uznir

    2015-01-01

    In the next few years, 3D data is expected to be an intrinsic part of geospatial data. However, issues on 3D spatial data management are still in the research stage. One of the issues is performance deterioration during 3D data retrieval. Thus, a practical 3D index structure is required for effic...

  9. Shared Geospatial Metadata Repository for Ontario University Libraries: Collaborative Approaches

    Science.gov (United States)

    Forward, Erin; Leahey, Amber; Trimble, Leanne

    2015-01-01

    Successfully providing access to special collections of digital geospatial data in academic libraries relies upon complete and accurate metadata. Creating and maintaining metadata using specialized standards is a formidable challenge for libraries. The Ontario Council of University Libraries' Scholars GeoPortal project, which created a shared…

  10. What Lives Where & Why? Understanding Biodiversity through Geospatial Exploration

    Science.gov (United States)

    Trautmann, Nancy M.; Makinster, James G.; Batek, Michael

    2013-01-01

    Using an interactive map-based PDF, students learn key concepts related to biodiversity while developing data-analysis and critical-thinking skills. The Bird Island lesson provides students with experience in translating geospatial data into bar graphs, then interpreting these graphs to compare biodiversity across ecoregions on a fictional island.…

  11. application of geospatial tools for landslide hazard assessment

    African Journals Online (AJOL)

    immax

    incorporate the use of geospatial tools in Uganda's disaster management strategies. 1 Introduction ... technologies have proved to be useful landslide assessment tools primarily because they combine mapping, field ... (Guzzetti, 2002) observes that landslides are localized “point” events controlled by the intensity, duration ...

  12. Geospatial Data Repository. Sharing Data Across the Organization and Beyond

    National Research Council Canada - National Science Library

    Ruiz, Marilyn

    2001-01-01

    .... This short Technical Note discusses a five-part approach to creating a data repository that addresses the problems of the historical organizational framework for geospatial data. Fort Hood, Texas was the site used to develop the prototype. A report documenting the complete study will be available in late Spring 2001.

  13. 75 FR 10309 - Announcement of National Geospatial Advisory Committee Meeting

    Science.gov (United States)

    2010-03-05

    ... Geospatial Advisory Committee (NGAC) will meet on March 24-25, 2010 at the One Washington Circle Hotel, 1 Washington Circle, NW., Washington, DC 20037. The meeting will be held in the Meridian Room. The NGAC, which... to the public, seating may be limited due to room capacity. DATES: The meeting will be held from 8:30...

  14. Geospatial Analysis of Renewable Energy Technical Potential on Tribal Lands

    Energy Technology Data Exchange (ETDEWEB)

    Doris, E.; Lopez, A.; Beckley, D.

    2013-02-01

    This technical report uses an established geospatial methodology to estimate the technical potential for renewable energy on tribal lands for the purpose of allowing Tribes to prioritize the development of renewable energy resources either for community scale on-tribal land use or for revenue generating electricity sales.

  15. Preparing Preservice Teachers to Incorporate Geospatial Technologies in Geography Teaching

    Science.gov (United States)

    Harte, Wendy

    2017-01-01

    This study evaluated the efficacy of geospatial technology (GT) learning experiences in two geography curriculum courses to determine their effectiveness for developing preservice teacher confidence and preparing preservice teachers to incorporate GT in their teaching practices. Surveys were used to collect data from preservice teachers at three…

  16. A Research Agenda for Geospatial Technologies and Learning

    Science.gov (United States)

    Baker, Tom R.; Battersby, Sarah; Bednarz, Sarah W.; Bodzin, Alec M.; Kolvoord, Bob; Moore, Steven; Sinton, Diana; Uttal, David

    2015-01-01

    Knowledge around geospatial technologies and learning remains sparse, inconsistent, and overly anecdotal. Studies are needed that are better structured; more systematic and replicable; attentive to progress and findings in the cognate fields of science, technology, engineering, and math education; and coordinated for multidisciplinary approaches.…

  17. Office of Biological Informatics and Outreach geospatial technology activities

    Science.gov (United States)

    ,

    1998-01-01

    The U.S. Geological Survey (USGS) Office of Biological Informatics and Outreach (OBIO) in Reston, Virginia, and its Center for Biological Informatics (CBI) in Denver, Colorado, provide leadership in the development and use of geospatial technologies to advance the Nation's biological science activities.

  18. Persistent Teaching Practices after Geospatial Technology Professional Development

    Science.gov (United States)

    Rubino-Hare, Lori A.; Whitworth, Brooke A.; Bloom, Nena E.; Claesgens, Jennifer M.; Fredrickson, Kristi M.; Sample, James C.

    2016-01-01

    This case study described teachers with varying technology skills who were implementing the use of geospatial technology (GST) within project-based instruction (PBI) at varying grade levels and contexts 1 to 2 years following professional development. The sample consisted of 10 fifth- to ninth-grade teachers. Data sources included artifacts,…

  19. Detection of Climate Trends Over Ethiopia Using Geospatial ...

    African Journals Online (AJOL)

    Raster climate data including maximum temperature for warm season (April- June), cold season (October- January) and rainfall for the rainy season (June- September) for the years 1946 and 2006 were extracted from Climate Research Unit (CRU) Geospatial Raster Data Portal for Ethiopia. The change detection is ...

  20. International Atomic Energy Agency. Publications Catalogue 2011/12 - full details of publications published 2010-2012, forthcoming publications and a stocklist of publications published in 2008-2011

    International Nuclear Information System (INIS)

    2011-06-01

    This publications catalogue lists all sales publications of the IAEA published in 2010 and 2011 and those forthcoming in 2011/12. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  1. International Atomic Energy Agency publications. Publications catalogue 2007 including full details of publications published in 2005-2007 and forthcoming and a stocklist of publications published in 2003-2004

    International Nuclear Information System (INIS)

    2007-01-01

    This Publications Catalogue lists all sales publications of the IAEA published in 2005, 2006 and 2007 and forthcoming. Most IAEA publications are issued in English, some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  2. International Atomic Energy Agency publications. Publications catalogue 2006 including full details of publications published in 2004-2005 and forthcoming in 2006 and a stocklist of publications published in 2002-2003

    International Nuclear Information System (INIS)

    2006-03-01

    This Publications Catalogue lists all sales publications of the IAEA published in 2004, 2005 and forthcoming in 2006. Most IAEA publications are issued in English, some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  3. International Atomic Energy Agency publications. Publications catalogue 2005 including full details of publications published in 2003-2004 and forthcoming in 2005 and a stocklist of publications published in 2001-2002

    International Nuclear Information System (INIS)

    2005-03-01

    This Publications Catalogue lists all sales publications of the IAEA published in 2003, 2004 and forthcoming in 2005. Most IAEA publications are issued in English, some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  4. International Atomic Energy Agency. Publications catalogue 2009 including full details of publications published in 2008-2009, forthcoming publications and a stocklist of publications published in 2006-2007

    International Nuclear Information System (INIS)

    2009-06-01

    This Publications Catalogue lists all sales publications of the IAEA published in 2008 and 2009 and forthcoming in 2009. Most IAEA publications are issued in English, some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books

  5. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  6. Geospatial Multi-Agency Coordination (GeoMAC) wildland fire perimeters, 2008

    Science.gov (United States)

    Walters, Sandra P.; Schneider, Norma J.; Guthrie, John D.

    2011-01-01

    The Geospatial Multi-Agency Coordination (GeoMAC) has been collecting and storing data on wildland fire perimeters since August 2000. The dataset presented via this U.S. Geological Survey Data Series product contains the GeoMAC wildland fire perimeter data for the calendar year 2008, which are based upon input from incident intelligence sources, Global Positioning System (GPS) data, and infrared (IR) imagery. Wildland fire perimeter data are obtained from the incidents, evaluated for completeness and accuracy, and processed to reflect consistent field names and attributes. After a quality check, the perimeters are loaded to GeoMAC databases, which support the GeoMAC Web application for access by wildland fire managers and the public. The wildland fire perimeters are viewed through the Web application. The data are subsequently archived according to year and state and are made available for downloading through the Internet in shapefile and Keyhole Markup Language (KML) format. These wildland fire perimeter data are also retained for historical, planning, and research purposes. The datasets that pertain to this report can be found on the Rocky Mountain Geographic Science Center HTTP site at http://rmgsc.cr.usgs.gov/outgoing/GeoMAC/historic_fire_data/. The links are also provided on the sidebar.

  7. GRANAT/WATCH catalogue of cosmic gamma-ray bursts: December 1989 to September 1994

    DEFF Research Database (Denmark)

    Sazonov, S.Y.; Sunyaev, R.A.; Terekhov, O.V.

    1998-01-01

    We present the catalogue of gamma-ray bursts (GRB) observed with the WATCH all-sky monitor on board the GRANAT satellite during the period December 1989 to September 1994. The cosmic origin of 95 bursts comprising the catalogue is confirmed either bg their localization with WATCH or by their dete...

  8. Planck 2013 results. XXIX. Planck catalogue of Sunyaev-Zeldovich sources

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.

    2013-01-01

    We describe the all-sky Planck catalogue of clusters and cluster candidates derived from Sunyaev-Zeldovich (SZ) effect detections using the first 15.5 months of Planck satellite observations. The catalogue contains 1227 entries, making it over six times the size of the Planck Early SZ (ESZ) sampl...

  9. A quantitative study of history in the english short-title catalogue (ESTC), 1470-1800

    NARCIS (Netherlands)

    Lahti, Leo; Ilomäki, Niko; Tolonen, Mikko

    2015-01-01

    This article analyses publication trends in the field of history in early modern Britain and North America in 1470-1800, based on English Short-Title Catalogue (ESTC) data.2 Its major contribution is to demonstrate the potential of digitized library catalogues as an essential

  10. The Tycho-2 Catalogue of the 2.5 Million Brightest Stars

    Science.gov (United States)

    2000-01-01

    Astronomisches Rechen-Institut, Mönchhofstrasse 12–14, 69120 Heidelberg, Germany 4 European Southern Observatory, Karl -Schwarzschild-Strasse 2, 85748...Catalogue supersedes in size and quality the Tycho-1 Catalogue itself with respect to photometry and astrom - etry of single and double stars. It also

  11. Planck 2015 results. XXVIII. The Planck Catalogue of Galactic Cold Clumps

    CERN Document Server

    Ade, P.A.R.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartolo, N.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R.C.; Calabrese, E.; Catalano, A.; Chamballu, A.; Chiang, H.C.; Christensen, P.R.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T.A.; Eriksen, H.K.; Falgarone, E.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A.A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.L.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, A.H.; Jaffe, T.R.; Jones, W.C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T.S.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C.R.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P.B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P.M.; Macías-Pérez, J.F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Marshall, D.J.; Martin, P.G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Mazzotta, P.; McGehee, P.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J.A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Nørgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T.J.; Pelkonen, V.-M.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Pratt, G.W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J.P.; Reach, W.T.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J.A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Stolyarov, V.; Sudiwala, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J.A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L.A.; Wandelt, B.D.; Wehus, I.K.; Yvon, D.; Zacchei, A.

    2016-01-01

    We present the Planck Catalogue of Galactic Cold Clumps (PGCC), an all-sky catalogue of Galactic cold clump candidates detected by Planck. This catalogue is the full version of the Early Cold Core (ECC) catalogue, which was made available in 2011 with the Early Release Compact Source Catalogue (ERCSC) and contained 915 high S/N sources. It is based on the Planck 48 months mission data that are currently being released to the astronomical community. The PGCC catalogue is an observational catalogue consisting exclusively of Galactic cold sources. The three highest Planck bands (857, 545, 353 GHz) have been combined with IRAS data at 3 THz to perform a multi-frequency detection of sources colder than their local environment. After rejection of possible extragalactic contaminants, the PGCC catalogue contains 13188 Galactic sources spread across the whole sky, i.e., from the Galactic plane to high latitudes, following the spatial distribution of the main molecular cloud complexes. The median temperature of PGCC so...

  12. A new version of the European tsunami catalogue: updating and revision

    Directory of Open Access Journals (Sweden)

    S. Tinti

    2001-01-01

    Full Text Available A new version of the European catalogue of tsunamis is presented here. It differs from the latest release of the catalogue that was produced in 1998 and is known as GITEC tsunami catalogue in some important aspects. In the first place, it is a database built on the Visual FoxPro 6.0 DBMS that can be used and maintained under the PC operating systems currently available. Conversely, the GITEC catalogue was compatible only with Windows 95 and older PC platforms. In the second place, it is enriched by new facilities and a new type of data, such as a database of pictures that can be accessed easily from the main screen of the catalogue. Thirdly, it has been updated by including the newly published references. Minute and painstaking search for new data has been undertaken to re-evaluate cases that were not included in the GITEC catalogue, though they were mentioned in previous catalogues; the exclusion was motivated by a lack of data. This last work has focused so far on Italian cases of the last two centuries. The result is that at least two events have been found which deserve inclusion in the new catalogue: one occurred in 1809 in the Gulf of La Spezia, and the other occurred in 1940 in the Gulf of Palermo. Two further events are presently under investigation.

  13. Exploring best cataloguing rules in the 21st century: Changes from ...

    African Journals Online (AJOL)

    In this digital era, the need to embrace change is inevitable. The authors described fundamental changes that were necessary to move cataloguing practice to the next level. Some of these changes include but not limited to: cataloguing working tools, changes in information resources, vocabulary, main entry points and ...

  14. EURISCO: The European search catalogue for plant genetic resources.

    Science.gov (United States)

    Weise, Stephan; Oppermann, Markus; Maggioni, Lorenzo; van Hintum, Theo; Knüpffer, Helmut

    2017-01-04

    The European Search Catalogue for Plant Genetic Resources, EURISCO, provides information about 1.8 million crop plant accessions preserved by almost 400 institutes in Europe and beyond. EURISCO is being maintained on behalf of the European Cooperative Programme for Plant Genetic Resources. It is based on a network of National Inventories of 43 member countries and represents an important effort for the preservation of world's agrobiological diversity by providing information about the large genetic diversity kept by the collaborating collections. Moreover, EURISCO also assists its member countries in fulfilling legal obligations and commitments, e.g. with respect to the International Treaty on Plant Genetic Resources, the Second Global Plan of Action for Plant Genetic Resources for Food and Agriculture of the United Nation's Food and Agriculture Organization, or the Convention on Biological Diversity. EURISCO is accessible at http://eurisco.ecpgr.org. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. International Atomic Energy Agency Publications. Catalogue 1980-1995

    International Nuclear Information System (INIS)

    1996-08-01

    This catalogue lists all sales publications of the International Atomic Energy Agency issued from 1980 up to the end of 1995 an still available. Some earlier titles which form part of an established series or are still considered of importance have been included. Most Agency publications are issued in English, though some are also available in Chinese, French, Russian or Spanish. This is noted as C for Chinese, E for English, F for French, R for Russian and S For Spanish by the relevant ISBN number. Proceedings of conferences, symposia, seminars and panels, of experts contain papers in their original language (English, French, Russian or Spanish) with abstracts in English and in the original language

  16. Gamma radiography of defects in concrete. Catalogue of reference exposures

    International Nuclear Information System (INIS)

    1974-01-01

    A catalogue of reference exposure as a basic document for the interpretation of gamma-radiographs of reinforced and prestressed concrete structures is presented. The radiation sources are Iridium 192, Caesium 137 and Cobalt 60. Photographic films are used as radiation detectors combined with intensifying screens and filters of lead, copper and iron. The concrete specimens were designed and made for the purpose of studying, as a function of the concrete thickness, the possibility of detecting certain characteristic incorporations or defects. Each set of standard specimens consists of seven standard blocks with the dimensions 15x15x50cm. The thicknesses of the specimens range from 15 to 75cm (1 to 5 blocks)

  17. Performance Evaluation of INACT - INDECT Advanced Image Cataloguing Tool

    Directory of Open Access Journals (Sweden)

    Libor Michalek

    2012-01-01

    Full Text Available In this article, we describe the performance evaluation of INACT tool which is developed for cataloguing of high-level and low-level metadata of the evidence material. INACT tool can be used by police forces in the cases of prosecution of such crimes as as possession and distribution of child pornography (CP. In live forensic cases, the time to first hit (time when the first image containing e.g. CP is found is important, as then further legal actions are justified (such as arrest of the suspect and his hardware. The performance evaluation of first hit was performed on real data with the cooperation of Czech Police, Department of Internet Crime.

  18. Hipparcos to deliver its final results catalogue soon

    Science.gov (United States)

    1995-10-01

    them, almost 30 years ago, to propose carrying out these observations from the relatively benign environment of space. Hipparcos is, by present standards, a medium-sized satellite, with a 30 cm telescope sensing simply ordinary light. But it has been described as the most imaginative in the short history of space astronomy. This foresight has been amply repaid. In the long history of stargazing it ranks with the surveys by Hipparchus the Greek in the 2nd Century BC and by Tichy Brahe the Dane in the 16th Century AD, both of which transformed human perceptions of the Universe. Positions derived from the Hipparcos satellite are better than a millionth of a degree, and newly a thousand times more accurate than star positions routinely determined from he ground. This accuracy makes it possible to measure directly the distances to the stars. While it took 250 years between astronomers first setting out on the exacting task of measuring the distance to a star, and a stellar distance being measured for the first time, ESA's Hipparcos mission has revolutionised this long, painstaking, and fundamental task by measuring accurate distances and movements of more than one hundred thousand. The measurement concept involved he satellite triangulating its way between he stars all wound the sky, building up a celestial map in much the same way as land surveyors use triangulation between hill-tops to measure distances accurately. Only the angles involved are much smaller : the accuracy that has been achieved with the Hipparcos Catalogue is such that he two edges of a coin, viewed from he other side of the Atlantic Ocean, could be distinguished. The results from Hipparcos will deliver scientists with long-awaited details of our place in he Milky Way Galaxy. Most of he stars visible to the naked eye are, to a large extent, companions of the Sun, in a great orbital march around the centre of the Galaxy, a journey so long that it takes individual stars 250 million years to complete, in

  19. Catalogue of data on Thorium intake, organ burden and excretion

    International Nuclear Information System (INIS)

    1989-05-01

    The Atomic Energy Control Board is initiating the critical evaluation of current biokinetic and dosimetric models applicable to estimating exposure to the common chemical and physical forms of thorium. The identification and location of the relevant sets of data are the first steps of this project. This report describes the collection methods used and presents catalogues of the data on human and animal studies that have resulted from exposures under controlled experimental conditions, chronic occupational or environmental situations and acute accidental conditions. The data was identified through the use of computerized literature searches of the Cancerlit, Chemical Exposure, Embase, BIOSIS, NTIS, INIS, MEDLINE and Occupational Safety and Health (NIOSH) data bases, library research and telephone contact with currently active researchers in the field. A table is presented which categorizes researchers in the field accordingly to affiliation and country

  20. Catalogue of data on uranium intake, organ burden and excretion

    International Nuclear Information System (INIS)

    1987-11-01

    The Atomic Energy Control Board is initiating the critical evaluation of current biokinetic and dosimetric models applicable to radiation workers exposed to the common chemical and physical forms of uranium that are encountered throughout the nuclear fuel cycle. The identification and location of the relevant sets of data are the first steps of this project. This report describes the collection methods used and presents catalogues of the data on human and animal studies that have resulted from exposures under controlled experimental conditions, chronic occupational or environmental situations and acute accidental conditions. The data was identified through the use of computerized literature searches of the BIOSIS, NTIS, MEDLINE and Occupational Safety and Health (NIOSH) data bases, library research and telephone contact with currently active researchers in the field. A table is presented which categorizes researchers in the field according to affiliation and country

  1. Catalogue to select the initial guess spectrum during unfolding

    CERN Document Server

    Vega-Carrillo, H R

    2002-01-01

    A new method to select the initial guess spectrum is presented. Neutron spectra unfolded from Bonner sphere data are dependent on the initial guess spectrum used in the unfolding code. The method is based on a catalogue of detector count rates calculated from a set of reported neutron spectra. The spectra of three isotopic neutron sources sup 2 sup 5 sup 2 Cf, sup 2 sup 3 sup 9 PuBe and sup 2 sup 5 sup 2 Cf/D sub 2 O, were measured to test the method. The unfolding was carried out using the three initial guess options included in the BUNKIUT code. Neutron spectra were also calculated using MCNP code. Unfolded spectra were compared with those calculated; in all the cases our method gives the best results.

  2. Modeling photovoltaic diffusion: an analysis of geospatial datasets

    International Nuclear Information System (INIS)

    Davidson, Carolyn; Drury, Easan; Lopez, Anthony; Elmore, Ryan; Margolis, Robert

    2014-01-01

    This study combines address-level residential photovoltaic (PV) adoption trends in California with several types of geospatial information—population demographics, housing characteristics, foreclosure rates, solar irradiance, vehicle ownership preferences, and others—to identify which subsets of geospatial information are the best predictors of historical PV adoption. Number of rooms, heating source and house age were key variables that had not been previously explored in the literature, but are consistent with the expected profile of a PV adopter. The strong relationship provided by foreclosure indicators and mortgage status have less of an intuitive connection to PV adoption, but may be highly correlated with characteristics inherent in PV adopters. Next, we explore how these predictive factors and model performance varies between different Investor Owned Utility (IOU) regions in California, and at different spatial scales. Results suggest that models trained with small subsets of geospatial information (five to eight variables) may provide similar explanatory power as models using hundreds of geospatial variables. Further, the predictive performance of models generally decreases at higher resolution, i.e., below ZIP code level since several geospatial variables with coarse native resolution become less useful for representing high resolution variations in PV adoption trends. However, for California we find that model performance improves if parameters are trained at the regional IOU level rather than the state-wide level. We also find that models trained within one IOU region are generally representative for other IOU regions in CA, suggesting that a model trained with data from one state may be applicable in another state. (letter)

  3. Towards Geo-spatial Information Science in Big Data Era

    Directory of Open Access Journals (Sweden)

    LI Deren

    2016-04-01

    Full Text Available Since the 1990s, with the advent of worldwide information revolution and the development of internet, geospatial information science have also come of age, which pushed forward the building of digital Earth and cyber city. As we entered the 21st century, with the development and integration of global information technology and industrialization, internet of things and cloud computing came into being, human society enters into the big data era. This article covers the key features (ubiquitous, multi-dimension and dynamics, internet+networking, full automation and real-time, from sensing to recognition, crowdsourcing and VGI, and service-oriented of geospatial information science in the big data era and addresses the key technical issues (non-linear four dimensional Earth reference frame system, space based enhanced GNSS, space-air and land unified network communication techniques, on board processing techniques for multi-sources image data, smart interface service techniques for space-borne information, space based resource scheduling and network security, design and developing of a payloads based multi-functional satellite platform. That needs to be resolved to provide a new definition of geospatial information science in big data era. Based on the discussion in this paper, the author finally proposes a new definition of geospatial information science (geomatics, i.e. Geomatics is a multiple discipline science and technology which, using a systematic approach, integrates all the means for spatio-temporal data acquisition, information extraction, networked management, knowledge discovering, spatial sensing and recognition, as well as intelligent location based services of any physical objects and human activities around the earth and its environment. Starting from this new definition, geospatial information science will get much more chances and find much more tasks in big data era for generation of smart earth and smart city . Our profession

  4. MultiSpec: A Desktop and Online Geospatial Image Data Processing Tool

    Science.gov (United States)

    Biehl, L. L.; Hsu, W. K.; Maud, A. R. M.; Yeh, T. T.

    2017-12-01

    MultiSpec is an easy to learn and use, freeware image processing tool for interactively analyzing a broad spectrum of geospatial image data, with capabilities such as image display, unsupervised and supervised classification, feature extraction, feature enhancement, and several other functions. Originally developed for Macintosh and Windows desktop computers, it has a community of several thousand users worldwide, including researchers and educators, as a practical and robust solution for analyzing multispectral and hyperspectral remote sensing data in several different file formats. More recently MultiSpec was adapted to run in the HUBzero collaboration platform so that it can be used within a web browser, allowing new user communities to be engaged through science gateways. MultiSpec Online has also been extended to interoperate with other components (e.g., data management) in HUBzero through integration with the geospatial data building blocks (GABBs) project. This integration enables a user to directly launch MultiSpec Online from data that is stored and/or shared in a HUBzero gateway and to save output data from MultiSpec Online to hub storage, allowing data sharing and multi-step workflows without having to move data between different systems. MultiSpec has also been used in K-12 classes for which one example is the GLOBE program (www.globe.gov) and in outreach material such as that provided by the USGS (eros.usgs.gov/educational-activities). MultiSpec Online now provides teachers with another way to use MultiSpec without having to install the desktop tool. Recently MultiSpec Online was used in a geospatial data session with 30-35 middle school students at the Turned Onto Technology and Leadership (TOTAL) Camp in the summers of 2016 and 2017 at Purdue University. The students worked on a flood mapping exercise using Landsat 5 data to learn about land remote sensing using supervised classification techniques. Online documentation is available for Multi

  5. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    Science.gov (United States)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  6. INDXENDF, Preparation of Visual Catalogue of ENDF Format Data

    International Nuclear Information System (INIS)

    Silva, Orion de O.; Paviotti Corcuera, R.; De Moraes Cunha, M.; Ferreira, P.A.

    1996-01-01

    1 - Description of program or function: This program is a video catalogue for libraries in the ENDF-4, ENDF-5 or ENDF-6 format (Evaluated Nuclear Data File) which can be run on an IBM-PC or compatible computer. This user friendly catalogue is of interest to nuclear and reactor physics researchers. The input is the filename of ENDF data and the two output files contain: i. the list of materials with corresponding laboratory, author and date of evaluation; ii. information about the MF and MT numbers for each material. The program is written in the C language whose capability of providing windows and 'interrupts' along with speed and portability, has been greatly exploited. The system allows output of options (i) and (ii) either on screen, printer or hard disk. 2 - Method of solution: The source code of about 3000 lines was written in C. The routines for windowing were based on the following works: Hummel (1988), Stevens (1989), Lafore (1987), Borland International (1988a, 1988b) and Schildt (1988, 1989). 3 - Restrictions on the complexity of the problem: The executable program occupies about 52 Kb of memory. The extra hard disk space needed depends upon the size of the ENDF/B data file to be processed (e.g. the Activation file contains about 1.3 M-bytes, the General Purpose ENDF/B-VI has four parts, each containing about 12 M-bytes). To run the program the 'datafile' and the executable code '.EXE' file should be on the hard-drive. The program may be run on any IBM/PC or compatible with at least 640 Kb RAM

  7. Biomass production on marginal lands - catalogue of bioenergy crops

    Science.gov (United States)

    Baumgarten, Wibke; Ivanina, Vadym; Hanzhenko, Oleksandr

    2017-04-01

    Marginal lands are the poorest type of land, with various limitations for traditional agriculture. However, they can be used for biomass production for bioenergy based on perennial plants or trees. The main advantage of biomass as an energy source compared to fossil fuels is the positive influence on the global carbon dioxide balance in the atmosphere. During combustion of biofuels, less carbon dioxide is emitted than is absorbed by plants during photosynthesis. Besides, 20 to 30 times less sulphur oxide and 3 to 4 times less ash is formed as compared with coal. Growing bioenergy crops creates additional workplaces in rural areas. Soil and climatic conditions of most European regions are suitable for growing perennial energy crops that are capable of rapid transforming solar energy into energy-intensive biomass. Selcted plants are not demanding for soil fertility, do not require a significant amount of fertilizers and pesticides and can be cultivated, therefore, also on unproductive lands of Europe. They prevent soil erosion, contribute to the preservation and improvement of agroecosystems and provide low-cost biomass. A catalogue of potential bioenergy plants was developed within the EU H2020 project SEEMLA including woody and perennial crops that are allowed to be grown in the territory of the EU and Ukraine. The catalogue lists high-productive woody and perennial crops that are not demanding to the conditions of growing and can guarantee stable high yields of high-energy-capacity biomass on marginal lands of various categories of marginality. Biomass of perennials plants and trees is composed of cellulose, hemicellulose and lignin, which are directly used to produce solid biofuels. Thanks to the well-developed root system of trees and perennial plants, they are better adapted to poor soils and do not require careful maintenance. Therefore, they can be grown on marginal lands. Particular C4 bioenergy crops are well adapted to a lack of moisture and high

  8. Matching Alternative Addresses: a Semantic Web Approach

    Science.gov (United States)

    Ariannamazi, S.; Karimipour, F.; Hakimpour, F.

    2015-12-01

    Rapid development of crowd-sourcing or volunteered geographic information (VGI) provides opportunities for authoritatives that deal with geospatial information. Heterogeneity of multiple data sources and inconsistency of data types is a key characteristics of VGI datasets. The expansion of cities resulted in the growing number of POIs in the OpenStreetMap, a well-known VGI source, which causes the datasets to outdate in short periods of time. These changes made to spatial and aspatial attributes of features such as names and addresses might cause confusion or ambiguity in the processes that require feature's literal information like addressing and geocoding. VGI sources neither will conform specific vocabularies nor will remain in a specific schema for a long period of time. As a result, the integration of VGI sources is crucial and inevitable in order to avoid duplication and the waste of resources. Information integration can be used to match features and qualify different annotation alternatives for disambiguation. This study enhances the search capabilities of geospatial tools with applications able to understand user terminology to pursuit an efficient way for finding desired results. Semantic web is a capable tool for developing technologies that deal with lexical and numerical calculations and estimations. There are a vast amount of literal-spatial data representing the capability of linguistic information in knowledge modeling, but these resources need to be harmonized based on Semantic Web standards. The process of making addresses homogenous generates a helpful tool based on spatial data integration and lexical annotation matching and disambiguating.

  9. VizieR Online Data Catalog: Catalogue of Stellar Spectral Classifications (Skiff, 2009-2012)

    Science.gov (United States)

    Skiff, B. A.

    2010-11-01

    Morgan & Abt 'MKA' paper (1972AJ.....77...35M) not appearing in the two later lists are added. Keenan made continual adjustments to the standards lists up to the time of his death. Thus the late-type standards comprise those marked as high-weight standards in the 1989 Perkins catalogue (1989ApJS...71..245K = III/150); the revised S-type standards in collaboration with Boeshaar (1980ApJS...43..379K); plus the carbon standards and class IIIb 'clump giants' in collaboration with Barnbaum (1996ApJS..105..419B and 1999ApJ...518..859K). In addition, I have made use of the final types by Keenan up to January 2000 shown at the Ohio State Web site (http://www.astronomy.ohio-state.edu/MKCool), accessed in autumn 2003. Though the present file contains all the stars in these lists, only those marked as standards are flagged as such. Garrison's list of MK 'anchor points' might also be consulted in this regard (1994mpyp.conf....3G, and http://www.astro.utoronto.ca/~garrison/mkstds.html). The catalogue includes for the first time results from many large-scale objective-prism spectral surveys done at Case, Stockholm, Crimea, Abastumani, and elsewhere. The stars in these surveys were usually identified only on charts or by other indirect means, and have been overlooked heretofore because of the difficulty in recovering the stars. More complete results from these separate publications, including notes and identifications, have been made available to the CDS, and are kept at the Lowell Observatory ftp area (ftp://ftp.lowell.edu/pub/bas/starcats). Not all of these stars are present in SIMBAD. As a 'living catalogue', an attempt will be made to keep up with current literature, and to extend the indexing of citations back in time. (2 data files).

  10. VizieR Online Data Catalog: Catalogue of Stellar Spectral Classifications (Skiff, 2009-2013)

    Science.gov (United States)

    Skiff, B. A.

    2013-05-01

    Morgan & Abt 'MKA' paper (1972AJ.....77...35M) not appearing in the two later lists are added. Keenan made continual adjustments to the standards lists up to the time of his death. Thus the late-type standards comprise those marked as high-weight standards in the 1989 Perkins catalogue (1989ApJS...71..245K = III/150); the revised S-type standards in collaboration with Boeshaar (1980ApJS...43..379K); plus the carbon standards and class IIIb 'clump giants' in collaboration with Barnbaum (1996ApJS..105..419B and 1999ApJ...518..859K). In addition, I have made use of the final types by Keenan up to January 2000 shown at the Ohio State Web site (http://www.astronomy.ohio-state.edu/MKCool), accessed in autumn 2003. Though the present file contains all the stars in these lists, only those marked as standards are flagged as such. Garrison's list of MK 'anchor points' might also be consulted in this regard (1994mpyp.conf....3G, and http://www.astro.utoronto.ca/~garrison/mkstds.html). The catalogue includes for the first time results from many large-scale objective-prism spectral surveys done at Case, Stockholm, Crimea, Abastumani, and elsewhere. The stars in these surveys were usually identified only on charts or by other indirect means, and have been overlooked heretofore because of the difficulty in recovering the stars. More complete results from these separate publications, including notes and identifications, have been made available to the CDS, and are kept at the Lowell Observatory ftp area (ftp://ftp.lowell.edu/pub/bas/starcats). Not all of these stars are present in SIMBAD. As a 'living catalogue', an attempt will be made to keep up with current literature, and to extend the indexing of citations back in time. (2 data files).

  11. VizieR Online Data Catalog: Catalogue of Stellar Spectral Classifications (Skiff, 2009-2016)

    Science.gov (United States)

    Skiff, B. A.

    2014-10-01

    Morgan & Abt 'MKA' paper (1972AJ.....77...35M) not appearing in the two later lists are added. Keenan made continual adjustments to the standards lists up to the time of his death. Thus the late-type standards comprise those marked as high-weight standards in the 1989 Perkins catalogue (1989ApJS...71..245K = III/150); the revised S-type standards in collaboration with Boeshaar (1980ApJS...43..379K); plus the carbon standards and class IIIb 'clump giants' in collaboration with Barnbaum (1996ApJS..105..419B and 1999ApJ...518..859K). In addition, I have made use of the final types by Keenan up to January 2000 shown at the Ohio State Web site (http://www.astronomy.ohio-state.edu/MKCool), accessed in autumn 2003. Though the present file contains all the stars in these lists, only those marked as standards are flagged as such. Garrison's list of MK 'anchor points' might also be consulted in this regard (1994mpyp.conf....3G, and http://www.astro.utoronto.ca/~garrison/mkstds.html). The catalogue includes for the first time results from many large-scale objective-prism spectral surveys done at Case, Stockholm, Crimea, Abastumani, and elsewhere. The stars in these surveys were usually identified only on charts or by other indirect means, and have been overlooked heretofore because of the difficulty in recovering the stars. More complete results from these separate publications, including notes and identifications, have been made available to the CDS, and are kept at the Lowell Observatory ftp area (ftp://ftp.lowell.edu/pub/bas/starcats). Not all of these stars are present in SIMBAD. As a 'living catalogue', an attempt will be made to keep up with current literature, and to extend the indexing of citations back in time. (2 data files).

  12. VizieR Online Data Catalog: Catalogue of Stellar Spectral Classifications (Skiff, 2009-2014)

    Science.gov (United States)

    Skiff, B. A.

    2014-10-01

    Morgan & Abt 'MKA' paper (1972AJ.....77...35M) not appearing in the two later lists are added. Keenan made continual adjustments to the standards lists up to the time of his death. Thus the late-type standards comprise those marked as high-weight standards in the 1989 Perkins catalogue (1989ApJS...71..245K = III/150); the revised S-type standards in collaboration with Boeshaar (1980ApJS...43..379K); plus the carbon standards and class IIIb 'clump giants' in collaboration with Barnbaum (1996ApJS..105..419B and 1999ApJ...518..859K). In addition, I have made use of the final types by Keenan up to January 2000 shown at the Ohio State Web site (http://www.astronomy.ohio-state.edu/MKCool), accessed in autumn 2003. Though the present file contains all the stars in these lists, only those marked as standards are flagged as such. Garrison's list of MK 'anchor points' might also be consulted in this regard (1994mpyp.conf....3G, and http://www.astro.utoronto.ca/~garrison/mkstds.html). The catalogue includes for the first time results from many large-scale objective-prism spectral surveys done at Case, Stockholm, Crimea, Abastumani, and elsewhere. The stars in these surveys were usually identified only on charts or by other indirect means, and have been overlooked heretofore because of the difficulty in recovering the stars. More complete results from these separate publications, including notes and identifications, have been made available to the CDS, and are kept at the Lowell Observatory ftp area (ftp://ftp.lowell.edu/pub/bas/starcats). Not all of these stars are present in SIMBAD. As a 'living catalogue', an attempt will be made to keep up with current literature, and to extend the indexing of citations back in time. (2 data files).

  13. Discovering Land Cover Web Map Services from the Deep Web with JavaScript Invocation Rules

    Directory of Open Access Journals (Sweden)

    Dongyang Hou

    2016-06-01

    Full Text Available Automatic discovery of isolated land cover web map services (LCWMSs can potentially help in sharing land cover data. Currently, various search engine-based and crawler-based approaches have been developed for finding services dispersed throughout the surface web. In fact, with the prevalence of geospatial web applications, a considerable number of LCWMSs are hidden in JavaScript code, which belongs to the deep web. However, discovering LCWMSs from JavaScript code remains an open challenge. This paper aims to solve this challenge by proposing a focused deep web crawler for finding more LCWMSs from deep web JavaScript code and the surface web. First, the names of a group of JavaScript links are abstracted as initial judgements. Through name matching, these judgements are utilized to judge whether or not the fetched webpages contain predefined JavaScript links that may prompt JavaScript code to invoke WMSs. Secondly, some JavaScript invocation functions and URL formats for WMS are summarized as JavaScript invocation rules from prior knowledge of how WMSs are employed and coded in JavaScript. These invocation rules are used to identify the JavaScript code for extracting candidate WMSs through rule matching. The above two operations are incorporated into a traditional focused crawling strategy situated between the tasks of fetching webpages and parsing webpages. Thirdly, LCWMSs are selected by matching services with a set of land cover keywords. Moreover, a search engine for LCWMSs is implemented that uses the focused deep web crawler to retrieve and integrate the LCWMSs it discovers. In the first experiment, eight online geospatial web applications serve as seed URLs (Uniform Resource Locators and crawling scopes; the proposed crawler addresses only the JavaScript code in these eight applications. All 32 available WMSs hidden in JavaScript code were found using the proposed crawler, while not one WMS was discovered through the focused crawler

  14. Usage of Web Mapping Systems and Services for Information Support of Regional Management

    Directory of Open Access Journals (Sweden)

    Shaparev Nicolay

    2016-01-01

    Full Text Available The work considers information and computing technologies supporting regional decisions making and based on geoinformation websystems and mapping web-services. The use of such systems for the information support of regional management is now becoming common. Long-term strategic forecasting and planning of territories development, solution of various institutional and sectoral problems today are often based on the use of integrated information and computing environment, complex information systems of regional management, which are based on geospatial (mapping data. This paper discusses technologies and webservices used in the creation and implementation of regional geoinformation web-systems. Such systems provide access to huge arrays of geospatial information and services distributed in the Internet, remote data processing with high performance multi-user computers. Problems of choosing basic software such as geoinformation platform, advantages and disadvantages of existing solutions are discussed. The software structure and basic web-GIS components are analyzed. Examples of completed projects are given.

  15. TopoCad - A unified system for geospatial data and services

    Science.gov (United States)

    Felus, Y. A.; Sagi, Y.; Regev, R.; Keinan, E.

    2013-10-01

    "E-government" is a leading trend in public sector activities in recent years. The Survey of Israel set as a vision to provide all of its services and datasets online. The TopoCad system is the latest software tool developed in order to unify a number of services and databases into one on-line and user friendly system. The TopoCad system is based on Web 1.0 technology; hence the customer is only a consumer of data. All data and services are accessible for the surveyors and geo-information professional in an easy and comfortable way. The future lies in Web 2.0 and Web 3.0 technologies through which professionals can upload their own data for quality control and future assimilation with the national database. A key issue in the development of this complex system was to implement a simple and easy (comfortable) user experience (UX). The user interface employs natural language dialog box in order to understand the user requirements. The system then links spatial data with alpha-numeric data in a flawless manner. The operation of the TopoCad requires no user guide or training. It is intuitive and self-taught. The system utilizes semantic engines and machine understanding technologies to link records from diverse databases in a meaningful way. Thus, the next generation of TopoCad will include five main modules: users and projects information, coordinates transformations and calculations services, geospatial data quality control, linking governmental systems and databases, smart forms and applications. The article describes the first stage of the TopoCad system and gives an overview of its future development.

  16. The GLIMS geospatial glacier database: A new tool for studying glacier change

    Science.gov (United States)

    Raup, Bruce; Racoviteanu, Adina; Khalsa, Siri Jodha Singh; Helm, Christopher; Armstrong, Richard; Arnaud, Yves

    2007-03-01

    The Global Land Ice Measurement from Space (GLIMS) project is a cooperative effort of over sixty institutions world-wide with the goal of inventorying a majority of the world's estimated 160 000 glaciers. Each institution (called a Regional Center, or RC) oversees the analysis of satellite imagery for a particular region containing glacier ice. Data received by the GLIMS team at the National Snow and Ice Data Center (NSIDC) in Boulder, Colorado are ingested into a spatially-enabled database (PostGIS) and made available via a website featuring an interactive map, and a Web-Mapping Service (WMS). The WMS, an Open Geospatial Consortium (OGC)-compliant web interface, makes GLIMS glacier data available to other data servers. The GLIMS Glacier Database is accessible on the World Wide Web at " http://nsidc.org/glims/". There, users can browse custom maps, display various data layers, query information within the GLIMS database, and download query results in different GIS-compatible formats. Map layers include glacier outlines, footprints of ASTER satellite optical images acquired over glaciers, and Regional Center information. The glacier and ASTER footprint layers may be queried for scalar attribute data, such as analyst name and date of contribution for glacier data, and acquisition time and browse imagery for the ASTER footprint layer. We present an example analysis of change in Cordillera Blanca glaciers, as determined by comparing data in the GLIMS Glacier Database to historical data. Results show marked changes in that system over the last 30 years, but also point out the need for establishing clear protocols for glacier monitoring from remote-sensing data.

  17. The Use of Geospatial Technologies Instruction within a Student/Teacher/Scientist Partnership: Increasing Students' Geospatial Skills and Atmospheric Concept Knowledge

    Science.gov (United States)

    Hedley, Mikell Lynne; Templin, Mark A.; Czaljkowski, Kevin; Czerniak, Charlene

    2013-01-01

    Many 21st century careers rely on geospatial skills; yet, curricula and professional development lag behind in incorporating these skills. As a result, many teachers have limited experience or preparation for teaching geospatial skills. One strategy for overcoming such problems is the creation of a student/teacher/scientist (STS) partnership…

  18. Developing a prenatal nursing care International Classification for Nursing Practice catalogue.

    Science.gov (United States)

    Liu, L; Coenen, A; Tao, H; Jansen, K R; Jiang, A L

    2017-09-01

    This study aimed to develop a prenatal nursing care catalogue of International Classification for Nursing Practice. As a programme of the International Council of Nurses, International Classification for Nursing Practice aims to support standardized electronic nursing documentation and facilitate collection of comparable nursing data across settings. This initiative enables the study of relationships among nursing diagnoses, nursing interventions and nursing outcomes for best practice, healthcare management decisions, and policy development. The catalogues are usually focused on target populations. Pregnant women are the nursing population addressed in this project. According to the guidelines for catalogue development, three research steps have been adopted: (a) identifying relevant nursing diagnoses, interventions and outcomes; (b) developing a conceptual framework for the catalogue; (c) expert's validation. This project established a prenatal nursing care catalogue with 228 terms in total, including 69 nursing diagnosis, 92 nursing interventions and 67 nursing outcomes, among them, 57 nursing terms were newly developed. All terms in the catalogue were organized by a framework with two main categories, i.e. Expected Changes of Pregnancy and Pregnancy at Risk. Each category had four domains, representing the physical, psychological, behavioral and environmental perspectives of nursing practice. This catalogue can ease the documentation workload among prenatal care nurses, and facilitate storage and retrieval of standardized data for many purposes, such as quality improvement, administration decision-support and researches. The documentations of prenatal care provided data that can be more fluently communicated, compared and evaluated across various healthcare providers and clinic settings. © 2016 International Council of Nurses.

  19. The Planck Catalogue of High-z source candidates

    Science.gov (United States)

    Montier, Ludovic

    2015-08-01

    The Planck satellite has provided the first FIR/submm all-sky survey with a sensitivity allowing us to identify the rarest, most luminous high-z dusty star-forming sources on the sky. It opens a new window on these extreme star-forming systems at redshift above 1.5, providing a powerful laboratory to study the mechanisms of galaxy evolution and enrichment in the frame of the large scale structure growth.I will describe how the Planck catalogue of high-z source candidates (PHz, Planck 2015 in prep.) has been built and charcaterized over 25% of the sky by selecting the brightest red submm sources at a 5' resolution. Follow-up observations with Herschel/SPIRE over 228 Planck candidates have shown that 93% of these candidates are actually overdensities of red sources with SEDs peaking at 350um (Planck Int. results. XXVII 2014). Complementarily to this population of objects, 12 Planck high-z candidates have been identified as strongly lensed star forming galaxies at redshift lying between 2.2 and 3.6 (Canameras et al 2015 subm.), with flux densities larger than 400 mJy up to 1 Jy at 350um, and strong magnification factors. These Planck lensed star-forming galaxies are the rarest brightest lensed in the submm range, providing a unique opportunity to extend the exploration of the star-forming system in this range of mass and redshift.I will detail further a specific analysis performed on a proto-cluster candidate, PHz G95.5-61.6, identified as a double structure at z=1.7 and z=2.03, using an extensive follow-up program (Flores-Cacho et al 2015 subm.). This is the first Planck proto-cluster candidate with spectroscopic confirmation, which opens a new field of statistical analysis about the evolution of dusty star-forming galaxies in such accreting structures.I will finally discuss how the PHz catalogue may help to answer some of the fundamental questions like: At what cosmic epoch did massive galaxy clusters form most of their stars? Is star formation more or less vigorous

  20. Value of Hipparcos Catalogue shown by planet assessments

    Science.gov (United States)

    1996-08-01

    , or deuterium. Even the "worst-case" mass quoted here for the companion of 47 Ursae Majoris, 22 Jupiter masses, is only a maximum, not a measurement. So the companion is almost certainly a true planet with less than 17 times the mass of Jupiter. For the star 70 Virginis, the distance newly established by Hipparcos is 59 light-years. Even on the least favourable assumptions about its orbit, the companion cannot have more than 65 Jupiter masses. It could be brown dwarf rather than a planet, but not a true star. Much more ambiguous is the result for 51 Pegasi. Its distance is 50 light-years and theoretically the companion could have more than 500 Jupiter masses, or half the mass of the Sun. This is a peculiar case anyway, because the companion is very close to 51 Pegasi. Small planets of the size of the Earth might be more promising as abodes of life than the large planets detectable by present astronomical methods. Space scientists are now reviewing methods of detecting the presence of life on alien planets by detecting the infrared signature of ozone in a planet's atmosphere. Ozone is a by-product of oxygen gas, which in turn is supposed to be generated only by life similar to that on the Earth. Meanwhile the detection of planets of whatever size is a tour de force for astronomers, and by analogy with the Solar System one may suppose that large planets are often likely to be accompanied by smaller ones. "Hipparcos was not conceived to look for planets," comments Michael Perryman, ESA's project scientist for Hipparcos, "and this example of assistance to our fellow-astronomers involves a very small sample of our measurements. But it is a timely result when we are considering planet-hunting missions for the 21st Century. The possibilities include a super-Hipparcos that could detect directly the wobbles in nearby stars due to the presence of planets." Hipparcos Catalogue ready for use The result from Hipparcos on alien planets coincides with the completion of the Hipparcos