WorldWideScience

Sample records for large size geospatial

  1. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  2. The Impact of a Geospatial Technology-Supported Energy Curriculum on Middle School Students' Science Achievement

    Science.gov (United States)

    Kulo, Violet; Bodzin, Alec

    2013-02-01

    Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade students classified in three ability level tracks. Data were gathered through pre/posttest content knowledge assessments, daily classroom observations, and daily reflective meetings with the teacher. Findings indicated a significant increase in the energy content knowledge for all the students. Effect sizes were large for all three ability level tracks, with the middle and low track classes having larger effect sizes than the upper track class. Learners in all three tracks were highly engaged with the curriculum. Curriculum effectiveness and practical issues involved with using geospatial technologies to support science learning are discussed.

  3. Geospatial semantic web

    CERN Document Server

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such as WFS, WMS, RDF, OWL, and GeoSPARQL, and demonstrates how to use the Geospatial Semantic Web technologies to solve practical real-world problems such as spatial data interoperability.

  4. Transportation of Large Wind Components: A Review of Existing Geospatial Data

    Energy Technology Data Exchange (ETDEWEB)

    Mooney, Meghan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maclaurin, Galen [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    This report features the geospatial data component of a larger project evaluating logistical and infrastructure requirements for transporting oversized and overweight (OSOW) wind components. The goal of the larger project was to assess the status and opportunities for improving the infrastructure and regulatory practices necessary to transport wind turbine towers, blades, and nacelles from current and potential manufacturing facilities to end-use markets. The purpose of this report is to summarize existing geospatial data on wind component transportation infrastructure and to provide a data gap analysis, identifying areas for further analysis and data collection.

  5. Using Cluster Analysis to Compartmentalize a Large Managed Wetland Based on Physical, Biological, and Climatic Geospatial Attributes.

    Science.gov (United States)

    Hahus, Ian; Migliaccio, Kati; Douglas-Mankin, Kyle; Klarenberg, Geraldine; Muñoz-Carpena, Rafael

    2018-04-27

    Hierarchical and partitional cluster analyses were used to compartmentalize Water Conservation Area 1, a managed wetland within the Arthur R. Marshall Loxahatchee National Wildlife Refuge in southeast Florida, USA, based on physical, biological, and climatic geospatial attributes. Single, complete, average, and Ward's linkages were tested during the hierarchical cluster analyses, with average linkage providing the best results. In general, the partitional method, partitioning around medoids, found clusters that were more evenly sized and more spatially aggregated than those resulting from the hierarchical analyses. However, hierarchical analysis appeared to be better suited to identify outlier regions that were significantly different from other areas. The clusters identified by geospatial attributes were similar to clusters developed for the interior marsh in a separate study using water quality attributes, suggesting that similar factors have influenced variations in both the set of physical, biological, and climatic attributes selected in this study and water quality parameters. However, geospatial data allowed further subdivision of several interior marsh clusters identified from the water quality data, potentially indicating zones with important differences in function. Identification of these zones can be useful to managers and modelers by informing the distribution of monitoring equipment and personnel as well as delineating regions that may respond similarly to future changes in management or climate.

  6. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    Science.gov (United States)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We

  7. Large Scale Analysis of Geospatial Data with Dask and XArray

    Science.gov (United States)

    Zender, C. S.; Hamman, J.; Abernathey, R.; Evans, K. J.; Rocklin, M.; Zender, C. S.; Rocklin, M.

    2017-12-01

    The analysis of geospatial data with high level languages has acceleratedinnovation and the impact of existing data resources. However, as datasetsgrow beyond single-machine memory, data structures within these high levellanguages can become a bottleneck. New libraries like Dask and XArray resolve some of these scalability issues,providing interactive workflows that are both familiar tohigh-level-language researchers while also scaling out to much largerdatasets. This broadens the access of researchers to larger datasets on highperformance computers and, through interactive development, reducestime-to-insight when compared to traditional parallel programming techniques(MPI). This talk describes Dask, a distributed dynamic task scheduler, Dask.array, amulti-dimensional array that copies the popular NumPy interface, and XArray,a library that wraps NumPy/Dask.array with labeled and indexes axes,implementing the CF conventions. We discuss both the basic design of theselibraries and how they change interactive analysis of geospatial data, and alsorecent benefits and challenges of distributed computing on clusters ofmachines.

  8. National Geospatial Program

    Science.gov (United States)

    Carswell, William J.

    2011-01-01

    The National Geospatial Program (NGP; http://www.usgs.gov/ngpo/) satisfies the needs of customers by providing geospatial products and services that customers incorporate into their decisionmaking and operational activities. These products and services provide geospatial data that are organized and maintained in cost-effective ways and developed by working with partners and organizations whose activities align with those of the program. To accomplish its mission, the NGP— organizes, maintains, publishes, and disseminates the geospatial baseline of the Nation's topography, natural landscape, and manmade environment through The National Map

  9. Designing a two-rank acceptance sampling plan for quality inspection of geospatial data products

    Science.gov (United States)

    Tong, Xiaohua; Wang, Zhenhua; Xie, Huan; Liang, Dan; Jiang, Zuoqin; Li, Jinchao; Li, Jun

    2011-10-01

    To address the disadvantages of classical sampling plans designed for traditional industrial products, we originally propose a two-rank acceptance sampling plan (TRASP) for the inspection of geospatial data outputs based on the acceptance quality level (AQL). The first rank sampling plan is to inspect the lot consisting of map sheets, and the second is to inspect the lot consisting of features in an individual map sheet. The TRASP design is formulated as an optimization problem with respect to sample size and acceptance number, which covers two lot size cases. The first case is for a small lot size with nonconformities being modeled by a hypergeometric distribution function, and the second is for a larger lot size with nonconformities being modeled by a Poisson distribution function. The proposed TRASP is illustrated through two empirical case studies. Our analysis demonstrates that: (1) the proposed TRASP provides a general approach for quality inspection of geospatial data outputs consisting of non-uniform items and (2) the proposed acceptance sampling plan based on TRASP performs better than other classical sampling plans. It overcomes the drawbacks of percent sampling, i.e., "strictness for large lot size, toleration for small lot size," and those of a national standard used specifically for industrial outputs, i.e., "lots with different sizes corresponding to the same sampling plan."

  10. The geospatial data quality REST API for primary biodiversity data.

    Science.gov (United States)

    Otegui, Javier; Guralnick, Robert P

    2016-06-01

    We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  11. NCI's Distributed Geospatial Data Server

    Science.gov (United States)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under

  12. Geospatial Services Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: To process, store, and disseminate geospatial data to the Department of Defense and other Federal agencies.DESCRIPTION: The Geospatial Services Laboratory...

  13. Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data

    Science.gov (United States)

    Baxes, Gregory; Mixon, Brian; Linger, TIm

    2013-01-01

    Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be

  14. Grid Enabled Geospatial Catalogue Web Service

    Science.gov (United States)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  15. Geospatial Information from Satellite Imagery for Geovisualisation of Smart Cities in India

    Science.gov (United States)

    Mohan, M.

    2016-06-01

    In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.

  16. A CLOUD-BASED PLATFORM SUPPORTING GEOSPATIAL COLLABORATION FOR GIS EDUCATION

    Directory of Open Access Journals (Sweden)

    X. Cheng

    2015-05-01

    Full Text Available GIS-related education needs support of geo-data and geospatial software. Although there are large amount of geographic information resources distributed on the web, the discovery, process and integration of these resources are still unsolved. Researchers and teachers always searched geo-data by common search engines but results were not satisfied. They also spent much money and energy on purchase and maintenance of various kinds of geospatial software. Aimed at these problems, a cloud-based geospatial collaboration platform called GeoSquare was designed and implemented. The platform serves as a geoportal encouraging geospatial data, information, and knowledge sharing through highly interactive and expressive graphic interfaces. Researchers and teachers can solve their problems effectively in this one-stop solution. Functions, specific design and implementation details are presented in this paper. Site of GeoSquare is: http://geosquare.tianditu.com/

  17. a Cloud-Based Platform Supporting Geospatial Collaboration for GIS Education

    Science.gov (United States)

    Cheng, X.; Gui, Z.; Hu, K.; Gao, S.; Shen, P.; Wu, H.

    2015-05-01

    GIS-related education needs support of geo-data and geospatial software. Although there are large amount of geographic information resources distributed on the web, the discovery, process and integration of these resources are still unsolved. Researchers and teachers always searched geo-data by common search engines but results were not satisfied. They also spent much money and energy on purchase and maintenance of various kinds of geospatial software. Aimed at these problems, a cloud-based geospatial collaboration platform called GeoSquare was designed and implemented. The platform serves as a geoportal encouraging geospatial data, information, and knowledge sharing through highly interactive and expressive graphic interfaces. Researchers and teachers can solve their problems effectively in this one-stop solution. Functions, specific design and implementation details are presented in this paper. Site of GeoSquare is: http://geosquare.tianditu.com/

  18. GEOSPATIAL INFORMATION FROM SATELLITE IMAGERY FOR GEOVISUALISATION OF SMART CITIES IN INDIA

    Directory of Open Access Journals (Sweden)

    M. Mohan

    2016-06-01

    Full Text Available In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.

  19. Python geospatial development

    CERN Document Server

    Westra, Erik

    2013-01-01

    This is a tutorial style book that will teach usage of Python tools for GIS using simple practical examples and then show you how to build a complete mapping application from scratch. The book assumes basic knowledge of Python. No knowledge of Open Source GIS is required.Experienced Python developers who want to learn about geospatial concepts, work with geospatial data, solve spatial problems, and build mapbased applications.This book will be useful those who want to get up to speed with Open Source GIS in order to build GIS applications or integrate GeoSpatial features into their existing ap

  20. Geospatial Authentication

    Science.gov (United States)

    Lyle, Stacey D.

    2009-01-01

    A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server.

  1. Infrastructure for the Geospatial Web

    Science.gov (United States)

    Lake, Ron; Farley, Jim

    Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.

  2. Geospatial Semantics and the Semantic Web

    CERN Document Server

    Ashish, Naveen

    2011-01-01

    The availability of geographic and geospatial information and services, especially on the open Web has become abundant in the last several years with the proliferation of online maps, geo-coding services, geospatial Web services and geospatially enabled applications. The need for geospatial reasoning has significantly increased in many everyday applications including personal digital assistants, Web search applications, local aware mobile services, specialized systems for emergency response, medical triaging, intelligence analysis and more. Geospatial Semantics and the Semantic Web: Foundation

  3. Multi-source Geospatial Data Analysis with Google Earth Engine

    Science.gov (United States)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  4. Geospatial health

    DEFF Research Database (Denmark)

    Utzinger, Jürg; Rinaldi, Laura; Malone, John B.

    2011-01-01

    Geospatial Health is an international, peer-reviewed scientific journal produced by the Global Network for Geospatial Health (GnosisGIS). This network was founded in 2000 and the inaugural issue of its official journal was published in November 2006 with the aim to cover all aspects of geographical...... information system (GIS) applications, remote sensing and other spatial analytic tools focusing on human and veterinary health. The University of Naples Federico II is the publisher, producing two issues per year, both as hard copy and an open-access online version. The journal is referenced in major...... databases, including CABI, ISI Web of Knowledge and PubMed. In 2008, it was assigned its first impact factor (1.47), which has now reached 1.71. Geospatial Health is managed by an editor-in-chief and two associate editors, supported by five regional editors and a 23-member strong editorial board...

  5. The African Geospatial Sciences Institute (agsi): a New Approach to Geospatial Training in North Africa

    Science.gov (United States)

    Oeldenberger, S.; Khaled, K. B.

    2012-07-01

    The African Geospatial Sciences Institute (AGSI) is currently being established in Tunisia as a non-profit, non-governmental organization (NGO). Its objective is to accelerate the geospatial capacity development in North-Africa, providing the facilities for geospatial project and management training to regional government employees, university graduates, private individuals and companies. With typical course durations between one and six months, including part-time programs and long-term mentoring, its focus is on practical training, providing actual project execution experience. The AGSI will complement formal university education and will work closely with geospatial certification organizations and the geospatial industry. In the context of closer cooperation between neighboring North Africa and the European Community, the AGSI will be embedded in a network of several participating European and African universities, e. g. the ITC, and international organizations, such as the ISPRS, the ICA and the OGC. Through a close cooperation with African organizations, such as the AARSE, the RCMRD and RECTAS, the network and exchange of ideas, experiences, technology and capabilities will be extended to Saharan and sub-Saharan Africa. A board of trustees will be steering the AGSI operations and will ensure that practical training concepts and contents are certifiable and can be applied within a credit system to graduate and post-graduate education at European and African universities. The geospatial training activities of the AGSI are centered on a facility with approximately 30 part- and full-time general staff and lecturers in Tunis during the first year. The AGSI will operate a small aircraft with a medium-format aerial camera and compact LIDAR instrument for local, community-scale data capture. Surveying training, the photogrammetric processing of aerial images, GIS data capture and remote sensing training will be the main components of the practical training courses

  6. Geospatial Data as a Service: Towards planetary scale real-time analytics

    Science.gov (United States)

    Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.

    2017-12-01

    The rapid growth of earth systems, environmental and geophysical datasets poses a challenge to both end-users and infrastructure providers. For infrastructure and data providers, tasks like managing, indexing and storing large collections of geospatial data needs to take into consideration the various use cases by which consumers will want to access and use the data. Considerable investment has been made by the Earth Science community to produce suitable real-time analytics platforms for geospatial data. There are currently different interfaces that have been defined to provide data services. Unfortunately, there is considerable difference on the standards, protocols or data models which have been designed to target specific communities or working groups. The Australian National University's National Computational Infrastructure (NCI) is used for a wide range of activities in the geospatial community. Earth observations, climate and weather forecasting are examples of these communities which generate large amounts of geospatial data. The NCI has been carrying out significant effort to develop a data and services model that enables the cross-disciplinary use of data. Recent developments in cloud and distributed computing provide a publicly accessible platform where new infrastructures can be built. One of the key components these technologies offer is the possibility of having "limitless" compute power next to where the data is stored. This model is rapidly transforming data delivery from centralised monolithic services towards ubiquitous distributed services that scale up and down adapting to fluctuations in the demand. NCI has developed GSKY, a scalable, distributed server which presents a new approach for geospatial data discovery and delivery based on OGC standards. We will present the architecture and motivating use-cases that drove GSKY's collaborative design, development and production deployment. We show our approach offers the community valuable exploratory

  7. Examining the Effect of Enactment of a Geospatial Curriculum on Students' Geospatial Thinking and Reasoning

    Science.gov (United States)

    Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara

    2014-08-01

    A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.

  8. Geospatial Data Management Platform for Urban Groundwater

    Science.gov (United States)

    Gaitanaru, D.; Priceputu, A.; Gogu, C. R.

    2012-04-01

    Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis

  9. The large sample size fallacy.

    Science.gov (United States)

    Lantz, Björn

    2013-06-01

    Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.

  10. Geospatial Technologies and Geography Education in a Changing World : Geospatial Practices and Lessons Learned

    NARCIS (Netherlands)

    2015-01-01

    Book published by IGU Commission on Geographical Education. It focuses particularly on what has been learned from geospatial projects and research from the past decades of implementing geospatial technologies in formal and informal education.

  11. A Geospatial Online Instruction Model

    OpenAIRE

    Athena OWEN-NAGEL; John C. RODGERS III; Shrinidhi AMBINAKUDIGE

    2012-01-01

    The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model’s effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based teaching effectiveness has not yet been clearly demonstrated for geospatial courses. The pedagogical model implemented in this study heavily utilizes ...

  12. GSKY: A scalable distributed geospatial data server on the cloud

    Science.gov (United States)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as

  13. Geospatial Information Response Team

    Science.gov (United States)

    Witt, Emitt C.

    2010-01-01

    Extreme emergency events of national significance that include manmade and natural disasters seem to have become more frequent during the past two decades. The Nation is becoming more resilient to these emergencies through better preparedness, reduced duplication, and establishing better communications so every response and recovery effort saves lives and mitigates the long-term social and economic impacts on the Nation. The National Response Framework (NRF) (http://www.fema.gov/NRF) was developed to provide the guiding principles that enable all response partners to prepare for and provide a unified national response to disasters and emergencies. The NRF provides five key principles for better preparation, coordination, and response: 1) engaged partnerships, 2) a tiered response, 3) scalable, flexible, and adaptable operations, 4) unity of effort, and 5) readiness to act. The NRF also describes how communities, tribes, States, Federal Government, privatesector, and non-governmental partners apply these principles for a coordinated, effective national response. The U.S. Geological Survey (USGS) has adopted the NRF doctrine by establishing several earth-sciences, discipline-level teams to ensure that USGS science, data, and individual expertise are readily available during emergencies. The Geospatial Information Response Team (GIRT) is one of these teams. The USGS established the GIRT to facilitate the effective collection, storage, and dissemination of geospatial data information and products during an emergency. The GIRT ensures that timely geospatial data are available for use by emergency responders, land and resource managers, and for scientific analysis. In an emergency and response capacity, the GIRT is responsible for establishing procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing coordinated products and services utilizing the USGS' exceptional pool of

  14. A geospatial search engine for discovering multi-format geospatial data across the web

    Science.gov (United States)

    Christopher Bone; Alan Ager; Ken Bunzel; Lauren Tierney

    2014-01-01

    The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created. However, challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist. The objective of this paper is to present a publically...

  15. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    Science.gov (United States)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current

  16. The Geospatial Web and Local Geographical Education

    Science.gov (United States)

    Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.

    2010-01-01

    Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…

  17. High Performance Processing and Analysis of Geospatial Data Using CUDA on GPU

    Directory of Open Access Journals (Sweden)

    STOJANOVIC, N.

    2014-11-01

    Full Text Available In this paper, the high-performance processing of massive geospatial data on many-core GPU (Graphic Processing Unit is presented. We use CUDA (Compute Unified Device Architecture programming framework to implement parallel processing of common Geographic Information Systems (GIS algorithms, such as viewshed analysis and map-matching. Experimental evaluation indicates the improvement in performance with respect to CPU-based solutions and shows feasibility of using GPU and CUDA for parallel implementation of GIS algorithms over large-scale geospatial datasets.

  18. Geospatial Technology in Geography Education

    NARCIS (Netherlands)

    Muniz Solari, Osvaldo; Demirci, A.; van der Schee, J.A.

    2015-01-01

    The book is presented as an important starting point for new research in Geography Education (GE) related to the use and application of geospatial technologies (GSTs). For this purpose, the selection of topics was based on central ideas to GE in its relationship with GSTs. The process of geospatial

  19. OSGeo - Open Source Geospatial Foundation

    Directory of Open Access Journals (Sweden)

    Margherita Di Leo

    2012-09-01

    Full Text Available L'esigenza nata verso la fine del 2005 di selezionare ed organizzare più di 200 progetti FOSS4G porta alla nascita nel Febbraio2006 di OSGeo (the Open Source Geospatial Foundation, organizzazione internazionale la cui mission è promuovere lo sviluppo collaborativo di software libero focalizzato sull'informazione geografica (FOSS4G.Open   Source   Geospatial   Foundation (OSGeoThe Open Source Geospatial Foundation (OSGeo  is  a  not-for-profit  organization, created  in  early  2006  to  the  aim  at  sup-porting   the   collaborative   development of  geospatial  open  source  software,  and promote its widespread use. The founda-tion provides financial, organizational and legal support to the broader open source geospatial community. It also serves as an independent  legal  entity  to  which  com-munity  members  can  contribute  code, funding  and  other  resources,  secure  in the knowledge that their contributions will be maintained for public benefit. OSGeo also  serves  as  an  outreach  and  advocacy organization for the open source geospa-tial  community,  and  provides  a  common forum  and  shared  infrastructure  for  im-proving  cross-project  collaboration.  The foundation's projects are all freely available and  useable  under  an  OSI-certified  open source license. The Italian OSGeo local chapter is named GFOSS.it     (Associazione     Italiana     per l'informazione Geografica Libera.

  20. A Geospatial Online Instruction Model

    Science.gov (United States)

    Rodgers, John C., III; Owen-Nagel, Athena; Ambinakudige, Shrinidhi

    2012-01-01

    The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model's effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based…

  1. From Geomatics to Geospatial Intelligent Service Science

    Directory of Open Access Journals (Sweden)

    LI Deren

    2017-10-01

    Full Text Available The paper reviews the 60 years of development from traditional surveying and mapping to today's geospatial intelligent service science.The three important stages of surveying and mapping, namely analogue,analytical and digital stage are summarized.The author introduces the integration of GNSS,RS and GIS(3S,which forms the rise of geospatial informatics(Geomatics.The development of geo-spatial information science in digital earth era is analyzed,and the latest progress of geo-spatial information science towards real-time intelligent service in smart earth era is discussed.This paper focuses on the three development levels of "Internet plus" spatial information intelligent service.In the era of big data,the traditional geomatics will surely take advantage of the integration of communication,navigation,remote sensing,artificial intelligence,virtual reality and brain cognition science,and become geospatial intelligent service science,thereby making contributions to national economy,defense and people's livelihood.

  2. Imprementation of Vgi-Based Geoportal for Empowering Citizen's Geospatial Observatories Related to Urban Disaster Management

    Science.gov (United States)

    Lee, Sanghoon

    2016-06-01

    The volunteered geospatial information (VGI) will be efficient and cost-effective method for generating and sharing large disasterrelated geospatial data. The national mapping organizations have played the role of major geospatial collector have been moving toward the considering public participation data collecting method. Due to VGI can conduct to encourage public participation and empower citizens, mapping agency could make a partnership with members of the VGI community to help to provide well-structured geospatial data. In order to effectively be understood and sharing the public semantics, datasets and action model of the public participation GeoPortal, the implemented VGI-GeoPortal designated as the basis of ISO 19154, ISO 19101 and OGC Reference Model. The proof of concepts of VGI-GeoPortal has been implemented urban flooding use-case in Republic of Korea to collect from the public, and analyze disaster-related geospatial data including high-disaster potential information such as the location of poor drainage sewer, small signs of occurring landslide, flooding vulnerability of urban structure, and etc.

  3. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    Science.gov (United States)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  4. Users Manual for the Geospatial Stream Flow Model (GeoSFM)

    Science.gov (United States)

    Artan, Guleid A.; Asante, Kwabena; Smith, Jodie; Pervez, Md Shahriar; Entenmann, Debbie; Verdin, James P.; Rowland, James

    2008-01-01

    The monitoring of wide-area hydrologic events requires the manipulation of large amounts of geospatial and time series data into concise information products that characterize the location and magnitude of the event. To perform these manipulations, scientists at the U.S. Geological Survey Center for Earth Resources Observation and Science (EROS), with the cooperation of the U.S. Agency for International Development, Office of Foreign Disaster Assistance (USAID/OFDA), have implemented a hydrologic modeling system. The system includes a data assimilation component to generate data for a Geospatial Stream Flow Model (GeoSFM) that can be run operationally to identify and map wide-area streamflow anomalies. GeoSFM integrates a geographical information system (GIS) for geospatial preprocessing and postprocessing tasks and hydrologic modeling routines implemented as dynamically linked libraries (DLLs) for time series manipulations. Model results include maps that depicting the status of streamflow and soil water conditions. This Users Manual provides step-by-step instructions for running the model and for downloading and processing the input data required for initial model parameterization and daily operation.

  5. An Effective Framework for Distributed Geospatial Query Processing in Grids

    Directory of Open Access Journals (Sweden)

    CHEN, B.

    2010-08-01

    Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.

  6. Increasing the value of geospatial informatics with open approaches for Big Data

    Science.gov (United States)

    Percivall, G.; Bermudez, L. E.

    2017-12-01

    Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."

  7. A FRAMEWORK FOR AN OPEN SOURCE GEOSPATIAL CERTIFICATION MODEL

    Directory of Open Access Journals (Sweden)

    T. U. R. Khan

    2016-06-01

    Full Text Available The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission “Making geospatial education and opportunities accessible to all”. Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the “Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM. The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and

  8. a Framework for AN Open Source Geospatial Certification Model

    Science.gov (United States)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  9. Visualization and Ontology of Geospatial Intelligence

    Science.gov (United States)

    Chan, Yupo

    Recent events have deepened our conviction that many human endeavors are best described in a geospatial context. This is evidenced in the prevalence of location-based services, as afforded by the ubiquitous cell phone usage. It is also manifested by the popularity of such internet engines as Google Earth. As we commute to work, travel on business or pleasure, we make decisions based on the geospatial information provided by such location-based services. When corporations devise their business plans, they also rely heavily on such geospatial data. By definition, local, state and federal governments provide services according to geographic boundaries. One estimate suggests that 85 percent of data contain spatial attributes.

  10. An approach for heterogeneous and loosely coupled geospatial data distributed computing

    Science.gov (United States)

    Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui

    2010-07-01

    Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.

  11. IMPREMENTATION OF VGI-BASED GEOPORTAL FOR EMPOWERING CITIZEN’S GEOSPATIAL OBSERVATORIES RELATED TO URBAN DISASTER MANAGEMENT

    Directory of Open Access Journals (Sweden)

    S. Lee

    2016-06-01

    Full Text Available The volunteered geospatial information (VGI will be efficient and cost-effective method for generating and sharing large disasterrelated geospatial data. The national mapping organizations have played the role of major geospatial collector have been moving toward the considering public participation data collecting method. Due to VGI can conduct to encourage public participation and empower citizens, mapping agency could make a partnership with members of the VGI community to help to provide well-structured geospatial data. In order to effectively be understood and sharing the public semantics, datasets and action model of the public participation GeoPortal, the implemented VGI-GeoPortal designated as the basis of ISO 19154, ISO 19101 and OGC Reference Model. The proof of concepts of VGI-GeoPortal has been implemented urban flooding use-case in Republic of Korea to collect from the public, and analyze disaster-related geospatial data including high-disaster potential information such as the location of poor drainage sewer, small signs of occurring landslide, flooding vulnerability of urban structure, and etc.

  12. Large-sized seaweed monitoring based on MODIS

    Science.gov (United States)

    Ma, Long; Li, Ying; Lan, Guo-xin; Li, Chuan-long

    2009-10-01

    In recent years, large-sized seaweed, such as ulva lactuca, blooms frequently in coastal water in China, which threatens marine eco-environment. In order to take effective measures, it is important to make operational surveillance. A case of large-sized seaweed blooming (i.e. enteromorpha), occurred in June, 2008, in the sea near Qingdao city, is studied. Seaweed blooming is dynamically monitored using Moderate Resolution Imaging Spectroradiometer (MODIS). After analyzing imaging spectral characteristics of enteromorpha, MODIS band 1 and 2 are used to create a band ratio algorithm for detecting and mapping large-sized seaweed blooming. In addition, chlorophyll-α concentration is inversed based on an empirical model developed using MODIS. Chlorophyll-α concentration maps are derived using multitemporal MODIS data, and chlorophyll-α concentration change is analyzed. Results show that the presented methods are useful to get the dynamic distribution and the growth of large-sized seaweed, and can support contingency planning.

  13. Economic Assessment of the Use Value of Geospatial Information

    Directory of Open Access Journals (Sweden)

    Richard Bernknopf

    2015-07-01

    Full Text Available Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI contained in geospatial data is the difference between the net benefits (in present value terms of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1 a retrospective model about environmental regulation of agrochemicals; (2 a prospective model about the impact and mitigation of earthquakes in urban areas; and (3 a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  14. Economic assessment of the use value of geospatial information

    Science.gov (United States)

    Bernknopf, Richard L.; Shapiro, Carl D.

    2015-01-01

    Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI) contained in geospatial data is the difference between the net benefits (in present value terms) of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1) a retrospective model about environmental regulation of agrochemicals; (2) a prospective model about the impact and mitigation of earthquakes in urban areas; and (3) a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  15. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    Science.gov (United States)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources

  16. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    Directory of Open Access Journals (Sweden)

    Shaoming Pan

    Full Text Available Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  17. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    Science.gov (United States)

    Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen

    2015-01-01

    Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  18. Geospatial Information is the Cornerstone of Effective Hazards Response

    Science.gov (United States)

    Newell, Mark

    2008-01-01

    Every day there are hundreds of natural disasters world-wide. Some are dramatic, whereas others are barely noticeable. A natural disaster is commonly defined as a natural event with catastrophic consequences for living things in the vicinity. Those events include earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, and wildfires. Man-made disasters are events that are caused by man either intentionally or by accident, and that directly or indirectly threaten public health and well-being. These occurrences span the spectrum from terrorist attacks to accidental oil spills. To assist in responding to natural and potential man-made disasters, the U.S. Geological Survey (USGS) has established the Geospatial Information Response Team (GIRT) (http://www.usgs.gov/emergency/). The primary purpose of the GIRT is to ensure rapid coordination and availability of geospatial information for effective response by emergency responders, and land and resource managers, and for scientific analysis. The GIRT is responsible for establishing monitoring procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing relevant geospatial products and services. The GIRT is focused on supporting programs, offices, other agencies, and the public in mission response to hazards. The GIRT will leverage the USGS Geospatial Liaison Network and partnerships with the Department of Homeland Security (DHS), National Geospatial-Intelligence Agency (NGA), and Northern Command (NORTHCOM) to coordinate the provisioning and deployment of USGS geospatial data, products, services, and equipment. The USGS geospatial liaisons will coordinate geospatial information sharing with State, local, and tribal governments, and ensure geospatial liaison back-up support procedures are in place. The GIRT will coordinate disposition of USGS staff in support of DHS response center activities as requested by DHS. The GIRT

  19. A Python Geospatial Language Toolkit

    Science.gov (United States)

    Fillmore, D.; Pletzer, A.; Galloy, M.

    2012-12-01

    The volume and scope of geospatial data archives, such as collections of satellite remote sensing or climate model products, has been rapidly increasing and will continue to do so in the near future. The recently launched (October 2011) Suomi National Polar-orbiting Partnership satellite (NPP) for instance, is the first of a new generation of Earth observation platforms that will monitor the atmosphere, oceans, and ecosystems, and its suite of instruments will generate several terabytes each day in the form of multi-spectral images and derived datasets. Full exploitation of such data for scientific analysis and decision support applications has become a major computational challenge. Geophysical data exploration and knowledge discovery could benefit, in particular, from intelligent mechanisms for extracting and manipulating subsets of data relevant to the problem of interest. Potential developments include enhanced support for natural language queries and directives to geospatial datasets. The translation of natural language (that is, human spoken or written phrases) into complex but unambiguous objects and actions can be based on a context, or knowledge domain, that represents the underlying geospatial concepts. This poster describes a prototype Python module that maps English phrases onto basic geospatial objects and operations. This module, along with the associated computational geometry methods, enables the resolution of natural language directives that include geographic regions of arbitrary shape and complexity.

  20. Parallel Agent-as-a-Service (P-AaaS Based Geospatial Service in the Cloud

    Directory of Open Access Journals (Sweden)

    Xicheng Tan

    2017-04-01

    Full Text Available To optimize the efficiency of the geospatial service in the flood response decision making system, a Parallel Agent-as-a-Service (P-AaaS method is proposed and implemented in the cloud. The prototype system and comparisons demonstrate the advantages of our approach over existing methods. The P-AaaS method includes both parallel architecture and a mechanism for adjusting the computational resources—the parallel geocomputing mechanism of the P-AaaS method used to execute a geospatial service and the execution algorithm of the P-AaaS based geospatial service chain, respectively. The P-AaaS based method has the following merits: (1 it inherits the advantages of the AaaS-based method (i.e., avoiding transfer of large volumes of remote sensing data or raster terrain data, agent migration, and intelligent conversion into services to improve domain expert collaboration; (2 it optimizes the low performance and the concurrent geoprocessing capability of the AaaS-based method, which is critical for special applications (e.g., highly concurrent applications and emergency response applications; and (3 it adjusts the computing resources dynamically according to the number and the performance requirements of concurrent requests, which allows the geospatial service chain to support a large number of concurrent requests by scaling up the cloud-based clusters in use and optimizes computing resources and costs by reducing the number of virtual machines (VMs when the number of requests decreases.

  1. Geospatial environmental data modelling applications using remote sensing, GIS and spatial statistics

    Energy Technology Data Exchange (ETDEWEB)

    Siljander, M.

    2010-07-01

    This thesis presents novel modelling applications for environmental geospatial data using remote sensing, GIS and statistical modelling techniques. The studied themes can be classified into four main themes: (i) to develop advanced geospatial databases. Paper (I) demonstrates the creation of a geospatial database for the Glanville fritillary butterfly (Melitaea cinxia) in the Aaland Islands, south-western Finland; (ii) to analyse species diversity and distribution using GIS techniques. Paper (II) presents a diversity and geographical distribution analysis for Scopulini moths at a world-wide scale; (iii) to study spatiotemporal forest cover change. Paper (III) presents a study of exotic and indigenous tree cover change detection in Taita Hills Kenya using airborne imagery and GIS analysis techniques; (iv) to explore predictive modelling techniques using geospatial data. In Paper (IV) human population occurrence and abundance in the Taita Hills highlands was predicted using the generalized additive modelling (GAM) technique. Paper (V) presents techniques to enhance fire prediction and burned area estimation at a regional scale in East Caprivi Namibia. Paper (VI) compares eight state-of-the-art predictive modelling methods to improve fire prediction, burned area estimation and fire risk mapping in East Caprivi Namibia. The results in Paper (I) showed that geospatial data can be managed effectively using advanced relational database management systems. Metapopulation data for Melitaea cinxia butterfly was successfully combined with GPS-delimited habitat patch information and climatic data. Using the geospatial database, spatial analyses were successfully conducted at habitat patch level or at more coarse analysis scales. Moreover, this study showed it appears evident that at a large-scale spatially correlated weather conditions are one of the primary causes of spatially correlated changes in Melitaea cinxia population sizes. In Paper (II) spatiotemporal characteristics

  2. Integration of Geospatial Science in Teacher Education

    Science.gov (United States)

    Hauselt, Peggy; Helzer, Jennifer

    2012-01-01

    One of the primary missions of our university is to train future primary and secondary teachers. Geospatial sciences, including GIS, have long been excluded from teacher education curriculum. This article explains the curriculum revisions undertaken to increase the geospatial technology education of future teachers. A general education class…

  3. Geospatial Data Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...

  4. Large litter sizes

    DEFF Research Database (Denmark)

    Sandøe, Peter; Rutherford, K.M.D.; Berg, Peer

    2012-01-01

    This paper presents some key results and conclusions from a review (Rutherford et al. 2011) undertaken regarding the ethical and welfare implications of breeding for large litter size in the domestic pig and about different ways of dealing with these implications. Focus is primarily on the direct...... possible to achieve a drop in relative piglet mortality and the related welfare problems. However, there will be a growing problem with the need to use foster or nurse sows which may have negative effects on both sows and piglets. This gives rise to new challenges for management....

  5. Bridging the Gap Between Surveyors and the Geo-Spatial Society

    Science.gov (United States)

    Müller, H.

    2016-06-01

    For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.

  6. Automatic geospatial information Web service composition based on ontology interface matching

    Science.gov (United States)

    Xu, Xianbin; Wu, Qunyong; Wang, Qinmin

    2008-10-01

    With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.

  7. Mapping a Difference: The Power of Geospatial Visualization

    Science.gov (United States)

    Kolvoord, B.

    2015-12-01

    Geospatial Technologies (GST), such as GIS, GPS and remote sensing, offer students and teachers the opportunity to study the "why" of where. By making maps and collecting location-based data, students can pursue authentic problems using sophisticated tools. The proliferation of web- and cloud-based tools has made these technologies broadly accessible to schools. In addition, strong spatial thinking skills have been shown to be a key factor in supporting students that want to study science, technology, engineering, and mathematics (STEM) disciplines (Wai, Lubinski and Benbow) and pursue STEM careers. Geospatial technologies strongly scaffold the development of these spatial thinking skills. For the last ten years, the Geospatial Semester, a unique dual-enrollment partnership between James Madison University and Virginia high schools, has provided students with the opportunity to use GST's to hone their spatial thinking skills and to do extended projects of local interest, including environmental, geological and ecological studies. Along with strong spatial thinking skills, these students have also shown strong problem solving skills, often beyond those of fellow students in AP classes. Programs like the Geospatial Semester are scalable and within the reach of many college and university departments, allowing strong engagement with K-12 schools. In this presentation, we'll share details of the Geospatial Semester and research results on the impact of the use of these technologies on students' spatial thinking skills, and discuss the success and challenges of developing K-12 partnerships centered on geospatial visualization.

  8. Activity-Based Intelligence prevedere il futuro osservando il presente con gli strumenti Hexagon Geospatial

    Directory of Open Access Journals (Sweden)

    Massimo Zotti

    2015-06-01

    Full Text Available The intelligence of human activities on the earth's surface, obtained through the analysis of earth observation data and other geospatial information, is vital for the planning and execution of any military action, for peacekeeping or for humanitarian emergencies. The success of these actions largely depends on the ability to analyze timely data from multiple sources. However, the proliferation of new sources of intelligence in a Geospatial big data scenario increasingly complicate the analysis of such activities by human analysts. Modern technologies solve these problems by enabling the Activity Based Intelligence, a methodology that improves the efficiency and timeliness of intelligence through the analysis of historical, current and future activity, to identify patterns, trends and relationships hidden in large data collections from different sources.

  9. Geospatial-temporal semantic graph representations of trajectories from remote sensing and geolocation data

    Science.gov (United States)

    Perkins, David Nikolaus; Brost, Randolph; Ray, Lawrence P.

    2017-08-08

    Various technologies for facilitating analysis of large remote sensing and geolocation datasets to identify features of interest are described herein. A search query can be submitted to a computing system that executes searches over a geospatial temporal semantic (GTS) graph to identify features of interest. The GTS graph comprises nodes corresponding to objects described in the remote sensing and geolocation datasets, and edges that indicate geospatial or temporal relationships between pairs of nodes in the nodes. Trajectory information is encoded in the GTS graph by the inclusion of movable nodes to facilitate searches for features of interest in the datasets relative to moving objects such as vehicles.

  10. A Javascript GIS Platform Based on Invocable Geospatial Web Services

    Directory of Open Access Journals (Sweden)

    Konstantinos Evangelidis

    2018-04-01

    Full Text Available Semantic Web technologies are being increasingly adopted by the geospatial community during last decade through the utilization of open standards for expressing and serving geospatial data. This was also dramatically assisted by the ever-increasing access and usage of geographic mapping and location-based services via smart devices in people’s daily activities. In this paper, we explore the developmental framework of a pure JavaScript client-side GIS platform exclusively based on invocable geospatial Web services. We also extend JavaScript utilization on the server side by deploying a node server acting as a bridge between open source WPS libraries and popular geoprocessing engines. The vehicle for such an exploration is a cross platform Web browser capable of interpreting JavaScript commands to achieve interaction with geospatial providers. The tool is a generic Web interface providing capabilities of acquiring spatial datasets, composing layouts and applying geospatial processes. In an ideal form the end-user will have to identify those services, which satisfy a geo-related need and put them in the appropriate row. The final output may act as a potential collector of freely available geospatial web services. Its server-side components may exploit geospatial processing suppliers composing that way a light-weight fully transparent open Web GIS platform.

  11. GeoSpatial Data Analysis for DHS Programs

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, Eric G.; Burke, John S.; Carlson, Carrie A.; Gillen, David S.; Joslyn, Cliff A.; Olsen, Bryan K.; Critchlow, Terence J.

    2009-05-10

    The Department of Homeland Security law enforcement faces the continual challenge of analyzing their custom data sources in a geospatial context. From a strategic perspective law enforcement has certain requirements to first broadly characterize a given situation using their custom data sources and then once it is summarily understood, to geospatially analyze their data in detail.

  12. Automated geospatial Web Services composition based on geodata quality requirements

    Science.gov (United States)

    Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael

    2012-10-01

    Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.

  13. Capacity Building through Geospatial Education in Planning and School Curricula

    Science.gov (United States)

    Kumar, P.; Siddiqui, A.; Gupta, K.; Jain, S.; Krishna Murthy, Y. V. N.

    2014-11-01

    Geospatial technology has widespread usage in development planning and resource management. It offers pragmatic tools to help urban and regional planners to realize their goals. On the request of Ministry of Urban Development, Govt. of India, the Indian Institute of Remote Sensing (IIRS), Dehradun has taken an initiative to study the model syllabi of All India Council for Technical Education for planning curricula of Bachelor and Master (five disciplines) programmes. It is inferred that geospatial content across the semesters in various planning fields needs revision. It is also realized that students pursuing planning curricula are invariably exposed to spatial mapping tools but the popular digital drafting software have limitations on geospatial analysis of planning phenomena. Therefore, students need exposure on geospatial technologies to understand various real world phenomena. Inputs were given to seamlessly merge and incorporate geospatial components throughout the semesters wherever seems relevant. Another initiative by IIRS was taken to enhance the understanding and essence of space and geospatial technologies amongst the young minds at 10+2 level. The content was proposed in a manner such that youngsters start realizing the innumerable contributions made by space and geospatial technologies in their day-to-day life. This effort both at school and college level would help in not only enhancing job opportunities for young generation but also utilizing the untapped human resource potential. In the era of smart cities, higher economic growth and aspirations for a better tomorrow, integration of Geospatial technologies with conventional wisdom can no longer be ignored.

  14. Challenges in sharing of geospatial data by data custodians in South Africa

    Science.gov (United States)

    Kay, Sissiel E.

    2018-05-01

    As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.

  15. Lsiviewer 2.0 - a Client-Oriented Online Visualization Tool for Geospatial Vector Data

    Science.gov (United States)

    Manikanta, K.; Rajan, K. S.

    2017-09-01

    Geospatial data visualization systems have been predominantly through applications that are installed and run in a desktop environment. Over the last decade, with the advent of web technologies and its adoption by Geospatial community, the server-client model for data handling, data rendering and visualization respectively has been the most prevalent approach in Web-GIS. While the client devices have become functionally more powerful over the recent years, the above model has largely ignored it and is still in a mode of serverdominant computing paradigm. In this paper, an attempt has been made to develop and demonstrate LSIViewer - a simple, easy-to-use and robust online geospatial data visualisation system for the user's own data that harness the client's capabilities for data rendering and user-interactive styling, with a reduced load on the server. The developed system can support multiple geospatial vector formats and can be integrated with other web-based systems like WMS, WFS, etc. The technology stack used to build this system is Node.js on the server side and HTML5 Canvas and JavaScript on the client side. Various tests run on a range of vector datasets, upto 35 MB, showed that the time taken to render the vector data using LSIViewer is comparable to a desktop GIS application, QGIS, over an identical system.

  16. A resource-oriented architecture for a Geospatial Web

    Science.gov (United States)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary

  17. Global polar geospatial information service retrieval based on search engine and ontology reasoning

    Science.gov (United States)

    Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang

    2007-01-01

    In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.

  18. Integrating Free and Open Source Solutions into Geospatial Science Education

    Directory of Open Access Journals (Sweden)

    Vaclav Petras

    2015-06-01

    Full Text Available While free and open source software becomes increasingly important in geospatial research and industry, open science perspectives are generally less reflected in universities’ educational programs. We present an example of how free and open source software can be incorporated into geospatial education to promote open and reproducible science. Since 2008 graduate students at North Carolina State University have the opportunity to take a course on geospatial modeling and analysis that is taught with both proprietary and free and open source software. In this course, students perform geospatial tasks simultaneously in the proprietary package ArcGIS and the free and open source package GRASS GIS. By ensuring that students learn to distinguish between geospatial concepts and software specifics, students become more flexible and stronger spatial thinkers when choosing solutions for their independent work in the future. We also discuss ways to continually update and improve our publicly available teaching materials for reuse by teachers, self-learners and other members of the GIS community. Only when free and open source software is fully integrated into geospatial education, we will be able to encourage a culture of openness and, thus, enable greater reproducibility in research and development applications.

  19. Geospatial Absorption and Regional Effects

    Directory of Open Access Journals (Sweden)

    IOAN MAC

    2009-01-01

    Full Text Available The geospatial absorptions are characterized by a specific complexity both in content and in their phenomenological and spatial manifestation fields. Such processes are differentiated according to their specificity to pre-absorption, absorption or post-absorption. The mechanisms that contribute to absorption are extremely numerous: aggregation, extension, diffusion, substitution, resistivity (resilience, stratification, borrowings, etc. Between these mechanisms frequent relations are established determining an amplification of the process and of its regional effects. The installation of the geographic osmosis phenomenon in a given territory (a place for example leads to a homogenization of the geospatial state and to the installation of the regional homogeneity.

  20. Biosecurity and geospatial analysis of mycoplasma infections in ...

    African Journals Online (AJOL)

    Geospatial database of farm locations and biosecurity measures are essential to control disease outbreaks. A study was conducted to establish geospatial database on poultry farms in Al-Jabal Al-Gharbi region of Libya, to evaluate the biosecurity level of each farm and to determine the seroprevalence of mycoplasma and ...

  1. Searches over graphs representing geospatial-temporal remote sensing data

    Science.gov (United States)

    Brost, Randolph; Perkins, David Nikolaus

    2018-03-06

    Various technologies pertaining to identifying objects of interest in remote sensing images by searching over geospatial-temporal graph representations are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Geospatial-temporal graph searches are made computationally efficient by taking advantage of characteristics of geospatial-temporal data in remote sensing images through the application of various graph search techniques.

  2. The National 3-D Geospatial Information Web-Based Service of Korea

    Science.gov (United States)

    Lee, D. T.; Kim, C. W.; Kang, I. G.

    2013-09-01

    3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of

  3. Improving the Slum Planning Through Geospatial Decision Support System

    Science.gov (United States)

    Shekhar, S.

    2014-11-01

    In India, a number of schemes and programmes have been launched from time to time in order to promote integrated city development and to enable the slum dwellers to gain access to the basic services. Despite the use of geospatial technologies in planning, the local, state and central governments have only been partially successful in dealing with these problems. The study on existing policies and programmes also proved that when the government is the sole provider or mediator, GIS can become a tool of coercion rather than participatory decision-making. It has also been observed that local level administrators who have adopted Geospatial technology for local planning continue to base decision-making on existing political processes. In this juncture, geospatial decision support system (GSDSS) can provide a framework for integrating database management systems with analytical models, graphical display, tabular reporting capabilities and the expert knowledge of decision makers. This assists decision-makers to generate and evaluate alternative solutions to spatial problems. During this process, decision-makers undertake a process of decision research - producing a large number of possible decision alternatives and provide opportunities to involve the community in decision making. The objective is to help decision makers and planners to find solutions through a quantitative spatial evaluation and verification process. The study investigates the options for slum development in a formal framework of RAY (Rajiv Awas Yojana), an ambitious program of Indian Government for slum development. The software modules for realizing the GSDSS were developed using the ArcGIS and Community -VIZ software for Gulbarga city.

  4. Revelation of `Hidden' Balinese Geospatial Heritage on A Map

    Science.gov (United States)

    Soeria Atmadja, Dicky A. S.; Wikantika, Ketut; Budi Harto, Agung; Putra, Daffa Gifary M.

    2018-05-01

    Bali is not just about beautiful nature. It also has a unique and interesting cultural heritage, including `hidden' geospatial heritage. Tri Hita Karana is a Hinduism concept of life consisting of human relation to God, to other humans and to the nature (Parahiyangan, Pawongan and Palemahan), Based on it, - in term of geospatial aspect - the Balinese derived its spatial orientation, spatial planning & lay out, measurement as well as color and typography. Introducing these particular heritage would be a very interesting contribution to Bali tourism. As a respond to these issues, a question arise on how to reveal these unique and highly valuable geospatial heritage on a map which can be used to introduce and disseminate them to the tourists. Symbols (patterns & colors), orientation, distance, scale, layout and toponimy have been well known as elements of a map. There is an chance to apply Balinese geospatial heritage in representing these map elements.

  5. Interoperability in planetary research for geospatial data analysis

    Science.gov (United States)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  6. Bim and Gis: when Parametric Modeling Meets Geospatial Data

    Science.gov (United States)

    Barazzetti, L.; Banfi, F.

    2017-12-01

    Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.

  7. BIM AND GIS: WHEN PARAMETRIC MODELING MEETS GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2017-12-01

    Full Text Available Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building scale to the infrastructure (where geospatial data cannot be neglected has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by “pure” GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator industry, as well as new solutions for parametric modelling with additional geoinformation.

  8. GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.

    Science.gov (United States)

    Liang, Steve H L; Huang, Chih-Yuan

    2013-10-02

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  9. A Geospatial Semantic Enrichment and Query Service for Geotagged Photographs

    Science.gov (United States)

    Ennis, Andrew; Nugent, Chris; Morrow, Philip; Chen, Liming; Ioannidis, George; Stan, Alexandru; Rachev, Preslav

    2015-01-01

    With the increasing abundance of technologies and smart devices, equipped with a multitude of sensors for sensing the environment around them, information creation and consumption has now become effortless. This, in particular, is the case for photographs with vast amounts being created and shared every day. For example, at the time of this writing, Instagram users upload 70 million photographs a day. Nevertheless, it still remains a challenge to discover the “right” information for the appropriate purpose. This paper describes an approach to create semantic geospatial metadata for photographs, which can facilitate photograph search and discovery. To achieve this we have developed and implemented a semantic geospatial data model by which a photograph can be enrich with geospatial metadata extracted from several geospatial data sources based on the raw low-level geo-metadata from a smartphone photograph. We present the details of our method and implementation for searching and querying the semantic geospatial metadata repository to enable a user or third party system to find the information they are looking for. PMID:26205265

  10. Geospatial Information Service System Based on GeoSOT Grid & Encoding

    Directory of Open Access Journals (Sweden)

    LI Shizhong

    2016-12-01

    Full Text Available With the rapid development of the space and earth observation technology, it is important to establish a multi-source, multi-scale and unified cross-platform reference for global data. In practice, the production and maintenance of geospatial data are scattered in different units, and the standard of the data grid varies between departments and systems. All these bring out the disunity of standards among different historical periods or orgnizations. Aiming at geospatial information security library for the national high resolution earth observation, there are some demands for global display, associated retrieval and template applications and other integrated services for geospatial data. Based on GeoSOT grid and encoding theory system, "geospatial information security library information of globally unified grid encoding management" data subdivision organization solutions have been proposed; system-level analyses, researches and designs have been carried out. The experimental results show that the data organization and management method based on GeoSOT can significantly improve the overall efficiency of the geospatial information security service system.

  11. BPELPower—A BPEL execution engine for geospatial web services

    Science.gov (United States)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  12. Restful Implementation of Catalogue Service for Geospatial Data Provenance

    Science.gov (United States)

    Jiang, L. C.; Yue, P.; Lu, X. C.

    2013-10-01

    Provenance, also known as lineage, is important in understanding the derivation history of data products. Geospatial data provenance helps data consumers to evaluate the quality and reliability of geospatial data. In a service-oriented environment, where data are often consumed or produced by distributed services, provenance could be managed by following the same service-oriented paradigm. The Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) is used for the registration and query of geospatial data provenance by extending ebXML Registry Information Model (ebRIM). Recent advance of the REpresentational State Transfer (REST) paradigm has shown great promise for the easy integration of distributed resources. RESTful Web Service aims to provide a standard way for Web clients to communicate with servers based on REST principles. The existing approach for provenance catalogue service could be improved by adopting the RESTful design. This paper presents the design and implementation of a catalogue service for geospatial data provenance following RESTful architecture style. A middleware named REST Converter is added on the top of the legacy catalogue service to support a RESTful style interface. The REST Converter is composed of a resource request dispatcher and six resource handlers. A prototype service is developed to demonstrate the applicability of the approach.

  13. An alternative method for determining particle-size distribution of forest road aggregate and soil with large-sized particles

    Science.gov (United States)

    Hakjun Rhee; Randy B. Foltz; James L. Fridley; Finn Krogstad; Deborah S. Page-Dumroese

    2014-01-01

    Measurement of particle-size distribution (PSD) of soil with large-sized particles (e.g., 25.4 mm diameter) requires a large sample and numerous particle-size analyses (PSAs). A new method is needed that would reduce time, effort, and cost for PSAs of the soil and aggregate material with large-sized particles. We evaluated a nested method for sampling and PSA by...

  14. SWOT analysis on National Common Geospatial Information Service Platform of China

    Science.gov (United States)

    Zheng, Xinyan; He, Biao

    2010-11-01

    Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.

  15. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin

    2010-03-01

    Geospatial information systems provide an abundance of information for researchers and scientists. Unfortunately this type of data can usually only be analyzed a few megapixels at a time, giving researchers a very narrow view into these voluminous data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented levels expediting analysis, interrogation, and discovery. ©2010 IEEE.

  16. Modeling photovoltaic diffusion: an analysis of geospatial datasets

    International Nuclear Information System (INIS)

    Davidson, Carolyn; Drury, Easan; Lopez, Anthony; Elmore, Ryan; Margolis, Robert

    2014-01-01

    This study combines address-level residential photovoltaic (PV) adoption trends in California with several types of geospatial information—population demographics, housing characteristics, foreclosure rates, solar irradiance, vehicle ownership preferences, and others—to identify which subsets of geospatial information are the best predictors of historical PV adoption. Number of rooms, heating source and house age were key variables that had not been previously explored in the literature, but are consistent with the expected profile of a PV adopter. The strong relationship provided by foreclosure indicators and mortgage status have less of an intuitive connection to PV adoption, but may be highly correlated with characteristics inherent in PV adopters. Next, we explore how these predictive factors and model performance varies between different Investor Owned Utility (IOU) regions in California, and at different spatial scales. Results suggest that models trained with small subsets of geospatial information (five to eight variables) may provide similar explanatory power as models using hundreds of geospatial variables. Further, the predictive performance of models generally decreases at higher resolution, i.e., below ZIP code level since several geospatial variables with coarse native resolution become less useful for representing high resolution variations in PV adoption trends. However, for California we find that model performance improves if parameters are trained at the regional IOU level rather than the state-wide level. We also find that models trained within one IOU region are generally representative for other IOU regions in CA, suggesting that a model trained with data from one state may be applicable in another state. (letter)

  17. Assessment of economically optimal water management and geospatial potential for large-scale water storage

    Science.gov (United States)

    Weerasinghe, Harshi; Schneider, Uwe A.

    2010-05-01

    Assessment of economically optimal water management and geospatial potential for large-scale water storage Weerasinghe, Harshi; Schneider, Uwe A Water is an essential but limited and vulnerable resource for all socio-economic development and for maintaining healthy ecosystems. Water scarcity accelerated due to population expansion, improved living standards, and rapid growth in economic activities, has profound environmental and social implications. These include severe environmental degradation, declining groundwater levels, and increasing problems of water conflicts. Water scarcity is predicted to be one of the key factors limiting development in the 21st century. Climate scientists have projected spatial and temporal changes in precipitation and changes in the probability of intense floods and droughts in the future. As scarcity of accessible and usable water increases, demand for efficient water management and adaptation strategies increases as well. Addressing water scarcity requires an intersectoral and multidisciplinary approach in managing water resources. This would in return safeguard the social welfare and the economical benefit to be at their optimal balance without compromising the sustainability of ecosystems. This paper presents a geographically explicit method to assess the potential for water storage with reservoirs and a dynamic model that identifies the dimensions and material requirements under an economically optimal water management plan. The methodology is applied to the Elbe and Nile river basins. Input data for geospatial analysis at watershed level are taken from global data repositories and include data on elevation, rainfall, soil texture, soil depth, drainage, land use and land cover; which are then downscaled to 1km spatial resolution. Runoff potential for different combinations of land use and hydraulic soil groups and for mean annual precipitation levels are derived by the SCS-CN method. Using the overlay and decision tree algorithms

  18. Geospatial Services in Special Libraries: A Needs Assessment Perspective

    Science.gov (United States)

    Barnes, Ilana

    2013-01-01

    Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…

  19. An Ontology-supported Approach for Automatic Chaining of Web Services in Geospatial Knowledge Discovery

    Science.gov (United States)

    di, L.; Yue, P.; Yang, W.; Yu, G.

    2006-12-01

    Recent developments in geospatial semantic Web have shown promise for automatic discovery, access, and use of geospatial Web services to quickly and efficiently solve particular application problems. With the semantic Web technology, it is highly feasible to construct intelligent geospatial knowledge systems that can provide answers to many geospatial application questions. A key challenge in constructing such intelligent knowledge system is to automate the creation of a chain or process workflow that involves multiple services and highly diversified data and can generate the answer to a specific question of users. This presentation discusses an approach for automating composition of geospatial Web service chains by employing geospatial semantics described by geospatial ontologies. It shows how ontology-based geospatial semantics are used for enabling the automatic discovery, mediation, and chaining of geospatial Web services. OWL-S is used to represent the geospatial semantics of individual Web services and the type of the services it belongs to and the type of the data it can handle. The hierarchy and classification of service types are described in the service ontology. The hierarchy and classification of data types are presented in the data ontology. For answering users' geospatial questions, an Artificial Intelligent (AI) planning algorithm is used to construct the service chain by using the service and data logics expressed in the ontologies. The chain can be expressed as a graph with nodes representing services and connection weights representing degrees of semantic matching between nodes. The graph is a visual representation of logical geo-processing path for answering users' questions. The graph can be instantiated to a physical service workflow for execution to generate the answer to a user's question. A prototype system, which includes real world geospatial applications, is implemented to demonstrate the concept and approach.

  20. Open Source Web Based Geospatial Processing with OMAR

    Directory of Open Access Journals (Sweden)

    Mark Lucas

    2009-01-01

    Full Text Available The availability of geospatial data sets is exploding. New satellites, aerial platforms, video feeds, global positioning system tagged digital photos, and traditional GIS information are dramatically increasing across the globe. These raw materials need to be dynamically processed, combined and correlated to generate value added information products to answer a wide range of questions. This article provides an overview of OMAR web based geospatial processing. OMAR is part of the Open Source Software Image Map project under the Open Source Geospatial Foundation. The primary contributors of OSSIM make their livings by providing professional services to US Government agencies and programs. OMAR provides one example that open source software solutions are increasingly being deployed in US government agencies. We will also summarize the capabilities of OMAR and its plans for near term development.

  1. The Value of Information - Accounting for a New Geospatial Paradigm

    Science.gov (United States)

    Pearlman, J.; Coote, A. M.

    2014-12-01

    A new frontier in consideration of socio-economic benefit is valuing information as an asset, often referred to as Infonomics. Conventional financial practice does not easily provide a mechanism for valuing information and yet clearly for many of the largest corporations, such as Google and Facebook, it is their principal asset. This is exacerbated for public sector organizations, as those that information-centric rather than information-enabled are relatively few - statistics, archiving and mapping agencies are perhaps the only examples - so it's not at the top of the agenda for Government. However, it is a hugely important issue when valuing Geospatial data and information. Geospatial data allows public institutions to operate, and facilitates the provision of essential services for emergency response and national defense. In this respect, geospatial data is strongly analogous to other types of public infrastructure, such as utilities and roads. The use of Geospatial data is widespread from companies in the transportation or construction sectors to individual planning for daily events. The categorization of geospatial data as infrastructure is critical to decisions related to investment in its management, maintenance and upgrade over time. Geospatial data depreciates in the same way that physical infrastructure depreciates. It needs to be maintained otherwise its functionality and value in use declines. We have coined the term geo-infonomics to encapsulate the concept. This presentation will develop the arguments around its importance and current avenues of research.

  2. GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web

    Directory of Open Access Journals (Sweden)

    Steve H.L. Liang

    2013-10-01

    Full Text Available The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web’s full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  3. Large size space construction for space exploitation

    Science.gov (United States)

    Kondyurin, Alexey

    2016-07-01

    Space exploitation is impossible without large space structures. We need to make sufficient large volume of pressurized protecting frames for crew, passengers, space processing equipment, & etc. We have to be unlimited in space. Now the size and mass of space constructions are limited by possibility of a launch vehicle. It limits our future in exploitation of space by humans and in development of space industry. Large-size space construction can be made with using of the curing technology of the fibers-filled composites and a reactionable matrix applied directly in free space. For curing the fabric impregnated with a liquid matrix (prepreg) is prepared in terrestrial conditions and shipped in a container to orbit. In due time the prepreg is unfolded by inflating. After polymerization reaction, the durable construction can be fitted out with air, apparatus and life support systems. Our experimental studies of the curing processes in the simulated free space environment showed that the curing of composite in free space is possible. The large-size space construction can be developed. A project of space station, Moon base, Mars base, mining station, interplanet space ship, telecommunication station, space observatory, space factory, antenna dish, radiation shield, solar sail is proposed and overviewed. The study was supported by Humboldt Foundation, ESA (contract 17083/03/NL/SFe), NASA program of the stratospheric balloons and RFBR grants (05-08-18277, 12-08-00970 and 14-08-96011).

  4. Information gathering, management and transferring for geospatial intelligence - A conceptual approach to create a spatial data infrastructure

    Science.gov (United States)

    Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena

    2017-06-01

    Since long ago, information is a key factor for military organizations. In military context the success of joint and combined operations depends on the accurate information and knowledge flow concerning the operational theatre: provision of resources, environment evolution, targets' location, where and when an event will occur. Modern military operations cannot be conceive without maps and geospatial information. Staffs and forces on the field request large volume of information during the planning and execution process, horizontal and vertical geospatial information integration is critical for decision cycle. Information and knowledge management are fundamental to clarify an environment full of uncertainty. Geospatial information (GI) management rises as a branch of information and knowledge management, responsible for the conversion process from raw data collect by human or electronic sensors to knowledge. Geospatial information and intelligence systems allow us to integrate all other forms of intelligence and act as a main platform to process and display geospatial-time referenced events. Combining explicit knowledge with person know-how to generate a continuous learning cycle that supports real time decisions, mitigates the influences of fog of war and provides the knowledge supremacy. This paper presents the analysis done after applying a questionnaire and interviews about the GI and intelligence management in a military organization. The study intended to identify the stakeholder's requirements for a military spatial data infrastructure as well as the requirements for a future software system development.

  5. Geo-spatial technologies in urban environments policy, practice, and pixels

    CERN Document Server

    Jensen, Ryan R; McLean, Daniel

    2004-01-01

    Using Geospatial Technologies in Urban Environments simultaneously fills two gaping vacuums in the scholarly literature on urban geography. The first is the clear and straightforward application of geospatial technologies to practical urban issues. By using remote sensing and statistical techniques (correlation-regression analysis, the expansion method, factor analysis, and analysis of variance), the - thors of these 12 chapters contribute significantly to our understanding of how geospatial methodologies enhance urban studies. For example, the GIS Specialty Group of the Association of American Geographers (AAG) has the largest m- bership of all the AAG specialty groups, followed by the Urban Geography S- cialty Group. Moreover, the Urban Geography Specialty Group has the largest number of cross-memberships with the GIS Specialty Group. This book advances this important geospatial and urban link. Second, the book fills a wide void in the urban-environment literature. Although the Annals of the Association of ...

  6. Towards Geo-spatial Hypermedia: Concepts and Prototype Implementation

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Vestergaard, Peter Posselt; Ørbæk, Peter

    2002-01-01

    This paper combines spatial hypermedia with techniques from Geographical Information Systems and location based services. We describe the Topos 3D Spatial Hypermedia system and how it has been developed to support geo-spatial hypermedia coupling hypermedia information to model representations...... of real world buildings and landscapes. The prototype experiments are primarily aimed at supporting architects and landscape architects in their work on site. Here it is useful to be able to superimpose and add different layers of information to, e.g. a landscape depending on the task being worked on. We...... and indirect navigation. Finally, we conclude with a number of research issues which are central to the future development of geo-spatial hypermedia, including design issues in combining metaphorical and literal hypermedia space, as well as a discussion of the role of spatial parsing in a geo-spatial context....

  7. Using the Geospatial Web to Deliver and Teach Giscience Education Programs

    Science.gov (United States)

    Veenendaal, B.

    2015-05-01

    Geographic information science (GIScience) education has undergone enormous changes over the past years. One major factor influencing this change is the role of the geospatial web in GIScience. In addition to the use of the web for enabling and enhancing GIScience education, it is also used as the infrastructure for communicating and collaborating among geospatial data and users. The web becomes both the means and the content for a geospatial education program. However, the web does not replace the traditional face-to-face environment, but rather is a means to enhance it, expand it and enable an authentic and real world learning environment. This paper outlines the use of the web in both the delivery and content of the GIScience program at Curtin University. The teaching of the geospatial web, web and cloud based mapping, and geospatial web services are key components of the program, and the use of the web and online learning are important to deliver this program. Some examples of authentic and real world learning environments are provided including joint learning activities with partner universities.

  8. Leveraging the geospatial advantage

    Science.gov (United States)

    Ben Butler; Andrew Bailey

    2013-01-01

    The Wildland Fire Decision Support System (WFDSS) web-based application leverages geospatial data to inform strategic decisions on wildland fires. A specialized data team, working within the Wildland Fire Management Research Development and Application group (WFM RD&A), assembles authoritative national-level data sets defining values to be protected. The use of...

  9. National Geospatial-Intelligence Agency Academic Research Program

    Science.gov (United States)

    Loomer, S. A.

    2004-12-01

    "Know the Earth.Show the Way." In fulfillment of its vision, the National Geospatial-Intelligence Agency (NGA) provides geospatial intelligence in all its forms and from whatever source-imagery, imagery intelligence, and geospatial data and information-to ensure the knowledge foundation for planning, decision, and action. To achieve this, NGA conducts a multi-disciplinary program of basic research in geospatial intelligence topics through grants and fellowships to the leading investigators, research universities, and colleges of the nation. This research provides the fundamental science support to NGA's applied and advanced research programs. The major components of the NGA Academic Research Program (NARP) are: - NGA University Research Initiatives (NURI): Three-year basic research grants awarded competitively to the best investigators across the US academic community. Topics are selected to provide the scientific basis for advanced and applied research in NGA core disciplines. - Historically Black College and University - Minority Institution Research Initiatives (HBCU-MI): Two-year basic research grants awarded competitively to the best investigators at Historically Black Colleges and Universities, and Minority Institutions across the US academic community. - Director of Central Intelligence Post-Doctoral Research Fellowships: Fellowships providing access to advanced research in science and technology applicable to the intelligence community's mission. The program provides a pool of researchers to support future intelligence community needs and develops long-term relationships with researchers as they move into career positions. This paper provides information about the NGA Academic Research Program, the projects it supports and how other researchers and institutions can apply for grants under the program.

  10. Allocation of Tutors and Study Centers in Distance Learning Using Geospatial Technologies

    Directory of Open Access Journals (Sweden)

    Shahid Nawaz Khan

    2018-05-01

    Full Text Available Allama Iqbal Open University (AIOU is Pakistan’s largest distance learning institute, providing education to 1.4 million students. This is a fairly large setup across a country where students are highly geographically distributed. Currently, the system works using a manual approach, which is not efficient. Allocation of tutors and study centers to students plays a key role in creating a better learning environment for distance learning. Assigning tutors and study centers to distance learning students is a challenging task when there is a huge geographic spread. Using geospatial technologies in open and distance learning can fix allocation problems. This research analyzes real data from the twin cities Islamabad and Rawalpindi. The results show that geospatial technologies can be used for efficient and proper resource utilization and allocation, which in turn can save time and money. The overall idea fits into an improved distance learning framework and related analytics.

  11. Issues on Building Kazakhstan Geospatial Portal to Implement E-Government

    Science.gov (United States)

    Sagadiyev, K.; Kang, H. K.; Li, K. J.

    2016-06-01

    A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.

  12. ISSUES ON BUILDING KAZAKHSTAN GEOSPATIAL PORTAL TO IMPLEMENT E-GOVERNMENT

    Directory of Open Access Journals (Sweden)

    K. Sagadiyev

    2016-06-01

    Full Text Available A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.

  13. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    Science.gov (United States)

    Hearn,, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  14. Geo-spatial Cognition on Human's Social Activity Space Based on Multi-scale Grids

    Directory of Open Access Journals (Sweden)

    ZHAI Weixin

    2016-12-01

    Full Text Available Widely applied location aware devices, including mobile phones and GPS receivers, have provided great convenience for collecting large volume individuals' geographical information. The researches on the human's society behavior space has attracts an increasingly number of researchers. In our research, based on location-based Flickr data From 2004 to May, 2014 in China, we choose five levels of spatial grids to form the multi-scale frame for investigate the correlation between the scale and the geo-spatial cognition on human's social activity space. The HT-index is selected as the fractal inspired by Alexander to estimate the maturity of the society activity on different scales. The results indicate that that the scale characteristics are related to the spatial cognition to a certain extent. It is favorable to use the spatial grid as a tool to control scales for geo-spatial cognition on human's social activity space.

  15. GIBS Geospatial Data Abstraction Library (GDAL)

    Data.gov (United States)

    National Aeronautics and Space Administration — GDAL is an open source translator library for raster geospatial data formats that presents a single abstract data model to the calling application for all supported...

  16. Foreword to the theme issue on geospatial computer vision

    Science.gov (United States)

    Wegner, Jan Dirk; Tuia, Devis; Yang, Michael; Mallet, Clement

    2018-06-01

    Geospatial Computer Vision has become one of the most prevalent emerging fields of investigation in Earth Observation in the last few years. In this theme issue, we aim at showcasing a number of works at the interface between remote sensing, photogrammetry, image processing, computer vision and machine learning. In light of recent sensor developments - both from the ground as from above - an unprecedented (and ever growing) quantity of geospatial data is available for tackling challenging and urgent tasks such as environmental monitoring (deforestation, carbon sequestration, climate change mitigation), disaster management, autonomous driving or the monitoring of conflicts. The new bottleneck for serving these applications is the extraction of relevant information from such large amounts of multimodal data. This includes sources, stemming from multiple sensors, that exhibit distinct physical nature of heterogeneous quality, spatial, spectral and temporal resolutions. They are as diverse as multi-/hyperspectral satellite sensors, color cameras on drones, laser scanning devices, existing open land-cover geodatabases and social media. Such core data processing is mandatory so as to generate semantic land-cover maps, accurate detection and trajectories of objects of interest, as well as by-products of superior added-value: georeferenced data, images with enhanced geometric and radiometric qualities, or Digital Surface and Elevation Models.

  17. Geospatial Information System Capability Maturity Models

    Science.gov (United States)

    2017-06-01

    To explore how State departments of transportation (DOTs) evaluate geospatial tool applications and services within their own agencies, particularly their experiences using capability maturity models (CMMs) such as the Urban and Regional Information ...

  18. Experimental study on propagation properties of large size TEM antennas

    International Nuclear Information System (INIS)

    Zhang Guowei; Wang Haiyang; Chen Weiqing; Wang Wei; Zhu Xiangqin; Xie Linshen

    2014-01-01

    The propagation properties of large size TEM antennas were studied by experiment. The size of the TEM antennas is 60 m × 20 m × 10 m and the character Impedance is 120 Ω. A kind of dielectric foil switch is designed compactly with TEM antennas which can generate double exponential waveform with altitude of 10 kV and rise time of l.2 ns. The radiated field distribution was measured. The relationship between rise time/altitude and distance were provided, and the propagation properties of large size TEM antennas were summarized. (authors)

  19. Processing and properties of large-sized ceramic slabs

    Energy Technology Data Exchange (ETDEWEB)

    Raimondo, M.; Dondi, M.; Zanelli, C.; Guarini, G.; Gozzi, A.; Marani, F.; Fossa, L.

    2010-07-01

    Large-sized ceramic slabs with dimensions up to 360x120 cm{sup 2} and thickness down to 2 mm are manufactured through an innovative ceramic process, starting from porcelain stoneware formulations and involving wet ball milling, spray drying, die-less slow-rate pressing, a single stage of fast drying-firing, and finishing (trimming, assembling of ceramic-fiberglass composites). Fired and unfired industrial slabs were selected and characterized from the technological, compositional (XRF, XRD) and microstructural (SEM) viewpoints. Semi-finished products exhibit a remarkable microstructural uniformity and stability in a rather wide window of firing schedules. The phase composition and compact microstructure of fired slabs are very similar to those of porcelain stoneware tiles. The values of water absorption, bulk density, closed porosity, functional performances as well as mechanical and tribological properties conform to the top quality range of porcelain stoneware tiles. However, the large size coupled with low thickness bestow on the slab a certain degree of flexibility, which is emphasized in ceramic-fiberglass composites. These outstanding performances make the large-sized slabs suitable to be used in novel applications: building and construction (new floorings without dismantling the previous paving, ventilated facades, tunnel coverings, insulating panelling), indoor furnitures (table tops, doors), support for photovoltaic ceramic panels. (Author) 24 refs.

  20. Processing and properties of large-sized ceramic slabs

    International Nuclear Information System (INIS)

    Raimondo, M.; Dondi, M.; Zanelli, C.; Guarini, G.; Gozzi, A.; Marani, F.; Fossa, L.

    2010-01-01

    Large-sized ceramic slabs with dimensions up to 360x120 cm 2 and thickness down to 2 mm are manufactured through an innovative ceramic process, starting from porcelain stoneware formulations and involving wet ball milling, spray drying, die-less slow-rate pressing, a single stage of fast drying-firing, and finishing (trimming, assembling of ceramic-fiberglass composites). Fired and unfired industrial slabs were selected and characterized from the technological, compositional (XRF, XRD) and microstructural (SEM) viewpoints. Semi-finished products exhibit a remarkable microstructural uniformity and stability in a rather wide window of firing schedules. The phase composition and compact microstructure of fired slabs are very similar to those of porcelain stoneware tiles. The values of water absorption, bulk density, closed porosity, functional performances as well as mechanical and tribological properties conform to the top quality range of porcelain stoneware tiles. However, the large size coupled with low thickness bestow on the slab a certain degree of flexibility, which is emphasized in ceramic-fiberglass composites. These outstanding performances make the large-sized slabs suitable to be used in novel applications: building and construction (new floorings without dismantling the previous paving, ventilated facades, tunnel coverings, insulating panelling), indoor furnitures (table tops, doors), support for photovoltaic ceramic panels. (Author) 24 refs.

  1. Technological Aspects of Creating Large-size Optical Telescopes

    Directory of Open Access Journals (Sweden)

    V. V. Sychev

    2015-01-01

    Full Text Available A concept of the telescope creation, first of all, depends both on a choice of the optical scheme to form optical radiation and images with minimum losses of energy and information and on a choice of design to meet requirements for strength, stiffness, and stabilization characteristics in real telescope operation conditions. Thus, the concept of creating large-size telescopes, certainly, involves the use of adaptive optics methods and means.The level of technological capabilities to realize scientific and engineering ideas define a successful development of large-size optical telescopes in many respects. All developers pursue the same aim that is to raise an amount of information by increasing a main mirror diameter of the telescope.The article analyses the adaptive telescope designs developed in our country. Using a domestic ACT-25 telescope as an example, it considers creation of large-size optical telescopes in terms of technological aspects. It also describes the telescope creation concept features, which allow reaching marginally possible characteristics to ensure maximum amount of information.The article compares a wide range of large-size telescopes projects. It shows that a domestic project to create the adaptive ACT-25 super-telescope surpasses its foreign counterparts, and there is no sense to implement Euro50 (50m and OWL (100m projects.The considered material gives clear understanding on a role of technological aspects in development of such complicated optic-electronic complexes as a large-size optical telescope. The technological criteria of an assessment offered in the article, namely specific informational content of the telescope, its specific mass, and specific cost allow us to reveal weaknesses in the project development and define a reserve regarding further improvement of the telescope.The analysis of results and their judgment have shown that improvement of optical largesize telescopes in terms of their maximum

  2. Towards Geo-spatial Information Science in Big Data Era

    Directory of Open Access Journals (Sweden)

    LI Deren

    2016-04-01

    Full Text Available Since the 1990s, with the advent of worldwide information revolution and the development of internet, geospatial information science have also come of age, which pushed forward the building of digital Earth and cyber city. As we entered the 21st century, with the development and integration of global information technology and industrialization, internet of things and cloud computing came into being, human society enters into the big data era. This article covers the key features (ubiquitous, multi-dimension and dynamics, internet+networking, full automation and real-time, from sensing to recognition, crowdsourcing and VGI, and service-oriented of geospatial information science in the big data era and addresses the key technical issues (non-linear four dimensional Earth reference frame system, space based enhanced GNSS, space-air and land unified network communication techniques, on board processing techniques for multi-sources image data, smart interface service techniques for space-borne information, space based resource scheduling and network security, design and developing of a payloads based multi-functional satellite platform. That needs to be resolved to provide a new definition of geospatial information science in big data era. Based on the discussion in this paper, the author finally proposes a new definition of geospatial information science (geomatics, i.e. Geomatics is a multiple discipline science and technology which, using a systematic approach, integrates all the means for spatio-temporal data acquisition, information extraction, networked management, knowledge discovering, spatial sensing and recognition, as well as intelligent location based services of any physical objects and human activities around the earth and its environment. Starting from this new definition, geospatial information science will get much more chances and find much more tasks in big data era for generation of smart earth and smart city . Our profession

  3. Student Focused Geospatial Curriculum Initiatives: Internships and Certificate Programs at NCCU

    Science.gov (United States)

    Vlahovic, G.; Malhotra, R.

    2009-12-01

    This paper reports recent efforts by the Department of Environmental, Earth and Geospatial Sciences faculty at North Carolina Central University (NCCU) to develop a leading geospatial sciences program that will be considered a model for other Historically Black College/University (HBCU) peers nationally. NCCU was established in 1909 and is the nation’s first state supported public liberal arts college funded for African Americans. In the most recent annual ranking of America’s best black colleges by the US News and World Report (Best Colleges 2010), NCCU was ranked 10th in the nation. As one of only two HBCUs in the southeast offering an undergraduate degree in Geography (McKee, J.O. and C. V. Dixon. Geography in Historically Black Colleges/ Universities in the Southeast, in The Role of the South in Making of American Geography: Centennial of the AAG, 2004), NCCU is uniquely positioned to positively affect talent and diversity of the geospatial discipline in the future. Therefore, successful creation of research and internship pathways for NCCU students has national implications because it will increase the number of minority students joining the workforce and applying to PhD programs. Several related efforts will be described, including research and internship projects with Fugro EarthData Inc., Center for Remote Sensing and Mapping Science at the University of Georgia, Center for Earthquake Research and Information at the University of Memphis and the City of Durham. The authors will also outline requirements and recent successes of ASPRS Provisional Certification Program, developed and pioneered as collaborative effort between ASPRS and NCCU. This certificate program allows graduating students majoring in geospatial technologies and allied fields to become provisionally certified by passing peer-review and taking the certification exam. At NCCU, projects and certification are conducted under the aegis of the Geospatial Research, Innovative Teaching and

  4. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    Science.gov (United States)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https

  5. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    Science.gov (United States)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-03-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  6. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    Science.gov (United States)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-04-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  7. A geospatial database model for the management of remote sensing datasets at multiple spectral, spatial, and temporal scales

    Science.gov (United States)

    Ifimov, Gabriela; Pigeau, Grace; Arroyo-Mora, J. Pablo; Soffer, Raymond; Leblanc, George

    2017-10-01

    In this study the development and implementation of a geospatial database model for the management of multiscale datasets encompassing airborne imagery and associated metadata is presented. To develop the multi-source geospatial database we have used a Relational Database Management System (RDBMS) on a Structure Query Language (SQL) server which was then integrated into ArcGIS and implemented as a geodatabase. The acquired datasets were compiled, standardized, and integrated into the RDBMS, where logical associations between different types of information were linked (e.g. location, date, and instrument). Airborne data, at different processing levels (digital numbers through geocorrected reflectance), were implemented in the geospatial database where the datasets are linked spatially and temporally. An example dataset consisting of airborne hyperspectral imagery, collected for inter and intra-annual vegetation characterization and detection of potential hydrocarbon seepage events over pipeline areas, is presented. Our work provides a model for the management of airborne imagery, which is a challenging aspect of data management in remote sensing, especially when large volumes of data are collected.

  8. MapFactory - Towards a mapping design pattern for big geospatial data

    Science.gov (United States)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  9. Center of Excellence for Geospatial Information Science research plan 2013-18

    Science.gov (United States)

    Usery, E. Lynn

    2013-01-01

    The U.S. Geological Survey Center of Excellence for Geospatial Information Science (CEGIS) was created in 2006 and since that time has provided research primarily in support of The National Map. The presentations and publications of the CEGIS researchers document the research accomplishments that include advances in electronic topographic map design, generalization, data integration, map projections, sea level rise modeling, geospatial semantics, ontology, user-centered design, volunteer geographic information, and parallel and grid computing for geospatial data from The National Map. A research plan spanning 2013–18 has been developed extending the accomplishments of the CEGIS researchers and documenting new research areas that are anticipated to support The National Map of the future. In addition to extending the 2006–12 research areas, the CEGIS research plan for 2013–18 includes new research areas in data models, geospatial semantics, high-performance computing, volunteered geographic information, crowdsourcing, social media, data integration, and multiscale representations to support the Three-Dimensional Elevation Program (3DEP) and The National Map of the future of the U.S. Geological Survey.

  10. Visual exposure to large and small portion sizes and perceptions of portion size normality: Three experimental studies.

    Science.gov (United States)

    Robinson, Eric; Oldham, Melissa; Cuckson, Imogen; Brunstrom, Jeffrey M; Rogers, Peter J; Hardman, Charlotte A

    2016-03-01

    Portion sizes of many foods have increased in recent times. In three studies we examined the effect that repeated visual exposure to larger versus smaller food portion sizes has on perceptions of what constitutes a normal-sized food portion and measures of portion size selection. In studies 1 and 2 participants were visually exposed to images of large or small portions of spaghetti bolognese, before making evaluations about an image of an intermediate sized portion of the same food. In study 3 participants were exposed to images of large or small portions of a snack food before selecting a portion size of snack food to consume. Across the three studies, visual exposure to larger as opposed to smaller portion sizes resulted in participants considering a normal portion of food to be larger than a reference intermediate sized portion. In studies 1 and 2 visual exposure to larger portion sizes also increased the size of self-reported ideal meal size. In study 3 visual exposure to larger portion sizes of a snack food did not affect how much of that food participants subsequently served themselves and ate. Visual exposure to larger portion sizes may adjust visual perceptions of what constitutes a 'normal' sized portion. However, we did not find evidence that visual exposure to larger portions altered snack food intake. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Geospatial Modelling for Micro Zonation of Groundwater Regime in Western Assam, India

    Science.gov (United States)

    Singh, R. P.

    2016-12-01

    Water, most precious natural resource on earth, is vital to sustain the natural system and human civilisation on the earth. The Assam state located in north-eastern part of India has a relatively good source of ground water due to their geographic and physiographic location but there is problem deterioration of groundwater quality causing major health problem in the area. In this study, I tried a integrated study of remote sensing and GIS and chemical analysis of groundwater samples to throw a light over groundwater regime and provides information for decision makers to make sustainable water resource management. The geospatial modelling performed by integrating hydrogeomorphic features. Geomorphology, lineament, Drainage, Landuse/landcover layer were generated through visual interpretation on satellite image (LISS III) based on tone, texture, shape, size, and arrangement of the features. Slope layer was prepared by using SRTM DEM data set .The LULC of the area were categories in to 6 classes of Agricultural field, Forest area ,River, Settlement , Tree-clad area and Wetlands. The geospatial modelling performed through weightage and rank method in GIS, depending on the influence of the features on ground water regime. To Assess the ground water quality of the area 45 groundwater samples have been collected from the field and chemical analysis performed through the standard method in the laboratory. The overall assessment of the ground water quality of the area analyse through Water Quality Index and found that about 70% samples are not potable for drinking purposes due to higher concentration Arsenic, Fluoride and Iron. It appears that, source of all these pollutants geologically and geomorphologically derived. Interpolated layer of Water Quality Index and geospatial modelled Groundwater potential layer provides a holistic view of groundwater scenario and provide direction for better planning and groundwater resource management. Study will be discussed in details

  12. Strategizing Teacher Professional Development for Classroom Uses of Geospatial Data and Tools

    Science.gov (United States)

    Zalles, Daniel R.; Manitakos, James

    2016-01-01

    Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE), a 4.5-year National Science Foundation funded project, explored the strategies that stimulate teacher commitment to the project's driving innovation: having students use geospatial information technology (GIT) to learn about weather, climate,…

  13. Geospatial Big Data Handling Theory and Methods: A Review and Research Challenges

    DEFF Research Database (Denmark)

    Li, Songnian; Dragicevic, Suzana; Anton, François

    2016-01-01

    Big data has now become a strong focus of global interest that is increasingly attracting the attention of academia, industry, government and other organizations. Big data can be situated in the disciplinary area of traditional geospatial data handling theory and methods. The increasing volume...... for Photogrammetry and Remote Sensing (ISPRS) Technical Commission II (TC II) revisits the existing geospatial data handling methods and theories to determine if they are still capable of handling emerging geospatial big data. Further, the paper synthesises problems, major issues and challenges with current...... developments as well as recommending what needs to be developed further in the near future....

  14. Towards the Development of a Taxonomy for Visualisation of Streamed Geospatial Data

    Science.gov (United States)

    Sibolla, B. H.; Van Zyl, T.; Coetzee, S.

    2016-06-01

    Geospatial data has very specific characteristics that need to be carefully captured in its visualisation, in order for the user and the viewer to gain knowledge from it. The science of visualisation has gained much traction over the last decade as a response to various visualisation challenges. During the development of an open source based, dynamic two-dimensional visualisation library, that caters for geospatial streaming data, it was found necessary to conduct a review of existing geospatial visualisation taxonomies. The review was done in order to inform the design phase of the library development, such that either an existing taxonomy can be adopted or extended to fit the needs at hand. The major challenge in this case is to develop dynamic two dimensional visualisations that enable human interaction in order to assist the user to understand the data streams that are continuously being updated. This paper reviews the existing geospatial data visualisation taxonomies that have been developed over the years. Based on the review, an adopted taxonomy for visualisation of geospatial streaming data is presented. Example applications of this taxonomy are also provided. The adopted taxonomy will then be used to develop the information model for the visualisation library in a further study.

  15. Artificial Intelligence in geospatial analysis: applications of self-organizing maps in the context of geographic information science.

    OpenAIRE

    Henriques, Roberto André Pereira

    2011-01-01

    A thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Systems. The size and dimensionality of available geospatial repositories increases every day, placing additional pressure on existing analysis tools, as they are expected to extract more knowledge from these databases. Most of these tools were created in a data poor environment and thus rarely address concerns of efficiency, dimensionality and automatic exploration. In ...

  16. Representation of activity in images using geospatial temporal graphs

    Science.gov (United States)

    Brost, Randolph; McLendon, III, William C.; Parekh, Ojas D.; Rintoul, Mark Daniel; Watson, Jean-Paul; Strip, David R.; Diegert, Carl

    2018-05-01

    Various technologies pertaining to modeling patterns of activity observed in remote sensing images using geospatial-temporal graphs are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Activity patterns may be discerned from the graphs by coding nodes representing persistent objects like buildings differently from nodes representing ephemeral objects like vehicles, and examining the geospatial-temporal relationships of ephemeral nodes within the graph.

  17. Synthesis of mesoporous carbon nanoparticles with large and tunable pore sizes

    Science.gov (United States)

    Liu, Chao; Yu, Meihua; Li, Yang; Li, Jiansheng; Wang, Jing; Yu, Chengzhong; Wang, Lianjun

    2015-07-01

    Mesoporous carbon nanoparticles (MCNs) with large and adjustable pores have been synthesized by using poly(ethylene oxide)-b-polystyrene (PEO-b-PS) as a template and resorcinol-formaldehyde (RF) as a carbon precursor. The resulting MCNs possess small diameters (100-126 nm) and high BET surface areas (up to 646 m2 g-1). By using home-designed block copolymers, the pore size of MCNs can be tuned in the range of 13-32 nm. Importantly, the pore size of 32 nm is the largest among the MCNs prepared by the soft-templating route. The formation mechanism and structure evolution of MCNs were studied by TEM and DLS measurements, based on which a soft-templating/sphere packing mechanism was proposed. Because of the large pores and small particle sizes, the resulting MCNs were excellent nano-carriers to deliver biomolecules into cancer cells. MCNs were further demonstrated with negligible toxicity. It is anticipated that this carbon material with large pores and small particle sizes may have excellent potential in drug/gene delivery.Mesoporous carbon nanoparticles (MCNs) with large and adjustable pores have been synthesized by using poly(ethylene oxide)-b-polystyrene (PEO-b-PS) as a template and resorcinol-formaldehyde (RF) as a carbon precursor. The resulting MCNs possess small diameters (100-126 nm) and high BET surface areas (up to 646 m2 g-1). By using home-designed block copolymers, the pore size of MCNs can be tuned in the range of 13-32 nm. Importantly, the pore size of 32 nm is the largest among the MCNs prepared by the soft-templating route. The formation mechanism and structure evolution of MCNs were studied by TEM and DLS measurements, based on which a soft-templating/sphere packing mechanism was proposed. Because of the large pores and small particle sizes, the resulting MCNs were excellent nano-carriers to deliver biomolecules into cancer cells. MCNs were further demonstrated with negligible toxicity. It is anticipated that this carbon material with large pores and

  18. Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics

    Science.gov (United States)

    Singh, R.; Bermudez, L. E.

    2013-12-01

    Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics The Open Geospatial Consortium (OGC) mission is to serve as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC coordinates with over 400 institutions in the development of geospatial standards. In the last years two main trends are making disruptions in geospatial applications: mobile and context sharing. People now have more and more mobile devices to support their work and personal life. Mobile devices are intermittently connected to the internet and have smaller computing capacity than a desktop computer. Based on this trend a new OGC file format standard called GeoPackage will enable greater geospatial data sharing on mobile devices. GeoPackage is perhaps best understood as the natural evolution of Shapefiles, which have been the predominant lightweight geodata sharing format for two decades. However the format is extremely limited. Four major shortcomings are that only vector points, lines, and polygons are supported; property names are constrained by the dBASE format; multiple files are required to encode a single data set; and multiple Shapefiles are required to encode multiple data sets. A more modern lingua franca for geospatial data is long overdue. GeoPackage fills this need with support for vector data, image tile matrices, and raster data. And it builds upon a database container - SQLite - that's self-contained, single-file, cross-platform, serverless, transactional, and open source. A GeoPackage, in essence, is a set of SQLite database tables whose content and layout is described in the candidate GeoPackage Implementation Specification available at https://portal.opengeospatial.org/files/?artifact_id=54838&version=1. The second trend is sharing client 'contexts'. When a user is looking into an article or a product on the web

  19. GeoSearch: A lightweight broking middleware for geospatial resources discovery

    Science.gov (United States)

    Gui, Z.; Yang, C.; Liu, K.; Xia, J.

    2012-12-01

    With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value

  20. Stakeholder Alignment and Changing Geospatial Information Capabilities

    Science.gov (United States)

    Winter, S.; Cutcher-Gershenfeld, J.; King, J. L.

    2015-12-01

    Changing geospatial information capabilities can have major economic and social effects on activities such as drought monitoring, weather forecasts, agricultural productivity projections, water and air quality assessments, the effects of forestry practices and so on. Whose interests are served by such changes? Two common mistakes are assuming stability in the community of stakeholders and consistency in stakeholder behavior. Stakeholder communities can reconfigure dramatically as some leave the discussion, others enter, and circumstances shift — all resulting in dynamic points of alignment and misalignment . New stakeholders can bring new interests, and existing stakeholders can change their positions. Stakeholders and their interests need to be be considered as geospatial information capabilities change, but this is easier said than done. New ways of thinking about stakeholder alignment in light of changes in capability are presented.

  1. Lowering the barriers for accessing distributed geospatial big data to advance spatial data science: the PolarHub solution

    Science.gov (United States)

    Li, W.

    2017-12-01

    Data is the crux of science. The widespread availability of big data today is of particular importance for fostering new forms of geospatial innovation. This paper reports a state-of-the-art solution that addresses a key cyberinfrastructure research problem—providing ready access to big, distributed geospatial data resources on the Web. We first formulate this data-access problem and introduce its indispensable elements, including identifying the cyber-location, space and time coverage, theme, and quality of the dataset. We then propose strategies to tackle each data-access issue and make the data more discoverable and usable for geospatial data users and decision makers. Among these strategies is large-scale web crawling as a key technique to support automatic collection of online geospatial data that are highly distributed, intrinsically heterogeneous, and known to be dynamic. To better understand the content and scientific meanings of the data, methods including space-time filtering, ontology-based thematic classification, and service quality evaluation are incorporated. To serve a broad scientific user community, these techniques are integrated into an operational data crawling system, PolarHub, which is also an important cyberinfrastructure building block to support effective data discovery. A series of experiments were conducted to demonstrate the outstanding performance of the PolarHub system. We expect this work to contribute significantly in building the theoretical and methodological foundation for data-driven geography and the emerging spatial data science.

  2. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    Science.gov (United States)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  3. Geospatial Brokering - Challenges and Future Directions

    Science.gov (United States)

    White, C. E.

    2012-12-01

    An important feature of many brokers is to facilitate straightforward human access to scientific data while maintaining programmatic access to it for system solutions. Standards-based protocols are critical for this, and there are a number of protocols to choose from. In this discussion, we will present a web application solution that leverages certain protocols - e.g., OGC CSW, REST, and OpenSearch - to provide programmatic as well as human access to geospatial resources. We will also discuss managing resources to reduce duplication yet increase discoverability, federated search solutions, and architectures that combine human-friendly interfaces with powerful underlying data management. The changing requirements witnessed in brokering solutions over time, our recent experience participating in the EarthCube brokering hack-a-thon, and evolving interoperability standards provide insight to future technological and philosophical directions planned for geospatial broker solutions. There has been much change over the past decade, but with the unprecedented data collaboration of recent years, in many ways the challenges and opportunities are just beginning.

  4. Large-scale drivers of malaria and priority areas for prevention and control in the Brazilian Amazon region using a novel multi-pathogen geospatial model.

    Science.gov (United States)

    Valle, Denis; Lima, Joanna M Tucker

    2014-11-20

    Most of the malaria burden in the Americas is concentrated in the Brazilian Amazon but a detailed spatial characterization of malaria risk has yet to be undertaken. Utilizing 2004-2008 malaria incidence data collected from six Brazilian Amazon states, large-scale spatial patterns of malaria risk were characterized with a novel Bayesian multi-pathogen geospatial model. Data included 2.4 million malaria cases spread across 3.6 million sq km. Remotely sensed variables (deforestation rate, forest cover, rainfall, dry season length, and proximity to large water bodies), socio-economic variables (rural population size, income, and literacy rate, mortality rate for children age under five, and migration patterns), and GIS variables (proximity to roads, hydro-electric dams and gold mining operations) were incorporated as covariates. Borrowing information across pathogens allowed for better spatial predictions of malaria caused by Plasmodium falciparum, as evidenced by a ten-fold cross-validation. Malaria incidence for both Plasmodium vivax and P. falciparum tended to be higher in areas with greater forest cover. Proximity to gold mining operations was another important risk factor, corroborated by a positive association between migration rates and malaria incidence. Finally, areas with a longer dry season and areas with higher average rural income tended to have higher malaria risk. Risk maps reveal striking spatial heterogeneity in malaria risk across the region, yet these mean disease risk surface maps can be misleading if uncertainty is ignored. By combining mean spatial predictions with their associated uncertainty, several sites were consistently classified as hotspots, suggesting their importance as priority areas for malaria prevention and control. This article provides several contributions. From a methodological perspective, the benefits of jointly modelling multiple pathogens for spatial predictions were illustrated. In addition, maps of mean disease risk were

  5. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin; Kuester, Falk

    2010-01-01

    data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented

  6. 3D geospatial visualizations: Animation and motion effects on spatial objects

    Science.gov (United States)

    Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos

    2018-02-01

    Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.

  7. Evaluating the Open Source Data Containers for Handling Big Geospatial Raster Data

    Directory of Open Access Journals (Sweden)

    Fei Hu

    2018-04-01

    Full Text Available Big geospatial raster data pose a grand challenge to data management technologies for effective big data query and processing. To address these challenges, various big data container solutions have been developed or enhanced to facilitate data storage, retrieval, and analysis. Data containers were also developed or enhanced to handle geospatial data. For example, Rasdaman was developed to handle raster data and GeoSpark/SpatialHadoop were enhanced from Spark/Hadoop to handle vector data. However, there are few studies to systematically compare and evaluate the features and performances of these popular data containers. This paper provides a comprehensive evaluation of six popular data containers (i.e., Rasdaman, SciDB, Spark, ClimateSpark, Hive, and MongoDB for handling multi-dimensional, array-based geospatial raster datasets. Their architectures, technologies, capabilities, and performance are compared and evaluated from two perspectives: (a system design and architecture (distributed architecture, logical data model, physical data model, and data operations; and (b practical use experience and performance (data preprocessing, data uploading, query speed, and resource consumption. Four major conclusions are offered: (1 no data containers, except ClimateSpark, have good support for the HDF data format used in this paper, requiring time- and resource-consuming data preprocessing to load data; (2 SciDB, Rasdaman, and MongoDB handle small/mediate volumes of data query well, whereas Spark and ClimateSpark can handle large volumes of data with stable resource consumption; (3 SciDB and Rasdaman provide mature array-based data operation and analytical functions, while the others lack these functions for users; and (4 SciDB, Spark, and Hive have better support of user defined functions (UDFs to extend the system capability.

  8. The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework

    Science.gov (United States)

    Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.

    2016-12-01

    The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During

  9. EnviroAtlas: Providing Nationwide Geospatial Ecosystem Goods and Services Indicators and Indices to Inform Decision-Making, Research, and Education

    Science.gov (United States)

    EnviroAtlas is a multi-organization effort led by the US Environmental Protection Agency to develop, host and display a large suite of nation-wide geospatial indicators and indices of ecosystem services. This open access tool allows users to view, analyze, and download a wealth o...

  10. NativeView: A Geospatial Curriculum for Native Nation Building

    Science.gov (United States)

    Rattling Leaf, J.

    2007-12-01

    In the spirit of collaboration and reciprocity, James Rattling Leaf of Sinte Gleska University on the Rosebud Reservation of South Dakota will present recent developments, experiences, insights and a vision for education in Indian Country. As a thirty-year young institution, Sinte Gleska University is founded by a strong vision of ancestral leadership and the values of the Lakota Way of Life. Sinte Gleska University (SGU) has initiated the development of a Geospatial Education Curriculum project. NativeView: A Geospatial Curriculum for Native Nation Building is a two-year project that entails a disciplined approach towards the development of a relevant Geospatial academic curriculum. This project is designed to meet the educational and land management needs of the Rosebud Lakota Tribe through the utilization of Geographic Information Systems (GIS), Remote Sensing (RS) and Global Positioning Systems (GPS). In conjunction with the strategy and progress of this academic project, a formal presentation and demonstration of the SGU based Geospatial software RezMapper software will exemplify an innovative example of state of the art information technology. RezMapper is an interactive CD software package focused toward the 21 Lakota communities on the Rosebud Reservation that utilizes an ingenious concept of multimedia mapping and state of the art data compression and presentation. This ongoing development utilizes geographic data, imagery from space, historical aerial photography and cultural features such as historic Lakota documents, language, song, video and historical photographs in a multimedia fashion. As a tangible product, RezMapper will be a project deliverable tool for use in the classroom and to a broad range of learners.

  11. The new geospatial tools: global transparency enhancing safeguards verification

    International Nuclear Information System (INIS)

    Pabian, Frank Vincent

    2010-01-01

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  12. Visual exposure to large and small portion sizes and perceptions of portion size normality: Three experimental studies

    OpenAIRE

    Robinson, Eric; Oldham, Melissa; Cuckson, Imogen; Brunstrom, Jeffrey M.; Rogers, Peter J.; Hardman, Charlotte A.

    2016-01-01

    Portion sizes of many foods have increased in recent times. In three studies we examined the effect that repeated visual exposure to larger versus smaller food portion sizes has on perceptions of what constitutes a normal-sized food portion and measures of portion size selection. In studies 1 and 2 participants were visually exposed to images of large or small portions of spaghetti bolognese, before making evaluations about an image of an intermediate sized portion of the same food. In study ...

  13. A Practice Approach of Multi-source Geospatial Data Integration for Web-based Geoinformation Services

    Science.gov (United States)

    Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.

    2014-04-01

    Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.

  14. Recent innovation of geospatial information technology to support disaster risk management and responses

    Science.gov (United States)

    Une, Hiroshi; Nakano, Takayuki

    2018-05-01

    Geographic location is one of the most fundamental and indispensable information elements in the field of disaster response and prevention. For example, in the case of the Tohoku Earthquake in 2011, aerial photos taken immediately after the earthquake greatly improved information sharing among different government offices and facilitated rescue and recovery operations, and maps prepared after the disaster assisted in the rapid reconstruction of affected local communities. Thanks to the recent development of geospatial information technology, this information has become more essential for disaster response activities. Advancements in web mapping technology allows us to better understand the situation by overlaying various location-specific data on base maps on the web and specifying the areas on which activities should be focused. Through 3-D modelling technology, we can have a more realistic understanding of the relationship between disaster and topography. Geospatial information technology can sup-port proper preparation and emergency responses against disasters by individuals and local communities through hazard mapping and other information services using mobile devices. Thus, geospatial information technology is playing a more vital role on all stages of disaster risk management and responses. In acknowledging geospatial information's vital role in disaster risk reduction, the Sendai Framework for Disaster Risk Reduction 2015-2030, adopted at the Third United Nations World Conference on Disaster Risk Reduction, repeatedly reveals the importance of utilizing geospatial information technology for disaster risk reduction. This presentation aims to report the recent practical applications of geospatial information technology for disaster risk management and responses.

  15. Developing a distributed HTML5-based search engine for geospatial resource discovery

    Science.gov (United States)

    ZHOU, N.; XIA, J.; Nebert, D.; Yang, C.; Gui, Z.; Liu, K.

    2013-12-01

    With explosive growth of data, Geospatial Cyberinfrastructure(GCI) components are developed to manage geospatial resources, such as data discovery and data publishing. However, the efficiency of geospatial resources discovery is still challenging in that: (1) existing GCIs are usually developed for users of specific domains. Users may have to visit a number of GCIs to find appropriate resources; (2) The complexity of decentralized network environment usually results in slow response and pool user experience; (3) Users who use different browsers and devices may have very different user experiences because of the diversity of front-end platforms (e.g. Silverlight, Flash or HTML). To address these issues, we developed a distributed and HTML5-based search engine. Specifically, (1)the search engine adopts a brokering approach to retrieve geospatial metadata from various and distributed GCIs; (2) the asynchronous record retrieval mode enhances the search performance and user interactivity; (3) the search engine based on HTML5 is able to provide unified access capabilities for users with different devices (e.g. tablet and smartphone).

  16. The Efficacy of Educative Curriculum Materials to Support Geospatial Science Pedagogical Content Knowledge

    Science.gov (United States)

    Bodzin, Alec; Peffer, Tamara; Kulo, Violet

    2012-01-01

    Teaching and learning about geospatial aspects of energy resource issues requires that science teachers apply effective science pedagogical approaches to implement geospatial technologies into classroom instruction. To address this need, we designed educative curriculum materials as an integral part of a comprehensive middle school energy…

  17. Nansat: a Scientist-Orientated Python Package for Geospatial Data Processing

    Directory of Open Access Journals (Sweden)

    Anton A. Korosov

    2016-10-01

    Full Text Available Nansat is a Python toolbox for analysing and processing 2-dimensional geospatial data, such as satellite imagery, output from numerical models, and gridded in-situ data. It is created with strong focus on facilitating research, and development of algorithms and autonomous processing systems. Nansat extends the widely used Geospatial Abstraction Data Library (GDAL by adding scientific meaning to the datasets through metadata, and by adding common functionality for data analysis and handling (e.g., exporting to various data formats. Nansat uses metadata vocabularies that follow international metadata standards, in particular the Climate and Forecast (CF conventions, and the NASA Directory Interchange Format (DIF and Global Change Master Directory (GCMD keywords. Functionality that is commonly needed in scientific work, such as seamless access to local or remote geospatial data in various file formats, collocation of datasets from different sources and geometries, and visualization, is also built into Nansat. The paper presents Nansat workflows, its functional structure, and examples of typical applications.

  18. The Future of Geospatial Standards

    Science.gov (United States)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds

  19. lawn: An R client for the Turf JavaScript Library for Geospatial Analysis

    Science.gov (United States)

    lawn is an R package to provide access to the geospatial analysis capabilities in the Turf javascript library. Turf expects data in GeoJSON format. Given that many datasets are now available natively in GeoJSON providing an easier method for conducting geospatial analyses on thes...

  20. A Geospatial Data Recommender System based on Metadata and User Behaviour

    Science.gov (United States)

    Li, Y.; Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; Finch, C. J.; McGibbney, L. J.

    2017-12-01

    Earth observations are produced in a fast velocity through real time sensors, reaching tera- to peta- bytes of geospatial data daily. Discovering and accessing the right data from the massive geospatial data is like finding needle in the haystack. To help researchers find the right data for study and decision support, quite a lot of research focusing on improving search performance have been proposed including recommendation algorithm. However, few papers have discussed the way to implement a recommendation algorithm in geospatial data retrieval system. In order to address this problem, we propose a recommendation engine to improve discovering relevant geospatial data by mining and utilizing metadata and user behavior data: 1) metadata based recommendation considers the correlation of each attribute (i.e., spatiotemporal, categorical, and ordinal) to data to be found. In particular, phrase extraction method is used to improve the accuracy of the description similarity; 2) user behavior data are utilized to predict the interest of a user through collaborative filtering; 3) an integration method is designed to combine the results of the above two methods to achieve better recommendation Experiments show that in the hybrid recommendation list, the all the precisions are larger than 0.8 from position 1 to 10.

  1. KINGDOM OF SAUDI ARABIA GEOSPATIAL INFORMATION INFRASTRUCTURE – AN INITIAL STUDY

    Directory of Open Access Journals (Sweden)

    S. H. Alsultan

    2015-10-01

    Full Text Available This paper reviews the current Geographic Information System (Longley et al. implementation and status in the Kingdom of Saudi Arabia (KSA. Based on the review, several problems were identified and discussed. The characteristic of these problems show that the country needs a national geospatial centre. As a new initiative for a national geospatial centre, a study is being conducted especially on best practice from other countries, availability of national committee for standards and policies on data sharing, and the best proposed organization structure inside the administration for the KSA. The study also covers the degree of readiness and awareness among the main GIS stakeholders within the country as well as private parties. At the end of this paper, strategic steps for the national geospatial management centre were proposed as the initial output of the study.

  2. A Geospatial Decision Support System Toolkit, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and commercialize a working prototype Geospatial Decision Support Toolkit (GeoKit). GeoKit will enable scientists, agencies, and stakeholders to...

  3. The new geospatial tools: global transparency enhancing safeguards verification

    Energy Technology Data Exchange (ETDEWEB)

    Pabian, Frank Vincent [Los Alamos National Laboratory

    2010-09-16

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  4. Reliable pipeline repair system for very large pipe size

    Energy Technology Data Exchange (ETDEWEB)

    Charalambides, John N.; Sousa, Alexandre Barreto de [Oceaneering International, Inc., Houston, TX (United States)

    2004-07-01

    The oil and gas industry worldwide has been mainly depending on the long-term reliability of rigid pipelines to ensure the transportation of hydrocarbons, crude oil, gas, fuel, etc. Many other methods are also utilized onshore and offshore (e.g. flexible lines, FPSO's, etc.), but when it comes to the underwater transportation of very high volumes of oil and gas, the industry commonly uses large size rigid pipelines (i.e. steel pipes). Oil and gas operators learned to depend on the long-lasting integrity of these very large pipelines and many times they forget or disregard that even steel pipelines degrade over time and more often that that, they are also susceptible to various forms of damage (minor or major, environmental or external, etc.). Over the recent years the industry had recognized the need of implementing an 'emergency repair plan' to account for such unforeseen events and the oil and gas operators have become 'smarter' by being 'pro-active' in order to ensure 'flow assurance'. When we consider very large diameter steel pipelines such as 42' and 48' nominal pipe size (NPS), the industry worldwide does not provide 'ready-made', 'off-the-shelf' repair hardware that can be easily shipped to the offshore location and effect a major repair within acceptable time frames and avoid substantial profit losses due to 'down-time' in production. The typical time required to establish a solid repair system for large pipe diameters could be as long as six or more months (depending on the availability of raw materials). This paper will present in detail the Emergency Pipeline Repair Systems (EPRS) that Oceaneering successfully designed, manufactured, tested and provided to two major oil and gas operators, located in two different continents (Gulf of Mexico, U.S.A. and Arabian Gulf, U.A.E.), for two different very large pipe sizes (42'' and 48'' Nominal Pipe Sizes

  5. Towards Precise Metadata-set for Discovering 3D Geospatial Models in Geo-portals

    Science.gov (United States)

    Zamyadi, A.; Pouliot, J.; Bédard, Y.

    2013-09-01

    Accessing 3D geospatial models, eventually at no cost and for unrestricted use, is certainly an important issue as they become popular among participatory communities, consultants, and officials. Various geo-portals, mainly established for 2D resources, have tried to provide access to existing 3D resources such as digital elevation model, LIDAR or classic topographic data. Describing the content of data, metadata is a key component of data discovery in geo-portals. An inventory of seven online geo-portals and commercial catalogues shows that the metadata referring to 3D information is very different from one geo-portal to another as well as for similar 3D resources in the same geo-portal. The inventory considered 971 data resources affiliated with elevation. 51% of them were from three geo-portals running at Canadian federal and municipal levels whose metadata resources did not consider 3D model by any definition. Regarding the remaining 49% which refer to 3D models, different definition of terms and metadata were found, resulting in confusion and misinterpretation. The overall assessment of these geo-portals clearly shows that the provided metadata do not integrate specific and common information about 3D geospatial models. Accordingly, the main objective of this research is to improve 3D geospatial model discovery in geo-portals by adding a specific metadata-set. Based on the knowledge and current practices on 3D modeling, and 3D data acquisition and management, a set of metadata is proposed to increase its suitability for 3D geospatial models. This metadata-set enables the definition of genuine classes, fields, and code-lists for a 3D metadata profile. The main structure of the proposal contains 21 metadata classes. These classes are classified in three packages as General and Complementary on contextual and structural information, and Availability on the transition from storage to delivery format. The proposed metadata set is compared with Canadian Geospatial

  6. Technologies Connotation and Developing Characteristics of Open Geospatial Information Platform

    Directory of Open Access Journals (Sweden)

    GUO Renzhong

    2016-02-01

    Full Text Available Based on the background of developments of surveying,mapping and geoinformation,aimed at the demands of data fusion,real-time sharing,in-depth processing and personalization,this paper analyzes significant features of geo-spatial service in digital city,focuses on theory,method and key techniques of open environment of cloud computing,multi-path data updating,full-scale urban geocoding,multi-source spatial data integration,adaptive geo-processing and adaptive Web mapping.As the basis for it,the Open Geospatial information platform is developed,and successfully implicated in digital Shenzhen.

  7. Assessing the socioeconomic impact and value of open geospatial information

    Science.gov (United States)

    Pearlman, Francoise; Pearlman, Jay; Bernknopf, Richard; Coote, Andrew; Craglia, Massimo; Friedl, Lawrence; Gallo, Jason; Hertzfeld, Henry; Jolly, Claire; Macauley, Molly K.; Shapiro, Carl; Smart, Alan

    2016-03-10

    The production and accessibility of geospatial information including Earth observation is changing greatly both technically and in terms of human participation. Advances in technology have changed the way that geospatial data are produced and accessed, resulting in more efficient processes and greater accessibility than ever before. Improved technology has also created opportunities for increased participation in the gathering and interpretation of data through crowdsourcing and citizen science efforts. Increased accessibility has resulted in greater participation in the use of data as prices for Government-produced data have fallen and barriers to access have been reduced.

  8. Sextant: Visualizing time-evolving linked geospatial data

    NARCIS (Netherlands)

    C. Nikolaou (Charalampos); K. Dogani (Kallirroi); K. Bereta (Konstantina); G. Garbis (George); M. Karpathiotakis (Manos); K. Kyzirakos (Konstantinos); M. Koubarakis (Manolis)

    2015-01-01

    textabstractThe linked open data cloud is constantly evolving as datasets get continuously updated with newer versions. As a result, representing, querying, and visualizing the temporal dimension of linked data is crucial. This is especially important for geospatial datasets that form the backbone

  9. Geospatial metadata retrieval from web services

    Directory of Open Access Journals (Sweden)

    Ivanildo Barbosa

    Full Text Available Nowadays, producers of geospatial data in either raster or vector formats are able to make them available on the World Wide Web by deploying web services that enable users to access and query on those contents even without specific software for geoprocessing. Several providers around the world have deployed instances of WMS (Web Map Service, WFS (Web Feature Service and WCS (Web Coverage Service, all of them specified by the Open Geospatial Consortium (OGC. In consequence, metadata about the available contents can be retrieved to be compared with similar offline datasets from other sources. This paper presents a brief summary and describes the matching process between the specifications for OGC web services (WMS, WFS and WCS and the specifications for metadata required by the ISO 19115 - adopted as reference for several national metadata profiles, including the Brazilian one. This process focuses on retrieving metadata about the identification and data quality packages as well as indicates the directions to retrieve metadata related to other packages. Therefore, users are able to assess whether the provided contents fit to their purposes.

  10. A cross-sectional ecological analysis of international and sub-national health inequalities in commercial geospatial resource availability.

    Science.gov (United States)

    Dotse-Gborgbortsi, Winfred; Wardrop, Nicola; Adewole, Ademola; Thomas, Mair L H; Wright, Jim

    2018-05-23

    Commercial geospatial data resources are frequently used to understand healthcare utilisation. Although there is widespread evidence of a digital divide for other digital resources and infra-structure, it is unclear how commercial geospatial data resources are distributed relative to health need. To examine the distribution of commercial geospatial data resources relative to health needs, we assembled coverage and quality metrics for commercial geocoding, neighbourhood characterisation, and travel time calculation resources for 183 countries. We developed a country-level, composite index of commercial geospatial data quality/availability and examined its distribution relative to age-standardised all-cause and cause specific (for three main causes of death) mortality using two inequality metrics, the slope index of inequality and relative concentration index. In two sub-national case studies, we also examined geocoding success rates versus area deprivation by district in Eastern Region, Ghana and Lagos State, Nigeria. Internationally, commercial geospatial data resources were inversely related to all-cause mortality. This relationship was more pronounced when examining mortality due to communicable diseases. Commercial geospatial data resources for calculating patient travel times were more equitably distributed relative to health need than resources for characterising neighbourhoods or geocoding patient addresses. Countries such as South Africa have comparatively high commercial geospatial data availability despite high mortality, whilst countries such as South Korea have comparatively low data availability and low mortality. Sub-nationally, evidence was mixed as to whether geocoding success was lowest in more deprived districts. To our knowledge, this is the first global analysis of commercial geospatial data resources in relation to health outcomes. In countries such as South Africa where there is high mortality but also comparatively rich commercial geospatial

  11. Research and Practical Trends in Geospatial Sciences

    Science.gov (United States)

    Karpik, A. P.; Musikhin, I. A.

    2016-06-01

    In recent years professional societies have been undergoing fundamental restructuring brought on by extensive technological change and rapid evolution of geospatial science. Almost all professional communities have been affected. Communities are embracing digital techniques, modern equipment, software and new technological solutions at a staggering pace. In this situation, when planning financial investments and intellectual resource management, it is crucial to have a clear understanding of those trends that will be in great demand in 3-7 years. This paper reviews current scientific and practical activities of such non-governmental international organizations as International Federation of Surveyors, International Cartographic Association, and International Society for Photogrammetry and Remote Sensing, analyzes and groups most relevant topics brought up at their scientific events, forecasts most probable research and practical trends in geospatial sciences, outlines topmost leading countries and emerging markets for further detailed analysis of their activities, types of scientific cooperation and joint implementation projects.

  12. Higher albedos and size distribution of large transneptunian objects

    Science.gov (United States)

    Lykawka, Patryk Sofia; Mukai, Tadashi

    2005-11-01

    Transneptunian objects (TNOs) orbit beyond Neptune and do offer important clues about the formation of our solar system. Although observations have been increasing the number of discovered TNOs and improving their orbital elements, very little is known about elementary physical properties such as sizes, albedos and compositions. Due to TNOs large distances (>40 AU) and observational limitations, reliable physical information can be obtained only from brighter objects (supposedly larger bodies). According to size and albedo measurements available, it is evident the traditionally assumed albedo p=0.04 cannot hold for all TNOs, especially those with approximately absolute magnitudes H⩽5.5. That is, the largest TNOs possess higher albedos (generally >0.04) that strongly appear to increase as a function of size. Using a compilation of published data, we derived empirical relations which can provide estimations of diameters and albedos as a function of absolute magnitude. Calculations result in more accurate size/albedo estimations for TNOs with H⩽5.5 than just assuming p=0.04. Nevertheless, considering low statistics, the value p=0.04 sounds still convenient for H>5.5 non-binary TNOs as a group. We also discuss about physical processes (e.g., collisions, intrinsic activity and the presence of tenuous atmospheres) responsible for the increase of albedo among large bodies. Currently, all big TNOs (>700 km) would be capable to sustain thin atmospheres or icy frosts composed of CH 4, CO or N 2 even for body bulk densities as low as 0.5 g cm -3. A size-dependent albedo has important consequences for the TNOs size distribution, cumulative luminosity function and total mass estimations. According to our analysis, the latter can be reduced up to 50% if higher albedos are common among large bodies. Lastly, by analyzing orbital properties of classical TNOs ( 42AUbodies. For both populations, distinct absolute magnitude distributions are maximized for an inclination threshold

  13. Streamlining geospatial metadata in the Semantic Web

    Science.gov (United States)

    Fugazza, Cristiano; Pepe, Monica; Oggioni, Alessandro; Tagliolato, Paolo; Carrara, Paola

    2016-04-01

    In the geospatial realm, data annotation and discovery rely on a number of ad-hoc formats and protocols. These have been created to enable domain-specific use cases generalized search is not feasible for. Metadata are at the heart of the discovery process and nevertheless they are often neglected or encoded in formats that either are not aimed at efficient retrieval of resources or are plainly outdated. Particularly, the quantum leap represented by the Linked Open Data (LOD) movement did not induce so far a consistent, interlinked baseline in the geospatial domain. In a nutshell, datasets, scientific literature related to them, and ultimately the researchers behind these products are only loosely connected; the corresponding metadata intelligible only to humans, duplicated on different systems, seldom consistently. Instead, our workflow for metadata management envisages i) editing via customizable web- based forms, ii) encoding of records in any XML application profile, iii) translation into RDF (involving the semantic lift of metadata records), and finally iv) storage of the metadata as RDF and back-translation into the original XML format with added semantics-aware features. Phase iii) hinges on relating resource metadata to RDF data structures that represent keywords from code lists and controlled vocabularies, toponyms, researchers, institutes, and virtually any description one can retrieve (or directly publish) in the LOD Cloud. In the context of a distributed Spatial Data Infrastructure (SDI) built on free and open-source software, we detail phases iii) and iv) of our workflow for the semantics-aware management of geospatial metadata.

  14. Development of Geospatial Map Based Election Portal

    Science.gov (United States)

    Gupta, A. Kumar Chandra; Kumar, P.; Vasanth Kumar, N.

    2014-11-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Election portal (GMEP) of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for pertain to planning and management of Department of Chief Electoral Officer, and as an election related information searching tools (Polling Station, Assembly and parliamentary constituency etc.,) for the citizens of NCTD. The GMEP is based on Client-Server architecture model. It has been developed using ArcGIS Server 10.0 with J2EE front-end on Microsoft Windows environment. The GMEP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMEP includes delimited precinct area boundaries of Voters Area of Polling stations, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMEP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of elections. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  15. Spin-torque oscillation in large size nano-magnet with perpendicular magnetic fields

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Linqiang, E-mail: LL6UK@virginia.edu [Department of Physics, University of Virginia, Charlottesville, VA 22904 (United States); Kabir, Mehdi [Department of Electrical & Computer Engineering, University of Virginia, Charlottesville, VA 22904 (United States); Dao, Nam; Kittiwatanakul, Salinporn [Department of Materials Science & Engineering, University of Virginia, Charlottesville, VA 22904 (United States); Cyberey, Michael [Department of Electrical Engineering, University of Virginia, Charlottesville, VA 22904 (United States); Wolf, Stuart A. [Department of Physics, University of Virginia, Charlottesville, VA 22904 (United States); Department of Materials Science & Engineering, University of Virginia, Charlottesville, VA 22904 (United States); Institute of Defense Analyses, Alexandria, VA 22311 (United States); Stan, Mircea [Department of Electrical & Computer Engineering, University of Virginia, Charlottesville, VA 22904 (United States); Lu, Jiwei [Department of Materials Science & Engineering, University of Virginia, Charlottesville, VA 22904 (United States)

    2017-06-15

    Highlights: • 500 nm size nano-pillar device was fabricated by photolithography techniques. • A magnetic hybrid structure was achieved with perpendicular magnetic fields. • Spin torque switching and oscillation was demonstrated in the large sized device. • Micromagnetic simulations accurately reproduced the experimental results. • Simulations demonstrated the synchronization of magnetic inhomogeneities. - Abstract: DC current induced magnetization reversal and magnetization oscillation was observed in 500 nm large size Co{sub 90}Fe{sub 10}/Cu/Ni{sub 80}Fe{sub 20} pillars. A perpendicular external field enhanced the coercive field separation between the reference layer (Co{sub 90}Fe{sub 10}) and free layer (Ni{sub 80}Fe{sub 20}) in the pseudo spin valve, allowing a large window of external magnetic field for exploring the free-layer reversal. A magnetic hybrid structure was achieved for the study of spin torque oscillation by applying a perpendicular field >3 kOe. The magnetization precession was manifested in terms of the multiple peaks on the differential resistance curves. Depending on the bias current and applied field, the regions of magnetic switching and magnetization precession on a dynamical stability diagram has been discussed in details. Micromagnetic simulations are shown to be in good agreement with experimental results and provide insight for synchronization of inhomogeneities in large sized device. The ability to manipulate spin-dynamics on large size devices could be proved useful for increasing the output power of the spin-transfer nano-oscillators (STNOs).

  16. A comparison of geospatially modeled fire behavior and fire management utility of three data sources in the southeastern United States

    Science.gov (United States)

    LaWen T. Hollingsworth; Laurie L. Kurth; Bernard R. Parresol; Roger D. Ottmar; Susan J. Prichard

    2012-01-01

    Landscape-scale fire behavior analyses are important to inform decisions on resource management projects that meet land management objectives and protect values from adverse consequences of fire. Deterministic and probabilistic geospatial fire behavior analyses are conducted with various modeling systems including FARSITE, FlamMap, FSPro, and Large Fire Simulation...

  17. AGWA: The Automated Geospatial Watershed Assessment Tool

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment Tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  18. Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)

    Science.gov (United States)

    Hancher, M.

    2013-12-01

    Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.

  19. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    Science.gov (United States)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data

  20. Geospatial intelligence and visual classification of environmentally observed species in the Future Internet

    Science.gov (United States)

    Arbab-Zavar, B.; Chakravarthy, A.; Sabeur, Z. A.

    2012-04-01

    The rapid development of advanced smart communication tools with good quality and resolution video cameras, audio and GPS devices in the last few years shall lead to profound impacts on the way future environmental observations are conducted and accessed by communities. The resulting large scale interconnections of these "Future Internet Things" form a large environmental sensing network which will generate large volumes of quality environmental observations and at highly localised spatial scales. This enablement in environmental sensing at local scales will be of great importance to contribute in the study of fauna and flora in the near future, particularly on the effect of climate change on biodiversity in various regions of Europe and beyond. The Future Internet could also potentially become the de facto information space to provide participative real-time sensing by communities and improve our situation awarness of the effect of climate on local environments. In the ENVIROFI(2011-2013) Usage Area project in the FP7 FI-PPP programme, a set of requirements for specific (and generic) enablers is achieved with the potential establishement of participating community observatories of the future. In particular, the specific enablement of interest concerns the building of future interoperable services for the management of environmental data intelligently with tagged contextual geo-spatial information generated by multiple operators in communities (Using smart phones). The classification of observed species in the resulting images is achieved with structured data pre-processing, semantic enrichement using contextual geospatial information, and high level fusion with controlled uncertainty estimations. The returned identification of species is further improved using future ground truth corrections and learning by the specific enablers.

  1. The geospatial web how geobrowsers, social software and the web 2 0 are shaping the network society

    CERN Document Server

    Scharl, Arno; Tochtermann, Klaus

    2007-01-01

    The Geospatial Web will have a profound impact on managing knowledge, structuring work flows within and across organizations, and communicating with like-minded individuals in virtual communities. The enabling technologies for the Geospatial Web are geo-browsers such as NASA World Wind, Google Earth and Microsoft Live Local 3D. These three-dimensional platforms revolutionize the production and consumption of media products. They not only reveal the geographic distribution of Web resources and services, but also bring together people of similar interests, browsing behavior, or geographic location. This book summarizes the latest research on the Geospatial Web's technical foundations, describes information services and collaborative tools built on top of geo-browsers, and investigates the environmental, social and economic impacts of geospatial applications. The role of contextual knowledge in shaping the emerging network society deserves particular attention. By integrating geospatial and semantic technology, ...

  2. A study on state of Geospatial courses in Indian Universities

    Science.gov (United States)

    Shekhar, S.

    2014-12-01

    Today the world is dominated by three technologies such as Nano technology, Bio technology and Geospatial technology. This increases the huge demand for experts in the respective field for disseminating the knowledge as well as for an innovative research. Therefore, the prime need is to train the existing fraternity to gain progressive knowledge in these technologies and impart the same to student community. The geospatial technology faces some peculiar problem than other two technologies because of its interdisciplinary, multi-disciplinary nature. It attracts students and mid career professionals from various disciplines including Physics, Computer science, Engineering, Geography, Geology, Agriculture, Forestry, Town Planning and so on. Hence there is always competition to crab and stabilize their position. The students of Master's degree in Geospatial science are facing two types of problem. The first one is no unique identity in the academic field. Neither they are exempted for National eligibility Test for Lecturer ship nor given an opportunity to have the exam in geospatial science. The second one is differential treatment by the industrial world. The students are either given low grade jobs or poorly paid for their job. Thus, it is a serious issue about the future of this course in the Universities and its recognition in the academic and industrial world. The universities should make this course towards more job oriented in consultation with the Industries and Industries should come forward to share their demands and requirements to the Universities, so that necessary changes in the curriculum can be made to meet the industrial requirements.

  3. Quantifying environmental limiting factors on tree cover using geospatial data.

    Science.gov (United States)

    Greenberg, Jonathan A; Santos, Maria J; Dobrowski, Solomon Z; Vanderbilt, Vern C; Ustin, Susan L

    2015-01-01

    Environmental limiting factors (ELFs) are the thresholds that determine the maximum or minimum biological response for a given suite of environmental conditions. We asked the following questions: 1) Can we detect ELFs on percent tree cover across the eastern slopes of the Lake Tahoe Basin, NV? 2) How are the ELFs distributed spatially? 3) To what extent are unmeasured environmental factors limiting tree cover? ELFs are difficult to quantify as they require significant sample sizes. We addressed this by using geospatial data over a relatively large spatial extent, where the wall-to-wall sampling ensures the inclusion of rare data points which define the minimum or maximum response to environmental factors. We tested mean temperature, minimum temperature, potential evapotranspiration (PET) and PET minus precipitation (PET-P) as potential limiting factors on percent tree cover. We found that the study area showed system-wide limitations on tree cover, and each of the factors showed evidence of being limiting on tree cover. However, only 1.2% of the total area appeared to be limited by the four (4) environmental factors, suggesting other unmeasured factors are limiting much of the tree cover in the study area. Where sites were near their theoretical maximum, non-forest sites (tree cover demand, and closed-canopy forests were not limited by any particular environmental factor. The detection of ELFs is necessary in order to fully understand the width of limitations that species experience within their geographic range.

  4. RESEARCH AND PRACTICAL TRENDS IN GEOSPATIAL SCIENCES

    Directory of Open Access Journals (Sweden)

    A. P. Karpik

    2016-06-01

    Full Text Available In recent years professional societies have been undergoing fundamental restructuring brought on by extensive technological change and rapid evolution of geospatial science. Almost all professional communities have been affected. Communities are embracing digital techniques, modern equipment, software and new technological solutions at a staggering pace. In this situation, when planning financial investments and intellectual resource management, it is crucial to have a clear understanding of those trends that will be in great demand in 3-7 years. This paper reviews current scientific and practical activities of such non-governmental international organizations as International Federation of Surveyors, International Cartographic Association, and International Society for Photogrammetry and Remote Sensing, analyzes and groups most relevant topics brought up at their scientific events, forecasts most probable research and practical trends in geospatial sciences, outlines topmost leading countries and emerging markets for further detailed analysis of their activities, types of scientific cooperation and joint implementation projects.

  5. Fast Updating National Geo-Spatial Databases with High Resolution Imagery: China's Methodology and Experience

    Science.gov (United States)

    Chen, J.; Wang, D.; Zhao, R. L.; Zhang, H.; Liao, A.; Jiu, J.

    2014-04-01

    Geospatial databases are irreplaceable national treasure of immense importance. Their up-to-dateness referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million's kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. The use of many data sources and the integration of these data to form a high accuracy, quality checked product were required. It had in turn required up to date techniques of image matching, semantic integration, generalization, data base management and conflict resolution. Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. A national 1:50,000 databases updating strategy and its production workflow were designed, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country.

  6. Geospatial technology perspectives for mining vis-a-vis sustainable forest ecosystems

    Directory of Open Access Journals (Sweden)

    Goparaju Laxmi

    2017-06-01

    Full Text Available Forests, the backbone of biogeochemical cycles and life supporting systems, are under severe pressure due to varied anthropogenic activities. Mining activities are one among the major reasons for forest destruction questioning the survivability and sustainability of flora and fauna existing in that area. Thus, monitoring and managing the impact of mining activities on natural resources at regular intervals is necessary to check the status of their depleted conditions, and to take up restoration and conservative measurements. Geospatial technology provides means to identify the impact of different mining operations on forest ecosystems and helps in proposing initiatives for safeguarding the forest environment. In this context, the present study highlights the problems related to mining in forest ecosystems and elucidates how geospatial technology can be employed at various stages of mining activities to achieve a sustainable forest ecosystem. The study collates information from various sources and highlights the role of geospatial technology in mining industries and reclamation process.

  7. Solar Maps | Geospatial Data Science | NREL

    Science.gov (United States)

    Solar Maps Solar Maps These solar maps provide average daily total solar resource information on disability, contact the Geospatial Data Science Team. U.S. State Solar Resource Maps Access state maps of MT NE NV NH NJ NM NY NC ND OH OK OR PA RI SC SD TN TX UT VT VA WA WV WI WY × U.S. Solar Resource

  8. Remote Sensing Technologies and Geospatial Modelling Hierarchy for Smart City Support

    Science.gov (United States)

    Popov, M.; Fedorovsky, O.; Stankevich, S.; Filipovich, V.; Khyzhniak, A.; Piestova, I.; Lubskyi, M.; Svideniuk, M.

    2017-12-01

    The approach to implementing the remote sensing technologies and geospatial modelling for smart city support is presented. The hierarchical structure and basic components of the smart city information support subsystem are considered. Some of the already available useful practical developments are described. These include city land use planning, urban vegetation analysis, thermal condition forecasting, geohazard detection, flooding risk assessment. Remote sensing data fusion approach for comprehensive geospatial analysis is discussed. Long-term city development forecasting by Forrester - Graham system dynamics model is provided over Kiev urban area.

  9. Architecture of a Process Broker for Interoperable Geospatial Modeling on the Web

    Directory of Open Access Journals (Sweden)

    Lorenzo Bigagli

    2015-04-01

    Full Text Available The identification of appropriate mechanisms for process sharing and reuse by means of composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. Modelers in need of running complex workflows may benefit from outsourcing process composition to a dedicated external service, according to the brokering approach. This work introduces our architecture of a process broker, as a distributed information system for creating, validating, editing, storing, publishing and executing geospatial-modeling workflows. The broker provides a service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general in the form of interoperable, executable workflows. The described solution has been experimentally applied in several use scenarios in the context of EU-funded projects and the Global Earth Observation System of Systems.

  10. HARVESTING, INTEGRATING AND DISTRIBUTING LARGE OPEN GEOSPATIAL DATASETS USING FREE AND OPEN-SOURCE SOFTWARE

    Directory of Open Access Journals (Sweden)

    R. Oliveira

    2016-06-01

    Full Text Available Federal, State and Local government agencies in the USA are investing heavily on the dissemination of Open Data sets produced by each of them. The main driver behind this thrust is to increase agencies’ transparency and accountability, as well as to improve citizens’ awareness. However, not all Open Data sets are easy to access and integrate with other Open Data sets available even from the same agency. The City and County of Denver Open Data Portal distributes several types of geospatial datasets, one of them is the city parcels information containing 224,256 records. Although this data layer contains many pieces of information it is incomplete for some custom purposes. Open-Source Software were used to first collect data from diverse City of Denver Open Data sets, then upload them to a repository in the Cloud where they were processed using a PostgreSQL installation on the Cloud and Python scripts. Our method was able to extract non-spatial information from a ‘not-ready-to-download’ source that could then be combined with the initial data set to enhance its potential use.

  11. Large-scale ocean connectivity and planktonic body size

    KAUST Repository

    Villarino, Ernesto; Watson, James R.; Jö nsson, Bror; Gasol, Josep M.; Salazar, Guillem; Acinas, Silvia G.; Estrada, Marta; Massana, Ramó n; Logares, Ramiro; Giner, Caterina R.; Pernice, Massimo C.; Olivar, M. Pilar; Citores, Leire; Corell, Jon; Rodrí guez-Ezpeleta, Naiara; Acuñ a, José Luis; Molina-Ramí rez, Axayacatl; Gonzá lez-Gordillo, J. Ignacio; Có zar, André s; Martí , Elisa; Cuesta, José A.; Agusti, Susana; Fraile-Nuez, Eugenio; Duarte, Carlos M.; Irigoien, Xabier; Chust, Guillem

    2018-01-01

    Global patterns of planktonic diversity are mainly determined by the dispersal of propagules with ocean currents. However, the role that abundance and body size play in determining spatial patterns of diversity remains unclear. Here we analyse spatial community structure - β-diversity - for several planktonic and nektonic organisms from prokaryotes to small mesopelagic fishes collected during the Malaspina 2010 Expedition. β-diversity was compared to surface ocean transit times derived from a global circulation model, revealing a significant negative relationship that is stronger than environmental differences. Estimated dispersal scales for different groups show a negative correlation with body size, where less abundant large-bodied communities have significantly shorter dispersal scales and larger species spatial turnover rates than more abundant small-bodied plankton. Our results confirm that the dispersal scale of planktonic and micro-nektonic organisms is determined by local abundance, which scales with body size, ultimately setting global spatial patterns of diversity.

  12. Large-scale ocean connectivity and planktonic body size

    KAUST Repository

    Villarino, Ernesto

    2018-01-04

    Global patterns of planktonic diversity are mainly determined by the dispersal of propagules with ocean currents. However, the role that abundance and body size play in determining spatial patterns of diversity remains unclear. Here we analyse spatial community structure - β-diversity - for several planktonic and nektonic organisms from prokaryotes to small mesopelagic fishes collected during the Malaspina 2010 Expedition. β-diversity was compared to surface ocean transit times derived from a global circulation model, revealing a significant negative relationship that is stronger than environmental differences. Estimated dispersal scales for different groups show a negative correlation with body size, where less abundant large-bodied communities have significantly shorter dispersal scales and larger species spatial turnover rates than more abundant small-bodied plankton. Our results confirm that the dispersal scale of planktonic and micro-nektonic organisms is determined by local abundance, which scales with body size, ultimately setting global spatial patterns of diversity.

  13. Lunar Mapping and Modeling On-the-Go: A mobile framework for viewing and interacting with large geospatial datasets

    Science.gov (United States)

    Chang, G.; Kim, R.; Bui, B.; Sadaqathullah, S.; Law, E.; Malhotra, S.

    2012-12-01

    bookmark those layers for quick access in subsequent sessions. A search tool is also provided to allow users to quickly find points of interests on the moon and to view the auxiliary data associated with that feature. More advanced features include the ability to interact with the data. Using the services provided by the portal, users will be able to log in and access the same scientific analysis tools provided on the web site including measuring between two points, generating subsets, and running other analysis tools, all by using a customized touch interface that are immediately familiar to users of these smart mobile devices. Users can also access their own storage on the portal and view or send the data to other users. Finally, there are features that will utilize functionality that can only be enabled by mobile devices. This includes the use of the gyroscopes and motion sensors to provide a haptic interface visualize lunar data in 3D, on the device as well as potentially on a large screen. The mobile framework that we have developed for LMMP provides a glimpse of what is possible in visualizing and manipulating large geospatial data on small portable devices. While the framework is currently tuned to our portal, we hope that we can generalize the tool to use data sources from any type of GIS services.

  14. Adoption of Geospatial Systems towards evolving Sustainable Himalayan Mountain Development

    Science.gov (United States)

    Murthy, M. S. R.; Bajracharya, B.; Pradhan, S.; Shestra, B.; Bajracharya, R.; Shakya, K.; Wesselmann, S.; Ali, M.; Bajracharya, S.; Pradhan, S.

    2014-11-01

    Natural resources dependence of mountain communities, rapid social and developmental changes, disaster proneness and climate change are conceived as the critical factors regulating sustainable Himalayan mountain development. The Himalayan region posed by typical geographic settings, diverse physical and cultural diversity present a formidable challenge to collect and manage data, information and understands varied socio-ecological settings. Recent advances in earth observation, near real-time data, in-situ measurements and in combination of information and communication technology have transformed the way we collect, process, and generate information and how we use such information for societal benefits. Glacier dynamics, land cover changes, disaster risk reduction systems, food security and ecosystem conservation are a few thematic areas where geospatial information and knowledge have significantly contributed to informed decision making systems over the region. The emergence and adoption of near-real time systems, unmanned aerial vehicles (UAV), board-scale citizen science (crowd-sourcing), mobile services and mapping, and cloud computing have paved the way towards developing automated environmental monitoring systems, enhanced scientific understanding of geophysical and biophysical processes, coupled management of socio-ecological systems and community based adaptation models tailored to mountain specific environment. There are differentiated capacities among the ICIMOD regional member countries with regard to utilization of earth observation and geospatial technologies. The region can greatly benefit from a coordinated and collaborative approach to capture the opportunities offered by earth observation and geospatial technologies. The regional level data sharing, knowledge exchange, and Himalayan GEO supporting geospatial platforms, spatial data infrastructure, unique region specific satellite systems to address trans-boundary challenges would go a long way in

  15. Estimated spatial requirements of the medium- to large-sized ...

    African Journals Online (AJOL)

    Conservation planning in the Cape Floristic Region (CFR) of South Africa, a recognised world plant diversity hotspot, required information on the estimated spatial requirements of selected medium- to large-sized mammals within each of 102 Broad Habitat Units (BHUs) delineated according to key biophysical parameters.

  16. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    Science.gov (United States)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This

  17. INTEGRATING GEOSPATIAL TECHNOLOGIES AND SECONDARY STUDENT PROJECTS: THE GEOSPATIAL SEMESTER

    Directory of Open Access Journals (Sweden)

    Bob Kolvoord

    2012-12-01

    Full Text Available Resumen:El Semestre Geoespacial es una actividad de educación geográfica centrada en que los estudiantes del último curso de secundaria en los institutos norteamericanos, adquieran competencias y habilidades específicas en sistemas de información geográfica, GPS y teledetección. A través de una metodología de aprendizaje basado en proyectos, los alumnos se motivan e implican en la realización de trabajos de investigación en los que analizan, e incluso proponen soluciones, diferentes procesos, problemas o cuestiones de naturaleza espacial. El proyecto está coordinado por la Universidad James Madison y lleva siete años implantándose en diferentes institutos del Estado de Virginia, implicando a más de 20 centros educativos y 1.500 alumnos. Los alumnos que superan esta asignatura de la enseñanza secundaria obtienen la convalidación de determinados créditos académicos de la Universidad de referencia.Palabras clave:Sistemas de información geográfica, enseñanza, didáctica de la geografía, semestre geoespacial.Abstract:The Geospatial Semester is a geographical education activity focused on students in their final year of secondary schools in the U.S., acquiring specific skills in GIS, GPS and remote sensing. Through a methodology for project-based learning, students are motivated and involved in conducting research using geographic information systems and analyze, and even propose solutions, different processes, problems or issues spatial in nature. The Geospatial Semester university management not only ensures proper coaching, guidance and GIS training for teachers of colleges, but has established a system whereby students who pass this course of secondary education gain the recognition of certain credits from the University.Key words:Geographic information system, teaching, geographic education, geospatial semester. Résumé:Le semestre géospatial est une activité axée sur l'éducation géographique des étudiants en derni

  18. Building a multi-scaled geospatial temporal ecology database from disparate data sources: Fostering open science through data reuse

    Science.gov (United States)

    Soranno, Patricia A.; Bissell, E.G.; Cheruvelil, Kendra S.; Christel, Samuel T.; Collins, Sarah M.; Fergus, C. Emi; Filstrup, Christopher T.; Lapierre, Jean-Francois; Lotting, Noah R.; Oliver, Samantha K.; Scott, Caren E.; Smith, Nicole J.; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A.; Gries, Corinna; Henry, Emily N.; Skaff, Nick K.; Stanley, Emily H.; Stow, Craig A.; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E.

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km2). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated

  19. Building a multi-scaled geospatial temporal ecology database from disparate data sources: fostering open science and data reuse.

    Science.gov (United States)

    Soranno, Patricia A; Bissell, Edward G; Cheruvelil, Kendra S; Christel, Samuel T; Collins, Sarah M; Fergus, C Emi; Filstrup, Christopher T; Lapierre, Jean-Francois; Lottig, Noah R; Oliver, Samantha K; Scott, Caren E; Smith, Nicole J; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A; Gries, Corinna; Henry, Emily N; Skaff, Nick K; Stanley, Emily H; Stow, Craig A; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E

    2015-01-01

    Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km(2)). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated

  20. Automatic Scaling Hadoop in the Cloud for Efficient Process of Big Geospatial Data

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2016-09-01

    Full Text Available Efficient processing of big geospatial data is crucial for tackling global and regional challenges such as climate change and natural disasters, but it is challenging not only due to the massive data volume but also due to the intrinsic complexity and high dimensions of the geospatial datasets. While traditional computing infrastructure does not scale well with the rapidly increasing data volume, Hadoop has attracted increasing attention in geoscience communities for handling big geospatial data. Recently, many studies were carried out to investigate adopting Hadoop for processing big geospatial data, but how to adjust the computing resources to efficiently handle the dynamic geoprocessing workload was barely explored. To bridge this gap, we propose a novel framework to automatically scale the Hadoop cluster in the cloud environment to allocate the right amount of computing resources based on the dynamic geoprocessing workload. The framework and auto-scaling algorithms are introduced, and a prototype system was developed to demonstrate the feasibility and efficiency of the proposed scaling mechanism using Digital Elevation Model (DEM interpolation as an example. Experimental results show that this auto-scaling framework could (1 significantly reduce the computing resource utilization (by 80% in our example while delivering similar performance as a full-powered cluster; and (2 effectively handle the spike processing workload by automatically increasing the computing resources to ensure the processing is finished within an acceptable time. Such an auto-scaling approach provides a valuable reference to optimize the performance of geospatial applications to address data- and computational-intensity challenges in GIScience in a more cost-efficient manner.

  1. An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services

    Science.gov (United States)

    Shah, M.; Verma, Y.; Nandakumar, R.

    2012-07-01

    Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  2. DESIGN AND DEVELOPMENT OF A LARGE SIZE NON-TRACKING SOLAR COOKER

    Directory of Open Access Journals (Sweden)

    N. M. NAHAR

    2009-09-01

    Full Text Available A large size novel non-tracking solar cooker has been designed, developed and tested. The cooker has been designed in such a way that the width to length ratio for reflector and glass window is about 4 so that maximum radiation falls on the glass window. This has helped in eliminating azimuthal tracking that is required in simple hot box solar cooker towards the Sun every hour because the width to length ratio of reflector is 1. It has been found that stagnation temperatures were 118.5oC and 108oC in large size non-tracking solar cooker and hot box solar cooker respectively. It takes about 2 h for soft food and 3 h for hard food. The cooker is capable of cooking 4.0 kg of food at a time. The efficiency of the large size non-tracking solar cooker has been found to be 27.5%. The cooker saves 5175 MJ of energy per year. The cost of the cooker is Rs. 10000.00 (1.0 US$ = Rs. 50.50. The payback period has been calculated by considering 10% annual interest, 5% maintenance cost and 5% inflation in fuel prices and maintenance cost. The payback period is least, i.e. 1.58 yr., with respect to electricity and maximum, i.e. 4.89 yr., with respect to kerosene. The payback periods are in increasing order with respect to fuel: electricity, coal, firewood, liquid petroleum gas, and kerosene. The shorter payback periods suggests that the use of large size non-tracking solar cooker is economical.

  3. Geospatial Database for Strata Objects Based on Land Administration Domain Model (ladm)

    Science.gov (United States)

    Nasorudin, N. N.; Hassan, M. I.; Zulkifli, N. A.; Rahman, A. Abdul

    2016-09-01

    Recently in our country, the construction of buildings become more complex and it seems that strata objects database becomes more important in registering the real world as people now own and use multilevel of spaces. Furthermore, strata title was increasingly important and need to be well-managed. LADM is a standard model for land administration and it allows integrated 2D and 3D representation of spatial units. LADM also known as ISO 19152. The aim of this paper is to develop a strata objects database using LADM. This paper discusses the current 2D geospatial database and needs for 3D geospatial database in future. This paper also attempts to develop a strata objects database using a standard data model (LADM) and to analyze the developed strata objects database using LADM data model. The current cadastre system in Malaysia includes the strata title is discussed in this paper. The problems in the 2D geospatial database were listed and the needs for 3D geospatial database in future also is discussed. The processes to design a strata objects database are conceptual, logical and physical database design. The strata objects database will allow us to find the information on both non-spatial and spatial strata title information thus shows the location of the strata unit. This development of strata objects database may help to handle the strata title and information.

  4. Distributed Multi-interface Catalogue for Geospatial Data

    Science.gov (United States)

    Nativi, S.; Bigagli, L.; Mazzetti, P.; Mattia, U.; Boldrini, E.

    2007-12-01

    Several geosciences communities (e.g. atmospheric science, oceanography, hydrology) have developed tailored data and metadata models and service protocol specifications for enabling online data discovery, inventory, evaluation, access and download. These specifications are conceived either profiling geospatial information standards or extending the well-accepted geosciences data models and protocols in order to capture more semantics. These artifacts have generated a set of related catalog -and inventory services- characterizing different communities, initiatives and projects. In fact, these geospatial data catalogs are discovery and access systems that use metadata as the target for query on geospatial information. The indexed and searchable metadata provide a disciplined vocabulary against which intelligent geospatial search can be performed within or among communities. There exists a clear need to conceive and achieve solutions to implement interoperability among geosciences communities, in the context of the more general geospatial information interoperability framework. Such solutions should provide search and access capabilities across catalogs, inventory lists and their registered resources. Thus, the development of catalog clearinghouse solutions is a near-term challenge in support of fully functional and useful infrastructures for spatial data (e.g. INSPIRE, GMES, NSDI, GEOSS). This implies the implementation of components for query distribution and virtual resource aggregation. These solutions must implement distributed discovery functionalities in an heterogeneous environment, requiring metadata profiles harmonization as well as protocol adaptation and mediation. We present a catalog clearinghouse solution for the interoperability of several well-known cataloguing systems (e.g. OGC CSW, THREDDS catalog and data services). The solution implements consistent resource discovery and evaluation over a dynamic federation of several well-known cataloguing and

  5. The national atlas as a metaphor for improved use of a national geospatial data infrastructure

    NARCIS (Netherlands)

    Aditya Kurniawan Muhammad, T.

    2007-01-01

    Geospatial Data infrastructures have been developed worldwide. Geoportals have been created as an interface to allow users or the community to discover and use geospatial data offered by providers of these initiatives. This study focuses on the development of a web national atlas as an alternative

  6. Land degradation assessment by geo-spatially modeling different soil erodibility equations in a semi-arid catchment.

    Science.gov (United States)

    Saygın, Selen Deviren; Basaran, Mustafa; Ozcan, Ali Ugur; Dolarslan, Melda; Timur, Ozgur Burhan; Yilman, F Ebru; Erpul, Gunay

    2011-09-01

    Land degradation by soil erosion is one of the most serious problems and environmental issues in many ecosystems of arid and semi-arid regions. Especially, the disturbed areas have greater soil detachability and transportability capacity. Evaluation of land degradation in terms of soil erodibility, by using geostatistical modeling, is vital to protect and reclaim susceptible areas. Soil erodibility, described as the ability of soils to resist erosion, can be measured either directly under natural or simulated rainfall conditions, or indirectly estimated by empirical regression models. This study compares three empirical equations used to determine the soil erodibility factor of revised universal soil loss equation prediction technology based on their geospatial performances in the semi-arid catchment of the Saraykoy II Irrigation Dam located in Cankiri, Turkey. A total of 311 geo-referenced soil samples were collected with irregular intervals from the top soil layer (0-10 cm). Geostatistical analysis was performed with the point values of each equation to determine its spatial pattern. Results showed that equations that used soil organic matter in combination with the soil particle size better agreed with the variations in land use and topography of the catchment than the one using only the particle size distribution. It is recommended that the equations which dynamically integrate soil intrinsic properties with land use, topography, and its influences on the local microclimates, could be successfully used to geospatially determine sites highly susceptible to water erosion, and therefore, to select the agricultural and bio-engineering control measures needed.

  7. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  8. Interlayer catalytic exfoliation realizing scalable production of large-size pristine few-layer graphene

    OpenAIRE

    Geng, Xiumei; Guo, Yufen; Li, Dongfang; Li, Weiwei; Zhu, Chao; Wei, Xiangfei; Chen, Mingliang; Gao, Song; Qiu, Shengqiang; Gong, Youpin; Wu, Liqiong; Long, Mingsheng; Sun, Mengtao; Pan, Gebo; Liu, Liwei

    2013-01-01

    Mass production of reduced graphene oxide and graphene nanoplatelets has recently been achieved. However, a great challenge still remains in realizing large-quantity and high-quality production of large-size thin few-layer graphene (FLG). Here, we create a novel route to solve the issue by employing one-time-only interlayer catalytic exfoliation (ICE) of salt-intercalated graphite. The typical FLG with a large lateral size of tens of microns and a thickness less than 2?nm have been obtained b...

  9. Automated Geospatial Watershed Assessment Tool (AGWA) Poster Presentation

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  10. A Research Agenda for Geospatial Technologies and Learning

    Science.gov (United States)

    Baker, Tom R.; Battersby, Sarah; Bednarz, Sarah W.; Bodzin, Alec M.; Kolvoord, Bob; Moore, Steven; Sinton, Diana; Uttal, David

    2015-01-01

    Knowledge around geospatial technologies and learning remains sparse, inconsistent, and overly anecdotal. Studies are needed that are better structured; more systematic and replicable; attentive to progress and findings in the cognate fields of science, technology, engineering, and math education; and coordinated for multidisciplinary approaches.…

  11. Academic research opportunities at the National Geospatial-Intelligence Agency(NGA)

    Science.gov (United States)

    Loomer, Scott A.

    2006-05-01

    The vision of the National Geospatial-Intelligence Agency (NGA) is to "Know the Earth...Show the Way." To achieve this vision, the NGA provides geospatial intelligence in all its forms and from whatever source-imagery, imagery intelligence, and geospatial data and information-to ensure the knowledge foundation for planning, decision, and action. Academia plays a key role in the NGA research and development program through the NGA Academic Research Program. This multi-disciplinary program of basic research in geospatial intelligence topics provides grants and fellowships to the leading investigators, research universities, and colleges of the nation. This research provides the fundamental science support to NGA's applied and advanced research programs. The major components of the NGA Academic Research Program are: *NGA University Research Initiatives (NURI): Three-year basic research grants awarded competitively to the best investigators across the US academic community. Topics are selected to provide the scientific basis for advanced and applied research in NGA core disciplines. *Historically Black College and University - Minority Institution Research Initiatives (HBCU-MI): Two-year basic research grants awarded competitively to the best investigators at Historically Black Colleges and Universities, and Minority Institutions across the US academic community. *Intelligence Community Post-Doctoral Research Fellowships: Fellowships providing access to advanced research in science and technology applicable to the intelligence community's mission. The program provides a pool of researchers to support future intelligence community needs and develops long-term relationships with researchers as they move into career positions. This paper provides information about the NGA Academic Research Program, the projects it supports and how researchers and institutions can apply for grants under the program. In addition, other opportunities for academia to engage with NGA through

  12. Geospatial Information Relevant to the Flood Protection Available on The Mainstream Web

    Directory of Open Access Journals (Sweden)

    Kliment Tomáš

    2014-03-01

    Full Text Available Flood protection is one of several disciplines where geospatial data is very important and is a crucial component. Its management, processing and sharing form the foundation for their efficient use; therefore, special attention is required in the development of effective, precise, standardized, and interoperable models for the discovery and publishing of data on the Web. This paper describes the design of a methodology to discover Open Geospatial Consortium (OGC services on the Web and collect descriptive information, i.e., metadata in a geocatalogue. A pilot implementation of the proposed methodology - Geocatalogue of geospatial information provided by OGC services discovered on Google (hereinafter “Geocatalogue” - was used to search for available resources relevant to the area of flood protection. The result is an analysis of the availability of resources discovered through their metadata collected from the OGC services (WMS, WFS, etc. and the resources they provide (WMS layers, WFS objects, etc. within the domain of flood protection.

  13. Large- and small-size advantages in sneaking behaviour in the dusky frillgoby Bathygobius fuscus

    Science.gov (United States)

    Takegaki, Takeshi; Kaneko, Takashi; Matsumoto, Yukio

    2012-04-01

    Sneaking tactic, a male alternative reproductive tactic involving sperm competition, is generally adopted by small individuals because of its inconspicuousness. However, large size has an advantage when competition occurs between sneakers for fertilization of eggs. Here, we suggest that both large- and small-size advantages of sneaker males are present within the same species. Large sneaker males of the dusky frillgoby Bathygobius fuscus showed a high success rate in intruding into spawning nests because of their advantage in competition among sneaker males in keeping a suitable position to sneak, whereas small sneakers had few chances to sneak. However, small sneaker males were able to stay in the nests longer than large sneaker males when they succeeded in sneak intrusion. This suggests the possibility of an increase in their paternity. The findings of these size-specific behavioural advantages may be important in considering the evolution of size-related reproductive traits.

  14. Classification of large-sized hyperspectral imagery using fast machine learning algorithms

    Science.gov (United States)

    Xia, Junshi; Yokoya, Naoto; Iwasaki, Akira

    2017-07-01

    We present a framework of fast machine learning algorithms in the context of large-sized hyperspectral images classification from the theoretical to a practical viewpoint. In particular, we assess the performance of random forest (RF), rotation forest (RoF), and extreme learning machine (ELM) and the ensembles of RF and ELM. These classifiers are applied to two large-sized hyperspectral images and compared to the support vector machines. To give the quantitative analysis, we pay attention to comparing these methods when working with high input dimensions and a limited/sufficient training set. Moreover, other important issues such as the computational cost and robustness against the noise are also discussed.

  15. Phased array inspection of large size forged steel parts

    Science.gov (United States)

    Dupont-Marillia, Frederic; Jahazi, Mohammad; Belanger, Pierre

    2018-04-01

    High strength forged steel requires uncompromising quality to warrant advance performance for numerous critical applications. Ultrasonic inspection is commonly used in nondestructive testing to detect cracks and other defects. In steel blocks of relatively small dimensions (at least two directions not exceeding a few centimetres), phased array inspection is a trusted method to generate images of the inside of the blocks and therefore identify and size defects. However, casting of large size forged ingots introduces changes of mechanical parameters such as grain size, the Young's modulus, the Poisson's ratio, and the chemical composition. These heterogeneities affect the wave propagation, and consequently, the reliability of ultrasonic inspection and the imaging capabilities for these blocks. In this context, a custom phased array transducer designed for a 40-ton bainitic forged ingot was investigated. Following a previous study that provided local mechanical parameters for a similar block, two-dimensional simulations were made to compute the optimal transducer parameters including the pitch, width and number of elements. It appeared that depending on the number of elements, backwall reconstruction can generate high amplitude artefacts. Indeed, the large dimensions of the simulated block introduce numerous constructive interferences from backwall reflections which may lead to important artefacts. To increase image quality, the reconstruction algorithm was adapted and promising results were observed and compared with the scattering cone filter method available in the CIVA software.

  16. Geospatial cryptography: enabling researchers to access private, spatially referenced, human subjects data for cancer control and prevention.

    Science.gov (United States)

    Jacquez, Geoffrey M; Essex, Aleksander; Curtis, Andrew; Kohler, Betsy; Sherman, Recinda; Emam, Khaled El; Shi, Chen; Kaufmann, Andy; Beale, Linda; Cusick, Thomas; Goldberg, Daniel; Goovaerts, Pierre

    2017-07-01

    As the volume, accuracy and precision of digital geographic information have increased, concerns regarding individual privacy and confidentiality have come to the forefront. Not only do these challenge a basic tenet underlying the advancement of science by posing substantial obstacles to the sharing of data to validate research results, but they are obstacles to conducting certain research projects in the first place. Geospatial cryptography involves the specification, design, implementation and application of cryptographic techniques to address privacy, confidentiality and security concerns for geographically referenced data. This article defines geospatial cryptography and demonstrates its application in cancer control and surveillance. Four use cases are considered: (1) national-level de-duplication among state or province-based cancer registries; (2) sharing of confidential data across cancer registries to support case aggregation across administrative geographies; (3) secure data linkage; and (4) cancer cluster investigation and surveillance. A secure multi-party system for geospatial cryptography is developed. Solutions under geospatial cryptography are presented and computation time is calculated. As services provided by cancer registries to the research community, de-duplication, case aggregation across administrative geographies and secure data linkage are often time-consuming and in some instances precluded by confidentiality and security concerns. Geospatial cryptography provides secure solutions that hold significant promise for addressing these concerns and for accelerating the pace of research with human subjects data residing in our nation's cancer registries. Pursuit of the research directions posed herein conceivably would lead to a geospatially encrypted geographic information system (GEGIS) designed specifically to promote the sharing and spatial analysis of confidential data. Geospatial cryptography holds substantial promise for accelerating the

  17. Big Data analytics in the Geo-Spatial Domain

    NARCIS (Netherlands)

    R.A. Goncalves (Romulo); M.G. Ivanova (Milena); M.L. Kersten (Martin); H. Scholten; S. Zlatanova; F. Alvanaki (Foteini); P. Nourian (Pirouz); E. Dias

    2014-01-01

    htmlabstractBig data collections in many scientific domains have inherently rich spatial and geo-spatial features. Spatial location is among the core aspects of data in Earth observation sciences, astronomy, and seismology to name a few. The goal of our project is to design an efficient data

  18. Intelligence, mapping, and geospatial exploitation system (IMAGES)

    Science.gov (United States)

    Moellman, Dennis E.; Cain, Joel M.

    1998-08-01

    This paper provides further detail to one facet of the battlespace visualization concept described in last year's paper Battlespace Situation Awareness for Force XXI. It focuses on the National Imagery and Mapping Agency (NIMA) goal to 'provide customers seamless access to tailorable imagery, imagery intelligence, and geospatial information.' This paper describes Intelligence, Mapping, and Geospatial Exploitation System (IMAGES), an exploitation element capable of CONUS baseplant operations or field deployment to provide NIMA geospatial information collaboratively into a reconnaissance, surveillance, and target acquisition (RSTA) environment through the United States Imagery and Geospatial Information System (USIGS). In a baseplant CONUS setting IMAGES could be used to produce foundation data to support mission planning. In the field it could be directly associated with a tactical sensor receiver or ground station (e.g. UAV or UGV) to provide near real-time and mission specific RSTA to support mission execution. This paper provides IMAGES functional level design; describes the technologies, their interactions and interdependencies; and presents a notional operational scenario to illustrate the system flexibility. Using as a system backbone an intelligent software agent technology, called Open Agent ArchitectureTM (OAATM), IMAGES combines multimodal data entry, natural language understanding, and perceptual and evidential reasoning for system management. Configured to be DII COE compliant, it would utilize, to the extent possible, COTS applications software for data management, processing, fusion, exploitation, and reporting. It would also be modular, scaleable, and reconfigurable. This paper describes how the OAATM achieves data synchronization and enables the necessary level of information to be rapidly available to various command echelons for making informed decisions. The reasoning component will provide for the best information to be developed in the timeline

  19. Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pabian, Frank V [Los Alamos National Laboratory

    2012-08-14

    This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previously used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant

  20. IMPRINT Analysis of an Unmanned Air System Geospatial Information Process

    National Research Council Canada - National Science Library

    Hunn, Bruce P; Schweitzer, Kristin M; Cahir, John A; Finch, Mary M

    2008-01-01

    ... intelligence, geospatial analysis cell. The Improved Performance Research Integration Tool (IMPRINT) modeling program was used to understand this process and to assess crew workload during several test scenarios...

  1. Efficient Extraction of Content from Enriched Geospatial and Networked Data

    DEFF Research Database (Denmark)

    Qu, Qiang

    forth, which makes it possible to extract relevant and interesting information that can then be utilized in different applications. However, web content is often semantically rich, structurally complex, and highly dynamic. This dissertation addresses some of the challenges posed by the use of such data......Social network services such as Google Places and Twitter have led to a proliferation of user-generated web content that is constantly shared among users. These services enable access to various types of content, covering geospatial locations, textual descriptions, social relationships, and so...... by merging edges and nodes in the original graph. Generalized, compressed graphs provide a way to interpret large networks. The dissertation reports on studies that compare the proposed solutions with respect to their tradeoffs between result complexity and quality. The findings suggest that the solutions...

  2. Learning R for geospatial analysis

    CERN Document Server

    Dorman, Michael

    2014-01-01

    This book is intended for anyone who wants to learn how to efficiently analyze geospatial data with R, including GIS analysts, researchers, educators, and students who work with spatial data and who are interested in expanding their capabilities through programming. The book assumes familiarity with the basic geographic information concepts (such as spatial coordinates), but no prior experience with R and/or programming is required. By focusing on R exclusively, you will not need to depend on any external software-a working installation of R is all that is necessary to begin.

  3. Integrated web system of geospatial data services for climate research

    Science.gov (United States)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander

    2016-04-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required. An approach for integrated analysis of georefernced climatological data sets based on combination of web and GIS technologies in the framework of spatial data infrastructure paradigm is presented. According to this approach a dedicated data-processing web system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is being developed. It is based on Open Geospatial Consortium (OGC) standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement #14.613.21.0037.

  4. A Geospatial Cyberinfrastructure for Urban Economic Analysis and Spatial Decision-Making

    Directory of Open Access Journals (Sweden)

    Michael F. Goodchild

    2013-05-01

    Full Text Available Urban economic modeling and effective spatial planning are critical tools towards achieving urban sustainability. However, in practice, many technical obstacles, such as information islands, poor documentation of data and lack of software platforms to facilitate virtual collaboration, are challenging the effectiveness of decision-making processes. In this paper, we report on our efforts to design and develop a geospatial cyberinfrastructure (GCI for urban economic analysis and simulation. This GCI provides an operational graphic user interface, built upon a service-oriented architecture to allow (1 widespread sharing and seamless integration of distributed geospatial data; (2 an effective way to address the uncertainty and positional errors encountered in fusing data from diverse sources; (3 the decomposition of complex planning questions into atomic spatial analysis tasks and the generation of a web service chain to tackle such complex problems; and (4 capturing and representing provenance of geospatial data to trace its flow in the modeling task. The Greater Los Angeles Region serves as the test bed. We expect this work to contribute to effective spatial policy analysis and decision-making through the adoption of advanced GCI and to broaden the application coverage of GCI to include urban economic simulations.

  5. Bridging the Gap between NASA Hydrological Data and the Geospatial Community

    Science.gov (United States)

    Rui, Hualan; Teng, Bill; Vollmer, Bruce; Mocko, David M.; Beaudoing, Hiroko K.; Nigro, Joseph; Gary, Mark; Maidment, David; Hooper, Richard

    2011-01-01

    There is a vast and ever increasing amount of data on the Earth interconnected energy and hydrological systems, available from NASA remote sensing and modeling systems, and yet, one challenge persists: increasing the usefulness of these data for, and thus their use by, the geospatial communities. The Hydrology Data and Information Services Center (HDISC), part of the Goddard Earth Sciences DISC, has continually worked to better understand the hydrological data needs of the geospatial end users, to thus better able to bridge the gap between NASA data and the geospatial communities. This paper will cover some of the hydrological data sets available from HDISC, and the various tools and services developed for data searching, data subletting ; format conversion. online visualization and analysis; interoperable access; etc.; to facilitate the integration of NASA hydrological data by end users. The NASA Goddard data analysis and visualization system, Giovanni, is described. Two case examples of user-customized data services are given, involving the EPA BASINS (Better Assessment Science Integrating point & Non-point Sources) project and the CUAHSI Hydrologic Information System, with the common requirement of on-the-fly retrieval of long duration time series for a geographical point

  6. Large- and small-size advantages in sneaking behaviour in the dusky frillgoby Bathygobius fuscus

    OpenAIRE

    Takegaki, Takeshi; Kaneko, Takashi; Matsumoto, Yukio

    2012-01-01

    Sneaking tactic, a male alternative reproductive tactic involving sperm competition, is generally adopted by small individuals because of its inconspicuousness. However, large size has an advantage when competition occurs between sneakers for fertilization of eggs. Here, we suggest that both large- and small-size advantages of sneaker males are present within the same species. Large sneaker males of the dusky frillgoby Bathygobius fuscus showed a high success rate in intruding into spawning n...

  7. Creating 3D models of historical buildings using geospatial data

    Science.gov (United States)

    Alionescu, Adrian; Bǎlǎ, Alina Corina; Brebu, Floarea Maria; Moscovici, Anca-Maria

    2017-07-01

    Recently, a lot of interest has been shown to understand a real world object by acquiring its 3D images of using laser scanning technology and panoramic images. A realistic impression of geometric 3D data can be generated by draping real colour textures simultaneously captured by a colour camera images. In this context, a new concept of geospatial data acquisition has rapidly revolutionized the method of determining the spatial position of objects, which is based on panoramic images. This article describes an approach that comprises inusing terrestrial laser scanning and panoramic images captured with Trimble V10 Imaging Rover technology to enlarge the details and realism of the geospatial data set, in order to obtain 3D urban plans and virtual reality applications.

  8. Persistent Teaching Practices after Geospatial Technology Professional Development

    Science.gov (United States)

    Rubino-Hare, Lori A.; Whitworth, Brooke A.; Bloom, Nena E.; Claesgens, Jennifer M.; Fredrickson, Kristi M.; Sample, James C.

    2016-01-01

    This case study described teachers with varying technology skills who were implementing the use of geospatial technology (GST) within project-based instruction (PBI) at varying grade levels and contexts 1 to 2 years following professional development. The sample consisted of 10 fifth- to ninth-grade teachers. Data sources included artifacts,…

  9. A Geo-Event-Based Geospatial Information Service: A Case Study of Typhoon Hazard

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-03-01

    Full Text Available Social media is valuable in propagating information during disasters for its timely and available characteristics nowadays, and assists in making decisions when tagged with locations. Considering the ambiguity and inaccuracy in some social data, additional authoritative data are needed for important verification. However, current works often fail to leverage both social and authoritative data and, on most occasions, the data are used in disaster analysis after the fact. Moreover, current works organize the data from the perspective of the spatial location, but not from the perspective of the disaster, making it difficult to dynamically analyze the disaster. All of the disaster-related data around the affected locations need to be retrieved. To solve these limitations, this study develops a geo-event-based geospatial information service (GEGIS framework and proceeded as follows: (1 a geo-event-related ontology was constructed to provide a uniform semantic basis for the system; (2 geo-events and attributes were extracted from the web using a natural language process (NLP and used in the semantic similarity match of the geospatial resources; and (3 a geospatial information service prototype system was designed and implemented for automatically retrieving and organizing geo-event-related geospatial resources. A case study of a typhoon hazard is analyzed here within the GEGIS and shows that the system would be effective when typhoons occur.

  10. Geospatial Technology: A Tool to Aid in the Elimination of Malaria in Bangladesh

    Directory of Open Access Journals (Sweden)

    Karen E. Kirk

    2014-12-01

    Full Text Available Bangladesh is a malaria endemic country. There are 13 districts in the country bordering India and Myanmar that are at risk of malaria. The majority of malaria morbidity and mortality cases are in the Chittagong Hill Tracts, the mountainous southeastern region of Bangladesh. In recent years, malaria burden has declined in the country. In this study, we reviewed and summarized published data (through 2014 on the use of geospatial technologies on malaria epidemiology in Bangladesh and outlined potential contributions of geospatial technologies for eliminating malaria in the country. We completed a literature review using “malaria, Bangladesh” search terms and found 218 articles published in peer-reviewed journals listed in PubMed. After a detailed review, 201 articles were excluded because they did not meet our inclusion criteria, 17 articles were selected for final evaluation. Published studies indicated geospatial technologies tools (Geographic Information System, Global Positioning System, and Remote Sensing were used to determine vector-breeding sites, land cover classification, accessibility to health facility, treatment seeking behaviors, and risk mapping at the household, regional, and national levels in Bangladesh. To achieve the goal of malaria elimination in Bangladesh, we concluded that further research using geospatial technologies should be integrated into the country’s ongoing surveillance system to identify and better assess progress towards malaria elimination.

  11. The welfare implications of large litter size in the domestic pig I

    DEFF Research Database (Denmark)

    Rutherford, K.M.D; Baxter, E.M.; D'Eath, R.B.

    2013-01-01

    Increasing litter size has long been a goal of pig breeders and producers, and may have implications for pig (Sus scrofa domesticus) welfare. This paper reviews the scientific evidence on biological factors affecting sow and piglet welfare in relation to large litter size. It is concluded that, i...

  12. SCHISTOSOMIASIS: GEOSPATIAL SURVEILLANCE AND RESPONSE SYSTEMS IN SOUTHEAST ASIA

    Directory of Open Access Journals (Sweden)

    J. Malone

    2016-10-01

    Full Text Available Geographic information system (GIS and remote sensing (RS from Earth-observing satellites offer opportunities for rapid assessment of areas endemic for vector-borne diseases including estimates of populations at risk and guidance to intervention strategies. This presentation deals with GIS and RS applications for the control of schistosomiasis in China and the Philippines. It includes large-scale risk mapping including identification of suitable habitats for Oncomelania hupensis, the intermediate host snail of Schistosoma japonicum. Predictions of infection risk are discussed with reference to ecological transformations and the potential impact of climate change and the potential for long-term temperature increases in the North as well as the impact on rivers, lakes and water resource developments. Potential integration of geospatial mapping and modeling in schistosomiasis surveillance and response systems in Asia within Global Earth Observation System of Systems (GEOSS guidelines in the health societal benefit area is discussed.

  13. Theoretical multi-tier trust framework for the geospatial domain

    CSIR Research Space (South Africa)

    Umuhoza, D

    2010-01-01

    Full Text Available chain or workflow from data acquisition to knowledge discovery. The author’s present work in progress of a theoretical multi-tier trust framework for processing chain from data acquisition to knowledge discovery in geospatial domain. Holistic trust...

  14. Computational scalability of large size image dissemination

    Science.gov (United States)

    Kooper, Rob; Bajcsy, Peter

    2011-01-01

    We have investigated the computational scalability of image pyramid building needed for dissemination of very large image data. The sources of large images include high resolution microscopes and telescopes, remote sensing and airborne imaging, and high resolution scanners. The term 'large' is understood from a user perspective which means either larger than a display size or larger than a memory/disk to hold the image data. The application drivers for our work are digitization projects such as the Lincoln Papers project (each image scan is about 100-150MB or about 5000x8000 pixels with the total number to be around 200,000) and the UIUC library scanning project for historical maps from 17th and 18th century (smaller number but larger images). The goal of our work is understand computational scalability of the web-based dissemination using image pyramids for these large image scans, as well as the preservation aspects of the data. We report our computational benchmarks for (a) building image pyramids to be disseminated using the Microsoft Seadragon library, (b) a computation execution approach using hyper-threading to generate image pyramids and to utilize the underlying hardware, and (c) an image pyramid preservation approach using various hard drive configurations of Redundant Array of Independent Disks (RAID) drives for input/output operations. The benchmarks are obtained with a map (334.61 MB, JPEG format, 17591x15014 pixels). The discussion combines the speed and preservation objectives.

  15. Damage threshold from large retinal spot size repetitive-pulse laser exposures.

    Science.gov (United States)

    Lund, Brian J; Lund, David J; Edsall, Peter R

    2014-10-01

    The retinal damage thresholds for large spot size, multiple-pulse exposures to a Q-switched, frequency doubled Nd:YAG laser (532 nm wavelength, 7 ns pulses) have been measured for 100 μm and 500 μm retinal irradiance diameters. The ED50, expressed as energy per pulse, varies only weakly with the number of pulses, n, for these extended spot sizes. The previously reported threshold for a multiple-pulse exposure for a 900 μm retinal spot size also shows the same weak dependence on the number of pulses. The multiple-pulse ED50 for an extended spot-size exposure does not follow the n dependence exhibited by small spot size exposures produced by a collimated beam. Curves derived by using probability-summation models provide a better fit to the data.

  16. Folding and unfolding of large-size shell construction for application in Earth orbit

    Science.gov (United States)

    Kondyurin, Alexey; Pestrenina, Irena; Pestrenin, Valery; Rusakov, Sergey

    2016-07-01

    A future exploration of space requires a technology of large module for biological, technological, logistic and other applications in Earth orbits [1-3]. This report describes the possibility of using large-sized shell structures deployable in space. Structure is delivered to the orbit in the spaceship container. The shell is folded for the transportation. The shell material is either rigid plastic or multilayer prepreg comprising rigid reinforcements (such as reinforcing fibers). The unfolding process (bringing a construction to the unfolded state by loading the internal pressure) needs be considered at the presence of both stretching and bending deformations. An analysis of the deployment conditions (the minimum internal pressure bringing a construction from the folded state to the unfolded state) of large laminated CFRP shell structures is formulated in this report. Solution of this mechanics of deformable solids (MDS) problem of the shell structure is based on the following assumptions: the shell is made of components whose median surface has a reamer; in the separate structural element relaxed state (not stressed and not deformed) its median surface coincides with its reamer (this assumption allows choose the relaxed state of the structure correctly); structural elements are joined (sewn together) by a seam that does not resist rotation around the tangent to the seam line. The ways of large shell structures folding, whose median surface has a reamer, are suggested. Unfolding of cylindrical, conical (full and truncated cones), and large-size composite shells (cylinder-cones, cones-cones) is considered. These results show that the unfolding pressure of such large-size structures (0.01-0.2 atm.) is comparable to the deploying pressure of pneumatic parts (0.001-0.1 atm.) [3]. It would be possible to extend this approach to investigate the unfolding process of large-sized shells with ruled median surface or for non-developable surfaces. This research was

  17. Geospatial big data and cartography : research challenges and opportunities for making maps that matter

    OpenAIRE

    Robinson, Anthony C.; Demsar, Urska; Moore, Antoni B.; Buckley, Aileen; Jiang, Bin; Field, Kenneth; Kraak, Menno-Jan; Camboim, Silvana P; Sluter, Claudia R

    2017-01-01

    Geospatial big data present a new set of challenges and opportunities for cartographic researchers in technical, methodological, and artistic realms. New computational and technical paradigms for cartography are accompanying the rise of geospatial big data. Additionally, the art and science of cartography needs to focus its contemporary efforts on work that connects to outside disciplines and is grounded in problems that are important to humankind and its sustainability. Following the develop...

  18. Accuracy of the photogrametric measuring system for large size elements

    Directory of Open Access Journals (Sweden)

    M. Grzelka

    2011-04-01

    Full Text Available The aim of this paper is to present methods of estimating and guidelines for verifying the accuracy of optical photogrammetric measuringsystems, using for measurement of large size elements. Measuring systems applied to measure workpieces of a large size which oftenreach more than 10000mm require use of appropriate standards. Those standards provided by the manufacturer of photogrammetricsystems are certified and are inspected annually. To make sure that these systems work properly there was developed a special standardVDI / VDE 2634, "Optical 3D measuring systems. Imaging systems with point - by - point probing. " According to recommendationsdescribed in this standard research on accuracy of photogrametric measuring system was conducted using K class gauge blocks dedicatedto calibrate and test accuracy of classic CMMs. The paper presents results of research of estimation the actual error of indication for sizemeasurement MPEE for photogrammetric coordinate measuring system TRITOP.

  19. Describing Geospatial Assets in the Web of Data: A Metadata Management Scenario

    Directory of Open Access Journals (Sweden)

    Cristiano Fugazza

    2016-12-01

    Full Text Available Metadata management is an essential enabling factor for geospatial assets because discovery, retrieval, and actual usage of the latter are tightly bound to the quality of these descriptions. Unfortunately, the multi-faceted landscape of metadata formats, requirements, and conventions makes it difficult to identify editing tools that can be easily tailored to the specificities of a given project, workgroup, and Community of Practice. Our solution is a template-driven metadata editing tool that can be customised to any XML-based schema. Its output is constituted by standards-compliant metadata records that also have a semantics-aware counterpart eliciting novel exploitation techniques. Moreover, external data sources can easily be plugged in to provide autocompletion functionalities on the basis of the data structures made available on the Web of Data. Beside presenting the essentials on customisation of the editor by means of two use cases, we extend the methodology to the whole life cycle of geospatial metadata. We demonstrate the novel capabilities enabled by RDF-based metadata representation with respect to traditional metadata management in the geospatial domain.

  20. A research on the security of wisdom campus based on geospatial big data

    Science.gov (United States)

    Wang, Haiying

    2018-05-01

    There are some difficulties in wisdom campus, such as geospatial big data sharing, function expansion, data management, analysis and mining geospatial big data for a characteristic, especially the problem of data security can't guarantee cause prominent attention increasingly. In this article we put forward a data-oriented software architecture which is designed by the ideology of orienting data and data as kernel, solve the problem of traditional software architecture broaden the campus space data research, develop the application of wisdom campus.

  1. Large exon size does not limit splicing in vivo.

    Science.gov (United States)

    Chen, I T; Chasin, L A

    1994-03-01

    Exon sizes in vertebrate genes are, with a few exceptions, limited to less than 300 bases. It has been proposed that this limitation may derive from the exon definition model of splice site recognition. In this model, a downstream donor site enhances splicing at the upstream acceptor site of the same exon. This enhancement may require contact between factors bound to each end of the exon; an exon size limitation would promote such contact. To test the idea that proximity was required for exon definition, we inserted random DNA fragments from Escherichia coli into a central exon in a three-exon dihydrofolate reductase minigene and tested whether the expanded exons were efficiently spliced. DNA from a plasmid library of expanded minigenes was used to transfect a CHO cell deletion mutant lacking the dhfr locus. PCR analysis of DNA isolated from the pooled stable cotransfectant populations displayed a range of DNA insert sizes from 50 to 1,500 nucleotides. A parallel analysis of the RNA from this population by reverse transcription followed by PCR showed a similar size distribution. Central exons as large as 1,400 bases could be spliced into mRNA. We also tested individual plasmid clones containing exon inserts of defined sizes. The largest exon included in mRNA was 1,200 bases in length, well above the 300-base limit implied by the survey of naturally occurring exons. We conclude that a limitation in exon size is not part of the exon definition mechanism.

  2. Research on presentation and query service of geo-spatial data based on ontology

    Science.gov (United States)

    Li, Hong-wei; Li, Qin-chao; Cai, Chang

    2008-10-01

    The paper analyzed the deficiency on presentation and query of geo-spatial data existed in current GIS, discussed the advantages that ontology possessed in formalization of geo-spatial data and the presentation of semantic granularity, taken land-use classification system as an example to construct domain ontology, and described it by OWL; realized the grade level and category presentation of land-use data benefited from the thoughts of vertical and horizontal navigation; and then discussed query mode of geo-spatial data based on ontology, including data query based on types and grade levels, instances and spatial relation, and synthetic query based on types and instances; these methods enriched query mode of current GIS, and is a useful attempt; point out that the key point of the presentation and query of spatial data based on ontology is to construct domain ontology that can correctly reflect geo-concept and its spatial relation and realize its fine formalization description.

  3. Body size evolution in an old insect order: No evidence for Cope's Rule in spite of fitness benefits of large size.

    Science.gov (United States)

    Waller, John T; Svensson, Erik I

    2017-09-01

    We integrate field data and phylogenetic comparative analyses to investigate causes of body size evolution and stasis in an old insect order: odonates ("dragonflies and damselflies"). Fossil evidence for "Cope's Rule" in odonates is weak or nonexistent since the last major extinction event 65 million years ago, yet selection studies show consistent positive selection for increased body size among adults. In particular, we find that large males in natural populations of the banded demoiselle (Calopteryx splendens) over several generations have consistent fitness benefits both in terms of survival and mating success. Additionally, there was no evidence for stabilizing or conflicting selection between fitness components within the adult life-stage. This lack of stabilizing selection during the adult life-stage was independently supported by a literature survey on different male and female fitness components from several odonate species. We did detect several significant body size shifts among extant taxa using comparative methods and a large new molecular phylogeny for odonates. We suggest that the lack of Cope's rule in odonates results from conflicting selection between fitness advantages of large adult size and costs of long larval development. We also discuss competing explanations for body size stasis in this insect group. © 2017 The Author(s). Evolution © 2017 The Society for the Study of Evolution.

  4. Size Reduction Techniques for Large Scale Permanent Magnet Generators in Wind Turbines

    Science.gov (United States)

    Khazdozian, Helena; Hadimani, Ravi; Jiles, David

    2015-03-01

    Increased wind penetration is necessary to reduce U.S. dependence on fossil fuels, combat climate change and increase national energy security. The U.S Department of Energy has recommended large scale and offshore wind turbines to achieve 20% wind electricity generation by 2030. Currently, geared doubly-fed induction generators (DFIGs) are typically employed in the drivetrain for conversion of mechanical to electrical energy. Yet, gearboxes account for the greatest downtime of wind turbines, decreasing reliability and contributing to loss of profit. Direct drive permanent magnet generators (PMGs) offer a reliable alternative to DFIGs by eliminating the gearbox. However, PMGs scale up in size and weight much more rapidly than DFIGs as rated power is increased, presenting significant challenges for large scale wind turbine application. Thus, size reduction techniques are needed for viability of PMGs in large scale wind turbines. Two size reduction techniques are presented. It is demonstrated that 25% size reduction of a 10MW PMG is possible with a high remanence theoretical permanent magnet. Additionally, the use of a Halbach cylinder in an outer rotor PMG is investigated to focus magnetic flux over the rotor surface in order to increase torque. This work was supported by the National Science Foundation under Grant No. 1069283 and a Barbara and James Palmer Endowment at Iowa State University.

  5. Geospatial Technology In Environmental Impact Assessments – Retrospective.

    Directory of Open Access Journals (Sweden)

    Goparaju Laxmi

    2015-10-01

    Full Text Available Environmental Impact Assessments are studies conducted to give us an insight into the various impacts caused by an upcoming industry or any developmental activity. It should address various social, economic and environmental issues ensuring that negative impacts are mitigated. In this context, geospatial technology has been used widely in recent times.

  6. A Spatial Data Infrastructure Integrating Multisource Heterogeneous Geospatial Data and Time Series: A Study Case in Agriculture

    Directory of Open Access Journals (Sweden)

    Gloria Bordogna

    2016-05-01

    Full Text Available Currently, the best practice to support land planning calls for the development of Spatial Data Infrastructures (SDI capable of integrating both geospatial datasets and time series information from multiple sources, e.g., multitemporal satellite data and Volunteered Geographic Information (VGI. This paper describes an original OGC standard interoperable SDI architecture and a geospatial data and metadata workflow for creating and managing multisource heterogeneous geospatial datasets and time series, and discusses it in the framework of the Space4Agri project study case developed to support the agricultural sector in Lombardy region, Northern Italy. The main novel contributions go beyond the application domain for which the SDI has been developed and are the following: the ingestion within an a-centric SDI, potentially distributed in several nodes on the Internet to support scalability, of products derived by processing remote sensing images, authoritative data, georeferenced in-situ measurements and voluntary information (VGI created by farmers and agronomists using an original Smart App; the workflow automation for publishing sets and time series of heterogeneous multisource geospatial data and relative web services; and, finally, the project geoportal, that can ease the analysis of the geospatial datasets and time series by providing complex intelligent spatio-temporal query and answering facilities.

  7. Business models for implementing geospatial technologies in transportation decision-making

    Science.gov (United States)

    2007-03-31

    This report describes six State DOTs business models for implementing geospatial technologies. It provides a comparison of the organizational factors influencing how Arizona DOT, Delaware DOT, Georgia DOT, Montana DOT, North Carolina DOT, and Okla...

  8. 75 FR 10309 - Announcement of National Geospatial Advisory Committee Meeting

    Science.gov (United States)

    2010-03-05

    ... Geospatial Advisory Committee (NGAC) will meet on March 24-25, 2010 at the One Washington Circle Hotel, 1... implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting...

  9. GEO-SPATIAL MODELING OF TRAVEL TIME TO MEDICAL FACILITIES IN MUNA BARAT DISTRICT, SOUTHEAST SULAWESI PROVINCE, INDONESIA

    Directory of Open Access Journals (Sweden)

    Nelson Sula

    2018-03-01

    Full Text Available Background: Health services are strongly influenced by regional topography. Road infrastructure is a key in access to health services. The geographic information system becomes a tool in modeling access to health services. Objective: To analyze geospatial data of the travel time to medical facilities in Muna Barat district, Southeast Sulawesi Province, Indonesia. Methods: This research used geospatial analysis with classification of raster data then overlaid with raster data such as Digital Elevation Modeling (DEM, Road of Vector data, and the point of Public Health Center (Puskesmas. Results: The result of geospatial analysis showed that the travel time to Puskesmas in Napano Kusambi and Kusambi sub districts is between 90-120 minutes, and travel time to the hospital in Kusambi sub district is required more than 2 hours. Conclusion: The output of this geospatial analysis can be an input for local government in planning infrastructure development in Muna Barat District, Indonesia.

  10. Quantifying environmental limiting factors on tree cover using geospatial data.

    Directory of Open Access Journals (Sweden)

    Jonathan A Greenberg

    Full Text Available Environmental limiting factors (ELFs are the thresholds that determine the maximum or minimum biological response for a given suite of environmental conditions. We asked the following questions: 1 Can we detect ELFs on percent tree cover across the eastern slopes of the Lake Tahoe Basin, NV? 2 How are the ELFs distributed spatially? 3 To what extent are unmeasured environmental factors limiting tree cover? ELFs are difficult to quantify as they require significant sample sizes. We addressed this by using geospatial data over a relatively large spatial extent, where the wall-to-wall sampling ensures the inclusion of rare data points which define the minimum or maximum response to environmental factors. We tested mean temperature, minimum temperature, potential evapotranspiration (PET and PET minus precipitation (PET-P as potential limiting factors on percent tree cover. We found that the study area showed system-wide limitations on tree cover, and each of the factors showed evidence of being limiting on tree cover. However, only 1.2% of the total area appeared to be limited by the four (4 environmental factors, suggesting other unmeasured factors are limiting much of the tree cover in the study area. Where sites were near their theoretical maximum, non-forest sites (tree cover < 25% were primarily limited by cold mean temperatures, open-canopy forest sites (tree cover between 25% and 60% were primarily limited by evaporative demand, and closed-canopy forests were not limited by any particular environmental factor. The detection of ELFs is necessary in order to fully understand the width of limitations that species experience within their geographic range.

  11. Geospatial techniques for developing a sampling frame of watersheds across a region

    Science.gov (United States)

    Gresswell, Robert E.; Bateman, Douglas S.; Lienkaemper, George; Guy, T.J.

    2004-01-01

    Current land-management decisions that affect the persistence of native salmonids are often influenced by studies of individual sites that are selected based on judgment and convenience. Although this approach is useful for some purposes, extrapolating results to areas that were not sampled is statistically inappropriate because the sampling design is usually biased. Therefore, in recent investigations of coastal cutthroat trout (Oncorhynchus clarki clarki) located above natural barriers to anadromous salmonids, we used a methodology for extending the statistical scope of inference. The purpose of this paper is to apply geospatial tools to identify a population of watersheds and develop a probability-based sampling design for coastal cutthroat trout in western Oregon, USA. The population of mid-size watersheds (500-5800 ha) west of the Cascade Range divide was derived from watershed delineations based on digital elevation models. Because a database with locations of isolated populations of coastal cutthroat trout did not exist, a sampling frame of isolated watersheds containing cutthroat trout had to be developed. After the sampling frame of watersheds was established, isolated watersheds with coastal cutthroat trout were stratified by ecoregion and erosion potential based on dominant bedrock lithology (i.e., sedimentary and igneous). A stratified random sample of 60 watersheds was selected with proportional allocation in each stratum. By comparing watershed drainage areas of streams in the general population to those in the sampling frame and the resulting sample (n = 60), we were able to evaluate the how representative the subset of watersheds was in relation to the population of watersheds. Geospatial tools provided a relatively inexpensive means to generate the information necessary to develop a statistically robust, probability-based sampling design.

  12. Reviews of Geospatial Information Technology and Collaborative Data Delivery for Disaster Risk Management

    Directory of Open Access Journals (Sweden)

    Hiroyuki Miyazaki

    2015-09-01

    Full Text Available Due to the fact that geospatial information technology is considered necessary for disaster risk management (DRM, the need for more effective collaborations between providers and end users in data delivery is increasing. This paper reviews the following: (i schemes of disaster risk management and collaborative data operation in DRM; (ii geospatial information technology in terms of applications to the schemes reviewed; and (iii ongoing practices of collaborative data delivery with the schemes reviewed. This paper concludes by discussing the future of collaborative data delivery and the progress of the technologies.

  13. Qualitative-Geospatial Methods of Exploring Person-Place Transactions in Aging Adults: A Scoping Review.

    Science.gov (United States)

    Hand, Carri; Huot, Suzanne; Laliberte Rudman, Debbie; Wijekoon, Sachindri

    2017-06-01

    Research exploring how places shape and interact with the lives of aging adults must be grounded in the places where aging adults live and participate. Combined participatory geospatial and qualitative methods have the potential to illuminate the complex processes enacted between person and place to create much-needed knowledge in this area. The purpose of this scoping review was to identify methods that can be used to study person-place relationships among aging adults and their neighborhoods by determining the extent and nature of research with aging adults that combines qualitative methods with participatory geospatial methods. A systematic search of nine databases identified 1,965 articles published from 1995 to late 2015. We extracted data and assessed whether the geospatial and qualitative methods were supported by a specified methodology, the methods of data analysis, and the extent of integration of geospatial and qualitative methods. Fifteen studies were included and used the photovoice method, global positioning system tracking plus interview, or go-along interviews. Most included articles provided sufficient detail about data collection methods, yet limited detail about methodologies supporting the study designs and/or data analysis. Approaches that combine participatory geospatial and qualitative methods are beginning to emerge in the aging literature. By more explicitly grounding studies in a methodology, better integrating different types of data during analysis, and reflecting on methods as they are applied, these methods can be further developed and utilized to provide crucial place-based knowledge that can support aging adults' health, well-being, engagement, and participation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    Science.gov (United States)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc

  15. Free and Open Source Software for Geospatial in the field of planetary science

    Science.gov (United States)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and

  16. Geospatial Image Stream Processing: Models, techniques, and applications in remote sensing change detection

    Science.gov (United States)

    Rueda-Velasquez, Carlos Alberto

    Detection of changes in environmental phenomena using remotely sensed data is a major requirement in the Earth sciences, especially in natural disaster related scenarios where real-time detection plays a crucial role in the saving of human lives and the preservation of natural resources. Although various approaches formulated to model multidimensional data can in principle be applied to the inherent complexity of remotely sensed geospatial data, there are still challenging peculiarities that demand a precise characterization in the context of change detection, particularly in scenarios of fast changes. In the same vein, geospatial image streams do not fit appropriately in the standard Data Stream Management System (DSMS) approach because these systems mainly deal with tuple-based streams. Recognizing the necessity for a systematic effort to address the above issues, the work presented in this thesis is a concrete step toward the foundation and construction of an integrated Geospatial Image Stream Processing framework, GISP. First, we present a data and metadata model for remotely sensed image streams. We introduce a precise characterization of images and image streams in the context of remotely sensed geospatial data. On this foundation, we define spatially-aware temporal operators with a consistent semantics for change analysis tasks. We address the change detection problem in settings where multiple image stream sources are available, and thus we introduce an architectural design for the processing of geospatial image streams from multiple sources. With the aim of targeting collaborative scientific environments, we construct a realization of our architecture based on Kepler, a robust and widely used scientific workflow management system, as the underlying computational support; and open data and Web interface standards, as a means to facilitate the interoperability of GISP instances with other processing infrastructures and client applications. We demonstrate our

  17. Design and Development of a Framework Based on Ogc Web Services for the Visualization of Three Dimensional Large-Scale Geospatial Data Over the Web

    Science.gov (United States)

    Roccatello, E.; Nozzi, A.; Rumor, M.

    2013-05-01

    This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.

  18. Investigating Climate Change Issues With Web-Based Geospatial Inquiry Activities

    Science.gov (United States)

    Dempsey, C.; Bodzin, A. M.; Sahagian, D. L.; Anastasio, D. J.; Peffer, T.; Cirucci, L.

    2011-12-01

    In the Environmental Literacy and Inquiry middle school Climate Change curriculum we focus on essential climate literacy principles with an emphasis on weather and climate, Earth system energy balance, greenhouse gases, paleoclimatology, and how human activities influence climate change (http://www.ei.lehigh.edu/eli/cc/). It incorporates a related set of a framework and design principles to provide guidance for the development of the geospatial technology-integrated Earth and environmental science curriculum materials. Students use virtual globes, Web-based tools including an interactive carbon calculator and geologic timeline, and inquiry-based lab activities to investigate climate change topics. The curriculum includes educative curriculum materials that are designed to promote and support teachers' learning of important climate change content and issues, geospatial pedagogical content knowledge, and geographic spatial thinking. The curriculum includes baseline instructional guidance for teachers and provides implementation and adaptation guidance for teaching with diverse learners including low-level readers, English language learners and students with disabilities. In the curriculum, students use geospatial technology tools including Google Earth with embedded spatial data to investigate global temperature changes, areas affected by climate change, evidence of climate change, and the effects of sea level rise on the existing landscape. We conducted a designed-based research implementation study with urban middle school students. Findings showed that the use of the Climate Change curriculum showed significant improvement in urban middle school students' understanding of climate change concepts.

  19. Organizational needs for managing and preserving geospatial data and related electronic records

    Directory of Open Access Journals (Sweden)

    R R Downs

    2006-01-01

    Full Text Available Government agencies and other organizations are required to manage and preserve records that they create and use to facilitate future access and reuse. The increasing use of geospatial data and related electronic records presents new challenges for these organizations, which have relied on traditional practices for managing and preserving records in printed form. This article reports on an investigation of current and future needs for managing and preserving geospatial electronic records on the part of localand state-level organizations in the New York City metropolitan region. It introduces the study and describes organizational needs observed, including needs for organizational coordination and interorganizational cooperation throughout the entire data lifecycle.

  20. Gastro-oesophageal reflux in large-sized, deep-chested versus small-sized, barrel-chested dogs undergoing spinal surgery in sternal recumbency.

    Science.gov (United States)

    Anagnostou, Tilemahos L; Kazakos, George M; Savvas, Ioannis; Kostakis, Charalampos; Papadopoulou, Paraskevi

    2017-01-01

    The aim of this study was to investigate whether an increased frequency of gastro-oesophageal reflux (GOR) is more common in large-sized, deep-chested dogs undergoing spinal surgery in sternal recumbency than in small-sized, barrelchested dogs. Prospective, cohort study. Nineteen small-sized, barrel-chested dogs (group B) and 26 large-sized, deep-chested dogs (group D). All animals were premedicated with intramuscular (IM) acepromazine (0.05 mg kg -1 ) and pethidine (3 mg kg -1 ) IM. Anaesthesia was induced with intravenous sodium thiopental and maintained with halothane in oxygen. Lower oesophageal pH was monitored continuously after induction of anaesthesia. Gastro-oesophageal reflux was considered to have occurred whenever pH values > 7.5 or < 4 were recorded. If GOR was detected during anaesthesia, measures were taken to avoid aspiration of gastric contents into the lungs and to prevent the development of oesophagitis/oesophageal stricture. The frequency of GOR during anaesthesia was significantly higher in group D (6/26 dogs; 23.07%) than in group B (0/19 dogs; 0%) (p = 0.032). Signs indicative of aspiration pneumonia, oesophagitis or oesophageal stricture were not reported in any of the GOR cases. In large-sized, deep-chested dogs undergoing spinal surgery in sternal recumbency, it would seem prudent to consider measures aimed at preventing GOR and its potentially devastating consequences (oesophagitis/oesophageal stricture, aspiration pneumonia). Copyright © 2016 Association of Veterinary Anaesthetists and American College of Veterinary Anesthesia and Analgesia. Published by Elsevier Ltd. All rights reserved.

  1. Geospatial Health: the first five years

    Directory of Open Access Journals (Sweden)

    Jürg Utzinger

    2011-11-01

    Full Text Available Geospatial Health is an international, peer-reviewed scientific journal produced by the Global Network for Geospatial Health (GnosisGIS. This network was founded in 2000 and the inaugural issue of its official journal was published in November 2006 with the aim to cover all aspects of geographical information system (GIS applications, remote sensing and other spatial analytic tools focusing on human and veterinary health. The University of Naples Federico II is the publisher, producing two issues per year, both as hard copy and an open-access online version. The journal is referenced in major databases, including CABI, ISI Web of Knowledge and PubMed. In 2008, it was assigned its first impact factor (1.47, which has now reached 1.71. Geospatial Health is managed by an editor-in-chief and two associate editors, supported by five regional editors and a 23-member strong editorial board. This overview takes stock of the first five years of publishing: 133 contributions have been published so far, primarily original research (79.7%, followed by reviews (7.5%, announcements (6.0%, editorials and meeting reports (3.0% each and a preface in the first issue. A content analysis of all the original research articles and reviews reveals that three quarters of the publications focus on human health with the remainder dealing with veterinary health. Two thirds of the papers come from Africa, Asia and Europe with similar numbers of contributions from each continent. Studies of more than 35 different diseases, injuries and risk factors have been presented. Malaria and schistosomiasis were identified as the two most important diseases (11.2% each. Almost half the contributions were based on GIS, one third on spatial analysis, often using advanced Bayesian geostatistics (13.8%, and one quarter on remote sensing. The 120 original research articles, reviews and editorials were produced by 505 authors based at institutions and universities in 52 countries

  2. Fission gas release during post irradiation annealing of large grain size fuels from Hinkley point B

    International Nuclear Information System (INIS)

    Killeen, J.C.

    1997-01-01

    A series of post-irradiation anneals has been carried out on fuel taken from an experimental stringer from Hinkley Point B AGR. The stringer was part of an experimental programme in the reactor to study the effect of large grain size fuel. Three differing fuel types were present in separate pins in the stringer. One variant of large grain size fuel had been prepared by using an MgO dopant during fuel manufactured, a second by high temperature sintering of standard fuel and the third was a reference, 12μm grain size fuel. Both large grain size variants had similar grain sizes around 35μm. The present experiments took fuel samples from highly rated pins from the stringer with local burn-up in excess of 25GWd/tU and annealed these to temperature of up to 1535 deg. C under reducing conditions to allow a comparison of fission gas behaviour at high release levels. The results demonstrate the beneficial effect of large grain size on release rate of 85 Kr following interlinkage. At low temperatures and release rates there was no difference between the fuel types, but at temperatures in excess of 1400 deg. C the release rate was found to be inversely dependent on the fuel grain size. The experiments showed some differences between the doped and undoped large grains size fuel in that the former became interlinked at a lower temperature, releasing fission gas at an increased rate at this temperature. At higher temperatures the grain size effect was dominant. The temperature dependence for fission gas release was determined over a narrow range of temperature and found to be similar for all three types and for both pre-interlinkage and post-interlinkage releases, the difference between the release rates is then seen to be controlled by grain size. (author). 4 refs, 7 figs, 3 tabs

  3. Fission gas release during post irradiation annealing of large grain size fuels from Hinkley point B

    Energy Technology Data Exchange (ETDEWEB)

    Killeen, J C [Nuclear Electric plc, Barnwood (United Kingdom)

    1997-08-01

    A series of post-irradiation anneals has been carried out on fuel taken from an experimental stringer from Hinkley Point B AGR. The stringer was part of an experimental programme in the reactor to study the effect of large grain size fuel. Three differing fuel types were present in separate pins in the stringer. One variant of large grain size fuel had been prepared by using an MgO dopant during fuel manufactured, a second by high temperature sintering of standard fuel and the third was a reference, 12{mu}m grain size fuel. Both large grain size variants had similar grain sizes around 35{mu}m. The present experiments took fuel samples from highly rated pins from the stringer with local burn-up in excess of 25GWd/tU and annealed these to temperature of up to 1535 deg. C under reducing conditions to allow a comparison of fission gas behaviour at high release levels. The results demonstrate the beneficial effect of large grain size on release rate of {sup 85}Kr following interlinkage. At low temperatures and release rates there was no difference between the fuel types, but at temperatures in excess of 1400 deg. C the release rate was found to be inversely dependent on the fuel grain size. The experiments showed some differences between the doped and undoped large grains size fuel in that the former became interlinked at a lower temperature, releasing fission gas at an increased rate at this temperature. At higher temperatures the grain size effect was dominant. The temperature dependence for fission gas release was determined over a narrow range of temperature and found to be similar for all three types and for both pre-interlinkage and post-interlinkage releases, the difference between the release rates is then seen to be controlled by grain size. (author). 4 refs, 7 figs, 3 tabs.

  4. FOSS geospatial libraries in scientific workflow environments: experiences and directions

    CSIR Research Space (South Africa)

    McFerren, G

    2011-07-01

    Full Text Available of experiments. In context of three sets of research (wildfire research, flood modelling and the linking of disease outbreaks to multi-scale environmental conditions), we describe our efforts to provide geospatial capability for scientific workflow software...

  5. Fast Deployment on the Cloud of Integrated Postgres, API and a Jupyter Notebook for Geospatial Collaboration

    Science.gov (United States)

    Fatland, R.; Tan, A.; Arendt, A. A.

    2016-12-01

    We describe a Python-based implementation of a PostgreSQL database accessed through an Application Programming Interface (API) hosted on the Amazon Web Services public cloud. The data is geospatial and concerns hydrological model results in the glaciated catchment basins of southcentral and southeast Alaska. This implementation, however, is intended to be generalized to other forms of geophysical data, particularly data that is intended to be shared across a collaborative team or publicly. An example (moderate-size) dataset is provided together with the code base and a complete installation tutorial on GitHub. An enthusiastic scientist with some familiarity with software installation can replicate the example system in two hours. This installation includes database, API, a test Client and a supporting Jupyter Notebook, specifically oriented towards Python 3 and markup text to comprise an executable paper. The installation 'on the cloud' often engenders discussion and consideration of cloud cost and safety. By treating the process as somewhat "cookbook" we hope to first demonstrate the feasibility of the proposition. A discussion of cost and data security is provided in this presentation and in the accompanying tutorial/documentation. This geospatial data system case study is part of a larger effort at the University of Washington to enable research teams to take advantage of the public cloud to meet challenges in data management and analysis.

  6. Small-size pedestrian detection in large scene based on fast R-CNN

    Science.gov (United States)

    Wang, Shengke; Yang, Na; Duan, Lianghua; Liu, Lu; Dong, Junyu

    2018-04-01

    Pedestrian detection is a canonical sub-problem of object detection with high demand during recent years. Although recent deep learning object detectors such as Fast/Faster R-CNN have shown excellent performance for general object detection, they have limited success for small size pedestrian detection in large-view scene. We study that the insufficient resolution of feature maps lead to the unsatisfactory accuracy when handling small instances. In this paper, we investigate issues involving Fast R-CNN for pedestrian detection. Driven by the observations, we propose a very simple but effective baseline for pedestrian detection based on Fast R-CNN, employing the DPM detector to generate proposals for accuracy, and training a fast R-CNN style network to jointly optimize small size pedestrian detection with skip connection concatenating feature from different layers to solving coarseness of feature maps. And the accuracy is improved in our research for small size pedestrian detection in the real large scene.

  7. Geo-spatial Service and Application based on National E-government Network Platform and Cloud

    Science.gov (United States)

    Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.

    2014-04-01

    With the acceleration of China's informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical applications. Based on cloud computing technology and national e-government network platform, "National Natural Resources and Geospatial Database (NRGD)" project integrated and transformed natural resources and geospatial information dispersed in various sectors and regions, established logically unified and physically dispersed fundamental database and developed national integrated information database system supporting main e-government applications. Cross-sector e-government applications and services are realized to provide long-term, stable and standardized natural resources and geospatial fundamental information products and services for national egovernment and public users.

  8. Online Resources to Support Professional Development for Managing and Preserving Geospatial Data

    Science.gov (United States)

    Downs, R. R.; Chen, R. S.

    2013-12-01

    Improved capabilities of information and communication technologies (ICT) enable the development of new systems and applications for collecting, managing, disseminating, and using scientific data. New knowledge, skills, and techniques are also being developed to leverage these new ICT capabilities and improve scientific data management practices throughout the entire data lifecycle. In light of these developments and in response to increasing recognition of the wider value of scientific data for society, government agencies are requiring plans for the management, stewardship, and public dissemination of data and research products that are created by government-funded studies. Recognizing that data management and dissemination have not been part of traditional science education programs, new educational programs and learning resources are being developed to prepare new and practicing scientists, data scientists, data managers, and other data professionals with skills in data science and data management. Professional development and training programs also are being developed to address the need for scientists and professionals to improve their expertise in using the tools and techniques for managing and preserving scientific data. The Geospatial Data Preservation Resource Center offers an online catalog of various open access publications, open source tools, and freely available information for the management and stewardship of geospatial data and related resources, such as maps, GIS, and remote sensing data. Containing over 500 resources that can be found by type, topic, or search query, the geopreservation.org website enables discovery of various types of resources to improve capabilities for managing and preserving geospatial data. Applications and software tools can be found for use online or for download. Online journal articles, presentations, reports, blogs, and forums are also available through the website. Available education and training materials include

  9. Learning transfer of geospatial technologies in secondary science and mathematics core areas

    Science.gov (United States)

    Nielsen, Curtis P.

    The purpose of this study was to investigate the transfer of geospatial technology knowledge and skill presented in a social sciences course context to other core areas of the curriculum. Specifically, this study explored the transfer of geospatial technology knowledge and skill to the STEM-related core areas of science and mathematics among ninth-grade students. Haskell's (2001) research on "levels of transfer" provided the theoretical framework for this study, which sought to demonstrate the experimental group's higher ability to transfer geospatial skills, higher mean assignment scores, higher post-test scores, higher geospatial skill application and deeper levels of transfer application than the control group. The participants of the study consisted of thirty ninth-graders enrolled in U.S. History, Earth Science and Integrated Mathematics 1 courses. The primary investigator of this study had no previous classroom experiences with this group of students. The participants who were enrolled in the school's existing two-section class configuration were assigned to experimental and control groups. The experimental group had ready access to Macintosh MacBook laptop computers, and the control group had ready access to Macintosh iPads. All participants in U.S. History received instruction with and were required to use ArcGIS Explorer Online during a Westward Expansion project. All participants were given the ArcGIS Explorer Online content assessment following the completion of the U.S. History project. Once the project in U.S. History was completed, Earth Science and Integrated Mathematics 1 began units of instruction beginning with a multiple-choice content pre-test created by the classroom teachers. Experimental participants received the same unit of instruction without the use or influence of ArcGIS Explorer Online. At the end of the Earth Science and Integrated Math 1 units, the same multiple-choice test was administered as the content post-test. Following the

  10. High performance geospatial and climate data visualization using GeoJS

    Science.gov (United States)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring

  11. Geospatial Image Mining For Nuclear Proliferation Detection: Challenges and New Opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Vatsavai, Raju [ORNL; Bhaduri, Budhendra L [ORNL; Cheriyadat, Anil M [ORNL; Arrowood, Lloyd [Y-12 National Security Complex; Bright, Eddie A [ORNL; Gleason, Shaun Scott [ORNL; Diegert, Carl [Sandia National Laboratories (SNL); Katsaggelos, Aggelos K [ORNL; Pappas, Thrasos N [ORNL; Porter, Reid [Los Alamos National Laboratory (LANL); Bollinger, Jim [Savannah River National Laboratory (SRNL); Chen, Barry [Lawrence Livermore National Laboratory (LLNL); Hohimer, Ryan [Pacific Northwest National Laboratory (PNNL)

    2010-01-01

    With increasing understanding and availability of nuclear technologies, and increasing persuasion of nuclear technologies by several new countries, it is increasingly becoming important to monitor the nuclear proliferation activities. There is a great need for developing technologies to automatically or semi-automatically detect nuclear proliferation activities using remote sensing. Images acquired from earth observation satellites is an important source of information in detecting proliferation activities. High-resolution remote sensing images are highly useful in verifying the correctness, as well as completeness of any nuclear program. DOE national laboratories are interested in detecting nuclear proliferation by developing advanced geospatial image mining algorithms. In this paper we describe the current understanding of geospatial image mining techniques and enumerate key gaps and identify future research needs in the context of nuclear proliferation.

  12. Leveraging geospatial data, technology, and methods for improving the health of communities: priorities and strategies from an expert panel convened by the CDC.

    Science.gov (United States)

    Elmore, Kim; Flanagan, Barry; Jones, Nicholas F; Heitgerd, Janet L

    2010-04-01

    In 2008, CDC convened an expert panel to gather input on the use of geospatial science in surveillance, research and program activities focused on CDC's Healthy Communities Goal. The panel suggested six priorities: spatially enable and strengthen public health surveillance infrastructure; develop metrics for geospatial categorization of community health and health inequity; evaluate the feasibility and validity of standard metrics of community health and health inequities; support and develop GIScience and geospatial analysis; provide geospatial capacity building, training and education; and, engage non-traditional partners. Following the meeting, the strategies and action items suggested by the expert panel were reviewed by a CDC subcommittee to determine priorities relative to ongoing CDC geospatial activities, recognizing that many activities may need to occur either in parallel, or occur multiple times across phases. Phase A of the action items centers on developing leadership support. Phase B focuses on developing internal and external capacity in both physical (e.g., software and hardware) and intellectual infrastructure. Phase C of the action items plan concerns the development and integration of geospatial methods. In summary, the panel members provided critical input to the development of CDC's strategic thinking on integrating geospatial methods and research issues across program efforts in support of its Healthy Communities Goal.

  13. A geospatial soil-based DSS to reconcile landscape management and land protection

    Science.gov (United States)

    Manna, Piero; Basile, Angelo; Bonfante, Antonello; D'Antonio, Amedeo; De Michele, Carlo; Iamarino, Michela; Langella, Giuliano; Florindo Mileti, Antonio; Pileri, Paolo; Vingiani, Simona; Terribile, Fabio

    2017-04-01

    The implementation of UN Agenda 2030 may represent a great opportunity to place soil science at the hearth of many Sustainable Development Goals (e.g. SDGs 2, 3, 13, 15, 15.3, 16.7). On the other side the high complexity embedded in the factual implementation of SDG and many others ambitious objectives (e.g. FAO goals) may cause new frustrations if these policy documents will not bring real progresses. The scientific communities are asked to contribute to disentangle this complexity and possibly identifying a "way to go". This may help the large number of European directives (e.g. WFD, EIA), regulations and communications aiming to achieve a better environment but still facing large difficulties in their full implementation (e.g. COM2015/120; COM2013/683). This contribution has the motivation to provide a different perspective, thinking that the full implementation of SDGs and integrated land policies requires to challenge some key overlooked issues including full competence (and managing capability) about the landscape variability, its multi-functionalities (e.g. agriculture / environment) and its dynamic nature (many processes, including crops growth and fate of pollutants, are dynamic); moreover, it requires to support actions at a very detailed local scale since many processes and problems are site specific. The landscape and all the above issues have the soil as pulsing heart. Accordingly, we aim to demonstrate the multiple benefits in using a smart geoSpatial Decision Support System (S-DSS) grounded on soil modelling, called SOILCONSWEB (EU LIFE+ project and its extensions). It is a freely-accessible web platform based on a Geospatial Cyber-Infrastructure (GCI) and developed in Valle Telesina (South Italy) over an area of 20,000 ha. It supports a multilevel decision-making in agriculture and environment including the interaction with other land uses (such as landscape and urban planning) and thus it simultaneously delivers to SDGs 2, 3, 13, 15, 15.3, 16.7.

  14. Contextualizing Cave Maps as Geospatial Information: Case Study of Indonesia

    Science.gov (United States)

    Reinhart, H.

    2017-12-01

    Caves are the result of solution processes. Because they are happened from geochemical and tectonic activity, they can be considered as geosphere phenomena. As one of the geosphere phenomena, especially at karst landform, caves have spatial dimensions and aspects. Cave’s utilizations and developments are increasing in many sectors such as hydrology, earth science, and tourism industry. However, spatial aspects of caves are poorly concerned dues to the lack of recognition toward cave maps. Many stakeholders have not known significances and importance of cave maps in determining development of a cave. Less information can be considered as the cause. Therefore, it is strongly necessary to put cave maps into the right context in order to make stakeholders realize the significance of it. Also, cave maps will be officially regarded as tools related to policy, development, and conservation act of caves hence they will have regulation in the usages and applications. This paper aims to make the contextualization of cave maps toward legal act. The act which is used is Act Number 4 Year 2011 About Geospatial Information. The contextualization is done by scrutinizing every articles and clauses related to cave maps and seek the contextual elements from both of them. The results are that cave maps can be regarded as geospatial information and classified as thematic geospatial information. The usages of them can be regulated through the Act Number 4 Year 2011. The regulations comprised by data acquisition, database, authorities, surveyor, and the obligation of providing cave maps in planning cave’s development and the environment surrounding.

  15. Geo-Spatial Tactical Decision Aid Systems: Fuzzy Logic for Supporting Decision Making

    National Research Council Canada - National Science Library

    Grasso, Raffaele; Giannecchini, Simone

    2006-01-01

    .... This paper describes a tactical decision aid system based on fuzzy logic reasoning for data fusion and on current Open Geospatial Consortium specifications for interoperability, data dissemination...

  16. Geospatial Web Services in Real Estate Information System

    Science.gov (United States)

    Radulovic, Aleksandra; Sladic, Dubravka; Govedarica, Miro; Popovic, Dragana; Radovic, Jovana

    2017-12-01

    Since the data of cadastral records are of great importance for the economic development of the country, they must be well structured and organized. Records of real estate on the territory of Serbia met many problems in previous years. To prevent problems and to achieve efficient access, sharing and exchange of cadastral data on the principles of interoperability, domain model for real estate is created according to current standards in the field of spatial data. The resulting profile of the domain model for the Serbian real estate cadastre is based on the current legislation and on Land Administration Domain Model (LADM) which is specified in the ISO19152 standard. Above such organized data, and for their effective exchange, it is necessary to develop a model of services that must be provided by the institutions interested in the exchange of cadastral data. This is achieved by introducing a service-oriented architecture in the information system of real estate cadastre and with that ensures efficiency of the system. It is necessary to develop user services for download, review and use of the real estate data through the web. These services should be provided to all users who need access to cadastral data (natural and legal persons as well as state institutions) through e-government. It is also necessary to provide search, view and download of cadastral spatial data by specifying geospatial services. Considering that real estate contains geometric data for parcels and buildings it is necessary to establish set of geospatial services that would provide information and maps for the analysis of spatial data, and for forming a raster data. Besides the theme Cadastral parcels, INSPIRE directive specifies several themes that involve data on buildings and land use, for which data can be provided from real estate cadastre. In this paper, model of geospatial services in Serbia is defined. A case study of using these services to estimate which household is at risk of

  17. Interlayer catalytic exfoliation realizing scalable production of large-size pristine few-layer graphene

    Science.gov (United States)

    Geng, Xiumei; Guo, Yufen; Li, Dongfang; Li, Weiwei; Zhu, Chao; Wei, Xiangfei; Chen, Mingliang; Gao, Song; Qiu, Shengqiang; Gong, Youpin; Wu, Liqiong; Long, Mingsheng; Sun, Mengtao; Pan, Gebo; Liu, Liwei

    2013-01-01

    Mass production of reduced graphene oxide and graphene nanoplatelets has recently been achieved. However, a great challenge still remains in realizing large-quantity and high-quality production of large-size thin few-layer graphene (FLG). Here, we create a novel route to solve the issue by employing one-time-only interlayer catalytic exfoliation (ICE) of salt-intercalated graphite. The typical FLG with a large lateral size of tens of microns and a thickness less than 2 nm have been obtained by a mild and durative ICE. The high-quality graphene layers preserve intact basal crystal planes owing to avoidance of the degradation reaction during both intercalation and ICE. Furthermore, we reveal that the high-quality FLG ensures a remarkable lithium-storage stability (>1,000 cycles) and a large reversible specific capacity (>600 mAh g-1). This simple and scalable technique acquiring high-quality FLG offers considerable potential for future realistic applications.

  18. Semi-empirical formula for large pore-size estimation from o-Ps annihilation lifetime

    International Nuclear Information System (INIS)

    Nguyen Duc Thanh; Tran Quoc Dung; Luu Anh Tuyen; Khuong Thanh Tuan

    2007-01-01

    The o-Ps annihilation rate in large pore was investigated by the semi-classical approach. The semi-empirical formula that simply correlates between the pore size and the o-Ps lifetime was proposed. The calculated results agree well with experiment in the range from some angstroms to several ten nanometers size of pore. (author)

  19. Large increase in nest size linked to climate change: an indicator of life history, senescence and condition.

    Science.gov (United States)

    Møller, Anders Pape; Nielsen, Jan Tøttrup

    2015-11-01

    Many animals build extravagant nests that exceed the size required for successful reproduction. Large nests may signal the parenting ability of nest builders suggesting that nests may have a signaling function. In particular, many raptors build very large nests for their body size. We studied nest size in the goshawk Accipiter gentilis, which is a top predator throughout most of the Nearctic. Both males and females build nests, and males provision their females and offspring with food. Nest volume in the goshawk is almost three-fold larger than predicted from their body size. Nest size in the goshawk is highly variable and may reach more than 600 kg for a bird that weighs ca. 1 kg. While 8.5% of nests fell down, smaller nests fell down more often than large nests. There was a hump-shaped relationship between nest volume and female age, with a decline in nest volume late in life, as expected for senescence. Clutch size increased with nest volume. Nest volume increased during 1977-2014 in an accelerating fashion, linked to increasing spring temperature during April, when goshawks build and start reproduction. These findings are consistent with nest size being a reliable signal of parental ability, with large nest size signaling superior parenting ability and senescence, and also indicating climate warming.

  20. Ecosystem Services Provided by Agricultural Land as Modeled by Broad Scale Geospatial Analysis

    Science.gov (United States)

    Kokkinidis, Ioannis

    conversion take place. While the quantity of wheat produced though extensification is equal to 4.2 times 2012 production, conversion will lead to large increases in runoff (4.1 to 39.4%) and erosion (6 times). This study advances the state of geospatial tools for quantification of ecosystem services.

  1. Geospatial Associations Between Tobacco Retail Outlets and Current Use of Cigarettes and e-Cigarettes among Youths in Texas.

    Science.gov (United States)

    Pérez, Adriana; Chien, Lung-Chang; Harrell, Melissa B; Pasch, Keryn E; Obinwa, Udoka C; Perry, Cheryl L

    2017-10-01

    To identify the geospatial association between the presence of tobacco retail outlets (TRO) around schools' neighborhoods, and current use of cigarettes and e-cigarettes among adolescents in four counties in Texas. Students in grades 6, 8 and 10th were surveyed in their schools in 2014-2015. The schools' addresses was geocoded to determine the presence of at least one TRO within half a mile of the school. Two outcomes were considered: past 30-day use of (a) cigarettes and (b) e-cigarettes. Bayesian structured additive regression models and Kriging methods were used to estimate the geospatial associations between the presence of TRO and use in three counties: Dallas/Tarrant, Harris, and Travis. We observed a geospatial association between the presence of TRO around the schools and current use of cigarettes in the eastern area of Dallas County and in the southeastern area of Harris County. Also, a geospatial association between the presence of TRO around the schools and current use of e-cigarettes was observed in the entire Tarrant County and in the northeastern area of Harris County. There were geospatial associations between the presence of TRO around some schools and cigarette/e-cigarette use among students, but this association was not consistent across all the counties. More research is needed to determine why some areas are at higher risk for this association.

  2. Web mapping system for complex processing and visualization of environmental geospatial datasets

    Science.gov (United States)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial

  3. Preparing Preservice Teachers to Incorporate Geospatial Technologies in Geography Teaching

    Science.gov (United States)

    Harte, Wendy

    2017-01-01

    This study evaluated the efficacy of geospatial technology (GT) learning experiences in two geography curriculum courses to determine their effectiveness for developing preservice teacher confidence and preparing preservice teachers to incorporate GT in their teaching practices. Surveys were used to collect data from preservice teachers at three…

  4. Geospatial Analysis of Renewable Energy Technical Potential on Tribal Lands

    Energy Technology Data Exchange (ETDEWEB)

    Doris, E.; Lopez, A.; Beckley, D.

    2013-02-01

    This technical report uses an established geospatial methodology to estimate the technical potential for renewable energy on tribal lands for the purpose of allowing Tribes to prioritize the development of renewable energy resources either for community scale on-tribal land use or for revenue generating electricity sales.

  5. Shared Geospatial Metadata Repository for Ontario University Libraries: Collaborative Approaches

    Science.gov (United States)

    Forward, Erin; Leahey, Amber; Trimble, Leanne

    2015-01-01

    Successfully providing access to special collections of digital geospatial data in academic libraries relies upon complete and accurate metadata. Creating and maintaining metadata using specialized standards is a formidable challenge for libraries. The Ontario Council of University Libraries' Scholars GeoPortal project, which created a shared…

  6. Statistical characteristics and stability index (si) of large-sized landslide dams around the world

    International Nuclear Information System (INIS)

    Iqbal, J.; Dai, F.; Raja, I.A.

    2014-01-01

    In the last few decades, landslide dams have received greater attention of researchers, as they have caused loss to property and human lives. Over 261 large-sized landslide dams from different countries of the world with volume greater than 1 x 105 m have been reviewed for this study. The data collected for this study shows that 58% of the catastrophic landslides were triggered by earthquakes and 21 % by rainfall, revealing that earthquake and rainfall are the two major triggers, accounting for 75% of large-sized landslide dams. These land-slides were most frequent during last two decades (1990-2010) throughout the world. The mean landslide dam volume of the studied cases was 53.39 x 10 m with mean dam height of 71.98 m, while the mean lake volume was found to be 156.62 x 10 m. Failure of these large landslide dams pose a severe threat to the property and people living downstream, hence immediate attention is required to deal with this problem. A stability index (SI) has been derived on the basis on 59 large-sized landslide dams (out of the 261 dams) with complete parametric information. (author)

  7. Initial PDS4 Support for the Geospatial Data Abstraction Library (GDAL)

    Science.gov (United States)

    Hare, T. M.; Gaddis, L. R.

    2018-04-01

    We introduce initial support for PDS4 within the Geospatial Data Abstraction Library (GDAL). Both highlights and limitations are presented, as well as a short discussion on methods for supporting a GDAL-based workflow for PDS4 conversions.

  8. A comparison of workplace safety perceptions among financial decision-makers of medium- vs. large-size companies.

    Science.gov (United States)

    Huang, Yueng-Hsiang; Leamon, Tom B; Courtney, Theodore K; Chen, Peter Y; DeArmond, Sarah

    2011-01-01

    This study, through a random national survey in the U.S., explored how corporate financial decision-makers perceive important workplace safety issues as a function of the size of the company for which they worked (medium- vs. large-size companies). Telephone surveys were conducted with 404 U.S. corporate financial decision-makers: 203 from medium-size companies and 201 from large companies. Results showed that the patterns of responding for participants from medium- and large-size companies were somewhat similar. The top-rated safety priorities in resource allocation reported by participants from both groups were overexertion, repetitive motion, and bodily reaction. They believed that there were direct and indirect costs associated with workplace injuries and for every dollar spent improving workplace safety, more than four dollars would be returned. They perceived the top benefits of an effective safety program to be predominately financial in nature - increased productivity and reduced costs - and the safety modification participants mentioned most often was to have more/better safety-focused training. However, more participants from large- than medium-size companies reported that "falling on the same level" was the major cause of workers' compensation loss, which is in line with industry loss data. Participants from large companies were more likely to see their safety programs as better than those of other companies in their industries, and those of medium-size companies were more likely to mention that there were no improvements needed for their companies. Copyright © 2009 Elsevier Ltd. All rights reserved.

  9. Encoding and analyzing aerial imagery using geospatial semantic graphs

    Energy Technology Data Exchange (ETDEWEB)

    Watson, Jean-Paul; Strip, David R.; McLendon, William Clarence,; Parekh, Ojas D.; Diegert, Carl F.; Martin, Shawn Bryan; Rintoul, Mark Daniel

    2014-02-01

    While collection capabilities have yielded an ever-increasing volume of aerial imagery, analytic techniques for identifying patterns in and extracting relevant information from this data have seriously lagged. The vast majority of imagery is never examined, due to a combination of the limited bandwidth of human analysts and limitations of existing analysis tools. In this report, we describe an alternative, novel approach to both encoding and analyzing aerial imagery, using the concept of a geospatial semantic graph. The advantages of our approach are twofold. First, intuitive templates can be easily specified in terms of the domain language in which an analyst converses. These templates can be used to automatically and efficiently search large graph databases, for specific patterns of interest. Second, unsupervised machine learning techniques can be applied to automatically identify patterns in the graph databases, exposing recurring motifs in imagery. We illustrate our approach using real-world data for Anne Arundel County, Maryland, and compare the performance of our approach to that of an expert human analyst.

  10. BAND STRUCTURE OF NON-STEIOCHIOMETRIC LARGE-SIZED NANOCRYSTALLITES

    Directory of Open Access Journals (Sweden)

    I.V.Kityk

    2004-01-01

    Full Text Available A band structure of large-sized (from 20 to 35nm non-steichiometric nanocrystallites (NC of the Si2-xCx (1.04 < x < 1.10 has been investigated using different band energy approaches and a modified Car-Parinello molecular dynamics structure optimization of the NC interfaces. The non-steichiometric excess of carbon favors the appearance of a thin prevailingly carbon-contained layer (with thickness of about 1 nm covering the crystallites. As a consequence, one can observe a substantial structure reconstruction of boundary SiC crystalline layers. The numerical modeling has shown that these NC can be considered as SiC reconstructed crystalline films with thickness of about 2 nm covering the SiC crystallites. The observed data are considered within the different one-electron band structure methods. It was shown that the nano-sized carbon sheet plays a key role in a modified band structure. Independent manifestation of the important role played by the reconstructed confined layers is due to the experimentally discovered excitonic-like resonances. Low-temperature absorption measurements confirm the existence of sharp-like absorption resonances originating from the reconstructed layers.

  11. Fatigue-crack propagation in gamma-based titanium aluminide alloys at large and small crack sizes

    International Nuclear Information System (INIS)

    Kruzic, J.J.; Campbell, J.P.; Ritchie, R.O.

    1999-01-01

    Most evaluations of the fracture and fatigue-crack propagation properties of γ+α 2 titanium aluminide alloys to date have been performed using standard large-crack samples, e.g., compact-tension specimens containing crack sizes which are on the order of tens of millimeters, i.e., large compared to microstructural dimensions. However, these alloys have been targeted for applications, such as blades in gas-turbine engines, where relevant crack sizes are much smaller ( 5 mm) and (c ≅ 25--300 microm) cracks in a γ-TiAl based alloy, of composition Ti-47Al-2Nb-2Cr-0.2B (at.%), specifically for duplex (average grain size approximately17 microm) and refined lamellar (average colony size ≅150 microm) microstructures. It is found that, whereas the lamellar microstructure displays far superior fracture toughness and fatigue-crack growth resistance in the presence of large cracks, in small-crack testing the duplex microstructure exhibits a better combination of properties. The reasons for such contrasting behavior are examined in terms of the intrinsic and extrinsic (i.e., crack bridging) contributions to cyclic crack advance

  12. Development, management and benefit from Internet-based geospatial data sources through knowledge management for GIS-based regional geography applications

    International Nuclear Information System (INIS)

    Thunemann, H.G.

    2009-01-01

    The provision of data and information on the Internet is growing daily. For geoscientific applications, especially using geographic information systems (GIS), changing geospacial data are often needed, and thus possibly different data sources. Geospatial data should be easily available. As an increasingly important medium for exchange of geospatial data is the internet. The problem of finding appropriate datasources on the Internet remains to the user. The Internet as a technical basis, which was designed as a tool for information exchange, has changed the practice of dealing with knowledge and information on fundamental and not previously foreseeable manner. From the many individual acts social consequences result, concerning the production and disposal of knowledge. These determine the development of different solutions significantly, which also includes the production, deployment and use of geospatial data, with all its strengths and problems. Various solutions to the provision of geospatial data are available on the Internet, the targeted searching of this geodata sources on the Internet remains a shortcoming. The options of knowledge management, among other solutions, could be a possibility to ease the compilation, storage, connection, popularization and ultimately the application of geodata sources on the Internet. Communication, as a central element of the use of knowledge management, should be used in the form of a communication platform. The present study describes the variety of deployment options of geospatial data and the problems of finding data sources on the Internet. Potential hazards of geospatial data provision (also) via the Internet as well as an option to manage, update and use them for various applications on the Internet are are pointed out. (author) [de

  13. Geospatial Data Repository. Sharing Data Across the Organization and Beyond

    National Research Council Canada - National Science Library

    Ruiz, Marilyn

    2001-01-01

    .... This short Technical Note discusses a five-part approach to creating a data repository that addresses the problems of the historical organizational framework for geospatial data. Fort Hood, Texas was the site used to develop the prototype. A report documenting the complete study will be available in late Spring 2001.

  14. Influence Factors of Sports Bra Evaluation and Design Based on Large Size

    Directory of Open Access Journals (Sweden)

    Zhang Lingxi

    2016-01-01

    Full Text Available The purpose of this paper was to find the main influence factors of sports bra evaluation by the subjective assessment of different styles commercial sports bra, and to summarize the design elements of sports bra for large size. 10 women in large size (>C80 were chosen to evaluate 9 different sports bras. The main influence factors were extracted by factor analysis and all the product samples were classified by Q-cluster analysis. The conclusions show that breast stability, wearing comfort and bust modelling are the three key factors for sports bra evaluation. And a classification-positioning model of sports bra products was established. The findings can provide theoretical basis and guidance for the research and design of sports bras both for academic and sports or underwear enterprises, and also provide reference value for women customers.

  15. Data Democracy and Decision Making: Enhancing the Use and Value of Geospatial Data and Scientific Information

    Science.gov (United States)

    Shapiro, C. D.

    2014-12-01

    Data democracy is a concept that has great relevance to the use and value of geospatial data and scientific information. Data democracy describes a world in which data and information are widely and broadly accessible, understandable, and useable. The concept operationalizes the public good nature of scientific information and provides a framework for increasing benefits from its use. Data democracy encompasses efforts to increase accessibility to geospatial data and to expand participation in its collection, analysis, and application. These two pillars are analogous to demand and supply relationships. Improved accessibility, or demand, includes increased knowledge about geospatial data and low barriers to retrieval and use. Expanded participation, or supply, encompasses a broader community involved in developing geospatial data and scientific information. This pillar of data democracy is characterized by methods such as citizen science or crowd sourcing.A framework is developed for advancing the use of data democracy. This includes efforts to assess the societal benefits (economic and social) of scientific information. This knowledge is critical to continued monitoring of the effectiveness of data democracy implementation and of potential impact on the use and value of scientific information. The framework also includes an assessment of opportunities for advancing data democracy both on the supply and demand sides. These opportunities include relatively inexpensive efforts to reduce barriers to use as well as the identification of situations in which participation can be expanded in scientific efforts to enhance the breadth of involvement as well as expanding participation to non-traditional communities. This framework provides an initial perspective on ways to expand the "scientific community" of data users and providers. It also describes a way forward for enhancing the societal benefits from geospatial data and scientific information. As a result, data

  16. Open Data, Jupyter Notebooks and Geospatial Data Standards Combined - Opening up large volumes of marine and climate data to other communities

    Science.gov (United States)

    Clements, O.; Siemen, S.; Wagemann, J.

    2017-12-01

    The EU-funded Earthserver-2 project aims to offer on-demand access to large volumes of environmental data (Earth Observation, Marine, Climate data and Planetary data) via the interface standard Web Coverage Service defined by the Open Geospatial Consortium. Providing access to data via OGC web services (e.g. WCS and WMS) has the potential to open up services to a wider audience, especially to users outside the respective communities. Especially WCS 2.0 with its processing extension Web Coverage Processing Service (WCPS) is highly beneficial to make large volumes accessible to non-expert communities. Users do not have to deal with custom community data formats, such as GRIB for the meteorological community, but can directly access the data in a format they are more familiar with, such as NetCDF, JSON or CSV. Data requests can further directly be integrated into custom processing routines and users are not required to download Gigabytes of data anymore. WCS supports trim (reduction of data extent) and slice (reduction of data dimension) operations on multi-dimensional data, providing users a very flexible on-demand access to the data. WCPS allows the user to craft queries to run on the data using a text-based query language, similar to SQL. These queries can be very powerful, e.g. condensing a three-dimensional data cube into its two-dimensional mean. However, the more processing-intensive the more complex the query. As part of the EarthServer-2 project, we developed a python library that helps users to generate complex WCPS queries with Python, a programming language they are more familiar with. The interactive presentation aims to give practical examples how users can benefit from two specific WCS services from the Marine and Climate community. Use-cases from the two communities will show different approaches to take advantage of a Web Coverage (Processing) Service. The entire content is available with Jupyter Notebooks, as they prove to be a highly beneficial tool

  17. Preparation and provisional validation of a large size dried spike: Batch SAL-9931

    International Nuclear Information System (INIS)

    Jammet, G.; Zoigner, A.; Doubek, N.; Grabmueller, G.; Bagliano, G.

    1990-05-01

    To determine uranium and plutonium concentration using isotope dilution mass spectrometry, weighed aliquands of a synthetic mixture containing about 2 mg of Pu (with a 239 Pu abundance of about 98%) and 40 mg of U (with a 235 U enrichment of about 19%) have been prepared and verified by SAL to be used to spike samples of concentrated spent fuel solutions with a high burn-up and a low 235 U enrichment. The advantages of such a Large Size Dried (LSD) Spike have been pointed out elsewhere and proof of the usefulness in the field reported. Certified Reference Materials Pu-NBL-126, natural U-NBS-960 and 93% enriched U-NBL-116 were used to prepare a stock solution containing 1.8 mg/ml of Pu and 37.3 mg/ml of 19.4% enriched U. Before shipment to the Reprocessing Plant, aliquands of the stock solution are dried to give Large Size Dried Spikes which resist shocks encountered during transportation, so that they can readily be recovered quantitatively at the plant. This paper describes the preparation and the validation of a Large Size Dried Spike which is intended to be used as a common spike by the plant operator, the national and the IAEA inspectorates. 6 refs, 7 tabs

  18. The research of the quantitative prediction of the deposits concentrated regions of the large and super-large sized mineral deposits in China

    International Nuclear Information System (INIS)

    Zhao Zhenyu; Wang Shicheng

    2003-01-01

    By the general theory and method of mineral resources prognosis of synthetic information, the locative and quantitative prediction of the large and super-large sized mineral deposits of solid resources of 1 : 5,000,000 are developed in china. The deposit concentrated regions is model unit, the anomaly concentrated regions is prediction unit. The mineral prognosis of synthetic information is developed on GIS platform. The technical route and work method of looking for the large and super-large sized mineral resources and basic principle of compiling attribute table of the variables and the response variables are mentioned. In research of prediction of resources quantity, the locative and quantitative prediction are processed by separately the quantification theory Ⅲ and the corresponding characteristic analysis, two methods are compared. It is very important for resources prediction of western ten provinces in china, it is helpful. (authors)

  19. New Sequences with Low Correlation and Large Family Size

    Science.gov (United States)

    Zeng, Fanxin

    In direct-sequence code-division multiple-access (DS-CDMA) communication systems and direct-sequence ultra wideband (DS-UWB) radios, sequences with low correlation and large family size are important for reducing multiple access interference (MAI) and accepting more active users, respectively. In this paper, a new collection of families of sequences of length pn-1, which includes three constructions, is proposed. The maximum number of cyclically distinct families without GMW sequences in each construction is φ(pn-1)/n·φ(pm-1)/m, where p is a prime number, n is an even number, and n=2m, and these sequences can be binary or polyphase depending upon choice of the parameter p. In Construction I, there are pn distinct sequences within each family and the new sequences have at most d+2 nontrivial periodic correlation {-pm-1, -1, pm-1, 2pm-1,…,dpm-1}. In Construction II, the new sequences have large family size p2n and possibly take the nontrivial correlation values in {-pm-1, -1, pm-1, 2pm-1,…,(3d-4)pm-1}. In Construction III, the new sequences possess the largest family size p(d-1)n and have at most 2d correlation levels {-pm-1, -1,pm-1, 2pm-1,…,(2d-2)pm-1}. Three constructions are near-optimal with respect to the Welch bound because the values of their Welch-Ratios are moderate, WR_??_d, WR_??_3d-4 and WR_??_2d-2, respectively. Each family in Constructions I, II and III contains a GMW sequence. In addition, Helleseth sequences and Niho sequences are special cases in Constructions I and III, and their restriction conditions to the integers m and n, pm≠2 (mod 3) and n≅0 (mod 4), respectively, are removed in our sequences. Our sequences in Construction III include the sequences with Niho type decimation 3·2m-2, too. Finally, some open questions are pointed out and an example that illustrates the performance of these sequences is given.

  20. The Geospatial Data Cloud: An Implementation of Applying Cloud Computing in Geosciences

    Directory of Open Access Journals (Sweden)

    Xuezhi Wang

    2014-11-01

    Full Text Available The rapid growth in the volume of remote sensing data and its increasing computational requirements bring huge challenges for researchers as traditional systems cannot adequately satisfy the huge demand for service. Cloud computing has the advantage of high scalability and reliability, which can provide firm technical support. This paper proposes a highly scalable geospatial cloud platform named the Geospatial Data Cloud, which is constructed based on cloud computing. The architecture of the platform is first introduced, and then two subsystems, the cloud-based data management platform and the cloud-based data processing platform, are described.  ––– This paper was presented at the First Scientific Data Conference on Scientific Research, Big Data, and Data Science, organized by CODATA-China and held in Beijing on 24-25 February, 2014.

  1. Brokered virtual hubs for facilitating access and use of geospatial Open Data

    Science.gov (United States)

    Mazzetti, Paolo; Latre, Miguel; Kamali, Nargess; Brumana, Raffaella; Braumann, Stefan; Nativi, Stefano

    2016-04-01

    Open Data is a major trend in current information technology scenario and it is often publicised as one of the pillars of the information society in the near future. In particular, geospatial Open Data have a huge potential also for Earth Sciences, through the enablement of innovative applications and services integrating heterogeneous information. However, open does not mean usable. As it was recognized at the very beginning of the Web revolution, many different degrees of openness exist: from simple sharing in a proprietary format to advanced sharing in standard formats and including semantic information. Therefore, to fully unleash the potential of geospatial Open Data, advanced infrastructures are needed to increase the data openness degree, enhancing their usability. In October 2014, the ENERGIC OD (European NEtwork for Redistributing Geospatial Information to user Communities - Open Data) project, funded by the European Union under the Competitiveness and Innovation framework Programme (CIP), has started. In response to the EU call, the general objective of the project is to "facilitate the use of open (freely available) geographic data from different sources for the creation of innovative applications and services through the creation of Virtual Hubs". The ENERGIC OD Virtual Hubs aim to facilitate the use of geospatial Open Data by lowering and possibly removing the main barriers which hampers geo-information (GI) usage by end-users and application developers. Data and services heterogeneity is recognized as one of the major barriers to Open Data (re-)use. It imposes end-users and developers to spend a lot of effort in accessing different infrastructures and harmonizing datasets. Such heterogeneity cannot be completely removed through the adoption of standard specifications for service interfaces, metadata and data models, since different infrastructures adopt different standards to answer to specific challenges and to address specific use-cases. Thus

  2. Infusion of Climate Change and Geospatial Science Concepts into Environmental and Biological Science Curriculum

    Science.gov (United States)

    Balaji Bhaskar, M. S.; Rosenzweig, J.; Shishodia, S.

    2017-12-01

    The objective of our activity is to improve the students understanding and interpretation of geospatial science and climate change concepts and its applications in the field of Environmental and Biological Sciences in the College of Science Engineering and Technology (COEST) at Texas Southern University (TSU) in Houston, TX. The courses of GIS for Environment, Ecology and Microbiology were selected for the curriculum infusion. A total of ten GIS hands-on lab modules, along with two NCAR (National Center for Atmospheric Research) lab modules on climate change were implemented in the "GIS for Environment" course. GIS and Google Earth Labs along with climate change lectures were infused into Microbiology and Ecology courses. Critical thinking and empirical skills of the students were assessed in all the courses. The student learning outcomes of these courses includes the ability of students to interpret the geospatial maps and the student demonstration of knowledge of the basic principles and concepts of GIS (Geographic Information Systems) and climate change. At the end of the courses, students developed a comprehensive understanding of the geospatial data, its applications in understanding climate change and its interpretation at the local and regional scales during multiple years.

  3. Geospatial Data Availability for Haiti: An Aid in the Development of GIS-Based Natural Resource Assessments for Conservation Planning.

    Science.gov (United States)

    Maya Quinones; William Gould; Carlos D. Rodriguez-Pedraza

    2007-01-01

    This report documents the type and source of geospatial data available for Haiti. It was compiled to serve as a resource for geographic information system (GIS)-based land management and planning. It will be useful for conservation planning, reforestation efforts, and agricultural extension projects. Our study indicates that there is a great deal of geospatial...

  4. Geospatial Modeling of Asthma Population in Relation to Air Pollution

    Science.gov (United States)

    Kethireddy, Swatantra R.; Tchounwou, Paul B.; Young, John H.; Luvall, Jeffrey C.; Alhamdan, Mohammad

    2013-01-01

    Current observations indicate that asthma is growing every year in the United States, specific reasons for this are not well understood. This study stems from an ongoing research effort to investigate the spatio-temporal behavior of asthma and its relatedness to air pollution. The association between environmental variables such as air quality and asthma related health issues over Mississippi State are investigated using Geographic Information Systems (GIS) tools and applications. Health data concerning asthma obtained from Mississippi State Department of Health (MSDH) for 9-year period of 2003-2011, and data of air pollutant concentrations (PM2.5) collected from USEPA web resources, and are analyzed geospatially to establish the impacts of air quality on human health specifically related to asthma. Disease mapping using geospatial techniques provides valuable insights into the spatial nature, variability, and association of asthma to air pollution. Asthma patient hospitalization data of Mississippi has been analyzed and mapped using quantitative Choropleth techniques in ArcGIS. Patients have been geocoded to their respective zip codes. Potential air pollutant sources of Interstate highways, Industries, and other land use data have been integrated in common geospatial platform to understand their adverse contribution on human health. Existing hospitals and emergency clinics are being injected into analysis to further understand their proximity and easy access to patient locations. At the current level of analysis and understanding, spatial distribution of Asthma is observed in the populations of Zip code regions in gulf coast, along the interstates of south, and in counties of Northeast Mississippi. It is also found that asthma is prevalent in most of the urban population. This GIS based project would be useful to make health risk assessment and provide information support to the administrators and decision makers for establishing satellite clinics in future.

  5. Crisp Clustering Algorithm for 3D Geospatial Vector Data Quantization

    DEFF Research Database (Denmark)

    Azri, Suhaibah; Anton, François; Ujang, Uznir

    2015-01-01

    In the next few years, 3D data is expected to be an intrinsic part of geospatial data. However, issues on 3D spatial data management are still in the research stage. One of the issues is performance deterioration during 3D data retrieval. Thus, a practical 3D index structure is required for effic...

  6. Geo-Spatial Support for Assessment of Anthropic Impact on Biodiversity

    Directory of Open Access Journals (Sweden)

    Marco Piragnolo

    2014-04-01

    Full Text Available This paper discusses a methodology where geo-spatial analysis tools are used to quantify risk derived from anthropic activities on habitats and species. The method has been developed with a focus on simplification and the quality of standard procedures set on flora and fauna protected by the European Directives. In this study case, the DPSIR (Drivers, Pressures, State, Impacts, Responses is applied using spatial procedures in a geographical information system (GIS framework. This approach can be inserted in a multidimensional space as the analysis is applied to each threat, pressure and activity and also to each habitat and species, at the spatial and temporal scale. Threats, pressures and activities, stress and indicators can be managed by means of a geo-database and analyzed using spatial analysis functions in a tested GIS workflow environment. The method applies a matrix with risk values, and the final product is a geo-spatial representation of impact indicators, which can be used as a support for decision-makers at various levels (regional, national and European.

  7. Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine

    Science.gov (United States)

    Boehm, J.; Liu, K.; Alis, C.

    2016-06-01

    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  8. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    Science.gov (United States)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  9. Gamification and geospatial health management

    Science.gov (United States)

    Wortley, David

    2014-06-01

    Sensor and Measurement technologies are rapidly developing for many consumer applications which have the potential to make a major impact on business and society. One of the most important areas for building a sustainable future is in health management. This opportunity arises because of the growing popularity of lifestyle monitoring devices such as the Jawbone UP bracelet, Nike Fuelband and Samsung Galaxy GEAR. These devices measure physical activity and calorie consumption and, when visualised on mobile and portable devices, enable users to take more responsibility for their personal health. This presentation looks at how the process of gamification can be applied to develop important geospatial health management applications that could not only improve the health of nations but also significantly address some of the issues in global health such as the ageing society and obesity.

  10. Gamification and geospatial health management

    International Nuclear Information System (INIS)

    Wortley, David

    2014-01-01

    Sensor and Measurement technologies are rapidly developing for many consumer applications which have the potential to make a major impact on business and society. One of the most important areas for building a sustainable future is in health management. This opportunity arises because of the growing popularity of lifestyle monitoring devices such as the Jawbone UP bracelet, Nike Fuelband and Samsung Galaxy GEAR. These devices measure physical activity and calorie consumption and, when visualised on mobile and portable devices, enable users to take more responsibility for their personal health. This presentation looks at how the process of gamification can be applied to develop important geospatial health management applications that could not only improve the health of nations but also significantly address some of the issues in global health such as the ageing society and obesity

  11. Measuring the Interdisciplinary Impact of Using Geospatial Data with Remote Sensing Data

    Science.gov (United States)

    Downs, R. R.; Chen, R. S.; Schumacher, J.

    2017-12-01

    Various disciplines offer benefits to society by contributing to the scientific progress that informs the knowledge and decisions that improve the lives, safety, and conditions of people around the globe. In addition to disciplines within the natural sciences, other disciplines, including those in the social, health, and computer sciences, provide benefits to society by collecting, preparing, and analyzing data in the process of conducting research. Preparing geospatial environmental and socioeconomic data together with remote sensing data from satellite-based instruments for wider use by heterogeneous communities of users increases the potential impact of these data by enabling their use in different application areas and sectors of society. Furthermore, enabling wider use of scientific data can bring to bear resources and expertise that will improve reproducibility, quality, methodological transparency, interoperability, and improved understanding by diverse communities of users. In line with its commitment to open data, the NASA Socioeconomic Data and Applications Center (SEDAC), which focuses on human interactions in the environment, curates and disseminates freely and publicly available geospatial data for use across many disciplines and societal benefit areas. We describe efforts to broaden the use of SEDAC data and to publicly document their impact, assess the interdisciplinary impact of the use of SEDAC data with remote sensing data, and characterize these impacts in terms of their influence across disciplines by analyzing citations of geospatial data with remote sensing data within scientific journals.

  12. Mobile Traffic Alert and Tourist Route Guidance System Design Using Geospatial Data

    Science.gov (United States)

    Bhattacharya, D.; Painho, M.; Mishra, S.; Gupta, A.

    2017-09-01

    The present study describes an integrated system for traffic data collection and alert warning. Geographical information based decision making related to traffic destinations and routes is proposed through the design. The system includes a geospatial database having profile relating to a user of a mobile device. The processing and understanding of scanned maps, other digital data input leads to route guidance. The system includes a server configured to receive traffic information relating to a route and location information relating to the mobile device. Server is configured to send a traffic alert to the mobile device when the traffic information and the location information indicate that the mobile device is traveling toward traffic congestion. Proposed system has geospatial and mobile data sets pertaining to Bangalore city in India. It is envisaged to be helpful for touristic purposes as a route guidance and alert relaying information system to tourists for proximity to sites worth seeing in a city they have entered into. The system is modular in architecture and the novelty lies in integration of different modules carrying different technologies for a complete traffic information system. Generic information processing and delivery system has been tested to be functional and speedy under test geospatial domains. In a restricted prototype model with geo-referenced route data required information has been delivered correctly over sustained trials to designated cell numbers, with average time frame of 27.5 seconds, maximum 50 and minimum 5 seconds. Traffic geo-data set trials testing is underway.

  13. Genome size variation affects song attractiveness in grasshoppers: evidence for sexual selection against large genomes.

    Science.gov (United States)

    Schielzeth, Holger; Streitner, Corinna; Lampe, Ulrike; Franzke, Alexandra; Reinhold, Klaus

    2014-12-01

    Genome size is largely uncorrelated to organismal complexity and adaptive scenarios. Genetic drift as well as intragenomic conflict have been put forward to explain this observation. We here study the impact of genome size on sexual attractiveness in the bow-winged grasshopper Chorthippus biguttulus. Grasshoppers show particularly large variation in genome size due to the high prevalence of supernumerary chromosomes that are considered (mildly) selfish, as evidenced by non-Mendelian inheritance and fitness costs if present in high numbers. We ranked male grasshoppers by song characteristics that are known to affect female preferences in this species and scored genome sizes of attractive and unattractive individuals from the extremes of this distribution. We find that attractive singers have significantly smaller genomes, demonstrating that genome size is reflected in male courtship songs and that females prefer songs of males with small genomes. Such a genome size dependent mate preference effectively selects against selfish genetic elements that tend to increase genome size. The data therefore provide a novel example of how sexual selection can reinforce natural selection and can act as an agent in an intragenomic arms race. Furthermore, our findings indicate an underappreciated route of how choosy females could gain indirect benefits. © 2014 The Author(s). Evolution © 2014 The Society for the Study of Evolution.

  14. Collective Sensing: Integrating Geospatial Technologies to Understand Urban Systems—An Overview

    Directory of Open Access Journals (Sweden)

    Geoffrey J. Hay

    2011-08-01

    Full Text Available Cities are complex systems composed of numerous interacting components that evolve over multiple spatio-temporal scales. Consequently, no single data source is sufficient to satisfy the information needs required to map, monitor, model, and ultimately understand and manage our interaction within such urban systems. Remote sensing technology provides a key data source for mapping such environments, but is not sufficient for fully understanding them. In this article we provide a condensed urban perspective of critical geospatial technologies and techniques: (i Remote Sensing; (ii Geographic Information Systems; (iii object-based image analysis; and (iv sensor webs, and recommend a holistic integration of these technologies within the language of open geospatial consortium (OGC standards in-order to more fully understand urban systems. We then discuss the potential of this integration and conclude that this extends the monitoring and mapping options beyond “hard infrastructure” by addressing “humans as sensors”, mobility and human-environment interactions, and future improvements to quality of life and of social infrastructures.

  15. Geospatial climate monitoring products: Tools for food security assessment

    Science.gov (United States)

    Verdin, James Patrick

    Many of the 250 million people living in the drylands of Sub-Saharan Africa are food insecure---they lack access at all times to enough food for an active and healthy life. Their vulnerability is due in large measure to highly variable climatic conditions and a dependence on rainfed agriculture. Famine, the most extreme food security emergency, is caused by crop failure due to bad weather, conflict, or both. Famine is a slow onset disaster, culminating after two or more bad growing seasons. After the disastrous African famines of the 1970s and 1980s, the U.S. established the Famine Early Warning System (FEWS) to make the observations of climatic and socioeconomic variables needed for early detection of food security emergencies. Two geospatial climate monitoring products, rainfall estimate and vegetation index images derived from satellite data, are operationally used by FEWS analysts. This dissertation describes research to derive new products from them to reduce ambiguity and improve the link between early warning and early response. First, rainfall estimate images were used in a geospatial crop water accounting scheme. The resulting water requirement satisfaction index was used to estimate crop yield, and a correlation of 0.80 with conventional yield reports was obtained for the 1997 maize harvest in Zimbabwe. Thus, the agricultural significance of remotely sensed patterns of precipitation in time and space was made more clear. The second product tested was the expression of a seasonal climate forecast as a series of vegetation index anomaly images. Correlations between sea surface temperature anomalies in the equatorial Pacific and vegetation index anomalies in Southern Africa were established and predictive relationships cross-validated. Using model forecast values of Pacific sea surface temperature from the National Oceanic and Atmospheric Administration for January, February, and March, forecast images of vegetation index anomalies were prepared prior to the

  16. OpenClimateGIS - A Web Service Providing Climate Model Data in Commonly Used Geospatial Formats

    Science.gov (United States)

    Erickson, T. A.; Koziol, B. W.; Rood, R. B.

    2011-12-01

    The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.

  17. Geospatial data sharing, online spatial analysis and processing of Indian Biodiversity data in Internet GIS domain - A case study for raster based online geo-processing

    Science.gov (United States)

    Karnatak, H.; Pandey, K.; Oberai, K.; Roy, A.; Joshi, D.; Singh, H.; Raju, P. L. N.; Krishna Murthy, Y. V. N.

    2014-11-01

    National Biodiversity Characterization at Landscape Level, a project jointly sponsored by Department of Biotechnology and Department of Space, was implemented to identify and map the potential biodiversity rich areas in India. This project has generated spatial information at three levels viz. Satellite based primary information (Vegetation Type map, spatial locations of road & village, Fire occurrence); geospatially derived or modelled information (Disturbance Index, Fragmentation, Biological Richness) and geospatially referenced field samples plots. The study provides information of high disturbance and high biological richness areas suggesting future management strategies and formulating action plans. The study has generated for the first time baseline database in India which will be a valuable input towards climate change study in the Indian Subcontinent. The spatial data generated during the study is organized as central data repository in Geo-RDBMS environment using PostgreSQL and POSTGIS. The raster and vector data is published as OGC WMS and WFS standard for development of web base geoinformation system using Service Oriented Architecture (SOA). The WMS and WFS based system allows geo-visualization, online query and map outputs generation based on user request and response. This is a typical mashup architecture based geo-information system which allows access to remote web services like ISRO Bhuvan, Openstreet map, Google map etc., with overlay on Biodiversity data for effective study on Bio-resources. The spatial queries and analysis with vector data is achieved through SQL queries on POSTGIS and WFS-T operations. But the most important challenge is to develop a system for online raster based geo-spatial analysis and processing based on user defined Area of Interest (AOI) for large raster data sets. The map data of this study contains approximately 20 GB of size for each data layer which are five in number. An attempt has been to develop system using

  18. Large Time Asymptotics for a Continuous Coagulation-Fragmentation Model with Degenerate Size-Dependent Diffusion

    KAUST Repository

    Desvillettes, Laurent

    2010-01-01

    We study a continuous coagulation-fragmentation model with constant kernels for reacting polymers (see [M. Aizenman and T. Bak, Comm. Math. Phys., 65 (1979), pp. 203-230]). The polymers are set to diffuse within a smooth bounded one-dimensional domain with no-flux boundary conditions. In particular, we consider size-dependent diffusion coefficients, which may degenerate for small and large cluster-sizes. We prove that the entropy-entropy dissipation method applies directly in this inhomogeneous setting. We first show the necessary basic a priori estimates in dimension one, and second we show faster-than-polynomial convergence toward global equilibria for diffusion coefficients which vanish not faster than linearly for large sizes. This extends the previous results of [J.A. Carrillo, L. Desvillettes, and K. Fellner, Comm. Math. Phys., 278 (2008), pp. 433-451], which assumes that the diffusion coefficients are bounded below. © 2009 Society for Industrial and Applied Mathematics.

  19. Automating Geospatial Visualizations with Smart Default Renderers for Data Exploration Web Applications

    Science.gov (United States)

    Ekenes, K.

    2017-12-01

    This presentation will outline the process of creating a web application for exploring large amounts of scientific geospatial data using modern automated cartographic techniques. Traditional cartographic methods, including data classification, may inadvertently hide geospatial and statistical patterns in the underlying data. This presentation demonstrates how to use smart web APIs that quickly analyze the data when it loads, and provides suggestions for the most appropriate visualizations based on the statistics of the data. Since there are just a few ways to visualize any given dataset well, it is imperative to provide smart default color schemes tailored to the dataset as opposed to static defaults. Since many users don't go beyond default values, it is imperative that they are provided with smart default visualizations. Multiple functions for automating visualizations are available in the Smart APIs, along with UI elements allowing users to create more than one visualization for a dataset since there isn't a single best way to visualize a given dataset. Since bivariate and multivariate visualizations are particularly difficult to create effectively, this automated approach removes the guesswork out of the process and provides a number of ways to generate multivariate visualizations for the same variables. This allows the user to choose which visualization is most appropriate for their presentation. The methods used in these APIs and the renderers generated by them are not available elsewhere. The presentation will show how statistics can be used as the basis for automating default visualizations of data along continuous ramps, creating more refined visualizations while revealing the spread and outliers of the data. Adding interactive components to instantaneously alter visualizations allows users to unearth spatial patterns previously unknown among one or more variables. These applications may focus on a single dataset that is frequently updated, or configurable

  20. RESOURCE SAVING TECHNOLOGICAL PROCESS OF LARGE-SIZE DIE THERMAL TREATMENT

    Directory of Open Access Journals (Sweden)

    L. A. Glazkov

    2009-01-01

    Full Text Available The given paper presents a development of a technological process pertaining to hardening large-size parts made of die steel. The proposed process applies a water-air mixture instead of a conventional hardening medium that is industrial oil.While developing this new technological process it has been necessary to solve the following problems: reduction of thermal treatment duration, reduction of power resource expense (natural gas and mineral oil, elimination of fire danger and increase of process ecological efficiency. 

  1. Development of Geospatial Map Based Portal for New Delhi Municipal Council

    Science.gov (United States)

    Gupta, A. Kumar Chandra; Kumar, P.; Sharma, P. Kumar

    2017-09-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Portal (GMP) for New Delhi Municipal Council (NDMC) of NCT of Delhi. The GMP has been developed as a map based spatial decision support system (SDSS) for planning and development of NDMC area to the NDMC department and It's heaving the inbuilt information searching tools (identifying of location, nearest utilities locations, distance measurement etc.) for the citizens of NCTD. The GMP is based on Client-Server architecture model. It has been developed using Arc GIS Server 10.0 with .NET (pronounced dot net) technology. The GMP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMP includes Circle, Division, Sub-division boundaries of department pertaining to New Delhi Municipal Council, Parcels of residential, commercial, and government buildings, basic amenities (Police Stations, Hospitals, Schools, Banks, ATMs and Fire Stations etc.), Over-ground and Underground utility network lines, Roads, Railway features. GMP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for development and management of MCD area. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  2. DEVELOPMENT OF GEOSPATIAL MAP BASED PORTAL FOR NEW DELHI MUNICIPAL COUNCIL

    Directory of Open Access Journals (Sweden)

    A. Kumar Chandra Gupta

    2017-09-01

    Full Text Available The Geospatial Delhi Limited (GSDL, a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD to the Government of National Capital Territory of Delhi (GNCTD and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD. This paper describes the development of Geospatial Map based Portal (GMP for New Delhi Municipal Council (NDMC of NCT of Delhi. The GMP has been developed as a map based spatial decision support system (SDSS for planning and development of NDMC area to the NDMC department and It’s heaving the inbuilt information searching tools (identifying of location, nearest utilities locations, distance measurement etc. for the citizens of NCTD. The GMP is based on Client-Server architecture model. It has been developed using Arc GIS Server 10.0 with .NET (pronounced dot net technology. The GMP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN connectivity. Spatial data to GMP includes Circle, Division, Sub-division boundaries of department pertaining to New Delhi Municipal Council, Parcels of residential, commercial, and government buildings, basic amenities (Police Stations, Hospitals, Schools, Banks, ATMs and Fire Stations etc., Over-ground and Underground utility network lines, Roads, Railway features. GMP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for development and management of MCD area. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  3. Open cyberGIS software for geospatial research and education in the big data era

    Science.gov (United States)

    Wang, Shaowen; Liu, Yan; Padmanabhan, Anand

    CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  4. Open cyberGIS software for geospatial research and education in the big data era

    Directory of Open Access Journals (Sweden)

    Shaowen Wang

    2016-01-01

    Full Text Available CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS, spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies–open access, source, and integration–to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  5. Processing and properties of large-sized ceramic slabs

    Directory of Open Access Journals (Sweden)

    Fossa, L.

    2010-10-01

    Full Text Available Large-sized ceramic slabs – with dimensions up to 360x120 cm2 and thickness down to 2 mm – are manufactured through an innovative ceramic process, starting from porcelain stoneware formulations and involving wet ball milling, spray drying, die-less slow-rate pressing, a single stage of fast drying-firing, and finishing (trimming, assembling of ceramic-fiberglass composites. Fired and unfired industrial slabs were selected and characterized from the technological, compositional (XRF, XRD and microstructural (SEM viewpoints. Semi-finished products exhibit a remarkable microstructural uniformity and stability in a rather wide window of firing schedules. The phase composition and compact microstructure of fired slabs are very similar to those of porcelain stoneware tiles. The values of water absorption, bulk density, closed porosity, functional performances as well as mechanical and tribological properties conform to the top quality range of porcelain stoneware tiles. However, the large size coupled with low thickness bestow on the slab a certain degree of flexibility, which is emphasized in ceramic-fiberglass composites. These outstanding performances make the large-sized slabs suitable to be used in novel applications: building and construction (new floorings without dismantling the previous paving, ventilated façades, tunnel coverings, insulating panelling, indoor furnitures (table tops, doors, support for photovoltaic ceramic panels.

    Se han fabricado piezas de gran formato, con dimensiones de hasta 360x120 cm, y menos de 2 mm, de espesor, empleando métodos innovadores de fabricación, partiendo de composiciones de gres porcelánico y utilizando, molienda con bolas por vía húmeda, atomización, prensado a baja velocidad sin boquilla de extrusión, secado y cocción rápido en una sola etapa, y un acabado que incluye la adhesión de fibra de vidrio al soporte cerámico y el rectificado de la pieza final. Se han

  6. A Novel Divisive Hierarchical Clustering Algorithm for Geospatial Analysis

    Directory of Open Access Journals (Sweden)

    Shaoning Li

    2017-01-01

    Full Text Available In the fields of geographic information systems (GIS and remote sensing (RS, the clustering algorithm has been widely used for image segmentation, pattern recognition, and cartographic generalization. Although clustering analysis plays a key role in geospatial modelling, traditional clustering methods are limited due to computational complexity, noise resistant ability and robustness. Furthermore, traditional methods are more focused on the adjacent spatial context, which makes it hard for the clustering methods to be applied to multi-density discrete objects. In this paper, a new method, cell-dividing hierarchical clustering (CDHC, is proposed based on convex hull retraction. The main steps are as follows. First, a convex hull structure is constructed to describe the global spatial context of geospatial objects. Then, the retracting structure of each borderline is established in sequence by setting the initial parameter. The objects are split into two clusters (i.e., “sub-clusters” if the retracting structure intersects with the borderlines. Finally, clusters are repeatedly split and the initial parameter is updated until the terminate condition is satisfied. The experimental results show that CDHC separates the multi-density objects from noise sufficiently and also reduces complexity compared to the traditional agglomerative hierarchical clustering algorithm.

  7. National Geospatial Data Asset Lifecycle Baseline Maturity Assessment for the Federal Geographic Data Committee

    Science.gov (United States)

    Peltz-Lewis, L. A.; Blake-Coleman, W.; Johnston, J.; DeLoatch, I. B.

    2014-12-01

    The Federal Geographic Data Committee (FGDC) is designing a portfolio management process for 193 geospatial datasets contained within the 16 topical National Spatial Data Infrastructure themes managed under OMB Circular A-16 "Coordination of Geographic Information and Related Spatial Data Activities." The 193 datasets are designated as National Geospatial Data Assets (NGDA) because of their significance in implementing to the missions of multiple levels of government, partners and stakeholders. As a starting point, the data managers of these NGDAs will conduct a baseline maturity assessment of the dataset(s) for which they are responsible. The maturity is measured against benchmarks related to each of the seven stages of the data lifecycle management framework promulgated within the OMB Circular A-16 Supplemental Guidance issued by OMB in November 2010. This framework was developed by the interagency Lifecycle Management Work Group (LMWG), consisting of 16 Federal agencies, under the 2004 Presidential Initiative the Geospatial Line of Business,using OMB Circular A-130" Management of Federal Information Resources" as guidance The seven lifecycle stages are: Define, Inventory/Evaluate, Obtain, Access, Maintain, Use/Evaluate, and Archive. This paper will focus on the Lifecycle Baseline Maturity Assessment, and efforts to integration the FGDC approach with other data maturity assessments.

  8. Using a Web GIS Plate Tectonics Simulation to Promote Geospatial Thinking

    Science.gov (United States)

    Bodzin, Alec M.; Anastasio, David; Sharif, Rajhida; Rutzmoser, Scott

    2016-01-01

    Learning with Web-based geographic information system (Web GIS) can promote geospatial thinking and analysis of georeferenced data. Web GIS can enable learners to analyze rich data sets to understand spatial relationships that are managed in georeferenced data visualizations. We developed a Web GIS plate tectonics simulation as a capstone learning…

  9. Implementing a High School Level Geospatial Technologies and Spatial Thinking Course

    Science.gov (United States)

    Nielsen, Curtis P.; Oberle, Alex; Sugumaran, Ramanathan

    2011-01-01

    Understanding geospatial technologies (GSTs) and spatial thinking is increasingly vital to contemporary life including common activities and hobbies; learning in science, mathematics, and social science; and employment within fields as diverse as engineering, health, business, and planning. As such, there is a need for a stand-alone K-12…

  10. SIDELOADING – INGESTION OF LARGE POINT CLOUDS INTO THE APACHE SPARK BIG DATA ENGINE

    Directory of Open Access Journals (Sweden)

    J. Boehm

    2016-06-01

    Full Text Available In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  11. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    Science.gov (United States)

    Dednam, W.; Botha, A. E.

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  12. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    International Nuclear Information System (INIS)

    Dednam, W; Botha, A E

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  13. Comprehensive, Mixed-Methods Assessment of a Blended Learning Model for Geospatial Literacy Instruction

    Science.gov (United States)

    Brodeur, J. J.; Maclachlan, J. C.; Bagg, J.; Chiappetta-Swanson, C.; Vine, M. M.; Vajoczki, S.

    2013-12-01

    Geospatial literacy -- the ability to conceptualize, capture, analyze and communicate spatial phenomena -- represents an important competency for 21st Century learners in a period of 'Geospatial Revolution'. Though relevant to in-course learning, these skills are often taught externally, placing time and resource pressures on the service providers - commonly libraries - that are relied upon to provide instruction. The emergence of online and blended modes of instruction has presented a potential means of increasing the cost-effectiveness of such activities, by simultaneously reducing instructional costs, expanding the audience for these resources, and addressing student preferences for asynchronous learning and '24-7' access. During 2011 and 2012, McMaster University Library coordinated the development, implementation and assessment of blended learning modules for geospatial literacy instruction in first-year undergraduate Social Science courses. In this paper, we present the results of a comprehensive mixed-methods approach to assess the efficacy of implementing blended learning modules to replace traditional (face-to-face), library-led, first-year undergraduate geospatial literacy instruction. Focus groups, personal interviews and an online survey were used to assess modules across dimensions of: student use, satisfaction and accessibility requirements (via Universal Instructional Design [UID] principles); instructor and teaching staff perception of pedagogical efficacy and instructional effectiveness; and, administrator cost-benefit assessment of development and implementation. Results showed that both instructors and students identified significant value in using the online modules in a blended-learning setting. Reaffirming assumptions of students' '24/7' learning preferences, over 80% of students reported using the modules on a repeat basis. Students were more likely to use the modules to better understand course content than simply to increase their grade in

  14. Feasibility study of geospatial mapping of chronic disease risk to inform public health commissioning.

    Science.gov (United States)

    Noble, Douglas; Smith, Dianna; Mathur, Rohini; Robson, John; Greenhalgh, Trisha

    2012-01-01

    To explore the feasibility of producing small-area geospatial maps of chronic disease risk for use by clinical commissioning groups and public health teams. Cross-sectional geospatial analysis using routinely collected general practitioner electronic record data. Tower Hamlets, an inner-city district of London, UK, characterised by high socioeconomic and ethnic diversity and high prevalence of non-communicable diseases. The authors used type 2 diabetes as an example. The data set was drawn from electronic general practice records on all non-diabetic individuals aged 25-79 years in the district (n=163 275). The authors used a validated instrument, QDScore, to calculate 10-year risk of developing type 2 diabetes. Using specialist mapping software (ArcGIS), the authors produced visualisations of how these data varied by lower and middle super output area across the district. The authors enhanced these maps with information on examples of locality-based social determinants of health (population density, fast food outlets and green spaces). Data were piloted as three types of geospatial map (basic, heat and ring). The authors noted practical, technical and information governance challenges involved in producing the maps. Usable data were obtained on 96.2% of all records. One in 11 adults in our cohort was at 'high risk' of developing type 2 diabetes with a 20% or more 10-year risk. Small-area geospatial mapping illustrated 'hot spots' where up to 17.3% of all adults were at high risk of developing type 2 diabetes. Ring maps allowed visualisation of high risk for type 2 diabetes by locality alongside putative social determinants in the same locality. The task of downloading, cleaning and mapping data from electronic general practice records posed some technical challenges, and judgement was required to group data at an appropriate geographical level. Information governance issues were time consuming and required local and national consultation and agreement. Producing

  15. Geospatial database of estimates of groundwater discharge to streams in the Upper Colorado River Basin

    Science.gov (United States)

    Garcia, Adriana; Masbruch, Melissa D.; Susong, David D.

    2014-01-01

    The U.S. Geological Survey, as part of the Department of the Interior’s WaterSMART (Sustain and Manage America’s Resources for Tomorrow) initiative, compiled published estimates of groundwater discharge to streams in the Upper Colorado River Basin as a geospatial database. For the purpose of this report, groundwater discharge to streams is the baseflow portion of streamflow that includes contributions of groundwater from various flow paths. Reported estimates of groundwater discharge were assigned as attributes to stream reaches derived from the high-resolution National Hydrography Dataset. A total of 235 estimates of groundwater discharge to streams were compiled and included in the dataset. Feature class attributes of the geospatial database include groundwater discharge (acre-feet per year), method of estimation, citation abbreviation, defined reach, and 8-digit hydrologic unit code(s). Baseflow index (BFI) estimates of groundwater discharge were calculated using an existing streamflow characteristics dataset and were included as an attribute in the geospatial database. A comparison of the BFI estimates to the compiled estimates of groundwater discharge found that the BFI estimates were greater than the reported groundwater discharge estimates.

  16. Evaluation of Data Management Systems for Geospatial Big Data

    OpenAIRE

    Amirian, Pouria; Basiri, Anahid; Winstanley, Adam C.

    2014-01-01

    Big Data encompasses collection, management, processing and analysis of the huge amount of data that varies in types and changes with high frequency. Often data component of Big Data has a positional component as an important part of it in various forms, such as postal address, Internet Protocol (IP) address and geographical location. If the positional components in Big Data extensively used in storage, retrieval, analysis, processing, visualization and knowledge discovery (geospatial Big Dat...

  17. MOBILE TRAFFIC ALERT AND TOURIST ROUTE GUIDANCE SYSTEM DESIGN USING GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    D. Bhattacharya

    2017-09-01

    Full Text Available The present study describes an integrated system for traffic data collection and alert warning. Geographical information based decision making related to traffic destinations and routes is proposed through the design. The system includes a geospatial database having profile relating to a user of a mobile device. The processing and understanding of scanned maps, other digital data input leads to route guidance. The system includes a server configured to receive traffic information relating to a route and location information relating to the mobile device. Server is configured to send a traffic alert to the mobile device when the traffic information and the location information indicate that the mobile device is traveling toward traffic congestion. Proposed system has geospatial and mobile data sets pertaining to Bangalore city in India. It is envisaged to be helpful for touristic purposes as a route guidance and alert relaying information system to tourists for proximity to sites worth seeing in a city they have entered into. The system is modular in architecture and the novelty lies in integration of different modules carrying different technologies for a complete traffic information system. Generic information processing and delivery system has been tested to be functional and speedy under test geospatial domains. In a restricted prototype model with geo-referenced route data required information has been delivered correctly over sustained trials to designated cell numbers, with average time frame of 27.5 seconds, maximum 50 and minimum 5 seconds. Traffic geo-data set trials testing is underway.

  18. Path Network Recovery Using Remote Sensing Data and Geospatial-Temporal Semantic Graphs

    Energy Technology Data Exchange (ETDEWEB)

    McLendon, William C.,; Brost, Randolph

    2016-05-01

    Remote sensing systems produce large volumes of high-resolution images that are difficult to search. The GeoGraphy (pronounced Geo-Graph-y) framework [2, 20] encodes remote sensing imagery into a geospatial-temporal semantic graph representation to enable high level semantic searches to be performed. Typically scene objects such as buildings and trees tend to be shaped like blocks with few holes, but other shapes generated from path networks tend to have a large number of holes and can span a large geographic region due to their connectedness. For example, we have a dataset covering the city of Philadelphia in which there is a single road network node spanning a 6 mile x 8 mile region. Even a simple question such as "find two houses near the same street" might give unexpected results. More generally, nodes arising from networks of paths (roads, sidewalks, trails, etc.) require additional processing to make them useful for searches in GeoGraphy. We have assigned the term Path Network Recovery to this process. Path Network Recovery is a three-step process involving (1) partitioning the network node into segments, (2) repairing broken path segments interrupted by occlusions or sensor noise, and (3) adding path-aware search semantics into GeoQuestions. This report covers the path network recovery process, how it is used, and some example use cases of the current capabilities.

  19. When bigger is not better: selection against large size, high condition and fast growth in juvenile lemon sharks.

    Science.gov (United States)

    Dibattista, J D; Feldheim, K A; Gruber, S H; Hendry, A P

    2007-01-01

    Selection acting on large marine vertebrates may be qualitatively different from that acting on terrestrial or freshwater organisms, but logistical constraints have thus far precluded selection estimates for the former. We overcame these constraints by exhaustively sampling and repeatedly recapturing individuals in six cohorts of juvenile lemon sharks (450 age-0 and 255 age-1 fish) at an enclosed nursery site (Bimini, Bahamas). Data on individual size, condition factor, growth rate and inter-annual survival were used to test the 'bigger is better', 'fatter is better' and 'faster is better' hypotheses of life-history theory. For age-0 sharks, selection on all measured traits was weak, and generally acted against large size and high condition. For age-1 sharks, selection was much stronger, and consistently acted against large size and fast growth. These results suggest that selective pressures at Bimini may be constraining the evolution of large size and fast growth, an observation that fits well with the observed small size and low growth rate of juveniles at this site. Our results support those of some other recent studies in suggesting that bigger/fatter/faster is not always better, and may often be worse.

  20. Resonant atom-field interaction in large-size coupled-cavity arrays

    International Nuclear Information System (INIS)

    Ciccarello, Francesco

    2011-01-01

    We consider an array of coupled cavities with staggered intercavity couplings, where each cavity mode interacts with an atom. In contrast to large-size arrays with uniform hopping rates where the atomic dynamics is known to be frozen in the strong-hopping regime, we show that resonant atom-field dynamics with significant energy exchange can occur in the case of staggered hopping rates even in the thermodynamic limit. This effect arises from the joint emergence of an energy gap in the free photonic dispersion relation and a discrete frequency at the gap's center. The latter corresponds to a bound normal mode stemming solely from the finiteness of the array length. Depending on which cavity is excited, either the atomic dynamics is frozen or a Jaynes-Cummings-like energy exchange is triggered between the bound photonic mode and its atomic analog. As these phenomena are effective with any number of cavities, they are prone to be experimentally observed even in small-size arrays.

  1. Geospatial Data Quality of the Servir CORS Network

    Science.gov (United States)

    Santos, J.; Teodoro, R.; Mira, N.; Mendes, V. B.

    2015-08-01

    The SERVIR Continuous Operation Reference Stations (CORS) network was implemented in 2006 to facilitate land surveying with Global Navigation Satellite Systems (GNSS) positioning techniques. Nowadays, the network covers all Portuguese mainland. The SERVIR data is provided to many users, such as surveyors, universities (for education and research purposes) and companies that deal with geographic information. By middle 2012, there was a significant change in the network accessing paradigm, the most important of all being the increase in the responsibility of managing the network to guarantee a permanent availability and the highest quality of the geospatial data. In addition, the software that is used to manage the network and to compute the differential corrections was replaced by a new software package. These facts were decisive to perform the quality control of the SERVIR network and evaluate positional accuracy. In order to perform such quality control, a significant number of geodetic monuments spread throughout the country were chosen. Some of these monuments are located in the worst location regarding the network geometry in order to evaluate the accuracy of positions for the worst case scenarios. Data collection was carried out using different GNSS positioning modes and were compared against the benchmark positions that were determined using data acquired in static mode in 3-hour sessions. We conclude the geospatial data calculated and provided to the users community by the network is, within the surveying purposes, accurate, precise and fits the needs of those users.

  2. From nanoparticles to large aerosols: Ultrafast measurement methods for size and concentration

    International Nuclear Information System (INIS)

    Keck, Lothar; Spielvogel, Juergen; Grimm, Hans

    2009-01-01

    A major challenge in aerosol technology is the fast measurement of number size distributions with good accuracy and size resolution. The dedicated instruments are frequently based on particle charging and electric detection. Established fast systems, however, still feature a number of shortcomings. We have developed a new instrument that constitutes of a high flow Differential Mobility Analyser (high flow DMA) and a high sensitivity Faraday Cup Electrometer (FCE). The system enables variable flow rates of up to 150 lpm, and the scan time for size distribution can be shortened considerably due to the short residence time of the particles in the DMA. Three different electrodes can be employed in order to cover a large size range. First test results demonstrate that the scan time can be reduced to less than 1 s for small particles, and that the results from the fast scans feature no significant difference to the results from established slow method. The fields of application for the new instrument comprise the precise monitoring of fast processes with nanoparticles, including monitoring of engine exhaust in automotive research.

  3. From nanoparticles to large aerosols: Ultrafast measurement methods for size and concentration

    Science.gov (United States)

    Keck, Lothar; Spielvogel, Jürgen; Grimm, Hans

    2009-05-01

    A major challenge in aerosol technology is the fast measurement of number size distributions with good accuracy and size resolution. The dedicated instruments are frequently based on particle charging and electric detection. Established fast systems, however, still feature a number of shortcomings. We have developed a new instrument that constitutes of a high flow Differential Mobility Analyser (high flow DMA) and a high sensitivity Faraday Cup Electrometer (FCE). The system enables variable flow rates of up to 150 lpm, and the scan time for size distribution can be shortened considerably due to the short residence time of the particles in the DMA. Three different electrodes can be employed in order to cover a large size range. First test results demonstrate that the scan time can be reduced to less than 1 s for small particles, and that the results from the fast scans feature no significant difference to the results from established slow method. The fields of application for the new instrument comprise the precise monitoring of fast processes with nanoparticles, including monitoring of engine exhaust in automotive research.

  4. Geospatial revolution and remote sensing LiDAR in Mesoamerican archaeology

    Science.gov (United States)

    Chase, Arlen F.; Fisher, Christopher T.; Leisz, Stephen J.; Weishampel, John F.

    2012-01-01

    The application of light detection and ranging (LiDAR), a laser-based remote-sensing technology that is capable of penetrating overlying vegetation and forest canopies, is generating a fundamental shift in Mesoamerican archaeology and has the potential to transform research in forested areas world-wide. Much as radiocarbon dating that half a century ago moved archaeology forward by grounding archaeological remains in time, LiDAR is proving to be a catalyst for an improved spatial understanding of the past. With LiDAR, ancient societies can be contextualized within a fully defined landscape. Interpretations about the scale and organization of densely forested sites no longer are constrained by sample size, as they were when mapping required laborious on-ground survey. The ability to articulate ancient landscapes fully permits a better understanding of the complexity of ancient Mesoamerican urbanism and also aids in modern conservation efforts. The importance of this geospatial innovation is demonstrated with newly acquired LiDAR data from the archaeological sites of Caracol, Cayo, Belize and Angamuco, Michoacán, Mexico. These data illustrate the potential of technology to act as a catalytic enabler of rapid transformational change in archaeological research and interpretation and also underscore the value of on-the-ground archaeological investigation in validating and contextualizing results. PMID:22802623

  5. Geospatial revolution and remote sensing LiDAR in Mesoamerican archaeology.

    Science.gov (United States)

    Chase, Arlen F; Chase, Diane Z; Fisher, Christopher T; Leisz, Stephen J; Weishampel, John F

    2012-08-07

    The application of light detection and ranging (LiDAR), a laser-based remote-sensing technology that is capable of penetrating overlying vegetation and forest canopies, is generating a fundamental shift in Mesoamerican archaeology and has the potential to transform research in forested areas world-wide. Much as radiocarbon dating that half a century ago moved archaeology forward by grounding archaeological remains in time, LiDAR is proving to be a catalyst for an improved spatial understanding of the past. With LiDAR, ancient societies can be contextualized within a fully defined landscape. Interpretations about the scale and organization of densely forested sites no longer are constrained by sample size, as they were when mapping required laborious on-ground survey. The ability to articulate ancient landscapes fully permits a better understanding of the complexity of ancient Mesoamerican urbanism and also aids in modern conservation efforts. The importance of this geospatial innovation is demonstrated with newly acquired LiDAR data from the archaeological sites of Caracol, Cayo, Belize and Angamuco, Michoacán, Mexico. These data illustrate the potential of technology to act as a catalytic enabler of rapid transformational change in archaeological research and interpretation and also underscore the value of on-the-ground archaeological investigation in validating and contextualizing results.

  6. SemantGeo: Powering Ecological and Environment Data Discovery and Search with Standards-Based Geospatial Reasoning

    Science.gov (United States)

    Seyed, P.; Ashby, B.; Khan, I.; Patton, E. W.; McGuinness, D. L.

    2013-12-01

    Recent efforts to create and leverage standards for geospatial data specification and inference include the GeoSPARQL standard, Geospatial OWL ontologies (e.g., GAZ, Geonames), and RDF triple stores that support GeoSPARQL (e.g., AllegroGraph, Parliament) that use RDF instance data for geospatial features of interest. However, there remains a gap on how best to fuse software engineering best practices and GeoSPARQL within semantic web applications to enable flexible search driven by geospatial reasoning. In this abstract we introduce the SemantGeo module for the SemantEco framework that helps fill this gap, enabling scientists find data using geospatial semantics and reasoning. SemantGeo provides multiple types of geospatial reasoning for SemantEco modules. The server side implementation uses the Parliament SPARQL Endpoint accessed via a Tomcat servlet. SemantGeo uses the Google Maps API for user-specified polygon construction and JsTree for providing containment and categorical hierarchies for search. SemantGeo uses GeoSPARQL for spatial reasoning alone and in concert with RDFS/OWL reasoning capabilities to determine, e.g., what geofeatures are within, partially overlap with, or within a certain distance from, a given polygon. We also leverage qualitative relationships defined by the Gazetteer ontology that are composites of spatial relationships as well as administrative designations or geophysical phenomena. We provide multiple mechanisms for exploring data, such as polygon (map-based) and named-feature (hierarchy-based) selection, that enable flexible search constraints using boolean combination of selections. JsTree-based hierarchical search facets present named features and include a 'part of' hierarchy (e.g., measurement-site-01, Lake George, Adirondack Region, NY State) and type hierarchies (e.g., nodes in the hierarchy for WaterBody, Park, MeasurementSite), depending on the ';axis of choice' option selected. Using GeoSPARQL and aforementioned ontology

  7. Development of Geospatial Map Based Portal for Delimitation of Mcd Wards

    Science.gov (United States)

    Gupta, A. Kumar Chandra; Kumar, P.; Sharma, P. Kumar

    2017-09-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Portal for Delimitation of MCD Wards (GMPDW) and election of 3 Municipal Corporations of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for delimitation of MCD Wards and draw of peripheral wards boundaries to planning and management of MCD Election process of State Election Commission, and as an MCD election related information searching tools (Polling Station, MCD Wards and Assembly constituency etc.,) for the citizens of NCTD. The GMPDW is based on Client-Server architecture model. It has been developed using Arc GIS Server 10.0 with .NET (pronounced dot net) technology. The GMPDW is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMPDW includes Enumeration Block (EB) and Enumeration Blocks Group (EBG) boundaries of Citizens of Delhi, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMPDW could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of MCD election. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  8. DEVELOPMENT OF GEOSPATIAL MAP BASED PORTAL FOR DELIMITATION OF MCD WARDS

    Directory of Open Access Journals (Sweden)

    A. Kumar Chandra Gupta

    2017-09-01

    Full Text Available The Geospatial Delhi Limited (GSDL, a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD to the Government of National Capital Territory of Delhi (GNCTD and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD. This paper describes the development of Geospatial Map based Portal for Delimitation of MCD Wards (GMPDW and election of 3 Municipal Corporations of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS for delimitation of MCD Wards and draw of peripheral wards boundaries to planning and management of MCD Election process of State Election Commission, and as an MCD election related information searching tools (Polling Station, MCD Wards and Assembly constituency etc., for the citizens of NCTD. The GMPDW is based on Client-Server architecture model. It has been developed using Arc GIS Server 10.0 with .NET (pronounced dot net technology. The GMPDW is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN connectivity. Spatial data to GMPDW includes Enumeration Block (EB and Enumeration Blocks Group (EBG boundaries of Citizens of Delhi, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.. GMPDW could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of MCD election. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  9. Uncertainty budget in internal monostandard NAA for small and large size samples analysis

    International Nuclear Information System (INIS)

    Dasari, K.B.; Acharya, R.

    2014-01-01

    Total uncertainty budget evaluation on determined concentration value is important under quality assurance programme. Concentration calculation in NAA or carried out by relative NAA and k0 based internal monostandard NAA (IM-NAA) method. IM-NAA method has been used for small and large sample analysis of clay potteries. An attempt was made to identify the uncertainty components in IM-NAA and uncertainty budget for La in both small and large size samples has been evaluated and compared. (author)

  10. A geospatial modelling approach to predict seagrass habitat recovery under multiple stressor regimes

    Science.gov (United States)

    Restoration of estuarine seagrass habitats requires a clear understanding of the modes of action of multiple interacting stressors including nutrients, climate change, coastal land-use change, and habitat modification. We have developed and demonstrated a geospatial modeling a...

  11. Establishing Accurate and Sustainable Geospatial Reference Layers in Developing Countries

    Science.gov (United States)

    Seaman, V. Y.

    2017-12-01

    Accurate geospatial reference layers (settlement names & locations, administrative boundaries, and population) are not readily available for most developing countries. This critical information gap makes it challenging for governments to efficiently plan, allocate resources, and provide basic services. It also hampers international agencies' response to natural disasters, humanitarian crises, and other emergencies. The current work involves a recent successful effort, led by the Bill & Melinda Gates Foundation and the Government of Nigeria, to obtain such data. The data collection began in 2013, with local teams collecting names, coordinates, and administrative attributes for over 100,000 settlements using ODK-enabled smartphones. A settlement feature layer extracted from satellite imagery was used to ensure all settlements were included. Administrative boundaries (Ward, LGA) were created using the settlement attributes. These "new" boundary layers were much more accurate than existing shapefiles used by the government and international organizations. The resulting data sets helped Nigeria eradicate polio from all areas except in the extreme northeast, where security issues limited access and vaccination activities. In addition to the settlement and boundary layers, a GIS-based population model was developed, in partnership with Oak Ridge National Laboratories and Flowminder), that used the extracted settlement areas and characteristics, along with targeted microcensus data. This model provides population and demographics estimates independent of census or other administrative data, at a resolution of 90 meters. These robust geospatial data layers found many other uses, including establishing catchment area settlements and populations for health facilities, validating denominators for population-based surveys, and applications across a variety of government sectors. Based on the success of the Nigeria effort, a partnership between DfID and the Bill & Melinda Gates

  12. Decision Performance Using Spatial Decision Support Systems: A Geospatial Reasoning Ability Perspective

    Science.gov (United States)

    Erskine, Michael A.

    2013-01-01

    As many consumer and business decision makers are utilizing Spatial Decision Support Systems (SDSS), a thorough understanding of how such decisions are made is crucial for the information systems domain. This dissertation presents six chapters encompassing a comprehensive analysis of the impact of geospatial reasoning ability on…

  13. A Collaborative Geospatial Shoreline Inventory Tool to Guide Coastal Development and Habitat Conservation

    Directory of Open Access Journals (Sweden)

    Peter Gies

    2013-05-01

    Full Text Available We are developing a geospatial inventory tool that will guide habitat conservation, restoration and coastal development and benefit several stakeholders who seek mitigation and adaptation strategies to shoreline changes resulting from erosion and sea level rise. The ESRI Geoportal Server, which is a type of web portal used to find and access geospatial information in a central repository, is customized by adding a Geoinventory tool capability that allows any shoreline related data to be searched, displayed and analyzed on a map viewer. Users will be able to select sections of the shoreline and generate statistical reports in the map viewer to allow for comparisons. The tool will also facilitate map-based discussion forums and creation of user groups to encourage citizen participation in decisions regarding shoreline stabilization and restoration, thereby promoting sustainable coastal development.

  14. Development of superconducting poloidal field coils for medium and large size tokamaks

    International Nuclear Information System (INIS)

    Dittrich, H.-G.; Forster, S.; Hofmann, A.

    1983-01-01

    Large long pulse tokamak fusion experiments require the use of superconducting poloidal field (PF) coils. In the past not much attention has been paid to the development of such coils. Therefore a development programme has been initiated recently at KfK. In this report start with summarizing the relevant PF coil parameters of some medium and large size tokamaks presently under construction or design, respectively. The most important areas of research and development work are deduced from these parameters. Design considerations and first experimental results concerning low loss conductors, cooling concepts and structural components are given

  15. A Big Data Platform for Storing, Accessing, Mining and Learning Geospatial Data

    Science.gov (United States)

    Yang, C. P.; Bambacus, M.; Duffy, D.; Little, M. M.

    2017-12-01

    Big Data is becoming a norm in geoscience domains. A platform that is capable to effiently manage, access, analyze, mine, and learn the big data for new information and knowledge is desired. This paper introduces our latest effort on developing such a platform based on our past years' experiences on cloud and high performance computing, analyzing big data, comparing big data containers, and mining big geospatial data for new information. The platform includes four layers: a) the bottom layer includes a computing infrastructure with proper network, computer, and storage systems; b) the 2nd layer is a cloud computing layer based on virtualization to provide on demand computing services for upper layers; c) the 3rd layer is big data containers that are customized for dealing with different types of data and functionalities; d) the 4th layer is a big data presentation layer that supports the effient management, access, analyses, mining and learning of big geospatial data.

  16. TOWARDS IMPLEMENTATION OF THE FOG COMPUTING CONCEPT INTO THE GEOSPATIAL DATA INFRASTRUCTURES

    Directory of Open Access Journals (Sweden)

    E. A. Panidi

    2016-01-01

    Full Text Available The information technologies and Global Network technologies in particular are developing very quickly. According to this, the problem remains actual that incorporates implementation issues for the general-purpose technologies into the information systems which operate with geospatial data. The paper discusses the implementation feasibility for a number of new approaches and concepts that solve the problems of spatial data publish and management on the Global Network. A brief review describes some contemporary concepts and technologies used for distributed data storage and management, which provide combined use of server-side and client-side resources. In particular, the concepts of Cloud Computing, Fog Computing, and Internet of Things, also with Java Web Start, WebRTC and WebTorrent technologies are mentioned. The author's experience is described briefly, which incorporates the number of projects devoted to the development of the portable solutions for geospatial data and GIS software publication on the Global Network.

  17. Generating a geospatial database of U.S. regional feedstock production for use in evaluating the environmental footprint of biofuels.

    Science.gov (United States)

    Holder, Christopher T; Cleland, Joshua C; LeDuc, Stephen D; Andereck, Zac; Hogan, Chris; Martin, Kristen M

    2016-04-01

    The potential environmental effects of increased U.S. biofuel production often vary depending upon the location and type of land used to produce biofuel feedstocks. However, complete, annual data are generally lacking regarding feedstock production by specific location. Corn is the dominant biofuel feedstock in the U.S., so here we present methods for estimating where bioethanol corn feedstock is grown annually and how much is used by U.S. ethanol biorefineries. We use geospatial software and publicly available data to map locations of biorefineries, estimate their corn feedstock requirements, and estimate the feedstock production locations and quantities. We combined these data and estimates into a Bioethanol Feedstock Geospatial Database (BFGD) for years 2005-2010. We evaluated the performance of the methods by assessing how well the feedstock geospatial model matched our estimates of locally-sourced feedstock demand. On average, the model met approximately 89 percent of the total estimated local feedstock demand across the studied years-within approximately 25-to-40 kilometers of the biorefinery in the majority of cases. We anticipate that these methods could be used for other years and feedstocks, and can be subsequently applied to estimate the environmental footprint of feedstock production. Methods used to develop the Bioethanol Feedstock Geospatial Database (BFGD) provide a means of estimating the amount and location of U.S. corn harvested for use as U.S. bioethanol feedstock. Such estimates of geospatial feedstock production may be used to evaluate environmental impacts of bioethanol production and to identify conservation priorities. The BFGD is available for 2005-2010, and the methods may be applied to additional years, locations, and potentially other biofuels and feedstocks.

  18. Geospatial Data as a Service: The GEOGLAM Rangelands and Pasture Productivity Map Experience

    Science.gov (United States)

    Evans, B. J. K.; Antony, J.; Guerschman, J. P.; Larraondo, P. R.; Richards, C. J.

    2017-12-01

    Empowering end-users like pastoralists, land management specialists and land policy makers in the use of earth observation data for both day-to-day and seasonal planning needs both interactive delivery of multiple geospatial datasets and the capability of supporting on-the-fly dynamic queries while simultaneously fostering a community around the effort. The use of and wide adoption of large data archives, like those produced by earth observation missions, are often limited by compute and storage capabilities of the remote user. We demonstrate that wide-scale use of large data archives can be facilitated by end-users dynamically requesting value-added products using open standards (WCS, WMS, WPS), with compute running in the cloud or dedicated data-centres and visualizing outputs on web-front ends. As an example, we will demonstrate how a tool called GSKY can empower a remote end-user by providing the data delivery and analytics capabilities for the GEOGLAM Rangelands and Pasture Productivity (RAPP) Map tool. The GEOGLAM RAPP initiative from the Group on Earth Observations (GEO) and its Agricultural Monitoring subgroup aims at providing practical tools to end-users focusing on the important role of rangelands and pasture systems in providing food production security from both agricultural crops and animal protein. Figure 1, is a screen capture from the RAPP Map interface for an important pasture area in the Namibian rangelands. The RAPP Map has been in production for six months and has garnered significant interest from groups and users all over the world. GSKY, being formulated around the theme of Open Geospatial Data-as-a-Service capabilities uses distributed computing and storage to facilitate this. It works behind the scenes, accepting OGC standard requests in WCS, WMS and WPS. Results from these requests are rendered on a web-front end. In this way, the complexities of data locality and compute execution are masked from an end user. On-the-fly computation of

  19. Highly crystallized nanometer-sized zeolite a with large Cs adsorption capability for the decontamination of water.

    Science.gov (United States)

    Torad, Nagy L; Naito, Masanobu; Tatami, Junichi; Endo, Akira; Leo, Sin-Yen; Ishihara, Shinsuke; Wu, Kevin C-W; Wakihara, Toru; Yamauchi, Yusuke

    2014-03-01

    Nanometer-sized zeolite A with a large cesium (Cs) uptake capability is prepared through a simple post-milling recrystallization method. This method is suitable for producing nanometer-sized zeolite in large scale, as additional organic compounds are not needed to control zeolite nucleation and crystal growth. Herein, we perform a quartz crystal microbalance (QCM) study to evaluate the uptake ability of Cs ions by zeolite, to the best of our knowledge, for the first time. In comparison to micrometer-sized zeolite A, nanometer-sized zeolite A can rapidly accommodate a larger amount of Cs ions into the zeolite crystal structure, owing to its high external surface area. Nanometer-sized zeolite is a promising candidate for the removal of radioactive Cs ions from polluted water. Our QCM study on Cs adsorption uptake behavior provides the information of adsorption kinetics (e.g., adsorption amounts and rates). This technique is applicable to other zeolites, which will be highly valuable for further consideration of radioactive Cs removal in the future. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Evaluating Progression in Students' Relational Thinking While Working on Tasks with Geospatial Technologies

    NARCIS (Netherlands)

    Favier, Tim|info:eu-repo/dai/nl/33811534X; van der Schee, Joop|info:eu-repo/dai/nl/072719575

    2014-01-01

    One of the facets of geographic literacy is the ability to think in a structured way about geographic relationships. Geospatial technologies offer many opportunities to stimulate students’ geographic relational thinking. The question is: How can these opportunities be effectuated? This paper

  1. Applying Geospatial Technologies for International Development and Public Health: The USAID/NASA SERVIR Program

    Science.gov (United States)

    Hemmings, Sarah; Limaye, Ashutosh; Irwin, Dan

    2011-01-01

    Background: SERVIR -- the Regional Visualization and Monitoring System -- helps people use Earth observations and predictive models based on data from orbiting satellites to make timely decisions that benefit society. SERVIR operates through a network of regional hubs in Mesoamerica, East Africa, and the Hindu Kush-Himalayas. USAID and NASA support SERVIR, with the long-term goal of transferring SERVIR capabilities to the host countries. Objective/Purpose: The purpose of this presentation is to describe how the SERVIR system helps the SERVIR regions cope with eight areas of societal benefit identified by the Group on Earth Observations (GEO): health, disasters, ecosystems, biodiversity, weather, water, climate, and agriculture. This presentation will describe environmental health applications of data in the SERVIR system, as well as ongoing and future efforts to incorporate additional health applications into the SERVIR system. Methods: This presentation will discuss how the SERVIR Program makes environmental data available for use in environmental health applications. SERVIR accomplishes its mission by providing member nations with access to geospatial data and predictive models, information visualization, training and capacity building, and partnership development. SERVIR conducts needs assessments in partner regions, develops custom applications of Earth observation data, and makes NASA and partner data available through an online geospatial data portal at SERVIRglobal.net. Results: Decision makers use SERVIR to improve their ability to monitor air quality, extreme weather, biodiversity, and changes in land cover. In past several years, the system has been used over 50 times to respond to environmental threats such as wildfires, floods, landslides, and harmful algal blooms. Given that the SERVIR regions are experiencing increased stress under larger climate variability than historic observations, SERVIR provides information to support the development of

  2. Interacting With A Near Real-Time Urban Digital Watershed Using Emerging Geospatial Web Technologies

    Science.gov (United States)

    Liu, Y.; Fazio, D. J.; Abdelzaher, T.; Minsker, B.

    2007-12-01

    The value of real-time hydrologic data dissemination including river stage, streamflow, and precipitation for operational stormwater management efforts is particularly high for communities where flash flooding is common and costly. Ideally, such data would be presented within a watershed-scale geospatial context to portray a holistic view of the watershed. Local hydrologic sensor networks usually lack comprehensive integration with sensor networks managed by other agencies sharing the same watershed due to administrative, political, but mostly technical barriers. Recent efforts on providing unified access to hydrological data have concentrated on creating new SOAP-based web services and common data format (e.g. WaterML and Observation Data Model) for users to access the data (e.g. HIS and HydroSeek). Geospatial Web technology including OGC sensor web enablement (SWE), GeoRSS, Geo tags, Geospatial browsers such as Google Earth and Microsoft Virtual Earth and other location-based service tools provides possibilities for us to interact with a digital watershed in near-real-time. OGC SWE proposes a revolutionary concept towards a web-connected/controllable sensor networks. However, these efforts have not provided the capability to allow dynamic data integration/fusion among heterogeneous sources, data filtering and support for workflows or domain specific applications where both push and pull mode of retrieving data may be needed. We propose a light weight integration framework by extending SWE with open source Enterprise Service Bus (e.g., mule) as a backbone component to dynamically transform, transport, and integrate both heterogeneous sensor data sources and simulation model outputs. We will report our progress on building such framework where multi-agencies" sensor data and hydro-model outputs (with map layers) will be integrated and disseminated in a geospatial browser (e.g. Microsoft Virtual Earth). This is a collaborative project among NCSA, USGS Illinois Water

  3. Study on external reactor vessel cooling capacity for advanced large size PWR

    International Nuclear Information System (INIS)

    Jin Di; Liu Xiaojing; Cheng Xu; Li Fei

    2014-01-01

    External reactor vessel cooling (ERVC) is widely adopted as a part of in- vessel retention (IVR) in severe accident management strategies. In this paper, some flow parameters and boundary conditions, eg., inlet and outlet area, water inlet temperature, heating power of the lower head, the annular gap size at the position of the lower head and flooding water level, were considered to qualitatively study the effect of them on natural circulation capacity of the external reactor vessel cooling for an advanced large size PWR by using RELAP5 code. And the calculation results provide some basis of analysis for the structure design and the following transient response behavior of the system. (authors)

  4. Prey size and availability limits maximum size of rainbow trout in a large tailwater: insights from a drift-foraging bioenergetics model

    Science.gov (United States)

    Dodrill, Michael J.; Yackulic, Charles B.; Kennedy, Theodore A.; Haye, John W

    2016-01-01

    The cold and clear water conditions present below many large dams create ideal conditions for the development of economically important salmonid fisheries. Many of these tailwater fisheries have experienced declines in the abundance and condition of large trout species, yet the causes of these declines remain uncertain. Here, we develop, assess, and apply a drift-foraging bioenergetics model to identify the factors limiting rainbow trout (Oncorhynchus mykiss) growth in a large tailwater. We explored the relative importance of temperature, prey quantity, and prey size by constructing scenarios where these variables, both singly and in combination, were altered. Predicted growth matched empirical mass-at-age estimates, particularly for younger ages, demonstrating that the model accurately describes how current temperature and prey conditions interact to determine rainbow trout growth. Modeling scenarios that artificially inflated prey size and abundance demonstrate that rainbow trout growth is limited by the scarcity of large prey items and overall prey availability. For example, shifting 10% of the prey biomass to the 13 mm (large) length class, without increasing overall prey biomass, increased lifetime maximum mass of rainbow trout by 88%. Additionally, warmer temperatures resulted in lower predicted growth at current and lower levels of prey availability; however, growth was similar across all temperatures at higher levels of prey availability. Climate change will likely alter flow and temperature regimes in large rivers with corresponding changes to invertebrate prey resources used by fish. Broader application of drift-foraging bioenergetics models to build a mechanistic understanding of how changes to habitat conditions and prey resources affect growth of salmonids will benefit management of tailwater fisheries.

  5. Mechanical properties of duplex steel welded joints in large-size constructions

    OpenAIRE

    J. Nowacki

    2012-01-01

    Purpose: On the basis of sources and own experiments, the analysis of mechanical properties, applications as well as material and technological problems of ferritic-austenitic steel welding were carried out. It was shown the area of welding applications, particularly welding of large-size structures, on the basis of example of the FCAW method of welding of the UNS S3 1803 duplex steel in construction of chemical cargo ships.Design/methodology/approach: Welding tests were carried out for duple...

  6. Mock-up test of remote controlled dismantling apparatus for large-sized vessels (contract research)

    Energy Technology Data Exchange (ETDEWEB)

    Myodo, Masato; Miyajima, Kazutoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Okane, Shogo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment

    2001-03-01

    The Remote dismantling apparatus, which is equipped with multi-units for functioning of washing, cutting, collection of cut pieces and so on, has been constructed to dismantle the large-sized vessels in the JAERI's Reprocessing Test Facility (JRTF). The apparatus has five-axis movement capability and its operation is performed remotely. The mock-up tests were performed to evaluate the applicability of the apparatus to actual dismantling activities by using the mock-ups of LV-3 and LV-5 in the facility. It was confirmed that each unit was satisfactory functioned by remote operation. Efficient procedures for dismantling the large-sized vessel was studied and various date was obtained in the mock-up tests. This apparatus was found to be applicable for the actual dismantling activity in JRTF. (author)

  7. Mock-up test of remote controlled dismantling apparatus for large-sized vessels (contract research)

    International Nuclear Information System (INIS)

    Myodo, Masato; Miyajima, Kazutoshi; Okane, Shogo

    2001-03-01

    The Remote dismantling apparatus, which is equipped with multi-units for functioning of washing, cutting, collection of cut pieces and so on, has been constructed to dismantle the large-sized vessels in the JAERI's Reprocessing Test Facility (JRTF). The apparatus has five-axis movement capability and its operation is performed remotely. The mock-up tests were performed to evaluate the applicability of the apparatus to actual dismantling activities by using the mock-ups of LV-3 and LV-5 in the facility. It was confirmed that each unit was satisfactory functioned by remote operation. Efficient procedures for dismantling the large-sized vessel was studied and various date was obtained in the mock-up tests. This apparatus was found to be applicable for the actual dismantling activity in JRTF. (author)

  8. Bridging IMO e-Navigation Policy and Offshore Oil and Gas Operations through Geospatial Standards

    Directory of Open Access Journals (Sweden)

    Filipe Modesto Da Rocha

    2016-04-01

    Full Text Available In offshore industry activities, the suitable onboard provision of assets location and geospatial marine information during operations is essential. Currently, most companies use its own data structures, resulting in incompatibility between processes. In order to promote the data exchange, oil and gas industry associations have pursued initiatives to standardize spatial information. In turn, the IMO - International Maritime Organization - started the implementation of e-Navigation policy, which is the standardization of technologies and protocols applied to maritime information and navigation. This paper shows relationship and integration points between maritime activities of oil and gas industry and e-Navigation technologies and processes, highlighting geospatial information. This paper also preludes out an initiative for a suitable product specification for the offshore oil and gas industry, compliant with e-Navigation and IHO S-100 international standards.

  9. Impact basins on Ganymede and Callisto and implications for the large-projectile size distribution

    Science.gov (United States)

    Wagner, R.; Neukum, G.; Wolf, U.; Greeley, R.; Klemaszewski, J. E.

    2003-04-01

    It has been conjectured that the projectile family which impacted the Galilean Satellites of Jupiter was depleted in large projectiles, concluded from a ''dearth'' in large craters (> 60 km) (e.g. [1]). Geologic mapping, aided by spatial filtering of new Galileo as well as older Voyager data shows, however, that large projectiles have left an imprint of palimpsests and multi-ring structures on both Ganymede and Callisto (e. g. [2]). Most of these impact structures are heavily degraded and hence difficult to recognize. In this paper, we present (1) maps showing the outlines of these basins, and (2) derive updated crater size-frequency diagrams of the two satellites. The crater diameter from a palimpsest diameter was reconstructed using a formula derived by [3]. The calculation of the crater diameter Dc from the outer boundary Do of a multi-ring structure is much less constrained and on the order of Dc = k \\cdot Do , with k ≈ 0.25-0.3 [4]. Despite the uncertainties in locating the ''true'' crater rims, the resulting shape of the distribution in the range from kilometer-sized craters to sizes of ≈ 500 km is lunar-like and strongly suggests a collisionally evolved projectile family, very likely of asteroidal origin. An alternative explanation for this shape could be that comets are collisionally evolved bodies in a similar way as are asteroids, which as of yet is still uncertain and in discussion. Also, the crater size distributions on Ganymede and Callisto are shifted towards smaller crater sizes compared to the Moon, caused by a much lower impact velocity of impactors which preferentially were in planetocentric orbits [5]. References: [1] Strom et al., JGR 86, 8659-8674, 1981. [2] J. E. Klemaszewski et al., Ann. Geophys. 16, suppl. III, 1998. [3] Iaquinta-Ridolfi &Schenk, LPSC XXVI (abstr.), 651-652, 1995. [4] Schenk &Moore, LPSC XXX, abstr. No. 1786 [CD-Rom], 1999. [5] Horedt & Neukum, JGR 89, 10,405-10,410, 1984.

  10. —Does Demand Fall When Customers Perceive That Prices Are Unfair? The Case of Premium Pricing for Large Sizes

    OpenAIRE

    Eric T. Anderson; Duncan I. Simester

    2008-01-01

    We analyze a large-scale field test conducted with a mail-order catalog firm to investigate how customers react to premium prices for larger sizes of women's apparel. We find that customers who demand large sizes react unfavorably to paying a higher price than customers for small sizes. Further investigation suggests that these consumers perceive that the price premium is unfair. Overall, premium pricing led to a 6% to 8% decrease in gross profits.

  11. Vertebral Adaptations to Large Body Size in Theropod Dinosaurs.

    Directory of Open Access Journals (Sweden)

    John P Wilson

    Full Text Available Rugose projections on the anterior and posterior aspects of vertebral neural spines appear throughout Amniota and result from the mineralization of the supraspinous and interspinous ligaments via metaplasia, the process of permanent tissue-type transformation. In mammals, this metaplasia is generally pathological or stress induced, but is a normal part of development in some clades of birds. Such structures, though phylogenetically sporadic, appear throughout the fossil record of non-avian theropod dinosaurs, yet their physiological and adaptive significance has remained unexamined. Here we show novel histologic and phylogenetic evidence that neural spine projections were a physiological response to biomechanical stress in large-bodied theropod species. Metaplastic projections also appear to vary between immature and mature individuals of the same species, with immature animals either lacking them or exhibiting smaller projections, supporting the hypothesis that these structures develop through ontogeny as a result of increasing bending stress subjected to the spinal column. Metaplastic mineralization of spinal ligaments would likely affect the flexibility of the spinal column, increasing passive support for body weight. A stiff spinal column would also provide biomechanical support for the primary hip flexors and, therefore, may have played a role in locomotor efficiency and mobility in large-bodied species. This new association of interspinal ligament metaplasia in Theropoda with large body size contributes additional insight to our understanding of the diverse biomechanical coping mechanisms developed throughout Dinosauria, and stresses the significance of phylogenetic methods when testing for biological trends, evolutionary or not.

  12. Vertebral Adaptations to Large Body Size in Theropod Dinosaurs.

    Science.gov (United States)

    Wilson, John P; Woodruff, D Cary; Gardner, Jacob D; Flora, Holley M; Horner, John R; Organ, Chris L

    2016-01-01

    Rugose projections on the anterior and posterior aspects of vertebral neural spines appear throughout Amniota and result from the mineralization of the supraspinous and interspinous ligaments via metaplasia, the process of permanent tissue-type transformation. In mammals, this metaplasia is generally pathological or stress induced, but is a normal part of development in some clades of birds. Such structures, though phylogenetically sporadic, appear throughout the fossil record of non-avian theropod dinosaurs, yet their physiological and adaptive significance has remained unexamined. Here we show novel histologic and phylogenetic evidence that neural spine projections were a physiological response to biomechanical stress in large-bodied theropod species. Metaplastic projections also appear to vary between immature and mature individuals of the same species, with immature animals either lacking them or exhibiting smaller projections, supporting the hypothesis that these structures develop through ontogeny as a result of increasing bending stress subjected to the spinal column. Metaplastic mineralization of spinal ligaments would likely affect the flexibility of the spinal column, increasing passive support for body weight. A stiff spinal column would also provide biomechanical support for the primary hip flexors and, therefore, may have played a role in locomotor efficiency and mobility in large-bodied species. This new association of interspinal ligament metaplasia in Theropoda with large body size contributes additional insight to our understanding of the diverse biomechanical coping mechanisms developed throughout Dinosauria, and stresses the significance of phylogenetic methods when testing for biological trends, evolutionary or not.

  13. Technical trends of large-size photomasks for flat panel displays

    Science.gov (United States)

    Yoshida, Koichiro

    2017-06-01

    Currently, flat panel displays (FPDs) are one of the main parts for information technology devices and sets. From 1990's to 2000's, liquid crystal displays (LCDs) and plasma displays had been mainstream FPDs. In the middle of 2000's, demand of plasma displays declined and organic light emitting diodes (OLEDs) newly came into FPD market. And today, major technology of FPDs are LCDs and OLEDs. Especially for mobile devices, the penetration of OLEDs is remarkable. In FPDs panel production, photolithography is the key technology as same as LSI. Photomasks for FPDs are used not only as original master of circuit pattern, but also as a tool to form other functional structures of FPDs. Photomasks for FPDs are called as "Large Size Photomasks(LSPMs)", since the remarkable feature is " Size" which reaches over 1- meter square and over 100kg. In this report, we discuss three LSPMs technical topics with FPDs technical transition and trend. The first topics is upsizing of LSPMs, the second is the challenge for higher resolution patterning, and the last is "Multi-Tone Mask" for "Half -Tone Exposure".

  14. Ultra-large size austenitic stainless steel forgings for fast breeder reactor 'Monju'

    International Nuclear Information System (INIS)

    Tsukada, Hisashi; Suzuki, Komei; Sato, Ikuo; Miura, Ritsu.

    1988-01-01

    The large SUS 304 austenitic stainless steel forgings for the reactor vessel of the prototype FBR 'Monju' of 280 MWe output were successfully manufactured. The reactor vessel contains the heart of the reactor and sodium coolant at 530 deg C, and its inside diameter is about 7 m, and height is about 18 m. It is composed of 12 large forgings, that is, very thick flanges and shalls made by ring forging and an end plate made by disk forging and hot forming, using a special press machine. The manufacture of these large forgings utilized the results of the basic test on the material properties in high temperature environment and the effect that the manufacturing factors exert on the material properties and the results of the development of manufacturing techniques for superlarge forgings. The problems were the manufacturing techniques for the large ingots of 250 t class of high purity, the hot working techniques for stainless steel of fine grain size, the forging techniques for superlarge rings and disks, and the machining techniques of high precision for particularly large diameter, thin wall rings. The manufacture of these large stainless steel forgings is reported. (Kako, I.)

  15. New Techniques for Deep Learning with Geospatial Data using TensorFlow, Earth Engine, and Google Cloud Platform

    Science.gov (United States)

    Hancher, M.

    2017-12-01

    Recent years have seen promising results from many research teams applying deep learning techniques to geospatial data processing. In that same timeframe, TensorFlow has emerged as the most popular framework for deep learning in general, and Google has assembled petabytes of Earth observation data from a wide variety of sources and made them available in analysis-ready form in the cloud through Google Earth Engine. Nevertheless, developing and applying deep learning to geospatial data at scale has been somewhat cumbersome to date. We present a new set of tools and techniques that simplify this process. Our approach combines the strengths of several underlying tools: TensorFlow for its expressive deep learning framework; Earth Engine for data management, preprocessing, postprocessing, and visualization; and other tools in Google Cloud Platform to train TensorFlow models at scale, perform additional custom parallel data processing, and drive the entire process from a single familiar Python development environment. These tools can be used to easily apply standard deep neural networks, convolutional neural networks, and other custom model architectures to a variety of geospatial data structures. We discuss our experiences applying these and related tools to a range of machine learning problems, including classic problems like cloud detection, building detection, land cover classification, as well as more novel problems like illegal fishing detection. Our improved tools will make it easier for geospatial data scientists to apply modern deep learning techniques to their own problems, and will also make it easier for machine learning researchers to advance the state of the art of those techniques.

  16. Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis

    Science.gov (United States)

    Singh, R.; Percivall, G.

    2009-12-01

    (note: acronym glossary at end of abstract) For scientists to have confidence in the veracity of data sets and computational processes not under their control, operational transparency must be much greater than previously required. Being able to have a universally understood and machine-readable language for describing such things as the completeness of metadata, data provenance and uncertainty, and the discrete computational steps in a complex process take on increased importance. OGC has been involved with technological issues associated with climate change since 2005 when we, along with the IEEE Committee on Earth Observation, began a close working relationship with GEO and GEOSS (http://earthobservations.org). GEO/GEOS provide the technology platform to GCOS who in turn represents the earth observation community to UNFCCC. OGC and IEEE are the organizers of the GEO/GEOSS Architecture Implementation Pilot (see http://www.ogcnetwork.net/AIpilot). This continuing work involves closely working with GOOS (Global Ocean Observing System) and WMO (World Meteorological Organization). This session reports on the findings of recent work within the OGC’s community of software developers and users to apply geospatial web services to the climate studies domain. The value of this work is to evolve OGC web services, moving from data access and query to geo-processing and workflows. Two projects will be described, the GEOSS API-2 and the CCIP. AIP is a task of the GEOSS Architecture and Data Committee. During its duration, two GEO Tasks defined the project: AIP-2 began as GEO Task AR-07-02, to lead the incorporation of contributed components consistent with the GEOSS Architecture using a GEO Web Portal and a Clearinghouse search facility to access services through GEOSS Interoperability Arrangements in support of the GEOSS Societal Benefit Areas. AIP-2 concluded as GEOS Task AR-09-01b, to develop and pilot new process and infrastructure components for the GEOSS Common

  17. A hybrid adaptive large neighborhood search algorithm applied to a lot-sizing problem

    DEFF Research Database (Denmark)

    Muller, Laurent Flindt; Spoorendonk, Simon

    This paper presents a hybrid of a general heuristic framework that has been successfully applied to vehicle routing problems and a general purpose MIP solver. The framework uses local search and an adaptive procedure which choses between a set of large neighborhoods to be searched. A mixed integer...... of a solution and to investigate the feasibility of elements in such a neighborhood. The hybrid heuristic framework is applied to the multi-item capacitated lot sizing problem with dynamic lot sizes, where experiments have been conducted on a series of instances from the literature. On average the heuristic...

  18. Sizing and scaling requirements of a large-scale physical model for code validation

    International Nuclear Information System (INIS)

    Khaleel, R.; Legore, T.

    1990-01-01

    Model validation is an important consideration in application of a code for performance assessment and therefore in assessing the long-term behavior of the engineered and natural barriers of a geologic repository. Scaling considerations relevant to porous media flow are reviewed. An analysis approach is presented for determining the sizing requirements of a large-scale, hydrology physical model. The physical model will be used to validate performance assessment codes that evaluate the long-term behavior of the repository isolation system. Numerical simulation results for sizing requirements are presented for a porous medium model in which the media properties are spatially uncorrelated

  19. A NoSQL–SQL Hybrid Organization and Management Approach for Real-Time Geospatial Data: A Case Study of Public Security Video Surveillance

    Directory of Open Access Journals (Sweden)

    Chen Wu

    2017-01-01

    Full Text Available With the widespread deployment of ground, air and space sensor sources (internet of things or IoT, social networks, sensor networks, the integrated applications of real-time geospatial data from ubiquitous sensors, especially in public security and smart city domains, are becoming challenging issues. The traditional geographic information system (GIS mostly manages time-discretized geospatial data by means of the Structured Query Language (SQL database management system (DBMS and emphasizes query and retrieval of massive historical geospatial data on disk. This limits its capability for on-the-fly access of real-time geospatial data for online analysis in real time. This paper proposes a hybrid database organization and management approach with SQL relational databases (RDB and not only SQL (NoSQL databases (including the main memory database, MMDB, and distributed files system, DFS. This hybrid approach makes full use of the advantages of NoSQL and SQL DBMS for the real-time access of input data and structured on-the-fly analysis results which can meet the requirements of increased spatio-temporal big data linking analysis. The MMDB facilitates real-time access of the latest input data such as the sensor web and IoT, and supports the real-time query for online geospatial analysis. The RDB stores change information such as multi-modal features and abnormal events extracted from real-time input data. The DFS on disk manages the massive geospatial data, and the extensible storage architecture and distributed scheduling of a NoSQL database satisfy the performance requirements of incremental storage and multi-user concurrent access. A case study of geographic video (GeoVideo surveillance of public security is presented to prove the feasibility of this hybrid organization and management approach.

  20. Effect of pore size on performance of monolithic tube chromatography of large biomolecules.

    Science.gov (United States)

    Podgornik, Ales; Hamachi, Masataka; Isakari, Yu; Yoshimoto, Noriko; Yamamoto, Shuichi

    2017-11-01

    Effect of pore size on the performance of ion-exchange monolith tube chromatography of large biomolecules was investigated. Radial flow 1 mL polymer based monolith tubes of different pore sizes (1.5, 2, and 6 μm) were tested with model samples such as 20 mer poly T-DNA, basic proteins, and acidic proteins (molecular weight 14 000-670 000). Pressure drop, pH transient, the number of binding site, dynamic binding capacity, and peak width were examined. Pressure drop-flow rate curves and dynamic binding capacity values were well correlated with the nominal pore size. While duration of the pH transient curves depends on the pore size, it was found that pH duration normalized on estimated surface area was constant, indicating that the ligand density is the same. This was also confirmed by the constant number of binding site values being independent of pore size. The peak width values were similar to those for axial flow monolith chromatography. These results showed that it is easy to scale up axial flow monolith chromatography to radial flow monolith tube chromatography by choosing the right pore size in terms of the pressure drop and capacity. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Scrum of scrums solution for large size teams using scrum methodology

    OpenAIRE

    Qurashi, Saja Al; Qureshi, M. Rizwan Jameel

    2014-01-01

    Scrum is a structured framework to support complex product development. However, Scrum methodology faces a challenge of managing large teams. To address this challenge, in this paper we propose a solution called Scrum of Scrums. In Scrum of Scrums, we divide the Scrum team into teams of the right size, and then organize them hierarchically into a Scrum of Scrums. The main goals of the proposed solution are to optimize communication between teams in Scrum of Scrums; to make the system work aft...

  2. Introduction to Large-sized Test Facility for validating Containment Integrity under Severe Accidents

    International Nuclear Information System (INIS)

    Na, Young Su; Hong, Seongwan; Hong, Seongho; Min, Beongtae

    2014-01-01

    An overall assessment of containment integrity can be conducted properly by examining the hydrogen behavior in the containment building. Under severe accidents, an amount of hydrogen gases can be generated by metal oxidation and corium-concrete interaction. Hydrogen behavior in the containment building strongly depends on complicated thermal hydraulic conditions with mixed gases and steam. The performance of a PAR can be directly affected by the thermal hydraulic conditions, steam contents, gas mixture behavior and aerosol characteristics, as well as the operation of other engineering safety systems such as a spray. The models in computer codes for a severe accident assessment can be validated based on the experiment results in a large-sized test facility. The Korea Atomic Energy Research Institute (KAERI) is now preparing a large-sized test facility to examine in detail the safety issues related with hydrogen including the performance of safety devices such as a PAR in various severe accident situations. This paper introduces the KAERI test facility for validating the containment integrity under severe accidents. To validate the containment integrity, a large-sized test facility is necessary for simulating complicated phenomena induced by an amount of steam and gases, especially hydrogen released into the containment building under severe accidents. A pressure vessel 9.5 m in height and 3.4 m in diameter was designed at the KAERI test facility for the validating containment integrity, which was based on the THAI test facility with the experimental safety and the reliable measurement systems certified for a long time. This large-sized pressure vessel operated in steam and iodine as a corrosive agent was made by stainless steel 316L because of corrosion resistance for a long operating time, and a vessel was installed in at KAERI in March 2014. In the future, the control systems for temperature and pressure in a vessel will be constructed, and the measurement system

  3. Large Scale Behavior and Droplet Size Distributions in Crude Oil Jets and Plumes

    Science.gov (United States)

    Katz, Joseph; Murphy, David; Morra, David

    2013-11-01

    The 2010 Deepwater Horizon blowout introduced several million barrels of crude oil into the Gulf of Mexico. Injected initially as a turbulent jet containing crude oil and gas, the spill caused formation of a subsurface plume stretching for tens of miles. The behavior of such buoyant multiphase plumes depends on several factors, such as the oil droplet and bubble size distributions, current speed, and ambient stratification. While large droplets quickly rise to the surface, fine ones together with entrained seawater form intrusion layers. Many elements of the physics of droplet formation by an immiscible turbulent jet and their resulting size distribution have not been elucidated, but are known to be significantly influenced by the addition of dispersants, which vary the Weber Number by orders of magnitude. We present experimental high speed visualizations of turbulent jets of sweet petroleum crude oil (MC 252) premixed with Corexit 9500A dispersant at various dispersant to oil ratios. Observations were conducted in a 0.9 m × 0.9 m × 2.5 m towing tank, where large-scale behavior of the jet, both stationary and towed at various speeds to simulate cross-flow, have been recorded at high speed. Preliminary data on oil droplet size and spatial distributions were also measured using a videoscope and pulsed light sheet. Sponsored by Gulf of Mexico Research Initiative (GoMRI).

  4. NOSQL FOR STORAGE AND RETRIEVAL OF LARGE LIDAR DATA COLLECTIONS

    Directory of Open Access Journals (Sweden)

    J. Boehm

    2015-08-01

    Full Text Available Developments in LiDAR technology over the past decades have made LiDAR to become a mature and widely accepted source of geospatial information. This in turn has led to an enormous growth in data volume. The central idea for a file-centric storage of LiDAR point clouds is the observation that large collections of LiDAR data are typically delivered as large collections of files, rather than single files of terabyte size. This split of the dataset, commonly referred to as tiling, was usually done to accommodate a specific processing pipeline. It makes therefore sense to preserve this split. A document oriented NoSQL database can easily emulate this data partitioning, by representing each tile (file in a separate document. The document stores the metadata of the tile. The actual files are stored in a distributed file system emulated by the NoSQL database. We demonstrate the use of MongoDB a highly scalable document oriented NoSQL database for storing large LiDAR files. MongoDB like any NoSQL database allows for queries on the attributes of the document. As a specialty MongoDB also allows spatial queries. Hence we can perform spatial queries on the bounding boxes of the LiDAR tiles. Inserting and retrieving files on a cloud-based database is compared to native file system and cloud storage transfer speed.

  5. Nosql for Storage and Retrieval of Large LIDAR Data Collections

    Science.gov (United States)

    Boehm, J.; Liu, K.

    2015-08-01

    Developments in LiDAR technology over the past decades have made LiDAR to become a mature and widely accepted source of geospatial information. This in turn has led to an enormous growth in data volume. The central idea for a file-centric storage of LiDAR point clouds is the observation that large collections of LiDAR data are typically delivered as large collections of files, rather than single files of terabyte size. This split of the dataset, commonly referred to as tiling, was usually done to accommodate a specific processing pipeline. It makes therefore sense to preserve this split. A document oriented NoSQL database can easily emulate this data partitioning, by representing each tile (file) in a separate document. The document stores the metadata of the tile. The actual files are stored in a distributed file system emulated by the NoSQL database. We demonstrate the use of MongoDB a highly scalable document oriented NoSQL database for storing large LiDAR files. MongoDB like any NoSQL database allows for queries on the attributes of the document. As a specialty MongoDB also allows spatial queries. Hence we can perform spatial queries on the bounding boxes of the LiDAR tiles. Inserting and retrieving files on a cloud-based database is compared to native file system and cloud storage transfer speed.

  6. Geospatial field applications within United States Department of Agriculture, Veterinary Services.

    Science.gov (United States)

    FitzMaurice, Priscilla L; Freier, Jerome E; Geter, Kenneth D

    2007-01-01

    Epidemiologists, veterinary medical officers and animal health technicians within Veterinary Services (VS) are actively utilising global positioning system (GPS) technology to obtain positional data on livestock and poultry operations throughout the United States. Geospatial data, if acquired for monitoring and surveillance purposes, are stored within the VS Generic Database (GDB). If the information is collected in response to an animal disease outbreak, the data are entered into the Emergency Management Response System (EMRS). The Spatial Epidemiology group within the Centers for Epidemiology and Animal Health (CEAH) has established minimum data accuracy standards for geodata acquisition. To ensure that field-collected geographic coordinates meet these minimum standards, field personnel are trained in proper data collection procedures. Positional accuracy is validated with digital atlases, aerial photographs, Web-based parcel maps, or address geocoding. Several geospatial methods and technologies are under investigation for future use within VS. These include the direct transfer of coordinates from GPS receivers to computers, GPS-enabled digital cameras, tablet PCs, and GPS receivers preloaded with custom ArcGIS maps - all with the objective of reducing transcription and data entry errors and improving the ease of data collection in the field.

  7. Newspaper archives + text mining = rich sources of historical geo-spatial data

    Science.gov (United States)

    Yzaguirre, A.; Smit, M.; Warren, R.

    2016-04-01

    Newspaper archives are rich sources of cultural, social, and historical information. These archives, even when digitized, are typically unstructured and organized by date rather than by subject or location, and require substantial manual effort to analyze. The effort of journalists to be accurate and precise means that there is often rich geo-spatial data embedded in the text, alongside text describing events that editors considered to be of sufficient importance to the region or the world to merit column inches. A regional newspaper can add over 100,000 articles to its database each year, and extracting information from this data for even a single country would pose a substantial Big Data challenge. In this paper, we describe a pilot study on the construction of a database of historical flood events (location(s), date, cause, magnitude) to be used in flood assessment projects, for example to calibrate models, estimate frequency, establish high water marks, or plan for future events in contexts ranging from urban planning to climate change adaptation. We then present a vision for extracting and using the rich geospatial data available in unstructured text archives, and suggest future avenues of research.

  8. Large-sized and highly radioactive 3H and 109Cd Langmuir-Blodgett films

    International Nuclear Information System (INIS)

    Shibata, S.; Kawakami, H.; Kato, S.

    1994-02-01

    A device for the deposition of a radioactive Langmuir-Blodgett (LB) film was developed with the use of: (1) a modified horizontal lifting method, (2) an extremely shallow trough, and (3) a surface pressure-generating system without piston oil. It made a precious radioactive subphase solution repeatedly usable while keeping its radioactivity concentration as high as possible. Any large-size thin films can be prepared by just changing the trough size. Two monomolecular-layers of Y-type films of cadmium [ 3 H] icosanoate and 109 Cd icosanoate were built up as 3 H and 109 Cd β-sources for electron spectroscopy with intensities of 1.5 GBq (40 mCi) and 7.4 MBq (200 μCi), respectively, and a size of 65x200 mm 2 . Excellent uniformity of the distribution of deposited radioactivity was confirmed by autoradiography and photometry. (author)

  9. Geospatial Analysis Using Remote Sensing Images: Case Studies of Zonguldak Test Field

    Science.gov (United States)

    Bayık, Çağlar; Topan, Hüseyin; Özendi, Mustafa; Oruç, Murat; Cam, Ali; Abdikan, Saygın

    2016-06-01

    Inclined topographies are one of the most challenging problems for geospatial analysis of air-borne and space-borne imageries. However, flat areas are mostly misleading to exhibit the real performance. For this reason, researchers generally require a study area which includes mountainous topography and various land cover and land use types. Zonguldak and its vicinity is a very suitable test site for performance investigation of remote sensing systems due to the fact that it contains different land use types such as dense forest, river, sea, urban area; different structures such as open pit mining operations, thermal power plant; and its mountainous structure. In this paper, we reviewed more than 120 proceeding papers and journal articles about geospatial analysis that are performed on the test field of Zonguldak and its surroundings. Geospatial analysis performed with imageries include elimination of systematic geometric errors, 2/3D georeferencing accuracy assessment, DEM and DSM generation and validation, ortho-image production, evaluation of information content, image classification, automatic feature extraction and object recognition, pan-sharpening, land use and land cover change analysis and deformation monitoring. In these applications many optical satellite images are used i.e. ASTER, Bilsat-1, IKONOS, IRS-1C, KOMPSAT-1, KVR-1000, Landsat-3-5-7, Orbview-3, QuickBird, Pleiades, SPOT-5, TK-350, RADARSAT-1, WorldView-1-2; as well as radar data i.e. JERS-1, Envisat ASAR, TerraSAR-X, ALOS PALSAR and SRTM. These studies are performed by Departments of Geomatics Engineering at Bülent Ecevit University, at İstanbul Technical University, at Yıldız Technical University, and Institute of Photogrammetry and GeoInformation at Leibniz University Hannover. These studies are financially supported by TÜBİTAK (Turkey), the Universities, ESA, Airbus DS, ERSDAC (Japan) and Jülich Research Centre (Germany).

  10. GEOSPATIAL ANALYSIS USING REMOTE SENSING IMAGES: CASE STUDIES OF ZONGULDAK TEST FIELD

    Directory of Open Access Journals (Sweden)

    Ç. Bayık

    2016-06-01

    Full Text Available Inclined topographies are one of the most challenging problems for geospatial analysis of air-borne and space-borne imageries. However, flat areas are mostly misleading to exhibit the real performance. For this reason, researchers generally require a study area which includes mountainous topography and various land cover and land use types. Zonguldak and its vicinity is a very suitable test site for performance investigation of remote sensing systems due to the fact that it contains different land use types such as dense forest, river, sea, urban area; different structures such as open pit mining operations, thermal power plant; and its mountainous structure. In this paper, we reviewed more than 120 proceeding papers and journal articles about geospatial analysis that are performed on the test field of Zonguldak and its surroundings. Geospatial analysis performed with imageries include elimination of systematic geometric errors, 2/3D georeferencing accuracy assessment, DEM and DSM generation and validation, ortho-image production, evaluation of information content, image classification, automatic feature extraction and object recognition, pan-sharpening, land use and land cover change analysis and deformation monitoring. In these applications many optical satellite images are used i.e. ASTER, Bilsat-1, IKONOS, IRS-1C, KOMPSAT-1, KVR-1000, Landsat-3-5-7, Orbview-3, QuickBird, Pleiades, SPOT-5, TK-350, RADARSAT-1, WorldView-1-2; as well as radar data i.e. JERS-1, Envisat ASAR, TerraSAR-X, ALOS PALSAR and SRTM. These studies are performed by Departments of Geomatics Engineering at Bülent Ecevit University, at İstanbul Technical University, at Yıldız Technical University, and Institute of Photogrammetry and GeoInformation at Leibniz University Hannover. These studies are financially supported by TÜBİTAK (Turkey, the Universities, ESA, Airbus DS, ERSDAC (Japan and Jülich Research Centre (Germany.

  11. FOSS Tools and Applications for Education in Geospatial Sciences

    Directory of Open Access Journals (Sweden)

    Marco Ciolli

    2017-07-01

    Full Text Available While the theory and implementation of geographic information systems (GIS have a history of more than 50 years, the development of dedicated educational tools and applications in this field is more recent. This paper presents a free and open source software (FOSS approach for education in the geospatial disciplines, which has been used over the last 20 years at two Italian universities. The motivations behind the choice of FOSS are discussed with respect to software availability and development, as well as educational material licensing. Following this philosophy, a wide range of educational tools have been developed, covering topics from numerical cartography and GIS principles to the specifics regarding different systems for the management and analysis of spatial data. Various courses have been implemented for diverse recipients, ranging from professional training workshops to PhD courses. Feedback from the students of those courses provides an invaluable assessment of the effectiveness of the approach, supplying at the same time directions for further improvement. Finally, lessons learned after 20 years are discussed, highlighting how the management of educational materials can be difficult even with a very open approach to licensing. Overall, the use of free and open source software for geospatial (FOSS4G science provides a clear advantage over other approaches, not only simplifying software and data management, but also ensuring that all of the information related to system design and implementation is available.

  12. Geospatial Informational Security Risks and Concerns of the U.S. Air Force GeoBase Program

    National Research Council Canada - National Science Library

    Bryant, Scott A

    2007-01-01

    Technological advancements such as Geospatial Information Systems (GIS) and the Internet have made it easier and affordable to share information, which enables complex and time sensitive decisions to be made with higher confidence...

  13. Q0000-398 is a high-redshift quasar with a large angular size

    International Nuclear Information System (INIS)

    Gearhart, M.R.; Pacht, E.

    1977-01-01

    A study is described, using the three-element interferrometer at the National Radio Astronomy Observatory, West Virginia, to investigate whether any quasars exist that might be radio sources. It was found that Q0000-398 appeared to be a quasar of high red shift and large angular size. The interferrometer was operated with a 300-1200-1500 m baseline configuration at 2695 MHz. The radio map for Q0000-398 is shown, and has two weak components separated by 134 +- 40 arc s. If these components are associated with the optical object this quasar has the largest known angular size for its red shift value. The results reported for Q0000-398 and other quasars having considerable angular extent demonstrate the importance of considering radio selection effects in the angular diameter-red shift relationship, and since any radio selection effects are removed when quasars are selected optically, more extensive mapping programs should be undertaken, looking particularly for large scale structure around optically selected high-z quasars. (U.K.)

  14. Generation of Multiple Metadata Formats from a Geospatial Data Repository

    Science.gov (United States)

    Hudspeth, W. B.; Benedict, K. K.; Scott, S.

    2012-12-01

    The Earth Data Analysis Center (EDAC) at the University of New Mexico is partnering with the CYBERShARE and Environmental Health Group from the Center for Environmental Resource Management (CERM), located at the University of Texas, El Paso (UTEP), the Biodiversity Institute at the University of Kansas (KU), and the New Mexico Geo- Epidemiology Research Network (GERN) to provide a technical infrastructure that enables investigation of a variety of climate-driven human/environmental systems. Two significant goals of this NASA-funded project are: a) to increase the use of NASA Earth observational data at EDAC by various modeling communities through enabling better discovery, access, and use of relevant information, and b) to expose these communities to the benefits of provenance for improving understanding and usability of heterogeneous data sources and derived model products. To realize these goals, EDAC has leveraged the core capabilities of its Geographic Storage, Transformation, and Retrieval Engine (Gstore) platform, developed with support of the NSF EPSCoR Program. The Gstore geospatial services platform provides general purpose web services based upon the REST service model, and is capable of data discovery, access, and publication functions, metadata delivery functions, data transformation, and auto-generated OGC services for those data products that can support those services. Central to the NASA ACCESS project is the delivery of geospatial metadata in a variety of formats, including ISO 19115-2/19139, FGDC CSDGM, and the Proof Markup Language (PML). This presentation details the extraction and persistence of relevant metadata in the Gstore data store, and their transformation into multiple metadata formats that are increasingly utilized by the geospatial community to document not only core library catalog elements (e.g. title, abstract, publication data, geographic extent, projection information, and database elements), but also the processing steps used to

  15. High Resolution Dsm and Classified Volumetric Generation: AN Operational Approach to the Improvement of Geospatial Intelligence

    Science.gov (United States)

    Boccardo, P.; Gentili, G.

    2011-09-01

    As mentioned by Bacastow and Bellafiore, Geospatial Intelligence (GEOINT) is a field of knowledge, a process, and a profession. As knowledge, it is information integrated in a coherent space-time context that supports descriptions, explanations, or forecasts of human activities with which decision makers take action. As a process, it is the means by which data and information are collected, manipulated, geospatially reasoned, and disseminated to decision-makers. The geospatial intelligence professional establishes the scope of activities, interdisciplinary associations, competencies, and standards in academe, government, and the private sectors. Taking into account the fact that GEOINT is crucial for broad organizations, BLOM Group, a leading International provider within acquisition, processing and modeling of geographic information and ITHACA, a non-profit organization devoted to products and services delivering to the UN System in the field of geomatics, set up and provided GEOINT data to the main Italian companies operating in the field of mobile phone networking. This data, extremely useful for telecom network planning, have derived and produced using a standardized and effective (from the production point of view) approach. In this paper, all the procedures used for the production are described and tested with the aim to investigate the suitability of the data and the procedures themselves to any others possible fields of application.

  16. Compilation of geospatial data for the mineral industries and related infrastructure of Latin America and the Caribbean

    Science.gov (United States)

    Baker, Michael S.; Buteyn, Spencer D.; Freeman, Philip A.; Trippi, Michael H.; Trimmer III, Loyd M.

    2017-07-31

    This report describes the U.S. Geological Survey’s (USGS) ongoing commitment to its mission of understanding the nature and distribution of global mineral commodity supply chains by updating and publishing the georeferenced locations of mineral commodity production and processing facilities, mineral exploration and development sites, and mineral commodity exporting ports in Latin America and the Caribbean. The report includes an overview of data sources and an explanation of the geospatial PDF map format.The geodatabase and geospatial data layers described in this report create a new geographic information product in the form of a geospatial portable document format (PDF) map. The geodatabase contains additional data layers from USGS, foreign governmental, and open-source sources as follows: (1) coal occurrence areas, (2) electric power generating facilities, (3) electric power transmission lines, (4) hydrocarbon resource cumulative production data, (5) liquefied natural gas terminals, (6) oil and gas concession leasing areas, (7) oil and gas field center points, (8) oil and gas pipelines, (9) USGS petroleum provinces, (10) railroads, (11) recoverable proven plus probable hydrocarbon resources, (12) major cities, (13) major rivers, and (14) undiscovered porphyry copper tracts.

  17. Small, medium, large or supersize? The development and evaluation of interventions targeted at portion size

    Science.gov (United States)

    Vermeer, W M; Steenhuis, I H M; Poelman, M P

    2014-01-01

    In the past decades, portion sizes of high-caloric foods and drinks have increased and can be considered an important environmental obesogenic factor. This paper describes a research project in which the feasibility and effectiveness of environmental interventions targeted at portion size was evaluated. The studies that we conducted revealed that portion size labeling, offering a larger variety of portion sizes, and proportional pricing (that is, a comparable price per unit regardless of the size) were considered feasible to implement according to both consumers and point-of-purchase representatives. Studies into the effectiveness of these interventions demonstrated that the impact of portion size labeling on the (intended) consumption of soft drinks was, at most, modest. Furthermore, the introduction of smaller portion sizes of hot meals in worksite cafeterias in addition to the existing size stimulated a moderate number of consumers to replace their large meals by a small meal. Elaborating on these findings, we advocate further research into communication and marketing strategies related to portion size interventions; the development of environmental portion size interventions as well as educational interventions that improve people's ability to deal with a ‘super-sized' environment; the implementation of regulation with respect to portion size labeling, and the use of nudges to stimulate consumers to select healthier portion sizes. PMID:25033959

  18. Using Geospatial Analysis to Align Little Free Library Locations with Community Literacy Needs

    Science.gov (United States)

    Rebori, Marlene K.; Burge, Peter

    2017-01-01

    We used geospatial analysis tools to develop community maps depicting fourth-grade reading proficiency test scores and locations of facilities offering public access to reading materials (i.e., public libraries, elementary schools, and Little Free Libraries). The maps visually highlighted areas with struggling readers and areas without adequate…

  19. Data Quality, Provenance and IPR Management services: their role in empowering geospatial data suppliers and users

    Science.gov (United States)

    Millard, Keiran

    2015-04-01

    This paper looks at current experiences of geospatial users and geospatial suppliers and how they have been limited by suitable frameworks for managing and communicating data quality, data provenance and intellectual property rights (IPR). Current political and technological drivers mean that increasing volumes of geospatial data are available through a plethora of different products and services, and whilst this is inherently a good thing it does create a new generation of challenges. This paper consider two examples of where these issues have been examined and looks at the challenges and possible solutions from a data user and data supplier perspective. The first example is the IQmulus project that is researching fusion environments for big geospatial point clouds and coverages. The second example is the EU Emodnet programme that is establishing thematic data portals for public marine and coastal data. IQmulus examines big geospatial data; the data from sources such as LIDAR, SONAR and numerical simulations; these data are simply too big for routine and ad-hoc analysis, yet they could realise a myriad of disparate, and readily useable, information products with the right infrastructure in place. IQmulus is researching how to deliver this infrastructure technically, but a financially sustainable delivery depends on being able to track and manage ownership and IPR across the numerous data sets being processed. This becomes complex when the data is composed of multiple overlapping coverages, however managing this allows for uses to be delivered highly-bespoke products to meet their budget and technical needs. The Emodnet programme delivers harmonised marine data at the EU scale across seven thematic portals. As part of the Emodnet programme a series of 'check points' have been initiated to examine how useful these services and other public data services actually are to solve real-world problems. One key finding is that users have been confused by the fact that often

  20. Validation Of Intermediate Large Sample Analysis (With Sizes Up to 100 G) and Associated Facility Improvement

    International Nuclear Information System (INIS)

    Bode, P.; Koster-Ammerlaan, M.J.J.

    2018-01-01

    Pragmatic rather than physical correction factors for neutron and gamma-ray shielding were studied for samples of intermediate size, i.e. up to the 10-100 gram range. It was found that for most biological and geological materials, the neutron self-shielding is less than 5 % and the gamma-ray self-attenuation can easily be estimated. A trueness control material of 1 kg size was made based on use of left-overs of materials, used in laboratory intercomparisons. A design study for a large sample pool-side facility, handling plate-type volumes, had to be stopped because of a reduction in human resources, available for this CRP. The large sample NAA facilities were made available to guest scientists from Greece and Brazil. The laboratory for neutron activation analysis participated in the world’s first laboratory intercomparison utilizing large samples. (author)

  1. Optimal integrated sizing and planning of hubs with midsize/large CHP units considering reliability of supply

    International Nuclear Information System (INIS)

    Moradi, Saeed; Ghaffarpour, Reza; Ranjbar, Ali Mohammad; Mozaffari, Babak

    2017-01-01

    Highlights: • New hub planning formulation is proposed to exploit assets of midsize/large CHPs. • Linearization approaches are proposed for two-variable nonlinear CHP fuel function. • Efficient operation of addressed CHPs & hub devices at contingencies are considered. • Reliability-embedded integrated planning & sizing is formulated as one single MILP. • Noticeable results for costs & reliability-embedded planning due to mid/large CHPs. - Abstract: Use of multi-carrier energy systems and the energy hub concept has recently been a widespread trend worldwide. However, most of the related researches specialize in CHP systems with constant electricity/heat ratios and linear operating characteristics. In this paper, integrated energy hub planning and sizing is developed for the energy systems with mid-scale and large-scale CHP units, by taking their wide operating range into consideration. The proposed formulation is aimed at taking the best use of the beneficial degrees of freedom associated with these units for decreasing total costs and increasing reliability. High-accuracy piecewise linearization techniques with approximation errors of about 1% are introduced for the nonlinear two-dimensional CHP input-output function, making it possible to successfully integrate the CHP sizing. Efficient operation of CHP and the hub at contingencies is extracted via a new formulation, which is developed to be incorporated to the planning and sizing problem. Optimal operation, planning, sizing and contingency operation of hub components are integrated and formulated as a single comprehensive MILP problem. Results on a case study with midsize CHPs reveal a 33% reduction in total costs, and it is demonstrated that the proposed formulation ceases the need for additional components/capacities for increasing reliability of supply.

  2. Advancing Geospatial Technologies in Science and Social Science: A Case Study in Collaborative Education

    Science.gov (United States)

    Williams, N. A.; Morris, J. N.; Simms, M. L.; Metoyer, S.

    2007-12-01

    The Advancing Geospatial Skills in Science and Social Sciences (AGSSS) program, funded by NSF, provides middle and high school teacher-partners with access to graduate student scientists for classroom collaboration and curriculum adaptation to incorporate and advance skills in spatial thinking. AGSSS Fellows aid in the delivery of geospatially-enhanced activities utilizing technology such as geographic information systems, remote sensing, and virtual globes. The partnership also provides advanced professional development for both participating teachers and fellows. The AGSSS program is mutually beneficial to all parties involved. This successful collaboration of scientists, teachers, and students results in greater understanding and enthusiasm for the use of spatial thinking strategies and geospatial technologies. In addition, the partnership produces measurable improvements in student efficacy and attitudes toward processes of spatial thinking. The teacher partner training and classroom resources provided by AGSSS will continue the integration of geospatial activities into the curriculum after the project concludes. Time and resources are the main costs in implementing this partnership. Graduate fellows invest considerable time and energy, outside of academic responsibilities, to develop materials for the classroom. Fellows are required to be available during K-12 school hours, which necessitates forethought in scheduling other graduate duties. However, the benefits far outweigh the costs. Graduate fellows gain experience in working in classrooms. In exchange, students gain exposure to working scientists and their research. This affords graduate fellows the opportunity to hone their communication skills, and specifically allows them to address the issue of translating technical information for a novice audience. Teacher-partners and students benefit by having scientific expertise readily available. In summation, these experiences result in changes in teacher

  3. A procedure to detect flaws inside large size marble blocks by ultrasound

    OpenAIRE

    Bramanti, Mauro; Bozzi, Edoardo

    1999-01-01

    In stone and marble industry there is considerable interest in the possibility of using ultrasound diagnostic techniques for non-destructive testing of large size blocks in order to detect internal flaws such as faults, cracks and fissures. In this paper some preliminary measurements are reported in order to acquire basic knowledge of the fundamental properties of ultrasound, such as propagation velocity and attenuation, in the media here considered. We then outline a particular diagnostic pr...

  4. Local Government GIS and Geospatial Capabilities : Suitability for Integrated Transportation & Land Use Planning (California SB 375)

    Science.gov (United States)

    2009-11-01

    This report examines two linked phenomena in transportation planning: the geospatial analysis capabilities of local planning agencies and the increasing demands on such capabilities imposed by comprehensive planning mandates.

  5. Automating the Analysis of Spatial Grids A Practical Guide to Data Mining Geospatial Images for Human & Environmental Applications

    CERN Document Server

    Lakshmanan, Valliappa

    2012-01-01

    The ability to create automated algorithms to process gridded spatial data is increasingly important as remotely sensed datasets increase in volume and frequency. Whether in business, social science, ecology, meteorology or urban planning, the ability to create automated applications to analyze and detect patterns in geospatial data is increasingly important. This book provides students with a foundation in topics of digital image processing and data mining as applied to geospatial datasets. The aim is for readers to be able to devise and implement automated techniques to extract information from spatial grids such as radar, satellite or high-resolution survey imagery.

  6. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    Science.gov (United States)

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  7. Comparison of silicon strip tracker module size using large sensors from 6 inch wafers

    CERN Multimedia

    Honma, Alan

    1999-01-01

    Two large silicon strip sensor made from 6 inch wafers are placed next to each other to simulate the size of a CMS outer silicon tracker module. On the left is a prototype 2 sensor CMS inner endcap silicon tracker module made from 4 inch wafers.

  8. Finite-time and finite-size scalings in the evaluation of large-deviation functions: Numerical approach in continuous time.

    Science.gov (United States)

    Guevara Hidalgo, Esteban; Nemoto, Takahiro; Lecomte, Vivien

    2017-06-01

    Rare trajectories of stochastic systems are important to understand because of their potential impact. However, their properties are by definition difficult to sample directly. Population dynamics provides a numerical tool allowing their study, by means of simulating a large number of copies of the system, which are subjected to selection rules that favor the rare trajectories of interest. Such algorithms are plagued by finite simulation time and finite population size, effects that can render their use delicate. In this paper, we present a numerical approach which uses the finite-time and finite-size scalings of estimators of the large deviation functions associated to the distribution of rare trajectories. The method we propose allows one to extract the infinite-time and infinite-size limit of these estimators, which-as shown on the contact process-provides a significant improvement of the large deviation function estimators compared to the standard one.

  9. Finite-time and finite-size scalings in the evaluation of large-deviation functions: Numerical approach in continuous time

    Science.gov (United States)

    Guevara Hidalgo, Esteban; Nemoto, Takahiro; Lecomte, Vivien

    2017-06-01

    Rare trajectories of stochastic systems are important to understand because of their potential impact. However, their properties are by definition difficult to sample directly. Population dynamics provides a numerical tool allowing their study, by means of simulating a large number of copies of the system, which are subjected to selection rules that favor the rare trajectories of interest. Such algorithms are plagued by finite simulation time and finite population size, effects that can render their use delicate. In this paper, we present a numerical approach which uses the finite-time and finite-size scalings of estimators of the large deviation functions associated to the distribution of rare trajectories. The method we propose allows one to extract the infinite-time and infinite-size limit of these estimators, which—as shown on the contact process—provides a significant improvement of the large deviation function estimators compared to the standard one.

  10. Detection of tiny amounts of fissile materials in large-sized containers with radioactive waste

    Science.gov (United States)

    Batyaev, V. F.; Skliarov, S. V.

    2018-01-01

    The paper is devoted to non-destructive control of tiny amounts of fissile materials in large-sized containers filled with radioactive waste (RAW). The aim of this work is to model an active neutron interrogation facility for detection of fissile ma-terials inside NZK type containers with RAW and determine the minimal detectable mass of U-235 as a function of various param-eters: matrix type, nonuniformity of container filling, neutron gen-erator parameters (flux, pulse frequency, pulse duration), meas-urement time. As a result the dependence of minimal detectable mass on fissile materials location inside container is shown. Nonu-niformity of the thermal neutron flux inside a container is the main reason of the space-heterogeneity of minimal detectable mass in-side a large-sized container. Our experiments with tiny amounts of uranium-235 (<1 g) confirm the detection of fissile materials in NZK containers by using active neutron interrogation technique.

  11. Detection of tiny amounts of fissile materials in large-sized containers with radioactive waste

    Directory of Open Access Journals (Sweden)

    Batyaev V.F.

    2018-01-01

    Full Text Available The paper is devoted to non-destructive control of tiny amounts of fissile materials in large-sized containers filled with radioactive waste (RAW. The aim of this work is to model an active neutron interrogation facility for detection of fissile ma-terials inside NZK type containers with RAW and determine the minimal detectable mass of U-235 as a function of various param-eters: matrix type, nonuniformity of container filling, neutron gen-erator parameters (flux, pulse frequency, pulse duration, meas-urement time. As a result the dependence of minimal detectable mass on fissile materials location inside container is shown. Nonu-niformity of the thermal neutron flux inside a container is the main reason of the space-heterogeneity of minimal detectable mass in-side a large-sized container. Our experiments with tiny amounts of uranium-235 (<1 g confirm the detection of fissile materials in NZK containers by using active neutron interrogation technique.

  12. Comprehensive geo-spatial data creation for Najran region in the KSA

    Science.gov (United States)

    Alrajhi, M.; Hawarey, M.

    2009-04-01

    The General Directorate for Surveying and Mapping (GDSM) of the Deputy Ministry for Land and Surveying (DMLS) of the Ministry of Municipal and Rural Affairs (MOMRA) in the Kingdom of Saudi Arabia (KSA) has the exclusive mandate to carry out aerial photography and produce large-scale detailed maps for about 220 cities and villages in the KSA. This presentation is about the comprehensive geo-spatial data creation for the Najran region, South KSA, that was founded on country-wide horizontal geodetic ground control using Global Navigation Satellite Systems (GNSS) within the MOMRA's Terrestrial Reference Frame 2000 (MTRF2000) that is tied to International Terrestrial Reference Frame 2000 (ITRF2000) Epoch 2004.0, and vertical geodetic ground control using precise digital leveling in reference to Jeddah 1969 mean sea level, and included aerial photography of area 917 km2 at 1:5,500 scale and 14,304 km2 at 1:45,000 scale, full aerial triangulation, and production of orthophoto maps at scale of 1:10,000 (298 sheets) for 14,304 km2, with aerial photography lasting from May 2006 until July 2006.

  13. Comprehensive geo-spatial data creation for Asir region in the KSA

    Science.gov (United States)

    Alrajhi, M.; Hawarey, M.

    2009-04-01

    The General Directorate for Surveying and Mapping (GDSM) of the Deputy Ministry for Land and Surveying (DMLS) of the Ministry of Municipal and Rural Affairs (MOMRA) in the Kingdom of Saudi Arabia (KSA) has the exclusive mandate to carry out aerial photography and produce large-scale detailed maps for about 220 cities and villages in the KSA. This presentation is about the comprehensive geo-spatial data creation for the Asir region, South West KSA, that was founded on country-wide horizontal geodetic ground control using Global Navigation Satellite Systems (GNSS) within the MOMRA's Terrestrial Reference Frame 2000 (MTRF2000) that is tied to International Terrestrial Reference Frame 2000 (ITRF2000) Epoch 2004.0, and vertical geodetic ground control using precise digital leveling in reference to Jeddah 1969 mean sea level, and included aerial photography of area 2,188 km2 at 1:5,500 scale and 32,640 km2 at 1:45,000 scale, full aerial triangulation, and production of orthophoto maps at scale of 1:10,000 (680 sheets) for 32,640 km2, with aerial photography lasting from July 2007 thru October 2007.

  14. The geo-spatial information infrastructure at the Centre for Control and Prevention of Zoonoses, University of Ibadan, Nigeria: an emerging sustainable One-Health pavilion.

    Science.gov (United States)

    Olugasa, B O

    2014-12-01

    The World-Wide-Web as a contemporary means of information sharing offers a platform for geo-spatial information dissemination to improve education about spatio-temporal patterns of disease spread at the human-animal-environment interface in developing countries of West Africa. In assessing the quality of exposure to geospatial information applications among students in five purposively selected institutions in West Africa, this study reviewed course contents and postgraduate programmes in zoonoses surveillance. Geospatial information content and associated practical exercises in zoonoses surveillance were scored.. Seven criteria were used to categorize and score capability, namely, spatial data capture; thematic map design and interpretation; spatio-temporal analysis; remote sensing of data; statistical modelling; the management of spatial data-profile; and web-based map sharing operation within an organization. These criteria were used to compute weighted exposure during training at the institutions. A categorical description of institution with highest-scoring of computed Cumulative Exposure Point Average (CEPA) was based on an illustration with retrospective records of rabies cases, using data from humans, animals and the environment, that were sourced from Grand Bassa County, Liberia to create and share maps and information with faculty, staff, students and the neighbourhood about animal bite injury surveillance and spatial distribution of rabies-like illness. Uniformly low CEPA values (0-1.3) were observed across academic departments. The highest (3.8) was observed at the Centre for Control and Prevention of Zoonoses (CCPZ), University of Ibadan, Nigeria, where geospatial techniques were systematically taught, and thematic and predictive maps were produced and shared online with other institutions in West Africa. In addition, a short course in zoonosis surveillance, which offers inclusive learning in geospatial applications, is taught at CCPZ. The paper

  15. A NEW INITIATIVE FOR TILING, STITCHING AND PROCESSING GEOSPATIAL BIG DATA IN DISTRIBUTED COMPUTING ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2016-06-01

    Full Text Available Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/ developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu. The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows written in different development frameworks (e.g. Python, R or C#. It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  16. a New Initiative for Tiling, Stitching and Processing Geospatial Big Data in Distributed Computing Environments

    Science.gov (United States)

    Olasz, A.; Nguyen Thai, B.; Kristóf, D.

    2016-06-01

    Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  17. River predisposition to ice jams: a simplified geospatial model

    Directory of Open Access Journals (Sweden)

    S. De Munck

    2017-07-01

    Full Text Available Floods resulting from river ice jams pose a great risk to many riverside municipalities in Canada. The location of an ice jam is mainly influenced by channel morphology. The goal of this work was therefore to develop a simplified geospatial model to estimate the predisposition of a river channel to ice jams. Rather than predicting the timing of river ice breakup, the main question here was to predict where the broken ice is susceptible to jam based on the river's geomorphological characteristics. Thus, six parameters referred to potential causes for ice jams in the literature were initially selected: presence of an island, narrowing of the channel, high sinuosity, presence of a bridge, confluence of rivers, and slope break. A GIS-based tool was used to generate the aforementioned factors over regular-spaced segments along the entire channel using available geospatial data. An ice jam predisposition index (IJPI was calculated by combining the weighted optimal factors. Three Canadian rivers (province of Québec were chosen as test sites. The resulting maps were assessed from historical observations and local knowledge. Results show that 77 % of the observed ice jam sites on record occurred in river sections that the model considered as having high or medium predisposition. This leaves 23 % of false negative errors (missed occurrence. Between 7 and 11 % of the highly predisposed river sections did not have an ice jam on record (false-positive cases. Results, limitations, and potential improvements are discussed.

  18. Representing Geospatial Environment Observation Capability Information: A Case Study of Managing Flood Monitoring Sensors in the Jinsha River Basin

    Science.gov (United States)

    Hu, Chuli; Guan, Qingfeng; Li, Jie; Wang, Ke; Chen, Nengcheng

    2016-01-01

    Sensor inquirers cannot understand comprehensive or accurate observation capability information because current observation capability modeling does not consider the union of multiple sensors nor the effect of geospatial environmental features on the observation capability of sensors. These limitations result in a failure to discover credible sensors or plan for their collaboration for environmental monitoring. The Geospatial Environmental Observation Capability (GEOC) is proposed in this study and can be used as an information basis for the reliable discovery and collaborative planning of multiple environmental sensors. A field-based GEOC (GEOCF) information representation model is built. Quintuple GEOCF feature components and two GEOCF operations are formulated based on the geospatial field conceptual framework. The proposed GEOCF markup language is used to formalize the proposed GEOCF. A prototype system called GEOCapabilityManager is developed, and a case study is conducted for flood observation in the lower reaches of the Jinsha River Basin. The applicability of the GEOCF is verified through the reliable discovery of flood monitoring sensors and planning for the collaboration of these sensors. PMID:27999247

  19. Urban Image Classification: Per-Pixel Classifiers, Sub-Pixel Analysis, Object-Based Image Analysis, and Geospatial Methods. 10; Chapter

    Science.gov (United States)

    Myint, Soe W.; Mesev, Victor; Quattrochi, Dale; Wentz, Elizabeth A.

    2013-01-01

    Remote sensing methods used to generate base maps to analyze the urban environment rely predominantly on digital sensor data from space-borne platforms. This is due in part from new sources of high spatial resolution data covering the globe, a variety of multispectral and multitemporal sources, sophisticated statistical and geospatial methods, and compatibility with GIS data sources and methods. The goal of this chapter is to review the four groups of classification methods for digital sensor data from space-borne platforms; per-pixel, sub-pixel, object-based (spatial-based), and geospatial methods. Per-pixel methods are widely used methods that classify pixels into distinct categories based solely on the spectral and ancillary information within that pixel. They are used for simple calculations of environmental indices (e.g., NDVI) to sophisticated expert systems to assign urban land covers. Researchers recognize however, that even with the smallest pixel size the spectral information within a pixel is really a combination of multiple urban surfaces. Sub-pixel classification methods therefore aim to statistically quantify the mixture of surfaces to improve overall classification accuracy. While within pixel variations exist, there is also significant evidence that groups of nearby pixels have similar spectral information and therefore belong to the same classification category. Object-oriented methods have emerged that group pixels prior to classification based on spectral similarity and spatial proximity. Classification accuracy using object-based methods show significant success and promise for numerous urban 3 applications. Like the object-oriented methods that recognize the importance of spatial proximity, geospatial methods for urban mapping also utilize neighboring pixels in the classification process. The primary difference though is that geostatistical methods (e.g., spatial autocorrelation methods) are utilized during both the pre- and post

  20. Exchanging the Context between OGC Geospatial Web clients and GIS applications using Atom

    Science.gov (United States)

    Maso, Joan; Díaz, Paula; Riverola, Anna; Pons, Xavier

    2013-04-01

    Currently, the discovery and sharing of geospatial information over the web still presents difficulties. News distribution through website content was simplified by the use of Really Simple Syndication (RSS) and Atom syndication formats. This communication exposes an extension of Atom to redistribute references to geospatial information in a Spatial Data Infrastructure distributed environment. A geospatial client can save the status of an application that involves several OGC services of different kind and direct data and share this status with other users that need the same information and use different client vendor products in an interoperable way. The extensibility of the Atom format was essential to define a format that could be used in RSS enabled web browser, Mass Market map viewers and emerging geospatial enable integrated clients that support Open Geospatial Consortium (OGC) services. Since OWS Context has been designed as an Atom extension, it is possible to see the document in common places where Atom documents are valid. Internet web browsers are able to present the document as a list of items with title, abstract, time, description and downloading features. OWS Context uses GeoRSS so that, the document can be to be interpreted by both Google maps and Bing Maps as items that have the extent represented in a dynamic map. Another way to explode a OWS Context is to develop an XSLT to transform the Atom feed into an HTML5 document that shows the exact status of the client view window that saved the context document. To accomplish so, we use the width and height of the client window, and the extent of the view in world (geographic) coordinates in order to calculate the scale of the map. Then, we can mix elements in world coordinates (such as CF-NetCDF files or GML) with elements in pixel coordinates (such as WMS maps, WMTS tiles and direct SVG content). A smarter map browser application called MiraMon Map Browser is able to write a context document and read

  1. An automated system for the preparation of Large Size Dried (LSD) Spikes

    International Nuclear Information System (INIS)

    Verbruggen, A.; Bauwens, J.; Jakobsson, U.; Eykens, R.; Wellum, R.; Aregbe, Y.; Van De Steene, N.

    2008-01-01

    Large size dried (LSD) spikes have been produced to fulfill the existing requirement for reliable and traceable isotopic reference materials for nuclear safeguards. A system to produce certified nuclear isotopic reference material as a U/Pu mixture in the form of large size dried spikes, comparable to those produced using traditional methods has been installed in collaboration with Nucomat, a company with a recognized reputation in design and development of integrated automated systems. The major components of the system are a robot, two balances, a dispenser and a drying unit fitted into a glove box. The robot is software driven and designed to control all movements inside the glove-box, to identify unambiguously the penicillin vials with a bar-code reader, to dispense the LSD batch solution into the vials and to weigh the amount dispensed. The system functionality has been evaluated and the performance validated by comparing the results from a series of samples dispensed and weighed by the automated system with the results by manual substitution weighing. After applying the proper correction factors to the data from the automated system balance no significant difference was observed between the two. However, an additional component of uncertainty of 3*10 -4 is introduced in the uncertainty budget for the certified weights provided by the automatic system. (authors)

  2. An automated system for the preparation of Large Size Dried (LSD) Spikes

    Energy Technology Data Exchange (ETDEWEB)

    Verbruggen, A.; Bauwens, J.; Jakobsson, U.; Eykens, R.; Wellum, R.; Aregbe, Y. [European Commission - Joint Research Centre, Institute for Reference Materials and Measurements (IRMM), Retieseweg 211, B2440 Geel (Belgium); Van De Steene, N. [Nucomat, Mercatorstraat 206, B9100 Sint Niklaas (Belgium)

    2008-07-01

    Large size dried (LSD) spikes have been produced to fulfill the existing requirement for reliable and traceable isotopic reference materials for nuclear safeguards. A system to produce certified nuclear isotopic reference material as a U/Pu mixture in the form of large size dried spikes, comparable to those produced using traditional methods has been installed in collaboration with Nucomat, a company with a recognized reputation in design and development of integrated automated systems. The major components of the system are a robot, two balances, a dispenser and a drying unit fitted into a glove box. The robot is software driven and designed to control all movements inside the glove-box, to identify unambiguously the penicillin vials with a bar-code reader, to dispense the LSD batch solution into the vials and to weigh the amount dispensed. The system functionality has been evaluated and the performance validated by comparing the results from a series of samples dispensed and weighed by the automated system with the results by manual substitution weighing. After applying the proper correction factors to the data from the automated system balance no significant difference was observed between the two. However, an additional component of uncertainty of 3*10{sup -4} is introduced in the uncertainty budget for the certified weights provided by the automatic system. (authors)

  3. Salt-assisted direct exfoliation of graphite into high-quality, large-size, few-layer graphene sheets.

    Science.gov (United States)

    Niu, Liyong; Li, Mingjian; Tao, Xiaoming; Xie, Zhuang; Zhou, Xuechang; Raju, Arun P A; Young, Robert J; Zheng, Zijian

    2013-08-21

    We report a facile and low-cost method to directly exfoliate graphite powders into large-size, high-quality, and solution-dispersible few-layer graphene sheets. In this method, aqueous mixtures of graphite and inorganic salts such as NaCl and CuCl2 are stirred, and subsequently dried by evaporation. Finally, the mixture powders are dispersed into an orthogonal organic solvent solution of the salt by low-power and short-time ultrasonication, which exfoliates graphite into few-layer graphene sheets. We find that the as-made graphene sheets contain little oxygen, and 86% of them are 1-5 layers with lateral sizes as large as 210 μm(2). Importantly, the as-made graphene can be readily dispersed into aqueous solution in the presence of surfactant and thus is compatible with various solution-processing techniques towards graphene-based thin film devices.

  4. Assessing and Valuing Historical Geospatial Data for Decisions

    Science.gov (United States)

    Sylak-Glassman, E.; Gallo, J.

    2016-12-01

    We will present a method for assessing the use and valuation of historical geospatial data and information products derived from Earth observations (EO). Historical data is widely used in the establishment of baseline reference cases, time-series analysis, and Earth system modeling. Historical geospatial data is used in diverse application areas, such as risk assessment in the insurance and reinsurance industry, disaster preparedness and response planning, historical demography, land-use change analysis, and paleoclimate research, among others. Establishing the current value of previously collected data, often from EO systems that are no longer operating, is difficult since the costs associated with their preservation, maintenance, and dissemination are current, while the costs associated with their original collection are sunk. Understanding their current use and value can aid in funding decisions about the data management infrastructure and workforce allocation required to maintain their availability. Using a value-tree framework to trace the application of data from EO systems, sensors, networks, and surveys, to weighted key Federal objectives, we are able to estimate relative contribution of individual EO systems, sensors, networks, and surveys to meeting those objectives. The analysis relies on a modified Delphi method to elicit relative levels of reliance on individual EO data inputs, including historical data, from subject matter experts. This results in the identification of a representative portfolio of all EO data used to meet key Federal objectives. Because historical data is collected in conjunction with all other EO data within a weighted framework, its contribution to meeting key Federal objectives can be specifically identified and evaluated in relationship to other EO data. The results of this method could be applied better understanding and projecting the long-term value of data from current and future EO systems.

  5. Comparative analysis of non-destructive methods to control fissile materials in large-size containers

    Directory of Open Access Journals (Sweden)

    Batyaev V.F.

    2017-01-01

    Full Text Available The analysis of various non-destructive methods to control fissile materials (FM in large-size containers filled with radioactive waste (RAW has been carried out. The difficulty of applying passive gamma-neutron monitoring FM in large containers filled with concreted RAW is shown. Selection of an active non-destructive assay technique depends on the container contents; and in case of a concrete or iron matrix with very low activity and low activity RAW the neutron radiation method appears to be more preferable as compared with the photonuclear one.

  6. Geospatial distribution modeling and determining suitability of groundwater quality for irrigation purpose using geospatial methods and water quality index (WQI) in Northern Ethiopia

    Science.gov (United States)

    Gidey, Amanuel

    2018-06-01

    Determining suitability and vulnerability of groundwater quality for irrigation use is a key alarm and first aid for careful management of groundwater resources to diminish the impacts on irrigation. This study was conducted to determine the overall suitability of groundwater quality for irrigation use and to generate their spatial distribution maps in Elala catchment, Northern Ethiopia. Thirty-nine groundwater samples were collected to analyze and map the water quality variables. Atomic absorption spectrophotometer, ultraviolet spectrophotometer, titration and calculation methods were used for laboratory groundwater quality analysis. Arc GIS, geospatial analysis tools, semivariogram model types and interpolation methods were used to generate geospatial distribution maps. Twelve and eight water quality variables were used to produce weighted overlay and irrigation water quality index models, respectively. Root-mean-square error, mean square error, absolute square error, mean error, root-mean-square standardized error, measured values versus predicted values were used for cross-validation. The overall weighted overlay model result showed that 146 km2 areas are highly suitable, 135 km2 moderately suitable and 60 km2 area unsuitable for irrigation use. The result of irrigation water quality index confirms 10.26% with no restriction, 23.08% with low restriction, 20.51% with moderate restriction, 15.38% with high restriction and 30.76% with the severe restriction for irrigation use. GIS and irrigation water quality index are better methods for irrigation water resources management to achieve a full yield irrigation production to improve food security and to sustain it for a long period, to avoid the possibility of increasing environmental problems for the future generation.

  7. Sizing the star cluster population of the Large Magellanic Cloud

    Science.gov (United States)

    Piatti, Andrés E.

    2018-04-01

    The number of star clusters that populate the Large Magellanic Cloud (LMC) at deprojected distances knowledge of the LMC cluster formation and dissolution histories, we closely revisited such a compilation of objects and found that only ˜35 per cent of the previously known catalogued clusters have been included. The remaining entries are likely related to stellar overdensities of the LMC composite star field, because there is a remarkable enhancement of objects with assigned ages older than log(t yr-1) ˜ 9.4, which contrasts with the existence of the LMC cluster age gap; the assumption of a cluster formation rate similar to that of the LMC star field does not help to conciliate so large amount of clusters either; and nearly 50 per cent of them come from cluster search procedures known to produce more than 90 per cent of false detections. The lack of further analyses to confirm the physical reality as genuine star clusters of the identified overdensities also glooms those results. We support that the actual size of the LMC main body cluster population is close to that previously known.

  8. A method for examining the geospatial distribution of CO2 storage resources applied to the Pre-Punta Gorda Composite and Dollar Bay reservoirs of the South Florida Basin, U.S.A

    Science.gov (United States)

    Roberts-Ashby, Tina; Brandon N. Ashby,

    2016-01-01

    This paper demonstrates geospatial modification of the USGS methodology for assessing geologic CO2 storage resources, and was applied to the Pre-Punta Gorda Composite and Dollar Bay reservoirs of the South Florida Basin. The study provides detailed evaluation of porous intervals within these reservoirs and utilizes GIS to evaluate the potential spatial distribution of reservoir parameters and volume of CO2 that can be stored. This study also shows that incorporating spatial variation of parameters using detailed and robust datasets may improve estimates of storage resources when compared to applying uniform values across the study area derived from small datasets, like many assessment methodologies. Geospatially derived estimates of storage resources presented here (Pre-Punta Gorda Composite = 105,570 MtCO2; Dollar Bay = 24,760 MtCO2) were greater than previous assessments, which was largely attributed to the fact that detailed evaluation of these reservoirs resulted in higher estimates of porosity and net-porous thickness, and areas of high porosity and thick net-porous intervals were incorporated into the model, likely increasing the calculated volume of storage space available for CO2 sequestration. The geospatial method for evaluating CO2 storage resources also provides the ability to identify areas that potentially contain higher volumes of storage resources, as well as areas that might be less favorable.

  9. An Application of Geospatial Information Systems (GIS) Technology to Anatomic Dental Charting

    OpenAIRE

    Bartling, William C.; Schleyer, Titus K.L.

    2003-01-01

    Historically, an anatomic dental chart is a compilation of color-coded symbols and numbers used within a template, either paper or computerized, to create a graphic record of a patient’s oral health status. This poster depicts how Geospatial Information System (GIS) technology can be used to create an accurate, current anatomic dental chart that contains detailed information not present in current charting systems.

  10. An exploration of counterfeit medicine surveillance strategies guided by geospatial analysis: lessons learned from counterfeit Avastin detection in the US drug supply chain.

    Science.gov (United States)

    Cuomo, Raphael E; Mackey, Tim K

    2014-12-02

    To explore healthcare policy and system improvements that would more proactively respond to future penetration of counterfeit cancer medications in the USA drug supply chain using geospatial analysis. A statistical and geospatial analysis of areas that received notices from the Food and Drug Administration (FDA) about the possibility of counterfeit Avastin penetrating the US drug supply chain. Data from FDA warning notices were compared to data from 44 demographic variables available from the US Census Bureau via correlation, means testing and geospatial visualisation. Results were interpreted in light of existing literature in order to recommend improvements to surveillance of counterfeit medicines. This study analysed 791 distinct healthcare provider addresses that received FDA warning notices across 30,431 zip codes in the USA. Statistical outputs were Pearson's correlation coefficients and t values. Geospatial outputs were cartographic visualisations. These data were used to generate the overarching study outcome, which was a recommendation for a strategy for drug safety surveillance congruent with existing literature on counterfeit medication. Zip codes with greater numbers of individuals age 65+ and greater numbers of ethnic white individuals were most correlated with receipt of a counterfeit Avastin notice. Geospatial visualisations designed in conjunction with statistical analysis of demographic variables appeared more capable of suggesting areas and populations that may be at risk for undetected counterfeit Avastin penetration. This study suggests that dual incorporation of statistical and geospatial analysis in surveillance of counterfeit medicine may be helpful in guiding efforts to prevent, detect and visualise counterfeit medicines penetrations in the US drug supply chain and other settings. Importantly, the information generated by these analyses could be utilised to identify at-risk populations associated with demographic characteristics

  11. Growth of large-size-two-dimensional crystalline pentacene grains for high performance organic thin film transistors

    Directory of Open Access Journals (Sweden)

    Chuan Du

    2012-06-01

    Full Text Available New approach is presented for growth of pentacene crystalline thin film with large grain size. Modification of dielectric surfaces using a monolayer of small molecule results in the formation of pentacene thin films with well ordered large crystalline domain structures. This suggests that pentacene molecules may have significantly large diffusion constant on the modified surface. An average hole mobility about 1.52 cm2/Vs of pentacene based organic thin film transistors (OTFTs is achieved with good reproducibility.

  12. Geospatial data infrastructure: The development of metadata for geo-information in China

    Science.gov (United States)

    Xu, Baiquan; Yan, Shiqiang; Wang, Qianju; Lian, Jian; Wu, Xiaoping; Ding, Keyong

    2014-03-01

    Stores of geoscience records are in constant flux. These stores are continually added to by new information, ideas and data, which are frequently revised. The geoscience record is in restrained by human thought and technology for handling information. Conventional methods strive, with limited success, to maintain geoscience records which are readily susceptible and renewable. The information system must adapt to the diversity of ideas and data in geoscience and their changes through time. In China, more than 400,000 types of important geological data are collected and produced in geological work during the last two decades, including oil, natural gas and marine data, mine exploration, geophysical, geochemical, remote sensing and important local geological survey and research reports. Numerous geospatial databases are formed and stored in National Geological Archives (NGA) with available formats of MapGIS, ArcGIS, ArcINFO, Metalfile, Raster, SQL Server, Access and JPEG. But there is no effective way to warrant that the quality of information is adequate in theory and practice for decision making. The need for fast, reliable, accurate and up-to-date information by providing the Geographic Information System (GIS) communities are becoming insistent for all geoinformation producers and users in China. Since 2010, a series of geoinformation projects have been carried out under the leadership of the Ministry of Land and Resources (MLR), including (1) Integration, update and maintenance of geoinformation databases; (2) Standards research on clusterization and industrialization of information services; (3) Platform construction of geological data sharing; (4) Construction of key borehole databases; (5) Product development of information services. "Nine-System" of the basic framework has been proposed for the development and improvement of the geospatial data infrastructure, which are focused on the construction of the cluster organization, cluster service, convergence

  13. Geospatial data infrastructure: The development of metadata for geo-information in China

    International Nuclear Information System (INIS)

    Xu, Baiquan; Yan, Shiqiang; Wang, Qianju; Lian, Jian; Wu, Xiaoping; Ding, Keyong

    2014-01-01

    Stores of geoscience records are in constant flux. These stores are continually added to by new information, ideas and data, which are frequently revised. The geoscience record is in restrained by human thought and technology for handling information. Conventional methods strive, with limited success, to maintain geoscience records which are readily susceptible and renewable. The information system must adapt to the diversity of ideas and data in geoscience and their changes through time. In China, more than 400,000 types of important geological data are collected and produced in geological work during the last two decades, including oil, natural gas and marine data, mine exploration, geophysical, geochemical, remote sensing and important local geological survey and research reports. Numerous geospatial databases are formed and stored in National Geological Archives (NGA) with available formats of MapGIS, ArcGIS, ArcINFO, Metalfile, Raster, SQL Server, Access and JPEG. But there is no effective way to warrant that the quality of information is adequate in theory and practice for decision making. The need for fast, reliable, accurate and up-to-date information by providing the Geographic Information System (GIS) communities are becoming insistent for all geoinformation producers and users in China. Since 2010, a series of geoinformation projects have been carried out under the leadership of the Ministry of Land and Resources (MLR), including (1) Integration, update and maintenance of geoinformation databases; (2) Standards research on clusterization and industrialization of information services; (3) Platform construction of geological data sharing; (4) Construction of key borehole databases; (5) Product development of information services. ''Nine-System'' of the basic framework has been proposed for the development and improvement of the geospatial data infrastructure, which are focused on the construction of the cluster organization, cluster

  14. Thinking Critically in Space: Toward a Mixed-Methods Geospatial Approach to Education Policy Analysis

    Science.gov (United States)

    Yoon, Ee-Seul; Lubienski, Christopher

    2018-01-01

    This paper suggests that synergies can be produced by using geospatial analyses as a bridge between traditional qualitative-quantitative distinctions in education research. While mapping tools have been effective for informing education policy studies, especially in terms of educational access and choice, they have also been underutilized and…

  15. Does company size matter? Validation of an integrative model of safety behavior across small and large construction companies.

    Science.gov (United States)

    Guo, Brian H W; Yiu, Tak Wing; González, Vicente A

    2018-02-01

    Previous safety climate studies primarily focused on either large construction companies or the construction industry as a whole, while little is known about whether company size has significant effects on workers' understanding of safety climate measures and relationships between safety climate factors and safety behavior. Thus, this study aims to: (a) test the measurement equivalence (ME) of a safety climate measure across workers from small and large companies; (b) investigate if company size alters the causal structure of the integrative model developed by Guo, Yiu, and González (2016). Data were collected from 253 construction workers in New Zealand using a safety climate measure. This study used multi-group confirmatory factor analyses (MCFA) to test the measurement equivalence of the safety climate measure and structure invariance of the integrative model. Results indicate that workers from small and large companies understood the safety climate measure in a similar manner. In addition, it was suggested that company size does not change the causal structure and mediational processes of the integrative model. Both measurement equivalence of the safety climate measure and structural invariance of the integrative model were supported by this study. Practical applications: Findings of this study provided strong support for a meaningful use of the safety climate measure across construction companies in different sizes. Safety behavior promotion strategies designed based on the integrative model may be well suited for both large and small companies. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  16. Geospatial exposure to point-of-sale tobacco: real-time craving and smoking-cessation outcomes.

    Science.gov (United States)

    Kirchner, Thomas R; Cantrell, Jennifer; Anesetti-Rothermel, Andrew; Ganz, Ollie; Vallone, Donna M; Abrams, David B

    2013-10-01

    Little is known about the factors that drive the association between point-of-sale marketing and behavior, because methods that directly link individual-level use outcomes to real-world point-of-sale exposure are only now beginning to be developed. Daily outcomes during smoking cessation were examined as a function of both real-time geospatial exposure to point-of-sale tobacco (POST) and subjective craving to smoke. Continuous individual geospatial location data collected over the first month of a smoking-cessation attempt in 2010-2012 (N=475) were overlaid on a POST outlet geodatabase (N=1060). Participants' mobility data were used to quantify the number of times they came into contact with a POST outlet. Participants recorded real-time craving levels and smoking status via ecological momentary assessment (EMA) on cellular telephones. The final data set spanned a total of 12,871 days of EMA and geospatial tracking. Lapsing was significantly more likely on days with any POST contact (OR=1.19, 95% CI=1.18, 1.20), and increasingly likely as the number of daily POST contacts increased (OR=1.07, 95% CI=1.06, 1.08). Overall, daily POST exposure was significantly associated with lapsing when craving was low (OR=1.22, 95% CI=1.20, 1.23); high levels of craving were more directly associated with lapse outcomes. These data shed light on the way mobility patterns drive a dynamic interaction between individuals and the POST environment, demonstrating that quantification of individuals' exposure to POST marketing can be used to identify previously unrecognized patterns of association among individual mobility, the built environment, and behavioral outcomes. © 2013 American Journal of Preventive Medicine.

  17. A Geospatial Database for Wind and Solar Energy Applications: The Kingdom of Bahrain Study Case

    Directory of Open Access Journals (Sweden)

    Al-Joburi Khalil

    2017-01-01

    Full Text Available This research is aimed at designing, implementing, and testing a geospatial database for wind and solar energy applications in the Kingdom of Bahrain. All decision making needed to determine economic feasibility and establish site location for wind turbines or solar panels depends primarily on geospatial feature theme information and non-spatial (attribute data for wind, solar, rainfall, temperature and weather characteristics of a particular region. Spatial data includes, but is not limited to, digital elevation, slopes, land use, zonings, parks, population density, road utility maps, and other related information. Digital elevations for over 450,000 spot at 50 m spatial horizontal resolution plus field surveying and GPS (at selected locations was obtained from the Surveying and Land Registration Bureau (SLRB. Road, utilities, and population density are obtained from the Central Information Organization (CIO. Land use zoning, recreational parks, and other data are obtained from the Ministry of Municipalities and Agricultural Affairs. Wind, solar, humidity, rainfall, and temperature data are obtained from the Ministry of Transportation, Civil Aviation Section. LandSat Satellite and others images are obtained from NASA and online sources respectively. The collected geospatial data was geo-referenced to Ain el-Abd UTM Zone 39 North. 3D Digital Elevation Model (DEM-50 m spatial resolutions was created using SLRB spot elevations. Slope and aspect maps were generate based on the DEM. Supervised image classification to identify open spaces was performed utilizing satellite images. Other geospatial data was converted to raster format with the same cell resolution. Non-spatial data are entered as an attribute to spatial features. To eliminate ambiguous solution, multi-criteria GIS model is developed based on, vector (discrete point, line, and polygon representations as well as raster model (continuous representation. The model was tested at the Al

  18. A Geospatial Database for Wind and Solar Energy Applications: The Kingdom of Bahrain Study Case

    Science.gov (United States)

    Al-Joburi, Khalil; Dahman, Nidal

    2017-11-01

    This research is aimed at designing, implementing, and testing a geospatial database for wind and solar energy applications in the Kingdom of Bahrain. All decision making needed to determine economic feasibility and establish site location for wind turbines or solar panels depends primarily on geospatial feature theme information and non-spatial (attribute) data for wind, solar, rainfall, temperature and weather characteristics of a particular region. Spatial data includes, but is not limited to, digital elevation, slopes, land use, zonings, parks, population density, road utility maps, and other related information. Digital elevations for over 450,000 spot at 50 m spatial horizontal resolution plus field surveying and GPS (at selected locations) was obtained from the Surveying and Land Registration Bureau (SLRB). Road, utilities, and population density are obtained from the Central Information Organization (CIO). Land use zoning, recreational parks, and other data are obtained from the Ministry of Municipalities and Agricultural Affairs. Wind, solar, humidity, rainfall, and temperature data are obtained from the Ministry of Transportation, Civil Aviation Section. LandSat Satellite and others images are obtained from NASA and online sources respectively. The collected geospatial data was geo-referenced to Ain el-Abd UTM Zone 39 North. 3D Digital Elevation Model (DEM)-50 m spatial resolutions was created using SLRB spot elevations. Slope and aspect maps were generate based on the DEM. Supervised image classification to identify open spaces was performed utilizing satellite images. Other geospatial data was converted to raster format with the same cell resolution. Non-spatial data are entered as an attribute to spatial features. To eliminate ambiguous solution, multi-criteria GIS model is developed based on, vector (discrete point, line, and polygon representations) as well as raster model (continuous representation). The model was tested at the Al-Areen proposed

  19. MultiSpec: A Desktop and Online Geospatial Image Data Processing Tool

    Science.gov (United States)

    Biehl, L. L.; Hsu, W. K.; Maud, A. R. M.; Yeh, T. T.

    2017-12-01

    MultiSpec is an easy to learn and use, freeware image processing tool for interactively analyzing a broad spectrum of geospatial image data, with capabilities such as image display, unsupervised and supervised classification, feature extraction, feature enhancement, and several other functions. Originally developed for Macintosh and Windows desktop computers, it has a community of several thousand users worldwide, including researchers and educators, as a practical and robust solution for analyzing multispectral and hyperspectral remote sensing data in several different file formats. More recently MultiSpec was adapted to run in the HUBzero collaboration platform so that it can be used within a web browser, allowing new user communities to be engaged through science gateways. MultiSpec Online has also been extended to interoperate with other components (e.g., data management) in HUBzero through integration with the geospatial data building blocks (GABBs) project. This integration enables a user to directly launch MultiSpec Online from data that is stored and/or shared in a HUBzero gateway and to save output data from MultiSpec Online to hub storage, allowing data sharing and multi-step workflows without having to move data between different systems. MultiSpec has also been used in K-12 classes for which one example is the GLOBE program (www.globe.gov) and in outreach material such as that provided by the USGS (eros.usgs.gov/educational-activities). MultiSpec Online now provides teachers with another way to use MultiSpec without having to install the desktop tool. Recently MultiSpec Online was used in a geospatial data session with 30-35 middle school students at the Turned Onto Technology and Leadership (TOTAL) Camp in the summers of 2016 and 2017 at Purdue University. The students worked on a flood mapping exercise using Landsat 5 data to learn about land remote sensing using supervised classification techniques. Online documentation is available for Multi

  20. Emerging trends in geospatial artificial intelligence (geoAI): potential applications for environmental epidemiology.

    Science.gov (United States)

    VoPham, Trang; Hart, Jaime E; Laden, Francine; Chiang, Yao-Yi

    2018-04-17

    Geospatial artificial intelligence (geoAI) is an emerging scientific discipline that combines innovations in spatial science, artificial intelligence methods in machine learning (e.g., deep learning), data mining, and high-performance computing to extract knowledge from spatial big data. In environmental epidemiology, exposure modeling is a commonly used approach to conduct exposure assessment to determine the distribution of exposures in study populations. geoAI technologies provide important advantages for exposure modeling in environmental epidemiology, including the ability to incorporate large amounts of big spatial and temporal data in a variety of formats; computational efficiency; flexibility in algorithms and workflows to accommodate relevant characteristics of spatial (environmental) processes including spatial nonstationarity; and scalability to model other environmental exposures across different geographic areas. The objectives of this commentary are to provide an overview of key concepts surrounding the evolving and interdisciplinary field of geoAI including spatial data science, machine learning, deep learning, and data mining; recent geoAI applications in research; and potential future directions for geoAI in environmental epidemiology.

  1. Precise large deviations of aggregate claims in a size-dependent renewal risk model with stopping time claim-number process

    Directory of Open Access Journals (Sweden)

    Shuo Zhang

    2017-04-01

    Full Text Available Abstract In this paper, we consider a size-dependent renewal risk model with stopping time claim-number process. In this model, we do not make any assumption on the dependence structure of claim sizes and inter-arrival times. We study large deviations of the aggregate amount of claims. For the subexponential heavy-tailed case, we obtain a precise large-deviation formula; our method substantially relies on a martingale for the structure of our models.

  2. Statistical characterization of a large geochemical database and effect of sample size

    Science.gov (United States)

    Zhang, C.; Manheim, F.T.; Hinde, J.; Grossman, J.N.

    2005-01-01

    smaller numbers of data points showed that few elements passed standard statistical tests for normality or log-normality until sample size decreased to a few hundred data points. Large sample size enhances the power of statistical tests, and leads to rejection of most statistical hypotheses for real data sets. For large sample sizes (e.g., n > 1000), graphical methods such as histogram, stem-and-leaf, and probability plots are recommended for rough judgement of probability distribution if needed. ?? 2005 Elsevier Ltd. All rights reserved.

  3. Comparative analysis of non-destructive methods to control fissile materials in large-size containers

    Science.gov (United States)

    Batyaev, V. F.; Sklyarov, S. V.

    2017-09-01

    The analysis of various non-destructive methods to control fissile materials (FM) in large-size containers filled with radioactive waste (RAW) has been carried out. The difficulty of applying passive gamma-neutron monitoring FM in large containers filled with concreted RAW is shown. Selection of an active non-destructive assay technique depends on the container contents; and in case of a concrete or iron matrix with very low activity and low activity RAW the neutron radiation method appears to be more preferable as compared with the photonuclear one. Note to the reader: the pdf file has been changed on September 22, 2017.

  4. VISA: AN AUTOMATIC AWARE AND VISUAL AIDS MECHANISM FOR IMPROVING THE CORRECT USE OF GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    J. H. Hong

    2016-06-01

    Full Text Available With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of “differences” implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.

  5. Local government GIS and geospatial capabilities : suitability for integrated transportation and land use planning (California SB 375).

    Science.gov (United States)

    2009-11-01

    This report examines two linked phenomena in transportation planning: the geospatial analysis capabilities of local planning agencies and the increasing demands on such capabilities imposed by comprehensive planning mandates. The particular examples ...

  6. New directions in valuing geospatial information - how to value goespatial information for policy and business decisioins in the future

    Science.gov (United States)

    Smart, A. C.

    2014-12-01

    Governments are increasingly asking for more evidence of the benefits of investing in geospatial data and infrastructure before investing. They are looking for a clearer articulation of the economic, environmental and social benefits than has been possble in the past. Development of techniques has accelerated in the past five years as governments and industry become more involved in the capture and use of geospatial data. However evaluation practitioners have struggled to answer these emerging questions. The paper explores the types of questions that decision makers are asking and discusses the different approaches and methods that have been used recently to answer them. It explores the need for better buisness case models. The emerging approaches are then discussed and their attributes reviewed. These include methods of analysing tengible economic benefits, intangible benefits and societal benefits. The paper explores the use of value chain analysis and real options analysis to better articulate the impacts on international competitiveness and how to value the potential benefits of innovations enabled by the geospatial data that is produced. The paper concludes by illustrating the potential for these techniques in current and future decision making.

  7. Challenges and opportunities : One stop processing of automatic large-scale base map production using airborne lidar data within gis environment case study: Makassar City, Indonesia

    NARCIS (Netherlands)

    Widyaningrum, E.; Gorte, B.G.H.

    2017-01-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information

  8. A NEW FRAMEWORK FOR GEOSPATIAL SITE SELECTION USING ARTIFICIAL NEURAL NETWORKS AS DECISION RULES: A CASE STUDY ON LANDFILL SITES

    Directory of Open Access Journals (Sweden)

    S. K. M. Abujayyab

    2015-10-01

    Full Text Available This paper briefly introduced the theory and framework of geospatial site selection (GSS and discussed the application and framework of artificial neural networks (ANNs. The related literature on the use of ANNs as decision rules in GSS is scarce from 2000 till 2015. As this study found, ANNs are not only adaptable to dynamic changes but also capable of improving the objectivity of acquisition in GSS, reducing time consumption, and providing high validation. ANNs make for a powerful tool for solving geospatial decision-making problems by enabling geospatial decision makers to implement their constraints and imprecise concepts. This tool offers a way to represent and handle uncertainty. Specifically, ANNs are decision rules implemented to enhance conventional GSS frameworks. The main assumption in implementing ANNs in GSS is that the current characteristics of existing sites are indicative of the degree of suitability of new locations with similar characteristics. GSS requires several input criteria that embody specific requirements and the desired site characteristics, which could contribute to geospatial sites. In this study, the proposed framework consists of four stages for implementing ANNs in GSS. A multilayer feed-forward network with a backpropagation algorithm was used to train the networks from prior sites to assess, generalize, and evaluate the outputs on the basis of the inputs for the new sites. Two metrics, namely, confusion matrix and receiver operating characteristic tests, were utilized to achieve high accuracy and validation. Results proved that ANNs provide reasonable and efficient results as an accurate and inexpensive quantitative technique for GSS.

  9. Metadata Wizard: an easy-to-use tool for creating FGDC-CSDGM metadata for geospatial datasets in ESRI ArcGIS Desktop

    Science.gov (United States)

    Ignizio, Drew A.; O'Donnell, Michael S.; Talbert, Colin B.

    2014-01-01

    Creating compliant metadata for scientific data products is mandated for all federal Geographic Information Systems professionals and is a best practice for members of the geospatial data community. However, the complexity of the The Federal Geographic Data Committee’s Content Standards for Digital Geospatial Metadata, the limited availability of easy-to-use tools, and recent changes in the ESRI software environment continue to make metadata creation a challenge. Staff at the U.S. Geological Survey Fort Collins Science Center have developed a Python toolbox for ESRI ArcDesktop to facilitate a semi-automated workflow to create and update metadata records in ESRI’s 10.x software. The U.S. Geological Survey Metadata Wizard tool automatically populates several metadata elements: the spatial reference, spatial extent, geospatial presentation format, vector feature count or raster column/row count, native system/processing environment, and the metadata creation date. Once the software auto-populates these elements, users can easily add attribute definitions and other relevant information in a simple Graphical User Interface. The tool, which offers a simple design free of esoteric metadata language, has the potential to save many government and non-government organizations a significant amount of time and costs by facilitating the development of The Federal Geographic Data Committee’s Content Standards for Digital Geospatial Metadata compliant metadata for ESRI software users. A working version of the tool is now available for ESRI ArcDesktop, version 10.0, 10.1, and 10.2 (downloadable at http:/www.sciencebase.gov/metadatawizard).

  10. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  11. Flexible Multi-Bit Feedback Design for HARQ Operation of Large-Size Data Packets in 5G

    DEFF Research Database (Denmark)

    Khosravirad, Saeed; Mudolo, Luke; Pedersen, Klaus I.

    2017-01-01

    large-size data packet thanks to which the transmitter node can reduce the retransmission size to only include the initially failed segments of the packet. We study the effect of feedback size on retransmission efficiency through extensive link-level simulations over realistic channel models. Numerical......A reliable feedback channel is vital to report decoding acknowledgments in retransmission mechanisms such as the hybrid automatic repeat request (HARQ). While the feedback bits are known to be costly for the wireless link, a feedback message more informative than the conventional single......-bit feedback can increase resource utilization efficiency. Considering the practical limitations for increasing feedback message size, this paper proposes a framework for the design of flexible-content multi-bit feedback. The proposed design is capable of efficiently indicating the faulty segments of a failed...

  12. Eco-friendly preparation of large-sized graphene via short-circuit discharge of lithium primary battery.

    Science.gov (United States)

    Kang, Shaohong; Yu, Tao; Liu, Tingting; Guan, Shiyou

    2018-02-15

    We proposed a large-sized graphene preparation method by short-circuit discharge of the lithium-graphite primary battery for the first time. LiC x is obtained through lithium ions intercalation into graphite cathode in the above primary battery. Graphene was acquired by chemical reaction between LiC x and stripper agents with dispersion under sonication conditions. The gained graphene is characterized by Raman spectrum, X-ray diffraction (XRD), transmission electron microscopy (TEM), X-ray photoelectron spectroscopy (XPS), Atomic force microscope (AFM) and Scanning electron microscopy (SEM). The results indicate that the as-prepared graphene has a large size and few defects, and it is monolayer or less than three layers. The quality of graphene is significant improved compared to the reported electrochemical methods. The yield of graphene can reach 8.76% when the ratio of the H 2 O and NMP is 3:7. This method provides a potential solution for the recycling of waste lithium ion batteries. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Comprehensive geo-spatial data creation for Ar-Riyadh region in the KSA

    Science.gov (United States)

    Alrajhi, M.; Hawarey, M.

    2009-04-01

    The General Directorate for Surveying and Mapping (GDSM) of the Deputy Ministry for Land and Surveying (DMLS) of the Ministry of Municipal and Rural Affairs (MOMRA) in the Kingdom of Saudi Arabia (KSA) has the exclusive mandate to carry out aerial photography and produce large-scale detailed maps for about 220 cities and villages in the KSA. This presentation is about the comprehensive geo-spatial data creation for the Ar-Riyadh region, Central KSA, that was founded on country-wide horizontal geodetic ground control using Global Navigation Satellite Systems (GNSS) within the MOMRA's Terrestrial Reference Frame 2000 (MTRF2000) that is tied to International Terrestrial Reference Frame 2000 (ITRF2000) Epoch 2004.0, and vertical geodetic ground control using precise digital leveling in reference to Jeddah 1969 mean sea level, and included aerial photography of area 3,000 km2 at 1:5,500 scale and 10,000 km2 at 1:45,000 scale, full aerial triangulation, and production of orthophoto maps at scale of 1:10,000 (480 sheets) for 10,000 km2, with aerial photography lasting from July 2007 thru August 2007.

  14. Assessing Vulnerability to Heat: A Geospatial Analysis for the City of Philadelphia

    Directory of Open Access Journals (Sweden)

    Laura Barron

    2018-04-01

    Full Text Available Urban heat island (UHI effect is an increasingly prominent health and environmental hazard that is linked to urbanization and climate change. Greening reduces the negative impacts of UHI; trees specifically are the most effective in ambient temperature reduction. This paper investigates vulnerability to heat in the Philadelphia, Pennsylvania and identifies where street trees can be planted as a public intervention. We used geospatial information systems (GIS software to map a validated Heat Vulnerability Index to identify vulnerability at the block level. Using a high-low geospatial cluster analysis, we assessed where the City of Philadelphia can most effectively plant street trees to address UHI. This information was then aggregated to the neighborhood level for more effective citizen communication and policymaking. We identified that 26 of 48 (54% neighborhoods that were vulnerable to heat also lacked street trees. Of 158 Philadelphia neighborhoods, 63 (40% contained block groups of high vulnerability to either heat or street tree infrastructure. Neighborhoods that were ranked highest in both classifications were identified in two adjacent West Philadelphia neighborhoods. Planting street trees is a public service a city can potentially reduce the negative health impacts of UHI. GIS can be used to identify and recommend street tree plantings to reduce urban heat.

  15. ANALYSIS OF RADAR AND OPTICAL SPACE BORNE DATA FOR LARGE SCALE TOPOGRAPHICAL MAPPING

    Directory of Open Access Journals (Sweden)

    W. Tampubolon

    2015-03-01

    Full Text Available Normally, in order to provide high resolution 3 Dimension (3D geospatial data, large scale topographical mapping needs input from conventional airborne campaigns which are in Indonesia bureaucratically complicated especially during legal administration procedures i.e. security clearance from military/defense ministry. This often causes additional time delays besides technical constraints such as weather and limited aircraft availability for airborne campaigns. Of course the geospatial data quality is an important issue for many applications. The increasing demand of geospatial data nowadays consequently requires high resolution datasets as well as a sufficient level of accuracy. Therefore an integration of different technologies is required in many cases to gain the expected result especially in the context of disaster preparedness and emergency response. Another important issue in this context is the fast delivery of relevant data which is expressed by the term “Rapid Mapping”. In this paper we present first results of an on-going research to integrate different data sources like space borne radar and optical platforms. Initially the orthorectification of Very High Resolution Satellite (VHRS imagery i.e. SPOT-6 has been done as a continuous process to the DEM generation using TerraSAR-X/TanDEM-X data. The role of Ground Control Points (GCPs from GNSS surveys is mandatory in order to fulfil geometrical accuracy. In addition, this research aims on providing suitable processing algorithm of space borne data for large scale topographical mapping as described in section 3.2. Recently, radar space borne data has been used for the medium scale topographical mapping e.g. for 1:50.000 map scale in Indonesian territories. The goal of this on-going research is to increase the accuracy of remote sensing data by different activities, e.g. the integration of different data sources (optical and radar or the usage of the GCPs in both, the optical and the

  16. Creating of Central Geospatial Database of the Slovak Republic and Procedures of its Revision

    Science.gov (United States)

    Miškolci, M.; Šafář, V.; Šrámková, R.

    2016-06-01

    The article describes the creation of initial three dimensional geodatabase from planning and designing through the determination of technological and manufacturing processes to practical using of Central Geospatial Database (CGD - official name in Slovak language is Centrálna Priestorová Databáza - CPD) and shortly describes procedures of its revision. CGD ensures proper collection, processing, storing, transferring and displaying of digital geospatial information. CGD is used by Ministry of Defense (MoD) for defense and crisis management tasks and by Integrated rescue system. For military personnel CGD is run on MoD intranet, and for other users outside of MoD is transmutated to ZbGIS (Primary Geodatabase of Slovak Republic) and is run on public web site. CGD is a global set of geo-spatial information. CGD is a vector computer model which completely covers entire territory of Slovakia. Seamless CGD is created by digitizing of real world using of photogrammetric stereoscopic methods and measurements of objects properties. Basic vector model of CGD (from photogrammetric processing) is then taken out to the field for inspection and additional gathering of objects properties in the whole area of mapping. Finally real-world objects are spatially modeled as a entities of three-dimensional database. CGD gives us opportunity, to get know the territory complexly in all the three spatial dimensions. Every entity in CGD has recorded the time of collection, which allows the individual to assess the timeliness of information. CGD can be utilized for the purposes of geographical analysis, geo-referencing, cartographic purposes as well as various special-purpose mapping and has the ambition to cover the needs not only the MoD, but to become a reference model for the national geographical infrastructure.

  17. Recent Advances in Geospatial Visualization with the New Google Earth

    Science.gov (United States)

    Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.

    2017-12-01

    Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.

  18. High voltage distribution scheme for large size GEM detector

    International Nuclear Information System (INIS)

    Saini, J.; Kumar, A.; Dubey, A.K.; Negi, V.S.; Chattopadhyay, S.

    2016-01-01

    Gas Electron Multiplier (GEM) detectors will be used for Muon tracking in the Compressed Baryonic Matter (CBM) experiment at the Facility for Anti-proton Ion Research (FAIR) at Darmstadt, Germany. The sizes of the detector modules in the Muon chambers are of the order of 1 metre x 0.5 metre. For construction of these chambers, three GEM foils are used per chamber. These foils are made by two layered 50μm thin kapton foil. Each GEM foil has millions of holes on it. In such a large scale manufacturing of the foils, even after stringent quality controls, some of the holes may still have defects or defects might develop over the time with operating conditions. These defects may result in short-circuit of the entire GEM foil. A short even in a single hole will make entire foil un-usable. To reduce such occurrences, high voltage (HV) segmentation within the foils has been introduced. These segments are powered either by individual HV supply per segment or through an active HV distribution to manage such a large number of segments across the foil. Individual supplies apart from being costly, are highly complex to implement. Additionally, CBM will have high intensity of particles bombarding on the detector causing the change of resistive chain current feeding the GEM detector with the variation in the intensity. This leads to voltage fluctuations across the foil resulting in the gain variation with the particle intensity. Hence, a low cost active HV distribution is designed to take care of the above discussed issues

  19. Multi-class geospatial object detection based on a position-sensitive balancing framework for high spatial resolution remote sensing imagery

    Science.gov (United States)

    Zhong, Yanfei; Han, Xiaobing; Zhang, Liangpei

    2018-04-01

    Multi-class geospatial object detection from high spatial resolution (HSR) remote sensing imagery is attracting increasing attention in a wide range of object-related civil and engineering applications. However, the distribution of objects in HSR remote sensing imagery is location-variable and complicated, and how to accurately detect the objects in HSR remote sensing imagery is a critical problem. Due to the powerful feature extraction and representation capability of deep learning, the deep learning based region proposal generation and object detection integrated framework has greatly promoted the performance of multi-class geospatial object detection for HSR remote sensing imagery. However, due to the translation caused by the convolution operation in the convolutional neural network (CNN), although the performance of the classification stage is seldom influenced, the localization accuracies of the predicted bounding boxes in the detection stage are easily influenced. The dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage has not been addressed for HSR remote sensing imagery, and causes position accuracy problems for multi-class geospatial object detection with region proposal generation and object detection. In order to further improve the performance of the region proposal generation and object detection integrated framework for HSR remote sensing imagery object detection, a position-sensitive balancing (PSB) framework is proposed in this paper for multi-class geospatial object detection from HSR remote sensing imagery. The proposed PSB framework takes full advantage of the fully convolutional network (FCN), on the basis of a residual network, and adopts the PSB framework to solve the dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage. In addition, a pre-training mechanism is utilized to accelerate the training procedure

  20. Multiscale virtual particle based elastic network model (MVP-ENM) for normal mode analysis of large-sized biomolecules.

    Science.gov (United States)

    Xia, Kelin

    2017-12-20

    In this paper, a multiscale virtual particle based elastic network model (MVP-ENM) is proposed for the normal mode analysis of large-sized biomolecules. The multiscale virtual particle (MVP) model is proposed for the discretization of biomolecular density data. With this model, large-sized biomolecular structures can be coarse-grained into virtual particles such that a balance between model accuracy and computational cost can be achieved. An elastic network is constructed by assuming "connections" between virtual particles. The connection is described by a special harmonic potential function, which considers the influence from both the mass distributions and distance relations of the virtual particles. Two independent models, i.e., the multiscale virtual particle based Gaussian network model (MVP-GNM) and the multiscale virtual particle based anisotropic network model (MVP-ANM), are proposed. It has been found that in the Debye-Waller factor (B-factor) prediction, the results from our MVP-GNM with a high resolution are as good as the ones from GNM. Even with low resolutions, our MVP-GNM can still capture the global behavior of the B-factor very well with mismatches predominantly from the regions with large B-factor values. Further, it has been demonstrated that the low-frequency eigenmodes from our MVP-ANM are highly consistent with the ones from ANM even with very low resolutions and a coarse grid. Finally, the great advantage of MVP-ANM model for large-sized biomolecules has been demonstrated by using two poliovirus virus structures. The paper ends with a conclusion.

  1. A simple, compact, and rigid piezoelectric step motor with large step size

    Science.gov (United States)

    Wang, Qi; Lu, Qingyou

    2009-08-01

    We present a novel piezoelectric stepper motor featuring high compactness, rigidity, simplicity, and any direction operability. Although tested in room temperature, it is believed to work in low temperatures, owing to its loose operation conditions and large step size. The motor is implemented with a piezoelectric scanner tube that is axially cut into almost two halves and clamp holds a hollow shaft inside at both ends via the spring parts of the shaft. Two driving voltages that singly deform the two halves of the piezotube in one direction and recover simultaneously will move the shaft in the opposite direction, and vice versa.

  2. Integrated Sustainable Planning for Industrial Region Using Geospatial Technology

    Science.gov (United States)

    Tiwari, Manish K.; Saxena, Aruna; Katare, Vivek

    2012-07-01

    The Geospatial techniques and its scope of applications have undergone an order of magnitude change since its advent and now it has been universally accepted as a most important and modern tool for mapping and monitoring of various natural resources as well as amenities and infrastructure. The huge and voluminous spatial database generated from various Remote Sensing platforms needs proper management like storage, retrieval, manipulation and analysis to extract desired information, which is beyond the capability of human brain. This is where the computer aided GIS technology came into existence. A GIS with major input from Remote Sensing satellites for the natural resource management applications must be able to handle the spatiotemporal data, supporting spatiotemporal quarries and other spatial operations. Software and the computer-based tools are designed to make things easier to the user and to improve the efficiency and quality of information processing tasks. The natural resources are a common heritage, which we have shared with the past generations, and our future generation will be inheriting these resources from us. Our greed for resource and our tremendous technological capacity to exploit them at a much larger scale has created a situation where we have started withdrawing from the future stocks. Bhopal capital region had attracted the attention of the planners from the beginning of the five-year plan strategy for Industrial development. However, a number of projects were carried out in the individual Districts (Bhopal, Rajgarh, Shajapur, Raisen, Sehore) which also gave fruitful results, but no serious efforts have been made to involve the entire region. No use of latest Geospatial technique (Remote Sensing, GIS, GPS) to prepare a well structured computerized data base without which it is very different to retrieve, analyze and compare the data for monitoring as well as for planning the developmental activities in future.

  3. Contribution of large-sized primary sensory neuronal sensitization to mechanical allodynia by upregulation of hyperpolarization-activated cyclic nucleotide gated channels via cyclooxygenase 1 cascade.

    Science.gov (United States)

    Sun, Wei; Yang, Fei; Wang, Yan; Fu, Han; Yang, Yan; Li, Chun-Li; Wang, Xiao-Liang; Lin, Qing; Chen, Jun

    2017-02-01

    Under physiological state, small- and medium-sized dorsal root ganglia (DRG) neurons are believed to mediate nociceptive behavioral responses to painful stimuli. However, recently it has been found that a number of large-sized neurons are also involved in nociceptive transmission under neuropathic conditions. Nonetheless, the underlying mechanisms that large-sized DRG neurons mediate nociception are poorly understood. In the present study, the role of large-sized neurons in bee venom (BV)-induced mechanical allodynia and the underlying mechanisms were investigated. Behaviorally, it was found that mechanical allodynia was still evoked by BV injection in rats in which the transient receptor potential vanilloid 1-positive DRG neurons were chemically deleted. Electrophysiologically, in vitro patch clamp recordings of large-sized neurons showed hyperexcitability in these neurons. Interestingly, the firing pattern of these neurons was changed from phasic to tonic under BV-inflamed state. It has been suggested that hyperpolarization-activated cyclic nucleotide gated channels (HCN) expressed in large-sized DRG neurons contribute importantly to repeatedly firing. So we examined the roles of HCNs in BV-induced mechanical allodynia. Consistent with the overexpression of HCN1/2 detected by immunofluorescence, HCNs-mediated hyperpolarization activated cation current (I h ) was significantly increased in the BV treated samples. Pharmacological experiments demonstrated that the hyperexcitability and upregulation of I h in large-sized neurons were mediated by cyclooxygenase-1 (COX-1)-prostaglandin E2 pathway. This is evident by the fact that the COX-1 inhibitor significantly attenuated the BV-induced mechanical allodynia. These results suggest that BV can excite the large-sized DRG neurons at least in part by increasing I h through activation of COX-1. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Dynamic Science Data Services for Display, Analysis and Interaction in Widely-Accessible, Web-Based Geospatial Platforms, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — TerraMetrics, Inc., proposes a Phase II R/R&D program to implement the TerraBlocksTM Server architecture that provides geospatial data authoring, storage and...

  5. CLUSTER DYNAMICS LARGELY SHAPES PROTOPLANETARY DISK SIZES

    Energy Technology Data Exchange (ETDEWEB)

    Vincke, Kirsten; Pfalzner, Susanne, E-mail: kvincke@mpifr-bonn.mpg.de [Max Planck Institute for Radio Astronomy, Auf dem Hügel 69, D-53121 Bonn (Germany)

    2016-09-01

    To what degree the cluster environment influences the sizes of protoplanetary disks surrounding young stars is still an open question. This is particularly true for the short-lived clusters typical for the solar neighborhood, in which the stellar density and therefore the influence of the cluster environment change considerably over the first 10 Myr. In previous studies, the effect of the gas on the cluster dynamics has often been neglected; this is remedied here. Using the code NBody6++, we study the stellar dynamics in different developmental phases—embedded, expulsion, and expansion—including the gas, and quantify the effect of fly-bys on the disk size. We concentrate on massive clusters (M {sub cl} ≥ 10{sup 3}–6 ∗ 10{sup 4} M {sub Sun}), which are representative for clusters like the Orion Nebula Cluster (ONC) or NGC 6611. We find that not only the stellar density but also the duration of the embedded phase matters. The densest clusters react fastest to the gas expulsion and drop quickly in density, here 98% of relevant encounters happen before gas expulsion. By contrast, disks in sparser clusters are initially less affected, but because these clusters expand more slowly, 13% of disks are truncated after gas expulsion. For ONC-like clusters, we find that disks larger than 500 au are usually affected by the environment, which corresponds to the observation that 200 au-sized disks are common. For NGC 6611-like clusters, disk sizes are cut-down on average to roughly 100 au. A testable hypothesis would be that the disks in the center of NGC 6611 should be on average ≈20 au and therefore considerably smaller than those in the ONC.

  6. Seeing through the clouds: Processes and challenges for sharing geospatial data for disaster management in Haiti

    DEFF Research Database (Denmark)

    Clark, Nathan Edward; Guiffault, Flore

    2018-01-01

    This article examines the ways in which the production and sharing of geospatial data for disaster management purposes have evolved in Haiti, within the context of the 2010 earthquake and 2016 Hurricane Matthew. The conditions for these developments are traced through the institutional and operat...

  7. Tsunami vertical-evacuation planning in the U.S. Pacific Northwest as a geospatial, multi-criteria decision problem

    Science.gov (United States)

    Wood, Nathan; Jones, Jeanne; Schelling, John; Schmidtlein, Mathew

    2014-01-01

    Tsunami vertical-evacuation (TVE) refuges can be effective risk-reduction options for coastal communities with local tsunami threats but no accessible high ground for evacuations. Deciding where to locate TVE refuges is a complex risk-management question, given the potential for conflicting stakeholder priorities and multiple, suitable sites. We use the coastal community of Ocean Shores (Washington, USA) and the local tsunami threat posed by Cascadia subduction zone earthquakes as a case study to explore the use of geospatial, multi-criteria decision analysis for framing the locational problem of TVE siting. We demonstrate a mixed-methods approach that uses potential TVE sites identified at community workshops, geospatial analysis to model changes in pedestrian evacuation times for TVE options, and statistical analysis to develop metrics for comparing population tradeoffs and to examine influences in decision making. Results demonstrate that no one TVE site can save all at-risk individuals in the community and each site provides varying benefits to residents, employees, customers at local stores, tourists at public venues, children at schools, and other vulnerable populations. The benefit of some proposed sites varies depending on whether or not nearby bridges will be functioning after the preceding earthquake. Relative rankings of the TVE sites are fairly stable under various criteria-weighting scenarios but do vary considerably when comparing strategies to exclusively protect tourists or residents. The proposed geospatial framework can serve as an analytical foundation for future TVE siting discussions.

  8. Strength and fatigue testing of large size wind turbines rotors. Vol. II: Full size natural vibration and static strength test, a reference case

    Energy Technology Data Exchange (ETDEWEB)

    Arias, F.; Soria, E.

    1996-12-01

    This report shows the methods and procedures selected to define a strength test for large size wind turbine, anyway in particular it application on a 500 kW blade and it results obtained in the test carried out in july of 1995 in Asinel`s test plant (Madrid). Henceforth, this project is designed in an abbreviate form whit the acronym SFAT. (Author)

  9. Strength and fatigue testing of large size wind turbines rotors. Volume II. Full size natural vibration and static strength test, a reference case

    International Nuclear Information System (INIS)

    Arias, F.; Soria, E.

    1996-01-01

    This report shows the methods and procedures selected to define a strength test for large size wind turbine, anyway in particularly it application on a 500 kW blade and it results obtained in the test carried out in july of 1995 in Asinel test plant (Madrid). Henceforth, this project is designed in an abbreviate form whit the acronym SFAT. (Author)

  10. Job Stress in the United Kingdom: Are Small and Medium-Sized Enterprises and Large Enterprises Different?

    Science.gov (United States)

    Lai, Yanqing; Saridakis, George; Blackburn, Robert

    2015-08-01

    This paper examines the relationships between firm size and employees' experience of work stress. We used a matched employer-employee dataset (Workplace Employment Relations Survey 2011) that comprises of 7182 employees from 1210 private organizations in the United Kingdom. Initially, we find that employees in small and medium-sized enterprises experience lower level of overall job stress than those in large enterprises, although the effect disappears when we control for individual and organizational characteristics in the model. We also find that quantitative work overload, job insecurity and poor promotion opportunities, good work relationships and poor communication are strongly associated with job stress in the small and medium-sized enterprises, whereas qualitative work overload, poor job autonomy and employee engagements are more related with larger enterprises. Hence, our estimates show that the association and magnitude of estimated effects differ significantly by enterprise size. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Ulysses: accurate detection of low-frequency structural variations in large insert-size sequencing libraries.

    Science.gov (United States)

    Gillet-Markowska, Alexandre; Richard, Hugues; Fischer, Gilles; Lafontaine, Ingrid

    2015-03-15

    The detection of structural variations (SVs) in short-range Paired-End (PE) libraries remains challenging because SV breakpoints can involve large dispersed repeated sequences, or carry inherent complexity, hardly resolvable with classical PE sequencing data. In contrast, large insert-size sequencing libraries (Mate-Pair libraries) provide higher physical coverage of the genome and give access to repeat-containing regions. They can thus theoretically overcome previous limitations as they are becoming routinely accessible. Nevertheless, broad insert size distributions and high rates of chimerical sequences are usually associated to this type of libraries, which makes the accurate annotation of SV challenging. Here, we present Ulysses, a tool that achieves drastically higher detection accuracy than existing tools, both on simulated and real mate-pair sequencing datasets from the 1000 Human Genome project. Ulysses achieves high specificity over the complete spectrum of variants by assessing, in a principled manner, the statistical significance of each possible variant (duplications, deletions, translocations, insertions and inversions) against an explicit model for the generation of experimental noise. This statistical model proves particularly useful for the detection of low frequency variants. SV detection performed on a large insert Mate-Pair library from a breast cancer sample revealed a high level of somatic duplications in the tumor and, to a lesser extent, in the blood sample as well. Altogether, these results show that Ulysses is a valuable tool for the characterization of somatic mosaicism in human tissues and in cancer genomes. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Analyzing Personal Happiness from Global Survey and Weather Data: A Geospatial Approach.

    Science.gov (United States)

    Peng, Yi-Fan; Tang, Jia-Hong; Fu, Yang-chih; Fan, I-chun; Hor, Maw-Kae; Chan, Ta-Chien

    2016-01-01

    Past studies have shown that personal subjective happiness is associated with various macro- and micro-level background factors, including environmental conditions, such as weather and the economic situation, and personal health behaviors, such as smoking and exercise. We contribute to this literature of happiness studies by using a geospatial approach to examine both macro and micro links to personal happiness. Our geospatial approach incorporates two major global datasets: representative national survey data from the International Social Survey Program (ISSP) and corresponding world weather data from the National Oceanic and Atmospheric Administration (NOAA). After processing and filtering 55,081 records of ISSP 2011 survey data from 32 countries, we extracted 5,420 records from China and 25,441 records from 28 other countries. Sensitivity analyses of different intervals for average weather variables showed that macro-level conditions, including temperature, wind speed, elevation, and GDP, are positively correlated with happiness. To distinguish the effects of weather conditions on happiness in different seasons, we also adopted climate zone and seasonal variables. The micro-level analysis indicated that better health status and eating more vegetables or fruits are highly associated with happiness. Never engaging in physical activity appears to make people less happy. The findings suggest that weather conditions, economic situations, and personal health behaviors are all correlated with levels of happiness.

  13. Operational Marine Data Acquisition and Delivery Powered by Web and Geospatial Standards

    Science.gov (United States)

    Thomas, R.; Buck, J. J. H.

    2015-12-01

    As novel sensor types and new platforms are deployed to monitor the global oceans, the volumes of scientific and environmental data collected in the marine context are rapidly growing. In order to use these data in both the traditional operational modes and in innovative "Big Data" applications the data must be readily understood by software agents. One approach to achieving this is the application of both World Wide Web and Open Geospatial Consortium standards: namely Linked Data1 and Sensor Web Enablement2 (SWE). The British Oceanographic Data Centre (BODC) is adopting this strategy in a number of European Commission funded projects (NETMAR; SenseOCEAN; Ocean Data Interoperability Platform - ODIP; and AtlantOS) to combine its existing data archiving architecture with SWE components (such as Sensor Observation Services) and a Linked Data interface. These will evolve the data management and data transfer from a process that requires significant manual intervention to an automated operational process enabling the rapid, standards-based, ingestion and delivery of data. This poster will show the current capabilities of BODC and the status of on-going implementation of this strategy. References1. World Wide Web Consortium. (2013). Linked Data. Available:http://www.w3.org/standards/semanticweb/data. Last accessed 7th April 20152. Open Geospatial Consortium. (2014). Sensor Web Enablement (SWE). Available:http://www.opengeospatial.org/ogc/markets-technologies/swe. Last accessed 8th October 2014

  14. Categorizing natural disaster damage assessment using satellite-based geospatial techniques

    Science.gov (United States)

    Myint, S.W.; Yuan, M.; Cerveny, R.S.; Giri, C.

    2008-01-01

    Remote sensing of a natural disaster's damage offers an exciting backup and/or alternative to traditional means of on-site damage assessment. Although necessary for complete assessment of damage areas, ground-based damage surveys conducted in the aftermath of natural hazard passage can sometimes be potentially complicated due to on-site difficulties (e.g., interaction with various authorities and emergency services) and hazards (e.g., downed power lines, gas lines, etc.), the need for rapid mobilization (particularly for remote locations), and the increasing cost of rapid physical transportation of manpower and equipment. Satellite image analysis, because of its global ubiquity, its ability for repeated independent analysis, and, as we demonstrate here, its ability to verify on-site damage assessment provides an interesting new perspective and investigative aide to researchers. Using one of the strongest tornado events in US history, the 3 May 1999 Oklahoma City Tornado, as a case example, we digitized the tornado damage path and co-registered the damage path using pre- and post-Landsat Thematic Mapper image data to perform a damage assessment. We employed several geospatial approaches, specifically the Getis index, Geary's C, and two lacunarity approaches to categorize damage characteristics according to the original Fujita tornado damage scale (F-scale). Our results indicate strong relationships between spatial indices computed within a local window and tornado F-scale damage categories identified through the ground survey. Consequently, linear regression models, even incorporating just a single band, appear effective in identifying F-scale damage categories using satellite imagery. This study demonstrates that satellite-based geospatial techniques can effectively add spatial perspectives to natural disaster damages, and in particular for this case study, tornado damages.

  15. A lake-centric geospatial database to guide research and inform management decisions in an Arctic watershed in northern Alaska experiencing climate and land-use changes

    Science.gov (United States)

    Jones, Benjamin M.; Arp, Christopher D.; Whitman, Matthew S.; Nigro, Debora A.; Nitze, Ingmar; Beaver, John; Gadeke, Anne; Zuck, Callie; Liljedahl, Anna K.; Daanen, Ronald; Torvinen, Eric; Fritz, Stacey; Grosse, Guido

    2017-01-01

    Lakes are dominant and diverse landscape features in the Arctic, but conventional land cover classification schemes typically map them as a single uniform class. Here, we present a detailed lake-centric geospatial database for an Arctic watershed in northern Alaska. We developed a GIS dataset consisting of 4362 lakes that provides information on lake morphometry, hydrologic connectivity, surface area dynamics, surrounding terrestrial ecotypes, and other important conditions describing Arctic lakes. Analyzing the geospatial database relative to fish and bird survey data shows relations to lake depth and hydrologic connectivity, which are being used to guide research and aid in the management of aquatic resources in the National Petroleum Reserve in Alaska. Further development of similar geospatial databases is needed to better understand and plan for the impacts of ongoing climate and land-use changes occurring across lake-rich landscapes in the Arctic.

  16. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  17. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  18. A Smart Web-Based Geospatial Data Discovery System with Oceanographic Data as an Example

    Directory of Open Access Journals (Sweden)

    Yongyao Jiang

    2018-02-01

    Full Text Available Discovering and accessing geospatial data presents a significant challenge for the Earth sciences community as massive amounts of data are being produced on a daily basis. In this article, we report a smart web-based geospatial data discovery system that mines and utilizes data relevancy from metadata user behavior. Specifically, (1 the system enables semantic query expansion and suggestion to assist users in finding more relevant data; (2 machine-learned ranking is utilized to provide the optimal search ranking based on a number of identified ranking features that can reflect users’ search preferences; (3 a hybrid recommendation module is designed to allow users to discover related data considering metadata attributes and user behavior; (4 an integrated graphic user interface design is developed to quickly and intuitively guide data consumers to the appropriate data resources. As a proof of concept, we focus on a well-defined domain-oceanography and use oceanographic data discovery as an example. Experiments and a search example show that the proposed system can improve the scientific community’s data search experience by providing query expansion, suggestion, better search ranking, and data recommendation via a user-friendly interface.

  19. Development and introduction of stamping technique for large-size laterals of NPP pipelines

    International Nuclear Information System (INIS)

    Romashko, N.I.; Moshnin, E.N.; Timokhin, V.S.; Bryukhanov, Yu.V.; Lebedev, V.A.

    1984-01-01

    The results of development and introduction of stamping technique for large-size laterals of NPP high-pressure pipelines are presented. The main experimental data characterizing technological possibilities of the process are given. The technological process and design of the stamp assure production of laterals from ovalized bars per one heating of the bar and per one running of the press cronnhead. Introduction of new technology decreased labour input of lateral production, reliability and serviceability of pipelines increased in this case. Introduction of this technology gives a considerable benefit

  20. Geospatial Technologies and i-Tree Echo Inventory for Predicting Climate Change on Urban Environment

    Science.gov (United States)

    Sriharan, S.; Robinson, L.; Ghariban, N.; Comar, M.; Pope, B.; Frey, G.

    2015-12-01

    Urban forests can be useful both in mitigating climate change and in helping cities adapt to higher temperatures and other impacts of climate change. Understanding and managing the impacts of climate change on the urban forest trees and natural communities will help us maintain their environmental, cultural, and economic benefits. Tree Inventory can provide important information on tree species, height, crown width, overall condition, health and maintenance needs. This presentation will demonstrate that a trees database system is necessary for developing a sustainable urban tree program. Virginia State University (VSU) campus benefits from large number and diversity of trees that are helping us by cleaning the air, retaining water, and providing shade on the buildings to reduce energy cost. The objectives of this study were to develop campus inventory of the trees, identify the tree species, map the locations of the trees with user-friendly tools such as i-Tree Eco and geospatial technologies by assessing the cost/benefit of employing student labor for training and ground validation of the results, and help campus landscape managers implement adaptive responses to climate change impacts. Data was collected on the location, species, and size of trees by using i-Tree urban forestry analysis software. This data was transferred to i-Tree inventory system for demonstrating types of trees, diameter of the trees, height of the trees, and vintage of the trees. The study site was mapped by collecting waypoints with GPS (Global Positioning System) at the trees and uploading these waypoints in ArcMap. The results of this study showed that: (i) students make good field crews, (ii) if more trees were placed in the proper area, the heating and cooling costs will reduce, and (iii) trees database system is necessary for planning, designing, planting, and maintenance, and removal of campus trees Research sponsored by the NIFA Grant, "Urban Forestry Management" (2012-38821-20153).

  1. Geospatial technology and the "exposome": new perspectives on addiction.

    Science.gov (United States)

    Stahler, Gerald J; Mennis, Jeremy; Baron, David A

    2013-08-01

    Addiction represents one of the greatest public health problems facing the United States. Advances in addiction research have focused on the neurobiology of this disease. We discuss potential new breakthroughs in understanding the other side of gene-environment interactions-the environmental context or "exposome" of addiction. Such research has recently been made possible by advances in geospatial technologies together with new mobile and sensor computing platforms. These advances have fostered interdisciplinary collaborations focusing on the intersection of environment and behavior in addiction research. Although issues of privacy protection for study participants remain, these advances could potentially improve our understanding of initiation of drug use and relapse and help develop innovative technology-based interventions to improve treatment and continuing care services.

  2. An Examination of Teachers' Perceptions and Practice when Teaching Large and Reduced-Size Classes: Do Teachers Really Teach Them in the Same Way?

    Science.gov (United States)

    Harfitt, Gary James

    2012-01-01

    Class size research suggests that teachers do not vary their teaching strategies when moving from large to smaller classes. This study draws on interviews and classroom observations of three experienced English language teachers working with large and reduced-size classes in Hong Kong secondary schools. Findings from the study point to subtle…

  3. Analyzing Damping Vibration Methods of Large-Size Space Vehicles in the Earth's Magnetic Field

    Directory of Open Access Journals (Sweden)

    G. A. Shcheglov

    2016-01-01

    Full Text Available It is known that most of today's space vehicles comprise large antennas, which are bracket-attached to the vehicle body. Dimensions of reflector antennas may be of 30 ... 50 m. The weight of such constructions can reach approximately 200 kg.Since the antenna dimensions are significantly larger than the size of the vehicle body and the points to attach the brackets to the space vehicles have a low stiffness, conventional dampers may be inefficient. The paper proposes to consider the damping antenna in terms of its interaction with the Earth's magnetic field.A simple dynamic model of the space vehicle equipped with a large-size structure is built. The space vehicle is a parallelepiped to which the antenna is attached through a beam.To solve the model problems, was used a simplified model of Earth's magnetic field: uniform, with intensity lines parallel to each other and perpendicular to the plane of the antenna.The paper considers two layouts of coils with respect to the antenna, namely: a vertical one in which an axis of magnetic dipole is perpendicular to the antenna plane, and a horizontal layout in which an axis of magnetic dipole lies in the antenna plane. It also explores two ways for magnetic damping of oscillations: through the controlled current that is supplied from the power supply system of the space vehicle, and by the self-induction current in the coil. Thus, four objectives were formulated.In each task was formulated an oscillation equation. Then a ratio of oscillation amplitudes and their decay time were estimated. It was found that each task requires the certain parameters either of the antenna itself, its dimensions and moment of inertia, or of the coil and, respectively, the current, which is supplied from the space vehicle. In each task for these parameters were found the ranges, which allow us to tell of efficient damping vibrations.The conclusion can be drawn based on the analysis of tasks that a specialized control system

  4. Small genomes and large seeds: chromosome numbers, genome size and seed mass in diploid Aesculus species (Sapindaceae).

    Science.gov (United States)

    Krahulcová, Anna; Trávnícek, Pavel; Krahulec, František; Rejmánek, Marcel

    2017-04-01

    Aesculus L. (horse chestnut, buckeye) is a genus of 12-19 extant woody species native to the temperate Northern Hemisphere. This genus is known for unusually large seeds among angiosperms. While chromosome counts are available for many Aesculus species, only one has had its genome size measured. The aim of this study is to provide more genome size data and analyse the relationship between genome size and seed mass in this genus. Chromosome numbers in root tip cuttings were confirmed for four species and reported for the first time for three additional species. Flow cytometric measurements of 2C nuclear DNA values were conducted on eight species, and mean seed mass values were estimated for the same taxa. The same chromosome number, 2 n = 40, was determined in all investigated taxa. Original measurements of 2C values for seven Aesculus species (eight taxa), added to just one reliable datum for A. hippocastanum , confirmed the notion that the genome size in this genus with relatively large seeds is surprisingly low, ranging from 0·955 pg 2C -1 in A. parviflora to 1·275 pg 2C -1 in A. glabra var. glabra. The chromosome number of 2 n = 40 seems to be conclusively the universal 2 n number for non-hybrid species in this genus. Aesculus genome sizes are relatively small, not only within its own family, Sapindaceae, but also within woody angiosperms. The genome sizes seem to be distinct and non-overlapping among the four major Aesculus clades. These results provide an extra support for the most recent reconstruction of Aesculus phylogeny. The correlation between the 2C values and seed masses in examined Aesculus species is slightly negative and not significant. However, when the four major clades are treated separately, there is consistent positive association between larger genome size and larger seed mass within individual lineages. © The Author 2017. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For

  5. Preparation and validation of a large size dried spike: Batch SAL-9924

    International Nuclear Information System (INIS)

    Bagliano, G.; Cappis, J.; Doubek, N.; Jammet, G.; Raab, W.; Zoigner, A.

    1989-12-01

    To determine uranium and plutonium concentration using isotope dilution mass spectrometry, weighed aliquands of a synthetic mixture containing 2 to 4 mg of Pu (with a 239 Pu abundance of about 97%) and 40 to 200 mg of U (with a 235 U enrichment of about 18%) can be advantageously used to spike a concentrated spent fuel solution with a high burn up and with a low 235 U enrichment. This will simplify the conditioning of the sample by 1) reduced time of preparation (from more than one day used for the conventional technique to 2-3 hours); 2) reduced burden for the operator with a clear easiness for the inspector to witness the entire procedure (accurate dilution of the spent fuel sample before spiking being no longer necessary). Furthermore this type of spike could be used as a common spike for the operator and the inspector. The source materials are available in sufficient quantity and are enough cheaper than the commonly used 233 U and 242 Pu or 244 Pu tracer that the costs of the overall Operator-Inspector procedures will be reduced. Certified Reference Materials Pu-NBL-126, natural U-NBS-960 and 93% enriched U-NBL-116 were used to prepare a stock solution containing 1.7 mg/ml of Pu and 68 mg/ml of 17.5% enriched U. Before shipment to the Reprocessing Plant, aliquands of the stock solution must be dried to give Large Size Dried Spikes which resist shocks encountered during transportation, so that they can readily be recovered quantitatively at the plant. This paper describes the preparation and the validation of the Large Size Dried Spike. Proof of usefulness in the field will be done at a later date in parallel with analysis by the conventional technique. Refs and tabs

  6. Some Key Technologies of Geospatial Information System for China Water Census

    Directory of Open Access Journals (Sweden)

    CAI Yang

    2015-05-01

    Full Text Available We have pioneered research on geospatial information system for national water census and its application. Aiming to the main issues such as information obtaining, data management, quality control, and project organization, the overall thought is given. It is based on taking fundamental data as supporting and data model as precursor, and viewing intelligent tool as protective role, and combing the management theory with technical methods. The key techniques developed include the digital basin extraction, data modeling orienting to the object of water resources, data acquisition and processing within certain rules and the application of multidimensional theme.

  7. Mining User spatiotemporal Behavior in Geospatial Cyberinfrastructure --using GEOSS Clearinghouse as an example

    Science.gov (United States)

    XIA, J.; Yang, C.; Liu, K.; Huang, Q.; Li, Z.

    2013-12-01

    Big Data becomes increasingly important in almost all scientific domains, especially in geoscience where hundreds to millions of sensors are collecting data of the Earth continuously (Whitehouse News 2012). With the explosive growth of data, various Geospatial Cyberinfrastructure (GCI) (Yang et al. 2010) components are developed to manage geospatial resources and provide data access for the public. These GCIs are accessed by different users intensively on a daily basis. However, little research has been done to analyze the spatiotemporal patterns of user behavior, which could be critical to the management of Big Data and the operation of GCIs (Yang et al. 2011). For example, the spatiotemporal distribution of end users helps us better arrange and locate GCI computing facilities. A better indexing and caching mechanism could be developed based on the spatiotemporal pattern of user queries. In this paper, we use GEOSS Clearinghouse as an example to investigate spatiotemporal patterns of user behavior in GCIs. The investigation results show that user behaviors are heterogeneous but with patterns across space and time. Identified patterns include (1) the high access frequency regions; (2) local interests; (3) periodical accesses and rush hours; (4) spiking access. Based on identified patterns, this presentation reports several solutions to better support the operation of the GEOSS Clearinghouse and other GCIs. Keywords: Big Data, EarthCube, CyberGIS, Spatiotemporal Thinking and Computing, Data Mining, User Behavior Reference: Fayyad, U. M., Piatetsky-Shapiro, G., Smyth, P., & Uthurusamy, R. 1996. Advances in knowledge discovery and data mining. Whitehouse. 2012. Obama administration unveils 'BIG DATA' initiative: announces $200 million in new R&D investments. Whitehouse. Retrieved from http://www.whitehouse.gov/sites/default/files/microsites/ostp/big_data_press_release_final_2.pdf [Accessed 14 June 2013] Yang, C., Wu, H., Huang, Q., Li, Z., & Li, J. 2011. Using spatial

  8. Simulation of reflecting surface deviations of centimeter-band parabolic space radiotelescope (SRT) with the large-size mirror

    Science.gov (United States)

    Kotik, A.; Usyukin, V.; Vinogradov, I.; Arkhipov, M.

    2017-11-01

    he realization of astrophysical researches requires the development of high-sensitive centimeterband parabolic space radiotelescopes (SRT) with the large-size mirrors. Constructively such SRT with the mirror size more than 10 m can be realized as deployable rigid structures. Mesh-structures of such size do not provide the reflector reflecting surface accuracy which is necessary for the centimeter band observations. Now such telescope with the 10 m diameter mirror is developed in Russia in the frame of "SPECTR - R" program. External dimensions of the telescope is more than the size of existing thermo-vacuum chambers used to prove SRT reflecting surface accuracy parameters under the action of space environment factors. That's why the numerical simulation turns out to be the basis required to accept the taken designs. Such modeling should be based on experimental working of the basic constructive materials and elements of the future reflector. In the article computational modeling of reflecting surface deviations of a centimeter-band of a large-sized deployable space reflector at a stage of his orbital functioning is considered. The analysis of the factors that determines the deviations - both determined (temperatures fields) and not-determined (telescope manufacturing and installation faults; the deformations caused by features of composite materials behavior in space) is carried out. The finite-element model and complex of methods are developed. They allow to carry out computational modeling of reflecting surface deviations caused by influence of all factors and to take into account the deviations correction by space vehicle orientation system. The results of modeling for two modes of functioning (orientation at the Sun) SRT are presented.

  9. Analysis of Large Seeds from Three Different Medicago truncatula Ecotypes Reveals a Potential Role of Hormonal Balance in Final Size Determination of Legume Grains

    Directory of Open Access Journals (Sweden)

    Kaustav Bandyopadhyay

    2016-09-01

    Full Text Available Legume seeds are important as protein and oil source for human diet. Understanding how their final seed size is determined is crucial to improve crop yield. In this study, we analyzed seed development of three accessions of the model legume, Medicago truncatula, displaying contrasted seed size. By comparing two large seed accessions to the reference accession A17, we described mechanisms associated with large seed size determination and potential factors modulating the final seed size. We observed that early events during embryogenesis had a major impact on final seed size and a delayed heart stage embryo development resulted to large seeds. We also observed that the difference in seed growth rate was mainly due to a difference in embryo cell number, implicating a role of cell division rate. Large seed accessions could be explained by an extended period of cell division due to a longer embryogenesis phase. According to our observations and recent reports, we observed that auxin (IAA and abscisic acid (ABA ratio could be a key determinant of cell division regulation at the end of embryogenesis. Overall, our study highlights that timing of events occurring during early seed development play decisive role for final seed size determination.

  10. Elucidating hydraulic fracturing impacts on groundwater quality using a regional geospatial statistical modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Burton, Taylour G., E-mail: tgburton@uh.edu [Civil and Environmental Engineering, University of Houston, W455 Engineering Bldg. 2, Houston, TX 77204-4003 (United States); Rifai, Hanadi S., E-mail: rifai@uh.edu [Civil and Environmental Engineering, University of Houston, N138 Engineering Bldg. 1, Houston, TX 77204-4003 (United States); Hildenbrand, Zacariah L., E-mail: zac@informenv.com [Inform Environmental, LLC, Dallas, TX 75206 (United States); Collaborative Laboratories for Environmental Analysis and Remediation, University of Texas at Arlington, Arlington, TX 76019 (United States); Carlton, Doug D., E-mail: doug.carlton@mavs.uta.edu [Collaborative Laboratories for Environmental Analysis and Remediation, University of Texas at Arlington, Arlington, TX 76019 (United States); Department of Chemistry & Biochemistry, The University of Texas at Arlington, Arlington, TX (United States); Fontenot, Brian E., E-mail: brian.fonteno@mavs.uta.edu [Collaborative Laboratories for Environmental Analysis and Remediation, University of Texas at Arlington, Arlington, TX 76019 (United States); Schug, Kevin A., E-mail: kschug@uta.edu [Collaborative Laboratories for Environmental Analysis and Remediation, University of Texas at Arlington, Arlington, TX 76019 (United States); Department of Chemistry & Biochemistry, The University of Texas at Arlington, Arlington, TX (United States)

    2016-03-01

    . - Highlights: • Migration pathways from fractured wells to groundwater are poorly understood • Geospatial modeling correlated groundwater chemicals to Barnett fractured wells • Increased Beryllium strongly associated with hydraulically fractured gas wells • Indirect evidence of pollutant migration via microannular fissures in well casing • Large-scale and spatial approach needed to detect groundwater quality changes.

  11. Elucidating hydraulic fracturing impacts on groundwater quality using a regional geospatial statistical modeling approach

    International Nuclear Information System (INIS)

    Burton, Taylour G.; Rifai, Hanadi S.; Hildenbrand, Zacariah L.; Carlton, Doug D.; Fontenot, Brian E.; Schug, Kevin A.

    2016-01-01

    . - Highlights: • Migration pathways from fractured wells to groundwater are poorly understood • Geospatial modeling correlated groundwater chemicals to Barnett fractured wells • Increased Beryllium strongly associated with hydraulically fractured gas wells • Indirect evidence of pollutant migration via microannular fissures in well casing • Large-scale and spatial approach needed to detect groundwater quality changes

  12. GIS-and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Wei [Colorado School of Mines, Golden, CO (United States); Minnick, Matthew [Colorado School of Mines, Golden, CO (United States); Geza, Mengistu [Colorado School of Mines, Golden, CO (United States); Murray, Kyle [Colorado School of Mines, Golden, CO (United States); Mattson, Earl [Colorado School of Mines, Golden, CO (United States)

    2012-09-30

    The Colorado School of Mines (CSM) was awarded a grant by the National Energy Technology Laboratory (NETL), Department of Energy (DOE) to conduct a research project en- titled GIS- and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development in October of 2008. The ultimate goal of this research project is to develop a water resource geo-spatial infrastructure that serves as “baseline data” for creating solutions on water resource management and for supporting decisions making on oil shale resource development. The project came to the end on September 30, 2012. This final project report will report the key findings from the project activity, major accomplishments, and expected impacts of the research. At meantime, the gamma version (also known as Version 4.0) of the geodatabase as well as other various deliverables stored on digital storage media will be send to the program manager at NETL, DOE via express mail. The key findings from the project activity include the quantitative spatial and temporal distribution of the water resource throughout the Piceance Basin, water consumption with respect to oil shale production, and data gaps identified. Major accomplishments of this project include the creation of a relational geodatabase, automated data processing scripts (Matlab) for database link with surface water and geological model, ArcGIS Model for hydrogeologic data processing for groundwater model input, a 3D geological model, surface water/groundwater models, energy resource development systems model, as well as a web-based geo-spatial infrastructure for data exploration, visualization and dissemination. This research will have broad impacts of the devel- opment of the oil shale resources in the US. The geodatabase provides a “baseline” data for fur- ther study of the oil shale development and identification of further data collection needs. The 3D geological model provides better understanding through data interpolation and

  13. Tools for open geospatial science

    Science.gov (United States)

    Petras, V.; Petrasova, A.; Mitasova, H.

    2017-12-01

    Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.

  14. Authoring Tours of Geospatial Data With KML and Google Earth

    Science.gov (United States)

    Barcay, D. P.; Weiss-Malik, M.

    2008-12-01

    As virtual globes become widely adopted by the general public, the use of geospatial data has expanded greatly. With the popularization of Google Earth and other platforms, GIS systems have become virtual reality platforms. Using these platforms, a casual user can easily explore the world, browse massive data-sets, create powerful 3D visualizations, and share those visualizations with millions of people using the KML language. This technology has raised the bar for professionals and academics alike. It is now expected that studies and projects will be accompanied by compelling, high-quality visualizations. In this new landscape, a presentation of geospatial data can be the most effective form of advertisement for a project: engaging both the general public and the scientific community in a unified interactive experience. On the other hand, merely dumping a dataset into a virtual globe can be a disorienting, alienating experience for many users. To create an effective, far-reaching presentation, an author must take care to make their data approachable to a wide variety of users with varying knowledge of the subject matter, expertise in virtual globes, and attention spans. To that end, we present techniques for creating self-guided interactive tours of data represented in KML and visualized in Google Earth. Using these methods, we provide the ability to move the camera through the world while dynamically varying the content, style, and visibility of the displayed data. Such tours can automatically guide users through massive, complex datasets: engaging a broad user-base, and conveying subtle concepts that aren't immediately apparent when viewing the raw data. To the casual user these techniques result in an extremely compelling experience similar to watching video. Unlike video though, these techniques maintain the rich interactive environment provided by the virtual globe, allowing users to explore the data in detail and to add other data sources to the presentation.

  15. Geospatial analysis of food environment demonstrates associations with gestational diabetes.

    Science.gov (United States)

    Kahr, Maike K; Suter, Melissa A; Ballas, Jerasimos; Ramin, Susan M; Monga, Manju; Lee, Wesley; Hu, Min; Shope, Cindy D; Chesnokova, Arina; Krannich, Laura; Griffin, Emily N; Mastrobattista, Joan; Dildy, Gary A; Strehlow, Stacy L; Ramphul, Ryan; Hamilton, Winifred J; Aagaard, Kjersti M

    2016-01-01

    Gestational diabetes mellitus (GDM) is one of most common complications of pregnancy, with incidence rates varying by maternal age, race/ethnicity, obesity, parity, and family history. Given its increasing prevalence in recent decades, covariant environmental and sociodemographic factors may be additional determinants of GDM occurrence. We hypothesized that environmental risk factors, in particular measures of the food environment, may be a diabetes contributor. We employed geospatial modeling in a populous US county to characterize the association of the relative availability of fast food restaurants and supermarkets to GDM. Utilizing a perinatal database with >4900 encoded antenatal and outcome variables inclusive of ZIP code data, 8912 consecutive pregnancies were analyzed for correlations between GDM and food environment based on countywide food permit registration data. Linkage between pregnancies and food environment was achieved on the basis of validated 5-digit ZIP code data. The prevalence of supermarkets and fast food restaurants per 100,000 inhabitants for each ZIP code were gathered from publicly available food permit sources. To independently authenticate our findings with objective data, we measured hemoglobin A1c levels as a function of geospatial distribution of food environment in a matched subset (n = 80). Residence in neighborhoods with a high prevalence of fast food restaurants (fourth quartile) was significantly associated with an increased risk of developing GDM (relative to first quartile: adjusted odds ratio, 1.63; 95% confidence interval, 1.21-2.19). In multivariate analysis, this association held true after controlling for potential confounders (P = .002). Measurement of hemoglobin A1c levels in a matched subset were significantly increased in association with residence in a ZIP code with a higher fast food/supermarket ratio (n = 80, r = 0.251 P analysis, a relationship of food environment and risk for gestational diabetes was

  16. Large and abundant flowers increase indirect costs of corollas: a study of coflowering sympatric Mediterranean species of contrasting flower size.

    Science.gov (United States)

    Teixido, Alberto L; Valladares, Fernando

    2013-09-01

    Large floral displays receive more pollinator visits but involve higher production and maintenance costs. This can result in indirect costs which may negatively affect functions like reproductive output. In this study, we explored the relationship between floral display and indirect costs in two pairs of coflowering sympatric Mediterranean Cistus of contrasting flower size. We hypothesized that: (1) corolla production entails direct costs in dry mass, N and P, (2) corollas entail significant indirect costs in terms of fruit set and seed production, (3) indirect costs increase with floral display, (4) indirect costs are greater in larger-flowered sympatric species, and (5) local climatic conditions influence indirect costs. We compared fruit set and seed production of petal-removed flowers and unmanipulated control flowers and evaluated the influence of mean flower number and mean flower size on relative fruit and seed gain of petal-removed and control flowers. Fruit set and seed production were significantly higher in petal-removed flowers in all the studied species. A positive relationship was found between relative fruit gain and mean individual flower size within species. In one pair of species, fruit gain was higher in the large-flowered species, as was the correlation between fruit gain and mean number of open flowers. In the other pair, the correlation between fruit gain and mean flower size was also higher in the large-flowered species. These results reveal that Mediterranean environments impose significant constraints on floral display, counteracting advantages of large flowers from the pollination point of view with increased indirect costs of such flowers.

  17. Comparative study of cocoa black ants temporal population distribution utilizing geospatial analysis

    Science.gov (United States)

    Adnan, N. A.; Bakar, S.; Mazlan, A. H.; Yusoff, Z. Mohd; Rasam, A. R. Abdul

    2018-02-01

    Cocoa plantation also subjected to diseases and pests infestation. Some pests not only reduced the yield but also inhibit the growth of trees. Therefore, the Malaysia Cocoa Board (MCB) has explored Cocoa Black Ants (CBA) as one of their biological control mechanism to reduce the pest infestation of the Cocoa Pod Borer (CPB). CPB is capable to cause damage to cocoa beans, and later on will reduce the quality of dried cocoa beans. This study tries to integrate the use of geospatial analysis in understanding population distribution pattern of CBA to enhance its capability in controlling CPB infestation. Two objectives of the study are i) to generate temporal CBA distribution of cocoa plantation for two different blocks, and ii) to compare visually the CBA population distribution pattern with the aid of geospatial technique. This study managed to find the CBA population pattern which indicated spatially modest amount of low pattern distribution in February of 2007 until reaching the highest levels of ant populations in September 2007 and decreasing by the end of the year in 2009 for two different blocks (i.e 10B and 18A). Therefore, the usage of GIS is important to explain the CBA pattern population in the mature cocoa field. This finding might to be used as an indicator to examine the optimum distribution of CBA, which needed as a biological control agent against the CPB in the future.

  18. Geospatial analysis of long-term morphological changes in Cochin estuary, SW coast of India

    Digital Repository Service at National Institute of Oceanography (India)

    DineshKumar, P.K.; Gopinath, G.; Manimurali, R.; Muraleedharan, K.R.

    are complex, where resource and management systems often confront with multiple conflicts. Estuarine functioning is case sensitive to changes in environmental factors and human interventions. Morphology of estuaries are generally characterized by the strong... for future. Synchronous environmental data would be useful in understanding the carrying capacity and problems and potentialities of fisheries, tourism and navigation. CONCLUSION In the discussion above, we examined the long period geospatial information...

  19. Mapping the world: cartographic and geographic visualization by the United Nations Geospatial Information Section (formerly Cartographic Section)

    Science.gov (United States)

    Kagawa, Ayako; Le Sourd, Guillaume

    2018-05-01

    United Nations Secretariat activities, mapping began in 1946, and by 1951, the need for maps increased and an office with a team of cartographers was established. Since then, with the development of technologies including internet, remote sensing, unmanned aerial systems, relationship database management and information systems, geospatial information provides an ever-increasing variation of support to the work of the Organization for planning of operations, decision-making and monitoring of crises. However, the need for maps has remained intact. This presentation aims to highlight some of the cartographic representation styles over the decades by reviewing the evolution of selected maps by the office, and noting the changing cognitive and semiotic aspects of cartographic and geographic visualization required by the United Nations. Through presentation and analysis of these maps, the changing dynamics of the Organization in information management can be reflected, with a reminder of the continuing and expanding deconstructionist role of a cartographer, now geospatial information management experts.

  20. The Whole World In Your Hands: Using an Interactive Virtual Reality Sandbox for Geospatial Education and Outreach

    Science.gov (United States)

    Clucas, T.; Wirth, G. S.; Broderson, D.

    2014-12-01

    Traditional geospatial education tools such as maps and computer screens don't convey the rich topography present on Earth. Translating lines on a contour lines on a topo map to relief in a landscape can be a challenging concept to convey.A partnership between Alaska EPSCoR and the Geographic Information Network of Alaska has successfully constructed an Interactive Virtual Reality Sandbox, an education tool that in real-time projects and updates topographic contours on the surface of a sandbox. The sandbox has been successfully deployed at public science events as well as professional geospatial and geodesy conferences. Landscape change, precipitation, and evaporation can all be modeled, much to the delight of our enthusiasts, who range in age from 3 to 90. Visually, as well as haptically, demonstrating the effects of events (such as dragging a hand through the sand) on a landscape, as well as the intuitive realization of meaning of topographic contour lines, has proven to be engaging.