WorldWideScience

Sample records for geospatial visualization platform

  1. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    Science.gov (United States)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We

  2. NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java*

    Science.gov (United States)

    Hogan, P.; Coughlan, J.

    2006-12-01

    NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).

  3. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    Science.gov (United States)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  4. NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java* for EDUCATION

    Science.gov (United States)

    Hogan, P.; Kuehnel, F.

    2006-12-01

    NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).

  5. A Javascript GIS Platform Based on Invocable Geospatial Web Services

    Directory of Open Access Journals (Sweden)

    Konstantinos Evangelidis

    2018-04-01

    Full Text Available Semantic Web technologies are being increasingly adopted by the geospatial community during last decade through the utilization of open standards for expressing and serving geospatial data. This was also dramatically assisted by the ever-increasing access and usage of geographic mapping and location-based services via smart devices in people’s daily activities. In this paper, we explore the developmental framework of a pure JavaScript client-side GIS platform exclusively based on invocable geospatial Web services. We also extend JavaScript utilization on the server side by deploying a node server acting as a bridge between open source WPS libraries and popular geoprocessing engines. The vehicle for such an exploration is a cross platform Web browser capable of interpreting JavaScript commands to achieve interaction with geospatial providers. The tool is a generic Web interface providing capabilities of acquiring spatial datasets, composing layouts and applying geospatial processes. In an ideal form the end-user will have to identify those services, which satisfy a geo-related need and put them in the appropriate row. The final output may act as a potential collector of freely available geospatial web services. Its server-side components may exploit geospatial processing suppliers composing that way a light-weight fully transparent open Web GIS platform.

  6. A CLOUD-BASED PLATFORM SUPPORTING GEOSPATIAL COLLABORATION FOR GIS EDUCATION

    Directory of Open Access Journals (Sweden)

    X. Cheng

    2015-05-01

    Full Text Available GIS-related education needs support of geo-data and geospatial software. Although there are large amount of geographic information resources distributed on the web, the discovery, process and integration of these resources are still unsolved. Researchers and teachers always searched geo-data by common search engines but results were not satisfied. They also spent much money and energy on purchase and maintenance of various kinds of geospatial software. Aimed at these problems, a cloud-based geospatial collaboration platform called GeoSquare was designed and implemented. The platform serves as a geoportal encouraging geospatial data, information, and knowledge sharing through highly interactive and expressive graphic interfaces. Researchers and teachers can solve their problems effectively in this one-stop solution. Functions, specific design and implementation details are presented in this paper. Site of GeoSquare is: http://geosquare.tianditu.com/

  7. a Cloud-Based Platform Supporting Geospatial Collaboration for GIS Education

    Science.gov (United States)

    Cheng, X.; Gui, Z.; Hu, K.; Gao, S.; Shen, P.; Wu, H.

    2015-05-01

    GIS-related education needs support of geo-data and geospatial software. Although there are large amount of geographic information resources distributed on the web, the discovery, process and integration of these resources are still unsolved. Researchers and teachers always searched geo-data by common search engines but results were not satisfied. They also spent much money and energy on purchase and maintenance of various kinds of geospatial software. Aimed at these problems, a cloud-based geospatial collaboration platform called GeoSquare was designed and implemented. The platform serves as a geoportal encouraging geospatial data, information, and knowledge sharing through highly interactive and expressive graphic interfaces. Researchers and teachers can solve their problems effectively in this one-stop solution. Functions, specific design and implementation details are presented in this paper. Site of GeoSquare is: http://geosquare.tianditu.com/

  8. Geospatial Data Management Platform for Urban Groundwater

    Science.gov (United States)

    Gaitanaru, D.; Priceputu, A.; Gogu, C. R.

    2012-04-01

    Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis

  9. Recent Advances in Geospatial Visualization with the New Google Earth

    Science.gov (United States)

    Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.

    2017-12-01

    Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.

  10. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    Science.gov (United States)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This

  11. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin

    2010-03-01

    Geospatial information systems provide an abundance of information for researchers and scientists. Unfortunately this type of data can usually only be analyzed a few megapixels at a time, giving researchers a very narrow view into these voluminous data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented levels expediting analysis, interrogation, and discovery. ©2010 IEEE.

  12. Technologies Connotation and Developing Characteristics of Open Geospatial Information Platform

    Directory of Open Access Journals (Sweden)

    GUO Renzhong

    2016-02-01

    Full Text Available Based on the background of developments of surveying,mapping and geoinformation,aimed at the demands of data fusion,real-time sharing,in-depth processing and personalization,this paper analyzes significant features of geo-spatial service in digital city,focuses on theory,method and key techniques of open environment of cloud computing,multi-path data updating,full-scale urban geocoding,multi-source spatial data integration,adaptive geo-processing and adaptive Web mapping.As the basis for it,the Open Geospatial information platform is developed,and successfully implicated in digital Shenzhen.

  13. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin; Kuester, Falk

    2010-01-01

    data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented

  14. Sextant: Visualizing time-evolving linked geospatial data

    NARCIS (Netherlands)

    C. Nikolaou (Charalampos); K. Dogani (Kallirroi); K. Bereta (Konstantina); G. Garbis (George); M. Karpathiotakis (Manos); K. Kyzirakos (Konstantinos); M. Koubarakis (Manolis)

    2015-01-01

    textabstractThe linked open data cloud is constantly evolving as datasets get continuously updated with newer versions. As a result, representing, querying, and visualizing the temporal dimension of linked data is crucial. This is especially important for geospatial datasets that form the backbone

  15. A Big Data Platform for Storing, Accessing, Mining and Learning Geospatial Data

    Science.gov (United States)

    Yang, C. P.; Bambacus, M.; Duffy, D.; Little, M. M.

    2017-12-01

    Big Data is becoming a norm in geoscience domains. A platform that is capable to effiently manage, access, analyze, mine, and learn the big data for new information and knowledge is desired. This paper introduces our latest effort on developing such a platform based on our past years' experiences on cloud and high performance computing, analyzing big data, comparing big data containers, and mining big geospatial data for new information. The platform includes four layers: a) the bottom layer includes a computing infrastructure with proper network, computer, and storage systems; b) the 2nd layer is a cloud computing layer based on virtualization to provide on demand computing services for upper layers; c) the 3rd layer is big data containers that are customized for dealing with different types of data and functionalities; d) the 4th layer is a big data presentation layer that supports the effient management, access, analyses, mining and learning of big geospatial data.

  16. SWOT analysis on National Common Geospatial Information Service Platform of China

    Science.gov (United States)

    Zheng, Xinyan; He, Biao

    2010-11-01

    Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.

  17. High performance geospatial and climate data visualization using GeoJS

    Science.gov (United States)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring

  18. KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery

    Science.gov (United States)

    Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan

    2013-05-01

    KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.

  19. Mapping a Difference: The Power of Geospatial Visualization

    Science.gov (United States)

    Kolvoord, B.

    2015-12-01

    Geospatial Technologies (GST), such as GIS, GPS and remote sensing, offer students and teachers the opportunity to study the "why" of where. By making maps and collecting location-based data, students can pursue authentic problems using sophisticated tools. The proliferation of web- and cloud-based tools has made these technologies broadly accessible to schools. In addition, strong spatial thinking skills have been shown to be a key factor in supporting students that want to study science, technology, engineering, and mathematics (STEM) disciplines (Wai, Lubinski and Benbow) and pursue STEM careers. Geospatial technologies strongly scaffold the development of these spatial thinking skills. For the last ten years, the Geospatial Semester, a unique dual-enrollment partnership between James Madison University and Virginia high schools, has provided students with the opportunity to use GST's to hone their spatial thinking skills and to do extended projects of local interest, including environmental, geological and ecological studies. Along with strong spatial thinking skills, these students have also shown strong problem solving skills, often beyond those of fellow students in AP classes. Programs like the Geospatial Semester are scalable and within the reach of many college and university departments, allowing strong engagement with K-12 schools. In this presentation, we'll share details of the Geospatial Semester and research results on the impact of the use of these technologies on students' spatial thinking skills, and discuss the success and challenges of developing K-12 partnerships centered on geospatial visualization.

  20. A Platform for Scalable Satellite and Geospatial Data Analysis

    Science.gov (United States)

    Beneke, C. M.; Skillman, S.; Warren, M. S.; Kelton, T.; Brumby, S. P.; Chartrand, R.; Mathis, M.

    2017-12-01

    At Descartes Labs, we use the commercial cloud to run global-scale machine learning applications over satellite imagery. We have processed over 5 Petabytes of public and commercial satellite imagery, including the full Landsat and Sentinel archives. By combining open-source tools with a FUSE-based filesystem for cloud storage, we have enabled a scalable compute platform that has demonstrated reading over 200 GB/s of satellite imagery into cloud compute nodes. In one application, we generated global 15m Landsat-8, 20m Sentinel-1, and 10m Sentinel-2 composites from 15 trillion pixels, using over 10,000 CPUs. We recently created a public open-source Python client library that can be used to query and access preprocessed public satellite imagery from within our platform, and made this platform available to researchers for non-commercial projects. In this session, we will describe how you can use the Descartes Labs Platform for rapid prototyping and scaling of geospatial analyses and demonstrate examples in land cover classification.

  1. VISA: AN AUTOMATIC AWARE AND VISUAL AIDS MECHANISM FOR IMPROVING THE CORRECT USE OF GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    J. H. Hong

    2016-06-01

    Full Text Available With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of “differences” implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.

  2. A PUBLIC PLATFORM FOR GEOSPATIAL DATA SHARING FOR DISASTER RISK MANAGEMENT

    Directory of Open Access Journals (Sweden)

    S. Balbo

    2014-01-01

    This paper presents a case study scenario of setting up a Web platform based on GeoNode. It is a public platform called MASDAP and promoted by the Government of Malawi in order to support development of the country and build resilience against natural disasters. A substantial amount of geospatial data has already been collected about hydrogeological risk, as well as several other-disasters related information. Moreover this platform will help to ensure that the data created by a number of past or ongoing projects is maintained and that this information remains accessible and useful. An Integrated Flood Risk Management Plan for a river basin has already been included in the platform and other data from future disaster risk management projects will be added as well.

  3. New Techniques for Deep Learning with Geospatial Data using TensorFlow, Earth Engine, and Google Cloud Platform

    Science.gov (United States)

    Hancher, M.

    2017-12-01

    Recent years have seen promising results from many research teams applying deep learning techniques to geospatial data processing. In that same timeframe, TensorFlow has emerged as the most popular framework for deep learning in general, and Google has assembled petabytes of Earth observation data from a wide variety of sources and made them available in analysis-ready form in the cloud through Google Earth Engine. Nevertheless, developing and applying deep learning to geospatial data at scale has been somewhat cumbersome to date. We present a new set of tools and techniques that simplify this process. Our approach combines the strengths of several underlying tools: TensorFlow for its expressive deep learning framework; Earth Engine for data management, preprocessing, postprocessing, and visualization; and other tools in Google Cloud Platform to train TensorFlow models at scale, perform additional custom parallel data processing, and drive the entire process from a single familiar Python development environment. These tools can be used to easily apply standard deep neural networks, convolutional neural networks, and other custom model architectures to a variety of geospatial data structures. We discuss our experiences applying these and related tools to a range of machine learning problems, including classic problems like cloud detection, building detection, land cover classification, as well as more novel problems like illegal fishing detection. Our improved tools will make it easier for geospatial data scientists to apply modern deep learning techniques to their own problems, and will also make it easier for machine learning researchers to advance the state of the art of those techniques.

  4. 3D geospatial visualizations: Animation and motion effects on spatial objects

    Science.gov (United States)

    Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos

    2018-02-01

    Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.

  5. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    Science.gov (United States)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  6. Geo-spatial Service and Application based on National E-government Network Platform and Cloud

    Science.gov (United States)

    Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.

    2014-04-01

    With the acceleration of China's informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical applications. Based on cloud computing technology and national e-government network platform, "National Natural Resources and Geospatial Database (NRGD)" project integrated and transformed natural resources and geospatial information dispersed in various sectors and regions, established logically unified and physically dispersed fundamental database and developed national integrated information database system supporting main e-government applications. Cross-sector e-government applications and services are realized to provide long-term, stable and standardized natural resources and geospatial fundamental information products and services for national egovernment and public users.

  7. Lsiviewer 2.0 - a Client-Oriented Online Visualization Tool for Geospatial Vector Data

    Science.gov (United States)

    Manikanta, K.; Rajan, K. S.

    2017-09-01

    Geospatial data visualization systems have been predominantly through applications that are installed and run in a desktop environment. Over the last decade, with the advent of web technologies and its adoption by Geospatial community, the server-client model for data handling, data rendering and visualization respectively has been the most prevalent approach in Web-GIS. While the client devices have become functionally more powerful over the recent years, the above model has largely ignored it and is still in a mode of serverdominant computing paradigm. In this paper, an attempt has been made to develop and demonstrate LSIViewer - a simple, easy-to-use and robust online geospatial data visualisation system for the user's own data that harness the client's capabilities for data rendering and user-interactive styling, with a reduced load on the server. The developed system can support multiple geospatial vector formats and can be integrated with other web-based systems like WMS, WFS, etc. The technology stack used to build this system is Node.js on the server side and HTML5 Canvas and JavaScript on the client side. Various tests run on a range of vector datasets, upto 35 MB, showed that the time taken to render the vector data using LSIViewer is comparable to a desktop GIS application, QGIS, over an identical system.

  8. Web mapping system for complex processing and visualization of environmental geospatial datasets

    Science.gov (United States)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial

  9. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    Science.gov (United States)

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  10. Automating Geospatial Visualizations with Smart Default Renderers for Data Exploration Web Applications

    Science.gov (United States)

    Ekenes, K.

    2017-12-01

    This presentation will outline the process of creating a web application for exploring large amounts of scientific geospatial data using modern automated cartographic techniques. Traditional cartographic methods, including data classification, may inadvertently hide geospatial and statistical patterns in the underlying data. This presentation demonstrates how to use smart web APIs that quickly analyze the data when it loads, and provides suggestions for the most appropriate visualizations based on the statistics of the data. Since there are just a few ways to visualize any given dataset well, it is imperative to provide smart default color schemes tailored to the dataset as opposed to static defaults. Since many users don't go beyond default values, it is imperative that they are provided with smart default visualizations. Multiple functions for automating visualizations are available in the Smart APIs, along with UI elements allowing users to create more than one visualization for a dataset since there isn't a single best way to visualize a given dataset. Since bivariate and multivariate visualizations are particularly difficult to create effectively, this automated approach removes the guesswork out of the process and provides a number of ways to generate multivariate visualizations for the same variables. This allows the user to choose which visualization is most appropriate for their presentation. The methods used in these APIs and the renderers generated by them are not available elsewhere. The presentation will show how statistics can be used as the basis for automating default visualizations of data along continuous ramps, creating more refined visualizations while revealing the spread and outliers of the data. Adding interactive components to instantaneously alter visualizations allows users to unearth spatial patterns previously unknown among one or more variables. These applications may focus on a single dataset that is frequently updated, or configurable

  11. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    Science.gov (United States)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data

  12. From Analysis to Impact: Challenges and Outcomes from Google's Cloud-based Platforms for Analyzing and Leveraging Petapixels of Geospatial Data

    Science.gov (United States)

    Thau, D.

    2017-12-01

    For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson

  13. Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)

    Science.gov (United States)

    Hancher, M.

    2013-12-01

    Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.

  14. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    Science.gov (United States)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc

  15. Visualization and Ontology of Geospatial Intelligence

    Science.gov (United States)

    Chan, Yupo

    Recent events have deepened our conviction that many human endeavors are best described in a geospatial context. This is evidenced in the prevalence of location-based services, as afforded by the ubiquitous cell phone usage. It is also manifested by the popularity of such internet engines as Google Earth. As we commute to work, travel on business or pleasure, we make decisions based on the geospatial information provided by such location-based services. When corporations devise their business plans, they also rely heavily on such geospatial data. By definition, local, state and federal governments provide services according to geographic boundaries. One estimate suggests that 85 percent of data contain spatial attributes.

  16. Authoring Tours of Geospatial Data With KML and Google Earth

    Science.gov (United States)

    Barcay, D. P.; Weiss-Malik, M.

    2008-12-01

    As virtual globes become widely adopted by the general public, the use of geospatial data has expanded greatly. With the popularization of Google Earth and other platforms, GIS systems have become virtual reality platforms. Using these platforms, a casual user can easily explore the world, browse massive data-sets, create powerful 3D visualizations, and share those visualizations with millions of people using the KML language. This technology has raised the bar for professionals and academics alike. It is now expected that studies and projects will be accompanied by compelling, high-quality visualizations. In this new landscape, a presentation of geospatial data can be the most effective form of advertisement for a project: engaging both the general public and the scientific community in a unified interactive experience. On the other hand, merely dumping a dataset into a virtual globe can be a disorienting, alienating experience for many users. To create an effective, far-reaching presentation, an author must take care to make their data approachable to a wide variety of users with varying knowledge of the subject matter, expertise in virtual globes, and attention spans. To that end, we present techniques for creating self-guided interactive tours of data represented in KML and visualized in Google Earth. Using these methods, we provide the ability to move the camera through the world while dynamically varying the content, style, and visibility of the displayed data. Such tours can automatically guide users through massive, complex datasets: engaging a broad user-base, and conveying subtle concepts that aren't immediately apparent when viewing the raw data. To the casual user these techniques result in an extremely compelling experience similar to watching video. Unlike video though, these techniques maintain the rich interactive environment provided by the virtual globe, allowing users to explore the data in detail and to add other data sources to the presentation.

  17. Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform

    Science.gov (United States)

    Shrestha, S. R.; Viswambharan, V.; Doshi, A.

    2017-12-01

    Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics

  18. Time-varying spatial data integration and visualization: 4 Dimensions Environmental Observations Platform (4-DEOS)

    Science.gov (United States)

    Paciello, Rossana; Coviello, Irina; Filizzola, Carolina; Genzano, Nicola; Lisi, Mariano; Mazzeo, Giuseppe; Pergola, Nicola; Sileo, Giancanio; Tramutoli, Valerio

    2014-05-01

    In environmental studies the integration of heterogeneous and time-varying data, is a very common requirement for investigating and possibly visualize correlations among physical parameters underlying the dynamics of complex phenomena. Datasets used in such kind of applications has often different spatial and temporal resolutions. In some case superimposition of asynchronous layers is required. Traditionally the platforms used to perform spatio-temporal visual data analyses allow to overlay spatial data, managing the time using 'snapshot' data model, each stack of layers being labeled with different time. But this kind of architecture does not incorporate the temporal indexing neither the third spatial dimension which is usually given as an independent additional layer. Conversely, the full representation of a generic environmental parameter P(x,y,z,t) in the 4D space-time domain could allow to handle asynchronous datasets as well as less traditional data-products (e.g. vertical sections, punctual time-series, etc.) . In this paper we present the 4 Dimensions Environmental Observation Platform (4-DEOS), a system based on a web services architecture Client-Broker-Server. This platform is a new open source solution for both a timely access and an easy integration and visualization of heterogeneous (maps, vertical profiles or sections, punctual time series, etc.) asynchronous, geospatial products. The innovative aspect of the 4-DEOS system is that users can analyze data/products individually moving through time, having also the possibility to stop the display of some data/products and focus on other parameters for better studying their temporal evolution. This platform gives the opportunity to choose between two distinct display modes for time interval or for single instant. Users can choose to visualize data/products in two ways: i) showing each parameter in a dedicated window or ii) visualize all parameters overlapped in a single window. A sliding time bar, allows

  19. Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus

    Science.gov (United States)

    Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.

  20. The National 3-D Geospatial Information Web-Based Service of Korea

    Science.gov (United States)

    Lee, D. T.; Kim, C. W.; Kang, I. G.

    2013-09-01

    3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of

  1. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    Science.gov (United States)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  2. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    Science.gov (United States)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current

  3. Visual guidance of mobile platforms

    Science.gov (United States)

    Blissett, Rodney J.

    1993-12-01

    Two systems are described and results presented demonstrating aspects of real-time visual guidance of autonomous mobile platforms. The first approach incorporates prior knowledge in the form of rigid geometrical models linking visual references within the environment. The second approach is based on a continuous synthesis of information extracted from image tokens to generate a coarse-grained world model, from which potential obstacles are inferred. The use of these techniques in workplace applications is discussed.

  4. Three-D Google Earth bases geospatial visualization tool for the smart grid distribution

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, K. [Enterprise Horizons, Fremont, CA (United States)

    2009-07-01

    Smart grids can be used to liberalize markets, ensure reliability and reduce the environmental footprint of electric utilities. This presentation discussed a geo-spatial visualization tool for smart grid distribution. The visualization tool can be used to visualize transmission lines, substations, and is capable of viewing millions of topographical components. The tool was designed to track and monitor the health of assets and to increase awareness of vulnerabilities, vegetation, and regional demographics. The tool is also capable of identifying potential issues before a rolling blackout situation as well as anticipating islanding spike events. The visualization tool can be segmented by population and industrial belts, and is able to provide diagnostics on power factor turbulence for congestion bottlenecks. When used for transmission line and substation siting, the tool can provide terrain feasibility analyses and environmental impact analyses. Weather-based demand forecasting can be used to determine critical customers impacted by potential outages. CAD drawings can be used to visualize assets in virtual reality and can be linked to consumer indexing and smart metering initiatives. It was concluded that the web-based tool can also be used for workforce and dispatch management. tabs., figs.

  5. Diagnosing Geospatial Uncertainty Visualization Challenges in Seasonal Temperature and Precipitation Forecasts

    Science.gov (United States)

    Speciale, A.; Kenney, M. A.; Gerst, M.; Baer, A. E.; DeWitt, D.; Gottschalk, J.; Handel, S.

    2017-12-01

    The uncertainty of future weather and climate conditions is important for many decisions made in communities and economic sectors. One tool that decision-makers use in gauging this uncertainty is forecasts, especially maps (or visualizations) of probabilistic forecast results. However, visualizing geospatial uncertainty is challenging because including probability introduces an extra variable to represent and probability is often poorly understood by users. Using focus group and survey methods, this study seeks to understand the barriers to using probabilistic temperature and precipitation visualizations for specific decisions in the agriculture, energy, emergency management, and water resource sectors. Preliminary results shown here focus on findings of emergency manager needs. Our experimental design uses National Oceanic and Atmospheric Administration (NOAA's) Climate Prediction Center (CPC) climate outlooks, which produce probabilistic temperature and precipitation forecast visualizations at the 6-10 day, 8-14 day, 3-4 week, and 1 and 3 month timeframes. Users were asked to complete questions related to how they use weather information, how uncertainty is represented, and design elements (e.g., color, contour lines) of the visualizations. Preliminary results from the emergency management sector indicate there is significant confusion on how "normal" weather is defined, boundaries between probability ranges, and meaning of the contour lines. After a complete understandability diagnosis is made using results from all sectors, we will collaborate with CPC to suggest modifications to the climate outlook visualizations. These modifications will then be retested in similar focus groups and web-based surveys to confirm they better meet the needs of users.

  6. Web-Based Geospatial Visualization of GPM Data with CesiumJS

    Science.gov (United States)

    Lammers, Matt

    2018-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology have made online visualization of large geospatial datasets such as those coming from precipitation satellites viable. These data benefit from being visualized on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS (http://cesiumjs.org), developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. This presentation will describe how CesiumJS has been used in three-dimensional visualization products developed as part of the NASA Precipitation Processing System (PPS) STORM data-order website. Existing methods of interacting with Global Precipitation Measurement (GPM) Mission data primarily focus on two-dimensional static images, whether displaying vertical slices or horizontal surface/height-level maps. These methods limit interactivity with the robust three-dimensional data coming from the GPM core satellite. Integrating the data with CesiumJS in a web-based user interface has allowed us to create the following products. We have linked with the data-order interface an on-the-fly visualization tool for any GPM/partner satellite orbit. A version of this tool also focuses on high-impact weather events. It enables viewing of combined radar and microwave-derived precipitation data on mobile devices and in a way that can be embedded into other websites. We also have used CesiumJS to visualize a method of integrating gridded precipitation data with modeled wind speeds that animates over time. Emphasis in the presentation will be placed on how a variety of technical methods were used to create these tools, and how the flexibility of the CesiumJS framework facilitates creative approaches to interact with the data.

  7. GEOSPATIAL DATA PROCESSING FOR 3D CITY MODEL GENERATION, MANAGEMENT AND VISUALIZATION

    Directory of Open Access Journals (Sweden)

    I. Toschi

    2017-05-01

    Full Text Available Recent developments of 3D technologies and tools have increased availability and relevance of 3D data (from 3D points to complete city models in the geospatial and geo-information domains. Nevertheless, the potential of 3D data is still underexploited and mainly confined to visualization purposes. Therefore, the major challenge today is to create automatic procedures that make best use of available technologies and data for the benefits and needs of public administrations (PA and national mapping agencies (NMA involved in “smart city” applications. The paper aims to demonstrate a step forward in this process by presenting the results of the SENECA project (Smart and SustaiNablE City from Above – http://seneca.fbk.eu. State-of-the-art processing solutions are investigated in order to (i efficiently exploit the photogrammetric workflow (aerial triangulation and dense image matching, (ii derive topologically and geometrically accurate 3D geo-objects (i.e. building models at various levels of detail and (iii link geometries with non-spatial information within a 3D geo-database management system accessible via web-based client. The developed methodology is tested on two case studies, i.e. the cities of Trento (Italy and Graz (Austria. Both spatial (i.e. nadir and oblique imagery and non-spatial (i.e. cadastral information and building energy consumptions data are collected and used as input for the project workflow, starting from 3D geometry capture and modelling in urban scenarios to geometry enrichment and management within a dedicated webGIS platform.

  8. Geospatial Data Processing for 3d City Model Generation, Management and Visualization

    Science.gov (United States)

    Toschi, I.; Nocerino, E.; Remondino, F.; Revolti, A.; Soria, G.; Piffer, S.

    2017-05-01

    Recent developments of 3D technologies and tools have increased availability and relevance of 3D data (from 3D points to complete city models) in the geospatial and geo-information domains. Nevertheless, the potential of 3D data is still underexploited and mainly confined to visualization purposes. Therefore, the major challenge today is to create automatic procedures that make best use of available technologies and data for the benefits and needs of public administrations (PA) and national mapping agencies (NMA) involved in "smart city" applications. The paper aims to demonstrate a step forward in this process by presenting the results of the SENECA project (Smart and SustaiNablE City from Above - http://seneca.fbk.eu). State-of-the-art processing solutions are investigated in order to (i) efficiently exploit the photogrammetric workflow (aerial triangulation and dense image matching), (ii) derive topologically and geometrically accurate 3D geo-objects (i.e. building models) at various levels of detail and (iii) link geometries with non-spatial information within a 3D geo-database management system accessible via web-based client. The developed methodology is tested on two case studies, i.e. the cities of Trento (Italy) and Graz (Austria). Both spatial (i.e. nadir and oblique imagery) and non-spatial (i.e. cadastral information and building energy consumptions) data are collected and used as input for the project workflow, starting from 3D geometry capture and modelling in urban scenarios to geometry enrichment and management within a dedicated webGIS platform.

  9. OnSight: Multi-platform Visualization of the Surface of Mars

    Science.gov (United States)

    Abercrombie, S. P.; Menzies, A.; Winter, A.; Clausen, M.; Duran, B.; Jorritsma, M.; Goddard, C.; Lidawer, A.

    2017-12-01

    A key challenge of planetary geology is to develop an understanding of an environment that humans cannot (yet) visit. Instead, scientists rely on visualizations created from images sent back by robotic explorers, such as the Curiosity Mars rover. OnSight is a multi-platform visualization tool that helps scientists and engineers to visualize the surface of Mars. Terrain visualization allows scientists to understand the scale and geometric relationships of the environment around the Curiosity rover, both for scientific understanding and for tactical consideration in safely operating the rover. OnSight includes a web-based 2D/3D visualization tool, as well as an immersive mixed reality visualization. In addition, OnSight offers a novel feature for communication among the science team. Using the multiuser feature of OnSight, scientists can meet virtually on Mars, to discuss geology in a shared spatial context. Combining web-based visualization with immersive visualization allows OnSight to leverage strengths of both platforms. This project demonstrates how 3D visualization can be adapted to either an immersive environment or a computer screen, and will discuss advantages and disadvantages of both platforms.

  10. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    Science.gov (United States)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources

  11. Geospatial Services in Special Libraries: A Needs Assessment Perspective

    Science.gov (United States)

    Barnes, Ilana

    2013-01-01

    Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…

  12. The Geospatial Data Cloud: An Implementation of Applying Cloud Computing in Geosciences

    Directory of Open Access Journals (Sweden)

    Xuezhi Wang

    2014-11-01

    Full Text Available The rapid growth in the volume of remote sensing data and its increasing computational requirements bring huge challenges for researchers as traditional systems cannot adequately satisfy the huge demand for service. Cloud computing has the advantage of high scalability and reliability, which can provide firm technical support. This paper proposes a highly scalable geospatial cloud platform named the Geospatial Data Cloud, which is constructed based on cloud computing. The architecture of the platform is first introduced, and then two subsystems, the cloud-based data management platform and the cloud-based data processing platform, are described.  ––– This paper was presented at the First Scientific Data Conference on Scientific Research, Big Data, and Data Science, organized by CODATA-China and held in Beijing on 24-25 February, 2014.

  13. Multi-source Geospatial Data Analysis with Google Earth Engine

    Science.gov (United States)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  14. Automated geospatial Web Services composition based on geodata quality requirements

    Science.gov (United States)

    Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael

    2012-10-01

    Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.

  15. Open Source Web Based Geospatial Processing with OMAR

    Directory of Open Access Journals (Sweden)

    Mark Lucas

    2009-01-01

    Full Text Available The availability of geospatial data sets is exploding. New satellites, aerial platforms, video feeds, global positioning system tagged digital photos, and traditional GIS information are dramatically increasing across the globe. These raw materials need to be dynamically processed, combined and correlated to generate value added information products to answer a wide range of questions. This article provides an overview of OMAR web based geospatial processing. OMAR is part of the Open Source Software Image Map project under the Open Source Geospatial Foundation. The primary contributors of OSSIM make their livings by providing professional services to US Government agencies and programs. OMAR provides one example that open source software solutions are increasingly being deployed in US government agencies. We will also summarize the capabilities of OMAR and its plans for near term development.

  16. A Python Geospatial Language Toolkit

    Science.gov (United States)

    Fillmore, D.; Pletzer, A.; Galloy, M.

    2012-12-01

    The volume and scope of geospatial data archives, such as collections of satellite remote sensing or climate model products, has been rapidly increasing and will continue to do so in the near future. The recently launched (October 2011) Suomi National Polar-orbiting Partnership satellite (NPP) for instance, is the first of a new generation of Earth observation platforms that will monitor the atmosphere, oceans, and ecosystems, and its suite of instruments will generate several terabytes each day in the form of multi-spectral images and derived datasets. Full exploitation of such data for scientific analysis and decision support applications has become a major computational challenge. Geophysical data exploration and knowledge discovery could benefit, in particular, from intelligent mechanisms for extracting and manipulating subsets of data relevant to the problem of interest. Potential developments include enhanced support for natural language queries and directives to geospatial datasets. The translation of natural language (that is, human spoken or written phrases) into complex but unambiguous objects and actions can be based on a context, or knowledge domain, that represents the underlying geospatial concepts. This poster describes a prototype Python module that maps English phrases onto basic geospatial objects and operations. This module, along with the associated computational geometry methods, enables the resolution of natural language directives that include geographic regions of arbitrary shape and complexity.

  17. Mapping the world: cartographic and geographic visualization by the United Nations Geospatial Information Section (formerly Cartographic Section)

    Science.gov (United States)

    Kagawa, Ayako; Le Sourd, Guillaume

    2018-05-01

    United Nations Secretariat activities, mapping began in 1946, and by 1951, the need for maps increased and an office with a team of cartographers was established. Since then, with the development of technologies including internet, remote sensing, unmanned aerial systems, relationship database management and information systems, geospatial information provides an ever-increasing variation of support to the work of the Organization for planning of operations, decision-making and monitoring of crises. However, the need for maps has remained intact. This presentation aims to highlight some of the cartographic representation styles over the decades by reviewing the evolution of selected maps by the office, and noting the changing cognitive and semiotic aspects of cartographic and geographic visualization required by the United Nations. Through presentation and analysis of these maps, the changing dynamics of the Organization in information management can be reflected, with a reminder of the continuing and expanding deconstructionist role of a cartographer, now geospatial information management experts.

  18. Developing a Cloud-Based Online Geospatial Information Sharing and Geoprocessing Platform to Facilitate Collaborative Education and Research

    Science.gov (United States)

    Yang, Z. L.; Cao, J.; Hu, K.; Gui, Z. P.; Wu, H. Y.; You, L.

    2016-06-01

    Efficient online discovering and applying geospatial information resources (GIRs) is critical in Earth Science domain as while for cross-disciplinary applications. However, to achieve it is challenging due to the heterogeneity, complexity and privacy of online GIRs. In this article, GeoSquare, a collaborative online geospatial information sharing and geoprocessing platform, was developed to tackle this problem. Specifically, (1) GIRs registration and multi-view query functions allow users to publish and discover GIRs more effectively. (2) Online geoprocessing and real-time execution status checking help users process data and conduct analysis without pre-installation of cumbersome professional tools on their own machines. (3) A service chain orchestration function enables domain experts to contribute and share their domain knowledge with community members through workflow modeling. (4) User inventory management allows registered users to collect and manage their own GIRs, monitor their execution status, and track their own geoprocessing histories. Besides, to enhance the flexibility and capacity of GeoSquare, distributed storage and cloud computing technologies are employed. To support interactive teaching and training, GeoSquare adopts the rich internet application (RIA) technology to create user-friendly graphical user interface (GUI). Results show that GeoSquare can integrate and foster collaboration between dispersed GIRs, computing resources and people. Subsequently, educators and researchers can share and exchange resources in an efficient and harmonious way.

  19. Applying Geospatial Technologies for International Development and Public Health: The USAID/NASA SERVIR Program

    Science.gov (United States)

    Hemmings, Sarah; Limaye, Ashutosh; Irwin, Dan

    2011-01-01

    Background: SERVIR -- the Regional Visualization and Monitoring System -- helps people use Earth observations and predictive models based on data from orbiting satellites to make timely decisions that benefit society. SERVIR operates through a network of regional hubs in Mesoamerica, East Africa, and the Hindu Kush-Himalayas. USAID and NASA support SERVIR, with the long-term goal of transferring SERVIR capabilities to the host countries. Objective/Purpose: The purpose of this presentation is to describe how the SERVIR system helps the SERVIR regions cope with eight areas of societal benefit identified by the Group on Earth Observations (GEO): health, disasters, ecosystems, biodiversity, weather, water, climate, and agriculture. This presentation will describe environmental health applications of data in the SERVIR system, as well as ongoing and future efforts to incorporate additional health applications into the SERVIR system. Methods: This presentation will discuss how the SERVIR Program makes environmental data available for use in environmental health applications. SERVIR accomplishes its mission by providing member nations with access to geospatial data and predictive models, information visualization, training and capacity building, and partnership development. SERVIR conducts needs assessments in partner regions, develops custom applications of Earth observation data, and makes NASA and partner data available through an online geospatial data portal at SERVIRglobal.net. Results: Decision makers use SERVIR to improve their ability to monitor air quality, extreme weather, biodiversity, and changes in land cover. In past several years, the system has been used over 50 times to respond to environmental threats such as wildfires, floods, landslides, and harmful algal blooms. Given that the SERVIR regions are experiencing increased stress under larger climate variability than historic observations, SERVIR provides information to support the development of

  20. Geospatial Information Service System Based on GeoSOT Grid & Encoding

    Directory of Open Access Journals (Sweden)

    LI Shizhong

    2016-12-01

    Full Text Available With the rapid development of the space and earth observation technology, it is important to establish a multi-source, multi-scale and unified cross-platform reference for global data. In practice, the production and maintenance of geospatial data are scattered in different units, and the standard of the data grid varies between departments and systems. All these bring out the disunity of standards among different historical periods or orgnizations. Aiming at geospatial information security library for the national high resolution earth observation, there are some demands for global display, associated retrieval and template applications and other integrated services for geospatial data. Based on GeoSOT grid and encoding theory system, "geospatial information security library information of globally unified grid encoding management" data subdivision organization solutions have been proposed; system-level analyses, researches and designs have been carried out. The experimental results show that the data organization and management method based on GeoSOT can significantly improve the overall efficiency of the geospatial information security service system.

  1. Geospatial Data as a Service: Towards planetary scale real-time analytics

    Science.gov (United States)

    Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.

    2017-12-01

    The rapid growth of earth systems, environmental and geophysical datasets poses a challenge to both end-users and infrastructure providers. For infrastructure and data providers, tasks like managing, indexing and storing large collections of geospatial data needs to take into consideration the various use cases by which consumers will want to access and use the data. Considerable investment has been made by the Earth Science community to produce suitable real-time analytics platforms for geospatial data. There are currently different interfaces that have been defined to provide data services. Unfortunately, there is considerable difference on the standards, protocols or data models which have been designed to target specific communities or working groups. The Australian National University's National Computational Infrastructure (NCI) is used for a wide range of activities in the geospatial community. Earth observations, climate and weather forecasting are examples of these communities which generate large amounts of geospatial data. The NCI has been carrying out significant effort to develop a data and services model that enables the cross-disciplinary use of data. Recent developments in cloud and distributed computing provide a publicly accessible platform where new infrastructures can be built. One of the key components these technologies offer is the possibility of having "limitless" compute power next to where the data is stored. This model is rapidly transforming data delivery from centralised monolithic services towards ubiquitous distributed services that scale up and down adapting to fluctuations in the demand. NCI has developed GSKY, a scalable, distributed server which presents a new approach for geospatial data discovery and delivery based on OGC standards. We will present the architecture and motivating use-cases that drove GSKY's collaborative design, development and production deployment. We show our approach offers the community valuable exploratory

  2. Geospatial Information from Satellite Imagery for Geovisualisation of Smart Cities in India

    Science.gov (United States)

    Mohan, M.

    2016-06-01

    In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.

  3. Visualizing NASA's Planetary Data with Google Earth

    Science.gov (United States)

    Beyer, R. A.; Hancher, M. D.; Broxton, M.; Weiss-Malik, M.; Gorelick, N.; Kolb, E.

    2008-12-01

    There is a vast store of planetary geospatial data that has been collected by NASA but is difficult to access and visualize. As a 3D geospatial browser, the Google Earth client is one way to visualize planetary data. KML imagery super-overlays enable us to create a non-Earth planetary globe within Google Earth, and conversion of planetary meta-data allows display of the footprint locations of various higher-resolution data sets. Once our group, or any group, performs these data conversions the KML can be made available on the Web, where anyone can download it and begin using it in Google Earth (or any other geospatial browser), just like a Web page. Lucian Plesea at JPL offers several KML basemaps (MDIM, colorized MDIM, MOC composite, THEMIS day time infrared, and both grayscale and colorized MOLA). We have created TES Thermal Inertia maps, and a THEMIS night time infrared overlay, as well. Many data sets for Mars have already been converted to KML. We provide coverage polygons overlaid on the globe, whose icons can be clicked on and lead to the full PDS data URL. We have built coverage maps for the following data sets: MOC narrow angle, HRSC imagery and DTMs, SHARAD tracks, CTX, and HiRISE. The CRISM team is working on providing their coverage data via publicly-accessible KML. The MSL landing site process is also providing data for potential landing sites via KML. The Google Earth client and KML allow anyone to contribute data for everyone to see via the Web. The Earth sciences community is already utilizing KML and Google Earth in a variety of ways as a geospatial browser, and we hope that the planetary sciences community will do the same. Using this paradigm for sharing geospatial data will not only enable planetary scientists to more easily build and share data within the scientific community, but will also provide an easy platform for public outreach and education efforts, and will easily allow anyone to layer geospatial information on top of planetary data

  4. Geospatial semantic web

    CERN Document Server

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such as WFS, WMS, RDF, OWL, and GeoSPARQL, and demonstrates how to use the Geospatial Semantic Web technologies to solve practical real-world problems such as spatial data interoperability.

  5. Visualizing Europe’s Refugee Crisis on the ‘Debating Europe’ Platform

    Directory of Open Access Journals (Sweden)

    Camelia Cmeciu

    2017-03-01

    Full Text Available The images picturing the refugee crisis are heavily emotion-laden and the picture of the dead boy Aylan on the beach is such an example. Besides newspapers where pictures of refugees have been used to stir the readers’ attention, debating platforms have used visual images to initiate debates with the EU citizens about Europe’s refugee crisis. Designed on a ‘bottom-up approach’, the ‘Debating Europe’ platform empowers citizens by encouraging a dialogue between Europe’s policymakers and experts, on the one hand, and citizens, on the other hand. Each debate embeds an issue to be addressed and visual images which may serve as incentives for a vivid debate. The selection of these visuals plays a significant role in the representation of a particular issue. The sample used for this qualitative analysis consists of the visual images (photographs and infographics of nine debates on Europe’s refugee crisis (2013-2015. Since Europe’s refugee crisis is both about attributing responsibility and human interest, we will provide an integrated visual framework for our analysis. Using a qualitative content analysis of the visual images depicting the refugee crisis we want to identify (1 the types and the salience of the participants depicted, (2 the communication strategies and the (rebordering issues used to (delegitimate these represented participants, (3 the types of emotions used by the ‘Debating Europe’ platform to visually frame the refugee crisis.

  6. An Integrated Web-Based 3d Modeling and Visualization Platform to Support Sustainable Cities

    Science.gov (United States)

    Amirebrahimi, S.; Rajabifard, A.

    2012-07-01

    Sustainable Development is found as the key solution to preserve the sustainability of cities in oppose to ongoing population growth and its negative impacts. This is complex and requires a holistic and multidisciplinary decision making. Variety of stakeholders with different backgrounds also needs to be considered and involved. Numerous web-based modeling and visualization tools have been designed and developed to support this process. There have been some success stories; however, majority failed to bring a comprehensive platform to support different aspects of sustainable development. In this work, in the context of SDI and Land Administration, CSDILA Platform - a 3D visualization and modeling platform -was proposed which can be used to model and visualize different dimensions to facilitate the achievement of sustainability, in particular, in urban context. The methodology involved the design of a generic framework for development of an analytical and visualization tool over the web. CSDILA Platform was then implemented via number of technologies based on the guidelines provided by the framework. The platform has a modular structure and uses Service-Oriented Architecture (SOA). It is capable of managing spatial objects in a 4D data store and can flexibly incorporate a variety of developed models using the platform's API. Development scenarios can be modeled and tested using the analysis and modeling component in the platform and the results are visualized in seamless 3D environment. The platform was further tested using number of scenarios and showed promising results and potentials to serve a wider need. In this paper, the design process of the generic framework, the implementation of CSDILA Platform and technologies used, and also findings and future research directions will be presented and discussed.

  7. Visualization on supercomputing platform level II ASC milestone (3537-1B) results from Sandia.

    Energy Technology Data Exchange (ETDEWEB)

    Geveci, Berk (Kitware, Inc., Clifton Park, NY); Fabian, Nathan; Marion, Patrick (Kitware, Inc., Clifton Park, NY); Moreland, Kenneth D.

    2010-09-01

    This report provides documentation for the completion of the Sandia portion of the ASC Level II Visualization on the platform milestone. This ASC Level II milestone is a joint milestone between Sandia National Laboratories and Los Alamos National Laboratories. This milestone contains functionality required for performing visualization directly on a supercomputing platform, which is necessary for peta-scale visualization. Sandia's contribution concerns in-situ visualization, running a visualization in tandem with a solver. Visualization and analysis of petascale data is limited by several factors which must be addressed as ACES delivers the Cielo platform. Two primary difficulties are: (1) Performance of interactive rendering, which is most computationally intensive portion of the visualization process. For terascale platforms, commodity clusters with graphics processors(GPUs) have been used for interactive rendering. For petascale platforms, visualization and rendering may be able to run efficiently on the supercomputer platform itself. (2) I/O bandwidth, which limits how much information can be written to disk. If we simply analyze the sparse information that is saved to disk we miss the opportunity to analyze the rich information produced every timestep by the simulation. For the first issue, we are pursuing in-situ analysis, in which simulations are coupled directly with analysis libraries at runtime. This milestone will evaluate the visualization and rendering performance of current and next generation supercomputers in contrast to GPU-based visualization clusters, and evaluate the performance of common analysis libraries coupled with the simulation that analyze and write data to disk during a running simulation. This milestone will explore, evaluate and advance the maturity level of these technologies and their applicability to problems of interest to the ASC program. Scientific simulation on parallel supercomputers is traditionally performed in four

  8. GEOSPATIAL INFORMATION FROM SATELLITE IMAGERY FOR GEOVISUALISATION OF SMART CITIES IN INDIA

    Directory of Open Access Journals (Sweden)

    M. Mohan

    2016-06-01

    Full Text Available In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.

  9. 3D Geospatial Models for Visualization and Analysis of Groundwater Contamination at a Nuclear Materials Processing Facility

    Science.gov (United States)

    Stirewalt, G. L.; Shepherd, J. C.

    2003-12-01

    Analysis of hydrostratigraphy and uranium and nitrate contamination in groundwater at a former nuclear materials processing facility in Oklahoma were undertaken employing 3-dimensional (3D) geospatial modeling software. Models constructed played an important role in the regulatory decision process of the U.S. Nuclear Regulatory Commission (NRC) because they enabled visualization of temporal variations in contaminant concentrations and plume geometry. Three aquifer systems occur at the site, comprised of water-bearing fractured shales separated by indurated sandstone aquitards. The uppermost terrace groundwater system (TGWS) aquifer is composed of terrace and alluvial deposits and a basal shale. The shallow groundwater system (SGWS) aquifer is made up of three shale units and two sandstones. It is separated from the overlying TGWS and underlying deep groundwater system (DGWS) aquifer by sandstone aquitards. Spills of nitric acid solutions containing uranium and radioactive decay products around the main processing building (MPB), leakage from storage ponds west of the MPB, and leaching of radioactive materials from discarded equipment and waste containers contaminated both the TGWS and SGWS aquifers during facility operation between 1970 and 1993. Constructing 3D geospatial property models for analysis of groundwater contamination at the site involved use of EarthVision (EV), a 3D geospatial modeling software developed by Dynamic Graphics, Inc. of Alameda, CA. A viable 3D geohydrologic framework model was initially constructed so property data could be spatially located relative to subsurface geohydrologic units. The framework model contained three hydrostratigraphic zones equivalent to the TGWS, SGWS, and DGWS aquifers in which groundwater samples were collected, separated by two sandstone aquitards. Groundwater data collected in the three aquifer systems since 1991 indicated high concentrations of uranium (>10,000 micrograms/liter) and nitrate (> 500 milligrams

  10. Two Contrasting Approaches to Building High School Teacher Capacity to Teach About Local Climate Change Using Powerful Geospatial Data and Visualization Technology

    Science.gov (United States)

    Zalles, D. R.

    2011-12-01

    The presentation will compare and contrast two different place-based approaches to helping high school science teachers use geospatial data visualization technology to teach about climate change in their local regions. The approaches are being used in the development, piloting, and dissemination of two projects for high school science led by the author: the NASA-funded Data-enhanced Investigations for Climate Change Education (DICCE) and the NSF funded Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE). DICCE is bringing an extensive portal of Earth observation data, the Goddard Interactive Online Visualization and Analysis Infrastructure, to high school classrooms. STORE is making available data for viewing results of a particular IPCC-sanctioned climate change model in relation to recent data about average temperatures, precipitation, and land cover for study areas in central California and western New York State. Across the two projects, partner teachers of academically and ethnically diverse students from five states are participating in professional development and pilot testing. Powerful geospatial data representation technologies are difficult to implement in high school science because of challenges that teachers and students encounter navigating data access and making sense of data characteristics and nomenclature. Hence, on DICCE, the researchers are testing the theory that by providing a scaffolded technology-supported process for instructional design, starting from fundamental questions about the content domain, teachers will make better instructional decisions. Conversely, the STORE approach is rooted in the perspective that co-design of curricular materials among researchers and teacher partners that work off of "starter" lessons covering focal skills and understandings will lead to the most effective utilizations of the technology in the classroom. The projects' goals and strategies for student

  11. Geospatial Visualization of Scientific Data Through Keyhole Markup Language

    Science.gov (United States)

    Wernecke, J.; Bailey, J. E.

    2008-12-01

    The development of virtual globes has provided a fun and innovative tool for exploring the surface of the Earth. However, it has been the paralleling maturation of Keyhole Markup Language (KML) that has created a new medium and perspective through which to visualize scientific datasets. Originally created by Keyhole Inc., and then acquired by Google in 2004, in 2007 KML was given over to the Open Geospatial Consortium (OGC). It became an OGC international standard on 14 April 2008, and has subsequently been adopted by all major geobrowser developers (e.g., Google, Microsoft, ESRI, NASA) and many smaller ones (e.g., Earthbrowser). By making KML a standard at a relatively young stage in its evolution, developers of the language are seeking to avoid the issues that plagued the early World Wide Web and development of Hypertext Markup Language (HTML). The popularity and utility of Google Earth, in particular, has been enhanced by KML features such as the Smithsonian volcano layer and the dynamic weather layers. Through KML, users can view real-time earthquake locations (USGS), view animations of polar sea-ice coverage (NSIDC), or read about the daily activities of chimpanzees (Jane Goodall Institute). Perhaps even more powerful is the fact that any users can create, edit, and share their own KML, with no or relatively little knowledge of manipulating computer code. We present an overview of the best current scientific uses of KML and a guide to how scientists can learn to use KML themselves.

  12. GeoSearch: A lightweight broking middleware for geospatial resources discovery

    Science.gov (United States)

    Gui, Z.; Yang, C.; Liu, K.; Xia, J.

    2012-12-01

    With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value

  13. Knowledge Discovery for Smart Grid Operation, Control, and Situation Awareness -- A Big Data Visualization Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gu, Yi; Jiang, Huaiguang; Zhang, Yingchen; Zhang, Jun Jason; Gao, Tianlu; Muljadi, Eduard

    2016-11-21

    In this paper, a big data visualization platform is designed to discover the hidden useful knowledge for smart grid (SG) operation, control and situation awareness. The spawn of smart sensors at both grid side and customer side can provide large volume of heterogeneous data that collect information in all time spectrums. Extracting useful knowledge from this big-data poll is still challenging. In this paper, the Apache Spark, an open source cluster computing framework, is used to process the big-data to effectively discover the hidden knowledge. A high-speed communication architecture utilizing the Open System Interconnection (OSI) model is designed to transmit the data to a visualization platform. This visualization platform uses Google Earth, a global geographic information system (GIS) to link the geological information with the SG knowledge and visualize the information in user defined fashion. The University of Denver's campus grid is used as a SG test bench and several demonstrations are presented for the proposed platform.

  14. The geospatial web how geobrowsers, social software and the web 2 0 are shaping the network society

    CERN Document Server

    Scharl, Arno; Tochtermann, Klaus

    2007-01-01

    The Geospatial Web will have a profound impact on managing knowledge, structuring work flows within and across organizations, and communicating with like-minded individuals in virtual communities. The enabling technologies for the Geospatial Web are geo-browsers such as NASA World Wind, Google Earth and Microsoft Live Local 3D. These three-dimensional platforms revolutionize the production and consumption of media products. They not only reveal the geographic distribution of Web resources and services, but also bring together people of similar interests, browsing behavior, or geographic location. This book summarizes the latest research on the Geospatial Web's technical foundations, describes information services and collaborative tools built on top of geo-browsers, and investigates the environmental, social and economic impacts of geospatial applications. The role of contextual knowledge in shaping the emerging network society deserves particular attention. By integrating geospatial and semantic technology, ...

  15. VAAPA: a web platform for visualization and analysis of alternative polyadenylation.

    Science.gov (United States)

    Guan, Jinting; Fu, Jingyi; Wu, Mingcheng; Chen, Longteng; Ji, Guoli; Quinn Li, Qingshun; Wu, Xiaohui

    2015-02-01

    Polyadenylation [poly(A)] is an essential process during the maturation of most mRNAs in eukaryotes. Alternative polyadenylation (APA) as an important layer of gene expression regulation has been increasingly recognized in various species. Here, a web platform for visualization and analysis of alternative polyadenylation (VAAPA) was developed. This platform can visualize the distribution of poly(A) sites and poly(A) clusters of a gene or a section of a chromosome. It can also highlight genes with switched APA sites among different conditions. VAAPA is an easy-to-use web-based tool that provides functions of poly(A) site query, data uploading, downloading, and APA sites visualization. It was designed in a multi-tier architecture and developed based on Smart GWT (Google Web Toolkit) using Java as the development language. VAAPA will be a valuable addition to the community for the comprehensive study of APA, not only by making the high quality poly(A) site data more accessible, but also by providing users with numerous valuable functions for poly(A) site analysis and visualization. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Not Just a Game … When We Play Together, We Learn Together: Interactive Virtual Environments and Gaming Engines for Geospatial Visualization

    Science.gov (United States)

    Shipman, J. S.; Anderson, J. W.

    2017-12-01

    An ideal tool for ecologists and land managers to investigate the impacts of both projected environmental changes and policy alternatives is the creation of immersive, interactive, virtual landscapes. As a new frontier in visualizing and understanding geospatial data, virtual landscapes require a new toolbox for data visualization that includes traditional GIS tools and uncommon tools such as the Unity3d game engine. Game engines provide capabilities to not only explore data but to build and interact with dynamic models collaboratively. These virtual worlds can be used to display and illustrate data that is often more understandable and plausible to both stakeholders and policy makers than is achieved using traditional maps.Within this context we will present funded research that has been developed utilizing virtual landscapes for geographic visualization and decision support among varied stakeholders. We will highlight the challenges and lessons learned when developing interactive virtual environments that require large multidisciplinary team efforts with varied competences. The results will emphasize the importance of visualization and interactive virtual environments and the link with emerging research disciplines within Visual Analytics.

  17. SVIP-N 1.0: An integrated visualization platform for neutronics analysis

    International Nuclear Information System (INIS)

    Luo Yuetong; Long Pengcheng; Wu Guoyong; Zeng Qin; Hu Liqin; Zou Jun

    2010-01-01

    Post-processing is an important part of neutronics analysis, and SVIP-N 1.0 (scientific visualization integrated platform for neutronics analysis) is designed to ease post-processing of neutronics analysis through visualization technologies. Main capabilities of SVIP-N 1.0 include: (1) ability of manage neutronics analysis result; (2) ability to preprocess neutronics analysis result; (3) ability to visualization neutronics analysis result data in different way. The paper describes the system architecture and main features of SVIP-N, some advanced visualization used in SVIP-N 1.0 and some preliminary applications, such as ITER.

  18. Developing a distributed HTML5-based search engine for geospatial resource discovery

    Science.gov (United States)

    ZHOU, N.; XIA, J.; Nebert, D.; Yang, C.; Gui, Z.; Liu, K.

    2013-12-01

    With explosive growth of data, Geospatial Cyberinfrastructure(GCI) components are developed to manage geospatial resources, such as data discovery and data publishing. However, the efficiency of geospatial resources discovery is still challenging in that: (1) existing GCIs are usually developed for users of specific domains. Users may have to visit a number of GCIs to find appropriate resources; (2) The complexity of decentralized network environment usually results in slow response and pool user experience; (3) Users who use different browsers and devices may have very different user experiences because of the diversity of front-end platforms (e.g. Silverlight, Flash or HTML). To address these issues, we developed a distributed and HTML5-based search engine. Specifically, (1)the search engine adopts a brokering approach to retrieve geospatial metadata from various and distributed GCIs; (2) the asynchronous record retrieval mode enhances the search performance and user interactivity; (3) the search engine based on HTML5 is able to provide unified access capabilities for users with different devices (e.g. tablet and smartphone).

  19. National Geospatial Program

    Science.gov (United States)

    Carswell, William J.

    2011-01-01

    The National Geospatial Program (NGP; http://www.usgs.gov/ngpo/) satisfies the needs of customers by providing geospatial products and services that customers incorporate into their decisionmaking and operational activities. These products and services provide geospatial data that are organized and maintained in cost-effective ways and developed by working with partners and organizations whose activities align with those of the program. To accomplish its mission, the NGP— organizes, maintains, publishes, and disseminates the geospatial baseline of the Nation's topography, natural landscape, and manmade environment through The National Map

  20. Bridging the Gap between NASA Hydrological Data and the Geospatial Community

    Science.gov (United States)

    Rui, Hualan; Teng, Bill; Vollmer, Bruce; Mocko, David M.; Beaudoing, Hiroko K.; Nigro, Joseph; Gary, Mark; Maidment, David; Hooper, Richard

    2011-01-01

    There is a vast and ever increasing amount of data on the Earth interconnected energy and hydrological systems, available from NASA remote sensing and modeling systems, and yet, one challenge persists: increasing the usefulness of these data for, and thus their use by, the geospatial communities. The Hydrology Data and Information Services Center (HDISC), part of the Goddard Earth Sciences DISC, has continually worked to better understand the hydrological data needs of the geospatial end users, to thus better able to bridge the gap between NASA data and the geospatial communities. This paper will cover some of the hydrological data sets available from HDISC, and the various tools and services developed for data searching, data subletting ; format conversion. online visualization and analysis; interoperable access; etc.; to facilitate the integration of NASA hydrological data by end users. The NASA Goddard data analysis and visualization system, Giovanni, is described. Two case examples of user-customized data services are given, involving the EPA BASINS (Better Assessment Science Integrating point & Non-point Sources) project and the CUAHSI Hydrologic Information System, with the common requirement of on-the-fly retrieval of long duration time series for a geographical point

  1. Geospatial Services Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: To process, store, and disseminate geospatial data to the Department of Defense and other Federal agencies.DESCRIPTION: The Geospatial Services Laboratory...

  2. Open cyberGIS software for geospatial research and education in the big data era

    Science.gov (United States)

    Wang, Shaowen; Liu, Yan; Padmanabhan, Anand

    CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  3. Open cyberGIS software for geospatial research and education in the big data era

    Directory of Open Access Journals (Sweden)

    Shaowen Wang

    2016-01-01

    Full Text Available CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS, spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies–open access, source, and integration–to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  4. expVIP: a Customizable RNA-seq Data Analysis and Visualization Platform.

    Science.gov (United States)

    Borrill, Philippa; Ramirez-Gonzalez, Ricardo; Uauy, Cristobal

    2016-04-01

    The majority of transcriptome sequencing (RNA-seq) expression studies in plants remain underutilized and inaccessible due to the use of disparate transcriptome references and the lack of skills and resources to analyze and visualize these data. We have developed expVIP, an expression visualization and integration platform, which allows easy analysis of RNA-seq data combined with an intuitive and interactive interface. Users can analyze public and user-specified data sets with minimal bioinformatics knowledge using the expVIP virtual machine. This generates a custom Web browser to visualize, sort, and filter the RNA-seq data and provides outputs for differential gene expression analysis. We demonstrate expVIP's suitability for polyploid crops and evaluate its performance across a range of biologically relevant scenarios. To exemplify its use in crop research, we developed a flexible wheat (Triticum aestivum) expression browser (www.wheat-expression.com) that can be expanded with user-generated data in a local virtual machine environment. The open-access expVIP platform will facilitate the analysis of gene expression data from a wide variety of species by enabling the easy integration, visualization, and comparison of RNA-seq data across experiments. © 2016 American Society of Plant Biologists. All Rights Reserved.

  5. An Ontology-supported Approach for Automatic Chaining of Web Services in Geospatial Knowledge Discovery

    Science.gov (United States)

    di, L.; Yue, P.; Yang, W.; Yu, G.

    2006-12-01

    Recent developments in geospatial semantic Web have shown promise for automatic discovery, access, and use of geospatial Web services to quickly and efficiently solve particular application problems. With the semantic Web technology, it is highly feasible to construct intelligent geospatial knowledge systems that can provide answers to many geospatial application questions. A key challenge in constructing such intelligent knowledge system is to automate the creation of a chain or process workflow that involves multiple services and highly diversified data and can generate the answer to a specific question of users. This presentation discusses an approach for automating composition of geospatial Web service chains by employing geospatial semantics described by geospatial ontologies. It shows how ontology-based geospatial semantics are used for enabling the automatic discovery, mediation, and chaining of geospatial Web services. OWL-S is used to represent the geospatial semantics of individual Web services and the type of the services it belongs to and the type of the data it can handle. The hierarchy and classification of service types are described in the service ontology. The hierarchy and classification of data types are presented in the data ontology. For answering users' geospatial questions, an Artificial Intelligent (AI) planning algorithm is used to construct the service chain by using the service and data logics expressed in the ontologies. The chain can be expressed as a graph with nodes representing services and connection weights representing degrees of semantic matching between nodes. The graph is a visual representation of logical geo-processing path for answering users' questions. The graph can be instantiated to a physical service workflow for execution to generate the answer to a user's question. A prototype system, which includes real world geospatial applications, is implemented to demonstrate the concept and approach.

  6. Eodataservice.org: Big Data Platform to Enable Multi-disciplinary Information Extraction from Geospatial Data

    Science.gov (United States)

    Natali, S.; Mantovani, S.; Barboni, D.; Hogan, P.

    2017-12-01

    In 1999, US Vice-President Al Gore outlined the concept of `Digital Earth' as a multi-resolution, three-dimensional representation of the planet to find, visualise and make sense of vast amounts of geo- referenced information on physical and social environments, allowing to navigate through space and time, accessing historical and forecast data to support scientists, policy-makers, and any other user. The eodataservice platform (http://eodataservice.org/) implements the Digital Earth Concept: eodatasevice is a cross-domain platform that makes available a large set of multi-year global environmental collections allowing data discovery, visualization, combination, processing and download. It implements a "virtual datacube" approach where data stored on distributed data centers are made available via standardized OGC-compliant interfaces. Dedicated web-based Graphic User Interfaces (based on the ESA-NASA WebWorldWind technology) as well as web-based notebooks (e.g. Jupyter notebook), deskop GIS tools and command line interfaces can be used to access and manipulate the data. The platform can be fully customized on users' needs. So far eodataservice has been used for the following thematic applications: High resolution satellite data distribution Land surface monitoring using SAR surface deformation data Atmosphere, ocean and climate applications Climate-health applications Urban Environment monitoring Safeguard of cultural heritage sites Support to farmers and (re)-insurances in the agriculturés field In the current work, the EO Data Service concept is presented as key enabling technology; furthermore various examples are provided to demonstrate the high level of interdisciplinarity of the platform.

  7. Improving data discoverability, accessibility, and interoperability with the Esri ArcGIS Platform at the NASA Atmospheric Science Data Center (ASDC).

    Science.gov (United States)

    Tisdale, M.

    2017-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying user requirements from government, private, public and academic communities. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), and OGC Web Coverage Services (WCS) while leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams at ASDC are utilizing these services through the development of applications using the Web AppBuilder for ArcGIS and the ArcGIS API for Javascript. These services provide greater exposure of ASDC data holdings to the GIS community and allow for broader sharing and distribution to various end users. These capabilities provide interactive visualization tools and improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry. The presentation will cover how the ASDC is developing geospatial web services and applications to improve data discoverability, accessibility, and interoperability.

  8. Nansat: a Scientist-Orientated Python Package for Geospatial Data Processing

    Directory of Open Access Journals (Sweden)

    Anton A. Korosov

    2016-10-01

    Full Text Available Nansat is a Python toolbox for analysing and processing 2-dimensional geospatial data, such as satellite imagery, output from numerical models, and gridded in-situ data. It is created with strong focus on facilitating research, and development of algorithms and autonomous processing systems. Nansat extends the widely used Geospatial Abstraction Data Library (GDAL by adding scientific meaning to the datasets through metadata, and by adding common functionality for data analysis and handling (e.g., exporting to various data formats. Nansat uses metadata vocabularies that follow international metadata standards, in particular the Climate and Forecast (CF conventions, and the NASA Directory Interchange Format (DIF and Global Change Master Directory (GCMD keywords. Functionality that is commonly needed in scientific work, such as seamless access to local or remote geospatial data in various file formats, collocation of datasets from different sources and geometries, and visualization, is also built into Nansat. The paper presents Nansat workflows, its functional structure, and examples of typical applications.

  9. Real-time GIS data model and sensor web service platform for environmental data management.

    Science.gov (United States)

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  10. The role of visualization in learning from computer-based images

    Science.gov (United States)

    Piburn, Michael D.; Reynolds, Stephen J.; McAuliffe, Carla; Leedy, Debra E.; Birk, James P.; Johnson, Julia K.

    2005-05-01

    Among the sciences, the practice of geology is especially visual. To assess the role of spatial ability in learning geology, we designed an experiment using: (1) web-based versions of spatial visualization tests, (2) a geospatial test, and (3) multimedia instructional modules built around QuickTime Virtual Reality movies. Students in control and experimental sections were administered measures of spatial orientation and visualization, as well as a content-based geospatial examination. All subjects improved significantly in their scores on spatial visualization and the geospatial examination. There was no change in their scores on spatial orientation. A three-way analysis of variance, with the geospatial examination as the dependent variable, revealed significant main effects favoring the experimental group and a significant interaction between treatment and gender. These results demonstrate that spatial ability can be improved through instruction, that learning of geological content will improve as a result, and that differences in performance between the genders can be eliminated.

  11. Towards Geo-spatial Information Science in Big Data Era

    Directory of Open Access Journals (Sweden)

    LI Deren

    2016-04-01

    Full Text Available Since the 1990s, with the advent of worldwide information revolution and the development of internet, geospatial information science have also come of age, which pushed forward the building of digital Earth and cyber city. As we entered the 21st century, with the development and integration of global information technology and industrialization, internet of things and cloud computing came into being, human society enters into the big data era. This article covers the key features (ubiquitous, multi-dimension and dynamics, internet+networking, full automation and real-time, from sensing to recognition, crowdsourcing and VGI, and service-oriented of geospatial information science in the big data era and addresses the key technical issues (non-linear four dimensional Earth reference frame system, space based enhanced GNSS, space-air and land unified network communication techniques, on board processing techniques for multi-sources image data, smart interface service techniques for space-borne information, space based resource scheduling and network security, design and developing of a payloads based multi-functional satellite platform. That needs to be resolved to provide a new definition of geospatial information science in big data era. Based on the discussion in this paper, the author finally proposes a new definition of geospatial information science (geomatics, i.e. Geomatics is a multiple discipline science and technology which, using a systematic approach, integrates all the means for spatio-temporal data acquisition, information extraction, networked management, knowledge discovering, spatial sensing and recognition, as well as intelligent location based services of any physical objects and human activities around the earth and its environment. Starting from this new definition, geospatial information science will get much more chances and find much more tasks in big data era for generation of smart earth and smart city . Our profession

  12. Designing algorithm visualization on mobile platform: The proposed guidelines

    Science.gov (United States)

    Supli, A. A.; Shiratuddin, N.

    2017-09-01

    This paper entails an ongoing study about the design guidelines of algorithm visualization (AV) on mobile platform, helping students learning data structures and algorithm (DSA) subject effectively. Our previous review indicated that design guidelines of AV on mobile platform are still few. Mostly, previous guidelines of AV are developed for AV on desktop and website platform. In fact, mobile learning has been proved to enhance engagement in learning circumstances, and thus effect student's performance. In addition, the researchers highly recommend including UI design and Interactivity in designing effective AV system. However, the discussions of these two aspects in previous AV design guidelines are not comprehensive. The UI design in this paper describes the arrangement of AV features in mobile environment, whereas interactivity is about the active learning strategy features based on learning experiences (how to engage learners). Thus, this study main objective is to propose design guidelines of AV on mobile platform (AVOMP) that entails comprehensively UI design and interactivity aspects. These guidelines are developed through content analysis and comparative analysis from various related studies. These guidelines are useful for AV designers to help them constructing AVOMP for various topics on DSA.

  13. BUILDING A BILLION SPATIO-TEMPORAL OBJECT SEARCH AND VISUALIZATION PLATFORM

    Directory of Open Access Journals (Sweden)

    D. Kakkar

    2017-10-01

    Full Text Available With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC, an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  14. Building a Billion Spatio-Temporal Object Search and Visualization Platform

    Science.gov (United States)

    Kakkar, D.; Lewis, B.

    2017-10-01

    With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA) has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC), an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  15. Grid Enabled Geospatial Catalogue Web Service

    Science.gov (United States)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  16. Development of a Web-Based Visualization Platform for Climate Research Using Google Earth

    Science.gov (United States)

    Sun, Xiaojuan; Shen, Suhung; Leptoukh, Gregory G.; Wang, Panxing; Di, Liping; Lu, Mingyue

    2011-01-01

    Recently, it has become easier to access climate data from satellites, ground measurements, and models from various data centers, However, searching. accessing, and prc(essing heterogeneous data from different sources are very tim -consuming tasks. There is lack of a comprehensive visual platform to acquire distributed and heterogeneous scientific data and to render processed images from a single accessing point for climate studies. This paper. documents the design and implementation of a Web-based visual, interoperable, and scalable platform that is able to access climatological fields from models, satellites, and ground stations from a number of data sources using Google Earth (GE) as a common graphical interface. The development is based on the TCP/IP protocol and various data sharing open sources, such as OPeNDAP, GDS, Web Processing Service (WPS), and Web Mapping Service (WMS). The visualization capability of integrating various measurements into cE extends dramatically the awareness and visibility of scientific results. Using embedded geographic information in the GE, the designed system improves our understanding of the relationships of different elements in a four dimensional domain. The system enables easy and convenient synergistic research on a virtual platform for professionals and the general public, gr$tly advancing global data sharing and scientific research collaboration.

  17. A Practice Approach of Multi-source Geospatial Data Integration for Web-based Geoinformation Services

    Science.gov (United States)

    Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.

    2014-04-01

    Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.

  18. Python geospatial development

    CERN Document Server

    Westra, Erik

    2013-01-01

    This is a tutorial style book that will teach usage of Python tools for GIS using simple practical examples and then show you how to build a complete mapping application from scratch. The book assumes basic knowledge of Python. No knowledge of Open Source GIS is required.Experienced Python developers who want to learn about geospatial concepts, work with geospatial data, solve spatial problems, and build mapbased applications.This book will be useful those who want to get up to speed with Open Source GIS in order to build GIS applications or integrate GeoSpatial features into their existing ap

  19. Geospatial Authentication

    Science.gov (United States)

    Lyle, Stacey D.

    2009-01-01

    A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server.

  20. Infrastructure for the Geospatial Web

    Science.gov (United States)

    Lake, Ron; Farley, Jim

    Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.

  1. The LandCarbon Web Application: Advanced Geospatial Data Delivery and Visualization Tools for Communication about Ecosystem Carbon Sequestration and Greenhouse Gas Fluxes

    Science.gov (United States)

    Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.

    2015-12-01

    The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application

  2. Using a Web GIS Plate Tectonics Simulation to Promote Geospatial Thinking

    Science.gov (United States)

    Bodzin, Alec M.; Anastasio, David; Sharif, Rajhida; Rutzmoser, Scott

    2016-01-01

    Learning with Web-based geographic information system (Web GIS) can promote geospatial thinking and analysis of georeferenced data. Web GIS can enable learners to analyze rich data sets to understand spatial relationships that are managed in georeferenced data visualizations. We developed a Web GIS plate tectonics simulation as a capstone learning…

  3. Geospatial Semantics and the Semantic Web

    CERN Document Server

    Ashish, Naveen

    2011-01-01

    The availability of geographic and geospatial information and services, especially on the open Web has become abundant in the last several years with the proliferation of online maps, geo-coding services, geospatial Web services and geospatially enabled applications. The need for geospatial reasoning has significantly increased in many everyday applications including personal digital assistants, Web search applications, local aware mobile services, specialized systems for emergency response, medical triaging, intelligence analysis and more. Geospatial Semantics and the Semantic Web: Foundation

  4. A Geospatial Cyberinfrastructure for Urban Economic Analysis and Spatial Decision-Making

    Directory of Open Access Journals (Sweden)

    Michael F. Goodchild

    2013-05-01

    Full Text Available Urban economic modeling and effective spatial planning are critical tools towards achieving urban sustainability. However, in practice, many technical obstacles, such as information islands, poor documentation of data and lack of software platforms to facilitate virtual collaboration, are challenging the effectiveness of decision-making processes. In this paper, we report on our efforts to design and develop a geospatial cyberinfrastructure (GCI for urban economic analysis and simulation. This GCI provides an operational graphic user interface, built upon a service-oriented architecture to allow (1 widespread sharing and seamless integration of distributed geospatial data; (2 an effective way to address the uncertainty and positional errors encountered in fusing data from diverse sources; (3 the decomposition of complex planning questions into atomic spatial analysis tasks and the generation of a web service chain to tackle such complex problems; and (4 capturing and representing provenance of geospatial data to trace its flow in the modeling task. The Greater Los Angeles Region serves as the test bed. We expect this work to contribute to effective spatial policy analysis and decision-making through the adoption of advanced GCI and to broaden the application coverage of GCI to include urban economic simulations.

  5. PAVICS: A Platform for the Analysis and Visualization of Climate Science

    Science.gov (United States)

    Gauvin St-Denis, B.; Landry, T.; Huard, D. B.; Byrns, D.; Chaumont, D.; Foucher, S.

    2016-12-01

    Climate service providers are boundary organizations working at the interface of climate science research and users of climate information. Users include academics in other disciplines looking for credible, customized future climate scenarios, government planners, resource managers, asset owners, as well as service utilities. These users are looking for relevant information regarding the impacts of climate change as well as informing decisions regarding adaptation options. As climate change concerns become mainstream, the pressure on climate service providers to deliver tailored, high quality information in a timely manner increases rapidly. To meet this growing demand, Ouranos, a climate service center located in Montreal, is collaborating with the Centre de recherche informatique de Montreal (CRIM) to develop a climate data analysis web-based platform interacting with RESTful services covering data access and retrieval, geospatial analysis, bias correction, distributed climate indicator computing and results visualization. The project, financed by CANARIE, relies on the experience of the UV-CDAT and ESGF-CWT teams, as well as on the Birdhouse framework developed by the German Climate Research Center (DKRZ) and French IPSL. Climate data is accessed through OPEnDAP, while computations are carried through WPS. Regions such as watersheds or user-defined polygons, used as spatial selections for computations, are managed by GeoServer, also providing WMS, WFS and WPS capabilities. The services are hosted on independent servers communicating by high throughput network. Deployment, maintenance and collaboration with other development teams are eased by the use of Docker and OpenStack VMs. Web-based tools are developed with modern web frameworks such as React-Redux, OpenLayers 3, Cesium and Plotly. Although the main objective of the project is to build a functioning, usable data analysis pipeline within two years, time is also devoted to explore emerging technologies and

  6. Pro visual C++/CLI and the net 35 platform

    CERN Document Server

    Fraser, Stephen

    2008-01-01

    Pro Visual C++/CLI and the .NET 3.5 Platform is about writing .NET applications using C++/CLI. While readers are learning the ins and outs of .NET application development, they will also be learning the syntax of C++, both old and new to .NET. Readers will also gain a good understanding of the .NET architecture. This is truly a .NET book applying C++ as its development language not another C++ syntax book that happens to cover .NET.

  7. A Web-Based Visualization and Animation Platform for Digital Logic Design

    Science.gov (United States)

    Shoufan, Abdulhadi; Lu, Zheng; Huss, Sorin A.

    2015-01-01

    This paper presents a web-based education platform for the visualization and animation of the digital logic design process. This includes the design of combinatorial circuits using logic gates, multiplexers, decoders, and look-up-tables as well as the design of finite state machines. Various configurations of finite state machines can be selected…

  8. WaveformECG: A Platform for Visualizing, Annotating, and Analyzing ECG Data.

    Science.gov (United States)

    Winslow, Raimond L; Granite, Stephen; Jurado, Christian

    2016-01-01

    The electrocardiogram (ECG) is the most commonly collected data in cardiovascular research because of the ease with which it can be measured and because changes in ECG waveforms reflect underlying aspects of heart disease. Accessed through a browser, WaveformECG is an open source platform supporting interactive analysis, visualization, and annotation of ECGs.

  9. Geospatial health

    DEFF Research Database (Denmark)

    Utzinger, Jürg; Rinaldi, Laura; Malone, John B.

    2011-01-01

    Geospatial Health is an international, peer-reviewed scientific journal produced by the Global Network for Geospatial Health (GnosisGIS). This network was founded in 2000 and the inaugural issue of its official journal was published in November 2006 with the aim to cover all aspects of geographical...... information system (GIS) applications, remote sensing and other spatial analytic tools focusing on human and veterinary health. The University of Naples Federico II is the publisher, producing two issues per year, both as hard copy and an open-access online version. The journal is referenced in major...... databases, including CABI, ISI Web of Knowledge and PubMed. In 2008, it was assigned its first impact factor (1.47), which has now reached 1.71. Geospatial Health is managed by an editor-in-chief and two associate editors, supported by five regional editors and a 23-member strong editorial board...

  10. The African Geospatial Sciences Institute (agsi): a New Approach to Geospatial Training in North Africa

    Science.gov (United States)

    Oeldenberger, S.; Khaled, K. B.

    2012-07-01

    The African Geospatial Sciences Institute (AGSI) is currently being established in Tunisia as a non-profit, non-governmental organization (NGO). Its objective is to accelerate the geospatial capacity development in North-Africa, providing the facilities for geospatial project and management training to regional government employees, university graduates, private individuals and companies. With typical course durations between one and six months, including part-time programs and long-term mentoring, its focus is on practical training, providing actual project execution experience. The AGSI will complement formal university education and will work closely with geospatial certification organizations and the geospatial industry. In the context of closer cooperation between neighboring North Africa and the European Community, the AGSI will be embedded in a network of several participating European and African universities, e. g. the ITC, and international organizations, such as the ISPRS, the ICA and the OGC. Through a close cooperation with African organizations, such as the AARSE, the RCMRD and RECTAS, the network and exchange of ideas, experiences, technology and capabilities will be extended to Saharan and sub-Saharan Africa. A board of trustees will be steering the AGSI operations and will ensure that practical training concepts and contents are certifiable and can be applied within a credit system to graduate and post-graduate education at European and African universities. The geospatial training activities of the AGSI are centered on a facility with approximately 30 part- and full-time general staff and lecturers in Tunis during the first year. The AGSI will operate a small aircraft with a medium-format aerial camera and compact LIDAR instrument for local, community-scale data capture. Surveying training, the photogrammetric processing of aerial images, GIS data capture and remote sensing training will be the main components of the practical training courses

  11. The Future of Geospatial Standards

    Science.gov (United States)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds

  12. The new geospatial tools: global transparency enhancing safeguards verification

    International Nuclear Information System (INIS)

    Pabian, Frank Vincent

    2010-01-01

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  13. Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pabian, Frank V [Los Alamos National Laboratory

    2012-08-14

    This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previously used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant

  14. An Interactive Platform to Visualize Data-Driven Clinical Pathways for the Management of Multiple Chronic Conditions.

    Science.gov (United States)

    Zhang, Yiye; Padman, Rema

    2017-01-01

    Patients with multiple chronic conditions (MCC) pose an increasingly complex health management challenge worldwide, particularly due to the significant gap in our understanding of how to provide coordinated care. Drawing on our prior research on learning data-driven clinical pathways from actual practice data, this paper describes a prototype, interactive platform for visualizing the pathways of MCC to support shared decision making. Created using Python web framework, JavaScript library and our clinical pathway learning algorithm, the visualization platform allows clinicians and patients to learn the dominant patterns of co-progression of multiple clinical events from their own data, and interactively explore and interpret the pathways. We demonstrate functionalities of the platform using a cluster of 36 patients, identified from a dataset of 1,084 patients, who are diagnosed with at least chronic kidney disease, hypertension, and diabetes. Future evaluation studies will explore the use of this platform to better understand and manage MCC.

  15. Adoption of Geospatial Systems towards evolving Sustainable Himalayan Mountain Development

    Science.gov (United States)

    Murthy, M. S. R.; Bajracharya, B.; Pradhan, S.; Shestra, B.; Bajracharya, R.; Shakya, K.; Wesselmann, S.; Ali, M.; Bajracharya, S.; Pradhan, S.

    2014-11-01

    Natural resources dependence of mountain communities, rapid social and developmental changes, disaster proneness and climate change are conceived as the critical factors regulating sustainable Himalayan mountain development. The Himalayan region posed by typical geographic settings, diverse physical and cultural diversity present a formidable challenge to collect and manage data, information and understands varied socio-ecological settings. Recent advances in earth observation, near real-time data, in-situ measurements and in combination of information and communication technology have transformed the way we collect, process, and generate information and how we use such information for societal benefits. Glacier dynamics, land cover changes, disaster risk reduction systems, food security and ecosystem conservation are a few thematic areas where geospatial information and knowledge have significantly contributed to informed decision making systems over the region. The emergence and adoption of near-real time systems, unmanned aerial vehicles (UAV), board-scale citizen science (crowd-sourcing), mobile services and mapping, and cloud computing have paved the way towards developing automated environmental monitoring systems, enhanced scientific understanding of geophysical and biophysical processes, coupled management of socio-ecological systems and community based adaptation models tailored to mountain specific environment. There are differentiated capacities among the ICIMOD regional member countries with regard to utilization of earth observation and geospatial technologies. The region can greatly benefit from a coordinated and collaborative approach to capture the opportunities offered by earth observation and geospatial technologies. The regional level data sharing, knowledge exchange, and Himalayan GEO supporting geospatial platforms, spatial data infrastructure, unique region specific satellite systems to address trans-boundary challenges would go a long way in

  16. Evaluation of Data Management Systems for Geospatial Big Data

    OpenAIRE

    Amirian, Pouria; Basiri, Anahid; Winstanley, Adam C.

    2014-01-01

    Big Data encompasses collection, management, processing and analysis of the huge amount of data that varies in types and changes with high frequency. Often data component of Big Data has a positional component as an important part of it in various forms, such as postal address, Internet Protocol (IP) address and geographical location. If the positional components in Big Data extensively used in storage, retrieval, analysis, processing, visualization and knowledge discovery (geospatial Big Dat...

  17. Examining the Effect of Enactment of a Geospatial Curriculum on Students' Geospatial Thinking and Reasoning

    Science.gov (United States)

    Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara

    2014-08-01

    A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.

  18. Using Geospatial Analysis to Align Little Free Library Locations with Community Literacy Needs

    Science.gov (United States)

    Rebori, Marlene K.; Burge, Peter

    2017-01-01

    We used geospatial analysis tools to develop community maps depicting fourth-grade reading proficiency test scores and locations of facilities offering public access to reading materials (i.e., public libraries, elementary schools, and Little Free Libraries). The maps visually highlighted areas with struggling readers and areas without adequate…

  19. Prototype of a web - based participative decision support platform in natural hazards and risk management

    NARCIS (Netherlands)

    Aye, Z.C.; Jaboyedoff, M.; Derron, M.H.; van Westen, C.J.

    2015-01-01

    This paper presents the current state and development of a prototype web-GIS (Geographic Information System) decision support platform intended for application in natural hazards and risk management, mainly for floods and landslides. This web platform uses open-source geospatial software and

  20. Tutoring math platform accessible for visually impaired people.

    Science.gov (United States)

    Maćkowski, Michał Sebastian; Brzoza, Piotr Franciszek; Spinczyk, Dominik Roland

    2018-04-01

    There are many problems with teaching and assessing impaired students in higher education, especially in technical science, where the knowledge is represented mostly by structural information like: math formulae, charts, graphs, etc. Developing e-learning platform for distance education solves this problem only partially due to the lack of accessibility for the blind. The proposed method is based on the decomposition of the typical mathematical exercise into a sequence of elementary sub-exercises. This allows for interactive resolving of math exercises and assessment of the correctness of exercise solutions at every stage. The presented methods were prepared and evaluated by visually impaired people and students. The article presents the accessible interactive tutoring platform for math teaching and assessment, and experience in exploring it. The results of conducted research confirm good understanding of math formulae described according to elaborated rules. Regardless of the level of complexity of the math formulae the level of math formulae understanding is higher for alternative structural description. The proposed solution enables alternative descriptions of math formulae. Based on the research results, the tool for computer-aided interactive learning of mathematics adapted to the needs of the blind has been designed, implemented and deployed as a platform for on-site and online and distance learning. The designed solution can be very helpful in overcoming many barriers that occur while teaching impaired students. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Geospatial database for heritage building conservation

    Science.gov (United States)

    Basir, W. N. F. W. A.; Setan, H.; Majid, Z.; Chong, A.

    2014-02-01

    Heritage buildings are icons from the past that exist in present time. Through heritage architecture, we can learn about economic issues and social activities of the past. Nowadays, heritage buildings are under threat from natural disaster, uncertain weather, pollution and others. In order to preserve this heritage for the future generation, recording and documenting of heritage buildings are required. With the development of information system and data collection technique, it is possible to create a 3D digital model. This 3D information plays an important role in recording and documenting heritage buildings. 3D modeling and virtual reality techniques have demonstrated the ability to visualize the real world in 3D. It can provide a better platform for communication and understanding of heritage building. Combining 3D modelling with technology of Geographic Information System (GIS) will create a database that can make various analyses about spatial data in the form of a 3D model. Objectives of this research are to determine the reliability of Terrestrial Laser Scanning (TLS) technique for data acquisition of heritage building and to develop a geospatial database for heritage building conservation purposes. The result from data acquisition will become a guideline for 3D model development. This 3D model will be exported to the GIS format in order to develop a database for heritage building conservation. In this database, requirements for heritage building conservation process are included. Through this research, a proper database for storing and documenting of the heritage building conservation data will be developed.

  2. Geospatial database for heritage building conservation

    International Nuclear Information System (INIS)

    Basir, W N F W A; Setan, H; Majid, Z; Chong, A

    2014-01-01

    Heritage buildings are icons from the past that exist in present time. Through heritage architecture, we can learn about economic issues and social activities of the past. Nowadays, heritage buildings are under threat from natural disaster, uncertain weather, pollution and others. In order to preserve this heritage for the future generation, recording and documenting of heritage buildings are required. With the development of information system and data collection technique, it is possible to create a 3D digital model. This 3D information plays an important role in recording and documenting heritage buildings. 3D modeling and virtual reality techniques have demonstrated the ability to visualize the real world in 3D. It can provide a better platform for communication and understanding of heritage building. Combining 3D modelling with technology of Geographic Information System (GIS) will create a database that can make various analyses about spatial data in the form of a 3D model. Objectives of this research are to determine the reliability of Terrestrial Laser Scanning (TLS) technique for data acquisition of heritage building and to develop a geospatial database for heritage building conservation purposes. The result from data acquisition will become a guideline for 3D model development. This 3D model will be exported to the GIS format in order to develop a database for heritage building conservation. In this database, requirements for heritage building conservation process are included. Through this research, a proper database for storing and documenting of the heritage building conservation data will be developed

  3. Geospatial Technologies and Geography Education in a Changing World : Geospatial Practices and Lessons Learned

    NARCIS (Netherlands)

    2015-01-01

    Book published by IGU Commission on Geographical Education. It focuses particularly on what has been learned from geospatial projects and research from the past decades of implementing geospatial technologies in formal and informal education.

  4. A Geospatial Online Instruction Model

    OpenAIRE

    Athena OWEN-NAGEL; John C. RODGERS III; Shrinidhi AMBINAKUDIGE

    2012-01-01

    The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model’s effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based teaching effectiveness has not yet been clearly demonstrated for geospatial courses. The pedagogical model implemented in this study heavily utilizes ...

  5. Avogadro: an advanced semantic chemical editor, visualization, and analysis platform.

    Science.gov (United States)

    Hanwell, Marcus D; Curtis, Donald E; Lonie, David C; Vandermeersch, Tim; Zurek, Eva; Hutchison, Geoffrey R

    2012-08-13

    The Avogadro project has developed an advanced molecule editor and visualizer designed for cross-platform use in computational chemistry, molecular modeling, bioinformatics, materials science, and related areas. It offers flexible, high quality rendering, and a powerful plugin architecture. Typical uses include building molecular structures, formatting input files, and analyzing output of a wide variety of computational chemistry packages. By using the CML file format as its native document type, Avogadro seeks to enhance the semantic accessibility of chemical data types. The work presented here details the Avogadro library, which is a framework providing a code library and application programming interface (API) with three-dimensional visualization capabilities; and has direct applications to research and education in the fields of chemistry, physics, materials science, and biology. The Avogadro application provides a rich graphical interface using dynamically loaded plugins through the library itself. The application and library can each be extended by implementing a plugin module in C++ or Python to explore different visualization techniques, build/manipulate molecular structures, and interact with other programs. We describe some example extensions, one which uses a genetic algorithm to find stable crystal structures, and one which interfaces with the PackMol program to create packed, solvated structures for molecular dynamics simulations. The 1.0 release series of Avogadro is the main focus of the results discussed here. Avogadro offers a semantic chemical builder and platform for visualization and analysis. For users, it offers an easy-to-use builder, integrated support for downloading from common databases such as PubChem and the Protein Data Bank, extracting chemical data from a wide variety of formats, including computational chemistry output, and native, semantic support for the CML file format. For developers, it can be easily extended via a powerful

  6. Avogadro: an advanced semantic chemical editor, visualization, and analysis platform

    Directory of Open Access Journals (Sweden)

    Hanwell Marcus D

    2012-08-01

    Full Text Available Abstract Background The Avogadro project has developed an advanced molecule editor and visualizer designed for cross-platform use in computational chemistry, molecular modeling, bioinformatics, materials science, and related areas. It offers flexible, high quality rendering, and a powerful plugin architecture. Typical uses include building molecular structures, formatting input files, and analyzing output of a wide variety of computational chemistry packages. By using the CML file format as its native document type, Avogadro seeks to enhance the semantic accessibility of chemical data types. Results The work presented here details the Avogadro library, which is a framework providing a code library and application programming interface (API with three-dimensional visualization capabilities; and has direct applications to research and education in the fields of chemistry, physics, materials science, and biology. The Avogadro application provides a rich graphical interface using dynamically loaded plugins through the library itself. The application and library can each be extended by implementing a plugin module in C++ or Python to explore different visualization techniques, build/manipulate molecular structures, and interact with other programs. We describe some example extensions, one which uses a genetic algorithm to find stable crystal structures, and one which interfaces with the PackMol program to create packed, solvated structures for molecular dynamics simulations. The 1.0 release series of Avogadro is the main focus of the results discussed here. Conclusions Avogadro offers a semantic chemical builder and platform for visualization and analysis. For users, it offers an easy-to-use builder, integrated support for downloading from common databases such as PubChem and the Protein Data Bank, extracting chemical data from a wide variety of formats, including computational chemistry output, and native, semantic support for the CML file format

  7. The new geospatial tools: global transparency enhancing safeguards verification

    Energy Technology Data Exchange (ETDEWEB)

    Pabian, Frank Vincent [Los Alamos National Laboratory

    2010-09-16

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  8. SmartR: an open-source platform for interactive visual analytics for translational research data.

    Science.gov (United States)

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  9. Building the Platform of Digital Earth with Sphere Split Bricks

    Directory of Open Access Journals (Sweden)

    WANG Jinxin

    2015-06-01

    Full Text Available Discrete global grids, a modeling framework for big geo-spatial data, is always used to build the Digital Earth platform. Based on the sphere split bricks (Earth system spatial grids, it can not only build the true three-dimensional digital Earth model, but also can achieve integration, fusion, expression and application of the spatial data which locates on, under or above the Earth subsurface. The theoretical system of spheroid geodesic QTM octree grid is discussed, including the partition principle, analysis of grid geometry features and coding/ decoding method etc, and a prototype system of true-3D digital Earth platform with the sphere split bricks is developed. The functions of the system mainly include the arbitrary sphere segmentation and the visualization of physical models of underground, surface and aerial entities. Results show that the sphere geodesic QTM octree grid has many application advantages, such as simple subdivision rules, the grid system neat, clear geometric features, strong applicability etc. In particular, it can be extended to the ellipsoid, so it can be used for organization, management, integration and application of the global spatial big data.

  10. Geospatial Information Response Team

    Science.gov (United States)

    Witt, Emitt C.

    2010-01-01

    Extreme emergency events of national significance that include manmade and natural disasters seem to have become more frequent during the past two decades. The Nation is becoming more resilient to these emergencies through better preparedness, reduced duplication, and establishing better communications so every response and recovery effort saves lives and mitigates the long-term social and economic impacts on the Nation. The National Response Framework (NRF) (http://www.fema.gov/NRF) was developed to provide the guiding principles that enable all response partners to prepare for and provide a unified national response to disasters and emergencies. The NRF provides five key principles for better preparation, coordination, and response: 1) engaged partnerships, 2) a tiered response, 3) scalable, flexible, and adaptable operations, 4) unity of effort, and 5) readiness to act. The NRF also describes how communities, tribes, States, Federal Government, privatesector, and non-governmental partners apply these principles for a coordinated, effective national response. The U.S. Geological Survey (USGS) has adopted the NRF doctrine by establishing several earth-sciences, discipline-level teams to ensure that USGS science, data, and individual expertise are readily available during emergencies. The Geospatial Information Response Team (GIRT) is one of these teams. The USGS established the GIRT to facilitate the effective collection, storage, and dissemination of geospatial data information and products during an emergency. The GIRT ensures that timely geospatial data are available for use by emergency responders, land and resource managers, and for scientific analysis. In an emergency and response capacity, the GIRT is responsible for establishing procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing coordinated products and services utilizing the USGS' exceptional pool of

  11. Storytelling in Interactive 3D Geographic Visualization Systems

    Directory of Open Access Journals (Sweden)

    Matthias Thöny

    2018-03-01

    Full Text Available The objective of interactive geographic maps is to provide geographic information to a large audience in a captivating and intuitive way. Storytelling helps to create exciting experiences and to explain complex or otherwise hidden relationships of geospatial data. Furthermore, interactive 3D applications offer a wide range of attractive elements for advanced visual story creation and offer the possibility to convey the same story in many different ways. In this paper, we discuss and analyze storytelling techniques in 3D geographic visualizations so that authors and developers working with geospatial data can use these techniques to conceptualize their visualization and interaction design. Finally, we outline two examples which apply the given concepts.

  12. Dynamic Science Data Services for Display, Analysis and Interaction in Widely-Accessible, Web-Based Geospatial Platforms, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — TerraMetrics, Inc., proposes a Phase II R/R&D program to implement the TerraBlocksTM Server architecture that provides geospatial data authoring, storage and...

  13. Operational Marine Data Acquisition and Delivery Powered by Web and Geospatial Standards

    Science.gov (United States)

    Thomas, R.; Buck, J. J. H.

    2015-12-01

    As novel sensor types and new platforms are deployed to monitor the global oceans, the volumes of scientific and environmental data collected in the marine context are rapidly growing. In order to use these data in both the traditional operational modes and in innovative "Big Data" applications the data must be readily understood by software agents. One approach to achieving this is the application of both World Wide Web and Open Geospatial Consortium standards: namely Linked Data1 and Sensor Web Enablement2 (SWE). The British Oceanographic Data Centre (BODC) is adopting this strategy in a number of European Commission funded projects (NETMAR; SenseOCEAN; Ocean Data Interoperability Platform - ODIP; and AtlantOS) to combine its existing data archiving architecture with SWE components (such as Sensor Observation Services) and a Linked Data interface. These will evolve the data management and data transfer from a process that requires significant manual intervention to an automated operational process enabling the rapid, standards-based, ingestion and delivery of data. This poster will show the current capabilities of BODC and the status of on-going implementation of this strategy. References1. World Wide Web Consortium. (2013). Linked Data. Available:http://www.w3.org/standards/semanticweb/data. Last accessed 7th April 20152. Open Geospatial Consortium. (2014). Sensor Web Enablement (SWE). Available:http://www.opengeospatial.org/ogc/markets-technologies/swe. Last accessed 8th October 2014

  14. Increasing the value of geospatial informatics with open approaches for Big Data

    Science.gov (United States)

    Percivall, G.; Bermudez, L. E.

    2017-12-01

    Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."

  15. A geospatial search engine for discovering multi-format geospatial data across the web

    Science.gov (United States)

    Christopher Bone; Alan Ager; Ken Bunzel; Lauren Tierney

    2014-01-01

    The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created. However, challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist. The objective of this paper is to present a publically...

  16. The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework

    Science.gov (United States)

    Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.

    2016-12-01

    The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During

  17. Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics

    Science.gov (United States)

    Singh, R.; Bermudez, L. E.

    2013-12-01

    Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics The Open Geospatial Consortium (OGC) mission is to serve as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC coordinates with over 400 institutions in the development of geospatial standards. In the last years two main trends are making disruptions in geospatial applications: mobile and context sharing. People now have more and more mobile devices to support their work and personal life. Mobile devices are intermittently connected to the internet and have smaller computing capacity than a desktop computer. Based on this trend a new OGC file format standard called GeoPackage will enable greater geospatial data sharing on mobile devices. GeoPackage is perhaps best understood as the natural evolution of Shapefiles, which have been the predominant lightweight geodata sharing format for two decades. However the format is extremely limited. Four major shortcomings are that only vector points, lines, and polygons are supported; property names are constrained by the dBASE format; multiple files are required to encode a single data set; and multiple Shapefiles are required to encode multiple data sets. A more modern lingua franca for geospatial data is long overdue. GeoPackage fills this need with support for vector data, image tile matrices, and raster data. And it builds upon a database container - SQLite - that's self-contained, single-file, cross-platform, serverless, transactional, and open source. A GeoPackage, in essence, is a set of SQLite database tables whose content and layout is described in the candidate GeoPackage Implementation Specification available at https://portal.opengeospatial.org/files/?artifact_id=54838&version=1. The second trend is sharing client 'contexts'. When a user is looking into an article or a product on the web

  18. A Dynamic Information Framework: A Multi-Sector, Geospatial Gateway for Environmental Conservation and Adaptation to Climate Change

    Science.gov (United States)

    Fernandes, E. C.; Norbu, C.; Juizo, D.; Wangdi, T.; Richey, J. E.

    2011-12-01

    Landscapes, watersheds, and their downstream coastal and lacustrine zones are facing a series of challenges critical to their future, centered on the availability and distribution of water. Management options cover a range of issues, from bringing safe water to local villages for the rural poor, developing adaptation strategies for both rural and urban populations and large infrastructure, and sustaining environmental flows and ecosystem services needed for natural and human-dominated ecosystems. These targets represent a very complex set of intersecting issues of scale, cross-sector science and technology, education, politics, and economics, and the desired sustainable development is closely linked to how the nominally responsible governmental Ministries respond to the information they have. In practice, such information and even perspectives are virtually absent, in much of the developing world. A Dynamic Information Framework (DIF) is being designed as a knowledge platform whereby decision-makers in information-sparse regions can consider rigorous scenarios of alternative futures and obtain decision support for complex environmental and economic decisions is essential. The DIF is geospatial gateway, with functional components of base data layers, directed data layers focused on synthetic objectives, geospatially-explicit, process-based, cross-sector simulation models (requiring data from the directed data layers), and facilitated input/output (including visualizations), and decision support system and scenario testing capabilities. A fundamental aspect to a DIF is not only the convergence of multi-sector information, but how that information can be (a) integrated (b) used for robust simulations and projections, and (c) conveyed to policymakers and stakeholders, in the most compelling, and visual, manner. Examples are given of emerging applications. The ZambeziDIF was used to establish baselines for agriculture, biodiversity, and water resources in the lower

  19. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data

    DEFF Research Database (Denmark)

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-01-01

    -friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface...... such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical...... displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics...

  20. Text Stream Trend Analysis using Multiscale Visual Analytics with Applications to Social Media Systems

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Beaver, Justin M [ORNL; BogenII, Paul L. [Google Inc.; Drouhard, Margaret MEG G [ORNL; Pyle, Joshua M [ORNL

    2015-01-01

    In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing these contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.

  1. Information gathering, management and transferring for geospatial intelligence - A conceptual approach to create a spatial data infrastructure

    Science.gov (United States)

    Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena

    2017-06-01

    Since long ago, information is a key factor for military organizations. In military context the success of joint and combined operations depends on the accurate information and knowledge flow concerning the operational theatre: provision of resources, environment evolution, targets' location, where and when an event will occur. Modern military operations cannot be conceive without maps and geospatial information. Staffs and forces on the field request large volume of information during the planning and execution process, horizontal and vertical geospatial information integration is critical for decision cycle. Information and knowledge management are fundamental to clarify an environment full of uncertainty. Geospatial information (GI) management rises as a branch of information and knowledge management, responsible for the conversion process from raw data collect by human or electronic sensors to knowledge. Geospatial information and intelligence systems allow us to integrate all other forms of intelligence and act as a main platform to process and display geospatial-time referenced events. Combining explicit knowledge with person know-how to generate a continuous learning cycle that supports real time decisions, mitigates the influences of fog of war and provides the knowledge supremacy. This paper presents the analysis done after applying a questionnaire and interviews about the GI and intelligence management in a military organization. The study intended to identify the stakeholder's requirements for a military spatial data infrastructure as well as the requirements for a future software system development.

  2. The Geospatial Web and Local Geographical Education

    Science.gov (United States)

    Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.

    2010-01-01

    Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…

  3. Design and Development of a Framework Based on Ogc Web Services for the Visualization of Three Dimensional Large-Scale Geospatial Data Over the Web

    Science.gov (United States)

    Roccatello, E.; Nozzi, A.; Rumor, M.

    2013-05-01

    This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.

  4. Multivariate Gradient Analysis for Evaluating and Visualizing a Learning System Platform for Computer Programming

    Science.gov (United States)

    Mather, Richard

    2015-01-01

    This paper explores the application of canonical gradient analysis to evaluate and visualize student performance and acceptance of a learning system platform. The subject of evaluation is a first year BSc module for computer programming. This uses "Ceebot," an animated and immersive game-like development environment. Multivariate…

  5. Visual, tangible, and touch-screen: Comparison of platforms for displaying simple graphics.

    Science.gov (United States)

    Gershon, Pnina; Klatzky, Roberta L; Palani, Hari; Giudice, Nicholas A

    2016-01-01

    Four different platforms were compared in a task of exploring an angular stimulus and reporting its value. The angle was explored visually, tangibly as raised fine-grit sandpaper, or on a touch-screen with a frictional or vibratory signal. All platforms produced highly accurate angle judgments. Differences were found, however, in exploration time, with vision fastest as expected, followed by tangible, vibration, and friction. Relative to the tangible display, touch-screens evidenced greater noise in the perceived angular value, with a particular disadvantage for friction. The latter must be interpreted in the context of a first-generation display and a rapidly advancing technology. On the whole, the results point both to promise and barriers in the use of refreshable graphical displays for blind users.

  6. Geospatial Technology in Geography Education

    NARCIS (Netherlands)

    Muniz Solari, Osvaldo; Demirci, A.; van der Schee, J.A.

    2015-01-01

    The book is presented as an important starting point for new research in Geography Education (GE) related to the use and application of geospatial technologies (GSTs). For this purpose, the selection of topics was based on central ideas to GE in its relationship with GSTs. The process of geospatial

  7. Generation of Multiple Metadata Formats from a Geospatial Data Repository

    Science.gov (United States)

    Hudspeth, W. B.; Benedict, K. K.; Scott, S.

    2012-12-01

    The Earth Data Analysis Center (EDAC) at the University of New Mexico is partnering with the CYBERShARE and Environmental Health Group from the Center for Environmental Resource Management (CERM), located at the University of Texas, El Paso (UTEP), the Biodiversity Institute at the University of Kansas (KU), and the New Mexico Geo- Epidemiology Research Network (GERN) to provide a technical infrastructure that enables investigation of a variety of climate-driven human/environmental systems. Two significant goals of this NASA-funded project are: a) to increase the use of NASA Earth observational data at EDAC by various modeling communities through enabling better discovery, access, and use of relevant information, and b) to expose these communities to the benefits of provenance for improving understanding and usability of heterogeneous data sources and derived model products. To realize these goals, EDAC has leveraged the core capabilities of its Geographic Storage, Transformation, and Retrieval Engine (Gstore) platform, developed with support of the NSF EPSCoR Program. The Gstore geospatial services platform provides general purpose web services based upon the REST service model, and is capable of data discovery, access, and publication functions, metadata delivery functions, data transformation, and auto-generated OGC services for those data products that can support those services. Central to the NASA ACCESS project is the delivery of geospatial metadata in a variety of formats, including ISO 19115-2/19139, FGDC CSDGM, and the Proof Markup Language (PML). This presentation details the extraction and persistence of relevant metadata in the Gstore data store, and their transformation into multiple metadata formats that are increasingly utilized by the geospatial community to document not only core library catalog elements (e.g. title, abstract, publication data, geographic extent, projection information, and database elements), but also the processing steps used to

  8. Field: a new meta-authoring platform for data-intensive scientific visualization

    Science.gov (United States)

    Downie, M.; Ameres, E.; Fox, P. A.; Goebel, J.; Graves, A.; Hendler, J.

    2012-12-01

    This presentation will demonstrate a new platform for data-intensive scientific visualization, called Field, that rethinks the problem of visual data exploration. Several new opportunities for scientific visualization present themselves here at this moment in time. We believe that when taken together they may catalyze a transformation of the practice of science and to begin to seed a technical culture within science that fuses data analysis, programming and myriad visual strategies. It is at integrative levels that the principle challenges exist, for many fundamental technical components of our field are now well understood and widely available. File formats from CSV through HDF all have broad library support; low-level high-performance graphics APIs (OpenGL) are in a period of stable growth; and a dizzying ecosystem of analysis and machine learning libraries abound. The hardware of computer graphics offers unprecedented computing power within commodity components; programming languages and platforms are coalescing around a core set of umbrella runtimes. Each of these trends are each set to continue — computer graphics hardware is developing at a super-Moore-law rate, and trends in publication and dissemination point only towards an increasing amount of access to code and data. The critical opportunity here for scientific visualization is, we maintain, not a in developing a new statistical library, nor a new tool centered on a particular technique, but rather new visual, "live" programming environment that is promiscuous in its scope. We can identify the necessarily methodological practice and traditions required here not in science or engineering but in the "live-coding" practices prevalent in the fields of digital art and design. We can define this practice as an approach to programming that is live, iterative, integrative, speculative and exploratory. "Live" because it is exclusively practiced in real-time (often during performance); "iterative", because

  9. OSGeo - Open Source Geospatial Foundation

    Directory of Open Access Journals (Sweden)

    Margherita Di Leo

    2012-09-01

    Full Text Available L'esigenza nata verso la fine del 2005 di selezionare ed organizzare più di 200 progetti FOSS4G porta alla nascita nel Febbraio2006 di OSGeo (the Open Source Geospatial Foundation, organizzazione internazionale la cui mission è promuovere lo sviluppo collaborativo di software libero focalizzato sull'informazione geografica (FOSS4G.Open   Source   Geospatial   Foundation (OSGeoThe Open Source Geospatial Foundation (OSGeo  is  a  not-for-profit  organization, created  in  early  2006  to  the  aim  at  sup-porting   the   collaborative   development of  geospatial  open  source  software,  and promote its widespread use. The founda-tion provides financial, organizational and legal support to the broader open source geospatial community. It also serves as an independent  legal  entity  to  which  com-munity  members  can  contribute  code, funding  and  other  resources,  secure  in the knowledge that their contributions will be maintained for public benefit. OSGeo also  serves  as  an  outreach  and  advocacy organization for the open source geospa-tial  community,  and  provides  a  common forum  and  shared  infrastructure  for  im-proving  cross-project  collaboration.  The foundation's projects are all freely available and  useable  under  an  OSI-certified  open source license. The Italian OSGeo local chapter is named GFOSS.it     (Associazione     Italiana     per l'informazione Geografica Libera.

  10. A Geospatial Online Instruction Model

    Science.gov (United States)

    Rodgers, John C., III; Owen-Nagel, Athena; Ambinakudige, Shrinidhi

    2012-01-01

    The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model's effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based…

  11. From Geomatics to Geospatial Intelligent Service Science

    Directory of Open Access Journals (Sweden)

    LI Deren

    2017-10-01

    Full Text Available The paper reviews the 60 years of development from traditional surveying and mapping to today's geospatial intelligent service science.The three important stages of surveying and mapping, namely analogue,analytical and digital stage are summarized.The author introduces the integration of GNSS,RS and GIS(3S,which forms the rise of geospatial informatics(Geomatics.The development of geo-spatial information science in digital earth era is analyzed,and the latest progress of geo-spatial information science towards real-time intelligent service in smart earth era is discussed.This paper focuses on the three development levels of "Internet plus" spatial information intelligent service.In the era of big data,the traditional geomatics will surely take advantage of the integration of communication,navigation,remote sensing,artificial intelligence,virtual reality and brain cognition science,and become geospatial intelligent service science,thereby making contributions to national economy,defense and people's livelihood.

  12. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  13. Kameleon Live: An Interactive Cloud Based Analysis and Visualization Platform for Space Weather Researchers

    Science.gov (United States)

    Pembroke, A. D.; Colbert, J. A.

    2015-12-01

    The Community Coordinated Modeling Center (CCMC) provides hosting for many of the simulations used by the space weather community of scientists, educators, and forecasters. CCMC users may submit model runs through the Runs on Request system, which produces static visualizations of model output in the browser, while further analysis may be performed off-line via Kameleon, CCMC's cross-language access and interpolation library. Off-line analysis may be suitable for power-users, but storage and coding requirements present a barrier to entry for non-experts. Moreover, a lack of a consistent framework for analysis hinders reproducibility of scientific findings. To that end, we have developed Kameleon Live, a cloud based interactive analysis and visualization platform. Kameleon Live allows users to create scientific studies built around selected runs from the Runs on Request database, perform analysis on those runs, collaborate with other users, and disseminate their findings among the space weather community. In addition to showcasing these novel collaborative analysis features, we invite feedback from CCMC users as we seek to advance and improve on the new platform.

  14. Architecture of a spatial data service system for statistical analysis and visualization of regional climate changes

    Science.gov (United States)

    Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.

  15. Development, management and benefit from Internet-based geospatial data sources through knowledge management for GIS-based regional geography applications

    International Nuclear Information System (INIS)

    Thunemann, H.G.

    2009-01-01

    The provision of data and information on the Internet is growing daily. For geoscientific applications, especially using geographic information systems (GIS), changing geospacial data are often needed, and thus possibly different data sources. Geospatial data should be easily available. As an increasingly important medium for exchange of geospatial data is the internet. The problem of finding appropriate datasources on the Internet remains to the user. The Internet as a technical basis, which was designed as a tool for information exchange, has changed the practice of dealing with knowledge and information on fundamental and not previously foreseeable manner. From the many individual acts social consequences result, concerning the production and disposal of knowledge. These determine the development of different solutions significantly, which also includes the production, deployment and use of geospatial data, with all its strengths and problems. Various solutions to the provision of geospatial data are available on the Internet, the targeted searching of this geodata sources on the Internet remains a shortcoming. The options of knowledge management, among other solutions, could be a possibility to ease the compilation, storage, connection, popularization and ultimately the application of geodata sources on the Internet. Communication, as a central element of the use of knowledge management, should be used in the form of a communication platform. The present study describes the variety of deployment options of geospatial data and the problems of finding data sources on the Internet. Potential hazards of geospatial data provision (also) via the Internet as well as an option to manage, update and use them for various applications on the Internet are are pointed out. (author) [de

  16. An Effective Framework for Distributed Geospatial Query Processing in Grids

    Directory of Open Access Journals (Sweden)

    CHEN, B.

    2010-08-01

    Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.

  17. BUILDING A COMPLETE FREE AND OPEN SOURCE GIS INFRASTRUCTURE FOR HYDROLOGICAL COMPUTING AND DATA PUBLICATION USING GIS.LAB AND GISQUICK PLATFORMS

    Directory of Open Access Journals (Sweden)

    M. Landa

    2017-07-01

    Full Text Available Building a complete free and open source GIS computing and data publication platform can be a relatively easy task. This paper describes an automated deployment of such platform using two open source software projects – GIS.lab and Gisquick. GIS.lab (http: //web.gislab.io is a project for rapid deployment of a complete, centrally managed and horizontally scalable GIS infrastructure in the local area network, data center or cloud. It provides a comprehensive set of free geospatial software seamlessly integrated into one, easy-to-use system. A platform for GIS computing (in our case demonstrated on hydrological data processing requires core components as a geoprocessing server, map server, and a computation engine as eg. GRASS GIS, SAGA, or other similar GIS software. All these components can be rapidly, and automatically deployed by GIS.lab platform. In our demonstrated solution PyWPS is used for serving WPS processes built on the top of GRASS GIS computation platform. GIS.lab can be easily extended by other components running in Docker containers. This approach is shown on Gisquick seamless integration. Gisquick (http://gisquick.org is an open source platform for publishing geospatial data in the sense of rapid sharing of QGIS projects on the web. The platform consists of QGIS plugin, Django-based server application, QGIS server, and web/mobile clients. In this paper is shown how to easily deploy complete open source GIS infrastructure allowing all required operations as data preparation on desktop, data sharing, and geospatial computation as the service. It also includes data publication in the sense of OGC Web Services and importantly also as interactive web mapping applications.

  18. A FRAMEWORK FOR AN OPEN SOURCE GEOSPATIAL CERTIFICATION MODEL

    Directory of Open Access Journals (Sweden)

    T. U. R. Khan

    2016-06-01

    Full Text Available The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission “Making geospatial education and opportunities accessible to all”. Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the “Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM. The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and

  19. a Framework for AN Open Source Geospatial Certification Model

    Science.gov (United States)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  20. An approach for heterogeneous and loosely coupled geospatial data distributed computing

    Science.gov (United States)

    Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui

    2010-07-01

    Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.

  1. Geospatial technology and the "exposome": new perspectives on addiction.

    Science.gov (United States)

    Stahler, Gerald J; Mennis, Jeremy; Baron, David A

    2013-08-01

    Addiction represents one of the greatest public health problems facing the United States. Advances in addiction research have focused on the neurobiology of this disease. We discuss potential new breakthroughs in understanding the other side of gene-environment interactions-the environmental context or "exposome" of addiction. Such research has recently been made possible by advances in geospatial technologies together with new mobile and sensor computing platforms. These advances have fostered interdisciplinary collaborations focusing on the intersection of environment and behavior in addiction research. Although issues of privacy protection for study participants remain, these advances could potentially improve our understanding of initiation of drug use and relapse and help develop innovative technology-based interventions to improve treatment and continuing care services.

  2. Geospatial Analysis Platform and tools: supporting planning and decision making across scales, borders, sectors and disciplines

    CSIR Research Space (South Africa)

    Naude, AH

    2008-04-01

    Full Text Available observation and geospatial analysis technologies, as well as the associated need for spatially explicit and sectorally integrated growth and development plans (including plans that deal with multi-scale or cross-border issues), the required statistical... planning. This requires planning and analysis that can (1) facilitate the sharing of spatial and other data, (2) deal with multi-scale or cross-border issues, as well as can (3) support the understanding of patterns and inter-regional dynamics at regional...

  3. Economic Assessment of the Use Value of Geospatial Information

    Directory of Open Access Journals (Sweden)

    Richard Bernknopf

    2015-07-01

    Full Text Available Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI contained in geospatial data is the difference between the net benefits (in present value terms of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1 a retrospective model about environmental regulation of agrochemicals; (2 a prospective model about the impact and mitigation of earthquakes in urban areas; and (3 a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  4. Economic assessment of the use value of geospatial information

    Science.gov (United States)

    Bernknopf, Richard L.; Shapiro, Carl D.

    2015-01-01

    Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI) contained in geospatial data is the difference between the net benefits (in present value terms) of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1) a retrospective model about environmental regulation of agrochemicals; (2) a prospective model about the impact and mitigation of earthquakes in urban areas; and (3) a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  5. Geospatial Modeling of Asthma Population in Relation to Air Pollution

    Science.gov (United States)

    Kethireddy, Swatantra R.; Tchounwou, Paul B.; Young, John H.; Luvall, Jeffrey C.; Alhamdan, Mohammad

    2013-01-01

    Current observations indicate that asthma is growing every year in the United States, specific reasons for this are not well understood. This study stems from an ongoing research effort to investigate the spatio-temporal behavior of asthma and its relatedness to air pollution. The association between environmental variables such as air quality and asthma related health issues over Mississippi State are investigated using Geographic Information Systems (GIS) tools and applications. Health data concerning asthma obtained from Mississippi State Department of Health (MSDH) for 9-year period of 2003-2011, and data of air pollutant concentrations (PM2.5) collected from USEPA web resources, and are analyzed geospatially to establish the impacts of air quality on human health specifically related to asthma. Disease mapping using geospatial techniques provides valuable insights into the spatial nature, variability, and association of asthma to air pollution. Asthma patient hospitalization data of Mississippi has been analyzed and mapped using quantitative Choropleth techniques in ArcGIS. Patients have been geocoded to their respective zip codes. Potential air pollutant sources of Interstate highways, Industries, and other land use data have been integrated in common geospatial platform to understand their adverse contribution on human health. Existing hospitals and emergency clinics are being injected into analysis to further understand their proximity and easy access to patient locations. At the current level of analysis and understanding, spatial distribution of Asthma is observed in the populations of Zip code regions in gulf coast, along the interstates of south, and in counties of Northeast Mississippi. It is also found that asthma is prevalent in most of the urban population. This GIS based project would be useful to make health risk assessment and provide information support to the administrators and decision makers for establishing satellite clinics in future.

  6. Inspection of Pole-Like Structures Using a Visual-Inertial Aided VTOL Platform with Shared Autonomy.

    Science.gov (United States)

    Sa, Inkyu; Hrabar, Stefan; Corke, Peter

    2015-09-02

    This paper presents an algorithm and a system for vertical infrastructure inspection using a vertical take-off and landing (VTOL) unmanned aerial vehicle and shared autonomy. Inspecting vertical structures such as light and power distribution poles is a difficult task that is time-consuming, dangerous and expensive. Recently, micro VTOL platforms (i.e., quad-, hexa- and octa-rotors) have been rapidly gaining interest in research, military and even public domains. The unmanned, low-cost and VTOL properties of these platforms make them ideal for situations where inspection would otherwise be time-consuming and/or hazardous to humans. There are, however, challenges involved with developing such an inspection system, for example flying in close proximity to a target while maintaining a fixed stand-off distance from it, being immune to wind gusts and exchanging useful information with the remote user. To overcome these challenges, we require accurate and high-update rate state estimation and high performance controllers to be implemented onboard the vehicle. Ease of control and a live video feed are required for the human operator. We demonstrate a VTOL platform that can operate at close-quarters, whilst maintaining a safe stand-off distance and rejecting environmental disturbances. Two approaches are presented: Position-Based Visual Servoing (PBVS) using an Extended Kalman Filter (EKF) and estimator-free Image-Based Visual Servoing (IBVS). Both use monocular visual, inertia, and sonar data, allowing the approaches to be applied for indoor or GPS-impaired environments. We extensively compare the performances of PBVS and IBVS in terms of accuracy, robustness and computational costs. Results from simulations Sensors 2015, 15 22004 and indoor/outdoor (day and night) flight experiments demonstrate the system is able to successfully inspect and circumnavigate a vertical pole.

  7. Inspection of Pole-Like Structures Using a Visual-Inertial Aided VTOL Platform with Shared Autonomy

    Directory of Open Access Journals (Sweden)

    Inkyu Sa

    2015-09-01

    Full Text Available This paper presents an algorithm and a system for vertical infrastructure inspection using a vertical take-off and landing (VTOL unmanned aerial vehicle and shared autonomy. Inspecting vertical structures such as light and power distribution poles is a difficult task that is time-consuming, dangerous and expensive. Recently, micro VTOL platforms (i.e., quad-, hexa- and octa-rotors have been rapidly gaining interest in research, military and even public domains. The unmanned, low-cost and VTOL properties of these platforms make them ideal for situations where inspection would otherwise be time-consuming and/or hazardous to humans. There are, however, challenges involved with developing such an inspection system, for example flying in close proximity to a target while maintaining a fixed stand-off distance from it, being immune to wind gusts and exchanging useful information with the remote user. To overcome these challenges, we require accurate and high-update rate state estimation and high performance controllers to be implemented onboard the vehicle. Ease of control and a live video feed are required for the human operator. We demonstrate a VTOL platform that can operate at close-quarters, whilst maintaining a safe stand-off distance and rejecting environmental disturbances. Two approaches are presented: Position-Based Visual Servoing (PBVS using an Extended Kalman Filter (EKF and estimator-free Image-Based Visual Servoing (IBVS. Both use monocular visual, inertia, and sonar data, allowing the approaches to be applied for indoor or GPS-impaired environments. We extensively compare the performances of PBVS and IBVS in terms of accuracy, robustness and computational costs. Results from simulations Sensors 2015, 15 22004 and indoor/outdoor (day and night flight experiments demonstrate the system is able to successfully inspect and circumnavigate a vertical pole.

  8. Inspection of Pole-Like Structures Using a Visual-Inertial Aided VTOL Platform with Shared Autonomy

    Science.gov (United States)

    Sa, Inkyu; Hrabar, Stefan; Corke, Peter

    2015-01-01

    This paper presents an algorithm and a system for vertical infrastructure inspection using a vertical take-off and landing (VTOL) unmanned aerial vehicle and shared autonomy. Inspecting vertical structures such as light and power distribution poles is a difficult task that is time-consuming, dangerous and expensive. Recently, micro VTOL platforms (i.e., quad-, hexa- and octa-rotors) have been rapidly gaining interest in research, military and even public domains. The unmanned, low-cost and VTOL properties of these platforms make them ideal for situations where inspection would otherwise be time-consuming and/or hazardous to humans. There are, however, challenges involved with developing such an inspection system, for example flying in close proximity to a target while maintaining a fixed stand-off distance from it, being immune to wind gusts and exchanging useful information with the remote user. To overcome these challenges, we require accurate and high-update rate state estimation and high performance controllers to be implemented onboard the vehicle. Ease of control and a live video feed are required for the human operator. We demonstrate a VTOL platform that can operate at close-quarters, whilst maintaining a safe stand-off distance and rejecting environmental disturbances. Two approaches are presented: Position-Based Visual Servoing (PBVS) using an Extended Kalman Filter (EKF) and estimator-free Image-Based Visual Servoing (IBVS). Both use monocular visual, inertia, and sonar data, allowing the approaches to be applied for indoor or GPS-impaired environments. We extensively compare the performances of PBVS and IBVS in terms of accuracy, robustness and computational costs. Results from simulations and indoor/outdoor (day and night) flight experiments demonstrate the system is able to successfully inspect and circumnavigate a vertical pole. PMID:26340631

  9. Distributed Multi-interface Catalogue for Geospatial Data

    Science.gov (United States)

    Nativi, S.; Bigagli, L.; Mazzetti, P.; Mattia, U.; Boldrini, E.

    2007-12-01

    Several geosciences communities (e.g. atmospheric science, oceanography, hydrology) have developed tailored data and metadata models and service protocol specifications for enabling online data discovery, inventory, evaluation, access and download. These specifications are conceived either profiling geospatial information standards or extending the well-accepted geosciences data models and protocols in order to capture more semantics. These artifacts have generated a set of related catalog -and inventory services- characterizing different communities, initiatives and projects. In fact, these geospatial data catalogs are discovery and access systems that use metadata as the target for query on geospatial information. The indexed and searchable metadata provide a disciplined vocabulary against which intelligent geospatial search can be performed within or among communities. There exists a clear need to conceive and achieve solutions to implement interoperability among geosciences communities, in the context of the more general geospatial information interoperability framework. Such solutions should provide search and access capabilities across catalogs, inventory lists and their registered resources. Thus, the development of catalog clearinghouse solutions is a near-term challenge in support of fully functional and useful infrastructures for spatial data (e.g. INSPIRE, GMES, NSDI, GEOSS). This implies the implementation of components for query distribution and virtual resource aggregation. These solutions must implement distributed discovery functionalities in an heterogeneous environment, requiring metadata profiles harmonization as well as protocol adaptation and mediation. We present a catalog clearinghouse solution for the interoperability of several well-known cataloguing systems (e.g. OGC CSW, THREDDS catalog and data services). The solution implements consistent resource discovery and evaluation over a dynamic federation of several well-known cataloguing and

  10. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  11. Geospatial Information is the Cornerstone of Effective Hazards Response

    Science.gov (United States)

    Newell, Mark

    2008-01-01

    Every day there are hundreds of natural disasters world-wide. Some are dramatic, whereas others are barely noticeable. A natural disaster is commonly defined as a natural event with catastrophic consequences for living things in the vicinity. Those events include earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, and wildfires. Man-made disasters are events that are caused by man either intentionally or by accident, and that directly or indirectly threaten public health and well-being. These occurrences span the spectrum from terrorist attacks to accidental oil spills. To assist in responding to natural and potential man-made disasters, the U.S. Geological Survey (USGS) has established the Geospatial Information Response Team (GIRT) (http://www.usgs.gov/emergency/). The primary purpose of the GIRT is to ensure rapid coordination and availability of geospatial information for effective response by emergency responders, and land and resource managers, and for scientific analysis. The GIRT is responsible for establishing monitoring procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing relevant geospatial products and services. The GIRT is focused on supporting programs, offices, other agencies, and the public in mission response to hazards. The GIRT will leverage the USGS Geospatial Liaison Network and partnerships with the Department of Homeland Security (DHS), National Geospatial-Intelligence Agency (NGA), and Northern Command (NORTHCOM) to coordinate the provisioning and deployment of USGS geospatial data, products, services, and equipment. The USGS geospatial liaisons will coordinate geospatial information sharing with State, local, and tribal governments, and ensure geospatial liaison back-up support procedures are in place. The GIRT will coordinate disposition of USGS staff in support of DHS response center activities as requested by DHS. The GIRT

  12. DEVELOPING WEB MAPPING APPLICATION USING ARCGIS SERVER WEB APPLICATION DEVELOPMEN FRAMEWORK (ADF FOR GEOSPATIAL DATA GENERATED DURING REHABILITATION AND RECONSTRUCTION PROCESS OF POST-TSUNAMI 2004 DISASTER IN ACEH

    Directory of Open Access Journals (Sweden)

    Nizamuddin Nizamuddin

    2014-04-01

    Full Text Available ESRI ArcGIS Server is equipped with ArcGIS Server Web Application Development Framework (ADF and ArcGIS Web Controls integration for Visual Studio.NET. Both the ArcGIS Server Manager for .NET and ArcGIS Web Controls can be easily utilized for developing the ASP.NET based ESRI Web mapping application. In  this study we implemented both tools for developing the ASP.NET based ESRI Web mapping application for geospatial data generated dring rehabilitation and reconstruction process of post-tsunami 2004 disaster in Aceh province. Rehabilitation and reconstruction process has produced a tremendous amount of geospatial data. This method was chosen in this study because in the process of developing  a web mapping application, one can easily and quickly create Mapping Services of huge geospatial data and also develop Web mapping application without writing any code. However, when utilizing Visual Studio.NET 2008, one needs to have some coding ability.

  13. Data Mining and the Twitter Platform for Prescribed Burn and Wildfire Incident Reporting with Geospatial Applications

    Science.gov (United States)

    Endsley, K.; McCarty, J. L.

    2012-12-01

    Data mining techniques have been applied to social media in a variety of contexts, from mapping the evolution of the Tahrir Square protests in Egypt to predicting influenza outbreaks. The Twitter platform is a particular favorite due to its robust application programming interface (API) and high throughput. Twitter, Inc. estimated in 2011 that over 2,200 messages or "tweets" are generated every second. Also helpful is Twitter's semblance in operation to the short message service (SMS), better known as "texting," available on cellular phones and the most popular means of wide telecommunications in many developing countries. In the United States, Twitter has been used by a number of federal, state and local officials as well as motivated individuals to report prescribed burns in advance (sometimes as part of a reporting obligation) or to communicate the emergence, response to, and containment of wildfires. These reports are unstructured and, like all Twitter messages, limited to 140 UTF-8 characters. Through internal research and development at the Michigan Tech Research Institute, the authors have developed a data mining routine that gathers potential tweets of interest using the Twitter API, eliminates duplicates ("retweets"), and extracts relevant information such as the approximate size and condition of the fire. Most importantly, the message is geocoded and/or contains approximate locational information, allowing for prescribed and wildland fires to be mapped. Natural language processing techniques, adapted to improve computational performance, are used to tokenize and tag these elements for each tweet. The entire routine is implemented in the Python programming language, using open-source libraries. As such, it is demonstrated in a web-based framework where prescribed burns and/or wildfires are mapped in real time, visualized through a JavaScript-based mapping client in any web browser. The practices demonstrated here generalize to an SMS platform (or any short

  14. Remote Sensing and Geospatial Technological Applications for Site-specific Management of Fruit and Nut Crops: A Review

    Directory of Open Access Journals (Sweden)

    Joel O. Paz

    2010-08-01

    Full Text Available Site-specific crop management (SSCM is one facet of precision agriculture which is helping increase production with minimal input. It has enhanced the cost-benefit scenario in crop production. Even though the SSCM is very widely used in row crop agriculture like corn, wheat, rice, soybean, etc. it has very little application in cash crops like fruit and nut. The main goal of this review paper was to conduct a comprehensive review of advanced technologies, including geospatial technologies, used in site-specific management of fruit and nut crops. The review explores various remote sensing data from different platforms like satellite, LIDAR, aerial, and field imaging. The study analyzes the use of satellite sensors, such as Quickbird, Landsat, SPOT, and IRS imagery as well as hyperspectral narrow-band remote sensing data in study of fruit and nut crops in blueberry, citrus, peach, apple, etc. The study also explores other geospatial technologies such as GPS, GIS spatial modeling, advanced image processing techniques, and information technology for suitability study, orchard delineation, and classification accuracy assessment. The study also provides an example of a geospatial model developed in ArcGIS ModelBuilder to automate the blueberry production suitability analysis. The GIS spatial model is developed using various crop characteristics such as chilling hours, soil permeability, drainage, and pH, and land cover to determine the best sites for growing blueberry in Georgia, U.S. The study also provides a list of spectral reflectance curves developed for some fruit and nut crops, blueberry, crowberry, redblush citrus, orange, prickly pear, and peach. The study also explains these curves in detail to help researchers choose the image platform, sensor, and spectrum wavelength for various fruit and nut crops SSCM.

  15. IVAG: An Integrative Visualization Application for Various Types of Genomic Data Based on R-Shiny and the Docker Platform.

    Science.gov (United States)

    Lee, Tae-Rim; Ahn, Jin Mo; Kim, Gyuhee; Kim, Sangsoo

    2017-12-01

    Next-generation sequencing (NGS) technology has become a trend in the genomics research area. There are many software programs and automated pipelines to analyze NGS data, which can ease the pain for traditional scientists who are not familiar with computer programming. However, downstream analyses, such as finding differentially expressed genes or visualizing linkage disequilibrium maps and genome-wide association study (GWAS) data, still remain a challenge. Here, we introduce a dockerized web application written in R using the Shiny platform to visualize pre-analyzed RNA sequencing and GWAS data. In addition, we have integrated a genome browser based on the JBrowse platform and an automated intermediate parsing process required for custom track construction, so that users can easily build and navigate their personal genome tracks with in-house datasets. This application will help scientists perform series of downstream analyses and obtain a more integrative understanding about various types of genomic data by interactively visualizing them with customizable options.

  16. Virtual interconnection platform initiative scoping study

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yong [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kou, Gefei [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pan, Zuohong [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Liu, Yilu [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); King Jr., Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-01-01

    Due to security and liability concerns, the research community has limited access to realistic large-scale power grid models to test and validate new operation and control methodologies. It is also difficult for industry to evaluate the relative value of competing new tools without a common platform for comparison. This report proposes to develop a large-scale virtual power grid model that retains basic features and represents future trends of major U.S. electric interconnections. This model will include realistic power flow and dynamics information as well as a relevant geospatial distribution of assets. This model will be made widely available to the research community for various power system stability and control studies and can be used as a common platform for comparing the efficacies of various new technologies.

  17. Intelligence, mapping, and geospatial exploitation system (IMAGES)

    Science.gov (United States)

    Moellman, Dennis E.; Cain, Joel M.

    1998-08-01

    This paper provides further detail to one facet of the battlespace visualization concept described in last year's paper Battlespace Situation Awareness for Force XXI. It focuses on the National Imagery and Mapping Agency (NIMA) goal to 'provide customers seamless access to tailorable imagery, imagery intelligence, and geospatial information.' This paper describes Intelligence, Mapping, and Geospatial Exploitation System (IMAGES), an exploitation element capable of CONUS baseplant operations or field deployment to provide NIMA geospatial information collaboratively into a reconnaissance, surveillance, and target acquisition (RSTA) environment through the United States Imagery and Geospatial Information System (USIGS). In a baseplant CONUS setting IMAGES could be used to produce foundation data to support mission planning. In the field it could be directly associated with a tactical sensor receiver or ground station (e.g. UAV or UGV) to provide near real-time and mission specific RSTA to support mission execution. This paper provides IMAGES functional level design; describes the technologies, their interactions and interdependencies; and presents a notional operational scenario to illustrate the system flexibility. Using as a system backbone an intelligent software agent technology, called Open Agent ArchitectureTM (OAATM), IMAGES combines multimodal data entry, natural language understanding, and perceptual and evidential reasoning for system management. Configured to be DII COE compliant, it would utilize, to the extent possible, COTS applications software for data management, processing, fusion, exploitation, and reporting. It would also be modular, scaleable, and reconfigurable. This paper describes how the OAATM achieves data synchronization and enables the necessary level of information to be rapidly available to various command echelons for making informed decisions. The reasoning component will provide for the best information to be developed in the timeline

  18. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  19. Geospatial intelligence and visual classification of environmentally observed species in the Future Internet

    Science.gov (United States)

    Arbab-Zavar, B.; Chakravarthy, A.; Sabeur, Z. A.

    2012-04-01

    The rapid development of advanced smart communication tools with good quality and resolution video cameras, audio and GPS devices in the last few years shall lead to profound impacts on the way future environmental observations are conducted and accessed by communities. The resulting large scale interconnections of these "Future Internet Things" form a large environmental sensing network which will generate large volumes of quality environmental observations and at highly localised spatial scales. This enablement in environmental sensing at local scales will be of great importance to contribute in the study of fauna and flora in the near future, particularly on the effect of climate change on biodiversity in various regions of Europe and beyond. The Future Internet could also potentially become the de facto information space to provide participative real-time sensing by communities and improve our situation awarness of the effect of climate on local environments. In the ENVIROFI(2011-2013) Usage Area project in the FP7 FI-PPP programme, a set of requirements for specific (and generic) enablers is achieved with the potential establishement of participating community observatories of the future. In particular, the specific enablement of interest concerns the building of future interoperable services for the management of environmental data intelligently with tagged contextual geo-spatial information generated by multiple operators in communities (Using smart phones). The classification of observed species in the resulting images is achieved with structured data pre-processing, semantic enrichement using contextual geospatial information, and high level fusion with controlled uncertainty estimations. The returned identification of species is further improved using future ground truth corrections and learning by the specific enablers.

  20. Integration of Geospatial Science in Teacher Education

    Science.gov (United States)

    Hauselt, Peggy; Helzer, Jennifer

    2012-01-01

    One of the primary missions of our university is to train future primary and secondary teachers. Geospatial sciences, including GIS, have long been excluded from teacher education curriculum. This article explains the curriculum revisions undertaken to increase the geospatial technology education of future teachers. A general education class…

  1. Geospatial Data Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...

  2. Bridging the Gap Between Surveyors and the Geo-Spatial Society

    Science.gov (United States)

    Müller, H.

    2016-06-01

    For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.

  3. Exploring U.S Cropland - A Web Service based Cropland Data Layer Visualization, Dissemination and Querying System (Invited)

    Science.gov (United States)

    Yang, Z.; Han, W.; di, L.

    2010-12-01

    The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability

  4. Preparing for a Product Platform

    DEFF Research Database (Denmark)

    Fiil-Nielsen, Ole; Munk, Lone; Mortensen, Niels Henrik

    2005-01-01

    on commonalities and similarities in the product family, and variance should be based on customer demands. To relate these terms and to improve the basis on which decisions are made, we need a way of visualizing the hierarchy of the product family as well as the commonality and variance. This visualization method...... of the platform or ensuring that the platform can meet future demands will be very useful in the preparation process of a platform synthesis as well as in the updating or reengineering of an existing product development platform.......Experience in the industry as well as recent related scientific publications show the benefits of product development platforms. Companies use platforms to develop not a single but multiple products (i.e. a product family) simultaneously. When these product development projects are coordinated...

  5. Automatic geospatial information Web service composition based on ontology interface matching

    Science.gov (United States)

    Xu, Xianbin; Wu, Qunyong; Wang, Qinmin

    2008-10-01

    With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.

  6. Visualizing uncertainties with the North Wyke Farm Platform Data Sets

    Science.gov (United States)

    Harris, Paul; Brunsdon, Chris; Lee, Michael

    2016-04-01

    The North Wyke Farm Platform (NWFP) is a systems-based, farm-scale experiment with the aim of addressing agricultural productivity and ecosystem responses to different management practices. The 63 ha site captures the spatial and/or temporal data necessary to develop a better understanding of the dynamic processes and underlying mechanisms that can be used to model how agricultural grassland systems respond to different management inputs. Via cattle beef and sheep production, the underlying principle is to manage each of three farmlets (each consisting of five hydrologically-isolated sub-catchments) in three contrasting ways: (i) improvement of permanent pasture through use of mineral fertilizers; (ii) improvement through use of legumes; and (iii) improvement through innovation. The connectivity between the timing and intensity of the different management operations, together with the transport of nutrients and potential pollutants from the NWFP is evaluated using numerous inter-linked data collection exercises. In this paper, we introduce some of the visualization opportunities that are possible with this rich data resource, and methods of analysis that might be applied to it, in particular with respect to data and model uncertainty operating across both temporal and spatial dimensions. An important component of the NWFP experiment is the representation of trade-offs with respect to: (a) economic profits, (b) environmental concerns, and (c) societal benefits, under the umbrella of sustainable intensification. Various visualizations exist to display such trade-offs and here we demonstrate ways to tailor them to relay key uncertainties and assessments of risk; and also consider how these visualizations can be honed to suit different audiences.

  7. GeoSpatial Data Analysis for DHS Programs

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, Eric G.; Burke, John S.; Carlson, Carrie A.; Gillen, David S.; Joslyn, Cliff A.; Olsen, Bryan K.; Critchlow, Terence J.

    2009-05-10

    The Department of Homeland Security law enforcement faces the continual challenge of analyzing their custom data sources in a geospatial context. From a strategic perspective law enforcement has certain requirements to first broadly characterize a given situation using their custom data sources and then once it is summarily understood, to geospatially analyze their data in detail.

  8. Helicopter flight simulation motion platform requirements

    Science.gov (United States)

    Schroeder, Jeffery Allyn

    Flight simulators attempt to reproduce in-flight pilot-vehicle behavior on the ground. This reproduction is challenging for helicopter simulators, as the pilot is often inextricably dependent on external cues for pilot-vehicle stabilization. One important simulator cue is platform motion; however, its required fidelity is unknown. To determine the required motion fidelity, several unique experiments were performed. A large displacement motion platform was used that allowed pilots to fly tasks with matched motion and visual cues. Then, the platform motion was modified to give cues varying from full motion to no motion. Several key results were found. First, lateral and vertical translational platform cues had significant effects on fidelity. Their presence improved performance and reduced pilot workload. Second, yaw and roll rotational platform cues were not as important as the translational platform cues. In particular, the yaw rotational motion platform cue did not appear at all useful in improving performance or reducing workload. Third, when the lateral translational platform cue was combined with visual yaw rotational cues, pilots believed the platform was rotating when it was not. Thus, simulator systems can be made more efficient by proper combination of platform and visual cues. Fourth, motion fidelity specifications were revised that now provide simulator users with a better prediction of motion fidelity based upon the frequency responses of their motion control laws. Fifth, vertical platform motion affected pilot estimates of steady-state altitude during altitude repositionings. This refutes the view that pilots estimate altitude and altitude rate in simulation solely from visual cues. Finally, the combined results led to a general method for configuring helicopter motion systems and for developing simulator tasks that more likely represent actual flight. The overall results can serve as a guide to future simulator designers and to today's operators.

  9. Geospatial Analysis of Oil and Gas Wells in California

    Science.gov (United States)

    Riqueros, N. S.; Kang, M.; Jackson, R. B.

    2015-12-01

    California currently ranks third in oil production by U.S. state and more than 200,000 wells have been drilled in the state. Oil and gas wells provide a potential pathway for subsurface migration, leading to groundwater contamination and emissions of methane and other fluids to the atmosphere. Here we compile available public databases on oil and gas wells from the California Department of Conservation's Division of Oil, Gas, and Geothermal Resources, the U.S. Geological Survey, and other state and federal sources. We perform geospatial analysis at the county and field levels to characterize depths, producing formations, spud/completion/abandonment dates, land cover, population, and land ownership of active, idle, buried, abandoned, and plugged wells in California. The compiled database is designed to serve as a quantitative platform for developing field-based groundwater and air emission monitoring plans.

  10. Geospatial Health: the first five years

    Directory of Open Access Journals (Sweden)

    Jürg Utzinger

    2011-11-01

    Full Text Available Geospatial Health is an international, peer-reviewed scientific journal produced by the Global Network for Geospatial Health (GnosisGIS. This network was founded in 2000 and the inaugural issue of its official journal was published in November 2006 with the aim to cover all aspects of geographical information system (GIS applications, remote sensing and other spatial analytic tools focusing on human and veterinary health. The University of Naples Federico II is the publisher, producing two issues per year, both as hard copy and an open-access online version. The journal is referenced in major databases, including CABI, ISI Web of Knowledge and PubMed. In 2008, it was assigned its first impact factor (1.47, which has now reached 1.71. Geospatial Health is managed by an editor-in-chief and two associate editors, supported by five regional editors and a 23-member strong editorial board. This overview takes stock of the first five years of publishing: 133 contributions have been published so far, primarily original research (79.7%, followed by reviews (7.5%, announcements (6.0%, editorials and meeting reports (3.0% each and a preface in the first issue. A content analysis of all the original research articles and reviews reveals that three quarters of the publications focus on human health with the remainder dealing with veterinary health. Two thirds of the papers come from Africa, Asia and Europe with similar numbers of contributions from each continent. Studies of more than 35 different diseases, injuries and risk factors have been presented. Malaria and schistosomiasis were identified as the two most important diseases (11.2% each. Almost half the contributions were based on GIS, one third on spatial analysis, often using advanced Bayesian geostatistics (13.8%, and one quarter on remote sensing. The 120 original research articles, reviews and editorials were produced by 505 authors based at institutions and universities in 52 countries

  11. Capacity Building through Geospatial Education in Planning and School Curricula

    Science.gov (United States)

    Kumar, P.; Siddiqui, A.; Gupta, K.; Jain, S.; Krishna Murthy, Y. V. N.

    2014-11-01

    Geospatial technology has widespread usage in development planning and resource management. It offers pragmatic tools to help urban and regional planners to realize their goals. On the request of Ministry of Urban Development, Govt. of India, the Indian Institute of Remote Sensing (IIRS), Dehradun has taken an initiative to study the model syllabi of All India Council for Technical Education for planning curricula of Bachelor and Master (five disciplines) programmes. It is inferred that geospatial content across the semesters in various planning fields needs revision. It is also realized that students pursuing planning curricula are invariably exposed to spatial mapping tools but the popular digital drafting software have limitations on geospatial analysis of planning phenomena. Therefore, students need exposure on geospatial technologies to understand various real world phenomena. Inputs were given to seamlessly merge and incorporate geospatial components throughout the semesters wherever seems relevant. Another initiative by IIRS was taken to enhance the understanding and essence of space and geospatial technologies amongst the young minds at 10+2 level. The content was proposed in a manner such that youngsters start realizing the innumerable contributions made by space and geospatial technologies in their day-to-day life. This effort both at school and college level would help in not only enhancing job opportunities for young generation but also utilizing the untapped human resource potential. In the era of smart cities, higher economic growth and aspirations for a better tomorrow, integration of Geospatial technologies with conventional wisdom can no longer be ignored.

  12. Challenges in sharing of geospatial data by data custodians in South Africa

    Science.gov (United States)

    Kay, Sissiel E.

    2018-05-01

    As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.

  13. The geospatial data quality REST API for primary biodiversity data.

    Science.gov (United States)

    Otegui, Javier; Guralnick, Robert P

    2016-06-01

    We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  14. A resource-oriented architecture for a Geospatial Web

    Science.gov (United States)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary

  15. Global polar geospatial information service retrieval based on search engine and ontology reasoning

    Science.gov (United States)

    Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang

    2007-01-01

    In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.

  16. Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data

    Science.gov (United States)

    Baxes, Gregory; Mixon, Brian; Linger, TIm

    2013-01-01

    Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be

  17. Integrating Free and Open Source Solutions into Geospatial Science Education

    Directory of Open Access Journals (Sweden)

    Vaclav Petras

    2015-06-01

    Full Text Available While free and open source software becomes increasingly important in geospatial research and industry, open science perspectives are generally less reflected in universities’ educational programs. We present an example of how free and open source software can be incorporated into geospatial education to promote open and reproducible science. Since 2008 graduate students at North Carolina State University have the opportunity to take a course on geospatial modeling and analysis that is taught with both proprietary and free and open source software. In this course, students perform geospatial tasks simultaneously in the proprietary package ArcGIS and the free and open source package GRASS GIS. By ensuring that students learn to distinguish between geospatial concepts and software specifics, students become more flexible and stronger spatial thinkers when choosing solutions for their independent work in the future. We also discuss ways to continually update and improve our publicly available teaching materials for reuse by teachers, self-learners and other members of the GIS community. Only when free and open source software is fully integrated into geospatial education, we will be able to encourage a culture of openness and, thus, enable greater reproducibility in research and development applications.

  18. Geospatial Absorption and Regional Effects

    Directory of Open Access Journals (Sweden)

    IOAN MAC

    2009-01-01

    Full Text Available The geospatial absorptions are characterized by a specific complexity both in content and in their phenomenological and spatial manifestation fields. Such processes are differentiated according to their specificity to pre-absorption, absorption or post-absorption. The mechanisms that contribute to absorption are extremely numerous: aggregation, extension, diffusion, substitution, resistivity (resilience, stratification, borrowings, etc. Between these mechanisms frequent relations are established determining an amplification of the process and of its regional effects. The installation of the geographic osmosis phenomenon in a given territory (a place for example leads to a homogenization of the geospatial state and to the installation of the regional homogeneity.

  19. Biosecurity and geospatial analysis of mycoplasma infections in ...

    African Journals Online (AJOL)

    Geospatial database of farm locations and biosecurity measures are essential to control disease outbreaks. A study was conducted to establish geospatial database on poultry farms in Al-Jabal Al-Gharbi region of Libya, to evaluate the biosecurity level of each farm and to determine the seroprevalence of mycoplasma and ...

  20. Searches over graphs representing geospatial-temporal remote sensing data

    Science.gov (United States)

    Brost, Randolph; Perkins, David Nikolaus

    2018-03-06

    Various technologies pertaining to identifying objects of interest in remote sensing images by searching over geospatial-temporal graph representations are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Geospatial-temporal graph searches are made computationally efficient by taking advantage of characteristics of geospatial-temporal data in remote sensing images through the application of various graph search techniques.

  1. ADMS Evaluation Platform

    Energy Technology Data Exchange (ETDEWEB)

    2018-01-23

    Deploying an ADMS or looking to optimize its value? NREL offers a low-cost, low-risk evaluation platform for assessing ADMS performance. The National Renewable Energy Laboratory (NREL) has developed a vendor-neutral advanced distribution management system (ADMS) evaluation platform and is expanding its capabilities. The platform uses actual grid-scale hardware, large-scale distribution system models, and advanced visualization to simulate realworld conditions for the most accurate ADMS evaluation and experimentation.

  2. NCI's Distributed Geospatial Data Server

    Science.gov (United States)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under

  3. Cloud Geospatial Analysis Tools for Global-Scale Comparisons of Population Models for Decision Making

    Science.gov (United States)

    Hancher, M.; Lieber, A.; Scott, L.

    2017-12-01

    The volume of satellite and other Earth data is growing rapidly. Combined with information about where people are, these data can inform decisions in a range of areas including food and water security, disease and disaster risk management, biodiversity, and climate adaptation. Google's platform for planetary-scale geospatial data analysis, Earth Engine, grants access to petabytes of continually updating Earth data, programming interfaces for analyzing the data without the need to download and manage it, and mechanisms for sharing the analyses and publishing results for data-driven decision making. In addition to data about the planet, data about the human planet - population, settlement and urban models - are now available for global scale analysis. The Earth Engine APIs enable these data to be joined, combined or visualized with economic or environmental indicators such as nighttime lights trends, global surface water, or climate projections, in the browser without the need to download anything. We will present our newly developed application intended to serve as a resource for government agencies, disaster response and public health programs, or other consumers of these data to quickly visualize the different population models, and compare them to ground truth tabular data to determine which model suits their immediate needs. Users can further tap into the power of Earth Engine and other Google technologies to perform a range of analysis from simple statistics in custom regions to more complex machine learning models. We will highlight case studies in which organizations around the world have used Earth Engine to combine population data with multiple other sources of data, such as water resources and roads data, over deep stacks of temporal imagery to model disease risk and accessibility to inform decisions.

  4. Revelation of `Hidden' Balinese Geospatial Heritage on A Map

    Science.gov (United States)

    Soeria Atmadja, Dicky A. S.; Wikantika, Ketut; Budi Harto, Agung; Putra, Daffa Gifary M.

    2018-05-01

    Bali is not just about beautiful nature. It also has a unique and interesting cultural heritage, including `hidden' geospatial heritage. Tri Hita Karana is a Hinduism concept of life consisting of human relation to God, to other humans and to the nature (Parahiyangan, Pawongan and Palemahan), Based on it, - in term of geospatial aspect - the Balinese derived its spatial orientation, spatial planning & lay out, measurement as well as color and typography. Introducing these particular heritage would be a very interesting contribution to Bali tourism. As a respond to these issues, a question arise on how to reveal these unique and highly valuable geospatial heritage on a map which can be used to introduce and disseminate them to the tourists. Symbols (patterns & colors), orientation, distance, scale, layout and toponimy have been well known as elements of a map. There is an chance to apply Balinese geospatial heritage in representing these map elements.

  5. The geo-spatial information infrastructure at the Centre for Control and Prevention of Zoonoses, University of Ibadan, Nigeria: an emerging sustainable One-Health pavilion.

    Science.gov (United States)

    Olugasa, B O

    2014-12-01

    The World-Wide-Web as a contemporary means of information sharing offers a platform for geo-spatial information dissemination to improve education about spatio-temporal patterns of disease spread at the human-animal-environment interface in developing countries of West Africa. In assessing the quality of exposure to geospatial information applications among students in five purposively selected institutions in West Africa, this study reviewed course contents and postgraduate programmes in zoonoses surveillance. Geospatial information content and associated practical exercises in zoonoses surveillance were scored.. Seven criteria were used to categorize and score capability, namely, spatial data capture; thematic map design and interpretation; spatio-temporal analysis; remote sensing of data; statistical modelling; the management of spatial data-profile; and web-based map sharing operation within an organization. These criteria were used to compute weighted exposure during training at the institutions. A categorical description of institution with highest-scoring of computed Cumulative Exposure Point Average (CEPA) was based on an illustration with retrospective records of rabies cases, using data from humans, animals and the environment, that were sourced from Grand Bassa County, Liberia to create and share maps and information with faculty, staff, students and the neighbourhood about animal bite injury surveillance and spatial distribution of rabies-like illness. Uniformly low CEPA values (0-1.3) were observed across academic departments. The highest (3.8) was observed at the Centre for Control and Prevention of Zoonoses (CCPZ), University of Ibadan, Nigeria, where geospatial techniques were systematically taught, and thematic and predictive maps were produced and shared online with other institutions in West Africa. In addition, a short course in zoonosis surveillance, which offers inclusive learning in geospatial applications, is taught at CCPZ. The paper

  6. Interoperability in planetary research for geospatial data analysis

    Science.gov (United States)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  7. Bim and Gis: when Parametric Modeling Meets Geospatial Data

    Science.gov (United States)

    Barazzetti, L.; Banfi, F.

    2017-12-01

    Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.

  8. BIM AND GIS: WHEN PARAMETRIC MODELING MEETS GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2017-12-01

    Full Text Available Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building scale to the infrastructure (where geospatial data cannot be neglected has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by “pure” GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator industry, as well as new solutions for parametric modelling with additional geoinformation.

  9. A Geospatial Semantic Enrichment and Query Service for Geotagged Photographs

    Science.gov (United States)

    Ennis, Andrew; Nugent, Chris; Morrow, Philip; Chen, Liming; Ioannidis, George; Stan, Alexandru; Rachev, Preslav

    2015-01-01

    With the increasing abundance of technologies and smart devices, equipped with a multitude of sensors for sensing the environment around them, information creation and consumption has now become effortless. This, in particular, is the case for photographs with vast amounts being created and shared every day. For example, at the time of this writing, Instagram users upload 70 million photographs a day. Nevertheless, it still remains a challenge to discover the “right” information for the appropriate purpose. This paper describes an approach to create semantic geospatial metadata for photographs, which can facilitate photograph search and discovery. To achieve this we have developed and implemented a semantic geospatial data model by which a photograph can be enrich with geospatial metadata extracted from several geospatial data sources based on the raw low-level geo-metadata from a smartphone photograph. We present the details of our method and implementation for searching and querying the semantic geospatial metadata repository to enable a user or third party system to find the information they are looking for. PMID:26205265

  10. BPELPower—A BPEL execution engine for geospatial web services

    Science.gov (United States)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  11. Restful Implementation of Catalogue Service for Geospatial Data Provenance

    Science.gov (United States)

    Jiang, L. C.; Yue, P.; Lu, X. C.

    2013-10-01

    Provenance, also known as lineage, is important in understanding the derivation history of data products. Geospatial data provenance helps data consumers to evaluate the quality and reliability of geospatial data. In a service-oriented environment, where data are often consumed or produced by distributed services, provenance could be managed by following the same service-oriented paradigm. The Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) is used for the registration and query of geospatial data provenance by extending ebXML Registry Information Model (ebRIM). Recent advance of the REpresentational State Transfer (REST) paradigm has shown great promise for the easy integration of distributed resources. RESTful Web Service aims to provide a standard way for Web clients to communicate with servers based on REST principles. The existing approach for provenance catalogue service could be improved by adopting the RESTful design. This paper presents the design and implementation of a catalogue service for geospatial data provenance following RESTful architecture style. A middleware named REST Converter is added on the top of the legacy catalogue service to support a RESTful style interface. The REST Converter is composed of a resource request dispatcher and six resource handlers. A prototype service is developed to demonstrate the applicability of the approach.

  12. Analyzing engagement in a web-based intervention platform through visualizing log-data.

    Science.gov (United States)

    Morrison, Cecily; Doherty, Gavin

    2014-11-13

    Engagement has emerged as a significant cross-cutting concern within the development of Web-based interventions. There have been calls to institute a more rigorous approach to the design of Web-based interventions, to increase both the quantity and quality of engagement. One approach would be to use log-data to better understand the process of engagement and patterns of use. However, an important challenge lies in organizing log-data for productive analysis. Our aim was to conduct an initial exploration of the use of visualizations of log-data to enhance understanding of engagement with Web-based interventions. We applied exploratory sequential data analysis to highlight sequential aspects of the log data, such as time or module number, to provide insights into engagement. After applying a number of processing steps, a range of visualizations were generated from the log-data. We then examined the usefulness of these visualizations for understanding the engagement of individual users and the engagement of cohorts of users. The visualizations created are illustrated with two datasets drawn from studies using the SilverCloud Platform: (1) a small, detailed dataset with interviews (n=19) and (2) a large dataset (n=326) with 44,838 logged events. We present four exploratory visualizations of user engagement with a Web-based intervention, including Navigation Graph, Stripe Graph, Start-Finish Graph, and Next Action Heat Map. The first represents individual usage and the last three, specific aspects of cohort usage. We provide examples of each with a discussion of salient features. Log-data analysis through data visualization is an alternative way of exploring user engagement with Web-based interventions, which can yield different insights than more commonly used summative measures. We describe how understanding the process of engagement through visualizations can support the development and evaluation of Web-based interventions. Specifically, we show how visualizations

  13. PAVICS: A platform for the Analysis and Visualization of Climate Science - adopting a workflow-based analysis method for dealing with a multitude of climate data sources

    Science.gov (United States)

    Gauvin St-Denis, B.; Landry, T.; Huard, D. B.; Byrns, D.; Chaumont, D.; Foucher, S.

    2017-12-01

    As the number of scientific studies and policy decisions requiring tailored climate information continues to increase, the demand for support from climate service centers to provide the latest information in the format most helpful for the end-user is also on the rise. Ouranos, being one such organization based in Montreal, has partnered with the Centre de recherche informatique de Montreal (CRIM) to develop a platform that will offer climate data products that have been identified as most useful for users through years of consultation. The platform is built as modular components that target the various requirements of climate data analysis. The data components host and catalog NetCDF data as well as geographical and political delimitations. The analysis components are made available as atomic operations through Web Processing Service (WPS) or as workflows, whereby the operations are chained through a simple JSON structure and executed on a distributed network of computing resources. The visualization components range from Web Map Service (WMS) to a complete frontend for searching the data, launching workflows and interacting with maps of the results. Each component can easily be deployed and executed as an independent service through the use of Docker technology and a proxy is available to regulate user workspaces and access permissions. PAVICS includes various components from birdhouse, a collection of WPS initially developed by the German Climate Research Center (DKRZ) and Institut Pierre Simon Laplace (IPSL) and is designed to be highly interoperable with other WPS as well as many Open Geospatial Consortium (OGC) standards. Further connectivity is made with the Earth System Grid Federation (ESGF) nodes and local results are made searchable using the same API terminology. Other projects conducted by CRIM that integrate with PAVICS include the OGC Testbed 13 Innovation Program (IP) initiative that will enhance advanced cloud capabilities, application packaging

  14. Modeling photovoltaic diffusion: an analysis of geospatial datasets

    International Nuclear Information System (INIS)

    Davidson, Carolyn; Drury, Easan; Lopez, Anthony; Elmore, Ryan; Margolis, Robert

    2014-01-01

    This study combines address-level residential photovoltaic (PV) adoption trends in California with several types of geospatial information—population demographics, housing characteristics, foreclosure rates, solar irradiance, vehicle ownership preferences, and others—to identify which subsets of geospatial information are the best predictors of historical PV adoption. Number of rooms, heating source and house age were key variables that had not been previously explored in the literature, but are consistent with the expected profile of a PV adopter. The strong relationship provided by foreclosure indicators and mortgage status have less of an intuitive connection to PV adoption, but may be highly correlated with characteristics inherent in PV adopters. Next, we explore how these predictive factors and model performance varies between different Investor Owned Utility (IOU) regions in California, and at different spatial scales. Results suggest that models trained with small subsets of geospatial information (five to eight variables) may provide similar explanatory power as models using hundreds of geospatial variables. Further, the predictive performance of models generally decreases at higher resolution, i.e., below ZIP code level since several geospatial variables with coarse native resolution become less useful for representing high resolution variations in PV adoption trends. However, for California we find that model performance improves if parameters are trained at the regional IOU level rather than the state-wide level. We also find that models trained within one IOU region are generally representative for other IOU regions in CA, suggesting that a model trained with data from one state may be applicable in another state. (letter)

  15. Brokered virtual hubs for facilitating access and use of geospatial Open Data

    Science.gov (United States)

    Mazzetti, Paolo; Latre, Miguel; Kamali, Nargess; Brumana, Raffaella; Braumann, Stefan; Nativi, Stefano

    2016-04-01

    , beyond a certain extent, heterogeneity is irreducible especially in interdisciplinary contexts. ENERGIC OD Virtual Hubs address heterogeneity adopting a mediation and brokering approach: specific components (brokers) are dedicated to harmonize service interfaces, metadata and data models, enabling seamless discovery and access to heterogeneous infrastructures and datasets. As an innovation project, ENERGIC OD integrates several existing technologies to implement Virtual Hubs as single points of access to geospatial datasets provided by new or existing platforms and infrastructures, including INSPIRE-compliant systems and Copernicus services. A first version of the ENERGIC OD brokers has been implemented based on the GI-Suite Brokering Framework developed by CNR-IIA, and complemented with other tools under integration and development. It already enables mediated discovery and harmonized access to different geospatial Open Data sources. It is accessible by users as Software-as-a-Service through a browser. Moreover, open APIs and a Javascript library are available for application developers. Six ENERGIC OD Virtual Hubs have been currently deployed: one at regional level (Berlin metropolitan area) and five at national-level (in France, Germany, Italy, Poland and Spain). Each Virtual Hub manager decided the deployment strategy (local infrastructure or commercial Infrastructure-as-a-Service cloud), and the list of connected Open Data sources. The ENERGIC OD Virtual Hubs are under test and validation through the development of ten different mobile and Web applications.

  16. The Value of Information - Accounting for a New Geospatial Paradigm

    Science.gov (United States)

    Pearlman, J.; Coote, A. M.

    2014-12-01

    A new frontier in consideration of socio-economic benefit is valuing information as an asset, often referred to as Infonomics. Conventional financial practice does not easily provide a mechanism for valuing information and yet clearly for many of the largest corporations, such as Google and Facebook, it is their principal asset. This is exacerbated for public sector organizations, as those that information-centric rather than information-enabled are relatively few - statistics, archiving and mapping agencies are perhaps the only examples - so it's not at the top of the agenda for Government. However, it is a hugely important issue when valuing Geospatial data and information. Geospatial data allows public institutions to operate, and facilitates the provision of essential services for emergency response and national defense. In this respect, geospatial data is strongly analogous to other types of public infrastructure, such as utilities and roads. The use of Geospatial data is widespread from companies in the transportation or construction sectors to individual planning for daily events. The categorization of geospatial data as infrastructure is critical to decisions related to investment in its management, maintenance and upgrade over time. Geospatial data depreciates in the same way that physical infrastructure depreciates. It needs to be maintained otherwise its functionality and value in use declines. We have coined the term geo-infonomics to encapsulate the concept. This presentation will develop the arguments around its importance and current avenues of research.

  17. Surface temperatures in New York City: Geospatial data enables the accurate prediction of radiative heat transfer.

    Science.gov (United States)

    Ghandehari, Masoud; Emig, Thorsten; Aghamohamadnia, Milad

    2018-02-02

    Despite decades of research seeking to derive the urban energy budget, the dynamics of thermal exchange in the densely constructed environment is not yet well understood. Using New York City as a study site, we present a novel hybrid experimental-computational approach for a better understanding of the radiative heat transfer in complex urban environments. The aim of this work is to contribute to the calculation of the urban energy budget, particularly the stored energy. We will focus our attention on surface thermal radiation. Improved understanding of urban thermodynamics incorporating the interaction of various bodies, particularly in high rise cities, will have implications on energy conservation at the building scale, and for human health and comfort at the urban scale. The platform presented is based on longwave hyperspectral imaging of nearly 100 blocks of Manhattan, in addition to a geospatial radiosity model that describes the collective radiative heat exchange between multiple buildings. Despite assumptions in surface emissivity and thermal conductivity of buildings walls, the close comparison of temperatures derived from measurements and computations is promising. Results imply that the presented geospatial thermodynamic model of urban structures can enable accurate and high resolution analysis of instantaneous urban surface temperatures.

  18. Virtual globes and geospatial health: the potential of new tools in the management and control of vector-borne diseases

    Directory of Open Access Journals (Sweden)

    Anna-Sofie Stensgaard

    2009-05-01

    Full Text Available The rapidly growing field of three-dimensional software modeling of the Earth holds promise for applications in the geospatial health sciences. Easy-to-use, intuitive virtual globe technologies such as Google Earth™ enable scientists around the world to share their data and research results in a visually attractive and readily understandable fashion without the need for highly sophisticated geographical information systems (GIS or much technical assistance. This paper discusses the utility of the rapid and simultaneous visualization of how the agents of parasitic diseases are distributed, as well as that of their vectors and/or intermediate hosts together with other spatially-explicit information. The resulting better understanding of the epidemiology of infectious diseases, and the multidimensional environment in which they occur, are highlighted. In particular, the value of Google Earth™, and its web-based pendant Google Maps™, are reviewed from a public health view point, combining results from literature searches and experiences gained thus far from a multidisciplinary project aimed at optimizing schistosomiasis control and transmission surveillance in sub-Saharan Africa. Although the basic analytical capabilities of virtual globe applications are limited, we conclude that they have considerable potential in the support and promotion of the geospatial health sciences as a userfriendly, straightforward GIS tool for the improvement of data collation, visualization and exploration. The potential of these systems for data sharing and broad dissemination of scientific research and results is emphasized.

  19. The Whole World In Your Hands: Using an Interactive Virtual Reality Sandbox for Geospatial Education and Outreach

    Science.gov (United States)

    Clucas, T.; Wirth, G. S.; Broderson, D.

    2014-12-01

    Traditional geospatial education tools such as maps and computer screens don't convey the rich topography present on Earth. Translating lines on a contour lines on a topo map to relief in a landscape can be a challenging concept to convey.A partnership between Alaska EPSCoR and the Geographic Information Network of Alaska has successfully constructed an Interactive Virtual Reality Sandbox, an education tool that in real-time projects and updates topographic contours on the surface of a sandbox. The sandbox has been successfully deployed at public science events as well as professional geospatial and geodesy conferences. Landscape change, precipitation, and evaporation can all be modeled, much to the delight of our enthusiasts, who range in age from 3 to 90. Visually, as well as haptically, demonstrating the effects of events (such as dragging a hand through the sand) on a landscape, as well as the intuitive realization of meaning of topographic contour lines, has proven to be engaging.

  20. Geospatial Modelling for Micro Zonation of Groundwater Regime in Western Assam, India

    Science.gov (United States)

    Singh, R. P.

    2016-12-01

    Water, most precious natural resource on earth, is vital to sustain the natural system and human civilisation on the earth. The Assam state located in north-eastern part of India has a relatively good source of ground water due to their geographic and physiographic location but there is problem deterioration of groundwater quality causing major health problem in the area. In this study, I tried a integrated study of remote sensing and GIS and chemical analysis of groundwater samples to throw a light over groundwater regime and provides information for decision makers to make sustainable water resource management. The geospatial modelling performed by integrating hydrogeomorphic features. Geomorphology, lineament, Drainage, Landuse/landcover layer were generated through visual interpretation on satellite image (LISS III) based on tone, texture, shape, size, and arrangement of the features. Slope layer was prepared by using SRTM DEM data set .The LULC of the area were categories in to 6 classes of Agricultural field, Forest area ,River, Settlement , Tree-clad area and Wetlands. The geospatial modelling performed through weightage and rank method in GIS, depending on the influence of the features on ground water regime. To Assess the ground water quality of the area 45 groundwater samples have been collected from the field and chemical analysis performed through the standard method in the laboratory. The overall assessment of the ground water quality of the area analyse through Water Quality Index and found that about 70% samples are not potable for drinking purposes due to higher concentration Arsenic, Fluoride and Iron. It appears that, source of all these pollutants geologically and geomorphologically derived. Interpolated layer of Water Quality Index and geospatial modelled Groundwater potential layer provides a holistic view of groundwater scenario and provide direction for better planning and groundwater resource management. Study will be discussed in details

  1. GIS-and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Wei [Colorado School of Mines, Golden, CO (United States); Minnick, Matthew [Colorado School of Mines, Golden, CO (United States); Geza, Mengistu [Colorado School of Mines, Golden, CO (United States); Murray, Kyle [Colorado School of Mines, Golden, CO (United States); Mattson, Earl [Colorado School of Mines, Golden, CO (United States)

    2012-09-30

    The Colorado School of Mines (CSM) was awarded a grant by the National Energy Technology Laboratory (NETL), Department of Energy (DOE) to conduct a research project en- titled GIS- and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development in October of 2008. The ultimate goal of this research project is to develop a water resource geo-spatial infrastructure that serves as “baseline data” for creating solutions on water resource management and for supporting decisions making on oil shale resource development. The project came to the end on September 30, 2012. This final project report will report the key findings from the project activity, major accomplishments, and expected impacts of the research. At meantime, the gamma version (also known as Version 4.0) of the geodatabase as well as other various deliverables stored on digital storage media will be send to the program manager at NETL, DOE via express mail. The key findings from the project activity include the quantitative spatial and temporal distribution of the water resource throughout the Piceance Basin, water consumption with respect to oil shale production, and data gaps identified. Major accomplishments of this project include the creation of a relational geodatabase, automated data processing scripts (Matlab) for database link with surface water and geological model, ArcGIS Model for hydrogeologic data processing for groundwater model input, a 3D geological model, surface water/groundwater models, energy resource development systems model, as well as a web-based geo-spatial infrastructure for data exploration, visualization and dissemination. This research will have broad impacts of the devel- opment of the oil shale resources in the US. The geodatabase provides a “baseline” data for fur- ther study of the oil shale development and identification of further data collection needs. The 3D geological model provides better understanding through data interpolation and

  2. Geo-spatial technologies in urban environments policy, practice, and pixels

    CERN Document Server

    Jensen, Ryan R; McLean, Daniel

    2004-01-01

    Using Geospatial Technologies in Urban Environments simultaneously fills two gaping vacuums in the scholarly literature on urban geography. The first is the clear and straightforward application of geospatial technologies to practical urban issues. By using remote sensing and statistical techniques (correlation-regression analysis, the expansion method, factor analysis, and analysis of variance), the - thors of these 12 chapters contribute significantly to our understanding of how geospatial methodologies enhance urban studies. For example, the GIS Specialty Group of the Association of American Geographers (AAG) has the largest m- bership of all the AAG specialty groups, followed by the Urban Geography S- cialty Group. Moreover, the Urban Geography Specialty Group has the largest number of cross-memberships with the GIS Specialty Group. This book advances this important geospatial and urban link. Second, the book fills a wide void in the urban-environment literature. Although the Annals of the Association of ...

  3. Geospatial data infrastructure: The development of metadata for geo-information in China

    Science.gov (United States)

    Xu, Baiquan; Yan, Shiqiang; Wang, Qianju; Lian, Jian; Wu, Xiaoping; Ding, Keyong

    2014-03-01

    Stores of geoscience records are in constant flux. These stores are continually added to by new information, ideas and data, which are frequently revised. The geoscience record is in restrained by human thought and technology for handling information. Conventional methods strive, with limited success, to maintain geoscience records which are readily susceptible and renewable. The information system must adapt to the diversity of ideas and data in geoscience and their changes through time. In China, more than 400,000 types of important geological data are collected and produced in geological work during the last two decades, including oil, natural gas and marine data, mine exploration, geophysical, geochemical, remote sensing and important local geological survey and research reports. Numerous geospatial databases are formed and stored in National Geological Archives (NGA) with available formats of MapGIS, ArcGIS, ArcINFO, Metalfile, Raster, SQL Server, Access and JPEG. But there is no effective way to warrant that the quality of information is adequate in theory and practice for decision making. The need for fast, reliable, accurate and up-to-date information by providing the Geographic Information System (GIS) communities are becoming insistent for all geoinformation producers and users in China. Since 2010, a series of geoinformation projects have been carried out under the leadership of the Ministry of Land and Resources (MLR), including (1) Integration, update and maintenance of geoinformation databases; (2) Standards research on clusterization and industrialization of information services; (3) Platform construction of geological data sharing; (4) Construction of key borehole databases; (5) Product development of information services. "Nine-System" of the basic framework has been proposed for the development and improvement of the geospatial data infrastructure, which are focused on the construction of the cluster organization, cluster service, convergence

  4. Geospatial data infrastructure: The development of metadata for geo-information in China

    International Nuclear Information System (INIS)

    Xu, Baiquan; Yan, Shiqiang; Wang, Qianju; Lian, Jian; Wu, Xiaoping; Ding, Keyong

    2014-01-01

    Stores of geoscience records are in constant flux. These stores are continually added to by new information, ideas and data, which are frequently revised. The geoscience record is in restrained by human thought and technology for handling information. Conventional methods strive, with limited success, to maintain geoscience records which are readily susceptible and renewable. The information system must adapt to the diversity of ideas and data in geoscience and their changes through time. In China, more than 400,000 types of important geological data are collected and produced in geological work during the last two decades, including oil, natural gas and marine data, mine exploration, geophysical, geochemical, remote sensing and important local geological survey and research reports. Numerous geospatial databases are formed and stored in National Geological Archives (NGA) with available formats of MapGIS, ArcGIS, ArcINFO, Metalfile, Raster, SQL Server, Access and JPEG. But there is no effective way to warrant that the quality of information is adequate in theory and practice for decision making. The need for fast, reliable, accurate and up-to-date information by providing the Geographic Information System (GIS) communities are becoming insistent for all geoinformation producers and users in China. Since 2010, a series of geoinformation projects have been carried out under the leadership of the Ministry of Land and Resources (MLR), including (1) Integration, update and maintenance of geoinformation databases; (2) Standards research on clusterization and industrialization of information services; (3) Platform construction of geological data sharing; (4) Construction of key borehole databases; (5) Product development of information services. ''Nine-System'' of the basic framework has been proposed for the development and improvement of the geospatial data infrastructure, which are focused on the construction of the cluster organization, cluster

  5. Towards Geo-spatial Hypermedia: Concepts and Prototype Implementation

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Vestergaard, Peter Posselt; Ørbæk, Peter

    2002-01-01

    This paper combines spatial hypermedia with techniques from Geographical Information Systems and location based services. We describe the Topos 3D Spatial Hypermedia system and how it has been developed to support geo-spatial hypermedia coupling hypermedia information to model representations...... of real world buildings and landscapes. The prototype experiments are primarily aimed at supporting architects and landscape architects in their work on site. Here it is useful to be able to superimpose and add different layers of information to, e.g. a landscape depending on the task being worked on. We...... and indirect navigation. Finally, we conclude with a number of research issues which are central to the future development of geo-spatial hypermedia, including design issues in combining metaphorical and literal hypermedia space, as well as a discussion of the role of spatial parsing in a geo-spatial context....

  6. Using the Geospatial Web to Deliver and Teach Giscience Education Programs

    Science.gov (United States)

    Veenendaal, B.

    2015-05-01

    Geographic information science (GIScience) education has undergone enormous changes over the past years. One major factor influencing this change is the role of the geospatial web in GIScience. In addition to the use of the web for enabling and enhancing GIScience education, it is also used as the infrastructure for communicating and collaborating among geospatial data and users. The web becomes both the means and the content for a geospatial education program. However, the web does not replace the traditional face-to-face environment, but rather is a means to enhance it, expand it and enable an authentic and real world learning environment. This paper outlines the use of the web in both the delivery and content of the GIScience program at Curtin University. The teaching of the geospatial web, web and cloud based mapping, and geospatial web services are key components of the program, and the use of the web and online learning are important to deliver this program. Some examples of authentic and real world learning environments are provided including joint learning activities with partner universities.

  7. Development of Web GIS for complex processing and visualization of climate geospatial datasets as an integral part of dedicated Virtual Research Environment

    Science.gov (United States)

    Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander

    2017-04-01

    For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map

  8. Integrated Sustainable Planning for Industrial Region Using Geospatial Technology

    Science.gov (United States)

    Tiwari, Manish K.; Saxena, Aruna; Katare, Vivek

    2012-07-01

    The Geospatial techniques and its scope of applications have undergone an order of magnitude change since its advent and now it has been universally accepted as a most important and modern tool for mapping and monitoring of various natural resources as well as amenities and infrastructure. The huge and voluminous spatial database generated from various Remote Sensing platforms needs proper management like storage, retrieval, manipulation and analysis to extract desired information, which is beyond the capability of human brain. This is where the computer aided GIS technology came into existence. A GIS with major input from Remote Sensing satellites for the natural resource management applications must be able to handle the spatiotemporal data, supporting spatiotemporal quarries and other spatial operations. Software and the computer-based tools are designed to make things easier to the user and to improve the efficiency and quality of information processing tasks. The natural resources are a common heritage, which we have shared with the past generations, and our future generation will be inheriting these resources from us. Our greed for resource and our tremendous technological capacity to exploit them at a much larger scale has created a situation where we have started withdrawing from the future stocks. Bhopal capital region had attracted the attention of the planners from the beginning of the five-year plan strategy for Industrial development. However, a number of projects were carried out in the individual Districts (Bhopal, Rajgarh, Shajapur, Raisen, Sehore) which also gave fruitful results, but no serious efforts have been made to involve the entire region. No use of latest Geospatial technique (Remote Sensing, GIS, GPS) to prepare a well structured computerized data base without which it is very different to retrieve, analyze and compare the data for monitoring as well as for planning the developmental activities in future.

  9. Leveraging the geospatial advantage

    Science.gov (United States)

    Ben Butler; Andrew Bailey

    2013-01-01

    The Wildland Fire Decision Support System (WFDSS) web-based application leverages geospatial data to inform strategic decisions on wildland fires. A specialized data team, working within the Wildland Fire Management Research Development and Application group (WFM RD&A), assembles authoritative national-level data sets defining values to be protected. The use of...

  10. Data Quality, Provenance and IPR Management services: their role in empowering geospatial data suppliers and users

    Science.gov (United States)

    Millard, Keiran

    2015-04-01

    This paper looks at current experiences of geospatial users and geospatial suppliers and how they have been limited by suitable frameworks for managing and communicating data quality, data provenance and intellectual property rights (IPR). Current political and technological drivers mean that increasing volumes of geospatial data are available through a plethora of different products and services, and whilst this is inherently a good thing it does create a new generation of challenges. This paper consider two examples of where these issues have been examined and looks at the challenges and possible solutions from a data user and data supplier perspective. The first example is the IQmulus project that is researching fusion environments for big geospatial point clouds and coverages. The second example is the EU Emodnet programme that is establishing thematic data portals for public marine and coastal data. IQmulus examines big geospatial data; the data from sources such as LIDAR, SONAR and numerical simulations; these data are simply too big for routine and ad-hoc analysis, yet they could realise a myriad of disparate, and readily useable, information products with the right infrastructure in place. IQmulus is researching how to deliver this infrastructure technically, but a financially sustainable delivery depends on being able to track and manage ownership and IPR across the numerous data sets being processed. This becomes complex when the data is composed of multiple overlapping coverages, however managing this allows for uses to be delivered highly-bespoke products to meet their budget and technical needs. The Emodnet programme delivers harmonised marine data at the EU scale across seven thematic portals. As part of the Emodnet programme a series of 'check points' have been initiated to examine how useful these services and other public data services actually are to solve real-world problems. One key finding is that users have been confused by the fact that often

  11. An operative dengue risk stratification system in Argentina based on geospatial technology

    Directory of Open Access Journals (Sweden)

    Ximena Porcasi

    2012-09-01

    Full Text Available Based on an agreement between the Ministry of Health and the National Space Activities Commission in Argentina, an integrated informatics platform for dengue risk using geospatial technology for the surveillance and prediction of risk areas for dengue fever has been designed. The task was focused on developing stratification based on environmental (historical and current, viral, social and entomological situation for >3,000 cities as part of a system. The platform, developed with open-source software with pattern design, following the European Space Agency standards for space informatics, delivers two products: a national risk map consisting of point vectors for each city/town/locality and an approximate 50 m resolution urban risk map modelling the risk inside selected high-risk cities. The operative system, architecture and tools used in the development are described, including a detailed list of end users’ requirements. Additionally, an algorithm based on bibliography and landscape epidemiology concepts is presented and discussed. The system, in operation since September 2011, is capable of continuously improving the algorithms producing improved risk stratifications without a complete set of inputs. The platform was specifically developed for surveillance of dengue fever as this disease has reemerged in Argentina but the aim is to widen the scope to include also other relevant vector-borne diseases such as chagas, malaria and leishmaniasis as well as other countries belonging to south region of Latin America.

  12. PIVOT: platform for interactive analysis and visualization of transcriptomics data.

    Science.gov (United States)

    Zhu, Qin; Fisher, Stephen A; Dueck, Hannah; Middleton, Sarah; Khaladkar, Mugdha; Kim, Junhyong

    2018-01-05

    Many R packages have been developed for transcriptome analysis but their use often requires familiarity with R and integrating results of different packages requires scripts to wrangle the datatypes. Furthermore, exploratory data analyses often generate multiple derived datasets such as data subsets or data transformations, which can be difficult to track. Here we present PIVOT, an R-based platform that wraps open source transcriptome analysis packages with a uniform user interface and graphical data management that allows non-programmers to interactively explore transcriptomics data. PIVOT supports more than 40 popular open source packages for transcriptome analysis and provides an extensive set of tools for statistical data manipulations. A graph-based visual interface is used to represent the links between derived datasets, allowing easy tracking of data versions. PIVOT further supports automatic report generation, publication-quality plots, and program/data state saving, such that all analysis can be saved, shared and reproduced. PIVOT will allow researchers with broad background to easily access sophisticated transcriptome analysis tools and interactively explore transcriptome datasets.

  13. National Geospatial-Intelligence Agency Academic Research Program

    Science.gov (United States)

    Loomer, S. A.

    2004-12-01

    "Know the Earth.Show the Way." In fulfillment of its vision, the National Geospatial-Intelligence Agency (NGA) provides geospatial intelligence in all its forms and from whatever source-imagery, imagery intelligence, and geospatial data and information-to ensure the knowledge foundation for planning, decision, and action. To achieve this, NGA conducts a multi-disciplinary program of basic research in geospatial intelligence topics through grants and fellowships to the leading investigators, research universities, and colleges of the nation. This research provides the fundamental science support to NGA's applied and advanced research programs. The major components of the NGA Academic Research Program (NARP) are: - NGA University Research Initiatives (NURI): Three-year basic research grants awarded competitively to the best investigators across the US academic community. Topics are selected to provide the scientific basis for advanced and applied research in NGA core disciplines. - Historically Black College and University - Minority Institution Research Initiatives (HBCU-MI): Two-year basic research grants awarded competitively to the best investigators at Historically Black Colleges and Universities, and Minority Institutions across the US academic community. - Director of Central Intelligence Post-Doctoral Research Fellowships: Fellowships providing access to advanced research in science and technology applicable to the intelligence community's mission. The program provides a pool of researchers to support future intelligence community needs and develops long-term relationships with researchers as they move into career positions. This paper provides information about the NGA Academic Research Program, the projects it supports and how other researchers and institutions can apply for grants under the program.

  14. A new android smartphone app for geospatial mapping from drones and kites

    Science.gov (United States)

    Anderson, Karen; Griffiths, Dave; Debell, Leon; Steve, Hancock; James, Duffy; Jamie, Shutler; Liam, Reinhardt; Griffiths, Amber; Threadgill, Katie

    2016-04-01

    sensing devices. The application uses a visual coding 'scheme blocks' framework, so that users can customise their own data capture tools in the field. In our presentation we will demonstrate the coding framework, and then we will show the results that were gathered when we used the app to collect data during test flights - utilising various kite and lightweight drone platforms. We have also developed a simple to use open-source geospatial toolkit to allow geographical information system (GIS)-ready GeoTIFF images to be processed from the metadata stored by the app. We will demonstrate how this works in our presentation. Two Android smartphones were used in testing - a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset. We will show that the best results were obtained with the higher specification phone when it was attached to a single line kite or to a gliding drone. Finally, we will use data collected using the app, over a farmyard to demonstrate the power of the resultant fine-grained products for a simple application - advising farmers about small-scale interventions they can make to improve the quality of water run-off from their farms. The app can be downloaded freely, and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. disaster zones, in teaching or for grassroots democratic mapping). [https://play.google.com/store/apps/details?id=foam.uavtoolkit&hl=en

  15. Visual analytics of inherently noisy crowdsourced data on ultra high resolution displays

    Science.gov (United States)

    Huynh, Andrew; Ponto, Kevin; Lin, Albert Yu-Min; Kuester, Falko

    The increasing prevalence of distributed human microtasking, crowdsourcing, has followed the exponential increase in data collection capabilities. The large scale and distributed nature of these microtasks produce overwhelming amounts of information that is inherently noisy due to the nature of human input. Furthermore, these inputs create a constantly changing dataset with additional information added on a daily basis. Methods to quickly visualize, filter, and understand this information over temporal and geospatial constraints is key to the success of crowdsourcing. This paper present novel methods to visually analyze geospatial data collected through crowdsourcing on top of remote sensing satellite imagery. An ultra high resolution tiled display system is used to explore the relationship between human and satellite remote sensing data at scale. A case study is provided that evaluates the presented technique in the context of an archaeological field expedition. A team in the field communicated in real-time with and was guided by researchers in the remote visual analytics laboratory, swiftly sifting through incoming crowdsourced data to identify target locations that were identified as viable archaeological sites.

  16. Product Platform Modeling

    DEFF Research Database (Denmark)

    Pedersen, Rasmus

    for customisation of products. In many companies these changes in the business environment have created a controversy between the need for a wide variety of products offered to the marketplace and a desire to reduce variation within the company in order to increase efficiency. Many companies use the concept...... other. These groups can be varied and combined to form different product variants without increasing the internal variety in the company. Based on the Theory of Domains, the concept of encapsulation in the organ domain is introduced, and organs are formulated as platform elements. Included......This PhD thesis has the title Product Platform Modelling. The thesis is about product platforms and visual product platform modelling. Product platforms have gained an increasing attention in industry and academia in the past decade. The reasons are many, yet the increasing globalisation...

  17. MultiSpec: A Desktop and Online Geospatial Image Data Processing Tool

    Science.gov (United States)

    Biehl, L. L.; Hsu, W. K.; Maud, A. R. M.; Yeh, T. T.

    2017-12-01

    MultiSpec is an easy to learn and use, freeware image processing tool for interactively analyzing a broad spectrum of geospatial image data, with capabilities such as image display, unsupervised and supervised classification, feature extraction, feature enhancement, and several other functions. Originally developed for Macintosh and Windows desktop computers, it has a community of several thousand users worldwide, including researchers and educators, as a practical and robust solution for analyzing multispectral and hyperspectral remote sensing data in several different file formats. More recently MultiSpec was adapted to run in the HUBzero collaboration platform so that it can be used within a web browser, allowing new user communities to be engaged through science gateways. MultiSpec Online has also been extended to interoperate with other components (e.g., data management) in HUBzero through integration with the geospatial data building blocks (GABBs) project. This integration enables a user to directly launch MultiSpec Online from data that is stored and/or shared in a HUBzero gateway and to save output data from MultiSpec Online to hub storage, allowing data sharing and multi-step workflows without having to move data between different systems. MultiSpec has also been used in K-12 classes for which one example is the GLOBE program (www.globe.gov) and in outreach material such as that provided by the USGS (eros.usgs.gov/educational-activities). MultiSpec Online now provides teachers with another way to use MultiSpec without having to install the desktop tool. Recently MultiSpec Online was used in a geospatial data session with 30-35 middle school students at the Turned Onto Technology and Leadership (TOTAL) Camp in the summers of 2016 and 2017 at Purdue University. The students worked on a flood mapping exercise using Landsat 5 data to learn about land remote sensing using supervised classification techniques. Online documentation is available for Multi

  18. Issues on Building Kazakhstan Geospatial Portal to Implement E-Government

    Science.gov (United States)

    Sagadiyev, K.; Kang, H. K.; Li, K. J.

    2016-06-01

    A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.

  19. ISSUES ON BUILDING KAZAKHSTAN GEOSPATIAL PORTAL TO IMPLEMENT E-GOVERNMENT

    Directory of Open Access Journals (Sweden)

    K. Sagadiyev

    2016-06-01

    Full Text Available A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.

  20. Visualizing and Understanding Socio-Environmental Dynamics in Baltimore

    Science.gov (United States)

    Zaitchik, B. F.; Omeara, K.; Guikema, S.; Scott, A.; Bessho, A.; Logan, T. M.

    2015-12-01

    The City of Baltimore, like any city, is the sum of its component neighborhoods, institutions, businesses, cultures, and, ultimately, its people. It is also an organism in its own right, with distinct geography, history, infrastructure, and environments that shape its residents even as it is shaped by them. Sometimes these interactions are obvious but often they are not; while basic economic patterns are widely documented, the distribution of socio-spatial and environmental connections often hides below the surface, as does the potential that those connections hold. Here we present results of a collaborative initiative on the geography, design, and policy of socio-environmental dynamics of Baltimore. Geospatial data derived from satellite imagery, demographic databases, social media feeds, infrastructure plans, and in situ environmental networks, among other sources, are applied to generate an interactive portrait of Baltimore City's social, health, and well-being dynamics. The layering of data serves as a platform for visualizing the interconnectedness of the City and as a database for modeling risk interactions, vulnerabilities, and strengths within and between communities. This presentation will provide an overview of project findings and highlight linkages to education and policy.

  1. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    Science.gov (United States)

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  2. Prototyping a Sensor Enabled 3d Citymodel on Geospatial Managed Objects

    Science.gov (United States)

    Kjems, E.; Kolář, J.

    2013-09-01

    One of the major development efforts within the GI Science domain are pointing at sensor based information and the usage of real time information coming from geographic referenced features in general. At the same time 3D City models are mostly justified as being objects for visualization purposes rather than constituting the foundation of a geographic data representation of the world. The combination of 3D city models and real time information based systems though can provide a whole new setup for data fusion within an urban environment and provide time critical information preserving our limited resources in the most sustainable way. Using 3D models with consistent object definitions give us the possibility to avoid troublesome abstractions of reality, and design even complex urban systems fusing information from various sources of data. These systems are difficult to design with the traditional software development approach based on major software packages and traditional data exchange. The data stream is varying from urban domain to urban domain and from system to system why it is almost impossible to design a complete system taking care of all thinkable instances now and in the future within one constraint software design complex. On several occasions we have been advocating for a new end advanced formulation of real world features using the concept of Geospatial Managed Objects (GMO). This paper presents the outcome of the InfraWorld project, a 4 million Euro project financed primarily by the Norwegian Research Council where the concept of GMO's have been applied in various situations on various running platforms of an urban system. The paper will be focusing on user experiences and interfaces rather then core technical and developmental issues. The project was primarily focusing on prototyping rather than realistic implementations although the results concerning applicability are quite clear.

  3. GIBS Geospatial Data Abstraction Library (GDAL)

    Data.gov (United States)

    National Aeronautics and Space Administration — GDAL is an open source translator library for raster geospatial data formats that presents a single abstract data model to the calling application for all supported...

  4. Geospatial Information System Capability Maturity Models

    Science.gov (United States)

    2017-06-01

    To explore how State departments of transportation (DOTs) evaluate geospatial tool applications and services within their own agencies, particularly their experiences using capability maturity models (CMMs) such as the Urban and Regional Information ...

  5. Arsenic removal from contaminated groundwater by membrane-integrated hybrid plant: optimization and control using Visual Basic platform.

    Science.gov (United States)

    Chakrabortty, S; Sen, M; Pal, P

    2014-03-01

    A simulation software (ARRPA) has been developed in Microsoft Visual Basic platform for optimization and control of a novel membrane-integrated arsenic separation plant in the backdrop of absence of such software. The user-friendly, menu-driven software is based on a dynamic linearized mathematical model, developed for the hybrid treatment scheme. The model captures the chemical kinetics in the pre-treating chemical reactor and the separation and transport phenomena involved in nanofiltration. The software has been validated through extensive experimental investigations. The agreement between the outputs from computer simulation program and the experimental findings are excellent and consistent under varying operating conditions reflecting high degree of accuracy and reliability of the software. High values of the overall correlation coefficient (R (2) = 0.989) and Willmott d-index (0.989) are indicators of the capability of the software in analyzing performance of the plant. The software permits pre-analysis, manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. Performance analysis of the whole system as well as the individual units is possible using the tool. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for removal of arsenic from contaminated groundwater.

  6. Connecting Music and Place: Exploring Library Collection Data Using Geo-visualizations

    Directory of Open Access Journals (Sweden)

    Carolyn Doi

    2017-06-01

    Full Text Available Abstract Objectives – This project had two stated objectives: 1 to compare the location and concentration of Saskatchewan-based large ensembles (bands, orchestras, choirs within the province, with the intention to draw conclusions about the history of community-based musical activity within the province; and 2 to enable location-based browsing of Saskatchewan music materials through an interactive search interface. Methods – Data was harvested from MARC metadata found in the library catalogue for a special collection of Saskatchewan music at the University of Saskatchewan. Microsoft Excel and OpenRefine were used to screen, clean, and enhance the dataset. Data was imported into ArcGIS software, where it was plotted using a geo-visualization showing location and concentrations of musical activity by large ensembles within the province. The geo-visualization also allows users to filter results based on the ensemble type (band, orchestra, or choir. Results – The geo-visualization shows that albums from large community ensembles appear across the province, in cities and towns of all sizes. The ensembles are concentrated in the southern portion of the province and there is a correlation between population density and ensemble location. Choral ensembles are more prevalent than bands and orchestras, and appear more widely across the province, whereas bands and orchestras are concentrated around larger centres. Conclusions – Library catalogue data contains unique information for research based on special collections, though additional cleaning is needed. Using geospatial visualizations to navigate collections allows for more intuitive searching by location, and allow users to compare facets. While not appropriate for all kinds of searching, maps are useful for browsing and for location-based searches. Information is displayed in a visual way that allows users to explore and connect with other platforms for more information.

  7. Plataforma de desarrollo para el control en tiempo real de estructuras cinemáticas con realimentación visual//Platform to develop real time visual servoing control in kinematics systems

    Directory of Open Access Journals (Sweden)

    René González-Rodríguez

    2012-09-01

    Full Text Available En este trabajo se presenta una plataforma de desarrollo para el control en tiempo real de estructuras cinemáticas con realimentación visual. Se ha diseñado una configuración genérica que permite la implementación de cualquier variante de control visual. Para el procesamiento de la imagen se ha propuesto una estrategia que permite el uso de diferentes herramientas comerciales o algoritmos propiospara la captura y extracción de características de la imagen. El uso de Real Time Work Shop y Real Time Windows Target en el lazo de control interno brinda la posibilidad de implementar algoritmos de control servovisual en tiempo real. Al final del trabajo se presentan los resultados de un esquema de controlservovisual aplicado en un manipulador industrial. La plataforma propuesta constituye una herramienta de desarrollo para aplicaciones industriales de control servovisual y sirve de apoyo a la enseñanza de la mecatrónica en pregrado y postgrado.Palabras claves: control servovisual, control en tiempo real, estructuras cinemáticas._______________________________________________________________________________AbstractIn this work we propose a platform to develop visual servoing control systems. The platform has a generic design with the possibility to implement direct or look and move visual servoing systems. For the image processing we present a generic design allowing the use of any image processing library like Matrox MIL,Intel IPP, OpenCV or any algorithms for image capture and target characteristics extraction. The uses of Real Time Work Shop and Real Time Windows Target in the internal loop permits modify the control structure in SIMULINK very easy.Key words: visual servoing, real time control, kinematics systems.

  8. SemantGeo: Powering Ecological and Environment Data Discovery and Search with Standards-Based Geospatial Reasoning

    Science.gov (United States)

    Seyed, P.; Ashby, B.; Khan, I.; Patton, E. W.; McGuinness, D. L.

    2013-12-01

    Recent efforts to create and leverage standards for geospatial data specification and inference include the GeoSPARQL standard, Geospatial OWL ontologies (e.g., GAZ, Geonames), and RDF triple stores that support GeoSPARQL (e.g., AllegroGraph, Parliament) that use RDF instance data for geospatial features of interest. However, there remains a gap on how best to fuse software engineering best practices and GeoSPARQL within semantic web applications to enable flexible search driven by geospatial reasoning. In this abstract we introduce the SemantGeo module for the SemantEco framework that helps fill this gap, enabling scientists find data using geospatial semantics and reasoning. SemantGeo provides multiple types of geospatial reasoning for SemantEco modules. The server side implementation uses the Parliament SPARQL Endpoint accessed via a Tomcat servlet. SemantGeo uses the Google Maps API for user-specified polygon construction and JsTree for providing containment and categorical hierarchies for search. SemantGeo uses GeoSPARQL for spatial reasoning alone and in concert with RDFS/OWL reasoning capabilities to determine, e.g., what geofeatures are within, partially overlap with, or within a certain distance from, a given polygon. We also leverage qualitative relationships defined by the Gazetteer ontology that are composites of spatial relationships as well as administrative designations or geophysical phenomena. We provide multiple mechanisms for exploring data, such as polygon (map-based) and named-feature (hierarchy-based) selection, that enable flexible search constraints using boolean combination of selections. JsTree-based hierarchical search facets present named features and include a 'part of' hierarchy (e.g., measurement-site-01, Lake George, Adirondack Region, NY State) and type hierarchies (e.g., nodes in the hierarchy for WaterBody, Park, MeasurementSite), depending on the ';axis of choice' option selected. Using GeoSPARQL and aforementioned ontology

  9. Design of intelligent power consumption optimization and visualization management platform for large buildings based on internet of things

    Directory of Open Access Journals (Sweden)

    Gong Shulan

    2017-01-01

    Full Text Available The buildings provide a significant contribution to total energy consumption and CO2 emission. It has been estimated that the development of an intelligent power consumption monitor and control system will result in about 30% savings in energy consumption. This design innovatively integrates the advanced technologies such as the internet of things, the internet, intelligent buildings and intelligent electricity which can offer open, efficient, convenient energy consumption detection platform in demand side and visual management demonstration application platform in power enterprises side. The system was created to maximize the effective and efficient the use of energy resource. It was development around sensor networks and intelligent gateway and the monitoring center software. This will realize the highly integration and comprehensive application in energy and information to meet the needs with intelligent buildings

  10. Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis

    Science.gov (United States)

    Singh, R.; Percivall, G.

    2009-12-01

    (note: acronym glossary at end of abstract) For scientists to have confidence in the veracity of data sets and computational processes not under their control, operational transparency must be much greater than previously required. Being able to have a universally understood and machine-readable language for describing such things as the completeness of metadata, data provenance and uncertainty, and the discrete computational steps in a complex process take on increased importance. OGC has been involved with technological issues associated with climate change since 2005 when we, along with the IEEE Committee on Earth Observation, began a close working relationship with GEO and GEOSS (http://earthobservations.org). GEO/GEOS provide the technology platform to GCOS who in turn represents the earth observation community to UNFCCC. OGC and IEEE are the organizers of the GEO/GEOSS Architecture Implementation Pilot (see http://www.ogcnetwork.net/AIpilot). This continuing work involves closely working with GOOS (Global Ocean Observing System) and WMO (World Meteorological Organization). This session reports on the findings of recent work within the OGC’s community of software developers and users to apply geospatial web services to the climate studies domain. The value of this work is to evolve OGC web services, moving from data access and query to geo-processing and workflows. Two projects will be described, the GEOSS API-2 and the CCIP. AIP is a task of the GEOSS Architecture and Data Committee. During its duration, two GEO Tasks defined the project: AIP-2 began as GEO Task AR-07-02, to lead the incorporation of contributed components consistent with the GEOSS Architecture using a GEO Web Portal and a Clearinghouse search facility to access services through GEOSS Interoperability Arrangements in support of the GEOSS Societal Benefit Areas. AIP-2 concluded as GEOS Task AR-09-01b, to develop and pilot new process and infrastructure components for the GEOSS Common

  11. LEGO Mindstorms NXT for elderly and visually impaired people in need: A platform.

    Science.gov (United States)

    Al-Halhouli, Ala'aldeen; Qitouqa, Hala; Malkosh, Nancy; Shubbak, Alaa; Al-Gharabli, Samer; Hamad, Eyad

    2016-07-27

    This paper presents the employment of LEGO Mindstorms NXT robotics as core component of low cost multidisciplinary platform for assisting elderly and visually impaired people. LEGO Mindstorms system offers a plug-and-play programmable robotics toolkit, incorporating construction guides, microcontrollers and sensors, all connected via a comprehensive programming language. It facilitates, without special training and at low cost, the use of such device for interpersonal communication and for handling multiple tasks required for elderly and visually impaired people in-need. The research project provides a model for larger-scale implementation, tackling the issues of creating additional functions in order to assist people in-need. The new functions were built and programmed using MATLAB through a user friendly Graphical User Interface (GUI). Power consumption problem, besides the integration of WiFi connection has been resolved, incorporating GPS application on smart phones enhanced the guiding and tracking functions. We believe that developing and expanding the system to encompass a range of applications beyond the initial design schematics to ease conducting a limited number of pre-described protocols. However, the beneficiaries for the proposed research would be limited to elderly people who require assistance within their household as assistive-robot to facilitate a low-cost solution for a highly demanding health circumstance.

  12. An Internet-Based GIS Platform Providing Data for Visualization and Spatial Analysis of Urbanization in Major Asian and African Cities

    Directory of Open Access Journals (Sweden)

    Hao Gong

    2017-08-01

    Full Text Available Rapid urbanization in developing countries has been observed to be relatively high in the last two decades, especially in the Asian and African regions. Although many researchers have made efforts to improve the understanding of the urbanization trends of various cities in Asia and Africa, the absence of platforms where local stakeholders can visualize and obtain processed urbanization data for their specific needs or analysis, still remains a gap. In this paper, we present an Internet-based GIS platform called MEGA-WEB. The Platform was developed in view of the urban planning and management challenges in developing countries of Asia and Africa due to the limited availability of data resources, effective tools, and proficiency in data analysis. MEGA-WEB provides online access, visualization, spatial analysis, and data sharing services following a mashup framework of the MEGA-WEB Geo Web Services (GWS, with the third-party map services using HTML5/JavaScript techniques. Through the integration of GIS, remote sensing, geo-modelling, and Internet GIS, several indicators for analyzing urbanization are provided in MEGA-WEB to give diverse perspectives on the urbanization of not only the physical land surface condition, but also the relationships of population, energy use, and the environment. The design, architecture, system functions, and uses of MEGA-WEB are discussed in the paper. The MEGA-WEB project is aimed at contributing to sustainable urban development in developing countries of Asia and Africa.

  13. Student Focused Geospatial Curriculum Initiatives: Internships and Certificate Programs at NCCU

    Science.gov (United States)

    Vlahovic, G.; Malhotra, R.

    2009-12-01

    This paper reports recent efforts by the Department of Environmental, Earth and Geospatial Sciences faculty at North Carolina Central University (NCCU) to develop a leading geospatial sciences program that will be considered a model for other Historically Black College/University (HBCU) peers nationally. NCCU was established in 1909 and is the nation’s first state supported public liberal arts college funded for African Americans. In the most recent annual ranking of America’s best black colleges by the US News and World Report (Best Colleges 2010), NCCU was ranked 10th in the nation. As one of only two HBCUs in the southeast offering an undergraduate degree in Geography (McKee, J.O. and C. V. Dixon. Geography in Historically Black Colleges/ Universities in the Southeast, in The Role of the South in Making of American Geography: Centennial of the AAG, 2004), NCCU is uniquely positioned to positively affect talent and diversity of the geospatial discipline in the future. Therefore, successful creation of research and internship pathways for NCCU students has national implications because it will increase the number of minority students joining the workforce and applying to PhD programs. Several related efforts will be described, including research and internship projects with Fugro EarthData Inc., Center for Remote Sensing and Mapping Science at the University of Georgia, Center for Earthquake Research and Information at the University of Memphis and the City of Durham. The authors will also outline requirements and recent successes of ASPRS Provisional Certification Program, developed and pioneered as collaborative effort between ASPRS and NCCU. This certificate program allows graduating students majoring in geospatial technologies and allied fields to become provisionally certified by passing peer-review and taking the certification exam. At NCCU, projects and certification are conducted under the aegis of the Geospatial Research, Innovative Teaching and

  14. Ocean Thermal Extractable Energy Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Ascari, Matthew [Lockheed Martin Corporation, Bethesda, MD (United States)

    2012-10-28

    The Ocean Thermal Extractable Energy Visualization (OTEEV) project focuses on assessing the Maximum Practicably Extractable Energy (MPEE) from the world’s ocean thermal resources. MPEE is defined as being sustainable and technically feasible, given today’s state-of-the-art ocean energy technology. Under this project the OTEEV team developed a comprehensive Geospatial Information System (GIS) dataset and software tool, and used the tool to provide a meaningful assessment of MPEE from the global and domestic U.S. ocean thermal resources.

  15. PsyPad: a platform for visual psychophysics on the iPad.

    Science.gov (United States)

    Turpin, Andrew; Lawson, David J; McKendrick, Allison M

    2014-03-11

    This article introduces PsyPad, a customizable, open-source platform for configuring and conducting visual psychophysics experiments on iPads without the need for any code development for the iPad. Stimuli for experiments are created off-line as a library of images. The PsyPad app (obtainable from the Apple App Store) presents the images according to either built-in, customizable staircase or method of constant stimuli procedures, mapping stimuli levels to images based on the image file names. On-screen buttons for responses are configurable and matched to "correct" using the image file name of any given stimulus. All actions are logged into a text file and sent to a specified server at the end of the test if an Internet connection is available. If the iPad is not connected, the results are uploaded the next time the iPad is online. We provide a secure server for this purpose, but the server-side software is also open source if researchers choose to run their own server.

  16. Monitoring of In-Field Variability for Site Specific Crop Management Through Open Geospatial Information

    Science.gov (United States)

    Řezník, T.; Lukas, V.; Charvát, K.; Charvát, K., Jr.; Horáková, Š.; Křivánek, Z.; Herman, L.

    2016-06-01

    The agricultural sector is in a unique position due to its strategic importance around the world. It is crucial for both citizens (consumers) and the economy (both regional and global), which, ideally, should ensure that the whole sector is a network of interacting organisations. It is important to develop new tools, management methods, and applications to improve the management and logistic operations of agricultural producers (farms) and agricultural service providers. From a geospatial perspective, this involves identifying cost optimization pathways, reducing transport, reducing environmental loads, and improving the energy balance, while maintaining production levels, etc. This paper describes the benefits of, and open issues arising from, the development of the Open Farm Management Information System. Emphasis is placed on descriptions of available remote sensing and other geospatial data, and their harmonization, processing, and presentation to users. At the same time, the FOODIE platform also offers a novel approach of yield potential estimations. Validation for one farm demonstrated 70% successful rate when comparing yield results at a farm counting 1'284 hectares on one hand and results of a theoretical model of yield potential on the other hand. The presented Open Farm Management Information System has already been successfully registered under Phase 8 of the Global Earth Observation System of Systems (GEOSS) Architecture Implementation Pilot in order to support the wide variety of demands that are primarily aimed at agriculture and water pollution monitoring by means of remote sensing.

  17. ESTIMATION OF CARBON SEQUESTRATION BY RUSSIAN FORESTS: GEOSPATIAL ISSUE

    Directory of Open Access Journals (Sweden)

    N. V. Malysheva

    2017-01-01

    Full Text Available Сategories of carbon sequestration assessment for Russian forests are identified by GIS toolkit. Those are uniform by bioclimatic and site-specific conditions strata corresponding to modern version of bioclimatic forest district division. Stratification of forests at early stage substantially reduces the ambiguity of the evaluation because phytomass conversion sequestration capacity and expansion factor dependent on site-specific condition for calculating of forest carbon sink are absolutely necessary. Forest management units were linked to strata. Biomass conversion and expansion factor for forest carbon sink assessment linked to the strata were recalculated for forest management units. All operations were carried out with GIS analytical toolkit due to accessible functionalities. Units for forest carbon storage inventory and forest carbon balance calculation were localized. Production capacity parameters and forest carbon sequestration capacity have been visualized on maps complied by ArcGIS. Based on spatially-explicit information, we have found out that the greatest annual rates of forest’s carbon accumulation in Russian forests fall into mixed coniferous-deciduous forests of European-Ural part of Russia to Kaliningrad, Smolensk and Briansk Regions, coniferous-deciduous forests close to the boundary of Khabarovsk Region and Primorskij Kray in the Far East, as well as separate forest management units of Kabardino-Balkariya NorthCaucasian mountain area. The geospatial visualization of carbon sequestration by Russian forests and carbon balance assessment has been given.

  18. MapFactory - Towards a mapping design pattern for big geospatial data

    Science.gov (United States)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  19. Center of Excellence for Geospatial Information Science research plan 2013-18

    Science.gov (United States)

    Usery, E. Lynn

    2013-01-01

    The U.S. Geological Survey Center of Excellence for Geospatial Information Science (CEGIS) was created in 2006 and since that time has provided research primarily in support of The National Map. The presentations and publications of the CEGIS researchers document the research accomplishments that include advances in electronic topographic map design, generalization, data integration, map projections, sea level rise modeling, geospatial semantics, ontology, user-centered design, volunteer geographic information, and parallel and grid computing for geospatial data from The National Map. A research plan spanning 2013–18 has been developed extending the accomplishments of the CEGIS researchers and documenting new research areas that are anticipated to support The National Map of the future. In addition to extending the 2006–12 research areas, the CEGIS research plan for 2013–18 includes new research areas in data models, geospatial semantics, high-performance computing, volunteered geographic information, crowdsourcing, social media, data integration, and multiscale representations to support the Three-Dimensional Elevation Program (3DEP) and The National Map of the future of the U.S. Geological Survey.

  20. Comparative study of cocoa black ants temporal population distribution utilizing geospatial analysis

    Science.gov (United States)

    Adnan, N. A.; Bakar, S.; Mazlan, A. H.; Yusoff, Z. Mohd; Rasam, A. R. Abdul

    2018-02-01

    Cocoa plantation also subjected to diseases and pests infestation. Some pests not only reduced the yield but also inhibit the growth of trees. Therefore, the Malaysia Cocoa Board (MCB) has explored Cocoa Black Ants (CBA) as one of their biological control mechanism to reduce the pest infestation of the Cocoa Pod Borer (CPB). CPB is capable to cause damage to cocoa beans, and later on will reduce the quality of dried cocoa beans. This study tries to integrate the use of geospatial analysis in understanding population distribution pattern of CBA to enhance its capability in controlling CPB infestation. Two objectives of the study are i) to generate temporal CBA distribution of cocoa plantation for two different blocks, and ii) to compare visually the CBA population distribution pattern with the aid of geospatial technique. This study managed to find the CBA population pattern which indicated spatially modest amount of low pattern distribution in February of 2007 until reaching the highest levels of ant populations in September 2007 and decreasing by the end of the year in 2009 for two different blocks (i.e 10B and 18A). Therefore, the usage of GIS is important to explain the CBA pattern population in the mature cocoa field. This finding might to be used as an indicator to examine the optimum distribution of CBA, which needed as a biological control agent against the CPB in the future.

  1. Prototyping a sensor enabled 3D citymodel on geospatial managed objects

    DEFF Research Database (Denmark)

    Kjems, Erik; Kolář, Jan

    2013-01-01

    rather than constituting the foundation of a geographic data representation of the world. The combination of 3D city models and real time information based systems though can provide a whole new setup for data fusion within an urban environment and provide time critical information preserving our limited......One of the major development efforts within the GI Science domain are pointing at sensor based information and the usage of real time information coming from geographic referenced features in general. At the same time 3D City models are mostly justified as being objects for visualization purposes...... one constraint software design complex. On several occasions we have been advocating for a new end advanced formulation of real world features using the concept of Geospatial Managed Objects (GMO). This paper presents the outcome of the InfraWorld project, a 4 million Euro project financed primarily...

  2. Novel data visualizations of X-ray data for aviation security applications using the Open Threat Assessment Platform (OTAP)

    Science.gov (United States)

    Gittinger, Jaxon M.; Jimenez, Edward S.; Holswade, Erica A.; Nunna, Rahul S.

    2017-02-01

    This work will demonstrate the implementation of a traditional and non-traditional visualization of x-ray images for aviation security applications that will be feasible with open system architecture initiatives such as the Open Threat Assessment Platform (OTAP). Anomalies of interest to aviation security are fluid, where characteristic signals of anomalies of interest can evolve rapidly. OTAP is a limited scope open architecture baggage screening prototype that intends to allow 3rd-party vendors to develop and easily implement, integrate, and deploy detection algorithms and specialized hardware on a field deployable screening technology [13]. In this study, stereoscopic images were created using an unmodified, field-deployed system and rendered on the Oculus Rift, a commercial virtual reality video gaming headset. The example described in this work is not dependent on the Oculus Rift, and is possible using any comparable hardware configuration capable of rendering stereoscopic images. The depth information provided from viewing the images will aid in the detection of characteristic signals from anomalies of interest. If successful, OTAP has the potential to allow for aviation security to become more fluid in its adaptation to the evolution of anomalies of interest. This work demonstrates one example that is easily implemented using the OTAP platform, that could lead to the future generation of ATR algorithms and data visualization approaches.

  3. Strategizing Teacher Professional Development for Classroom Uses of Geospatial Data and Tools

    Science.gov (United States)

    Zalles, Daniel R.; Manitakos, James

    2016-01-01

    Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE), a 4.5-year National Science Foundation funded project, explored the strategies that stimulate teacher commitment to the project's driving innovation: having students use geospatial information technology (GIT) to learn about weather, climate,…

  4. Geospatial Big Data Handling Theory and Methods: A Review and Research Challenges

    DEFF Research Database (Denmark)

    Li, Songnian; Dragicevic, Suzana; Anton, François

    2016-01-01

    Big data has now become a strong focus of global interest that is increasingly attracting the attention of academia, industry, government and other organizations. Big data can be situated in the disciplinary area of traditional geospatial data handling theory and methods. The increasing volume...... for Photogrammetry and Remote Sensing (ISPRS) Technical Commission II (TC II) revisits the existing geospatial data handling methods and theories to determine if they are still capable of handling emerging geospatial big data. Further, the paper synthesises problems, major issues and challenges with current...... developments as well as recommending what needs to be developed further in the near future....

  5. Towards the Development of a Taxonomy for Visualisation of Streamed Geospatial Data

    Science.gov (United States)

    Sibolla, B. H.; Van Zyl, T.; Coetzee, S.

    2016-06-01

    Geospatial data has very specific characteristics that need to be carefully captured in its visualisation, in order for the user and the viewer to gain knowledge from it. The science of visualisation has gained much traction over the last decade as a response to various visualisation challenges. During the development of an open source based, dynamic two-dimensional visualisation library, that caters for geospatial streaming data, it was found necessary to conduct a review of existing geospatial visualisation taxonomies. The review was done in order to inform the design phase of the library development, such that either an existing taxonomy can be adopted or extended to fit the needs at hand. The major challenge in this case is to develop dynamic two dimensional visualisations that enable human interaction in order to assist the user to understand the data streams that are continuously being updated. This paper reviews the existing geospatial data visualisation taxonomies that have been developed over the years. Based on the review, an adopted taxonomy for visualisation of geospatial streaming data is presented. Example applications of this taxonomy are also provided. The adopted taxonomy will then be used to develop the information model for the visualisation library in a further study.

  6. a Web-Based Interactive Platform for Co-Clustering Spatio-Temporal Data

    Science.gov (United States)

    Wu, X.; Poorthuis, A.; Zurita-Milla, R.; Kraak, M.-J.

    2017-09-01

    Since current studies on clustering analysis mainly focus on exploring spatial or temporal patterns separately, a co-clustering algorithm is utilized in this study to enable the concurrent analysis of spatio-temporal patterns. To allow users to adopt and adapt the algorithm for their own analysis, it is integrated within the server side of an interactive web-based platform. The client side of the platform, running within any modern browser, is a graphical user interface (GUI) with multiple linked visualizations that facilitates the understanding, exploration and interpretation of the raw dataset and co-clustering results. Users can also upload their own datasets and adjust clustering parameters within the platform. To illustrate the use of this platform, an annual temperature dataset from 28 weather stations over 20 years in the Netherlands is used. After the dataset is loaded, it is visualized in a set of linked visualizations: a geographical map, a timeline and a heatmap. This aids the user in understanding the nature of their dataset and the appropriate selection of co-clustering parameters. Once the dataset is processed by the co-clustering algorithm, the results are visualized in the small multiples, a heatmap and a timeline to provide various views for better understanding and also further interpretation. Since the visualization and analysis are integrated in a seamless platform, the user can explore different sets of co-clustering parameters and instantly view the results in order to do iterative, exploratory data analysis. As such, this interactive web-based platform allows users to analyze spatio-temporal data using the co-clustering method and also helps the understanding of the results using multiple linked visualizations.

  7. Prototype of a Web-based Participative Decision Support Platform in Natural Hazards and Risk Management

    Directory of Open Access Journals (Sweden)

    Zar Chi Aye

    2015-07-01

    Full Text Available This paper presents the current state and development of a prototype web-GIS (Geographic Information System decision support platform intended for application in natural hazards and risk management, mainly for floods and landslides. This web platform uses open-source geospatial software and technologies, particularly the Boundless (formerly OpenGeo framework and its client side software development kit (SDK. The main purpose of the platform is to assist the experts and stakeholders in the decision-making process for evaluation and selection of different risk management strategies through an interactive participation approach, integrating web-GIS interface with decision support tool based on a compromise programming approach. The access rights and functionality of the platform are varied depending on the roles and responsibilities of stakeholders in managing the risk. The application of the prototype platform is demonstrated based on an example case study site: Malborghetto Valbruna municipality of North-Eastern Italy where flash floods and landslides are frequent with major events having occurred in 2003. The preliminary feedback collected from the stakeholders in the region is discussed to understand the perspectives of stakeholders on the proposed prototype platform.

  8. Scientific visualization uncertainty, multifield, biomedical, and scalable visualization

    CERN Document Server

    Chen, Min; Johnson, Christopher; Kaufman, Arie; Hagen, Hans

    2014-01-01

    Based on the seminar that took place in Dagstuhl, Germany in June 2011, this contributed volume studies the four important topics within the scientific visualization field: uncertainty visualization, multifield visualization, biomedical visualization and scalable visualization. • Uncertainty visualization deals with uncertain data from simulations or sampled data, uncertainty due to the mathematical processes operating on the data, and uncertainty in the visual representation, • Multifield visualization addresses the need to depict multiple data at individual locations and the combination of multiple datasets, • Biomedical is a vast field with select subtopics addressed from scanning methodologies to structural applications to biological applications, • Scalability in scientific visualization is critical as data grows and computational devices range from hand-held mobile devices to exascale computational platforms. Scientific Visualization will be useful to practitioners of scientific visualization, ...

  9. Declarative language design for interactive visualization.

    Science.gov (United States)

    Heer, Jeffrey; Bostock, Michael

    2010-01-01

    We investigate the design of declarative, domain-specific languages for constructing interactive visualizations. By separating specification from execution, declarative languages can simplify development, enable unobtrusive optimization, and support retargeting across platforms. We describe the design of the Protovis specification language and its implementation within an object-oriented, statically-typed programming language (Java). We demonstrate how to support rich visualizations without requiring a toolkit-specific data model and extend Protovis to enable declarative specification of animated transitions. To support cross-platform deployment, we introduce rendering and event-handling infrastructures decoupled from the runtime platform, letting designers retarget visualization specifications (e.g., from desktop to mobile phone) with reduced effort. We also explore optimizations such as runtime compilation of visualization specifications, parallelized execution, and hardware-accelerated rendering. We present benchmark studies measuring the performance gains provided by these optimizations and compare performance to existing Java-based visualization tools, demonstrating scalability improvements exceeding an order of magnitude.

  10. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    Science.gov (United States)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  11. Representation of activity in images using geospatial temporal graphs

    Science.gov (United States)

    Brost, Randolph; McLendon, III, William C.; Parekh, Ojas D.; Rintoul, Mark Daniel; Watson, Jean-Paul; Strip, David R.; Diegert, Carl

    2018-05-01

    Various technologies pertaining to modeling patterns of activity observed in remote sensing images using geospatial-temporal graphs are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Activity patterns may be discerned from the graphs by coding nodes representing persistent objects like buildings differently from nodes representing ephemeral objects like vehicles, and examining the geospatial-temporal relationships of ephemeral nodes within the graph.

  12. Urban Image Classification: Per-Pixel Classifiers, Sub-Pixel Analysis, Object-Based Image Analysis, and Geospatial Methods. 10; Chapter

    Science.gov (United States)

    Myint, Soe W.; Mesev, Victor; Quattrochi, Dale; Wentz, Elizabeth A.

    2013-01-01

    Remote sensing methods used to generate base maps to analyze the urban environment rely predominantly on digital sensor data from space-borne platforms. This is due in part from new sources of high spatial resolution data covering the globe, a variety of multispectral and multitemporal sources, sophisticated statistical and geospatial methods, and compatibility with GIS data sources and methods. The goal of this chapter is to review the four groups of classification methods for digital sensor data from space-borne platforms; per-pixel, sub-pixel, object-based (spatial-based), and geospatial methods. Per-pixel methods are widely used methods that classify pixels into distinct categories based solely on the spectral and ancillary information within that pixel. They are used for simple calculations of environmental indices (e.g., NDVI) to sophisticated expert systems to assign urban land covers. Researchers recognize however, that even with the smallest pixel size the spectral information within a pixel is really a combination of multiple urban surfaces. Sub-pixel classification methods therefore aim to statistically quantify the mixture of surfaces to improve overall classification accuracy. While within pixel variations exist, there is also significant evidence that groups of nearby pixels have similar spectral information and therefore belong to the same classification category. Object-oriented methods have emerged that group pixels prior to classification based on spectral similarity and spatial proximity. Classification accuracy using object-based methods show significant success and promise for numerous urban 3 applications. Like the object-oriented methods that recognize the importance of spatial proximity, geospatial methods for urban mapping also utilize neighboring pixels in the classification process. The primary difference though is that geostatistical methods (e.g., spatial autocorrelation methods) are utilized during both the pre- and post

  13. CyberTORCS: An Intelligent Vehicles Simulation Platform for Cooperative Driving

    Directory of Open Access Journals (Sweden)

    Ming Yang

    2011-05-01

    Full Text Available Simulation platforms play an important role in helping intelligent vehicle research, especially for the research of cooperative driving due to the high cost and risk of the real experiments. In order to ease and bring more convenience for cooperative driving tests, we introduce an intelligent vehicle simulation platform, called CyberTORCS, for the research in cooperative driving. Details of the simulator modules including vehicle body control, vehicle visualization modeling and track visualization modeling are presented. Two simulation examples are given to validate the feasibility and effectiveness of the proposed simulation platform.

  14. Data management for geospatial vulnerability assessment of interdependencies in US power generation

    Energy Technology Data Exchange (ETDEWEB)

    Shih, C.Y.; Scown, C.D.; Soibelman, L.; Matthews, H.S.; Garrett, J.H.; Dodrill, K.; McSurdy, S. [Carnegie Mellon University, Pittsburgh, PA (United States). Dept. of Civil & Environmental Engineering

    2009-09-15

    Critical infrastructures maintain our society's stability, security, and quality of life. These systems are also interdependent, which means that the disruption of one infrastructure system can significantly impact the operation of other systems. Because of the heavy reliance on electricity production, it is important to assess possible vulnerabilities. Determining the source of these vulnerabilities can provide insight for risk management and emergency response efforts. This research uses data warehousing and visualization techniques to explore the interdependencies between coal mines, rail transportation, and electric power plants. By merging geospatial and nonspatial data, we are able to model the potential impacts of a disruption to one or more mines, rail lines, or power plants, and visually display the results using a geographical information system. A scenario involving a severe earthquake in the New Madrid Seismic Zone is used to demonstrate the capabilities of the model when given input in the form of a potentially impacted area. This type of interactive analysis can help decision makers to understand the vulnerabilities of the coal distribution network and the potential impact it can have on electricity production.

  15. Stakeholder Alignment and Changing Geospatial Information Capabilities

    Science.gov (United States)

    Winter, S.; Cutcher-Gershenfeld, J.; King, J. L.

    2015-12-01

    Changing geospatial information capabilities can have major economic and social effects on activities such as drought monitoring, weather forecasts, agricultural productivity projections, water and air quality assessments, the effects of forestry practices and so on. Whose interests are served by such changes? Two common mistakes are assuming stability in the community of stakeholders and consistency in stakeholder behavior. Stakeholder communities can reconfigure dramatically as some leave the discussion, others enter, and circumstances shift — all resulting in dynamic points of alignment and misalignment . New stakeholders can bring new interests, and existing stakeholders can change their positions. Stakeholders and their interests need to be be considered as geospatial information capabilities change, but this is easier said than done. New ways of thinking about stakeholder alignment in light of changes in capability are presented.

  16. Establishment of the Northeast Coastal Watershed Geospatial Data Network (NECWGDN)

    Energy Technology Data Exchange (ETDEWEB)

    Hannigan, Robyn [University of Massachusetts Boston

    2014-02-17

    The goals of NECWGDN were to establish integrated geospatial databases that interfaced with existing open-source (water.html) environmental data server technologies (e.g., HydroDesktop) and included ecological and human data to enable evaluation, prediction, and adaptation in coastal environments to climate- and human-induced threats to the coastal marine resources within the Gulf of Maine. We have completed the development and testing of a "test bed" architecture that is compatible with HydroDesktop and have identified key metadata structures that will enable seamless integration and delivery of environmental, ecological, and human data as well as models to predict threats to end-users. Uniquely this database integrates point as well as model data and so offers capacities to end-users that are unique among databases. Future efforts will focus on the development of integrated environmental-human dimension models that can serve, in near real time, visualizations of threats to coastal resources and habitats.

  17. Visual Analysis of Air Traffic Data

    Science.gov (United States)

    Albrecht, George Hans; Pang, Alex

    2012-01-01

    In this paper, we present visual analysis tools to help study the impact of policy changes on air traffic congestion. The tools support visualization of time-varying air traffic density over an area of interest using different time granularity. We use this visual analysis platform to investigate how changing the aircraft separation volume can reduce congestion while maintaining key safety requirements. The same platform can also be used as a decision aid for processing requests for unmanned aerial vehicle operations.

  18. Geospatial Brokering - Challenges and Future Directions

    Science.gov (United States)

    White, C. E.

    2012-12-01

    An important feature of many brokers is to facilitate straightforward human access to scientific data while maintaining programmatic access to it for system solutions. Standards-based protocols are critical for this, and there are a number of protocols to choose from. In this discussion, we will present a web application solution that leverages certain protocols - e.g., OGC CSW, REST, and OpenSearch - to provide programmatic as well as human access to geospatial resources. We will also discuss managing resources to reduce duplication yet increase discoverability, federated search solutions, and architectures that combine human-friendly interfaces with powerful underlying data management. The changing requirements witnessed in brokering solutions over time, our recent experience participating in the EarthCube brokering hack-a-thon, and evolving interoperability standards provide insight to future technological and philosophical directions planned for geospatial broker solutions. There has been much change over the past decade, but with the unprecedented data collaboration of recent years, in many ways the challenges and opportunities are just beginning.

  19. SIMULTANEOUS VISUALIZATION OF DIFFERENT UTILITY NETWORKS FOR DISASTER MANAGEMENT

    OpenAIRE

    S. Semm; T. Becker; T. H. Kolbe

    2012-01-01

    Cartographic visualizations of crises are used to create a Common Operational Picture (COP) and enforce Situational Awareness by presenting and representing relevant information. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific decision-making throughout the crises. Since, Operator's attention span and their working memory are limiting factors for the process of getting and interpreting information; the cartographic presentation has t...

  20. ASAP: a web-based platform for the analysis and interactive visualization of single-cell RNA-seq data.

    Science.gov (United States)

    Gardeux, Vincent; David, Fabrice P A; Shajkofci, Adrian; Schwalie, Petra C; Deplancke, Bart

    2017-10-01

    Single-cell RNA-sequencing (scRNA-seq) allows whole transcriptome profiling of thousands of individual cells, enabling the molecular exploration of tissues at the cellular level. Such analytical capacity is of great interest to many research groups in the world, yet these groups often lack the expertise to handle complex scRNA-seq datasets. We developed a fully integrated, web-based platform aimed at the complete analysis of scRNA-seq data post genome alignment: from the parsing, filtering and normalization of the input count data files, to the visual representation of the data, identification of cell clusters, differentially expressed genes (including cluster-specific marker genes), and functional gene set enrichment. This Automated Single-cell Analysis Pipeline (ASAP) combines a wide range of commonly used algorithms with sophisticated visualization tools. Compared with existing scRNA-seq analysis platforms, researchers (including those lacking computational expertise) are able to interact with the data in a straightforward fashion and in real time. Furthermore, given the overlap between scRNA-seq and bulk RNA-seq analysis workflows, ASAP should conceptually be broadly applicable to any RNA-seq dataset. As a validation, we demonstrate how we can use ASAP to simply reproduce the results from a single-cell study of 91 mouse cells involving five distinct cell types. The tool is freely available at asap.epfl.ch and R/Python scripts are available at github.com/DeplanckeLab/ASAP. bart.deplancke@epfl.ch. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  1. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    Science.gov (United States)

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  2. NativeView: A Geospatial Curriculum for Native Nation Building

    Science.gov (United States)

    Rattling Leaf, J.

    2007-12-01

    In the spirit of collaboration and reciprocity, James Rattling Leaf of Sinte Gleska University on the Rosebud Reservation of South Dakota will present recent developments, experiences, insights and a vision for education in Indian Country. As a thirty-year young institution, Sinte Gleska University is founded by a strong vision of ancestral leadership and the values of the Lakota Way of Life. Sinte Gleska University (SGU) has initiated the development of a Geospatial Education Curriculum project. NativeView: A Geospatial Curriculum for Native Nation Building is a two-year project that entails a disciplined approach towards the development of a relevant Geospatial academic curriculum. This project is designed to meet the educational and land management needs of the Rosebud Lakota Tribe through the utilization of Geographic Information Systems (GIS), Remote Sensing (RS) and Global Positioning Systems (GPS). In conjunction with the strategy and progress of this academic project, a formal presentation and demonstration of the SGU based Geospatial software RezMapper software will exemplify an innovative example of state of the art information technology. RezMapper is an interactive CD software package focused toward the 21 Lakota communities on the Rosebud Reservation that utilizes an ingenious concept of multimedia mapping and state of the art data compression and presentation. This ongoing development utilizes geographic data, imagery from space, historical aerial photography and cultural features such as historic Lakota documents, language, song, video and historical photographs in a multimedia fashion. As a tangible product, RezMapper will be a project deliverable tool for use in the classroom and to a broad range of learners.

  3. Interactive 3D visualization for theoretical virtual observatories

    Science.gov (United States)

    Dykes, T.; Hassan, A.; Gheller, C.; Croton, D.; Krokos, M.

    2018-06-01

    Virtual observatories (VOs) are online hubs of scientific knowledge. They encompass a collection of platforms dedicated to the storage and dissemination of astronomical data, from simple data archives to e-research platforms offering advanced tools for data exploration and analysis. Whilst the more mature platforms within VOs primarily serve the observational community, there are also services fulfilling a similar role for theoretical data. Scientific visualization can be an effective tool for analysis and exploration of data sets made accessible through web platforms for theoretical data, which often contain spatial dimensions and properties inherently suitable for visualization via e.g. mock imaging in 2D or volume rendering in 3D. We analyse the current state of 3D visualization for big theoretical astronomical data sets through scientific web portals and virtual observatory services. We discuss some of the challenges for interactive 3D visualization and how it can augment the workflow of users in a virtual observatory context. Finally we showcase a lightweight client-server visualization tool for particle-based data sets, allowing quantitative visualization via data filtering, highlighting two example use cases within the Theoretical Astrophysical Observatory.

  4. Interactive 3D Visualization for Theoretical Virtual Observatories

    Science.gov (United States)

    Dykes, Tim; Hassan, A.; Gheller, C.; Croton, D.; Krokos, M.

    2018-04-01

    Virtual Observatories (VOs) are online hubs of scientific knowledge. They encompass a collection of platforms dedicated to the storage and dissemination of astronomical data, from simple data archives to e-research platforms offering advanced tools for data exploration and analysis. Whilst the more mature platforms within VOs primarily serve the observational community, there are also services fulfilling a similar role for theoretical data. Scientific visualization can be an effective tool for analysis and exploration of datasets made accessible through web platforms for theoretical data, which often contain spatial dimensions and properties inherently suitable for visualization via e.g. mock imaging in 2d or volume rendering in 3d. We analyze the current state of 3d visualization for big theoretical astronomical datasets through scientific web portals and virtual observatory services. We discuss some of the challenges for interactive 3d visualization and how it can augment the workflow of users in a virtual observatory context. Finally we showcase a lightweight client-server visualization tool for particle-based datasets allowing quantitative visualization via data filtering, highlighting two example use cases within the Theoretical Astrophysical Observatory.

  5. Recent innovation of geospatial information technology to support disaster risk management and responses

    Science.gov (United States)

    Une, Hiroshi; Nakano, Takayuki

    2018-05-01

    Geographic location is one of the most fundamental and indispensable information elements in the field of disaster response and prevention. For example, in the case of the Tohoku Earthquake in 2011, aerial photos taken immediately after the earthquake greatly improved information sharing among different government offices and facilitated rescue and recovery operations, and maps prepared after the disaster assisted in the rapid reconstruction of affected local communities. Thanks to the recent development of geospatial information technology, this information has become more essential for disaster response activities. Advancements in web mapping technology allows us to better understand the situation by overlaying various location-specific data on base maps on the web and specifying the areas on which activities should be focused. Through 3-D modelling technology, we can have a more realistic understanding of the relationship between disaster and topography. Geospatial information technology can sup-port proper preparation and emergency responses against disasters by individuals and local communities through hazard mapping and other information services using mobile devices. Thus, geospatial information technology is playing a more vital role on all stages of disaster risk management and responses. In acknowledging geospatial information's vital role in disaster risk reduction, the Sendai Framework for Disaster Risk Reduction 2015-2030, adopted at the Third United Nations World Conference on Disaster Risk Reduction, repeatedly reveals the importance of utilizing geospatial information technology for disaster risk reduction. This presentation aims to report the recent practical applications of geospatial information technology for disaster risk management and responses.

  6. The Efficacy of Educative Curriculum Materials to Support Geospatial Science Pedagogical Content Knowledge

    Science.gov (United States)

    Bodzin, Alec; Peffer, Tamara; Kulo, Violet

    2012-01-01

    Teaching and learning about geospatial aspects of energy resource issues requires that science teachers apply effective science pedagogical approaches to implement geospatial technologies into classroom instruction. To address this need, we designed educative curriculum materials as an integral part of a comprehensive middle school energy…

  7. Geoscience data visualization and analysis using GeoMapApp

    Science.gov (United States)

    Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha

    2013-04-01

    Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D

  8. Central Asia Water (CAWa) - A visualization platform for hydro-meteorological sensor data

    Science.gov (United States)

    Stender, Vivien; Schroeder, Matthias; Wächter, Joachim

    2014-05-01

    (SAS) for sending alerts. An OpenSource web-platform bundles the data, provided by the SWE web services of the hydro-meteorological stations, and provides tools for data visualization and data access. The visualization tool was implemented by using OpenSource tools like GeoExt/ExtJS and OpenLayers. Using the application the user can query the relevant sensor data, select parameter and time period, visualize and finally download the data. [1] http://www.cawa-project.net

  9. lawn: An R client for the Turf JavaScript Library for Geospatial Analysis

    Science.gov (United States)

    lawn is an R package to provide access to the geospatial analysis capabilities in the Turf javascript library. Turf expects data in GeoJSON format. Given that many datasets are now available natively in GeoJSON providing an easier method for conducting geospatial analyses on thes...

  10. A Geospatial Data Recommender System based on Metadata and User Behaviour

    Science.gov (United States)

    Li, Y.; Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; Finch, C. J.; McGibbney, L. J.

    2017-12-01

    Earth observations are produced in a fast velocity through real time sensors, reaching tera- to peta- bytes of geospatial data daily. Discovering and accessing the right data from the massive geospatial data is like finding needle in the haystack. To help researchers find the right data for study and decision support, quite a lot of research focusing on improving search performance have been proposed including recommendation algorithm. However, few papers have discussed the way to implement a recommendation algorithm in geospatial data retrieval system. In order to address this problem, we propose a recommendation engine to improve discovering relevant geospatial data by mining and utilizing metadata and user behavior data: 1) metadata based recommendation considers the correlation of each attribute (i.e., spatiotemporal, categorical, and ordinal) to data to be found. In particular, phrase extraction method is used to improve the accuracy of the description similarity; 2) user behavior data are utilized to predict the interest of a user through collaborative filtering; 3) an integration method is designed to combine the results of the above two methods to achieve better recommendation Experiments show that in the hybrid recommendation list, the all the precisions are larger than 0.8 from position 1 to 10.

  11. NASA's Earth Science Gateway: A Platform for Interoperable Services in Support of the GEOSS Architecture

    Science.gov (United States)

    Alameh, N.; Bambacus, M.; Cole, M.

    2006-12-01

    Nasa's Earth Science as well as interdisciplinary research and applications activities require access to earth observations, analytical models and specialized tools and services, from diverse distributed sources. Interoperability and open standards for geospatial data access and processing greatly facilitate such access among the information and processing compo¬nents related to space¬craft, airborne, and in situ sensors; predictive models; and decision support tools. To support this mission, NASA's Geosciences Interoperability Office (GIO) has been developing the Earth Science Gateway (ESG; online at http://esg.gsfc.nasa.gov) by adapting and deploying a standards-based commercial product. Thanks to extensive use of open standards, ESG can tap into a wide array of online data services, serve a variety of audiences and purposes, and adapt to technology and business changes. Most importantly, the use of open standards allow ESG to function as a platform within a larger context of distributed geoscience processing, such as the Global Earth Observing System of Systems (GEOSS). ESG shares the goals of GEOSS to ensure that observations and products shared by users will be accessible, comparable, and understandable by relying on common standards and adaptation to user needs. By maximizing interoperability, modularity, extensibility and scalability, ESG's architecture fully supports the stated goals of GEOSS. As such, ESG's role extends beyond that of a gateway to NASA science data to become a shared platform that can be leveraged by GEOSS via: A modular and extensible architecture Consensus and community-based standards (e.g. ISO and OGC standards) A variety of clients and visualization techniques, including WorldWind and Google Earth A variety of services (including catalogs) with standard interfaces Data integration and interoperability Mechanisms for user involvement and collaboration Mechanisms for supporting interdisciplinary and domain-specific applications ESG

  12. Planetary SUrface Portal (PSUP): a tool for easy visualization and analysis of Martian surface

    Science.gov (United States)

    Poulet, Francois; Quantin-Nataf, Cathy; Ballans, Hervé; Lozac'h, Loic; Audouard, Joachim; Carter, John; Dassas, karin; Malapert, Jean-Christophe; Marmo, Chiara; Poulleau, Gilles; Riu, Lucie; Séjourné, antoine

    2016-10-01

    PSUP is two software application platforms for working with raster, vector, DTM, and hyper-spectral data acquired by various space instruments analyzing the surface of Mars from orbit. The first platform of PSUP is MarsSI (Martian surface data processing Information System, http://emars.univ-lyon1.fr). It provides data analysis functionalities to select and download ready-to-use products or to process data though specific and validated pipelines. To date, MarsSI handles CTX, HiRISE and CRISM data of NASA/MRO mission, HRSC and OMEGA data of ESA/MEx mission and THEMIS data of NASA/ODY mission (Lozac'h et al., EPSC 2015). The second part of PSUP is also open to the scientific community and can be visited at http://psup.ias.u-psud.fr/. This web-based user interface provides access to many data products for Mars: image footprints and rasters from the MarsSI tool; compositional maps from OMEGA and TES; albedo and thermal inertia from OMEGA and TES; mosaics from THEMIS, Viking, and CTX; high level specific products (defined as catalogues) such as hydrated mineral sites derived from CRISM and OMEGA data, central peaks mineralogy,… In addition, OMEGA C channel data cubes corrected for atmospheric and aerosol contributions can be downloaded. The architecture of PSUP data management and visualization is based on SITools2 and MIZAR, two CNES generic tools developed by a joint effort between CNES and scientific laboratories. SITools2 provides a self-manageable data access layer deployed on the PSUP data, while MIZAR is 3D application in a browser for discovering and visualizing geospatial data. Further developments including the addition of high level products of Mars (regional geological maps, new global compositional maps,…) are foreseen. Ultimately, PSUP will be adapted to other planetary surfaces and space missions in which the French research institutes are involved.

  13. KINGDOM OF SAUDI ARABIA GEOSPATIAL INFORMATION INFRASTRUCTURE – AN INITIAL STUDY

    Directory of Open Access Journals (Sweden)

    S. H. Alsultan

    2015-10-01

    Full Text Available This paper reviews the current Geographic Information System (Longley et al. implementation and status in the Kingdom of Saudi Arabia (KSA. Based on the review, several problems were identified and discussed. The characteristic of these problems show that the country needs a national geospatial centre. As a new initiative for a national geospatial centre, a study is being conducted especially on best practice from other countries, availability of national committee for standards and policies on data sharing, and the best proposed organization structure inside the administration for the KSA. The study also covers the degree of readiness and awareness among the main GIS stakeholders within the country as well as private parties. At the end of this paper, strategic steps for the national geospatial management centre were proposed as the initial output of the study.

  14. Real-time visual tracking of less textured three-dimensional objects on mobile platforms

    Science.gov (United States)

    Seo, Byung-Kuk; Park, Jungsik; Park, Hanhoon; Park, Jong-Il

    2012-12-01

    Natural feature-based approaches are still challenging for mobile applications (e.g., mobile augmented reality), because they are feasible only in limited environments such as highly textured and planar scenes/objects, and they need powerful mobile hardware for fast and reliable tracking. In many cases where conventional approaches are not effective, three-dimensional (3-D) knowledge of target scenes would be beneficial. We present a well-established framework for real-time visual tracking of less textured 3-D objects on mobile platforms. Our framework is based on model-based tracking that efficiently exploits partially known 3-D scene knowledge such as object models and a background's distinctive geometric or photometric knowledge. Moreover, we elaborate on implementation in order to make it suitable for real-time vision processing on mobile hardware. The performance of the framework is tested and evaluated on recent commercially available smartphones, and its feasibility is shown by real-time demonstrations.

  15. GeoBoost: accelerating research involving the geospatial metadata of virus GenBank records.

    Science.gov (United States)

    Tahsin, Tasnia; Weissenbacher, Davy; O'Connor, Karen; Magge, Arjun; Scotch, Matthew; Gonzalez-Hernandez, Graciela

    2018-05-01

    GeoBoost is a command-line software package developed to address sparse or incomplete metadata in GenBank sequence records that relate to the location of the infected host (LOIH) of viruses. Given a set of GenBank accession numbers corresponding to virus GenBank records, GeoBoost extracts, integrates and normalizes geographic information reflecting the LOIH of the viruses using integrated information from GenBank metadata and related full-text publications. In addition, to facilitate probabilistic geospatial modeling, GeoBoost assigns probability scores for each possible LOIH. Binaries and resources required for running GeoBoost are packed into a single zipped file and freely available for download at https://tinyurl.com/geoboost. A video tutorial is included to help users quickly and easily install and run the software. The software is implemented in Java 1.8, and supported on MS Windows and Linux platforms. gragon@upenn.edu. Supplementary data are available at Bioinformatics online.

  16. A Geospatial Decision Support System Toolkit, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and commercialize a working prototype Geospatial Decision Support Toolkit (GeoKit). GeoKit will enable scientists, agencies, and stakeholders to...

  17. GSKY: A scalable distributed geospatial data server on the cloud

    Science.gov (United States)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as

  18. Towards Precise Metadata-set for Discovering 3D Geospatial Models in Geo-portals

    Science.gov (United States)

    Zamyadi, A.; Pouliot, J.; Bédard, Y.

    2013-09-01

    Accessing 3D geospatial models, eventually at no cost and for unrestricted use, is certainly an important issue as they become popular among participatory communities, consultants, and officials. Various geo-portals, mainly established for 2D resources, have tried to provide access to existing 3D resources such as digital elevation model, LIDAR or classic topographic data. Describing the content of data, metadata is a key component of data discovery in geo-portals. An inventory of seven online geo-portals and commercial catalogues shows that the metadata referring to 3D information is very different from one geo-portal to another as well as for similar 3D resources in the same geo-portal. The inventory considered 971 data resources affiliated with elevation. 51% of them were from three geo-portals running at Canadian federal and municipal levels whose metadata resources did not consider 3D model by any definition. Regarding the remaining 49% which refer to 3D models, different definition of terms and metadata were found, resulting in confusion and misinterpretation. The overall assessment of these geo-portals clearly shows that the provided metadata do not integrate specific and common information about 3D geospatial models. Accordingly, the main objective of this research is to improve 3D geospatial model discovery in geo-portals by adding a specific metadata-set. Based on the knowledge and current practices on 3D modeling, and 3D data acquisition and management, a set of metadata is proposed to increase its suitability for 3D geospatial models. This metadata-set enables the definition of genuine classes, fields, and code-lists for a 3D metadata profile. The main structure of the proposal contains 21 metadata classes. These classes are classified in three packages as General and Complementary on contextual and structural information, and Availability on the transition from storage to delivery format. The proposed metadata set is compared with Canadian Geospatial

  19. Assessing the socioeconomic impact and value of open geospatial information

    Science.gov (United States)

    Pearlman, Francoise; Pearlman, Jay; Bernknopf, Richard; Coote, Andrew; Craglia, Massimo; Friedl, Lawrence; Gallo, Jason; Hertzfeld, Henry; Jolly, Claire; Macauley, Molly K.; Shapiro, Carl; Smart, Alan

    2016-03-10

    The production and accessibility of geospatial information including Earth observation is changing greatly both technically and in terms of human participation. Advances in technology have changed the way that geospatial data are produced and accessed, resulting in more efficient processes and greater accessibility than ever before. Improved technology has also created opportunities for increased participation in the gathering and interpretation of data through crowdsourcing and citizen science efforts. Increased accessibility has resulted in greater participation in the use of data as prices for Government-produced data have fallen and barriers to access have been reduced.

  20. Introduction of geospatial perspective to the ecology of fish-habitat relationships in Indonesian coral reefs: A remote sensing approach

    Science.gov (United States)

    Sawayama, Shuhei; Nurdin, Nurjannah; Akbar AS, Muhammad; Sakamoto, Shingo X.; Komatsu, Teruhisa

    2015-06-01

    Coral reef ecosystems worldwide are now being harmed by various stresses accompanying the degradation of fish habitats and thus knowledge of fish-habitat relationships is urgently required. Because conventional research methods were not practical for this purpose due to the lack of a geospatial perspective, we attempted to develop a research method integrating visual fish observation with a seabed habitat map and to expand knowledge to a two-dimensional scale. WorldView-2 satellite imagery of Spermonde Archipelago, Indonesia obtained in September 2012 was analyzed and classified into four typical substrates: live coral, dead coral, seagrass and sand. Overall classification accuracy of this map was 81.3% and considered precise enough for subsequent analyses. Three sub-areas (CC: continuous coral reef, BC: boundary of coral reef and FC: few live coral zone) around reef slopes were extracted from the map. Visual transect surveys for several fish species were conducted within each sub-area in June 2013. As a result, Mean density (Ind. / 300 m2) of Chaetodon octofasciatus, known as an obligate feeder of corals, was significantly higher at BC than at the others (p < 0.05), implying that this species' density is strongly influenced by spatial configuration of its habitat, like the "edge effect." This indicates that future conservation procedures for coral reef fishes should consider not only coral cover but also its spatial configuration. The present study also indicates that the introduction of a geospatial perspective derived from remote sensing has great potential to progress conventional ecological studies on coral reef fishes.

  1. Geospatial metadata retrieval from web services

    Directory of Open Access Journals (Sweden)

    Ivanildo Barbosa

    Full Text Available Nowadays, producers of geospatial data in either raster or vector formats are able to make them available on the World Wide Web by deploying web services that enable users to access and query on those contents even without specific software for geoprocessing. Several providers around the world have deployed instances of WMS (Web Map Service, WFS (Web Feature Service and WCS (Web Coverage Service, all of them specified by the Open Geospatial Consortium (OGC. In consequence, metadata about the available contents can be retrieved to be compared with similar offline datasets from other sources. This paper presents a brief summary and describes the matching process between the specifications for OGC web services (WMS, WFS and WCS and the specifications for metadata required by the ISO 19115 - adopted as reference for several national metadata profiles, including the Brazilian one. This process focuses on retrieving metadata about the identification and data quality packages as well as indicates the directions to retrieve metadata related to other packages. Therefore, users are able to assess whether the provided contents fit to their purposes.

  2. A cross-sectional ecological analysis of international and sub-national health inequalities in commercial geospatial resource availability.

    Science.gov (United States)

    Dotse-Gborgbortsi, Winfred; Wardrop, Nicola; Adewole, Ademola; Thomas, Mair L H; Wright, Jim

    2018-05-23

    Commercial geospatial data resources are frequently used to understand healthcare utilisation. Although there is widespread evidence of a digital divide for other digital resources and infra-structure, it is unclear how commercial geospatial data resources are distributed relative to health need. To examine the distribution of commercial geospatial data resources relative to health needs, we assembled coverage and quality metrics for commercial geocoding, neighbourhood characterisation, and travel time calculation resources for 183 countries. We developed a country-level, composite index of commercial geospatial data quality/availability and examined its distribution relative to age-standardised all-cause and cause specific (for three main causes of death) mortality using two inequality metrics, the slope index of inequality and relative concentration index. In two sub-national case studies, we also examined geocoding success rates versus area deprivation by district in Eastern Region, Ghana and Lagos State, Nigeria. Internationally, commercial geospatial data resources were inversely related to all-cause mortality. This relationship was more pronounced when examining mortality due to communicable diseases. Commercial geospatial data resources for calculating patient travel times were more equitably distributed relative to health need than resources for characterising neighbourhoods or geocoding patient addresses. Countries such as South Africa have comparatively high commercial geospatial data availability despite high mortality, whilst countries such as South Korea have comparatively low data availability and low mortality. Sub-nationally, evidence was mixed as to whether geocoding success was lowest in more deprived districts. To our knowledge, this is the first global analysis of commercial geospatial data resources in relation to health outcomes. In countries such as South Africa where there is high mortality but also comparatively rich commercial geospatial

  3. Research and Practical Trends in Geospatial Sciences

    Science.gov (United States)

    Karpik, A. P.; Musikhin, I. A.

    2016-06-01

    In recent years professional societies have been undergoing fundamental restructuring brought on by extensive technological change and rapid evolution of geospatial science. Almost all professional communities have been affected. Communities are embracing digital techniques, modern equipment, software and new technological solutions at a staggering pace. In this situation, when planning financial investments and intellectual resource management, it is crucial to have a clear understanding of those trends that will be in great demand in 3-7 years. This paper reviews current scientific and practical activities of such non-governmental international organizations as International Federation of Surveyors, International Cartographic Association, and International Society for Photogrammetry and Remote Sensing, analyzes and groups most relevant topics brought up at their scientific events, forecasts most probable research and practical trends in geospatial sciences, outlines topmost leading countries and emerging markets for further detailed analysis of their activities, types of scientific cooperation and joint implementation projects.

  4. Assessing UAV platform types and optical sensor specifications

    Science.gov (United States)

    Altena, B.; Goedemé, T.

    2014-05-01

    Photogrammetric acquisition with unmanned aerial vehicles (UAV) has grown extensively over the last couple of years. Such mobile platforms and their processing software have matured, resulting in a market which offers off-the-shelf mapping solutions to surveying companies and geospatial enterprises. Different approaches in platform type and optical instruments exist, though its resulting products have similar specifications. To demonstrate differences in acquisitioning practice, a case study over an open mine was flown with two different off-the-shelf UAVs (a fixed-wing and a multi-rotor). The resulting imagery is analyzed to clarify the differences in collection quality. We look at image settings, and stress the fact of photographic experience if manual setting are applied. For mapping production it might be safest to set the camera on automatic. Furthermore, we try to estimate if blur is present due to image motion. A subtle trend seems to be present, for the fast flying platform though its extent is of similar order to the slow moving one. It shows both systems operate at their limits. Finally, the lens distortion is assessed with special attention to chromatic aberration. Here we see that through calibration such aberrations could be present, however detecting this phenomena directly on imagery is not straightforward. For such effects a normal lens is sufficient, though a better lens and collimator does give significant improvement.

  5. Imprementation of Vgi-Based Geoportal for Empowering Citizen's Geospatial Observatories Related to Urban Disaster Management

    Science.gov (United States)

    Lee, Sanghoon

    2016-06-01

    The volunteered geospatial information (VGI) will be efficient and cost-effective method for generating and sharing large disasterrelated geospatial data. The national mapping organizations have played the role of major geospatial collector have been moving toward the considering public participation data collecting method. Due to VGI can conduct to encourage public participation and empower citizens, mapping agency could make a partnership with members of the VGI community to help to provide well-structured geospatial data. In order to effectively be understood and sharing the public semantics, datasets and action model of the public participation GeoPortal, the implemented VGI-GeoPortal designated as the basis of ISO 19154, ISO 19101 and OGC Reference Model. The proof of concepts of VGI-GeoPortal has been implemented urban flooding use-case in Republic of Korea to collect from the public, and analyze disaster-related geospatial data including high-disaster potential information such as the location of poor drainage sewer, small signs of occurring landslide, flooding vulnerability of urban structure, and etc.

  6. Streamlining geospatial metadata in the Semantic Web

    Science.gov (United States)

    Fugazza, Cristiano; Pepe, Monica; Oggioni, Alessandro; Tagliolato, Paolo; Carrara, Paola

    2016-04-01

    In the geospatial realm, data annotation and discovery rely on a number of ad-hoc formats and protocols. These have been created to enable domain-specific use cases generalized search is not feasible for. Metadata are at the heart of the discovery process and nevertheless they are often neglected or encoded in formats that either are not aimed at efficient retrieval of resources or are plainly outdated. Particularly, the quantum leap represented by the Linked Open Data (LOD) movement did not induce so far a consistent, interlinked baseline in the geospatial domain. In a nutshell, datasets, scientific literature related to them, and ultimately the researchers behind these products are only loosely connected; the corresponding metadata intelligible only to humans, duplicated on different systems, seldom consistently. Instead, our workflow for metadata management envisages i) editing via customizable web- based forms, ii) encoding of records in any XML application profile, iii) translation into RDF (involving the semantic lift of metadata records), and finally iv) storage of the metadata as RDF and back-translation into the original XML format with added semantics-aware features. Phase iii) hinges on relating resource metadata to RDF data structures that represent keywords from code lists and controlled vocabularies, toponyms, researchers, institutes, and virtually any description one can retrieve (or directly publish) in the LOD Cloud. In the context of a distributed Spatial Data Infrastructure (SDI) built on free and open-source software, we detail phases iii) and iv) of our workflow for the semantics-aware management of geospatial metadata.

  7. Development of Geospatial Map Based Election Portal

    Science.gov (United States)

    Gupta, A. Kumar Chandra; Kumar, P.; Vasanth Kumar, N.

    2014-11-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Election portal (GMEP) of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for pertain to planning and management of Department of Chief Electoral Officer, and as an election related information searching tools (Polling Station, Assembly and parliamentary constituency etc.,) for the citizens of NCTD. The GMEP is based on Client-Server architecture model. It has been developed using ArcGIS Server 10.0 with J2EE front-end on Microsoft Windows environment. The GMEP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMEP includes delimited precinct area boundaries of Voters Area of Polling stations, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMEP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of elections. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  8. AGWA: The Automated Geospatial Watershed Assessment Tool

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment Tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  9. Portable, universal, and visual ion sensing platform based on the light emitting diode-based self-referencing-ion selective field-effect transistor.

    Science.gov (United States)

    Zhang, Xiaowei; Han, Yanchao; Li, Jing; Zhang, Libing; Jia, Xiaofang; Wang, Erkang

    2014-02-04

    In this work, a novel and universal ion sensing platform was presented, which enables the visual detection of various ions with high sensitivity and selectivity. Coaxial potential signals (millivolt-scale) of the sample from the self-referencing (SR) ion selective chip can be transferred into the ad620-based amplifier with an output of volt-scale potentials. The amplified voltage is high enough to drive a light emitting diode (LED), which can be used as an amplifier and indicator to report the sample information. With this double amplification device (light emitting diode-based self-referencing-ion selective field-effect transistor, LED-SR-ISFET), a tiny change of the sample concentration can be observed with a distinguishable variation of LED brightness by visual inspection. This LED-based luminescent platform provided a facile, low-cost, and rapid sensing strategy without the need of additional expensive chemiluminescence reagent and instruments. Moreover, the SR mode also endows this device excellent stability and reliability. With this innovative design, sensitive determination of K(+), H(+), and Cl(-) by the naked eye was achieved. It should also be noticed that this sensing strategy can easily be extended to other ions (or molecules) by simply integrating the corresponding ion (or molecule) selective electrode.

  10. Users Manual for the Geospatial Stream Flow Model (GeoSFM)

    Science.gov (United States)

    Artan, Guleid A.; Asante, Kwabena; Smith, Jodie; Pervez, Md Shahriar; Entenmann, Debbie; Verdin, James P.; Rowland, James

    2008-01-01

    The monitoring of wide-area hydrologic events requires the manipulation of large amounts of geospatial and time series data into concise information products that characterize the location and magnitude of the event. To perform these manipulations, scientists at the U.S. Geological Survey Center for Earth Resources Observation and Science (EROS), with the cooperation of the U.S. Agency for International Development, Office of Foreign Disaster Assistance (USAID/OFDA), have implemented a hydrologic modeling system. The system includes a data assimilation component to generate data for a Geospatial Stream Flow Model (GeoSFM) that can be run operationally to identify and map wide-area streamflow anomalies. GeoSFM integrates a geographical information system (GIS) for geospatial preprocessing and postprocessing tasks and hydrologic modeling routines implemented as dynamically linked libraries (DLLs) for time series manipulations. Model results include maps that depicting the status of streamflow and soil water conditions. This Users Manual provides step-by-step instructions for running the model and for downloading and processing the input data required for initial model parameterization and daily operation.

  11. BrainBrowser: distributed, web-based neurological data visualization

    Directory of Open Access Journals (Sweden)

    Tarek eSherif

    2015-01-01

    Full Text Available Recent years have seen massive, distributed datasets become the norm in neuroimaging research, and the methodologies used analyze them have, in response, become more collaborative and exploratory. Tools and infrastructure are continuously being developed and deployed to facilitate research in this context: grid computation platforms to process the data, distributed data stores to house and share them, high-speed networks to move them around and collaborative, often web-based, platforms to provide access to and sometimes manage the entire system. BrainBrowser is a lightweight, high-performance JavaScript visualization library built to provide easy-to-use, powerful, on-demand visualization of remote datasets in this new research environment. BrainBrowser leverages modern Web technologies, such as WebGL, HTML5 and Web Workers, to visualize 3D surface and volumetric neuroimaging data in any modern web browser without requiring any browser plugins. It is thus trivial to integrate BrainBrowser into any web-based platform. BrainBrowser is simple enough to produce a basic web-based visualization in a few lines of code, while at the same time being robust enough to create full-featured visualization applications. BrainBrowser can dynamically load the data required for a given visualization, so no network bandwidth needs to be waisted on data that will not be used. BrainBrowser's integration into the standardized web platform also allows users to consider using 3D data visualization in novel ways, such as for data distribution, data sharing and dynamic online publications. BrainBrowser is already being used in two major online platforms, CBRAIN and LORIS, and has been used to make the 1TB MACACC dataset openly accessible.

  12. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    Directory of Open Access Journals (Sweden)

    Pirouz Nourian

    2018-03-01

    Full Text Available This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential and applicability in urban planning and urban data analytics. This review is not only based on the technical factors such as capabilities of the programming languages but also the ease of developing and sharing complex data processing workflows. The arena of web-based computing platforms is currently under rapid development and is too volatile to be predictable; therefore, in this article we focus on the specification of the requirements and potentials from an urban planning point of view rather than speculating about the fate of computing platforms or programming languages. The article presents a list of promising computing technologies, a technical specification of the essential data models and operators for geo-spatial data processing, and mathematical models for an ideal urban computing platform.

  13. A study on state of Geospatial courses in Indian Universities

    Science.gov (United States)

    Shekhar, S.

    2014-12-01

    Today the world is dominated by three technologies such as Nano technology, Bio technology and Geospatial technology. This increases the huge demand for experts in the respective field for disseminating the knowledge as well as for an innovative research. Therefore, the prime need is to train the existing fraternity to gain progressive knowledge in these technologies and impart the same to student community. The geospatial technology faces some peculiar problem than other two technologies because of its interdisciplinary, multi-disciplinary nature. It attracts students and mid career professionals from various disciplines including Physics, Computer science, Engineering, Geography, Geology, Agriculture, Forestry, Town Planning and so on. Hence there is always competition to crab and stabilize their position. The students of Master's degree in Geospatial science are facing two types of problem. The first one is no unique identity in the academic field. Neither they are exempted for National eligibility Test for Lecturer ship nor given an opportunity to have the exam in geospatial science. The second one is differential treatment by the industrial world. The students are either given low grade jobs or poorly paid for their job. Thus, it is a serious issue about the future of this course in the Universities and its recognition in the academic and industrial world. The universities should make this course towards more job oriented in consultation with the Industries and Industries should come forward to share their demands and requirements to the Universities, so that necessary changes in the curriculum can be made to meet the industrial requirements.

  14. B-CAN: a resource sharing platform to improve the operation, visualization and integrated analysis of TCGA breast cancer data.

    Science.gov (United States)

    Wen, Can-Hong; Ou, Shao-Min; Guo, Xiao-Bo; Liu, Chen-Feng; Shen, Yan-Bo; You, Na; Cai, Wei-Hong; Shen, Wen-Jun; Wang, Xue-Qin; Tan, Hai-Zhu

    2017-12-12

    Breast cancer is a high-risk heterogeneous disease with myriad subtypes and complicated biological features. The Cancer Genome Atlas (TCGA) breast cancer database provides researchers with the large-scale genome and clinical data via web portals and FTP services. Researchers are able to gain new insights into their related fields, and evaluate experimental discoveries with TCGA. However, it is difficult for researchers who have little experience with database and bioinformatics to access and operate on because of TCGA's complex data format and diverse files. For ease of use, we build the breast cancer (B-CAN) platform, which enables data customization, data visualization, and private data center. The B-CAN platform runs on Apache server and interacts with the backstage of MySQL database by PHP. Users can customize data based on their needs by combining tables from original TCGA database and selecting variables from each table. The private data center is applicable for private data and two types of customized data. A key feature of the B-CAN is that it provides single table display and multiple table display. Customized data with one barcode corresponding to many records and processed customized data are allowed in Multiple Tables Display. The B-CAN is an intuitive and high-efficient data-sharing platform.

  15. RESEARCH AND PRACTICAL TRENDS IN GEOSPATIAL SCIENCES

    Directory of Open Access Journals (Sweden)

    A. P. Karpik

    2016-06-01

    Full Text Available In recent years professional societies have been undergoing fundamental restructuring brought on by extensive technological change and rapid evolution of geospatial science. Almost all professional communities have been affected. Communities are embracing digital techniques, modern equipment, software and new technological solutions at a staggering pace. In this situation, when planning financial investments and intellectual resource management, it is crucial to have a clear understanding of those trends that will be in great demand in 3-7 years. This paper reviews current scientific and practical activities of such non-governmental international organizations as International Federation of Surveyors, International Cartographic Association, and International Society for Photogrammetry and Remote Sensing, analyzes and groups most relevant topics brought up at their scientific events, forecasts most probable research and practical trends in geospatial sciences, outlines topmost leading countries and emerging markets for further detailed analysis of their activities, types of scientific cooperation and joint implementation projects.

  16. The COMET Sleep Research Platform.

    Science.gov (United States)

    Nichols, Deborah A; DeSalvo, Steven; Miller, Richard A; Jónsson, Darrell; Griffin, Kara S; Hyde, Pamela R; Walsh, James K; Kushida, Clete A

    2014-01-01

    The Comparative Outcomes Management with Electronic Data Technology (COMET) platform is extensible and designed for facilitating multicenter electronic clinical research. Our research goals were the following: (1) to conduct a comparative effectiveness trial (CET) for two obstructive sleep apnea treatments-positive airway pressure versus oral appliance therapy; and (2) to establish a new electronic network infrastructure that would support this study and other clinical research studies. The COMET platform was created to satisfy the needs of CET with a focus on creating a platform that provides comprehensive toolsets, multisite collaboration, and end-to-end data management. The platform also provides medical researchers the ability to visualize and interpret data using business intelligence (BI) tools. COMET is a research platform that is scalable and extensible, and which, in a future version, can accommodate big data sets and enable efficient and effective research across multiple studies and medical specialties. The COMET platform components were designed for an eventual move to a cloud computing infrastructure that enhances sustainability, overall cost effectiveness, and return on investment.

  17. Geospatial technology perspectives for mining vis-a-vis sustainable forest ecosystems

    Directory of Open Access Journals (Sweden)

    Goparaju Laxmi

    2017-06-01

    Full Text Available Forests, the backbone of biogeochemical cycles and life supporting systems, are under severe pressure due to varied anthropogenic activities. Mining activities are one among the major reasons for forest destruction questioning the survivability and sustainability of flora and fauna existing in that area. Thus, monitoring and managing the impact of mining activities on natural resources at regular intervals is necessary to check the status of their depleted conditions, and to take up restoration and conservative measurements. Geospatial technology provides means to identify the impact of different mining operations on forest ecosystems and helps in proposing initiatives for safeguarding the forest environment. In this context, the present study highlights the problems related to mining in forest ecosystems and elucidates how geospatial technology can be employed at various stages of mining activities to achieve a sustainable forest ecosystem. The study collates information from various sources and highlights the role of geospatial technology in mining industries and reclamation process.

  18. Public-private collaboration in spatial data infrastructure: Overview of exposure, acceptance and sharing platform in Malaysia

    Science.gov (United States)

    Othman, Raha binti; Bakar, Muhamad Shahbani Abu; Mahamud, Ku Ruhana Ku

    2017-10-01

    While Spatial Data Infrastructure (SDI) has been established in Malaysia, the full potential can be further realized. To a large degree, geospatial industry users are hopeful that they can easily get access to the system and start utilizing the data. Some users expect SDI to provide them with readily available data without the necessary steps of requesting the data from the data providers as well as the steps for them to process and to prepare the data for their use. Some further argued that the usability of the system can be improved if appropriate combination between data sharing and focused application is found within the services. In order to address the current challenges and to enhance the effectiveness of the SDI in Malaysia, there is possibility of establishing a collaborative business venture between public and private entities; thus can help addressing the issues and expectations. In this paper, we discussed the possibility of collaboration between these two entities. Interviews with seven entities are held to collect information on the exposure, acceptance and sharing of platform. The outcomes indicate that though the growth of GIS technology and the high level of technology acceptance provides a solid based for utilizing the geospatial data, the absence of concrete policy on data sharing, a quality geospatial data, an authority for coordinator agency, leaves a vacuum for the successful implementation of the SDI initiative.

  19. Is a visual worth more than a thousand words? An investigation into brand engagement and social shopping on visual social media

    OpenAIRE

    Bennett, DR; Kunze, C

    2016-01-01

    Social media networking is now the most popular online activity worldwide and a large part of social media interaction involves sharing visual content on platforms such as Instagram and Pinterest. The phenomenal growth in numbers of users makes visual platforms enticing for marketers who spend ever-increasing time, effort and money on social media in the hope of generating consumer engagement for their brands. This research investigates customer behaviour for the largest visual social media p...

  20. Solar Maps | Geospatial Data Science | NREL

    Science.gov (United States)

    Solar Maps Solar Maps These solar maps provide average daily total solar resource information on disability, contact the Geospatial Data Science Team. U.S. State Solar Resource Maps Access state maps of MT NE NV NH NJ NM NY NC ND OH OK OR PA RI SC SD TN TX UT VT VA WA WV WI WY × U.S. Solar Resource

  1. Regulating outdoor advertisement boards; employing spatial decision support system to control urban visual pollution

    Science.gov (United States)

    Wakil, K.; Hussnain, MQ; Tahir, A.; Naeem, M. A.

    2016-06-01

    Unmanaged placement, size, location, structure and contents of outdoor advertisement boards have resulted in severe urban visual pollution and deterioration of the socio-physical living environment in urban centres of Pakistan. As per the regulatory instruments, the approval decision for a new advertisement installation is supposed to be based on the locational density of existing boards and their proximity or remoteness to certain land- uses. In cities, where regulatory tools for the control of advertisement boards exist, responsible authorities are handicapped in effective implementation due to the absence of geospatial analysis capacity. This study presents the development of a spatial decision support system (SDSS) for regularization of advertisement boards in terms of their location and placement. The knowledge module of the proposed SDSS is based on provisions and restrictions prescribed in regulatory documents. While the user interface allows visualization and scenario evaluation to understand if the new board will affect existing linear density on a particular road and if it violates any buffer restrictions around a particular land use. Technically the structure of the proposed SDSS is a web-based solution which includes open geospatial tools such as OpenGeo Suite, GeoExt, PostgreSQL, and PHP. It uses three key data sets including road network, locations of existing billboards and building parcels with land use information to perform the analysis. Locational suitability has been calculated using pairwise comparison through analytical hierarchy process (AHP) and weighted linear combination (WLC). Our results indicate that open geospatial tools can be helpful in developing an SDSS which can assist solving space related iterative decision challenges on outdoor advertisements. Employing such a system will result in effective implementation of regulations resulting in visual harmony and aesthetic improvement in urban communities.

  2. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    Directory of Open Access Journals (Sweden)

    Shaoming Pan

    Full Text Available Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  3. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    Science.gov (United States)

    Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen

    2015-01-01

    Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  4. Remote Sensing Technologies and Geospatial Modelling Hierarchy for Smart City Support

    Science.gov (United States)

    Popov, M.; Fedorovsky, O.; Stankevich, S.; Filipovich, V.; Khyzhniak, A.; Piestova, I.; Lubskyi, M.; Svideniuk, M.

    2017-12-01

    The approach to implementing the remote sensing technologies and geospatial modelling for smart city support is presented. The hierarchical structure and basic components of the smart city information support subsystem are considered. Some of the already available useful practical developments are described. These include city land use planning, urban vegetation analysis, thermal condition forecasting, geohazard detection, flooding risk assessment. Remote sensing data fusion approach for comprehensive geospatial analysis is discussed. Long-term city development forecasting by Forrester - Graham system dynamics model is provided over Kiev urban area.

  5. Architecture of a Process Broker for Interoperable Geospatial Modeling on the Web

    Directory of Open Access Journals (Sweden)

    Lorenzo Bigagli

    2015-04-01

    Full Text Available The identification of appropriate mechanisms for process sharing and reuse by means of composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. Modelers in need of running complex workflows may benefit from outsourcing process composition to a dedicated external service, according to the brokering approach. This work introduces our architecture of a process broker, as a distributed information system for creating, validating, editing, storing, publishing and executing geospatial-modeling workflows. The broker provides a service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general in the form of interoperable, executable workflows. The described solution has been experimentally applied in several use scenarios in the context of EU-funded projects and the Global Earth Observation System of Systems.

  6. IMPREMENTATION OF VGI-BASED GEOPORTAL FOR EMPOWERING CITIZEN’S GEOSPATIAL OBSERVATORIES RELATED TO URBAN DISASTER MANAGEMENT

    Directory of Open Access Journals (Sweden)

    S. Lee

    2016-06-01

    Full Text Available The volunteered geospatial information (VGI will be efficient and cost-effective method for generating and sharing large disasterrelated geospatial data. The national mapping organizations have played the role of major geospatial collector have been moving toward the considering public participation data collecting method. Due to VGI can conduct to encourage public participation and empower citizens, mapping agency could make a partnership with members of the VGI community to help to provide well-structured geospatial data. In order to effectively be understood and sharing the public semantics, datasets and action model of the public participation GeoPortal, the implemented VGI-GeoPortal designated as the basis of ISO 19154, ISO 19101 and OGC Reference Model. The proof of concepts of VGI-GeoPortal has been implemented urban flooding use-case in Republic of Korea to collect from the public, and analyze disaster-related geospatial data including high-disaster potential information such as the location of poor drainage sewer, small signs of occurring landslide, flooding vulnerability of urban structure, and etc.

  7. Transportation of Large Wind Components: A Review of Existing Geospatial Data

    Energy Technology Data Exchange (ETDEWEB)

    Mooney, Meghan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maclaurin, Galen [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    This report features the geospatial data component of a larger project evaluating logistical and infrastructure requirements for transporting oversized and overweight (OSOW) wind components. The goal of the larger project was to assess the status and opportunities for improving the infrastructure and regulatory practices necessary to transport wind turbine towers, blades, and nacelles from current and potential manufacturing facilities to end-use markets. The purpose of this report is to summarize existing geospatial data on wind component transportation infrastructure and to provide a data gap analysis, identifying areas for further analysis and data collection.

  8. INTEGRATING GEOSPATIAL TECHNOLOGIES AND SECONDARY STUDENT PROJECTS: THE GEOSPATIAL SEMESTER

    Directory of Open Access Journals (Sweden)

    Bob Kolvoord

    2012-12-01

    Full Text Available Resumen:El Semestre Geoespacial es una actividad de educación geográfica centrada en que los estudiantes del último curso de secundaria en los institutos norteamericanos, adquieran competencias y habilidades específicas en sistemas de información geográfica, GPS y teledetección. A través de una metodología de aprendizaje basado en proyectos, los alumnos se motivan e implican en la realización de trabajos de investigación en los que analizan, e incluso proponen soluciones, diferentes procesos, problemas o cuestiones de naturaleza espacial. El proyecto está coordinado por la Universidad James Madison y lleva siete años implantándose en diferentes institutos del Estado de Virginia, implicando a más de 20 centros educativos y 1.500 alumnos. Los alumnos que superan esta asignatura de la enseñanza secundaria obtienen la convalidación de determinados créditos académicos de la Universidad de referencia.Palabras clave:Sistemas de información geográfica, enseñanza, didáctica de la geografía, semestre geoespacial.Abstract:The Geospatial Semester is a geographical education activity focused on students in their final year of secondary schools in the U.S., acquiring specific skills in GIS, GPS and remote sensing. Through a methodology for project-based learning, students are motivated and involved in conducting research using geographic information systems and analyze, and even propose solutions, different processes, problems or issues spatial in nature. The Geospatial Semester university management not only ensures proper coaching, guidance and GIS training for teachers of colleges, but has established a system whereby students who pass this course of secondary education gain the recognition of certain credits from the University.Key words:Geographic information system, teaching, geographic education, geospatial semester. Résumé:Le semestre géospatial est une activité axée sur l'éducation géographique des étudiants en derni

  9. Automatic Scaling Hadoop in the Cloud for Efficient Process of Big Geospatial Data

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2016-09-01

    Full Text Available Efficient processing of big geospatial data is crucial for tackling global and regional challenges such as climate change and natural disasters, but it is challenging not only due to the massive data volume but also due to the intrinsic complexity and high dimensions of the geospatial datasets. While traditional computing infrastructure does not scale well with the rapidly increasing data volume, Hadoop has attracted increasing attention in geoscience communities for handling big geospatial data. Recently, many studies were carried out to investigate adopting Hadoop for processing big geospatial data, but how to adjust the computing resources to efficiently handle the dynamic geoprocessing workload was barely explored. To bridge this gap, we propose a novel framework to automatically scale the Hadoop cluster in the cloud environment to allocate the right amount of computing resources based on the dynamic geoprocessing workload. The framework and auto-scaling algorithms are introduced, and a prototype system was developed to demonstrate the feasibility and efficiency of the proposed scaling mechanism using Digital Elevation Model (DEM interpolation as an example. Experimental results show that this auto-scaling framework could (1 significantly reduce the computing resource utilization (by 80% in our example while delivering similar performance as a full-powered cluster; and (2 effectively handle the spike processing workload by automatically increasing the computing resources to ensure the processing is finished within an acceptable time. Such an auto-scaling approach provides a valuable reference to optimize the performance of geospatial applications to address data- and computational-intensity challenges in GIScience in a more cost-efficient manner.

  10. An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services

    Science.gov (United States)

    Shah, M.; Verma, Y.; Nandakumar, R.

    2012-07-01

    Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  11. SPATIOTEMPORAL VISUALIZATION OF TIME-SERIES SATELLITE-DERIVED CO2 FLUX DATA USING VOLUME RENDERING AND GPU-BASED INTERPOLATION ON A CLOUD-DRIVEN DIGITAL EARTH

    Directory of Open Access Journals (Sweden)

    S. Wu

    2017-10-01

    Full Text Available The ocean carbon cycle has a significant influence on global climate, and is commonly evaluated using time-series satellite-derived CO2 flux data. Location-aware and globe-based visualization is an important technique for analyzing and presenting the evolution of climate change. To achieve realistic simulation of the spatiotemporal dynamics of ocean carbon, a cloud-driven digital earth platform is developed to support the interactive analysis and display of multi-geospatial data, and an original visualization method based on our digital earth is proposed to demonstrate the spatiotemporal variations of carbon sinks and sources using time-series satellite data. Specifically, a volume rendering technique using half-angle slicing and particle system is implemented to dynamically display the released or absorbed CO2 gas. To enable location-aware visualization within the virtual globe, we present a 3D particlemapping algorithm to render particle-slicing textures onto geospace. In addition, a GPU-based interpolation framework using CUDA during real-time rendering is designed to obtain smooth effects in both spatial and temporal dimensions. To demonstrate the capabilities of the proposed method, a series of satellite data is applied to simulate the air-sea carbon cycle in the China Sea. The results show that the suggested strategies provide realistic simulation effects and acceptable interactive performance on the digital earth.

  12. A 3D City Model with Dynamic Behaviour Based on Geospatial Managed Objects

    DEFF Research Database (Denmark)

    Kjems, Erik; Kolář, Jan

    2014-01-01

    of a geographic data representation of the world. The combination of 3D city models and real time information based systems though can provide a whole new setup for data fusion within an urban environment and provide time critical information preserving our limited resources in the most sustainable way. Using 3D......One of the major development efforts within the GI Science domain are pointing at real time information coming from geographic referenced features in general. At the same time 3D City models are mostly justified as being objects for visualization purposes rather than constituting the foundation...... occasions we have been advocating for a new and advanced formulation of real world features using the concept of Geospatial Managed Objects (GMO). This chapter presents the outcome of the InfraWorld project, a 4 million Euro project financed primarily by the Norwegian Research Council where the concept...

  13. Geospatial Database for Strata Objects Based on Land Administration Domain Model (ladm)

    Science.gov (United States)

    Nasorudin, N. N.; Hassan, M. I.; Zulkifli, N. A.; Rahman, A. Abdul

    2016-09-01

    Recently in our country, the construction of buildings become more complex and it seems that strata objects database becomes more important in registering the real world as people now own and use multilevel of spaces. Furthermore, strata title was increasingly important and need to be well-managed. LADM is a standard model for land administration and it allows integrated 2D and 3D representation of spatial units. LADM also known as ISO 19152. The aim of this paper is to develop a strata objects database using LADM. This paper discusses the current 2D geospatial database and needs for 3D geospatial database in future. This paper also attempts to develop a strata objects database using a standard data model (LADM) and to analyze the developed strata objects database using LADM data model. The current cadastre system in Malaysia includes the strata title is discussed in this paper. The problems in the 2D geospatial database were listed and the needs for 3D geospatial database in future also is discussed. The processes to design a strata objects database are conceptual, logical and physical database design. The strata objects database will allow us to find the information on both non-spatial and spatial strata title information thus shows the location of the strata unit. This development of strata objects database may help to handle the strata title and information.

  14. The national atlas as a metaphor for improved use of a national geospatial data infrastructure

    NARCIS (Netherlands)

    Aditya Kurniawan Muhammad, T.

    2007-01-01

    Geospatial Data infrastructures have been developed worldwide. Geoportals have been created as an interface to allow users or the community to discover and use geospatial data offered by providers of these initiatives. This study focuses on the development of a web national atlas as an alternative

  15. High Performance Processing and Analysis of Geospatial Data Using CUDA on GPU

    Directory of Open Access Journals (Sweden)

    STOJANOVIC, N.

    2014-11-01

    Full Text Available In this paper, the high-performance processing of massive geospatial data on many-core GPU (Graphic Processing Unit is presented. We use CUDA (Compute Unified Device Architecture programming framework to implement parallel processing of common Geographic Information Systems (GIS algorithms, such as viewshed analysis and map-matching. Experimental evaluation indicates the improvement in performance with respect to CPU-based solutions and shows feasibility of using GPU and CUDA for parallel implementation of GIS algorithms over large-scale geospatial datasets.

  16. Visualizing dynamic geosciences phenomena using an octree-based view-dependent LOD strategy within virtual globes

    Science.gov (United States)

    Li, Jing; Wu, Huayi; Yang, Chaowei; Wong, David W.; Xie, Jibo

    2011-09-01

    Geoscientists build dynamic models to simulate various natural phenomena for a better understanding of our planet. Interactive visualizations of these geoscience models and their outputs through virtual globes on the Internet can help the public understand the dynamic phenomena related to the Earth more intuitively. However, challenges arise when the volume of four-dimensional data (4D), 3D in space plus time, is huge for rendering. Datasets loaded from geographically distributed data servers require synchronization between ingesting and rendering data. Also the visualization capability of display clients varies significantly in such an online visualization environment; some may not have high-end graphic cards. To enhance the efficiency of visualizing dynamic volumetric data in virtual globes, this paper proposes a systematic framework, in which an octree-based multiresolution data structure is implemented to organize time series 3D geospatial data to be used in virtual globe environments. This framework includes a view-dependent continuous level of detail (LOD) strategy formulated as a synchronized part of the virtual globe rendering process. Through the octree-based data retrieval process, the LOD strategy enables the rendering of the 4D simulation at a consistent and acceptable frame rate. To demonstrate the capabilities of this framework, data of a simulated dust storm event are rendered in World Wind, an open source virtual globe. The rendering performances with and without the octree-based LOD strategy are compared. The experimental results show that using the proposed data structure and processing strategy significantly enhances the visualization performance when rendering dynamic geospatial phenomena in virtual globes.

  17. Teaching Tectonics to Undergraduates with Web GIS

    Science.gov (United States)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  18. Automated Geospatial Watershed Assessment Tool (AGWA) Poster Presentation

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  19. A Research Agenda for Geospatial Technologies and Learning

    Science.gov (United States)

    Baker, Tom R.; Battersby, Sarah; Bednarz, Sarah W.; Bodzin, Alec M.; Kolvoord, Bob; Moore, Steven; Sinton, Diana; Uttal, David

    2015-01-01

    Knowledge around geospatial technologies and learning remains sparse, inconsistent, and overly anecdotal. Studies are needed that are better structured; more systematic and replicable; attentive to progress and findings in the cognate fields of science, technology, engineering, and math education; and coordinated for multidisciplinary approaches.…

  20. Academic research opportunities at the National Geospatial-Intelligence Agency(NGA)

    Science.gov (United States)

    Loomer, Scott A.

    2006-05-01

    The vision of the National Geospatial-Intelligence Agency (NGA) is to "Know the Earth...Show the Way." To achieve this vision, the NGA provides geospatial intelligence in all its forms and from whatever source-imagery, imagery intelligence, and geospatial data and information-to ensure the knowledge foundation for planning, decision, and action. Academia plays a key role in the NGA research and development program through the NGA Academic Research Program. This multi-disciplinary program of basic research in geospatial intelligence topics provides grants and fellowships to the leading investigators, research universities, and colleges of the nation. This research provides the fundamental science support to NGA's applied and advanced research programs. The major components of the NGA Academic Research Program are: *NGA University Research Initiatives (NURI): Three-year basic research grants awarded competitively to the best investigators across the US academic community. Topics are selected to provide the scientific basis for advanced and applied research in NGA core disciplines. *Historically Black College and University - Minority Institution Research Initiatives (HBCU-MI): Two-year basic research grants awarded competitively to the best investigators at Historically Black Colleges and Universities, and Minority Institutions across the US academic community. *Intelligence Community Post-Doctoral Research Fellowships: Fellowships providing access to advanced research in science and technology applicable to the intelligence community's mission. The program provides a pool of researchers to support future intelligence community needs and develops long-term relationships with researchers as they move into career positions. This paper provides information about the NGA Academic Research Program, the projects it supports and how researchers and institutions can apply for grants under the program. In addition, other opportunities for academia to engage with NGA through

  1. Geospatial Information Relevant to the Flood Protection Available on The Mainstream Web

    Directory of Open Access Journals (Sweden)

    Kliment Tomáš

    2014-03-01

    Full Text Available Flood protection is one of several disciplines where geospatial data is very important and is a crucial component. Its management, processing and sharing form the foundation for their efficient use; therefore, special attention is required in the development of effective, precise, standardized, and interoperable models for the discovery and publishing of data on the Web. This paper describes the design of a methodology to discover Open Geospatial Consortium (OGC services on the Web and collect descriptive information, i.e., metadata in a geocatalogue. A pilot implementation of the proposed methodology - Geocatalogue of geospatial information provided by OGC services discovered on Google (hereinafter “Geocatalogue” - was used to search for available resources relevant to the area of flood protection. The result is an analysis of the availability of resources discovered through their metadata collected from the OGC services (WMS, WFS, etc. and the resources they provide (WMS layers, WFS objects, etc. within the domain of flood protection.

  2. EarthServer - 3D Visualization on the Web

    Science.gov (United States)

    Wagner, Sebastian; Herzig, Pasquale; Bockholt, Ulrich; Jung, Yvonne; Behr, Johannes

    2013-04-01

    EarthServer (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, is a project to enable the management, access and exploration of massive, multi-dimensional datasets using Open GeoSpatial Consortium (OGC) query and processing language standards like WCS 2.0 and WCPS. To this end, a server/client architecture designed to handle Petabyte/Exabyte volumes of multi-dimensional data is being developed and deployed. As an important part of the EarthServer project, six Lighthouse Applications, major scientific data exploitation initiatives, are being established to make cross-domain, Earth Sciences related data repositories available in an open and unified manner, as service endpoints based on solutions and infrastructure developed within the project. Clients technology developed and deployed in EarthServer ranges from mobile and web clients to immersive virtual reality systems, all designed to interact with a physically and logically distributed server infrastructure using exclusively OGC standards. In this contribution, we would like to present our work on a web-based 3D visualization and interaction client for Earth Sciences data using only technology found in standard web browsers without requiring the user to install plugins or addons. Additionally, we are able to run the earth data visualization client on a wide range of different platforms with very different soft- and hardware requirements such as smart phones (e.g. iOS, Android), different desktop systems etc. High-quality, hardware-accelerated visualization of 3D and 4D content in standard web browsers can be realized now and we believe it will become more and more common to use this fast, lightweight and ubiquitous platform to provide insights into big datasets without requiring the user to set up a specialized client first. With that in mind, we will also point out some of the limitations we encountered using current web technologies. Underlying the EarthServer web client

  3. A geospatial soil-based DSS to reconcile landscape management and land protection

    Science.gov (United States)

    Manna, Piero; Basile, Angelo; Bonfante, Antonello; D'Antonio, Amedeo; De Michele, Carlo; Iamarino, Michela; Langella, Giuliano; Florindo Mileti, Antonio; Pileri, Paolo; Vingiani, Simona; Terribile, Fabio

    2017-04-01

    The implementation of UN Agenda 2030 may represent a great opportunity to place soil science at the hearth of many Sustainable Development Goals (e.g. SDGs 2, 3, 13, 15, 15.3, 16.7). On the other side the high complexity embedded in the factual implementation of SDG and many others ambitious objectives (e.g. FAO goals) may cause new frustrations if these policy documents will not bring real progresses. The scientific communities are asked to contribute to disentangle this complexity and possibly identifying a "way to go". This may help the large number of European directives (e.g. WFD, EIA), regulations and communications aiming to achieve a better environment but still facing large difficulties in their full implementation (e.g. COM2015/120; COM2013/683). This contribution has the motivation to provide a different perspective, thinking that the full implementation of SDGs and integrated land policies requires to challenge some key overlooked issues including full competence (and managing capability) about the landscape variability, its multi-functionalities (e.g. agriculture / environment) and its dynamic nature (many processes, including crops growth and fate of pollutants, are dynamic); moreover, it requires to support actions at a very detailed local scale since many processes and problems are site specific. The landscape and all the above issues have the soil as pulsing heart. Accordingly, we aim to demonstrate the multiple benefits in using a smart geoSpatial Decision Support System (S-DSS) grounded on soil modelling, called SOILCONSWEB (EU LIFE+ project and its extensions). It is a freely-accessible web platform based on a Geospatial Cyber-Infrastructure (GCI) and developed in Valle Telesina (South Italy) over an area of 20,000 ha. It supports a multilevel decision-making in agriculture and environment including the interaction with other land uses (such as landscape and urban planning) and thus it simultaneously delivers to SDGs 2, 3, 13, 15, 15.3, 16.7.

  4. Geospatial cryptography: enabling researchers to access private, spatially referenced, human subjects data for cancer control and prevention.

    Science.gov (United States)

    Jacquez, Geoffrey M; Essex, Aleksander; Curtis, Andrew; Kohler, Betsy; Sherman, Recinda; Emam, Khaled El; Shi, Chen; Kaufmann, Andy; Beale, Linda; Cusick, Thomas; Goldberg, Daniel; Goovaerts, Pierre

    2017-07-01

    As the volume, accuracy and precision of digital geographic information have increased, concerns regarding individual privacy and confidentiality have come to the forefront. Not only do these challenge a basic tenet underlying the advancement of science by posing substantial obstacles to the sharing of data to validate research results, but they are obstacles to conducting certain research projects in the first place. Geospatial cryptography involves the specification, design, implementation and application of cryptographic techniques to address privacy, confidentiality and security concerns for geographically referenced data. This article defines geospatial cryptography and demonstrates its application in cancer control and surveillance. Four use cases are considered: (1) national-level de-duplication among state or province-based cancer registries; (2) sharing of confidential data across cancer registries to support case aggregation across administrative geographies; (3) secure data linkage; and (4) cancer cluster investigation and surveillance. A secure multi-party system for geospatial cryptography is developed. Solutions under geospatial cryptography are presented and computation time is calculated. As services provided by cancer registries to the research community, de-duplication, case aggregation across administrative geographies and secure data linkage are often time-consuming and in some instances precluded by confidentiality and security concerns. Geospatial cryptography provides secure solutions that hold significant promise for addressing these concerns and for accelerating the pace of research with human subjects data residing in our nation's cancer registries. Pursuit of the research directions posed herein conceivably would lead to a geospatially encrypted geographic information system (GEGIS) designed specifically to promote the sharing and spatial analysis of confidential data. Geospatial cryptography holds substantial promise for accelerating the

  5. Big Data analytics in the Geo-Spatial Domain

    NARCIS (Netherlands)

    R.A. Goncalves (Romulo); M.G. Ivanova (Milena); M.L. Kersten (Martin); H. Scholten; S. Zlatanova; F. Alvanaki (Foteini); P. Nourian (Pirouz); E. Dias

    2014-01-01

    htmlabstractBig data collections in many scientific domains have inherently rich spatial and geo-spatial features. Spatial location is among the core aspects of data in Earth observation sciences, astronomy, and seismology to name a few. The goal of our project is to design an efficient data

  6. Methods and Tools to Align Curriculum to the Skills and Competencies Needed by the Workforce - an Example from Geospatial Science and Technology

    Science.gov (United States)

    Johnson, A. B.

    2012-12-01

    Geospatial science and technology (GST) including geographic information systems, remote sensing, global positioning systems and mobile applications, are valuable tools for geoscientists and students learning to become geoscientists. GST allows the user to analyze data spatially and temporarily and then visualize the data and outcomes in multiple formats (digital, web and paper). GST has evolved rapidly and it has been difficult to create effective curriculum as few guidelines existed to help educators. In 2010, the US Department of Labor (DoL), in collaboration with the National Geospatial Center of Excellence (GeoTech Center), a National Science Foundation supported grant, approved the Geospatial Technology Competency Mode (GTCM). The GTCM was developed and vetted with industry experts and provided the structure and example competencies needed across the industry. While the GTCM was helpful, a more detailed list of skills and competencies needed to be identified in order to build appropriate curriculum. The GeoTech Center carried out multiple DACUM events to identify the skills and competencies needed by entry-level workers. DACUM (Developing a Curriculum) is a job analysis process whereby expert workers are convened to describe what they do for a specific occupation. The outcomes from multiple DACUMs were combined into a MetaDACUM and reviewed by hundreds of GST professionals. This provided a list of more than 320 skills and competencies needed by the workforce. The GeoTech Center then held multiple workshops across the U.S. where more than 100 educators knowledgeable in teaching GST parsed the list into Model Courses and a Model Certificate Program. During this process, tools were developed that helped educators define which competency should be included in a specific course and the depth of instruction for that competency. This presentation will provide details about the process, methodology and tools used to create the Models and suggest how they can be used

  7. Investigating Methods for Serving Visualizations of Vertical Profiles

    Science.gov (United States)

    Roberts, J. T.; Cechini, M. F.; Lanjewar, K.; Rodriguez, J.; Boller, R. A.; Baynes, K.

    2017-12-01

    Several geospatial web servers, web service standards, and mapping clients exist for the visualization of two-dimensional raster and vector-based Earth science data products. However, data products with a vertical component (i.e., vertical profiles) do not have the same mature set of technologies and pose a greater technical challenge when it comes to visualizations. There are a variety of tools and proposed standards, but no obvious solution that can handle the variety of visualizations found with vertical profiles. An effort is being led by members of the NASA Global Imagery Browse Services (GIBS) team to gather a list of technologies relevant to existing vertical profile data products and user stories. The goal is to find a subset of technologies, standards, and tools that can be used to build publicly accessible web services that can handle the greatest number of use cases for the widest audience possible. This presentation will describe results of the investigation and offer directions for moving forward with building a system that is capable of effectively and efficiently serving visualizations of vertical profiles.

  8. The Prodiguer Messaging Platform

    Science.gov (United States)

    Denvil, S.; Greenslade, M. A.; Carenton, N.; Levavasseur, G.; Raciazek, J.

    2015-12-01

    CONVERGENCE is a French multi-partner national project designed to gather HPC and informatics expertise to innovate in the context of running French global climate models with differing grids and at differing resolutions. Efficient and reliable execution of these models and the management and dissemination of model output are some of the complexities that CONVERGENCE aims to resolve.At any one moment in time, researchers affiliated with the Institut Pierre Simon Laplace (IPSL) climate modeling group, are running hundreds of global climate simulations. These simulations execute upon a heterogeneous set of French High Performance Computing (HPC) environments. The IPSL's simulation execution runtime libIGCM (library for IPSL Global Climate Modeling group) has recently been enhanced so as to support hitherto impossible realtime use cases such as simulation monitoring, data publication, metrics collection, simulation control, visualizations … etc. At the core of this enhancement is Prodiguer: an AMQP (Advanced Message Queue Protocol) based event driven asynchronous distributed messaging platform. libIGCM now dispatches copious amounts of information, in the form of messages, to the platform for remote processing by Prodiguer software agents at IPSL servers in Paris. Such processing takes several forms: Persisting message content to database(s); Launching rollback jobs upon simulation failure; Notifying downstream applications; Automation of visualization pipelines; We will describe and/or demonstrate the platform's: Technical implementation; Inherent ease of scalability; Inherent adaptiveness in respect to supervising simulations; Web portal receiving simulation notifications in realtime.

  9. IMPRINT Analysis of an Unmanned Air System Geospatial Information Process

    National Research Council Canada - National Science Library

    Hunn, Bruce P; Schweitzer, Kristin M; Cahir, John A; Finch, Mary M

    2008-01-01

    ... intelligence, geospatial analysis cell. The Improved Performance Research Integration Tool (IMPRINT) modeling program was used to understand this process and to assess crew workload during several test scenarios...

  10. Learning R for geospatial analysis

    CERN Document Server

    Dorman, Michael

    2014-01-01

    This book is intended for anyone who wants to learn how to efficiently analyze geospatial data with R, including GIS analysts, researchers, educators, and students who work with spatial data and who are interested in expanding their capabilities through programming. The book assumes familiarity with the basic geographic information concepts (such as spatial coordinates), but no prior experience with R and/or programming is required. By focusing on R exclusively, you will not need to depend on any external software-a working installation of R is all that is necessary to begin.

  11. A GEOSPATIAL ANALYSIS OF THE RELATIONSHIP BETWEEN ENVIRONMENTAL DRIVERS AND VECTOR-BORNE DISEASES

    Directory of Open Access Journals (Sweden)

    MARIA IOANA VLAD-ȘANDRU

    2015-10-01

    Full Text Available A Geospatial Analysis of the Relationship between Environmental Drivers and Vector-Borne Diseases. Human health is profoundly affected by weather and climate. Environmental health is becoming a major preoccupation on a world-wide scale; there is a close correlation between a population’s state of health and the quality of its environment, considering many infectious diseases are at least partly dependent on environmental factors. When we talk about the environment, we realize that it includes and affects fields of action from our daily life. Earth observation from space, with validation from in situ observations, provide a greater understanding of the environment and enable us to monitor and predict key environmental phenomena and events that can affect our livelihoods and health. Even thought, the use of Earth observation is growing in usefulness for a wide variety of uses, it is extremely unlikely that Earth Observation will be able to detect infectious diseases directly. Instead, Earth observation can be used to detect high NDVI index (and possibly attribute the high surface chlorophyll concentration to a particular disease, and help predict the movement of the agents carrying vector-borne disease. Many diseases need certain temperature and moisture conditions to breed. The primary objective of analyzing environmental health risk and vulnerabilities is to support the Development Regions to strengthen their capacity to assess, visualize and analyze health risks and incorporate the results of this analysis in a health risk map for disaster risk reduction, emergency preparedness and response plans. At the same time, such an analysis applied in health, allows starting the collection and homogenization of baseline data, information and maps to help health authorities and decision makers to take informed decisions in times of crises. Informational Health Platform would be used for the integration of data coming from different sources in order to

  12. Modeling process platforms based on an object-oriented visual diagrammatic modeling language

    NARCIS (Netherlands)

    Zhang, L.

    2009-01-01

    Process platforms have been recognised as a promising means of dealing with product variety while achieving a near mass production efficiency. To assist practitioners to better understand, implement and use process platforms, this study addresses the underlying logic for coping with the challenges

  13. Using Immersive Visualizations to Improve Decision Making and Enhancing Public Understanding of Earth Resource and Climate Issues

    Science.gov (United States)

    Yu, K. C.; Raynolds, R. G.; Dechesne, M.

    2008-12-01

    New visualization technologies, from ArcGIS to Google Earth, have allowed for the integration of complex, disparate data sets to produce visually rich and compelling three-dimensional models of sub-surface and surface resource distribution patterns. The rendering of these models allows the public to quickly understand complicated geospatial relationships that would otherwise take much longer to explain using traditional media. We have impacted the community through topical policy presentations at both state and city levels, adult education classes at the Denver Museum of Nature and Science (DMNS), and public lectures at DMNS. We have constructed three-dimensional models from well data and surface observations which allow policy makers to better understand the distribution of groundwater in sandstone aquifers of the Denver Basin. Our presentations to local governments in the Denver metro area have allowed resource managers to better project future ground water depletion patterns, and to encourage development of alternative sources. DMNS adult education classes on water resources, geography, and regional geology, as well as public lectures on global issues such as earthquakes, tsunamis, and resource depletion, have utilized the visualizations developed from these research models. In addition to presenting GIS models in traditional lectures, we have also made use of the immersive display capabilities of the digital "fulldome" Gates Planetarium at DMNS. The real-time Uniview visualization application installed at Gates was designed for teaching astronomy, but it can be re-purposed for displaying our model datasets in the context of the Earth's surface. The 17-meter diameter dome of the Gates Planetarium allows an audience to have an immersive experience---similar to virtual reality CAVEs employed by the oil exploration industry---that would otherwise not be available to the general public. Public lectures in the dome allow audiences of over 100 people to comprehend

  14. Visual Basic 2012 programmer's reference

    CERN Document Server

    Stephens, Rod

    2012-01-01

    The comprehensive guide to Visual Basic 2012 Microsoft Visual Basic (VB) is the most popular programming language in the world, with millions of lines of code used in businesses and applications of all types and sizes. In this edition of the bestselling Wrox guide, Visual Basic expert Rod Stephens offers novice and experienced developers a comprehensive tutorial and reference to Visual Basic 2012. This latest edition introduces major changes to the Visual Studio development platform, including support for developing mobile applications that can take advantage of the Windows 8 operating system

  15. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    Science.gov (United States)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https

  16. How bicycle level of traffic stress correlate with reported cyclist accidents injury severities: A geospatial and mixed logit analysis.

    Science.gov (United States)

    Chen, Chen; Anderson, Jason C; Wang, Haizhong; Wang, Yinhai; Vogt, Rachel; Hernandez, Salvador

    2017-11-01

    Transportation agencies need efficient methods to determine how to reduce bicycle accidents while promoting cycling activities and prioritizing safety improvement investments. Many studies have used standalone methods, such as level of traffic stress (LTS) and bicycle level of service (BLOS), to better understand bicycle mode share and network connectivity for a region. However, in most cases, other studies rely on crash severity models to explain what variables contribute to the severity of bicycle related crashes. This research uniquely correlates bicycle LTS with reported bicycle crash locations for four cities in New Hampshire through geospatial mapping. LTS measurements and crash locations are compared visually using a GIS framework. Next, a bicycle injury severity model, that incorporates LTS measurements, is created through a mixed logit modeling framework. Results of the visual analysis show some geospatial correlation between higher LTS roads and "Injury" type bicycle crashes. It was determined, statistically, that LTS has an effect on the severity level of bicycle crashes and high LTS can have varying effects on severity outcome. However, it is recommended that further analyses be conducted to better understand the statistical significance and effect of LTS on injury severity. As such, this research will validate the use of LTS as a proxy for safety risk regardless of the recorded bicycle crash history. This research will help identify the clustering patterns of bicycle crashes on high-risk corridors and, therefore, assist with bicycle route planning and policy making. This paper also suggests low-cost countermeasures or treatments that can be implemented to address high-risk areas. Specifically, with the goal of providing safer routes for cyclists, such countermeasures or treatments have the potential to substantially reduce the number of fatalities and severe injuries. Published by Elsevier Ltd.

  17. Creating 3D models of historical buildings using geospatial data

    Science.gov (United States)

    Alionescu, Adrian; Bǎlǎ, Alina Corina; Brebu, Floarea Maria; Moscovici, Anca-Maria

    2017-07-01

    Recently, a lot of interest has been shown to understand a real world object by acquiring its 3D images of using laser scanning technology and panoramic images. A realistic impression of geometric 3D data can be generated by draping real colour textures simultaneously captured by a colour camera images. In this context, a new concept of geospatial data acquisition has rapidly revolutionized the method of determining the spatial position of objects, which is based on panoramic images. This article describes an approach that comprises inusing terrestrial laser scanning and panoramic images captured with Trimble V10 Imaging Rover technology to enlarge the details and realism of the geospatial data set, in order to obtain 3D urban plans and virtual reality applications.

  18. Novel Scientific Visualization Interfaces for Interactive Information Visualization and Sharing

    Science.gov (United States)

    Demir, I.; Krajewski, W. F.

    2012-12-01

    As geoscientists are confronted with increasingly massive datasets from environmental observations to simulations, one of the biggest challenges is having the right tools to gain scientific insight from the data and communicate the understanding to stakeholders. Recent developments in web technologies make it easy to manage, visualize and share large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to interact with data, and modify the parameters to create custom views of the data to gain insight from simulations and environmental observations. This requires developing new data models and intelligent knowledge discovery techniques to explore and extract information from complex computational simulations or large data repositories. Scientific visualization will be an increasingly important component to build comprehensive environmental information platforms. This presentation provides an overview of the trends and challenges in the field of scientific visualization, and demonstrates information visualization and communication tools in the Iowa Flood Information System (IFIS), developed within the light of these challenges. The IFIS is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to and visualization of flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, and other flood-related data for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS provides community-centric watershed and river characteristics, weather (rainfall) conditions, and streamflow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values, and flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, and

  19. Ibmdbpy-spatial : An Open-source implementation of in-database geospatial analytics in Python

    Science.gov (United States)

    Roy, Avipsa; Fouché, Edouard; Rodriguez Morales, Rafael; Moehler, Gregor

    2017-04-01

    As the amount of spatial data acquired from several geodetic sources has grown over the years and as data infrastructure has become more powerful, the need for adoption of in-database analytic technology within geosciences has grown rapidly. In-database analytics on spatial data stored in a traditional enterprise data warehouse enables much faster retrieval and analysis for making better predictions about risks and opportunities, identifying trends and spot anomalies. Although there are a number of open-source spatial analysis libraries like geopandas and shapely available today, most of them have been restricted to manipulation and analysis of geometric objects with a dependency on GEOS and similar libraries. We present an open-source software package, written in Python, to fill the gap between spatial analysis and in-database analytics. Ibmdbpy-spatial provides a geospatial extension to the ibmdbpy package, implemented in 2015. It provides an interface for spatial data manipulation and access to in-database algorithms in IBM dashDB, a data warehouse platform with a spatial extender that runs as a service on IBM's cloud platform called Bluemix. Working in-database reduces the network overload, as the complete data need not be replicated into the user's local system altogether and only a subset of the entire dataset can be fetched into memory in a single instance. Ibmdbpy-spatial accelerates Python analytics by seamlessly pushing operations written in Python into the underlying database for execution using the dashDB spatial extender, thereby benefiting from in-database performance-enhancing features, such as columnar storage and parallel processing. The package is currently supported on Python versions from 2.7 up to 3.4. The basic architecture of the package consists of three main components - 1) a connection to the dashDB represented by the instance IdaDataBase, which uses a middleware API namely - pypyodbc or jaydebeapi to establish the database connection via

  20. Persistent Teaching Practices after Geospatial Technology Professional Development

    Science.gov (United States)

    Rubino-Hare, Lori A.; Whitworth, Brooke A.; Bloom, Nena E.; Claesgens, Jennifer M.; Fredrickson, Kristi M.; Sample, James C.

    2016-01-01

    This case study described teachers with varying technology skills who were implementing the use of geospatial technology (GST) within project-based instruction (PBI) at varying grade levels and contexts 1 to 2 years following professional development. The sample consisted of 10 fifth- to ninth-grade teachers. Data sources included artifacts,…

  1. A Geo-Event-Based Geospatial Information Service: A Case Study of Typhoon Hazard

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-03-01

    Full Text Available Social media is valuable in propagating information during disasters for its timely and available characteristics nowadays, and assists in making decisions when tagged with locations. Considering the ambiguity and inaccuracy in some social data, additional authoritative data are needed for important verification. However, current works often fail to leverage both social and authoritative data and, on most occasions, the data are used in disaster analysis after the fact. Moreover, current works organize the data from the perspective of the spatial location, but not from the perspective of the disaster, making it difficult to dynamically analyze the disaster. All of the disaster-related data around the affected locations need to be retrieved. To solve these limitations, this study develops a geo-event-based geospatial information service (GEGIS framework and proceeded as follows: (1 a geo-event-related ontology was constructed to provide a uniform semantic basis for the system; (2 geo-events and attributes were extracted from the web using a natural language process (NLP and used in the semantic similarity match of the geospatial resources; and (3 a geospatial information service prototype system was designed and implemented for automatically retrieving and organizing geo-event-related geospatial resources. A case study of a typhoon hazard is analyzed here within the GEGIS and shows that the system would be effective when typhoons occur.

  2. Geospatial Technology: A Tool to Aid in the Elimination of Malaria in Bangladesh

    Directory of Open Access Journals (Sweden)

    Karen E. Kirk

    2014-12-01

    Full Text Available Bangladesh is a malaria endemic country. There are 13 districts in the country bordering India and Myanmar that are at risk of malaria. The majority of malaria morbidity and mortality cases are in the Chittagong Hill Tracts, the mountainous southeastern region of Bangladesh. In recent years, malaria burden has declined in the country. In this study, we reviewed and summarized published data (through 2014 on the use of geospatial technologies on malaria epidemiology in Bangladesh and outlined potential contributions of geospatial technologies for eliminating malaria in the country. We completed a literature review using “malaria, Bangladesh” search terms and found 218 articles published in peer-reviewed journals listed in PubMed. After a detailed review, 201 articles were excluded because they did not meet our inclusion criteria, 17 articles were selected for final evaluation. Published studies indicated geospatial technologies tools (Geographic Information System, Global Positioning System, and Remote Sensing were used to determine vector-breeding sites, land cover classification, accessibility to health facility, treatment seeking behaviors, and risk mapping at the household, regional, and national levels in Bangladesh. To achieve the goal of malaria elimination in Bangladesh, we concluded that further research using geospatial technologies should be integrated into the country’s ongoing surveillance system to identify and better assess progress towards malaria elimination.

  3. Theoretical multi-tier trust framework for the geospatial domain

    CSIR Research Space (South Africa)

    Umuhoza, D

    2010-01-01

    Full Text Available chain or workflow from data acquisition to knowledge discovery. The author’s present work in progress of a theoretical multi-tier trust framework for processing chain from data acquisition to knowledge discovery in geospatial domain. Holistic trust...

  4. Simulator platform for fast reactor operation and safety technology demonstration

    International Nuclear Information System (INIS)

    Vilim, R.B.; Park, Y.S.; Grandy, C.; Belch, H.; Dworzanski, P.; Misterka, J.

    2012-01-01

    A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe response to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.

  5. Simulator platform for fast reactor operation and safety technology demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Vilim, R. B.; Park, Y. S.; Grandy, C.; Belch, H.; Dworzanski, P.; Misterka, J. (Nuclear Engineering Division)

    2012-07-30

    A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe response to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.

  6. Geospatial big data and cartography : research challenges and opportunities for making maps that matter

    OpenAIRE

    Robinson, Anthony C.; Demsar, Urska; Moore, Antoni B.; Buckley, Aileen; Jiang, Bin; Field, Kenneth; Kraak, Menno-Jan; Camboim, Silvana P; Sluter, Claudia R

    2017-01-01

    Geospatial big data present a new set of challenges and opportunities for cartographic researchers in technical, methodological, and artistic realms. New computational and technical paradigms for cartography are accompanying the rise of geospatial big data. Additionally, the art and science of cartography needs to focus its contemporary efforts on work that connects to outside disciplines and is grounded in problems that are important to humankind and its sustainability. Following the develop...

  7. Describing Geospatial Assets in the Web of Data: A Metadata Management Scenario

    Directory of Open Access Journals (Sweden)

    Cristiano Fugazza

    2016-12-01

    Full Text Available Metadata management is an essential enabling factor for geospatial assets because discovery, retrieval, and actual usage of the latter are tightly bound to the quality of these descriptions. Unfortunately, the multi-faceted landscape of metadata formats, requirements, and conventions makes it difficult to identify editing tools that can be easily tailored to the specificities of a given project, workgroup, and Community of Practice. Our solution is a template-driven metadata editing tool that can be customised to any XML-based schema. Its output is constituted by standards-compliant metadata records that also have a semantics-aware counterpart eliciting novel exploitation techniques. Moreover, external data sources can easily be plugged in to provide autocompletion functionalities on the basis of the data structures made available on the Web of Data. Beside presenting the essentials on customisation of the editor by means of two use cases, we extend the methodology to the whole life cycle of geospatial metadata. We demonstrate the novel capabilities enabled by RDF-based metadata representation with respect to traditional metadata management in the geospatial domain.

  8. A Comprehensive Optimization Strategy for Real-time Spatial Feature Sharing and Visual Analytics in Cyberinfrastructure

    Science.gov (United States)

    Li, W.; Shao, H.

    2017-12-01

    For geospatial cyberinfrastructure enabled web services, the ability of rapidly transmitting and sharing spatial data over the Internet plays a critical role to meet the demands of real-time change detection, response and decision-making. Especially for the vector datasets which serve as irreplaceable and concrete material in data-driven geospatial applications, their rich geometry and property information facilitates the development of interactive, efficient and intelligent data analysis and visualization applications. However, the big-data issues of vector datasets have hindered their wide adoption in web services. In this research, we propose a comprehensive optimization strategy to enhance the performance of vector data transmitting and processing. This strategy combines: 1) pre- and on-the-fly generalization, which automatically determines proper simplification level through the introduction of appropriate distance tolerance (ADT) to meet various visualization requirements, and at the same time speed up simplification efficiency; 2) a progressive attribute transmission method to reduce data size and therefore the service response time; 3) compressed data transmission and dynamic adoption of a compression method to maximize the service efficiency under different computing and network environments. A cyberinfrastructure web portal was developed for implementing the proposed technologies. After applying our optimization strategies, substantial performance enhancement is achieved. We expect this work to widen the use of web service providing vector data to support real-time spatial feature sharing, visual analytics and decision-making.

  9. A research on the security of wisdom campus based on geospatial big data

    Science.gov (United States)

    Wang, Haiying

    2018-05-01

    There are some difficulties in wisdom campus, such as geospatial big data sharing, function expansion, data management, analysis and mining geospatial big data for a characteristic, especially the problem of data security can't guarantee cause prominent attention increasingly. In this article we put forward a data-oriented software architecture which is designed by the ideology of orienting data and data as kernel, solve the problem of traditional software architecture broaden the campus space data research, develop the application of wisdom campus.

  10. Testing RISKGIS Platform with Students to Improve Learning and Teaching Skills

    Science.gov (United States)

    Olyazadeh, R.; Aye, Z. C.; Jaboyedoff, M.; Derron, M. H.

    2016-12-01

    Nowadays, open-source developments in the field of natural hazards and risk management increase rapidly. The governments, NGOs and other research institutes are producing data for risk and disaster analysis, but a few platforms are available to bring a real-life experience to the students. This work focuses on the preliminary results of testing a WebGIS platform called RISKGIS with the bachelor students at the University of Lausanne. The platform is designed based on a geospatial open-source technology called OpenGeo (Boundless). This platform can calculate the potential risk of the buildings and assist the students to understand the situations for risk reduction mitigation and decision-making. The center of Jomsom in Nepal was selected for the first exercise that may be affected by amplifications of earthquake. The shaking intensity map was designed by an expert based on the geological characteristics and DEM (Digital Elevation Model) of the area. All buildings data were extracted from OpenStreetMap using QGIS and adapted to the platform. The video tutorial was prepared to guide the students through the platform, and 80 students have tested the application online successfully and 40 of them participated in Moodle (a free Open Source software package for educators) for online feedback and quiz. Among those, 30 of them have completely answered to both. We had interesting results for effectiveness, efficiency and satisfaction based on System Usability Scale (SUS). The SUS score for this platform was 68.6 out of 100. The average result of the quiz was 9.39 out of 10 with a duration of 8 to 33 minutes to answer the quiz. There were several outliers for this duration with 2 minutes (two students) and 9 to 18 hours (three students). Further exercises will be carried out with students by adding more advanced functions to the platform and improving the willingness of participation in this online learning platform. This project is funded by Fonds d'innovation p

  11. Research on presentation and query service of geo-spatial data based on ontology

    Science.gov (United States)

    Li, Hong-wei; Li, Qin-chao; Cai, Chang

    2008-10-01

    The paper analyzed the deficiency on presentation and query of geo-spatial data existed in current GIS, discussed the advantages that ontology possessed in formalization of geo-spatial data and the presentation of semantic granularity, taken land-use classification system as an example to construct domain ontology, and described it by OWL; realized the grade level and category presentation of land-use data benefited from the thoughts of vertical and horizontal navigation; and then discussed query mode of geo-spatial data based on ontology, including data query based on types and grade levels, instances and spatial relation, and synthetic query based on types and instances; these methods enriched query mode of current GIS, and is a useful attempt; point out that the key point of the presentation and query of spatial data based on ontology is to construct domain ontology that can correctly reflect geo-concept and its spatial relation and realize its fine formalization description.

  12. MUDMAP: Simulation model for releases from offshore platforms

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    The present article deals with a Norwegian developed simulation model dubbed MUDMAP. MUDMAP is a numerical model that simulates releases of drill muds and cuttings, produced water and other substances from offshore platforms. The model is envisioned as an advanced tool to assist in the rapid design and placement of intakes and release pipes on platforms, as well as in evaluating potential long-term impacts in the water and on the sea floor. MUDMAP allows rapid visual/graphical analysis of potential alternative solutions under various realistic environmental conditions, and for planning and executing platform monitoring projects. 4 figs

  13. Geo-spatial reporting for monitoring of household immunization coverage through mobile phones: Findings from a feasibility study.

    Science.gov (United States)

    Kazi, A M; Ali, M; K, Ayub; Kalimuddin, H; Zubair, K; Kazi, A N; A, Artani; Ali, S A

    2017-11-01

    The addition of Global Positioning System (GPS) to a mobile phone makes it a very powerful tool for surveillance and monitoring coverage of health programs. This technology enables transfer of data directly into computer applications and cross-references to Geographic Information Systems (GIS) maps, which enhances assessment of coverage and trends. Utilization of these systems in low and middle income countries is currently limited, particularly for immunization coverage assessments and polio vaccination campaigns. We piloted the use of this system and discussed its potential to improve the efficiency of field-based health providers and health managers for monitoring of the immunization program. Using "30×7" WHO sampling technique, a survey of children less than five years of age was conducted in random clusters of Karachi, Pakistan in three high risk towns where a polio case was detected in 2011. Center point of the cluster was calculated by the application on the mobile. Data and location coordinates were collected through a mobile phone. This data was linked with an automated mHealth based monitoring system for monitoring of Supplementary Immunization Activities (SIAs) in Karachi. After each SIA, a visual report was generated according to the coordinates collected from the survey. A total of 3535 participants consented to answer to a baseline survey. We found that the mobile phones incorporated with GIS maps can improve efficiency of health providers through real-time reporting and replacing paper based questionnaire for collection of data at household level. Visual maps generated from the data and geospatial analysis can also give a better assessment of the immunization coverage and polio vaccination campaigns. The study supports a model system in resource constrained settings that allows routine capture of individual level data through GPS enabled mobile phone providing actionable information and geospatial maps to local public health managers, policy makers

  14. Real-time analytics techniques to analyze and visualize streaming data

    CERN Document Server

    Ellis, Byron

    2014-01-01

    Construct a robust end-to-end solution for analyzing and visualizing streaming data Real-time analytics is the hottest topic in data analytics today. In Real-Time Analytics: Techniques to Analyze and Visualize Streaming Data, expert Byron Ellis teaches data analysts technologies to build an effective real-time analytics platform. This platform can then be used to make sense of the constantly changing data that is beginning to outpace traditional batch-based analysis platforms. The author is among a very few leading experts in the field. He has a prestigious background in research, development,

  15. Improving the Slum Planning Through Geospatial Decision Support System

    Science.gov (United States)

    Shekhar, S.

    2014-11-01

    In India, a number of schemes and programmes have been launched from time to time in order to promote integrated city development and to enable the slum dwellers to gain access to the basic services. Despite the use of geospatial technologies in planning, the local, state and central governments have only been partially successful in dealing with these problems. The study on existing policies and programmes also proved that when the government is the sole provider or mediator, GIS can become a tool of coercion rather than participatory decision-making. It has also been observed that local level administrators who have adopted Geospatial technology for local planning continue to base decision-making on existing political processes. In this juncture, geospatial decision support system (GSDSS) can provide a framework for integrating database management systems with analytical models, graphical display, tabular reporting capabilities and the expert knowledge of decision makers. This assists decision-makers to generate and evaluate alternative solutions to spatial problems. During this process, decision-makers undertake a process of decision research - producing a large number of possible decision alternatives and provide opportunities to involve the community in decision making. The objective is to help decision makers and planners to find solutions through a quantitative spatial evaluation and verification process. The study investigates the options for slum development in a formal framework of RAY (Rajiv Awas Yojana), an ambitious program of Indian Government for slum development. The software modules for realizing the GSDSS were developed using the ArcGIS and Community -VIZ software for Gulbarga city.

  16. Geospatial Technology In Environmental Impact Assessments – Retrospective.

    Directory of Open Access Journals (Sweden)

    Goparaju Laxmi

    2015-10-01

    Full Text Available Environmental Impact Assessments are studies conducted to give us an insight into the various impacts caused by an upcoming industry or any developmental activity. It should address various social, economic and environmental issues ensuring that negative impacts are mitigated. In this context, geospatial technology has been used widely in recent times.

  17. A Spatial Data Infrastructure Integrating Multisource Heterogeneous Geospatial Data and Time Series: A Study Case in Agriculture

    Directory of Open Access Journals (Sweden)

    Gloria Bordogna

    2016-05-01

    Full Text Available Currently, the best practice to support land planning calls for the development of Spatial Data Infrastructures (SDI capable of integrating both geospatial datasets and time series information from multiple sources, e.g., multitemporal satellite data and Volunteered Geographic Information (VGI. This paper describes an original OGC standard interoperable SDI architecture and a geospatial data and metadata workflow for creating and managing multisource heterogeneous geospatial datasets and time series, and discusses it in the framework of the Space4Agri project study case developed to support the agricultural sector in Lombardy region, Northern Italy. The main novel contributions go beyond the application domain for which the SDI has been developed and are the following: the ingestion within an a-centric SDI, potentially distributed in several nodes on the Internet to support scalability, of products derived by processing remote sensing images, authoritative data, georeferenced in-situ measurements and voluntary information (VGI created by farmers and agronomists using an original Smart App; the workflow automation for publishing sets and time series of heterogeneous multisource geospatial data and relative web services; and, finally, the project geoportal, that can ease the analysis of the geospatial datasets and time series by providing complex intelligent spatio-temporal query and answering facilities.

  18. Business models for implementing geospatial technologies in transportation decision-making

    Science.gov (United States)

    2007-03-31

    This report describes six State DOTs business models for implementing geospatial technologies. It provides a comparison of the organizational factors influencing how Arizona DOT, Delaware DOT, Georgia DOT, Montana DOT, North Carolina DOT, and Okla...

  19. Simultaneous Visualization of Different Utility Networks for Disaster Management

    Science.gov (United States)

    Semm, S.; Becker, T.; Kolbe, T. H.

    2012-07-01

    Cartographic visualizations of crises are used to create a Common Operational Picture (COP) and enforce Situational Awareness by presenting and representing relevant information. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific decision-making throughout the crises. Since, Operator's attention span and their working memory are limiting factors for the process of getting and interpreting information; the cartographic presentation has to support individuals in coordinating their activities and with handling highly dynamic situations. The Situational Awareness of operators in conjunction with a COP are key aspects of the decision making process and essential for coming to appropriate decisions. Utility networks are one of the most complex and most needed systems within a city. The visualization of utility infrastructure in crisis situations is addressed in this paper. The paper will provide a conceptual approach on how to simplify, aggregate, and visualize multiple utility networks and their components to meet the requirements of the decision-making process and to support Situational Awareness.

  20. THE DESIGN OF A HIGH PERFORMANCE EARTH IMAGERY AND RASTER DATA MANAGEMENT AND PROCESSING PLATFORM

    Directory of Open Access Journals (Sweden)

    Q. Xie

    2016-06-01

    Full Text Available This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC. Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.

  1. The Design of a High Performance Earth Imagery and Raster Data Management and Processing Platform

    Science.gov (United States)

    Xie, Qingyun

    2016-06-01

    This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC). Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.

  2. 75 FR 10309 - Announcement of National Geospatial Advisory Committee Meeting

    Science.gov (United States)

    2010-03-05

    ... Geospatial Advisory Committee (NGAC) will meet on March 24-25, 2010 at the One Washington Circle Hotel, 1... implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting...

  3. GEO-SPATIAL MODELING OF TRAVEL TIME TO MEDICAL FACILITIES IN MUNA BARAT DISTRICT, SOUTHEAST SULAWESI PROVINCE, INDONESIA

    Directory of Open Access Journals (Sweden)

    Nelson Sula

    2018-03-01

    Full Text Available Background: Health services are strongly influenced by regional topography. Road infrastructure is a key in access to health services. The geographic information system becomes a tool in modeling access to health services. Objective: To analyze geospatial data of the travel time to medical facilities in Muna Barat district, Southeast Sulawesi Province, Indonesia. Methods: This research used geospatial analysis with classification of raster data then overlaid with raster data such as Digital Elevation Modeling (DEM, Road of Vector data, and the point of Public Health Center (Puskesmas. Results: The result of geospatial analysis showed that the travel time to Puskesmas in Napano Kusambi and Kusambi sub districts is between 90-120 minutes, and travel time to the hospital in Kusambi sub district is required more than 2 hours. Conclusion: The output of this geospatial analysis can be an input for local government in planning infrastructure development in Muna Barat District, Indonesia.

  4. Reviews of Geospatial Information Technology and Collaborative Data Delivery for Disaster Risk Management

    Directory of Open Access Journals (Sweden)

    Hiroyuki Miyazaki

    2015-09-01

    Full Text Available Due to the fact that geospatial information technology is considered necessary for disaster risk management (DRM, the need for more effective collaborations between providers and end users in data delivery is increasing. This paper reviews the following: (i schemes of disaster risk management and collaborative data operation in DRM; (ii geospatial information technology in terms of applications to the schemes reviewed; and (iii ongoing practices of collaborative data delivery with the schemes reviewed. This paper concludes by discussing the future of collaborative data delivery and the progress of the technologies.

  5. Visual dataflow language for educational robots programming

    OpenAIRE

    ZIMIN G.A.; MORDVINOV D.A.

    2016-01-01

    Visual domain-specific languages usually have low entry barrier. Sometimes even children can program on such languages by working with visual representations. This is widely used in educational robotics domain, where most commonly used programming environments are visual. The paper describes a novel dataflow visual programming environment for embedded robotic platforms. Obviously, complex dataflow languages are not simple for understanding. The purpose of our tool is to "bridge" between light...

  6. Qualitative-Geospatial Methods of Exploring Person-Place Transactions in Aging Adults: A Scoping Review.

    Science.gov (United States)

    Hand, Carri; Huot, Suzanne; Laliberte Rudman, Debbie; Wijekoon, Sachindri

    2017-06-01

    Research exploring how places shape and interact with the lives of aging adults must be grounded in the places where aging adults live and participate. Combined participatory geospatial and qualitative methods have the potential to illuminate the complex processes enacted between person and place to create much-needed knowledge in this area. The purpose of this scoping review was to identify methods that can be used to study person-place relationships among aging adults and their neighborhoods by determining the extent and nature of research with aging adults that combines qualitative methods with participatory geospatial methods. A systematic search of nine databases identified 1,965 articles published from 1995 to late 2015. We extracted data and assessed whether the geospatial and qualitative methods were supported by a specified methodology, the methods of data analysis, and the extent of integration of geospatial and qualitative methods. Fifteen studies were included and used the photovoice method, global positioning system tracking plus interview, or go-along interviews. Most included articles provided sufficient detail about data collection methods, yet limited detail about methodologies supporting the study designs and/or data analysis. Approaches that combine participatory geospatial and qualitative methods are beginning to emerge in the aging literature. By more explicitly grounding studies in a methodology, better integrating different types of data during analysis, and reflecting on methods as they are applied, these methods can be further developed and utilized to provide crucial place-based knowledge that can support aging adults' health, well-being, engagement, and participation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Free and Open Source Software for Geospatial in the field of planetary science

    Science.gov (United States)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and

  8. Geospatial Image Stream Processing: Models, techniques, and applications in remote sensing change detection

    Science.gov (United States)

    Rueda-Velasquez, Carlos Alberto

    Detection of changes in environmental phenomena using remotely sensed data is a major requirement in the Earth sciences, especially in natural disaster related scenarios where real-time detection plays a crucial role in the saving of human lives and the preservation of natural resources. Although various approaches formulated to model multidimensional data can in principle be applied to the inherent complexity of remotely sensed geospatial data, there are still challenging peculiarities that demand a precise characterization in the context of change detection, particularly in scenarios of fast changes. In the same vein, geospatial image streams do not fit appropriately in the standard Data Stream Management System (DSMS) approach because these systems mainly deal with tuple-based streams. Recognizing the necessity for a systematic effort to address the above issues, the work presented in this thesis is a concrete step toward the foundation and construction of an integrated Geospatial Image Stream Processing framework, GISP. First, we present a data and metadata model for remotely sensed image streams. We introduce a precise characterization of images and image streams in the context of remotely sensed geospatial data. On this foundation, we define spatially-aware temporal operators with a consistent semantics for change analysis tasks. We address the change detection problem in settings where multiple image stream sources are available, and thus we introduce an architectural design for the processing of geospatial image streams from multiple sources. With the aim of targeting collaborative scientific environments, we construct a realization of our architecture based on Kepler, a robust and widely used scientific workflow management system, as the underlying computational support; and open data and Web interface standards, as a means to facilitate the interoperability of GISP instances with other processing infrastructures and client applications. We demonstrate our

  9. Investigating Climate Change Issues With Web-Based Geospatial Inquiry Activities

    Science.gov (United States)

    Dempsey, C.; Bodzin, A. M.; Sahagian, D. L.; Anastasio, D. J.; Peffer, T.; Cirucci, L.

    2011-12-01

    In the Environmental Literacy and Inquiry middle school Climate Change curriculum we focus on essential climate literacy principles with an emphasis on weather and climate, Earth system energy balance, greenhouse gases, paleoclimatology, and how human activities influence climate change (http://www.ei.lehigh.edu/eli/cc/). It incorporates a related set of a framework and design principles to provide guidance for the development of the geospatial technology-integrated Earth and environmental science curriculum materials. Students use virtual globes, Web-based tools including an interactive carbon calculator and geologic timeline, and inquiry-based lab activities to investigate climate change topics. The curriculum includes educative curriculum materials that are designed to promote and support teachers' learning of important climate change content and issues, geospatial pedagogical content knowledge, and geographic spatial thinking. The curriculum includes baseline instructional guidance for teachers and provides implementation and adaptation guidance for teaching with diverse learners including low-level readers, English language learners and students with disabilities. In the curriculum, students use geospatial technology tools including Google Earth with embedded spatial data to investigate global temperature changes, areas affected by climate change, evidence of climate change, and the effects of sea level rise on the existing landscape. We conducted a designed-based research implementation study with urban middle school students. Findings showed that the use of the Climate Change curriculum showed significant improvement in urban middle school students' understanding of climate change concepts.

  10. Geospatial Technologies to Improve Urban Energy Efficiency

    Directory of Open Access Journals (Sweden)

    Bharanidharan Hemachandran

    2011-07-01

    Full Text Available The HEAT (Home Energy Assessment Technologies pilot project is a FREE Geoweb mapping service, designed to empower the urban energy efficiency movement by allowing residents to visualize the amount and location of waste heat leaving their homes and communities as easily as clicking on their house in Google Maps. HEAT incorporates Geospatial solutions for residential waste heat monitoring using Geographic Object-Based Image Analysis (GEOBIA and Canadian built Thermal Airborne Broadband Imager technology (TABI-320 to provide users with timely, in-depth, easy to use, location-specific waste-heat information; as well as opportunities to save their money and reduce their green-house-gas emissions. We first report on the HEAT Phase I pilot project which evaluates 368 residences in the Brentwood community of Calgary, Alberta, Canada, and describe the development and implementation of interactive waste heat maps, energy use models, a Hot Spot tool able to view the 6+ hottest locations on each home and a new HEAT Score for inter-city waste heat comparisons. We then describe current challenges, lessons learned and new solutions as we begin Phase II and scale from 368 to 300,000+ homes with the newly developed TABI-1800. Specifically, we introduce a new object-based mosaicing strategy, an adaptation of Emissivity Modulation to correct for emissivity differences, a new Thermal Urban Road Normalization (TURN technique to correct for scene-wide microclimatic variation. We also describe a new Carbon Score and opportunities to update city cadastral errors with automatically defined thermal house objects.

  11. Organizational needs for managing and preserving geospatial data and related electronic records

    Directory of Open Access Journals (Sweden)

    R R Downs

    2006-01-01

    Full Text Available Government agencies and other organizations are required to manage and preserve records that they create and use to facilitate future access and reuse. The increasing use of geospatial data and related electronic records presents new challenges for these organizations, which have relied on traditional practices for managing and preserving records in printed form. This article reports on an investigation of current and future needs for managing and preserving geospatial electronic records on the part of localand state-level organizations in the New York City metropolitan region. It introduces the study and describes organizational needs observed, including needs for organizational coordination and interorganizational cooperation throughout the entire data lifecycle.

  12. Effects of the visual-feedback-based force platform training with functional electric stimulation on the balance and prevention of falls in older adults: a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Zhen Li

    2018-01-01

    Full Text Available Background Force platform training with functional electric stimulation aimed at improving balance may be effective in fall prevention for older adults. Aim of the study is to evaluate the effects of the visual-feedback-based force platform balance training with functional electric stimulation on balance and fall prevention in older adults. Methods A single-centre, unblinded, randomized controlled trial was conducted. One hundred and twenty older adults were randomly allocated to two groups: the control group (n = 60, one-leg standing balance exercise, 12 min/d or the intervention group (n = 60, force platform training with functional electric stimulation, 12 min/d. The training was provided 15 days a month for 3 months by physical therapists. Medial–lateral and anterior–posterior maximal range of sway with eyes open and closed, the Berg Balance Scale, the Barthel Index, the Falls Efficacy scale-International were assessed at baseline and after the 3-month intervention. A fall diary was kept by each participant during the 6-month follow-up. Results On comparing the two groups, the intervention group showed significantly decreased (p < 0.01 medial–lateral and anterior–posterior maximal range of sway with eyes open and closed. There was significantly higher improvement in the Berg Balance Scale (p < 0.05, the Barthel Index (p < 0.05 and the Falls Efficacy Scale-International (p < 0.05, along with significantly lesser number of injurious fallers (p < 0.05, number of fallers (p < 0.05, and fall rates (p < 0.05 during the 6-month follow-up in the intervention group. Conclusion This study showed that the visual feedback-based force platform training with functional electric stimulation improved balance and prevented falls in older adults.

  13. FOSS geospatial libraries in scientific workflow environments: experiences and directions

    CSIR Research Space (South Africa)

    McFerren, G

    2011-07-01

    Full Text Available of experiments. In context of three sets of research (wildfire research, flood modelling and the linking of disease outbreaks to multi-scale environmental conditions), we describe our efforts to provide geospatial capability for scientific workflow software...

  14. The VISPA Internet Platform for Students

    Science.gov (United States)

    Asseldonk, D. v.; Erdmann, M.; Fischer, R.; Glaser, C.; Müller, G.; Quast, T.; Rieger, M.; Urban, M.

    2016-04-01

    The VISPA internet platform enables users to remotely run Python scripts and view resulting plots or inspect their output data. With a standard web browser as the only user requirement on the client-side, the system becomes suitable for blended learning approaches for university physics students. VISPA was used in two consecutive years each by approx. 100 third year physics students at the RWTH Aachen University for their homework assignments. For example, in one exercise students gained a deeper understanding of Einsteins mass-energy relation by analyzing experimental data of electron-positron pairs revealing J / Ψ and Z particles. Because the students were free to choose their working hours, only few users accessed the platform simultaneously. The positive feedback from students and the stability of the platform lead to further development of the concept. This year, students accessed the platform in parallel while they analyzed the data recorded by demonstrated experiments live in the lecture hall. The platform is based on experience in the development of professional analysis tools. It combines core technologies from previous projects: an object-oriented C++ library, a modular data-driven analysis flow, and visual analysis steering. We present the platform and discuss its benefits in the context of teaching based on surveys that are conducted each semester.

  15. Species identification and other data collected from visual observation and other data from the COMMONWEALTH and other platforms in the North Pacific Ocean from 01 October 1950 to 01 August 1963 (NODC Accession 7200709)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Species identification and other data were collected using visual observation, net, and photograph from the COMMONWEALTH and other platforms in the North Pacific...

  16. Online Resources to Support Professional Development for Managing and Preserving Geospatial Data

    Science.gov (United States)

    Downs, R. R.; Chen, R. S.

    2013-12-01

    Improved capabilities of information and communication technologies (ICT) enable the development of new systems and applications for collecting, managing, disseminating, and using scientific data. New knowledge, skills, and techniques are also being developed to leverage these new ICT capabilities and improve scientific data management practices throughout the entire data lifecycle. In light of these developments and in response to increasing recognition of the wider value of scientific data for society, government agencies are requiring plans for the management, stewardship, and public dissemination of data and research products that are created by government-funded studies. Recognizing that data management and dissemination have not been part of traditional science education programs, new educational programs and learning resources are being developed to prepare new and practicing scientists, data scientists, data managers, and other data professionals with skills in data science and data management. Professional development and training programs also are being developed to address the need for scientists and professionals to improve their expertise in using the tools and techniques for managing and preserving scientific data. The Geospatial Data Preservation Resource Center offers an online catalog of various open access publications, open source tools, and freely available information for the management and stewardship of geospatial data and related resources, such as maps, GIS, and remote sensing data. Containing over 500 resources that can be found by type, topic, or search query, the geopreservation.org website enables discovery of various types of resources to improve capabilities for managing and preserving geospatial data. Applications and software tools can be found for use online or for download. Online journal articles, presentations, reports, blogs, and forums are also available through the website. Available education and training materials include

  17. Learning transfer of geospatial technologies in secondary science and mathematics core areas

    Science.gov (United States)

    Nielsen, Curtis P.

    The purpose of this study was to investigate the transfer of geospatial technology knowledge and skill presented in a social sciences course context to other core areas of the curriculum. Specifically, this study explored the transfer of geospatial technology knowledge and skill to the STEM-related core areas of science and mathematics among ninth-grade students. Haskell's (2001) research on "levels of transfer" provided the theoretical framework for this study, which sought to demonstrate the experimental group's higher ability to transfer geospatial skills, higher mean assignment scores, higher post-test scores, higher geospatial skill application and deeper levels of transfer application than the control group. The participants of the study consisted of thirty ninth-graders enrolled in U.S. History, Earth Science and Integrated Mathematics 1 courses. The primary investigator of this study had no previous classroom experiences with this group of students. The participants who were enrolled in the school's existing two-section class configuration were assigned to experimental and control groups. The experimental group had ready access to Macintosh MacBook laptop computers, and the control group had ready access to Macintosh iPads. All participants in U.S. History received instruction with and were required to use ArcGIS Explorer Online during a Westward Expansion project. All participants were given the ArcGIS Explorer Online content assessment following the completion of the U.S. History project. Once the project in U.S. History was completed, Earth Science and Integrated Mathematics 1 began units of instruction beginning with a multiple-choice content pre-test created by the classroom teachers. Experimental participants received the same unit of instruction without the use or influence of ArcGIS Explorer Online. At the end of the Earth Science and Integrated Math 1 units, the same multiple-choice test was administered as the content post-test. Following the

  18. Visualizer: 3D Gridded Data Visualization Software for Geoscience Education and Research

    Science.gov (United States)

    Harwood, C.; Billen, M. I.; Kreylos, O.; Jadamec, M.; Sumner, D. Y.; Kellogg, L. H.; Hamann, B.

    2008-12-01

    In both research and education learning is an interactive and iterative process of exploring and analyzing data or model results. However, visualization software often presents challenges on the path to learning because it assumes the user already knows the locations and types of features of interest, instead of enabling flexible and intuitive examination of results. We present examples of research and teaching using the software, Visualizer, specifically designed to create an effective and intuitive environment for interactive, scientific analysis of 3D gridded data. Visualizer runs in a range of 3D virtual reality environments (e.g., GeoWall, ImmersaDesk, or CAVE), but also provides a similar level of real-time interactivity on a desktop computer. When using Visualizer in a 3D-enabled environment, the software allows the user to interact with the data images as real objects, grabbing, rotating or walking around the data to gain insight and perspective. On the desktop, simple features, such as a set of cross-bars marking the plane of the screen, provide extra 3D spatial cues that allow the user to more quickly understand geometric relationships within the data. This platform portability allows the user to more easily integrate research results into classroom demonstrations and exercises, while the interactivity provides an engaging environment for self-directed and inquiry-based learning by students. Visualizer software is freely available for download (www.keckcaves.org) and runs on Mac OSX and Linux platforms.

  19. The Impact of a Geospatial Technology-Supported Energy Curriculum on Middle School Students' Science Achievement

    Science.gov (United States)

    Kulo, Violet; Bodzin, Alec

    2013-02-01

    Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade students classified in three ability level tracks. Data were gathered through pre/posttest content knowledge assessments, daily classroom observations, and daily reflective meetings with the teacher. Findings indicated a significant increase in the energy content knowledge for all the students. Effect sizes were large for all three ability level tracks, with the middle and low track classes having larger effect sizes than the upper track class. Learners in all three tracks were highly engaged with the curriculum. Curriculum effectiveness and practical issues involved with using geospatial technologies to support science learning are discussed.

  20. Development of an environment for 3D visualization of riser dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Bernardes Junior, Joao Luiz; Martins, Clovis de Arruda [Universidade de Sao Paulo (USP), SP (Brazil). Escola Politecnica]. E-mails: joao.bernardes@poli.usp.br; cmartins@usp.br

    2006-07-01

    This paper describes the merging of Virtual Reality and Scientific Visualization techniques in the development of Riser View, a multi platform 3D environment for real time, interactive visualization of riser dynamics. Its features, architecture, unusual collision detection algorithm and how up was customized for the project are discussed. Using Open GL through VRK, the software is able to make use of the resources available in most modern Graphics. Acceleration Hardware to improve performance. IUP/LED allows for native loo-and-feel in MS-Windows or Linux platform. The paper discusses conflicts that arise between scientific visualization and aspects such as realism and immersion, and how the visualization is prioritized. (author)

  1. Information-computational platform for collaborative multidisciplinary investigations of regional climatic changes and their impacts

    Science.gov (United States)

    Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2013-04-01

    Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through

  2. Geospatial environmental data modelling applications using remote sensing, GIS and spatial statistics

    Energy Technology Data Exchange (ETDEWEB)

    Siljander, M.

    2010-07-01

    This thesis presents novel modelling applications for environmental geospatial data using remote sensing, GIS and statistical modelling techniques. The studied themes can be classified into four main themes: (i) to develop advanced geospatial databases. Paper (I) demonstrates the creation of a geospatial database for the Glanville fritillary butterfly (Melitaea cinxia) in the Aaland Islands, south-western Finland; (ii) to analyse species diversity and distribution using GIS techniques. Paper (II) presents a diversity and geographical distribution analysis for Scopulini moths at a world-wide scale; (iii) to study spatiotemporal forest cover change. Paper (III) presents a study of exotic and indigenous tree cover change detection in Taita Hills Kenya using airborne imagery and GIS analysis techniques; (iv) to explore predictive modelling techniques using geospatial data. In Paper (IV) human population occurrence and abundance in the Taita Hills highlands was predicted using the generalized additive modelling (GAM) technique. Paper (V) presents techniques to enhance fire prediction and burned area estimation at a regional scale in East Caprivi Namibia. Paper (VI) compares eight state-of-the-art predictive modelling methods to improve fire prediction, burned area estimation and fire risk mapping in East Caprivi Namibia. The results in Paper (I) showed that geospatial data can be managed effectively using advanced relational database management systems. Metapopulation data for Melitaea cinxia butterfly was successfully combined with GPS-delimited habitat patch information and climatic data. Using the geospatial database, spatial analyses were successfully conducted at habitat patch level or at more coarse analysis scales. Moreover, this study showed it appears evident that at a large-scale spatially correlated weather conditions are one of the primary causes of spatially correlated changes in Melitaea cinxia population sizes. In Paper (II) spatiotemporal characteristics

  3. Study on integrated design and analysis platform of NPP

    International Nuclear Information System (INIS)

    Lu Dongsen; Gao Zuying; Zhou Zhiwei

    2001-01-01

    Many calculation software have been developed to nuclear system's design and safety analysis, such as structure design software, fuel design and manage software, thermal hydraulic analysis software, severe accident simulation software, etc. This study integrates those software to a platform, develops visual modeling tool for Retran, NGFM90. And in this platform, a distribution calculation method is also provided for couple calculation between different software. The study will improve the design and analysis of NPP

  4. Geomorphic Regionalization of Coastal Zone Using Geospatial Technology

    Directory of Open Access Journals (Sweden)

    Manoranjan Mishra

    2016-08-01

    Full Text Available The world coastal environment is made of diversified landforms and are also potentially vulnerable to climate variability, delta sinking, extreme events and anthropogenic interferences. Sustainable management of coastal resources and transforming quality ecosystem services to future generation are the goals of Integrated Coastal Zone Management (ICZM. Geographical homogenous unit are the basic implementation locus and back bone of these kinds of integrated management strategy and activities. However, coastal zone management projects in developing world using use arbitrary land-ward and sea-ward boundaries from physical reference as unit of management. The oversimplified fixed distance approaches are not able to map the spatial and temporal changes in coastal systems. The spatio-temporal variations of coastal systems are configured in geomorphic landforms and further that work on interaction between natural forces and anthropogenic inputs. The present research work is an attempt to present a simplified method of regionalization geomorphic landforms using geospatial platforms for delineating Orissa coast into smaller homogenous geographic unit as reference point for future management. Geomorphic landforms are reconstructed using Enhanced Thematic Mapper Plus (ETM+ imagery, Survey of India topomaps, field survey and Digital Elevation Model data at geographic information system (GIS plat form. Seventy geomorphic features covering an area of 5033.64 km2 were identified and further, regionalized into five homogenous geographic units. The need of time is to recognize unsustainable coastal systems in these homogenous geographic units by fine tuning development parameters and also same time allowing coastal systems to adapt naturally to any kind of variability. Although, the methodology applied to Orissa for delineation homogenous geographic area but it can be replicated to any coast in world.

  5. Improving the User Experience of Finding and Visualizing Oceanographic Data

    Science.gov (United States)

    Rauch, S.; Allison, M. D.; Groman, R. C.; Chandler, C. L.; Galvarino, C.; Gegg, S. R.; Kinkade, D.; Shepherd, A.; Wiebe, P. H.; Glover, D. M.

    2013-12-01

    Searching for and locating data of interest can be a challenge to researchers as increasing volumes of data are made available online through various data centers, repositories, and archives. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) is keenly aware of this challenge and, as a result, has implemented features and technologies aimed at improving data discovery and enhancing the user experience. BCO-DMO was created in 2006 to manage and publish data from research projects funded by the Division of Ocean Sciences (OCE) Biological and Chemical Oceanography Sections and the Division of Polar Programs (PLR) Antarctic Sciences Organisms and Ecosystems Program (ANT) of the US National Science Foundation (NSF). The BCO-DMO text-based and geospatial-based data access systems provide users with tools to search, filter, and visualize data in order to efficiently find data of interest. The geospatial interface, developed using a suite of open-source software (including MapServer [1], OpenLayers [2], ExtJS [3], and MySQL [4]), allows users to search and filter/subset metadata based on program, project, or deployment, or by using a simple word search. The map responds based on user selections, presents options that allow the user to choose specific data parameters (e.g., a species or an individual drifter), and presents further options for visualizing those data on the map or in "quick-view" plots. The data managed and made available by BCO-DMO are very heterogeneous in nature, from in-situ biogeochemical, ecological, and physical data, to controlled laboratory experiments. Due to the heterogeneity of the data types, a 'one size fits all' approach to visualization cannot be applied. Datasets are visualized in a way that will best allow users to assess fitness for purpose. An advanced geospatial interface, which contains a semantically-enabled faceted search [5], is also available. These search facets are highly interactive and responsive, allowing

  6. Prototype development of a web-based participative decision support platform in risk management

    Science.gov (United States)

    Aye, Zar Chi; Olyazadeh, Roya; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    This paper discusses the proposed background architecture and prototype development of an internet-based decision support system (DSS) in the field of natural hazards and risk management using open-source geospatial software and web technologies. It is based on a three-tier, client-server architecture with the support of boundless (opengeo) framework and its client side SDK application environment using customized gxp components and data utility classes. The main purpose of the system is to integrate the workflow of risk management systematically with the diverse involvement of stakeholders from different organizations dealing with natural hazards and risk for evaluation of management measures through the active online participation approach. It aims to develop an adaptive user friendly, web-based environment that allows the users to set up risk management strategies based on actual context and data by integrating web-GIS and DSS functionality associated with process flow and other visualization tools. Web-GIS interface has been integrated within the DSS to deliver maps and provide certain geo-processing capabilities on the web, which can be easily accessible and shared by different organizations located in case study sites of the project. This platform could be envisaged not only as a common web-based platform for the centralized sharing of data such as hazard maps, elements at risk maps and additional information but also to ensure an integrated platform of risk management where the users could upload data, analyze risk and identify possible alternative scenarios for risk reduction especially for floods and landslides, either quantitatively or qualitatively depending on the risk information provided by the stakeholders in case study regions. The level of involvement, access to and interaction with the provided functionality of the system varies depending on the roles and responsibilities of the stakeholders, for example, only the experts (planners, geological

  7. Parallel Agent-as-a-Service (P-AaaS Based Geospatial Service in the Cloud

    Directory of Open Access Journals (Sweden)

    Xicheng Tan

    2017-04-01

    Full Text Available To optimize the efficiency of the geospatial service in the flood response decision making system, a Parallel Agent-as-a-Service (P-AaaS method is proposed and implemented in the cloud. The prototype system and comparisons demonstrate the advantages of our approach over existing methods. The P-AaaS method includes both parallel architecture and a mechanism for adjusting the computational resources—the parallel geocomputing mechanism of the P-AaaS method used to execute a geospatial service and the execution algorithm of the P-AaaS based geospatial service chain, respectively. The P-AaaS based method has the following merits: (1 it inherits the advantages of the AaaS-based method (i.e., avoiding transfer of large volumes of remote sensing data or raster terrain data, agent migration, and intelligent conversion into services to improve domain expert collaboration; (2 it optimizes the low performance and the concurrent geoprocessing capability of the AaaS-based method, which is critical for special applications (e.g., highly concurrent applications and emergency response applications; and (3 it adjusts the computing resources dynamically according to the number and the performance requirements of concurrent requests, which allows the geospatial service chain to support a large number of concurrent requests by scaling up the cloud-based clusters in use and optimizes computing resources and costs by reducing the number of virtual machines (VMs when the number of requests decreases.

  8. Developing an Interactive Data Visualization Tool to Assess the Impact of Decision Support on Clinical Operations.

    Science.gov (United States)

    Huber, Timothy C; Krishnaraj, Arun; Monaghan, Dayna; Gaskin, Cree M

    2018-05-18

    Due to mandates from recent legislation, clinical decision support (CDS) software is being adopted by radiology practices across the country. This software provides imaging study decision support for referring providers at the point of order entry. CDS systems produce a large volume of data, providing opportunities for research and quality improvement. In order to better visualize and analyze trends in this data, an interactive data visualization dashboard was created using a commercially available data visualization platform. Following the integration of a commercially available clinical decision support product into the electronic health record, a dashboard was created using a commercially available data visualization platform (Tableau, Seattle, WA). Data generated by the CDS were exported from the data warehouse, where they were stored, into the platform. This allowed for real-time visualization of the data generated by the decision support software. The creation of the dashboard allowed the output from the CDS platform to be more easily analyzed and facilitated hypothesis generation. Integrating data visualization tools into clinical decision support tools allows for easier data analysis and can streamline research and quality improvement efforts.

  9. Geospatial Image Mining For Nuclear Proliferation Detection: Challenges and New Opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Vatsavai, Raju [ORNL; Bhaduri, Budhendra L [ORNL; Cheriyadat, Anil M [ORNL; Arrowood, Lloyd [Y-12 National Security Complex; Bright, Eddie A [ORNL; Gleason, Shaun Scott [ORNL; Diegert, Carl [Sandia National Laboratories (SNL); Katsaggelos, Aggelos K [ORNL; Pappas, Thrasos N [ORNL; Porter, Reid [Los Alamos National Laboratory (LANL); Bollinger, Jim [Savannah River National Laboratory (SRNL); Chen, Barry [Lawrence Livermore National Laboratory (LLNL); Hohimer, Ryan [Pacific Northwest National Laboratory (PNNL)

    2010-01-01

    With increasing understanding and availability of nuclear technologies, and increasing persuasion of nuclear technologies by several new countries, it is increasingly becoming important to monitor the nuclear proliferation activities. There is a great need for developing technologies to automatically or semi-automatically detect nuclear proliferation activities using remote sensing. Images acquired from earth observation satellites is an important source of information in detecting proliferation activities. High-resolution remote sensing images are highly useful in verifying the correctness, as well as completeness of any nuclear program. DOE national laboratories are interested in detecting nuclear proliferation by developing advanced geospatial image mining algorithms. In this paper we describe the current understanding of geospatial image mining techniques and enumerate key gaps and identify future research needs in the context of nuclear proliferation.

  10. Leveraging geospatial data, technology, and methods for improving the health of communities: priorities and strategies from an expert panel convened by the CDC.

    Science.gov (United States)

    Elmore, Kim; Flanagan, Barry; Jones, Nicholas F; Heitgerd, Janet L

    2010-04-01

    In 2008, CDC convened an expert panel to gather input on the use of geospatial science in surveillance, research and program activities focused on CDC's Healthy Communities Goal. The panel suggested six priorities: spatially enable and strengthen public health surveillance infrastructure; develop metrics for geospatial categorization of community health and health inequity; evaluate the feasibility and validity of standard metrics of community health and health inequities; support and develop GIScience and geospatial analysis; provide geospatial capacity building, training and education; and, engage non-traditional partners. Following the meeting, the strategies and action items suggested by the expert panel were reviewed by a CDC subcommittee to determine priorities relative to ongoing CDC geospatial activities, recognizing that many activities may need to occur either in parallel, or occur multiple times across phases. Phase A of the action items centers on developing leadership support. Phase B focuses on developing internal and external capacity in both physical (e.g., software and hardware) and intellectual infrastructure. Phase C of the action items plan concerns the development and integration of geospatial methods. In summary, the panel members provided critical input to the development of CDC's strategic thinking on integrating geospatial methods and research issues across program efforts in support of its Healthy Communities Goal.

  11. Contextualizing Cave Maps as Geospatial Information: Case Study of Indonesia

    Science.gov (United States)

    Reinhart, H.

    2017-12-01

    Caves are the result of solution processes. Because they are happened from geochemical and tectonic activity, they can be considered as geosphere phenomena. As one of the geosphere phenomena, especially at karst landform, caves have spatial dimensions and aspects. Cave’s utilizations and developments are increasing in many sectors such as hydrology, earth science, and tourism industry. However, spatial aspects of caves are poorly concerned dues to the lack of recognition toward cave maps. Many stakeholders have not known significances and importance of cave maps in determining development of a cave. Less information can be considered as the cause. Therefore, it is strongly necessary to put cave maps into the right context in order to make stakeholders realize the significance of it. Also, cave maps will be officially regarded as tools related to policy, development, and conservation act of caves hence they will have regulation in the usages and applications. This paper aims to make the contextualization of cave maps toward legal act. The act which is used is Act Number 4 Year 2011 About Geospatial Information. The contextualization is done by scrutinizing every articles and clauses related to cave maps and seek the contextual elements from both of them. The results are that cave maps can be regarded as geospatial information and classified as thematic geospatial information. The usages of them can be regulated through the Act Number 4 Year 2011. The regulations comprised by data acquisition, database, authorities, surveyor, and the obligation of providing cave maps in planning cave’s development and the environment surrounding.

  12. Geo-Spatial Tactical Decision Aid Systems: Fuzzy Logic for Supporting Decision Making

    National Research Council Canada - National Science Library

    Grasso, Raffaele; Giannecchini, Simone

    2006-01-01

    .... This paper describes a tactical decision aid system based on fuzzy logic reasoning for data fusion and on current Open Geospatial Consortium specifications for interoperability, data dissemination...

  13. Geospatial Web Services in Real Estate Information System

    Science.gov (United States)

    Radulovic, Aleksandra; Sladic, Dubravka; Govedarica, Miro; Popovic, Dragana; Radovic, Jovana

    2017-12-01

    Since the data of cadastral records are of great importance for the economic development of the country, they must be well structured and organized. Records of real estate on the territory of Serbia met many problems in previous years. To prevent problems and to achieve efficient access, sharing and exchange of cadastral data on the principles of interoperability, domain model for real estate is created according to current standards in the field of spatial data. The resulting profile of the domain model for the Serbian real estate cadastre is based on the current legislation and on Land Administration Domain Model (LADM) which is specified in the ISO19152 standard. Above such organized data, and for their effective exchange, it is necessary to develop a model of services that must be provided by the institutions interested in the exchange of cadastral data. This is achieved by introducing a service-oriented architecture in the information system of real estate cadastre and with that ensures efficiency of the system. It is necessary to develop user services for download, review and use of the real estate data through the web. These services should be provided to all users who need access to cadastral data (natural and legal persons as well as state institutions) through e-government. It is also necessary to provide search, view and download of cadastral spatial data by specifying geospatial services. Considering that real estate contains geometric data for parcels and buildings it is necessary to establish set of geospatial services that would provide information and maps for the analysis of spatial data, and for forming a raster data. Besides the theme Cadastral parcels, INSPIRE directive specifies several themes that involve data on buildings and land use, for which data can be provided from real estate cadastre. In this paper, model of geospatial services in Serbia is defined. A case study of using these services to estimate which household is at risk of

  14. Physically Based Rendering in the Nightshade NG Visualization Platform

    Science.gov (United States)

    Berglund, Karrie; Larey-Williams, Trystan; Spearman, Rob; Bogard, Arthur

    2015-01-01

    This poster describes our work on creating a physically based rendering model in Nightshade NG planetarium simulation and visualization software (project website: NightshadeSoftware.org). We discuss techniques used for rendering realistic scenes in the universe and dealing with astronomical distances in real time on consumer hardware. We also discuss some of the challenges of rewriting the software from scratch, a project which began in 2011.Nightshade NG can be a powerful tool for sharing data and visualizations. The desktop version of the software is free for anyone to download, use, and modify; it runs on Windows and Linux (and eventually Mac). If you are looking to disseminate your data or models, please stop by to discuss how we can work together.Nightshade software is used in literally hundreds of digital planetarium systems worldwide. Countless teachers and astronomy education groups run the software on flat screens. This wide use makes Nightshade an effective tool for dissemination to educators and the public.Nightshade NG is an especially powerful visualization tool when projected on a dome. We invite everyone to enter our inflatable dome in the exhibit hall to see this software in a 3D environment.

  15. Paper-based Platform for Urinary Creatinine Detection.

    Science.gov (United States)

    Sittiwong, Jarinya; Unob, Fuangfa

    2016-01-01

    A new paper platform was developed for the colorimetric detection of creatinine. The filter paper was coated with 3-propylsulfonic acid trimethoxysilane and used as the platform. Creatinine in a cationic form was extracted onto the paper via an ion-exchange mechanism and detected through the Jaffé reaction, resulting in a yellow-orange color complex. The color change on the paper could be observed visually, and the quantitative detection of creatinine was achieved through monitoring the color intensity change. The color intensity of creatinine complexes on the paper platform as a function of the creatinine concentration provided a linear range for creatinine detection in the range of 10 - 60 mg L(-1) and a detection limit of 4.2 mg L(-1). The accuracy of the proposed paper-based method was comparable to the conventional standard Jaffé method. This paper platform could be applied for simple and rapid detection of creatinine in human urine samples with a low consumption of reagent.

  16. Combining photorealistic immersive geovisualization and high-resolution geospatial data to enhance human-scale viewshed modelling

    Science.gov (United States)

    Tabrizian, P.; Petrasova, A.; Baran, P.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.

    2017-12-01

    Viewshed modelling- a process of defining, parsing and analysis of landscape visual space's structure within GIS- has been commonly used in applications ranging from landscape planning and ecosystem services assessment to geography and archaeology. However, less effort has been made to understand whether and to what extent these objective analyses predict actual on-the-ground perception of human observer. Moreover, viewshed modelling at the human-scale level require incorporation of fine-grained landscape structure (eg., vegetation) and patterns (e.g, landcover) that are typically omitted from visibility calculations or unrealistically simulated leading to significant error in predicting visual attributes. This poster illustrates how photorealistic Immersive Virtual Environments and high-resolution geospatial data can be used to integrate objective and subjective assessments of visual characteristics at the human-scale level. We performed viewshed modelling for a systematically sampled set of viewpoints (N=340) across an urban park using open-source GIS (GRASS GIS). For each point a binary viewshed was computed on a 3D surface model derived from high-density leaf-off LIDAR (QL2) points. Viewshed map was combined with high-resolution landcover (.5m) derived through fusion of orthoimagery, lidar vegetation, and vector data. Geo-statistics and landscape structure analysis was performed to compute topological and compositional metrics for visual-scale (e.g., openness), complexity (pattern, shape and object diversity), and naturalness. Based on the viewshed model output, a sample of 24 viewpoints representing the variation of visual characteristics were selected and geolocated. For each location, 360o imagery were captured using a DSL camera mounted on a GIGA PAN robot. We programmed a virtual reality application through which human subjects (N=100) immersively experienced a random representation of selected environments via a head-mounted display (Oculus Rift CV1), and

  17. The Value of Information and Geospatial Technologies for the analysis of tidal current patterns in the Guanabara Bay (Rio de Janeiro)

    Science.gov (United States)

    Isotta Cristofori, Elena; Demarchi, Alessandro; Facello, Anna; Cámaro, Walther; Hermosilla, Fernando; López, Jaime

    2016-04-01

    The study and validation of tidal current patterns relies on the combination of several data sources such as numerical weather prediction models, hydrodynamic models, weather stations, current drifters and remote sensing observations. The assessment of the accuracy and the reliability of produced patterns and the communication of results, including an easy to understand visualization of data, is crucial for a variety of stakeholders including decision-makers. The large diffusion of geospatial equipment such as GPS, current drifters, aerial photogrammetry, allows to collect data in the field using mobile and portable devices with a relative limited effort in terms of time and economic resources. Theses real-time measurements are essential in order to validate the models and specifically to assess the skill of the model during critical environmental conditions. Moreover, the considerable development in remote sensing technologies, cartographic services and GPS applications have enabled the creation of Geographic Information Systems (GIS) capable to store, analyze, manage and integrate spatial or geographical information with hydro-meteorological data. This valuable contribution of Information and geospatial technologies can benefit manifold decision-makers including high level sport athletes. While the numerical approach, commonly used to validate models with in-situ data, is more familiar for scientific users, high level sport users are not familiar with a numerical representations of data. Therefore the integration of data collected in the field into a GIS allows an immediate visualization of performed analysis into geographic maps. This visualization represents a particularly effective way to communicate current patterns assessment results and uncertainty in information, leading to an increase of confidence level about the forecast. The aim of this paper is to present the methodology set-up in collaboration with the Austrian Sailing Federation, for the study of

  18. Geospatial Associations Between Tobacco Retail Outlets and Current Use of Cigarettes and e-Cigarettes among Youths in Texas.

    Science.gov (United States)

    Pérez, Adriana; Chien, Lung-Chang; Harrell, Melissa B; Pasch, Keryn E; Obinwa, Udoka C; Perry, Cheryl L

    2017-10-01

    To identify the geospatial association between the presence of tobacco retail outlets (TRO) around schools' neighborhoods, and current use of cigarettes and e-cigarettes among adolescents in four counties in Texas. Students in grades 6, 8 and 10th were surveyed in their schools in 2014-2015. The schools' addresses was geocoded to determine the presence of at least one TRO within half a mile of the school. Two outcomes were considered: past 30-day use of (a) cigarettes and (b) e-cigarettes. Bayesian structured additive regression models and Kriging methods were used to estimate the geospatial associations between the presence of TRO and use in three counties: Dallas/Tarrant, Harris, and Travis. We observed a geospatial association between the presence of TRO around the schools and current use of cigarettes in the eastern area of Dallas County and in the southeastern area of Harris County. Also, a geospatial association between the presence of TRO around the schools and current use of e-cigarettes was observed in the entire Tarrant County and in the northeastern area of Harris County. There were geospatial associations between the presence of TRO around some schools and cigarette/e-cigarette use among students, but this association was not consistent across all the counties. More research is needed to determine why some areas are at higher risk for this association.

  19. Geo-Spatial Social Network Analysis of Social Media to Mitigate Disasters

    Science.gov (United States)

    Carley, K. M.

    2017-12-01

    Understanding the spatial layout of human activity can afford a better understanding many phenomena - such as local cultural, the spread of ideas, and the scope of a disaster. Today, social media is one of the key sensors for acquiring information on socio-cultural activity, some with cues as to the geo-location. We ask, What can be learned by putting such data on maps? For example, are people who chat on line more likely to be near each other? Can Twitter data support disaster planning or early warning? In this talk, such issues are examined using data collected via Twitter and analyzed using ORA. ORA is a network analysis and visualization system. It supports not just social networks (who is interacting with whom), but also high dimensional networks with many types of nodes (e.g. people, organizations, resources, activities …) and relations, geo-spatial network analysis, dynamic network analysis, & geo-temporal analysis. Using ORA lessons learned from five case studies are considered: Arab Spring, Tsunami warning in Padang Indonesia, Twitter around Fukushima in Japan, Typhoon Haiyan (Yolanda), & regional conflict. Using Padang Indonesia data, we characterize the strengths and limitations of social media data to support disaster planning & early warning, identify at risk areas & issues of concern, and estimate where people are and which areas are impacted. Using Fukushima Japanese data, social media is used to estimate geo-spatial regularities in movement and communication that can inform disaster response and risk estimation. Using Arab Spring data, we find that the spread of bots & extremists varies by country and time, to the extent that using twitter to understand who is important or what ideas are critical can be compromised. Bots and extremists can exploit disaster messaging to create havoc and facilitate criminal activity e.g. human trafficking. Event discovery mechanisms support isolating geo-epi-centers for key events become crucial. Spatial inference

  20. Geospatial-temporal semantic graph representations of trajectories from remote sensing and geolocation data

    Science.gov (United States)

    Perkins, David Nikolaus; Brost, Randolph; Ray, Lawrence P.

    2017-08-08

    Various technologies for facilitating analysis of large remote sensing and geolocation datasets to identify features of interest are described herein. A search query can be submitted to a computing system that executes searches over a geospatial temporal semantic (GTS) graph to identify features of interest. The GTS graph comprises nodes corresponding to objects described in the remote sensing and geolocation datasets, and edges that indicate geospatial or temporal relationships between pairs of nodes in the nodes. Trajectory information is encoded in the GTS graph by the inclusion of movable nodes to facilitate searches for features of interest in the datasets relative to moving objects such as vehicles.

  1. Preparing Preservice Teachers to Incorporate Geospatial Technologies in Geography Teaching

    Science.gov (United States)

    Harte, Wendy

    2017-01-01

    This study evaluated the efficacy of geospatial technology (GT) learning experiences in two geography curriculum courses to determine their effectiveness for developing preservice teacher confidence and preparing preservice teachers to incorporate GT in their teaching practices. Surveys were used to collect data from preservice teachers at three…

  2. Geospatial Analysis of Renewable Energy Technical Potential on Tribal Lands

    Energy Technology Data Exchange (ETDEWEB)

    Doris, E.; Lopez, A.; Beckley, D.

    2013-02-01

    This technical report uses an established geospatial methodology to estimate the technical potential for renewable energy on tribal lands for the purpose of allowing Tribes to prioritize the development of renewable energy resources either for community scale on-tribal land use or for revenue generating electricity sales.

  3. Shared Geospatial Metadata Repository for Ontario University Libraries: Collaborative Approaches

    Science.gov (United States)

    Forward, Erin; Leahey, Amber; Trimble, Leanne

    2015-01-01

    Successfully providing access to special collections of digital geospatial data in academic libraries relies upon complete and accurate metadata. Creating and maintaining metadata using specialized standards is a formidable challenge for libraries. The Ontario Council of University Libraries' Scholars GeoPortal project, which created a shared…

  4. A WEB-BASED PLATFORM FOR VISUALIZING SPATIOTEMPORAL DYNAMICS OF BIG TAXI DATA

    Directory of Open Access Journals (Sweden)

    H. Xiong

    2017-09-01

    Full Text Available With more and more vehicles equipped with Global Positioning System (GPS, access to large-scale taxi trajectory data has become increasingly easy. Taxis are valuable sensors and information associated with taxi trajectory can provide unprecedented insight into many aspects of city life. But analysing these data presents many challenges. Visualization of taxi data is an efficient way to represent its distributions and structures and reveal hidden patterns in the data. However, Most of the existing visualization systems have some shortcomings. On the one hand, the passenger loading status and speed information cannot be expressed. On the other hand, mono-visualization form limits the information presentation. In view of these problems, this paper designs and implements a visualization system in which we use colour and shape to indicate passenger loading status and speed information and integrate various forms of taxi visualization. The main work as follows: 1. Pre-processing and storing the taxi data into MongoDB database. 2. Visualization of hotspots for taxi pickup points. Through DBSCAN clustering algorithm, we cluster the extracted taxi passenger’s pickup locations to produce passenger hotspots. 3. Visualizing the dynamic of taxi moving trajectory using interactive animation. We use a thinning algorithm to reduce the amount of data and design a preloading strategyto load the data smoothly. Colour and shape are used to visualize the taxi trajectory data.

  5. a Web-Based Platform for Visualizing Spatiotemporal Dynamics of Big Taxi Data

    Science.gov (United States)

    Xiong, H.; Chen, L.; Gui, Z.

    2017-09-01

    With more and more vehicles equipped with Global Positioning System (GPS), access to large-scale taxi trajectory data has become increasingly easy. Taxis are valuable sensors and information associated with taxi trajectory can provide unprecedented insight into many aspects of city life. But analysing these data presents many challenges. Visualization of taxi data is an efficient way to represent its distributions and structures and reveal hidden patterns in the data. However, Most of the existing visualization systems have some shortcomings. On the one hand, the passenger loading status and speed information cannot be expressed. On the other hand, mono-visualization form limits the information presentation. In view of these problems, this paper designs and implements a visualization system in which we use colour and shape to indicate passenger loading status and speed information and integrate various forms of taxi visualization. The main work as follows: 1. Pre-processing and storing the taxi data into MongoDB database. 2. Visualization of hotspots for taxi pickup points. Through DBSCAN clustering algorithm, we cluster the extracted taxi passenger's pickup locations to produce passenger hotspots. 3. Visualizing the dynamic of taxi moving trajectory using interactive animation. We use a thinning algorithm to reduce the amount of data and design a preloading strategyto load the data smoothly. Colour and shape are used to visualize the taxi trajectory data.

  6. Large Scale Analysis of Geospatial Data with Dask and XArray

    Science.gov (United States)

    Zender, C. S.; Hamman, J.; Abernathey, R.; Evans, K. J.; Rocklin, M.; Zender, C. S.; Rocklin, M.

    2017-12-01

    The analysis of geospatial data with high level languages has acceleratedinnovation and the impact of existing data resources. However, as datasetsgrow beyond single-machine memory, data structures within these high levellanguages can become a bottleneck. New libraries like Dask and XArray resolve some of these scalability issues,providing interactive workflows that are both familiar tohigh-level-language researchers while also scaling out to much largerdatasets. This broadens the access of researchers to larger datasets on highperformance computers and, through interactive development, reducestime-to-insight when compared to traditional parallel programming techniques(MPI). This talk describes Dask, a distributed dynamic task scheduler, Dask.array, amulti-dimensional array that copies the popular NumPy interface, and XArray,a library that wraps NumPy/Dask.array with labeled and indexes axes,implementing the CF conventions. We discuss both the basic design of theselibraries and how they change interactive analysis of geospatial data, and alsorecent benefits and challenges of distributed computing on clusters ofmachines.

  7. A software platform for continuum modeling of ion channels based on unstructured mesh

    International Nuclear Information System (INIS)

    Tu, B; Bai, S Y; Xie, Y; Zhang, L B; Lu, B Z; Chen, M X

    2014-01-01

    Most traditional continuum molecular modeling adopted finite difference or finite volume methods which were based on a structured mesh (grid). Unstructured meshes were only occasionally used, but an increased number of applications emerge in molecular simulations. To facilitate the continuum modeling of biomolecular systems based on unstructured meshes, we are developing a software platform with tools which are particularly beneficial to those approaches. This work describes the software system specifically for the simulation of a typical, complex molecular procedure: ion transport through a three-dimensional channel system that consists of a protein and a membrane. The platform contains three parts: a meshing tool chain for ion channel systems, a parallel finite element solver for the Poisson–Nernst–Planck equations describing the electrodiffusion process of ion transport, and a visualization program for continuum molecular modeling. The meshing tool chain in the platform, which consists of a set of mesh generation tools, is able to generate high-quality surface and volume meshes for ion channel systems. The parallel finite element solver in our platform is based on the parallel adaptive finite element package PHG which wass developed by one of the authors [1]. As a featured component of the platform, a new visualization program, VCMM, has specifically been developed for continuum molecular modeling with an emphasis on providing useful facilities for unstructured mesh-based methods and for their output analysis and visualization. VCMM provides a graphic user interface and consists of three modules: a molecular module, a meshing module and a numerical module. A demonstration of the platform is provided with a study of two real proteins, the connexin 26 and hemolysin ion channels. (paper)

  8. SIMULTANEOUS VISUALIZATION OF DIFFERENT UTILITY NETWORKS FOR DISASTER MANAGEMENT

    Directory of Open Access Journals (Sweden)

    S. Semm

    2012-07-01

    Full Text Available Cartographic visualizations of crises are used to create a Common Operational Picture (COP and enforce Situational Awareness by presenting and representing relevant information. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific decision-making throughout the crises. Since, Operator's attention span and their working memory are limiting factors for the process of getting and interpreting information; the cartographic presentation has to support individuals in coordinating their activities and with handling highly dynamic situations. The Situational Awareness of operators in conjunction with a COP are key aspects of the decision making process and essential for coming to appropriate decisions. Utility networks are one of the most complex and most needed systems within a city. The visualization of utility infrastructure in crisis situations is addressed in this paper. The paper will provide a conceptual approach on how to simplify, aggregate, and visualize multiple utility networks and their components to meet the requirements of the decision-making process and to support Situational Awareness.

  9. Visualization of elongation measurements using an SER universal testing platform

    Czech Academy of Sciences Publication Activity Database

    Pivokonský, Radek; Filip, Petr; Zelenková, Jana

    2015-01-01

    Roč. 25, č. 1 (2015), s. 1-8 ISSN 1430-6395 R&D Projects: GA ČR(CZ) GAP105/11/2342 Institutional support: RVO:67985874 Keywords : elongational viscosity * Universal Testing Platform (SER) * polymer melts * LDPE Subject RIV: BK - Fluid Dynamics Impact factor: 1.241, year: 2015

  10. Initial PDS4 Support for the Geospatial Data Abstraction Library (GDAL)

    Science.gov (United States)

    Hare, T. M.; Gaddis, L. R.

    2018-04-01

    We introduce initial support for PDS4 within the Geospatial Data Abstraction Library (GDAL). Both highlights and limitations are presented, as well as a short discussion on methods for supporting a GDAL-based workflow for PDS4 conversions.

  11. State of the art of parallel scientific visualization applications on PC clusters

    International Nuclear Information System (INIS)

    Juliachs, M.

    2004-01-01

    In this state of the art on parallel scientific visualization applications on PC clusters, we deal with both surface and volume rendering approaches. We first analyze available PC cluster configurations and existing parallel rendering software components for parallel graphics rendering. CEA/DIF has been studying cluster visualization since 2001. This report is part of a study to set up a new visualization research platform. This platform consisting of an eight-node PC cluster under Linux and a tiled display was installed in collaboration with Versailles-Saint-Quentin University in August 2003. (author)

  12. Geospatial Techniques for Improved Water Management in Jordan

    Directory of Open Access Journals (Sweden)

    Jawad T. Al-Bakri

    2016-04-01

    Full Text Available This research shows a case from Jordan where geospatial techniques were utilized for irrigation water auditing. The work was based on assessing records of groundwater abstraction in relation to irrigated areas and estimated crop water consumption in three water basins: Yarmouk, Amman-Zarqa and Azraq. Mapping of irrigated areas and crop water requirements was carried out using remote sensing data of Landsat 8 and daily weather records. The methodology was based on visual interpretation and the unsupervised classification for remote sensing data, supported by ground surveys. Net (NCWR and gross (GCWR crop water requirements were calculated by merging crop evapotranspiration (ETc, calculated from daily weather records, with maps of irrigated crops. Gross water requirements were compared with groundwater abstractions recorded at a farm level to assess the levels of abstraction in relation to groundwater safe yield. Results showed that irrigated area and GCWR were higher than officially recorded cropped area and abstracted groundwater. The over abstraction of groundwater was estimated to range from 144% to 360% of the safe yield in the three basins. Overlaying the maps of irrigation and groundwater wells enabled the Ministry of Water and Irrigation (MWI to detect and uncover violations and illegal practices of irrigation, in the form of unlicensed wells, incorrect metering of pumped water and water conveyance for long distances. Results from the work were utilized at s high level of decision-making and changes to the water law were made, with remote sensing data being accredited for monitoring water resources in Jordan.

  13. Smart sensor-based geospatial architecture for dike monitoring

    Science.gov (United States)

    Herle, S.; Becker, R.; Blankenbach, J.

    2016-04-01

    Artificial hydraulic structures like dams or dikes used for water level regulations or flood prevention are continuously under the influence of the weather and variable river regimes. Thus, ongoing monitoring and simulation is crucial in order to determine the inner condition. Potentially life-threatening situations, in extreme case a failure, must be counteracted by all available means. Nowadays flood warning systems rely exclusively on water level forecast without considering the state of the structure itself. Area-covering continuous knowledge of the inner state including time dependent changes increases the capability of recognizing and locating vulnerable spots for early treatment. In case of a predicted breach, advance warning time for alerting affected citizens can be extended. Our approach is composed of smart sensors integrated in a service-oriented geospatial architecture to monitor and simulate artificial hydraulic structures continuously. The sensors observe the inner state of the construction like the soil moisture or the stress and deformation over time but also various external influences like water levels or wind speed. They are interconnected in distributed network architecture by a so-called sensor bus system based on lightweight protocols like Message Queue Telemetry Transport for Sensor Networks (MQTT-SN). These sensor data streams are transferred into an OGC Sensor Web Enablement (SWE) data structure providing high-level geo web services to end users. Bundled with 3rd party geo web services (WMS etc.) powerful processing and simulation tools can be invoked using the Web Processing Service (WPS) standard. Results will be visualized in a geoportal allowing user access to all information.

  14. Geospatial Technology in Disease Mapping, E- Surveillance and Health Care for Rural Population in South India

    Science.gov (United States)

    Praveenkumar, B. A.; Suresh, K.; Nikhil, A.; Rohan, M.; Nikhila, B. S.; Rohit, C. K.; Srinivas, A.

    2014-11-01

    Providing Healthcare to rural population has been a challenge to the medical service providers especially in developing countries. For this to be effective, scalable and sustainable, certain strategic decisions have to be taken during the planning phase. Also, there is a big gap between the services available and the availability of doctors and medical resources in rural areas. Use of Information Technology can aid this deficiency to a good extent. In this paper, a mobile application has been developed to gather data from the field. A cloud based interface has been developed to store the data in the cloud for effective usage and management of the data. A decision tree based solution developed in this paper helps in diagnosing a patient based on his health parameters. Interactive geospatial maps have been developed to provide effective data visualization facility. This will help both the user community as well as decision makers to carry out long term strategy planning.

  15. Stethoscope: A platform for interactive visual analysis of query execution plans

    NARCIS (Netherlands)

    M.M. Gawade (Mrunal); M.L. Kersten (Martin)

    2012-01-01

    textabstractSearching for the performance bottleneck in an execution trace is an error prone and time consuming activity. Existing tools oer some comfort by providing a visual representation of trace for analysis. In this paper we present the Stethoscope, an interactive visual tool to inspect and

  16. Stethoscope: a platform for interactive visual analysis of query execution plans

    NARCIS (Netherlands)

    Gawade, M.; Kersten, M.

    2012-01-01

    Searching for the performance bottleneck in an execution trace is an error prone and time consuming activity. Existing tools offer some comfort by providing a visual representation of trace for analysis. In this paper we present the Stethoscope, an interactive visual tool to inspect and ana- lyze

  17. Geospatial Data Repository. Sharing Data Across the Organization and Beyond

    National Research Council Canada - National Science Library

    Ruiz, Marilyn

    2001-01-01

    .... This short Technical Note discusses a five-part approach to creating a data repository that addresses the problems of the historical organizational framework for geospatial data. Fort Hood, Texas was the site used to develop the prototype. A report documenting the complete study will be available in late Spring 2001.

  18. Data Democracy and Decision Making: Enhancing the Use and Value of Geospatial Data and Scientific Information

    Science.gov (United States)

    Shapiro, C. D.

    2014-12-01

    Data democracy is a concept that has great relevance to the use and value of geospatial data and scientific information. Data democracy describes a world in which data and information are widely and broadly accessible, understandable, and useable. The concept operationalizes the public good nature of scientific information and provides a framework for increasing benefits from its use. Data democracy encompasses efforts to increase accessibility to geospatial data and to expand participation in its collection, analysis, and application. These two pillars are analogous to demand and supply relationships. Improved accessibility, or demand, includes increased knowledge about geospatial data and low barriers to retrieval and use. Expanded participation, or supply, encompasses a broader community involved in developing geospatial data and scientific information. This pillar of data democracy is characterized by methods such as citizen science or crowd sourcing.A framework is developed for advancing the use of data democracy. This includes efforts to assess the societal benefits (economic and social) of scientific information. This knowledge is critical to continued monitoring of the effectiveness of data democracy implementation and of potential impact on the use and value of scientific information. The framework also includes an assessment of opportunities for advancing data democracy both on the supply and demand sides. These opportunities include relatively inexpensive efforts to reduce barriers to use as well as the identification of situations in which participation can be expanded in scientific efforts to enhance the breadth of involvement as well as expanding participation to non-traditional communities. This framework provides an initial perspective on ways to expand the "scientific community" of data users and providers. It also describes a way forward for enhancing the societal benefits from geospatial data and scientific information. As a result, data

  19. Innovazione tecnologica e sinergie tra soluzioni geospaziali Lo "scenario Intergraph"

    Directory of Open Access Journals (Sweden)

    Andrea Fiduccia

    2012-04-01

    Full Text Available Negli ultimi anni abbiamo potuto assistere all’espansione delle tecnologie per l’Informazione Geografica dall’originale ambito della cartografia numerica per la conoscenza del territorio (automated mapping o per la gestione delle infrastrutture e delle reti tecnologiche (facility management verso un dominio più ampio denominato "geospatial".Technological innovation and synergies between geospatial solutions. The "Intergraph's scenario".Geospatial  is  much  more  than  GIS.  We  can  understand  the evolution  of  Geographic  Information  considering  some  new working groups and OWS demonstrations of Open Geospa-tial Consortium: Sensor Web Enablement, Sensor Fusion Ena-blement, Feature & Decision Fusion, Aviation, Emergency and Disaster Management, etc. Another point of observation is to analyze some brand new products of a multinational enterprise committed to the innovation like Intergraph Corporation. Part of Hexagon AB, Intergraph, Leica Geosystems and ERDAS are working together to leverage joint strengths in geospatial in-novation.Intergraph’s Motion Video Exploitation solution leverages full motion video, giving analysts the ability to collect, analyze, and maximize the value of video assets.GeoMedia  3D  is  a  GeoMedia  add-on  product  that  extends the functionality of Intergraph’s geospatial solutions through an integrated 3D visualization and analysis environment. You can visualize, navigate, analyze, and interact with 3D data na-tively in GeoMedia.GeoMedia Smart Client delivers an enterprise geospatial plat-form engineered to support large numbers of users who are unable to operate full desktop products, but whose workflows need  advanced  geospatial  functionality  that  cannot  be  sup-ported by Web mapping tools.G/Technology Fiber Optic Works 1.0 streamlines the management  of  fiber  optic  infrastructure  for  utilities,  municipalities, agencies and

  20. Innovazione tecnologica e sinergie tra soluzioni geospaziali Lo "scenario Intergraph"

    Directory of Open Access Journals (Sweden)

    Andrea Fiduccia

    2012-04-01

    Full Text Available Negli ultimi anni abbiamo potuto assistere all’espansione delle tecnologie per l’Informazione Geografica dall’originale ambito della cartografia numerica per la conoscenza del territorio (automated mapping o per la gestione delle infrastrutture e delle reti tecnologiche (facility management verso un dominio più ampio denominato "geospatial". Technological innovation and synergies between geospatial solutions. The "Intergraph's scenario".Geospatial  is  much  more  than  GIS.  We  can  understand  the evolution  of  Geographic  Information  considering  some  new working groups and OWS demonstrations of Open Geospa-tial Consortium: Sensor Web Enablement, Sensor Fusion Ena-blement, Feature & Decision Fusion, Aviation, Emergency and Disaster Management, etc. Another point of observation is to analyze some brand new products of a multinational enterprise committed to the innovation like Intergraph Corporation. Part of Hexagon AB, Intergraph, Leica Geosystems and ERDAS are working together to leverage joint strengths in geospatial in-novation.Intergraph’s Motion Video Exploitation solution leverages full motion video, giving analysts the ability to collect, analyze, and maximize the value of video assets.GeoMedia  3D  is  a  GeoMedia  add-on  product  that  extends the functionality of Intergraph’s geospatial solutions through an integrated 3D visualization and analysis environment. You can visualize, navigate, analyze, and interact with 3D data na-tively in GeoMedia.GeoMedia Smart Client delivers an enterprise geospatial plat-form engineered to support large numbers of users who are unable to operate full desktop products, but whose workflows need  advanced  geospatial  functionality  that  cannot  be  sup-ported by Web mapping tools.G/Technology Fiber Optic Works 1.0 streamlines the management  of  fiber  optic  infrastructure  for  utilities,  municipalities, agencies and

  1. D Visibility Analysis in Urban Environment - Cognition Research Based on Vge

    Science.gov (United States)

    Lin, T. P.; Lin, H.; Hu, M. Y.

    2013-09-01

    The author in this research attempts to illustrate a measurable relationship between the physical environment and human's visual perception, including the distance, visual angle impact and visual field (a 3D isovist conception) against human's cognition way, by using a 3D visibility analysis method based on the platform of Virtual Geographic Environment (VGE). The whole project carries out in the CUHK campus (the Chinese University of Hong Kong), by adopting a virtual 3D model of the whole campus and survey in real world. A possible model for the simulation of human cognition in urban spaces is expected to be the output of this research, such as what the human perceive from the environment, how their feelings and behaviours are and how they affect the surrounding world. Kevin Lynch raised 5 elements of urban design in 1960s, which are "vitality, sense, fit, access and control". As the development of urban design, several problems around the human's cognitive and behaviour have come out. Due to the restriction of sensing knowledge in urban spaces, the research among the "sense" and the "fit" of urban design were not quite concerned in recent decades. The geo-spatial cognition field comes into being in 1997 and developed in recent 15 years, which made great effort in way-finding and urban behaviour simulation based on the platform of GIS (geographic information system) or VGE. The platform of VGE is recognized as a proper tool for the analysis of human's perception in urban places, because of its efficient 3D spatial data management and excellent 3D visualization for output result. This article will generally describe the visibility analysis method based on the 3D VGE platform. According to the uncertainty and variety of human perception existed in this research, the author attempts to arrange a survey of observer investigation and validation for the analysis results. Four figures related with space and human's perception will be mainly concerned in this proposal

  2. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    Science.gov (United States)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework

  3. GeoMapApp as a platform for visualizing marine data from Polar Regions

    Science.gov (United States)

    Nitsche, F. O.; Ryan, W. B.; Carbotte, S. M.; Ferrini, V.; Goodwillie, A. M.; O'hara, S. H.; Weissel, R.; McLain, K.; Chinhong, C.; Arko, R. A.; Chan, S.; Morton, J. J.; Pomeroy, D.

    2012-12-01

    To maximize the investment in expensive fieldwork the resulting data should be re-used as much as possible. In addition, unnecessary duplication of data collection effort should be avoided. This becomes even more important if access to field areas is as difficult and expensive as it is in Polar Regions. Making existing data discoverable in an easy to use platform is key to improve re-use and avoid duplication. A common obstacle is that use of existing data is often limited to specialists who know of the data existence and also have the right tools to view and analyze these data. GeoMapApp is a free, interactive, map based tool that allows users to discover, visualize, and analyze a large number of data sets. In addition to a global view, it provides polar map projections for displaying data in Arctic and Antarctic areas. Data that have currently been added to the system include Arctic swath bathymetry data collected from the USCG icebreaker Healy. These data are collected almost continuously including from cruises where bathymetry is not the main objective and for which existence of the acquired data may not be well known. In contrast, existence of seismic data from the Antarctic continental margin is well known in the seismic community. They are archived at and can be accessed through the Antarctic Seismic Data Library System (SDLS). Incorporating these data into GeoMapApp makes an even broader community aware of these data and the custom interface, which includes capabilities to visualize and explore these data, allows users without specific software or knowledge of the underlying data format to access the data. In addition to investigating these datasets, GeoMapApp provides links to the actual data sources to allow specialists the opportunity to re-use the original data. Important identification of data sources and data references are achieved on different levels. For access to the actual Antarctic seismic data GeoMapApp links to the SDLS site, where users have

  4. GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.

    Science.gov (United States)

    Liang, Steve H L; Huang, Chih-Yuan

    2013-10-02

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  5. Virtual reality stimuli for force platform posturography.

    Science.gov (United States)

    Tossavainen, Timo; Juhola, Martti; Ilmari, Pyykö; Aalto, Heikki; Toppila, Esko

    2002-01-01

    People relying much on vision in the control of posture are known to have an elevated risk of falling. Dependence on visual control is an important parameter in the diagnosis of balance disorders. We have previously shown that virtual reality methods can be used to produce visual stimuli that affect balance, but suitable stimuli need to be found. In this study the effect of six different virtual reality stimuli on the balance of 22 healthy test subjects was evaluated using force platform posturography. According to the tests two of the stimuli have a significant effect on balance.

  6. Implementing a product platform in 35 man-days:

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik; Pedersen, Rasmus; Nielsen, Ole Fiil

    2008-01-01

    benefits are reduced costs, reduced lead time and increased ability to focus engineering resources on aspects providing value to the customer. A so-called visual approach has been utilised. By means of a Product Family Master Plan, the content and scope of the platform have been modelled and visualised...

  7. A technical review of flexible endoscopic multitasking platforms.

    Science.gov (United States)

    Yeung, Baldwin Po Man; Gourlay, Terence

    2012-01-01

    Further development of advanced therapeutic endoscopic techniques and natural orifice translumenal endoscopic surgery (NOTES) requires a powerful flexible endoscopic multitasking platform. Medline search was performed to identify literature relating to flexible endoscopic multitasking platform from year 2004-2011 using keywords: Flexible endoscopic multitasking platform, NOTES, Instrumentation, Endoscopic robotic surgery, and specific names of various endoscopic multitasking platforms. Key articles from articles references were reviewed. Flexible multitasking platforms can be classified as either mechanical or robotic. Purely mechanical systems include the dual channel endoscope (DCE) (Olympus), R-Scope (Olympus), the EndoSamurai (Olympus), the ANUBIScope (Karl-Storz), Incisionless Operating Platform (IOP) (USGI), and DDES system (Boston Scientific). Robotic systems include the MASTER system (Nanyang University, Singapore) and the Viacath (Hansen Medical). The DCE, the R-Scope, the EndoSamurai and the ANUBIScope have integrated visual function and instrument manipulation function. The IOP and DDES systems rely on the conventional flexible endoscope for visualization, and instrument manipulation is integrated through the use of a flexible, often lockable, multichannel access device. The advantage of the access device concept is that it allows optics and instrument dissociation. Due to the anatomical constrains of the pharynx, systems are designed to have a diameter of less than 20 mm. All systems are controlled by traction cable system actuated either by hand or by robotic machinery. In a flexible system, this method of actuation inevitably leads to significant hysteresis. This problem will be accentuated with a long endoscope such as that required in performing colonic procedures. Systems often require multiple operators. To date, the DCE, the R-Scope, the IOP, and the Viacath system have data published relating to their application in human. Alternative forms of

  8. Infusion of Climate Change and Geospatial Science Concepts into Environmental and Biological Science Curriculum

    Science.gov (United States)

    Balaji Bhaskar, M. S.; Rosenzweig, J.; Shishodia, S.

    2017-12-01

    The objective of our activity is to improve the students understanding and interpretation of geospatial science and climate change concepts and its applications in the field of Environmental and Biological Sciences in the College of Science Engineering and Technology (COEST) at Texas Southern University (TSU) in Houston, TX. The courses of GIS for Environment, Ecology and Microbiology were selected for the curriculum infusion. A total of ten GIS hands-on lab modules, along with two NCAR (National Center for Atmospheric Research) lab modules on climate change were implemented in the "GIS for Environment" course. GIS and Google Earth Labs along with climate change lectures were infused into Microbiology and Ecology courses. Critical thinking and empirical skills of the students were assessed in all the courses. The student learning outcomes of these courses includes the ability of students to interpret the geospatial maps and the student demonstration of knowledge of the basic principles and concepts of GIS (Geographic Information Systems) and climate change. At the end of the courses, students developed a comprehensive understanding of the geospatial data, its applications in understanding climate change and its interpretation at the local and regional scales during multiple years.

  9. Geospatial Data Availability for Haiti: An Aid in the Development of GIS-Based Natural Resource Assessments for Conservation Planning.

    Science.gov (United States)

    Maya Quinones; William Gould; Carlos D. Rodriguez-Pedraza

    2007-01-01

    This report documents the type and source of geospatial data available for Haiti. It was compiled to serve as a resource for geographic information system (GIS)-based land management and planning. It will be useful for conservation planning, reforestation efforts, and agricultural extension projects. Our study indicates that there is a great deal of geospatial...

  10. Enhancing interdisciplinary collaboration and decisionmaking with J-Earth: an open source data sharing, visualization and GIS analysis platform

    Science.gov (United States)

    Prashad, L. C.; Christensen, P. R.; Fink, J. H.; Anwar, S.; Dickenshied, S.; Engle, E.; Noss, D.

    2010-12-01

    Our society currently is facing a number of major environmental challenges, most notably the threat of climate change. A multifaceted, interdisciplinary approach involving physical and social scientists, engineers and decisionmakers is critical to adequately address these complex issues. To best facilitate this interdisciplinary approach, data and models at various scales - from local to global - must be quickly and easily shared between disciplines to effectively understand environmental phenomena and human-environmental interactions. When data are acquired and studied on different scales and within different disciplines, researchers and practitioners may not be able to easily learn from each others results. For example, climate change models are often developed at a global scale, while strategies that address human vulnerability to climate change and mitigation/adaptation strategies are often assessed on a local level. Linkages between urban heat island phenomena and global climate change may be better understood with increased data flow amongst researchers and those making policy decisions. In these cases it would be useful have a single platform to share, visualize, and analyze numerical model and satellite/airborne remote sensing data with social, environmental, and economic data between researchers and practitioners. The Arizona State University 100 Cities Project and Mars Space Flight Facility are developing the open source application J-Earth, with the goal of providing this single platform, that facilitates data sharing, visualization, and analysis between researchers and applied practitioners around environmental and other sustainability challenges. This application is being designed for user communities including physical and social scientists, NASA researchers, non-governmental organizations, and decisionmakers to share and analyze data at multiple scales. We are initially focusing on urban heat island and urban ecology studies, with data and users from

  11. SALOME. A software integration platform for multi-physics, pre-processing and visualisation

    International Nuclear Information System (INIS)

    Bergeaud, Vincent; Lefebvre, Vincent

    2010-01-01

    In order to ease the development of applications integrating simulation codes, CAD modelers and post-processing tools. CEA and EDF R and D have invested in the SALOME platform, a tool dedicated to the environment of the scientific codes. The platform comes in the shape of a toolbox which offers functionalities for CAD, meshing, code coupling, visualization, GUI development. These tools can be combined to create integrated applications that make the scientific codes easier to use and well-interfaced with their environment be it other codes, CAD and meshing tools or visualization software. Many projects in CEA and EDF R and D now use SALOME, bringing technical coherence to the software suites of our institutions. (author)

  12. Data Management for Flexible Access - Implementation and Lessons Learned from work with Multiple User Communities

    Science.gov (United States)

    Benedict, K. K.; Scott, S.; Hudspeth, W. B.

    2012-12-01

    There is no shortage of community-specific and generic data discovery and download platforms and protocols (e.g. CUAHSI HIS, DataONE, GeoNetwork Open Source, GeoPortal, OGC CSW, OAI PMH), documentation standards (e.g. FGDC, ISO 19115, EML, Dublin Core), data access and visualization standards and models (e.g. OGC WxS, OpenDAP), and general-purpose web service models (i.e. REST & SOAP) upon which Geo-informatics cyberinfrastructure (CI) may be built. When attempting to develop a robust platform that may service a wide variety of users and use cases the challenge is one of identifying which existing platform (if any) may support those current needs while also allowing for future expansion for additional capabilities. In the case of the implementation of a data storage, discovery and delivery platform to support the multiple projects at the Earth Data Analysis Center at UNM, no single platform or protocol met the joint requirements of two initial applications (the New Mexico Resource Geographic Information System [http://rgis.unm.edu] and the New Mexico EPSCoR Data Portal [http://nmepscor.org/dataportal]) and furthermore none met anticipated additional requirements as new applications of the platform emerged. As a result of this assessment three years ago EDAC embarked on the development of the Geographic Storage, Transformation, and Retrieval Engine (GSToRE) platform as a general purpose platform upon which n-tiered geospatially enabled data intensive applications could be built. When initially released in 2010 the focus was on the publication of dynamically generated Open Geospatial Consortium services based upon a PostgreSQL/PostGIS backend database. The identification of additional service interface requirements (implementation of the DataONE API and CUAHSI WaterML services), use cases provided by the NM EPSCoR education working group, and expanded metadata publication needs have led to a significant update to the underlying data management tier for GSToRE - the

  13. Crisp Clustering Algorithm for 3D Geospatial Vector Data Quantization

    DEFF Research Database (Denmark)

    Azri, Suhaibah; Anton, François; Ujang, Uznir

    2015-01-01

    In the next few years, 3D data is expected to be an intrinsic part of geospatial data. However, issues on 3D spatial data management are still in the research stage. One of the issues is performance deterioration during 3D data retrieval. Thus, a practical 3D index structure is required for effic...

  14. Digital geospatial presentation of geoelectrical and geotechnical data for the lower American River and flood plain, east Sacramento, California

    Science.gov (United States)

    Ball, Lyndsay B.; Burton, Bethany L.; Powers, Michael H.; Asch, Theodore H.

    2015-01-01

    To characterize the extent and thickness of lithologic units that may have differing scour potential, the U.S. Geological Survey, in cooperation with the U.S. Army Corps of Engineers, has performed several geoelectrical surveys of the lower American River channel and flood plain between Cal Expo and the Rio Americano High School in east Sacramento, California. Additional geotechnical data have been collected by the U.S. Army Corps of Engineers and its contractors. Data resulting from these surveys have been compiled into similar database formats and converted to uniform geospatial datums and projections. These data have been visualized in a digital three-dimensional framework project that can be viewed using freely available software. These data facilitate a comprehensive analysis of the resistivity structure underlying the lower American River corridor and assist in levee system management.

  15. iview: an interactive WebGL visualizer for protein-ligand complex.

    Science.gov (United States)

    Li, Hongjian; Leung, Kwong-Sak; Nakane, Takanori; Wong, Man-Hon

    2014-02-25

    Visualization of protein-ligand complex plays an important role in elaborating protein-ligand interactions and aiding novel drug design. Most existing web visualizers either rely on slow software rendering, or lack virtual reality support. The vital feature of macromolecular surface construction is also unavailable. We have developed iview, an easy-to-use interactive WebGL visualizer of protein-ligand complex. It exploits hardware acceleration rather than software rendering. It features three special effects in virtual reality settings, namely anaglyph, parallax barrier and oculus rift, resulting in visually appealing identification of intermolecular interactions. It supports four surface representations including Van der Waals surface, solvent excluded surface, solvent accessible surface and molecular surface. Moreover, based on the feature-rich version of iview, we have also developed a neat and tailor-made version specifically for our istar web platform for protein-ligand docking purpose. This demonstrates the excellent portability of iview. Using innovative 3D techniques, we provide a user friendly visualizer that is not intended to compete with professional visualizers, but to enable easy accessibility and platform independence.

  16. Geo-Spatial Support for Assessment of Anthropic Impact on Biodiversity

    Directory of Open Access Journals (Sweden)

    Marco Piragnolo

    2014-04-01

    Full Text Available This paper discusses a methodology where geo-spatial analysis tools are used to quantify risk derived from anthropic activities on habitats and species. The method has been developed with a focus on simplification and the quality of standard procedures set on flora and fauna protected by the European Directives. In this study case, the DPSIR (Drivers, Pressures, State, Impacts, Responses is applied using spatial procedures in a geographical information system (GIS framework. This approach can be inserted in a multidimensional space as the analysis is applied to each threat, pressure and activity and also to each habitat and species, at the spatial and temporal scale. Threats, pressures and activities, stress and indicators can be managed by means of a geo-database and analyzed using spatial analysis functions in a tested GIS workflow environment. The method applies a matrix with risk values, and the final product is a geo-spatial representation of impact indicators, which can be used as a support for decision-makers at various levels (regional, national and European.

  17. COMBINING INDEPENDENT VISUALIZATION AND TRACKING SYSTEMS FOR AUGMENTED REALITY

    Directory of Open Access Journals (Sweden)

    P. Hübner

    2018-05-01

    Full Text Available The basic requirement for the successful deployment of a mobile augmented reality application is a reliable tracking system with high accuracy. Recently, a helmet-based inside-out tracking system which meets this demand has been proposed for self-localization in buildings. To realize an augmented reality application based on this tracking system, a display has to be added for visualization purposes. Therefore, the relative pose of this visualization platform with respect to the helmet has to be tracked. In the case of hand-held visualization platforms like smartphones or tablets, this can be achieved by means of image-based tracking methods like marker-based or model-based tracking. In this paper, we present two marker-based methods for tracking the relative pose between the helmet-based tracking system and a tablet-based visualization system. Both methods were implemented and comparatively evaluated in terms of tracking accuracy. Our results show that mobile inside-out tracking systems without integrated displays can easily be supplemented with a hand-held tablet as visualization device for augmented reality purposes.

  18. Comparison and analysis of FDA reported visual outcomes of the three latest platforms for LASIK: wavefront guided Visx iDesign, topography guided WaveLight Allegro Contoura, and topography guided Nidek EC-5000 CATz

    Directory of Open Access Journals (Sweden)

    Moshirfar M

    2017-01-01

    Full Text Available Majid Moshirfar,1,2 Tirth J Shah,3 David Franklin Skanchy,4 Steven H Linn,1 Paul Kang,3 Daniel S Durrie5 1HDR Research Center, Hoopes Vision, Salt Lake City, UT, 2Department of Ophthalmology and Visual Sciences, John A Moran Eye Center, University of Utah School of Medicine, Salt Lake City, UT, 3University of Arizona College of Medicine – Phoenix, Phoenix, AZ, 4McGovern Medical School, The University of Texas Health Science Center at Houston, TX, 5Durrie Vision, Kansas City, KS, USA Purpose: To compare and analyze the differences in visual outcomes between Visx iDesign Advanced WaveScan Studio™ System, Alcon Wavelight Allegro Topolyzer and Nidek EC-5000 using Final Fit™ Custom Ablation Treatment Software from the submitted summary of safety and effectiveness of the US Food and Drug Administration (FDA data.Methods: In this retrospective comparative study, 334 eyes from Visx iDesign, 212 eyes from Alcon Contour, and 135 eyes from Nidek CATz platforms were analyzed for primary and secondary visual outcomes. These outcomes were compared via side-by-side graphical and tabular representation of the FDA data. Statistical significance was calculated when appropriate to assess differences. A P-value <0.05 was considered statistically significant. Results: The mean postoperative uncorrected distance visual acuity (UDVA at 12 months was 20/19.25±8.76, 20/16.59±5.94, and 20/19.17±4.46 for Visx iDesign, Alcon Contoura, and Nidek CATz, respectively. In at least 90% of treated eyes at 3 months and 12 months, all three lasers showed either no change or a gain of corrected distance visual acuity (CDVA. Mesopic contrast sensitivity at 6 months showed a clinically significant increase of 41.3%, 25.1%, and 10.6% for eyes using Visx iDesign, Alcon Contoura, and Nidek CATz, respectively. Photopic contrast sensitivity at 6 months showed a clinically significant increase of 19.2%, 31.9%, and 10.6% for eyes using Visx iDesign, Alcon Contoura, and Nidek CATz

  19. Gamification and geospatial health management

    Science.gov (United States)

    Wortley, David

    2014-06-01

    Sensor and Measurement technologies are rapidly developing for many consumer applications which have the potential to make a major impact on business and society. One of the most important areas for building a sustainable future is in health management. This opportunity arises because of the growing popularity of lifestyle monitoring devices such as the Jawbone UP bracelet, Nike Fuelband and Samsung Galaxy GEAR. These devices measure physical activity and calorie consumption and, when visualised on mobile and portable devices, enable users to take more responsibility for their personal health. This presentation looks at how the process of gamification can be applied to develop important geospatial health management applications that could not only improve the health of nations but also significantly address some of the issues in global health such as the ageing society and obesity.

  20. Gamification and geospatial health management

    International Nuclear Information System (INIS)

    Wortley, David

    2014-01-01

    Sensor and Measurement technologies are rapidly developing for many consumer applications which have the potential to make a major impact on business and society. One of the most important areas for building a sustainable future is in health management. This opportunity arises because of the growing popularity of lifestyle monitoring devices such as the Jawbone UP bracelet, Nike Fuelband and Samsung Galaxy GEAR. These devices measure physical activity and calorie consumption and, when visualised on mobile and portable devices, enable users to take more responsibility for their personal health. This presentation looks at how the process of gamification can be applied to develop important geospatial health management applications that could not only improve the health of nations but also significantly address some of the issues in global health such as the ageing society and obesity

  1. Measuring the Interdisciplinary Impact of Using Geospatial Data with Remote Sensing Data

    Science.gov (United States)

    Downs, R. R.; Chen, R. S.; Schumacher, J.

    2017-12-01

    Various disciplines offer benefits to society by contributing to the scientific progress that informs the knowledge and decisions that improve the lives, safety, and conditions of people around the globe. In addition to disciplines within the natural sciences, other disciplines, including those in the social, health, and computer sciences, provide benefits to society by collecting, preparing, and analyzing data in the process of conducting research. Preparing geospatial environmental and socioeconomic data together with remote sensing data from satellite-based instruments for wider use by heterogeneous communities of users increases the potential impact of these data by enabling their use in different application areas and sectors of society. Furthermore, enabling wider use of scientific data can bring to bear resources and expertise that will improve reproducibility, quality, methodological transparency, interoperability, and improved understanding by diverse communities of users. In line with its commitment to open data, the NASA Socioeconomic Data and Applications Center (SEDAC), which focuses on human interactions in the environment, curates and disseminates freely and publicly available geospatial data for use across many disciplines and societal benefit areas. We describe efforts to broaden the use of SEDAC data and to publicly document their impact, assess the interdisciplinary impact of the use of SEDAC data with remote sensing data, and characterize these impacts in terms of their influence across disciplines by analyzing citations of geospatial data with remote sensing data within scientific journals.

  2. Mobile Traffic Alert and Tourist Route Guidance System Design Using Geospatial Data

    Science.gov (United States)

    Bhattacharya, D.; Painho, M.; Mishra, S.; Gupta, A.

    2017-09-01

    The present study describes an integrated system for traffic data collection and alert warning. Geographical information based decision making related to traffic destinations and routes is proposed through the design. The system includes a geospatial database having profile relating to a user of a mobile device. The processing and understanding of scanned maps, other digital data input leads to route guidance. The system includes a server configured to receive traffic information relating to a route and location information relating to the mobile device. Server is configured to send a traffic alert to the mobile device when the traffic information and the location information indicate that the mobile device is traveling toward traffic congestion. Proposed system has geospatial and mobile data sets pertaining to Bangalore city in India. It is envisaged to be helpful for touristic purposes as a route guidance and alert relaying information system to tourists for proximity to sites worth seeing in a city they have entered into. The system is modular in architecture and the novelty lies in integration of different modules carrying different technologies for a complete traffic information system. Generic information processing and delivery system has been tested to be functional and speedy under test geospatial domains. In a restricted prototype model with geo-referenced route data required information has been delivered correctly over sustained trials to designated cell numbers, with average time frame of 27.5 seconds, maximum 50 and minimum 5 seconds. Traffic geo-data set trials testing is underway.

  3. GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web

    Directory of Open Access Journals (Sweden)

    Steve H.L. Liang

    2013-10-01

    Full Text Available The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web’s full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  4. Collective Sensing: Integrating Geospatial Technologies to Understand Urban Systems—An Overview

    Directory of Open Access Journals (Sweden)

    Geoffrey J. Hay

    2011-08-01

    Full Text Available Cities are complex systems composed of numerous interacting components that evolve over multiple spatio-temporal scales. Consequently, no single data source is sufficient to satisfy the information needs required to map, monitor, model, and ultimately understand and manage our interaction within such urban systems. Remote sensing technology provides a key data source for mapping such environments, but is not sufficient for fully understanding them. In this article we provide a condensed urban perspective of critical geospatial technologies and techniques: (i Remote Sensing; (ii Geographic Information Systems; (iii object-based image analysis; and (iv sensor webs, and recommend a holistic integration of these technologies within the language of open geospatial consortium (OGC standards in-order to more fully understand urban systems. We then discuss the potential of this integration and conclude that this extends the monitoring and mapping options beyond “hard infrastructure” by addressing “humans as sensors”, mobility and human-environment interactions, and future improvements to quality of life and of social infrastructures.

  5. Multimodal Microchannel and Nanowell-Based Microfluidic Platforms for Bioimaging

    Energy Technology Data Exchange (ETDEWEB)

    Geng, Tao; Smallwood, Chuck R.; Zhu, Ying; Bredeweg, Erin L.; Baker, Scott E.; Evans, James E.; Kelly, Ryan T.

    2017-03-30

    Modern live-cell imaging approaches permit real-time visualization of biological processes. However, limitations for unicellular organism trapping, culturing and long-term imaging can preclude complete understanding of how such microorganisms respond to perturbations in their local environment or linking single-cell variability to whole population dynamics. We have developed microfluidic platforms to overcome prior technical bottlenecks to allow both chemostat and compartmentalized cellular growth conditions using the same device. Additionally, a nanowell-based platform enables a high throughput approach to scale up compartmentalized imaging optimized within the microfluidic device. These channel and nanowell platforms are complementary, and both provide fine control over the local environment as well as the ability to add/replace media components at any experimental time point.

  6. JS-MS: a cross-platform, modular javascript viewer for mass spectrometry signals.

    Science.gov (United States)

    Rosen, Jebediah; Handy, Kyle; Gillan, André; Smith, Rob

    2017-11-06

    Despite the ubiquity of mass spectrometry (MS), data processing tools can be surprisingly limited. To date, there is no stand-alone, cross-platform 3-D visualizer for MS data. Available visualization toolkits require large libraries with multiple dependencies and are not well suited for custom MS data processing modules, such as MS storage systems or data processing algorithms. We present JS-MS, a 3-D, modular JavaScript client application for viewing MS data. JS-MS provides several advantages over existing MS viewers, such as a dependency-free, browser-based, one click, cross-platform install and better navigation interfaces. The client includes a modular Java backend with a novel streaming.mzML parser to demonstrate the API-based serving of MS data to the viewer. JS-MS enables custom MS data processing and evaluation by providing fast, 3-D visualization using improved navigation without dependencies. JS-MS is publicly available with a GPLv2 license at github.com/optimusmoose/jsms.

  7. Development of Geospatial Map Based Portal for New Delhi Municipal Council

    Science.gov (United States)

    Gupta, A. Kumar Chandra; Kumar, P.; Sharma, P. Kumar

    2017-09-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Portal (GMP) for New Delhi Municipal Council (NDMC) of NCT of Delhi. The GMP has been developed as a map based spatial decision support system (SDSS) for planning and development of NDMC area to the NDMC department and It's heaving the inbuilt information searching tools (identifying of location, nearest utilities locations, distance measurement etc.) for the citizens of NCTD. The GMP is based on Client-Server architecture model. It has been developed using Arc GIS Server 10.0 with .NET (pronounced dot net) technology. The GMP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMP includes Circle, Division, Sub-division boundaries of department pertaining to New Delhi Municipal Council, Parcels of residential, commercial, and government buildings, basic amenities (Police Stations, Hospitals, Schools, Banks, ATMs and Fire Stations etc.), Over-ground and Underground utility network lines, Roads, Railway features. GMP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for development and management of MCD area. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  8. DEVELOPMENT OF GEOSPATIAL MAP BASED PORTAL FOR NEW DELHI MUNICIPAL COUNCIL

    Directory of Open Access Journals (Sweden)

    A. Kumar Chandra Gupta

    2017-09-01

    Full Text Available The Geospatial Delhi Limited (GSDL, a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD to the Government of National Capital Territory of Delhi (GNCTD and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD. This paper describes the development of Geospatial Map based Portal (GMP for New Delhi Municipal Council (NDMC of NCT of Delhi. The GMP has been developed as a map based spatial decision support system (SDSS for planning and development of NDMC area to the NDMC department and It’s heaving the inbuilt information searching tools (identifying of location, nearest utilities locations, distance measurement etc. for the citizens of NCTD. The GMP is based on Client-Server architecture model. It has been developed using Arc GIS Server 10.0 with .NET (pronounced dot net technology. The GMP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN connectivity. Spatial data to GMP includes Circle, Division, Sub-division boundaries of department pertaining to New Delhi Municipal Council, Parcels of residential, commercial, and government buildings, basic amenities (Police Stations, Hospitals, Schools, Banks, ATMs and Fire Stations etc., Over-ground and Underground utility network lines, Roads, Railway features. GMP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for development and management of MCD area. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  9. Effects of the visual-feedback-based force platform training with functional electric stimulation on the balance and prevention of falls in older adults: a randomized controlled trial.

    Science.gov (United States)

    Li, Zhen; Wang, Xiu-Xia; Liang, Yan-Yi; Chen, Shu-Yan; Sheng, Jing; Ma, Shao-Jun

    2018-01-01

    Force platform training with functional electric stimulation aimed at improving balance may be effective in fall prevention for older adults. Aim of the study is to evaluate the effects of the visual-feedback-based force platform balance training with functional electric stimulation on balance and fall prevention in older adults. A single-centre, unblinded, randomized controlled trial was conducted. One hundred and twenty older adults were randomly allocated to two groups: the control group ( n  = 60, one-leg standing balance exercise, 12 min/d) or the intervention group ( n  = 60, force platform training with functional electric stimulation, 12 min/d). The training was provided 15 days a month for 3 months by physical therapists. Medial-lateral and anterior-posterior maximal range of sway with eyes open and closed, the Berg Balance Scale, the Barthel Index, the Falls Efficacy scale-International were assessed at baseline and after the 3-month intervention. A fall diary was kept by each participant during the 6-month follow-up. On comparing the two groups, the intervention group showed significantly decreased ( p  Falls Efficacy Scale-International ( p  fall rates ( p  falls in older adults.

  10. A Novel Divisive Hierarchical Clustering Algorithm for Geospatial Analysis

    Directory of Open Access Journals (Sweden)

    Shaoning Li

    2017-01-01

    Full Text Available In the fields of geographic information systems (GIS and remote sensing (RS, the clustering algorithm has been widely used for image segmentation, pattern recognition, and cartographic generalization. Although clustering analysis plays a key role in geospatial modelling, traditional clustering methods are limited due to computational complexity, noise resistant ability and robustness. Furthermore, traditional methods are more focused on the adjacent spatial context, which makes it hard for the clustering methods to be applied to multi-density discrete objects. In this paper, a new method, cell-dividing hierarchical clustering (CDHC, is proposed based on convex hull retraction. The main steps are as follows. First, a convex hull structure is constructed to describe the global spatial context of geospatial objects. Then, the retracting structure of each borderline is established in sequence by setting the initial parameter. The objects are split into two clusters (i.e., “sub-clusters” if the retracting structure intersects with the borderlines. Finally, clusters are repeatedly split and the initial parameter is updated until the terminate condition is satisfied. The experimental results show that CDHC separates the multi-density objects from noise sufficiently and also reduces complexity compared to the traditional agglomerative hierarchical clustering algorithm.

  11. Foreword to the theme issue on geospatial computer vision

    Science.gov (United States)

    Wegner, Jan Dirk; Tuia, Devis; Yang, Michael; Mallet, Clement

    2018-06-01

    Geospatial Computer Vision has become one of the most prevalent emerging fields of investigation in Earth Observation in the last few years. In this theme issue, we aim at showcasing a number of works at the interface between remote sensing, photogrammetry, image processing, computer vision and machine learning. In light of recent sensor developments - both from the ground as from above - an unprecedented (and ever growing) quantity of geospatial data is available for tackling challenging and urgent tasks such as environmental monitoring (deforestation, carbon sequestration, climate change mitigation), disaster management, autonomous driving or the monitoring of conflicts. The new bottleneck for serving these applications is the extraction of relevant information from such large amounts of multimodal data. This includes sources, stemming from multiple sensors, that exhibit distinct physical nature of heterogeneous quality, spatial, spectral and temporal resolutions. They are as diverse as multi-/hyperspectral satellite sensors, color cameras on drones, laser scanning devices, existing open land-cover geodatabases and social media. Such core data processing is mandatory so as to generate semantic land-cover maps, accurate detection and trajectories of objects of interest, as well as by-products of superior added-value: georeferenced data, images with enhanced geometric and radiometric qualities, or Digital Surface and Elevation Models.

  12. Importance of the spatial data and the sensor web in the ubiquitous computing area

    Science.gov (United States)

    Akçit, Nuhcan; Tomur, Emrah; Karslıoǧlu, Mahmut O.

    2014-08-01

    Spatial data has become a critical issue in recent years. In the past years, nearly more than three quarters of databases, were related directly or indirectly to locations referring to physical features, which constitute the relevant aspects. Spatial data is necessary to identify or calculate the relationships between spatial objects when using spatial operators in programs or portals. Originally, calculations were conducted using Geographic Information System (GIS) programs on local computers. Subsequently, through the Internet, they formed a geospatial web, which is integrated into a discoverable collection of geographically related web standards and key features, and constitutes a global network of geospatial data that employs the World Wide Web to process textual data. In addition, the geospatial web is used to gather spatial data producers, resources, and users. Standards also constitute a critical dimension in further globalizing the idea of the geospatial web. The sensor web is an example of the real time service that the geospatial web can provide. Sensors around the world collect numerous types of data. The sensor web is a type of sensor network that is used for visualizing, calculating, and analyzing collected sensor data. Today, people use smart devices and systems more frequently because of the evolution of technology and have more than one mobile device. The considerable number of sensors and different types of data that are positioned around the world have driven the production of interoperable and platform-independent sensor web portals. The focus of such production has been on further developing the idea of an interoperable and interdependent sensor web of all devices that share and collect information. The other pivotal idea consists of encouraging people to use and send data voluntarily for numerous purposes with the some level of credibility. The principal goal is to connect mobile and non-mobile device in the sensor web platform together to

  13. National Geospatial Data Asset Lifecycle Baseline Maturity Assessment for the Federal Geographic Data Committee

    Science.gov (United States)

    Peltz-Lewis, L. A.; Blake-Coleman, W.; Johnston, J.; DeLoatch, I. B.

    2014-12-01

    The Federal Geographic Data Committee (FGDC) is designing a portfolio management process for 193 geospatial datasets contained within the 16 topical National Spatial Data Infrastructure themes managed under OMB Circular A-16 "Coordination of Geographic Information and Related Spatial Data Activities." The 193 datasets are designated as National Geospatial Data Assets (NGDA) because of their significance in implementing to the missions of multiple levels of government, partners and stakeholders. As a starting point, the data managers of these NGDAs will conduct a baseline maturity assessment of the dataset(s) for which they are responsible. The maturity is measured against benchmarks related to each of the seven stages of the data lifecycle management framework promulgated within the OMB Circular A-16 Supplemental Guidance issued by OMB in November 2010. This framework was developed by the interagency Lifecycle Management Work Group (LMWG), consisting of 16 Federal agencies, under the 2004 Presidential Initiative the Geospatial Line of Business,using OMB Circular A-130" Management of Federal Information Resources" as guidance The seven lifecycle stages are: Define, Inventory/Evaluate, Obtain, Access, Maintain, Use/Evaluate, and Archive. This paper will focus on the Lifecycle Baseline Maturity Assessment, and efforts to integration the FGDC approach with other data maturity assessments.

  14. Implementing a High School Level Geospatial Technologies and Spatial Thinking Course

    Science.gov (United States)

    Nielsen, Curtis P.; Oberle, Alex; Sugumaran, Ramanathan

    2011-01-01

    Understanding geospatial technologies (GSTs) and spatial thinking is increasingly vital to contemporary life including common activities and hobbies; learning in science, mathematics, and social science; and employment within fields as diverse as engineering, health, business, and planning. As such, there is a need for a stand-alone K-12…

  15. Comprehensive, Mixed-Methods Assessment of a Blended Learning Model for Geospatial Literacy Instruction

    Science.gov (United States)

    Brodeur, J. J.; Maclachlan, J. C.; Bagg, J.; Chiappetta-Swanson, C.; Vine, M. M.; Vajoczki, S.

    2013-12-01

    Geospatial literacy -- the ability to conceptualize, capture, analyze and communicate spatial phenomena -- represents an important competency for 21st Century learners in a period of 'Geospatial Revolution'. Though relevant to in-course learning, these skills are often taught externally, placing time and resource pressures on the service providers - commonly libraries - that are relied upon to provide instruction. The emergence of online and blended modes of instruction has presented a potential means of increasing the cost-effectiveness of such activities, by simultaneously reducing instructional costs, expanding the audience for these resources, and addressing student preferences for asynchronous learning and '24-7' access. During 2011 and 2012, McMaster University Library coordinated the development, implementation and assessment of blended learning modules for geospatial literacy instruction in first-year undergraduate Social Science courses. In this paper, we present the results of a comprehensive mixed-methods approach to assess the efficacy of implementing blended learning modules to replace traditional (face-to-face), library-led, first-year undergraduate geospatial literacy instruction. Focus groups, personal interviews and an online survey were used to assess modules across dimensions of: student use, satisfaction and accessibility requirements (via Universal Instructional Design [UID] principles); instructor and teaching staff perception of pedagogical efficacy and instructional effectiveness; and, administrator cost-benefit assessment of development and implementation. Results showed that both instructors and students identified significant value in using the online modules in a blended-learning setting. Reaffirming assumptions of students' '24/7' learning preferences, over 80% of students reported using the modules on a repeat basis. Students were more likely to use the modules to better understand course content than simply to increase their grade in

  16. Engagement sensitive visual stimulation

    Directory of Open Access Journals (Sweden)

    Deepesh Kumar

    2016-06-01

    Full Text Available Stroke is one of leading cause of death and disability worldwide. Early detection during golden hour and treatment of individual neurological dysfunction in stroke using easy-to-access biomarkers based on a simple-to-use, cost-effective, clinically-valid screening tool can bring a paradigm shift in healthcare, both urban and rural. In our research we have designed a quantitative automatic home-based oculomotor assessment tool that can play an important complementary role in prognosis of neurological disorders like stroke for the neurologist. Once the patient has been screened for stroke, the next step is to design proper rehabilitation platform to alleviate the disability. In addition to the screening platform, in our research, we work in designing virtual reality based rehabilitation exercise platform that has the potential to deliver visual stimulation and in turn contribute to improving one’s performance.

  17. Feasibility study of geospatial mapping of chronic disease risk to inform public health commissioning.

    Science.gov (United States)

    Noble, Douglas; Smith, Dianna; Mathur, Rohini; Robson, John; Greenhalgh, Trisha

    2012-01-01

    To explore the feasibility of producing small-area geospatial maps of chronic disease risk for use by clinical commissioning groups and public health teams. Cross-sectional geospatial analysis using routinely collected general practitioner electronic record data. Tower Hamlets, an inner-city district of London, UK, characterised by high socioeconomic and ethnic diversity and high prevalence of non-communicable diseases. The authors used type 2 diabetes as an example. The data set was drawn from electronic general practice records on all non-diabetic individuals aged 25-79 years in the district (n=163 275). The authors used a validated instrument, QDScore, to calculate 10-year risk of developing type 2 diabetes. Using specialist mapping software (ArcGIS), the authors produced visualisations of how these data varied by lower and middle super output area across the district. The authors enhanced these maps with information on examples of locality-based social determinants of health (population density, fast food outlets and green spaces). Data were piloted as three types of geospatial map (basic, heat and ring). The authors noted practical, technical and information governance challenges involved in producing the maps. Usable data were obtained on 96.2% of all records. One in 11 adults in our cohort was at 'high risk' of developing type 2 diabetes with a 20% or more 10-year risk. Small-area geospatial mapping illustrated 'hot spots' where up to 17.3% of all adults were at high risk of developing type 2 diabetes. Ring maps allowed visualisation of high risk for type 2 diabetes by locality alongside putative social determinants in the same locality. The task of downloading, cleaning and mapping data from electronic general practice records posed some technical challenges, and judgement was required to group data at an appropriate geographical level. Information governance issues were time consuming and required local and national consultation and agreement. Producing

  18. Geospatial database of estimates of groundwater discharge to streams in the Upper Colorado River Basin

    Science.gov (United States)

    Garcia, Adriana; Masbruch, Melissa D.; Susong, David D.

    2014-01-01

    The U.S. Geological Survey, as part of the Department of the Interior’s WaterSMART (Sustain and Manage America’s Resources for Tomorrow) initiative, compiled published estimates of groundwater discharge to streams in the Upper Colorado River Basin as a geospatial database. For the purpose of this report, groundwater discharge to streams is the baseflow portion of streamflow that includes contributions of groundwater from various flow paths. Reported estimates of groundwater discharge were assigned as attributes to stream reaches derived from the high-resolution National Hydrography Dataset. A total of 235 estimates of groundwater discharge to streams were compiled and included in the dataset. Feature class attributes of the geospatial database include groundwater discharge (acre-feet per year), method of estimation, citation abbreviation, defined reach, and 8-digit hydrologic unit code(s). Baseflow index (BFI) estimates of groundwater discharge were calculated using an existing streamflow characteristics dataset and were included as an attribute in the geospatial database. A comparison of the BFI estimates to the compiled estimates of groundwater discharge found that the BFI estimates were greater than the reported groundwater discharge estimates.

  19. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    Science.gov (United States)

    Hearn,, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  20. U.S. Geological Survey Geospatial Data To Support STEM Education And Communication

    Science.gov (United States)

    Molnia, B. F.

    2017-12-01

    The U.S. Geological Survey (USGS) has a long history of contributing to STEM education, outreach, and communication. The USGS EarthExplorer website: https://earthexplorer.usgs.gov is the USGS gateway to more than 150 geospatial data sets that are freely available to STEM students, educators, and researchers. Two in particular, Global Fiducials data and Declassified Satellite Imagery provide the highest resolution visual record of the Earth's surface that is available for unlimited, unrestricted download. Global Fiducials Data - Since the mid-1990s, more than 500 locations, each termed a 'Fiducial Site', have been systematically and repeatedly imaged with U.S. National Imagery Systems space-based sensors. Each location was selected for long-term monitoring, based on its history and environmental values. Since 2008, imagery from about a quarter of the sites has been publicly released and is available on EarthExplorer. These 5,000 electro-optical (EO) images, with 1.0 - 1.3 m resolution, comprise more than 140 time-series. Individual time-series focus on wildland fire recovery, Arctic sea ice change, Antarctic habitats, temperate glacier behavior, eroding barrier islands, coastline evolution, resource and ecosystem management, natural disaster response, global change studies, and other topics. Declassified Satellite Imagery - Nearly 1 million declassified photographs, collected between 1960 and 1984, by U.S. intelligence satellites KH-1 through KH-9 have been released to the public. The USGS has copies of most of the released film and provides a digital finding aid that can be accessed from the USGS EarthExplorer website. Individual frames were collected at resolutions that range from 0.61 m - 7.6 m. Imagery exists for locations on all continents. Combined with Landsat imagery, also available from the USGS EarthExplorer website, the STEM Community has access to more than 7.5 million images providing nearly 50 years of visual observations of Earth's dynamic surface.

  1. MOBILE TRAFFIC ALERT AND TOURIST ROUTE GUIDANCE SYSTEM DESIGN USING GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    D. Bhattacharya

    2017-09-01

    Full Text Available The present study describes an integrated system for traffic data collection and alert warning. Geographical information based decision making related to traffic destinations and routes is proposed through the design. The system includes a geospatial database having profile relating to a user of a mobile device. The processing and understanding of scanned maps, other digital data input leads to route guidance. The system includes a server configured to receive traffic information relating to a route and location information relating to the mobile device. Server is configured to send a traffic alert to the mobile device when the traffic information and the location information indicate that the mobile device is traveling toward traffic congestion. Proposed system has geospatial and mobile data sets pertaining to Bangalore city in India. It is envisaged to be helpful for touristic purposes as a route guidance and alert relaying information system to tourists for proximity to sites worth seeing in a city they have entered into. The system is modular in architecture and the novelty lies in integration of different modules carrying different technologies for a complete traffic information system. Generic information processing and delivery system has been tested to be functional and speedy under test geospatial domains. In a restricted prototype model with geo-referenced route data required information has been delivered correctly over sustained trials to designated cell numbers, with average time frame of 27.5 seconds, maximum 50 and minimum 5 seconds. Traffic geo-data set trials testing is underway.

  2. Towards gaze-controlled platform games

    DEFF Research Database (Denmark)

    Muñoz, Jorge; Yannakakis, Georgios N.; Mulvey, Fiona

    2011-01-01

    This paper introduces the concept of using gaze as a sole modality for fully controlling player characters of fast-paced action computer games. A user experiment is devised to collect gaze and gameplay data from subjects playing a version of the popular Super Mario Bros platform game. The initial...... analysis shows that there is a rather limited grid around Mario where the efficient player focuses her attention the most while playing the game. The useful grid as we name it, projects the amount of meaningful visual information a designer should use towards creating successful player character...... controllers with the use of artificial intelligence for a platform game like Super Mario. Information about the eyes' position on the screen and the state of the game are utilized as inputs of an artificial neural network, which is trained to approximate which keyboard action is to be performed at each game...

  3. Scalable Multi-Platform Distribution of Spatial 3d Contents

    Science.gov (United States)

    Klimke, J.; Hagedorn, B.; Döllner, J.

    2013-09-01

    Virtual 3D city models provide powerful user interfaces for communication of 2D and 3D geoinformation. Providing high quality visualization of massive 3D geoinformation in a scalable, fast, and cost efficient manner is still a challenging task. Especially for mobile and web-based system environments, software and hardware configurations of target systems differ significantly. This makes it hard to provide fast, visually appealing renderings of 3D data throughout a variety of platforms and devices. Current mobile or web-based solutions for 3D visualization usually require raw 3D scene data such as triangle meshes together with textures delivered from server to client, what makes them strongly limited in terms of size and complexity of the models they can handle. In this paper, we introduce a new approach for provisioning of massive, virtual 3D city models on different platforms namely web browsers, smartphones or tablets, by means of an interactive map assembled from artificial oblique image tiles. The key concept is to synthesize such images of a virtual 3D city model by a 3D rendering service in a preprocessing step. This service encapsulates model handling and 3D rendering techniques for high quality visualization of massive 3D models. By generating image tiles using this service, the 3D rendering process is shifted from the client side, which provides major advantages: (a) The complexity of the 3D city model data is decoupled from data transfer complexity (b) the implementation of client applications is simplified significantly as 3D rendering is encapsulated on server side (c) 3D city models can be easily deployed for and used by a large number of concurrent users, leading to a high degree of scalability of the overall approach. All core 3D rendering techniques are performed on a dedicated 3D rendering server, and thin-client applications can be compactly implemented for various devices and platforms.

  4. Geospatial Data Quality of the Servir CORS Network

    Science.gov (United States)

    Santos, J.; Teodoro, R.; Mira, N.; Mendes, V. B.

    2015-08-01

    The SERVIR Continuous Operation Reference Stations (CORS) network was implemented in 2006 to facilitate land surveying with Global Navigation Satellite Systems (GNSS) positioning techniques. Nowadays, the network covers all Portuguese mainland. The SERVIR data is provided to many users, such as surveyors, universities (for education and research purposes) and companies that deal with geographic information. By middle 2012, there was a significant change in the network accessing paradigm, the most important of all being the increase in the responsibility of managing the network to guarantee a permanent availability and the highest quality of the geospatial data. In addition, the software that is used to manage the network and to compute the differential corrections was replaced by a new software package. These facts were decisive to perform the quality control of the SERVIR network and evaluate positional accuracy. In order to perform such quality control, a significant number of geodetic monuments spread throughout the country were chosen. Some of these monuments are located in the worst location regarding the network geometry in order to evaluate the accuracy of positions for the worst case scenarios. Data collection was carried out using different GNSS positioning modes and were compared against the benchmark positions that were determined using data acquired in static mode in 3-hour sessions. We conclude the geospatial data calculated and provided to the users community by the network is, within the surveying purposes, accurate, precise and fits the needs of those users.

  5. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    Science.gov (United States)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  6. Geospatial Data as a Service: The GEOGLAM Rangelands and Pasture Productivity Map Experience

    Science.gov (United States)

    Evans, B. J. K.; Antony, J.; Guerschman, J. P.; Larraondo, P. R.; Richards, C. J.

    2017-12-01

    Empowering end-users like pastoralists, land management specialists and land policy makers in the use of earth observation data for both day-to-day and seasonal planning needs both interactive delivery of multiple geospatial datasets and the capability of supporting on-the-fly dynamic queries while simultaneously fostering a community around the effort. The use of and wide adoption of large data archives, like those produced by earth observation missions, are often limited by compute and storage capabilities of the remote user. We demonstrate that wide-scale use of large data archives can be facilitated by end-users dynamically requesting value-added products using open standards (WCS, WMS, WPS), with compute running in the cloud or dedicated data-centres and visualizing outputs on web-front ends. As an example, we will demonstrate how a tool called GSKY can empower a remote end-user by providing the data delivery and analytics capabilities for the GEOGLAM Rangelands and Pasture Productivity (RAPP) Map tool. The GEOGLAM RAPP initiative from the Group on Earth Observations (GEO) and its Agricultural Monitoring subgroup aims at providing practical tools to end-users focusing on the important role of rangelands and pasture systems in providing food production security from both agricultural crops and animal protein. Figure 1, is a screen capture from the RAPP Map interface for an important pasture area in the Namibian rangelands. The RAPP Map has been in production for six months and has garnered significant interest from groups and users all over the world. GSKY, being formulated around the theme of Open Geospatial Data-as-a-Service capabilities uses distributed computing and storage to facilitate this. It works behind the scenes, accepting OGC standard requests in WCS, WMS and WPS. Results from these requests are rendered on a web-front end. In this way, the complexities of data locality and compute execution are masked from an end user. On-the-fly computation of

  7. Multi-Layer Visualization of Mobile Mapping Data

    Directory of Open Access Journals (Sweden)

    D. Eggert

    2013-10-01

    Full Text Available application various different visualization schemes are conceivable. This paper presents a multi-layer based visualization method, enabling fast data browsing of mobile mapping data. In contrast to systems like Google Street View the proposed visualization does not base on 360° panoramas, but on colored point clouds projected on partially translucent images. Those images are rendered as overlapping textures, preserving the depth of the recorded data and still enabling fast rendering on any kind of platform. Furthermore the proposed visualization allows the user to inspect the mobile mapping data in a panoramic fashion with an immersive depth illusion using the parallax scrolling technic.

  8. Development of Geospatial Map Based Portal for Delimitation of Mcd Wards

    Science.gov (United States)

    Gupta, A. Kumar Chandra; Kumar, P.; Sharma, P. Kumar

    2017-09-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Portal for Delimitation of MCD Wards (GMPDW) and election of 3 Municipal Corporations of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for delimitation of MCD Wards and draw of peripheral wards boundaries to planning and management of MCD Election process of State Election Commission, and as an MCD election related information searching tools (Polling Station, MCD Wards and Assembly constituency etc.,) for the citizens of NCTD. The GMPDW is based on Client-Server architecture model. It has been developed using Arc GIS Server 10.0 with .NET (pronounced dot net) technology. The GMPDW is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMPDW includes Enumeration Block (EB) and Enumeration Blocks Group (EBG) boundaries of Citizens of Delhi, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMPDW could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of MCD election. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  9. DEVELOPMENT OF GEOSPATIAL MAP BASED PORTAL FOR DELIMITATION OF MCD WARDS

    Directory of Open Access Journals (Sweden)

    A. Kumar Chandra Gupta

    2017-09-01

    Full Text Available The Geospatial Delhi Limited (GSDL, a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD to the Government of National Capital Territory of Delhi (GNCTD and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD. This paper describes the development of Geospatial Map based Portal for Delimitation of MCD Wards (GMPDW and election of 3 Municipal Corporations of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS for delimitation of MCD Wards and draw of peripheral wards boundaries to planning and management of MCD Election process of State Election Commission, and as an MCD election related information searching tools (Polling Station, MCD Wards and Assembly constituency etc., for the citizens of NCTD. The GMPDW is based on Client-Server architecture model. It has been developed using Arc GIS Server 10.0 with .NET (pronounced dot net technology. The GMPDW is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN connectivity. Spatial data to GMPDW includes Enumeration Block (EB and Enumeration Blocks Group (EBG boundaries of Citizens of Delhi, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.. GMPDW could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of MCD election. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  10. A Hierarchical Visualization Analysis Model of Power Big Data

    Science.gov (United States)

    Li, Yongjie; Wang, Zheng; Hao, Yang

    2018-01-01

    Based on the conception of integrating VR scene and power big data analysis, a hierarchical visualization analysis model of power big data is proposed, in which levels are designed, targeting at different abstract modules like transaction, engine, computation, control and store. The regularly departed modules of power data storing, data mining and analysis, data visualization are integrated into one platform by this model. It provides a visual analysis solution for the power big data.

  11. DOE's SciDAC Visualization and Analytics Center for EnablingTechnologies -- Strategy for Petascale Visual Data Analysis Success

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E Wes; Johnson, Chris; Aragon, Cecilia; Rubel, Oliver; Weber, Gunther; Pascucci, Valerio; Childs, Hank; Bremer, Peer-Timo; Whitlock, Brad; Ahern, Sean; Meredith, Jeremey; Ostrouchov, George; Joy, Ken; Hamann, Bernd; Garth, Christoph; Cole, Martin; Hansen, Charles; Parker, Steven; Sanderson, Allen; Silva, Claudio; Tricoche, Xavier

    2007-10-01

    The focus of this article is on how one group of researchersthe DOE SciDAC Visualization and Analytics Center for EnablingTechnologies (VACET) is tackling the daunting task of enabling knowledgediscovery through visualization and analytics on some of the world slargest and most complex datasets and on some of the world's largestcomputational platforms. As a Center for Enabling Technology, VACET smission is the creation of usable, production-quality visualization andknowledge discovery software infrastructure that runs on large, parallelcomputer systems at DOE's Open Computing facilities and that providessolutions to challenging visual data exploration and knowledge discoveryneeds of modern science, particularly the DOE sciencecommunity.

  12. VISUALIZATION OF VGI DATA THROUGH THE NEW NASA WEB WORLD WIND VIRTUAL GLOBE

    Directory of Open Access Journals (Sweden)

    M. A. Brovelli

    2016-06-01

    Full Text Available GeoWeb 2.0, laying the foundations of Volunteered Geographic Information (VGI systems, has led to platforms where users can contribute to the geographic knowledge that is open to access. Moreover, as a result of the advancements in 3D visualization, virtual globes able to visualize geographic data even on browsers emerged. However the integration of VGI systems and virtual globes has not been fully realized. The study presented aims to visualize volunteered data in 3D, considering also the ease of use aspects for general public, using Free and Open Source Software (FOSS. The new Application Programming Interface (API of NASA, Web World Wind, written in JavaScript and based on Web Graphics Library (WebGL is cross-platform and cross-browser, so that the virtual globe created using this API can be accessible through any WebGL supported browser on different operating systems and devices, as a result not requiring any installation or configuration on the client-side, making the collected data more usable to users, which is not the case with the World Wind for Java as installation and configuration of the Java Virtual Machine (JVM is required. Furthermore, the data collected through various VGI platforms might be in different formats, stored in a traditional relational database or in a NoSQL database. The project developed aims to visualize and query data collected through Open Data Kit (ODK platform and a cross-platform application, where data is stored in a relational PostgreSQL and NoSQL CouchDB databases respectively.

  13. Visualization of Vgi Data Through the New NASA Web World Wind Virtual Globe

    Science.gov (United States)

    Brovelli, M. A.; Kilsedar, C. E.; Zamboni, G.

    2016-06-01

    GeoWeb 2.0, laying the foundations of Volunteered Geographic Information (VGI) systems, has led to platforms where users can contribute to the geographic knowledge that is open to access. Moreover, as a result of the advancements in 3D visualization, virtual globes able to visualize geographic data even on browsers emerged. However the integration of VGI systems and virtual globes has not been fully realized. The study presented aims to visualize volunteered data in 3D, considering also the ease of use aspects for general public, using Free and Open Source Software (FOSS). The new Application Programming Interface (API) of NASA, Web World Wind, written in JavaScript and based on Web Graphics Library (WebGL) is cross-platform and cross-browser, so that the virtual globe created using this API can be accessible through any WebGL supported browser on different operating systems and devices, as a result not requiring any installation or configuration on the client-side, making the collected data more usable to users, which is not the case with the World Wind for Java as installation and configuration of the Java Virtual Machine (JVM) is required. Furthermore, the data collected through various VGI platforms might be in different formats, stored in a traditional relational database or in a NoSQL database. The project developed aims to visualize and query data collected through Open Data Kit (ODK) platform and a cross-platform application, where data is stored in a relational PostgreSQL and NoSQL CouchDB databases respectively.

  14. A geospatial modelling approach to predict seagrass habitat recovery under multiple stressor regimes

    Science.gov (United States)

    Restoration of estuarine seagrass habitats requires a clear understanding of the modes of action of multiple interacting stressors including nutrients, climate change, coastal land-use change, and habitat modification. We have developed and demonstrated a geospatial modeling a...

  15. A Microsoft Windows version of the MCNP visual editor

    International Nuclear Information System (INIS)

    Schwarz, R.A.; Carter, L.L.; Pfohl, J.

    1999-01-01

    Work has started on a Microsoft Windows version of the MCNP visual editor. The MCNP visual editor provides a graphical user interface for displaying and creating MCNP geometries. The visual editor is currently available from the Radiation Safety Information Computational Center (RSICC) and the Nuclear Energy Agency (NEA) as software package PSR-358. It currently runs on the major UNIX platforms (IBM, SGI, HP, SUN) and Linux. Work has started on converting the visual editor to work in a Microsoft Windows environment. This initial work focuses on converting the display capabilities of the visual editor; the geometry creation capability of the visual editor may be included in future upgrades

  16. Establishing Accurate and Sustainable Geospatial Reference Layers in Developing Countries

    Science.gov (United States)

    Seaman, V. Y.

    2017-12-01

    Accurate geospatial reference layers (settlement names & locations, administrative boundaries, and population) are not readily available for most developing countries. This critical information gap makes it challenging for governments to efficiently plan, allocate resources, and provide basic services. It also hampers international agencies' response to natural disasters, humanitarian crises, and other emergencies. The current work involves a recent successful effort, led by the Bill & Melinda Gates Foundation and the Government of Nigeria, to obtain such data. The data collection began in 2013, with local teams collecting names, coordinates, and administrative attributes for over 100,000 settlements using ODK-enabled smartphones. A settlement feature layer extracted from satellite imagery was used to ensure all settlements were included. Administrative boundaries (Ward, LGA) were created using the settlement attributes. These "new" boundary layers were much more accurate than existing shapefiles used by the government and international organizations. The resulting data sets helped Nigeria eradicate polio from all areas except in the extreme northeast, where security issues limited access and vaccination activities. In addition to the settlement and boundary layers, a GIS-based population model was developed, in partnership with Oak Ridge National Laboratories and Flowminder), that used the extracted settlement areas and characteristics, along with targeted microcensus data. This model provides population and demographics estimates independent of census or other administrative data, at a resolution of 90 meters. These robust geospatial data layers found many other uses, including establishing catchment area settlements and populations for health facilities, validating denominators for population-based surveys, and applications across a variety of government sectors. Based on the success of the Nigeria effort, a partnership between DfID and the Bill & Melinda Gates

  17. Decision Performance Using Spatial Decision Support Systems: A Geospatial Reasoning Ability Perspective

    Science.gov (United States)

    Erskine, Michael A.

    2013-01-01

    As many consumer and business decision makers are utilizing Spatial Decision Support Systems (SDSS), a thorough understanding of how such decisions are made is crucial for the information systems domain. This dissertation presents six chapters encompassing a comprehensive analysis of the impact of geospatial reasoning ability on…

  18. How NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements.

    Science.gov (United States)

    Tisdale, M.

    2016-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.

  19. A Collaborative Geospatial Shoreline Inventory Tool to Guide Coastal Development and Habitat Conservation

    Directory of Open Access Journals (Sweden)

    Peter Gies

    2013-05-01

    Full Text Available We are developing a geospatial inventory tool that will guide habitat conservation, restoration and coastal development and benefit several stakeholders who seek mitigation and adaptation strategies to shoreline changes resulting from erosion and sea level rise. The ESRI Geoportal Server, which is a type of web portal used to find and access geospatial information in a central repository, is customized by adding a Geoinventory tool capability that allows any shoreline related data to be searched, displayed and analyzed on a map viewer. Users will be able to select sections of the shoreline and generate statistical reports in the map viewer to allow for comparisons. The tool will also facilitate map-based discussion forums and creation of user groups to encourage citizen participation in decisions regarding shoreline stabilization and restoration, thereby promoting sustainable coastal development.

  20. TOWARDS IMPLEMENTATION OF THE FOG COMPUTING CONCEPT INTO THE GEOSPATIAL DATA INFRASTRUCTURES

    Directory of Open Access Journals (Sweden)

    E. A. Panidi

    2016-01-01

    Full Text Available The information technologies and Global Network technologies in particular are developing very quickly. According to this, the problem remains actual that incorporates implementation issues for the general-purpose technologies into the information systems which operate with geospatial data. The paper discusses the implementation feasibility for a number of new approaches and concepts that solve the problems of spatial data publish and management on the Global Network. A brief review describes some contemporary concepts and technologies used for distributed data storage and management, which provide combined use of server-side and client-side resources. In particular, the concepts of Cloud Computing, Fog Computing, and Internet of Things, also with Java Web Start, WebRTC and WebTorrent technologies are mentioned. The author's experience is described briefly, which incorporates the number of projects devoted to the development of the portable solutions for geospatial data and GIS software publication on the Global Network.