WorldWideScience

Sample records for learning geospatial analysis

  1. Learning R for geospatial analysis

    CERN Document Server

    Dorman, Michael

    2014-01-01

    This book is intended for anyone who wants to learn how to efficiently analyze geospatial data with R, including GIS analysts, researchers, educators, and students who work with spatial data and who are interested in expanding their capabilities through programming. The book assumes familiarity with the basic geographic information concepts (such as spatial coordinates), but no prior experience with R and/or programming is required. By focusing on R exclusively, you will not need to depend on any external software-a working installation of R is all that is necessary to begin.

  2. Geospatial Data Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...

  3. Geospatial Technologies and Geography Education in a Changing World : Geospatial Practices and Lessons Learned

    NARCIS (Netherlands)

    2015-01-01

    Book published by IGU Commission on Geographical Education. It focuses particularly on what has been learned from geospatial projects and research from the past decades of implementing geospatial technologies in formal and informal education.

  4. Geospatial and machine learning techniques for wicked social science problems: analysis of crash severity on a regional highway corridor

    Science.gov (United States)

    Effati, Meysam; Thill, Jean-Claude; Shabani, Shahin

    2015-04-01

    The contention of this paper is that many social science research problems are too "wicked" to be suitably studied using conventional statistical and regression-based methods of data analysis. This paper argues that an integrated geospatial approach based on methods of machine learning is well suited to this purpose. Recognizing the intrinsic wickedness of traffic safety issues, such approach is used to unravel the complexity of traffic crash severity on highway corridors as an example of such problems. The support vector machine (SVM) and coactive neuro-fuzzy inference system (CANFIS) algorithms are tested as inferential engines to predict crash severity and uncover spatial and non-spatial factors that systematically relate to crash severity, while a sensitivity analysis is conducted to determine the relative influence of crash severity factors. Different specifications of the two methods are implemented, trained, and evaluated against crash events recorded over a 4-year period on a regional highway corridor in Northern Iran. Overall, the SVM model outperforms CANFIS by a notable margin. The combined use of spatial analysis and artificial intelligence is effective at identifying leading factors of crash severity, while explicitly accounting for spatial dependence and spatial heterogeneity effects. Thanks to the demonstrated effectiveness of a sensitivity analysis, this approach produces comprehensive results that are consistent with existing traffic safety theories and supports the prioritization of effective safety measures that are geographically targeted and behaviorally sound on regional highway corridors.

  5. An exploration of counterfeit medicine surveillance strategies guided by geospatial analysis: lessons learned from counterfeit Avastin detection in the US drug supply chain.

    Science.gov (United States)

    Cuomo, Raphael E; Mackey, Tim K

    2014-12-02

    To explore healthcare policy and system improvements that would more proactively respond to future penetration of counterfeit cancer medications in the USA drug supply chain using geospatial analysis. A statistical and geospatial analysis of areas that received notices from the Food and Drug Administration (FDA) about the possibility of counterfeit Avastin penetrating the US drug supply chain. Data from FDA warning notices were compared to data from 44 demographic variables available from the US Census Bureau via correlation, means testing and geospatial visualisation. Results were interpreted in light of existing literature in order to recommend improvements to surveillance of counterfeit medicines. This study analysed 791 distinct healthcare provider addresses that received FDA warning notices across 30,431 zip codes in the USA. Statistical outputs were Pearson's correlation coefficients and t values. Geospatial outputs were cartographic visualisations. These data were used to generate the overarching study outcome, which was a recommendation for a strategy for drug safety surveillance congruent with existing literature on counterfeit medication. Zip codes with greater numbers of individuals age 65+ and greater numbers of ethnic white individuals were most correlated with receipt of a counterfeit Avastin notice. Geospatial visualisations designed in conjunction with statistical analysis of demographic variables appeared more capable of suggesting areas and populations that may be at risk for undetected counterfeit Avastin penetration. This study suggests that dual incorporation of statistical and geospatial analysis in surveillance of counterfeit medicine may be helpful in guiding efforts to prevent, detect and visualise counterfeit medicines penetrations in the US drug supply chain and other settings. Importantly, the information generated by these analyses could be utilised to identify at-risk populations associated with demographic characteristics

  6. IMPRINT Analysis of an Unmanned Air System Geospatial Information Process

    National Research Council Canada - National Science Library

    Hunn, Bruce P; Schweitzer, Kristin M; Cahir, John A; Finch, Mary M

    2008-01-01

    ... intelligence, geospatial analysis cell. The Improved Performance Research Integration Tool (IMPRINT) modeling program was used to understand this process and to assess crew workload during several test scenarios...

  7. A Research Agenda for Geospatial Technologies and Learning

    Science.gov (United States)

    Baker, Tom R.; Battersby, Sarah; Bednarz, Sarah W.; Bodzin, Alec M.; Kolvoord, Bob; Moore, Steven; Sinton, Diana; Uttal, David

    2015-01-01

    Knowledge around geospatial technologies and learning remains sparse, inconsistent, and overly anecdotal. Studies are needed that are better structured; more systematic and replicable; attentive to progress and findings in the cognate fields of science, technology, engineering, and math education; and coordinated for multidisciplinary approaches.…

  8. Machine learning on geospatial big data

    CSIR Research Space (South Africa)

    Van Zyl, T

    2014-02-01

    Full Text Available When trying to understand the difference between machine learning and statistics, it is important to note that it is not so much the set of techniques and theory that are used but more importantly the intended use of the results. In fact, many...

  9. Geospatial Field Methods: An Undergraduate Course Built Around Point Cloud Construction and Analysis to Promote Spatial Learning and Use of Emerging Technology in Geoscience

    Science.gov (United States)

    Bunds, M. P.

    2017-12-01

    , assessed for accuracy, and analyzed in Geographic Information System software. Student projects have included mapping and analyzing landslide morphology, fault scarps, and earthquake ground surface rupture. Students have praised the geospatial skills they learn, whereas helping them stay on schedule to finish their projects is a challenge.

  10. Multi-source Geospatial Data Analysis with Google Earth Engine

    Science.gov (United States)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  11. New Techniques for Deep Learning with Geospatial Data using TensorFlow, Earth Engine, and Google Cloud Platform

    Science.gov (United States)

    Hancher, M.

    2017-12-01

    Recent years have seen promising results from many research teams applying deep learning techniques to geospatial data processing. In that same timeframe, TensorFlow has emerged as the most popular framework for deep learning in general, and Google has assembled petabytes of Earth observation data from a wide variety of sources and made them available in analysis-ready form in the cloud through Google Earth Engine. Nevertheless, developing and applying deep learning to geospatial data at scale has been somewhat cumbersome to date. We present a new set of tools and techniques that simplify this process. Our approach combines the strengths of several underlying tools: TensorFlow for its expressive deep learning framework; Earth Engine for data management, preprocessing, postprocessing, and visualization; and other tools in Google Cloud Platform to train TensorFlow models at scale, perform additional custom parallel data processing, and drive the entire process from a single familiar Python development environment. These tools can be used to easily apply standard deep neural networks, convolutional neural networks, and other custom model architectures to a variety of geospatial data structures. We discuss our experiences applying these and related tools to a range of machine learning problems, including classic problems like cloud detection, building detection, land cover classification, as well as more novel problems like illegal fishing detection. Our improved tools will make it easier for geospatial data scientists to apply modern deep learning techniques to their own problems, and will also make it easier for machine learning researchers to advance the state of the art of those techniques.

  12. Geospatial Analysis of Grey Wolf Movement Patterns

    Science.gov (United States)

    Sur, D.

    2017-12-01

    The grey wolf is a top predator that lives across a diverse habitat, ranging from Europe to North America. They often hunt in packs, preferring caribou, deer and elk as prey. Currently, many gray wolves live in Denali National Park and Preserve. In this study, several wolf packs were studied in three distinct regions of Denali. The purpose of my research was to investigate the links between wolf habitat, movement patterns, and prey thresholds. These are needed for projecting future population, growth and distribution of wolves in the studied region. I also investigated the effect wolves have on the ecological structure of the communities they inhabit. In the study I carried out a quantitative analysis of wolf population trends and daily distance movement by utilizing an analysis of variance (ANOVA) in the program JmpPro12 (SAS Institute, Crary, NC) to assess regional differences in pack size, wolf density, average daily distance moved. I found a clear link between the wolf habitat and prey thresholds; the habitat directly influences the types of prey available. However there was no link between the daily distance movement, the wolf habitat and prey density.

  13. Allocation of Tutors and Study Centers in Distance Learning Using Geospatial Technologies

    Directory of Open Access Journals (Sweden)

    Shahid Nawaz Khan

    2018-05-01

    Full Text Available Allama Iqbal Open University (AIOU is Pakistan’s largest distance learning institute, providing education to 1.4 million students. This is a fairly large setup across a country where students are highly geographically distributed. Currently, the system works using a manual approach, which is not efficient. Allocation of tutors and study centers to students plays a key role in creating a better learning environment for distance learning. Assigning tutors and study centers to distance learning students is a challenging task when there is a huge geographic spread. Using geospatial technologies in open and distance learning can fix allocation problems. This research analyzes real data from the twin cities Islamabad and Rawalpindi. The results show that geospatial technologies can be used for efficient and proper resource utilization and allocation, which in turn can save time and money. The overall idea fits into an improved distance learning framework and related analytics.

  14. Large Scale Analysis of Geospatial Data with Dask and XArray

    Science.gov (United States)

    Zender, C. S.; Hamman, J.; Abernathey, R.; Evans, K. J.; Rocklin, M.; Zender, C. S.; Rocklin, M.

    2017-12-01

    The analysis of geospatial data with high level languages has acceleratedinnovation and the impact of existing data resources. However, as datasetsgrow beyond single-machine memory, data structures within these high levellanguages can become a bottleneck. New libraries like Dask and XArray resolve some of these scalability issues,providing interactive workflows that are both familiar tohigh-level-language researchers while also scaling out to much largerdatasets. This broadens the access of researchers to larger datasets on highperformance computers and, through interactive development, reducestime-to-insight when compared to traditional parallel programming techniques(MPI). This talk describes Dask, a distributed dynamic task scheduler, Dask.array, amulti-dimensional array that copies the popular NumPy interface, and XArray,a library that wraps NumPy/Dask.array with labeled and indexes axes,implementing the CF conventions. We discuss both the basic design of theselibraries and how they change interactive analysis of geospatial data, and alsorecent benefits and challenges of distributed computing on clusters ofmachines.

  15. A Novel Divisive Hierarchical Clustering Algorithm for Geospatial Analysis

    Directory of Open Access Journals (Sweden)

    Shaoning Li

    2017-01-01

    Full Text Available In the fields of geographic information systems (GIS and remote sensing (RS, the clustering algorithm has been widely used for image segmentation, pattern recognition, and cartographic generalization. Although clustering analysis plays a key role in geospatial modelling, traditional clustering methods are limited due to computational complexity, noise resistant ability and robustness. Furthermore, traditional methods are more focused on the adjacent spatial context, which makes it hard for the clustering methods to be applied to multi-density discrete objects. In this paper, a new method, cell-dividing hierarchical clustering (CDHC, is proposed based on convex hull retraction. The main steps are as follows. First, a convex hull structure is constructed to describe the global spatial context of geospatial objects. Then, the retracting structure of each borderline is established in sequence by setting the initial parameter. The objects are split into two clusters (i.e., “sub-clusters” if the retracting structure intersects with the borderlines. Finally, clusters are repeatedly split and the initial parameter is updated until the terminate condition is satisfied. The experimental results show that CDHC separates the multi-density objects from noise sufficiently and also reduces complexity compared to the traditional agglomerative hierarchical clustering algorithm.

  16. Geospatial Analysis Application to Forecast Wildfire Occurrences in South Carolina

    Directory of Open Access Journals (Sweden)

    Stephen L. Sperry

    2012-05-01

    Full Text Available Wildfire occurrence and intensity have increased over the last few decades and, at times, have been national news. Wildfire occurrence is somewhat predictable based on physical factors like meteorological conditions, fuel loads, and vegetation dynamics. Socioeconomic factors have been not been widely used in wildfire occurrence models. We used a geospatial (or geographical information system analysis approach to identify socioeconomic variables that contribute to wildfire occurrence. Key variables considered were population change, population density, poverty rate, educational level, geographic mobility, and road density (transportation network. Hot spot analysis was the primary research tool. Wildfire occurrence seemed to be positively related to low population densities, low levels of population change, high poverty rate, low educational attainment level, and low road density. Obviously, some of these variables are correlated and this is a complex problem. However, socioeconomic variables appeared to contribute to wildfire occurrence and should be considered in development of wildfire occurrence forecasting models.

  17. lawn: An R client for the Turf JavaScript Library for Geospatial Analysis

    Science.gov (United States)

    lawn is an R package to provide access to the geospatial analysis capabilities in the Turf javascript library. Turf expects data in GeoJSON format. Given that many datasets are now available natively in GeoJSON providing an easier method for conducting geospatial analyses on thes...

  18. Biosecurity and geospatial analysis of mycoplasma infections in ...

    African Journals Online (AJOL)

    Geospatial database of farm locations and biosecurity measures are essential to control disease outbreaks. A study was conducted to establish geospatial database on poultry farms in Al-Jabal Al-Gharbi region of Libya, to evaluate the biosecurity level of each farm and to determine the seroprevalence of mycoplasma and ...

  19. GeoSpatial Data Analysis for DHS Programs

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, Eric G.; Burke, John S.; Carlson, Carrie A.; Gillen, David S.; Joslyn, Cliff A.; Olsen, Bryan K.; Critchlow, Terence J.

    2009-05-10

    The Department of Homeland Security law enforcement faces the continual challenge of analyzing their custom data sources in a geospatial context. From a strategic perspective law enforcement has certain requirements to first broadly characterize a given situation using their custom data sources and then once it is summarily understood, to geospatially analyze their data in detail.

  20. Contextual object understanding through geospatial analysis and reasoning (COUGAR)

    Science.gov (United States)

    Douglas, Joel; Antone, Matthew; Coggins, James; Rhodes, Bradley J.; Sobel, Erik; Stolle, Frank; Vinciguerra, Lori; Zandipour, Majid; Zhong, Yu

    2009-05-01

    Military operations in urban areas often require detailed knowledge of the location and identity of commonly occurring objects and spatial features. The ability to rapidly acquire and reason over urban scenes is critically important to such tasks as mission and route planning, visibility prediction, communications simulation, target recognition, and inference of higher-level form and function. Under DARPA's Urban Reasoning and Geospatial ExploitatioN Technology (URGENT) Program, the BAE Systems team has developed a system that combines a suite of complementary feature extraction and matching algorithms with higher-level inference and contextual reasoning to detect, segment, and classify urban entities of interest in a fully automated fashion. Our system operates solely on colored 3D point clouds, and considers object categories with a wide range of specificity (fire hydrants, windows, parking lots), scale (street lights, roads, buildings, forests), and shape (compact shapes, extended regions, terrain). As no single method can recognize the diverse set of categories under consideration, we have integrated multiple state-of-the-art technologies that couple hierarchical associative reasoning with robust computer vision and machine learning techniques. Our solution leverages contextual cues and evidence propagation from features to objects to scenes in order to exploit the combined descriptive power of 3D shape, appearance, and learned inter-object spatial relationships. The result is a set of tools designed to significantly enhance the productivity of analysts in exploiting emerging 3D data sources.

  1. Geospatial Analysis of Oil and Gas Wells in California

    Science.gov (United States)

    Riqueros, N. S.; Kang, M.; Jackson, R. B.

    2015-12-01

    California currently ranks third in oil production by U.S. state and more than 200,000 wells have been drilled in the state. Oil and gas wells provide a potential pathway for subsurface migration, leading to groundwater contamination and emissions of methane and other fluids to the atmosphere. Here we compile available public databases on oil and gas wells from the California Department of Conservation's Division of Oil, Gas, and Geothermal Resources, the U.S. Geological Survey, and other state and federal sources. We perform geospatial analysis at the county and field levels to characterize depths, producing formations, spud/completion/abandonment dates, land cover, population, and land ownership of active, idle, buried, abandoned, and plugged wells in California. The compiled database is designed to serve as a quantitative platform for developing field-based groundwater and air emission monitoring plans.

  2. Geospatial analysis of food environment demonstrates associations with gestational diabetes.

    Science.gov (United States)

    Kahr, Maike K; Suter, Melissa A; Ballas, Jerasimos; Ramin, Susan M; Monga, Manju; Lee, Wesley; Hu, Min; Shope, Cindy D; Chesnokova, Arina; Krannich, Laura; Griffin, Emily N; Mastrobattista, Joan; Dildy, Gary A; Strehlow, Stacy L; Ramphul, Ryan; Hamilton, Winifred J; Aagaard, Kjersti M

    2016-01-01

    Gestational diabetes mellitus (GDM) is one of most common complications of pregnancy, with incidence rates varying by maternal age, race/ethnicity, obesity, parity, and family history. Given its increasing prevalence in recent decades, covariant environmental and sociodemographic factors may be additional determinants of GDM occurrence. We hypothesized that environmental risk factors, in particular measures of the food environment, may be a diabetes contributor. We employed geospatial modeling in a populous US county to characterize the association of the relative availability of fast food restaurants and supermarkets to GDM. Utilizing a perinatal database with >4900 encoded antenatal and outcome variables inclusive of ZIP code data, 8912 consecutive pregnancies were analyzed for correlations between GDM and food environment based on countywide food permit registration data. Linkage between pregnancies and food environment was achieved on the basis of validated 5-digit ZIP code data. The prevalence of supermarkets and fast food restaurants per 100,000 inhabitants for each ZIP code were gathered from publicly available food permit sources. To independently authenticate our findings with objective data, we measured hemoglobin A1c levels as a function of geospatial distribution of food environment in a matched subset (n = 80). Residence in neighborhoods with a high prevalence of fast food restaurants (fourth quartile) was significantly associated with an increased risk of developing GDM (relative to first quartile: adjusted odds ratio, 1.63; 95% confidence interval, 1.21-2.19). In multivariate analysis, this association held true after controlling for potential confounders (P = .002). Measurement of hemoglobin A1c levels in a matched subset were significantly increased in association with residence in a ZIP code with a higher fast food/supermarket ratio (n = 80, r = 0.251 P analysis, a relationship of food environment and risk for gestational diabetes was

  3. Advancing Collaborative Climate Studies through Globally Distributed Geospatial Analysis

    Science.gov (United States)

    Singh, R.; Percivall, G.

    2009-12-01

    Infrastructure and the broader GEOSS architecture. Of specific interest to this session is the work on geospatial workflows and geo-processing and data discovery and access. CCIP demonstrates standards-based interoperability between geospatial applications in the service of Climate Change analysis. CCIP is planned to be a yearly exercise. It consists of a network of online data services (WCS, WFS, SOS), analysis services (WPS, WCPS, WMS), and clients that exercise those services. In 2009, CCIP focuses on Australia, and the initial application of existing OGC services to climate studies. The results of the 2009 CCIP will serve as requirements for more complex geo-processing services to be developed for CCIP 2010. The benefits of CCIP include accelerating the implementation of the GCOS, and building confidence that implementations using multi-vendor interoperable technologies can help resolve vexing climate change questions. AIP-2: Architecture Implementation Pilot, Phase 2 CCIP: Climate Challenge Integration Plugfest GEO: Group on Earth Observations GEOSS: Global Earth Observing System of Systems GCOS: Global Climate Observing System OGC: Open Geospatial Consortium SOS: Sensor Observation Service WCS: Web Coverage Service WCPS: Web Coverage Processing Service WFS: Web Feature Service WMS: Web Mapping Service

  4. A Platform for Scalable Satellite and Geospatial Data Analysis

    Science.gov (United States)

    Beneke, C. M.; Skillman, S.; Warren, M. S.; Kelton, T.; Brumby, S. P.; Chartrand, R.; Mathis, M.

    2017-12-01

    At Descartes Labs, we use the commercial cloud to run global-scale machine learning applications over satellite imagery. We have processed over 5 Petabytes of public and commercial satellite imagery, including the full Landsat and Sentinel archives. By combining open-source tools with a FUSE-based filesystem for cloud storage, we have enabled a scalable compute platform that has demonstrated reading over 200 GB/s of satellite imagery into cloud compute nodes. In one application, we generated global 15m Landsat-8, 20m Sentinel-1, and 10m Sentinel-2 composites from 15 trillion pixels, using over 10,000 CPUs. We recently created a public open-source Python client library that can be used to query and access preprocessed public satellite imagery from within our platform, and made this platform available to researchers for non-commercial projects. In this session, we will describe how you can use the Descartes Labs Platform for rapid prototyping and scaling of geospatial analyses and demonstrate examples in land cover classification.

  5. Comprehensive, Mixed-Methods Assessment of a Blended Learning Model for Geospatial Literacy Instruction

    Science.gov (United States)

    Brodeur, J. J.; Maclachlan, J. C.; Bagg, J.; Chiappetta-Swanson, C.; Vine, M. M.; Vajoczki, S.

    2013-12-01

    Geospatial literacy -- the ability to conceptualize, capture, analyze and communicate spatial phenomena -- represents an important competency for 21st Century learners in a period of 'Geospatial Revolution'. Though relevant to in-course learning, these skills are often taught externally, placing time and resource pressures on the service providers - commonly libraries - that are relied upon to provide instruction. The emergence of online and blended modes of instruction has presented a potential means of increasing the cost-effectiveness of such activities, by simultaneously reducing instructional costs, expanding the audience for these resources, and addressing student preferences for asynchronous learning and '24-7' access. During 2011 and 2012, McMaster University Library coordinated the development, implementation and assessment of blended learning modules for geospatial literacy instruction in first-year undergraduate Social Science courses. In this paper, we present the results of a comprehensive mixed-methods approach to assess the efficacy of implementing blended learning modules to replace traditional (face-to-face), library-led, first-year undergraduate geospatial literacy instruction. Focus groups, personal interviews and an online survey were used to assess modules across dimensions of: student use, satisfaction and accessibility requirements (via Universal Instructional Design [UID] principles); instructor and teaching staff perception of pedagogical efficacy and instructional effectiveness; and, administrator cost-benefit assessment of development and implementation. Results showed that both instructors and students identified significant value in using the online modules in a blended-learning setting. Reaffirming assumptions of students' '24/7' learning preferences, over 80% of students reported using the modules on a repeat basis. Students were more likely to use the modules to better understand course content than simply to increase their grade in

  6. A Big Data Platform for Storing, Accessing, Mining and Learning Geospatial Data

    Science.gov (United States)

    Yang, C. P.; Bambacus, M.; Duffy, D.; Little, M. M.

    2017-12-01

    Big Data is becoming a norm in geoscience domains. A platform that is capable to effiently manage, access, analyze, mine, and learn the big data for new information and knowledge is desired. This paper introduces our latest effort on developing such a platform based on our past years' experiences on cloud and high performance computing, analyzing big data, comparing big data containers, and mining big geospatial data for new information. The platform includes four layers: a) the bottom layer includes a computing infrastructure with proper network, computer, and storage systems; b) the 2nd layer is a cloud computing layer based on virtualization to provide on demand computing services for upper layers; c) the 3rd layer is big data containers that are customized for dealing with different types of data and functionalities; d) the 4th layer is a big data presentation layer that supports the effient management, access, analyses, mining and learning of big geospatial data.

  7. SWOT analysis on National Common Geospatial Information Service Platform of China

    Science.gov (United States)

    Zheng, Xinyan; He, Biao

    2010-11-01

    Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.

  8. Interoperability in planetary research for geospatial data analysis

    Science.gov (United States)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  9. Geospatial Analysis of Renewable Energy Technical Potential on Tribal Lands

    Energy Technology Data Exchange (ETDEWEB)

    Doris, E.; Lopez, A.; Beckley, D.

    2013-02-01

    This technical report uses an established geospatial methodology to estimate the technical potential for renewable energy on tribal lands for the purpose of allowing Tribes to prioritize the development of renewable energy resources either for community scale on-tribal land use or for revenue generating electricity sales.

  10. Modeling photovoltaic diffusion: an analysis of geospatial datasets

    International Nuclear Information System (INIS)

    Davidson, Carolyn; Drury, Easan; Lopez, Anthony; Elmore, Ryan; Margolis, Robert

    2014-01-01

    This study combines address-level residential photovoltaic (PV) adoption trends in California with several types of geospatial information—population demographics, housing characteristics, foreclosure rates, solar irradiance, vehicle ownership preferences, and others—to identify which subsets of geospatial information are the best predictors of historical PV adoption. Number of rooms, heating source and house age were key variables that had not been previously explored in the literature, but are consistent with the expected profile of a PV adopter. The strong relationship provided by foreclosure indicators and mortgage status have less of an intuitive connection to PV adoption, but may be highly correlated with characteristics inherent in PV adopters. Next, we explore how these predictive factors and model performance varies between different Investor Owned Utility (IOU) regions in California, and at different spatial scales. Results suggest that models trained with small subsets of geospatial information (five to eight variables) may provide similar explanatory power as models using hundreds of geospatial variables. Further, the predictive performance of models generally decreases at higher resolution, i.e., below ZIP code level since several geospatial variables with coarse native resolution become less useful for representing high resolution variations in PV adoption trends. However, for California we find that model performance improves if parameters are trained at the regional IOU level rather than the state-wide level. We also find that models trained within one IOU region are generally representative for other IOU regions in CA, suggesting that a model trained with data from one state may be applicable in another state. (letter)

  11. Assessing Student Learning About Climate Change With Earth System Place-Based Geospatial Data

    Science.gov (United States)

    Zalles, D. R.; Krumhansl, R. A.; Acker, J. G.; Manitakos, J.; Elston, A.

    2012-12-01

    Powerful web-based data sets about geospatially situated Earth system phenomena are now available for analysis by the general public, including for any teacher or set of students who have the requisite skills to partake in the analyses. Unfortunately there exist impediments to successful use of these data. Teachers and students may lack (1) readiness to use the software interfaces for querying and representing the data, (2) needed scientific practice skills such as interpreting geographic information system-based maps and time series plots, and (3) needed understandings of the fundamental scientific concepts to make sense of the data. Hence, to evaluate any program designed to engage students and teachers with these data resources, there need to be assessment strategies to check for understanding. Assessment becomes the key to identifying learning needs and intervening appropriately with additional task scaffolding or other forms of instructional support. The paper will describe contrasting assessment strategies being carried out in two climate change education projects funded by NASA and NSF. The NASA project, Data Enhanced Investigations for Climate Change Education (DICCE), brings data from NASA satellite missions to the classroom. A bank of DICCE assessment items is being developed to measure students' abilities to transfer their skills in analyzing data about their local region to other regions of the world. Teachers choose pre-post assessment items for variables of Earth system phenomena that they target in their instruction. The data vary depending on what courses the teachers are teaching. For example, Earth science teachers are likely to choose data about atmospheric phenomena and biology teachers are more likely to choose land cover data. The NSF project, Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE), provides to teachers recent climatological and vegetation data about "study areas" in Central

  12. Geospatial analysis platform: Supporting strategic spatial analysis and planning

    CSIR Research Space (South Africa)

    Naude, A

    2008-11-01

    Full Text Available Whilst there have been rapid advances in satellite imagery and related fine resolution mapping and web-based interfaces (e.g. Google Earth), the development of capabilities for strategic spatial analysis and planning support has lagged behind...

  13. Geo-Spatial Social Network Analysis of Social Media to Mitigate Disasters

    Science.gov (United States)

    Carley, K. M.

    2017-12-01

    Understanding the spatial layout of human activity can afford a better understanding many phenomena - such as local cultural, the spread of ideas, and the scope of a disaster. Today, social media is one of the key sensors for acquiring information on socio-cultural activity, some with cues as to the geo-location. We ask, What can be learned by putting such data on maps? For example, are people who chat on line more likely to be near each other? Can Twitter data support disaster planning or early warning? In this talk, such issues are examined using data collected via Twitter and analyzed using ORA. ORA is a network analysis and visualization system. It supports not just social networks (who is interacting with whom), but also high dimensional networks with many types of nodes (e.g. people, organizations, resources, activities …) and relations, geo-spatial network analysis, dynamic network analysis, & geo-temporal analysis. Using ORA lessons learned from five case studies are considered: Arab Spring, Tsunami warning in Padang Indonesia, Twitter around Fukushima in Japan, Typhoon Haiyan (Yolanda), & regional conflict. Using Padang Indonesia data, we characterize the strengths and limitations of social media data to support disaster planning & early warning, identify at risk areas & issues of concern, and estimate where people are and which areas are impacted. Using Fukushima Japanese data, social media is used to estimate geo-spatial regularities in movement and communication that can inform disaster response and risk estimation. Using Arab Spring data, we find that the spread of bots & extremists varies by country and time, to the extent that using twitter to understand who is important or what ideas are critical can be compromised. Bots and extremists can exploit disaster messaging to create havoc and facilitate criminal activity e.g. human trafficking. Event discovery mechanisms support isolating geo-epi-centers for key events become crucial. Spatial inference

  14. High Performance Processing and Analysis of Geospatial Data Using CUDA on GPU

    Directory of Open Access Journals (Sweden)

    STOJANOVIC, N.

    2014-11-01

    Full Text Available In this paper, the high-performance processing of massive geospatial data on many-core GPU (Graphic Processing Unit is presented. We use CUDA (Compute Unified Device Architecture programming framework to implement parallel processing of common Geographic Information Systems (GIS algorithms, such as viewshed analysis and map-matching. Experimental evaluation indicates the improvement in performance with respect to CPU-based solutions and shows feasibility of using GPU and CUDA for parallel implementation of GIS algorithms over large-scale geospatial datasets.

  15. Using Geospatial Analysis to Align Little Free Library Locations with Community Literacy Needs

    Science.gov (United States)

    Rebori, Marlene K.; Burge, Peter

    2017-01-01

    We used geospatial analysis tools to develop community maps depicting fourth-grade reading proficiency test scores and locations of facilities offering public access to reading materials (i.e., public libraries, elementary schools, and Little Free Libraries). The maps visually highlighted areas with struggling readers and areas without adequate…

  16. Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pabian, Frank V [Los Alamos National Laboratory

    2012-08-14

    This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previously used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant

  17. Preliminary Geospatial Analysis of Arctic Ocean Hydrocarbon Resources

    Energy Technology Data Exchange (ETDEWEB)

    Long, Philip E.; Wurstner, Signe K.; Sullivan, E. C.; Schaef, Herbert T.; Bradley, Donald J.

    2008-10-01

    Ice coverage of the Arctic Ocean is predicted to become thinner and to cover less area with time. The combination of more ice-free waters for exploration and navigation, along with increasing demand for hydrocarbons and improvements in technologies for the discovery and exploitation of new hydrocarbon resources have focused attention on the hydrocarbon potential of the Arctic Basin and its margins. The purpose of this document is to 1) summarize results of a review of published hydrocarbon resources in the Arctic, including both conventional oil and gas and methane hydrates and 2) develop a set of digital maps of the hydrocarbon potential of the Arctic Ocean. These maps can be combined with predictions of ice-free areas to enable estimates of the likely regions and sequence of hydrocarbon production development in the Arctic. In this report, conventional oil and gas resources are explicitly linked with potential gas hydrate resources. This has not been attempted previously and is particularly powerful as the likelihood of gas production from marine gas hydrates increases. Available or planned infrastructure, such as pipelines, combined with the geospatial distribution of hydrocarbons is a very strong determinant of the temporal-spatial development of Arctic hydrocarbon resources. Significant unknowns decrease the certainty of predictions for development of hydrocarbon resources. These include: 1) Areas in the Russian Arctic that are poorly mapped, 2) Disputed ownership: primarily the Lomonosov Ridge, 3) Lack of detailed information on gas hydrate distribution, and 4) Technical risk associated with the ability to extract methane gas from gas hydrates. Logistics may control areas of exploration more than hydrocarbon potential. Accessibility, established ownership, and leasing of exploration blocks may trump quality of source rock, reservoir, and size of target. With this in mind, the main areas that are likely to be explored first are the Bering Strait and Chukchi

  18. Cloud Geospatial Analysis Tools for Global-Scale Comparisons of Population Models for Decision Making

    Science.gov (United States)

    Hancher, M.; Lieber, A.; Scott, L.

    2017-12-01

    The volume of satellite and other Earth data is growing rapidly. Combined with information about where people are, these data can inform decisions in a range of areas including food and water security, disease and disaster risk management, biodiversity, and climate adaptation. Google's platform for planetary-scale geospatial data analysis, Earth Engine, grants access to petabytes of continually updating Earth data, programming interfaces for analyzing the data without the need to download and manage it, and mechanisms for sharing the analyses and publishing results for data-driven decision making. In addition to data about the planet, data about the human planet - population, settlement and urban models - are now available for global scale analysis. The Earth Engine APIs enable these data to be joined, combined or visualized with economic or environmental indicators such as nighttime lights trends, global surface water, or climate projections, in the browser without the need to download anything. We will present our newly developed application intended to serve as a resource for government agencies, disaster response and public health programs, or other consumers of these data to quickly visualize the different population models, and compare them to ground truth tabular data to determine which model suits their immediate needs. Users can further tap into the power of Earth Engine and other Google technologies to perform a range of analysis from simple statistics in custom regions to more complex machine learning models. We will highlight case studies in which organizations around the world have used Earth Engine to combine population data with multiple other sources of data, such as water resources and roads data, over deep stacks of temporal imagery to model disease risk and accessibility to inform decisions.

  19. A Geospatial Cyberinfrastructure for Urban Economic Analysis and Spatial Decision-Making

    Directory of Open Access Journals (Sweden)

    Michael F. Goodchild

    2013-05-01

    Full Text Available Urban economic modeling and effective spatial planning are critical tools towards achieving urban sustainability. However, in practice, many technical obstacles, such as information islands, poor documentation of data and lack of software platforms to facilitate virtual collaboration, are challenging the effectiveness of decision-making processes. In this paper, we report on our efforts to design and develop a geospatial cyberinfrastructure (GCI for urban economic analysis and simulation. This GCI provides an operational graphic user interface, built upon a service-oriented architecture to allow (1 widespread sharing and seamless integration of distributed geospatial data; (2 an effective way to address the uncertainty and positional errors encountered in fusing data from diverse sources; (3 the decomposition of complex planning questions into atomic spatial analysis tasks and the generation of a web service chain to tackle such complex problems; and (4 capturing and representing provenance of geospatial data to trace its flow in the modeling task. The Greater Los Angeles Region serves as the test bed. We expect this work to contribute to effective spatial policy analysis and decision-making through the adoption of advanced GCI and to broaden the application coverage of GCI to include urban economic simulations.

  20. Python geospatial development

    CERN Document Server

    Westra, Erik

    2013-01-01

    This is a tutorial style book that will teach usage of Python tools for GIS using simple practical examples and then show you how to build a complete mapping application from scratch. The book assumes basic knowledge of Python. No knowledge of Open Source GIS is required.Experienced Python developers who want to learn about geospatial concepts, work with geospatial data, solve spatial problems, and build mapbased applications.This book will be useful those who want to get up to speed with Open Source GIS in order to build GIS applications or integrate GeoSpatial features into their existing ap

  1. Geospatial Analysis Platform and tools: supporting planning and decision making across scales, borders, sectors and disciplines

    CSIR Research Space (South Africa)

    Naude, AH

    2008-04-01

    Full Text Available observation and geospatial analysis technologies, as well as the associated need for spatially explicit and sectorally integrated growth and development plans (including plans that deal with multi-scale or cross-border issues), the required statistical... planning. This requires planning and analysis that can (1) facilitate the sharing of spatial and other data, (2) deal with multi-scale or cross-border issues, as well as can (3) support the understanding of patterns and inter-regional dynamics at regional...

  2. Geospatial Analysis Using Remote Sensing Images: Case Studies of Zonguldak Test Field

    Science.gov (United States)

    Bayık, Çağlar; Topan, Hüseyin; Özendi, Mustafa; Oruç, Murat; Cam, Ali; Abdikan, Saygın

    2016-06-01

    Inclined topographies are one of the most challenging problems for geospatial analysis of air-borne and space-borne imageries. However, flat areas are mostly misleading to exhibit the real performance. For this reason, researchers generally require a study area which includes mountainous topography and various land cover and land use types. Zonguldak and its vicinity is a very suitable test site for performance investigation of remote sensing systems due to the fact that it contains different land use types such as dense forest, river, sea, urban area; different structures such as open pit mining operations, thermal power plant; and its mountainous structure. In this paper, we reviewed more than 120 proceeding papers and journal articles about geospatial analysis that are performed on the test field of Zonguldak and its surroundings. Geospatial analysis performed with imageries include elimination of systematic geometric errors, 2/3D georeferencing accuracy assessment, DEM and DSM generation and validation, ortho-image production, evaluation of information content, image classification, automatic feature extraction and object recognition, pan-sharpening, land use and land cover change analysis and deformation monitoring. In these applications many optical satellite images are used i.e. ASTER, Bilsat-1, IKONOS, IRS-1C, KOMPSAT-1, KVR-1000, Landsat-3-5-7, Orbview-3, QuickBird, Pleiades, SPOT-5, TK-350, RADARSAT-1, WorldView-1-2; as well as radar data i.e. JERS-1, Envisat ASAR, TerraSAR-X, ALOS PALSAR and SRTM. These studies are performed by Departments of Geomatics Engineering at Bülent Ecevit University, at İstanbul Technical University, at Yıldız Technical University, and Institute of Photogrammetry and GeoInformation at Leibniz University Hannover. These studies are financially supported by TÜBİTAK (Turkey), the Universities, ESA, Airbus DS, ERSDAC (Japan) and Jülich Research Centre (Germany).

  3. GEOSPATIAL ANALYSIS USING REMOTE SENSING IMAGES: CASE STUDIES OF ZONGULDAK TEST FIELD

    Directory of Open Access Journals (Sweden)

    Ç. Bayık

    2016-06-01

    Full Text Available Inclined topographies are one of the most challenging problems for geospatial analysis of air-borne and space-borne imageries. However, flat areas are mostly misleading to exhibit the real performance. For this reason, researchers generally require a study area which includes mountainous topography and various land cover and land use types. Zonguldak and its vicinity is a very suitable test site for performance investigation of remote sensing systems due to the fact that it contains different land use types such as dense forest, river, sea, urban area; different structures such as open pit mining operations, thermal power plant; and its mountainous structure. In this paper, we reviewed more than 120 proceeding papers and journal articles about geospatial analysis that are performed on the test field of Zonguldak and its surroundings. Geospatial analysis performed with imageries include elimination of systematic geometric errors, 2/3D georeferencing accuracy assessment, DEM and DSM generation and validation, ortho-image production, evaluation of information content, image classification, automatic feature extraction and object recognition, pan-sharpening, land use and land cover change analysis and deformation monitoring. In these applications many optical satellite images are used i.e. ASTER, Bilsat-1, IKONOS, IRS-1C, KOMPSAT-1, KVR-1000, Landsat-3-5-7, Orbview-3, QuickBird, Pleiades, SPOT-5, TK-350, RADARSAT-1, WorldView-1-2; as well as radar data i.e. JERS-1, Envisat ASAR, TerraSAR-X, ALOS PALSAR and SRTM. These studies are performed by Departments of Geomatics Engineering at Bülent Ecevit University, at İstanbul Technical University, at Yıldız Technical University, and Institute of Photogrammetry and GeoInformation at Leibniz University Hannover. These studies are financially supported by TÜBİTAK (Turkey, the Universities, ESA, Airbus DS, ERSDAC (Japan and Jülich Research Centre (Germany.

  4. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is

  5. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel

    2016-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software package PySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described

  6. Geospatial analysis based on GIS integrated with LADAR.

    Science.gov (United States)

    Fetterman, Matt R; Freking, Robert; Fernandez-Cull, Christy; Hinkle, Christopher W; Myne, Anu; Relyea, Steven; Winslow, Jim

    2013-10-07

    In this work, we describe multi-layered analyses of a high-resolution broad-area LADAR data set in support of expeditionary activities. High-level features are extracted from the LADAR data, such as the presence and location of buildings and cars, and then these features are used to populate a GIS (geographic information system) tool. We also apply line-of-sight (LOS) analysis to develop a path-planning module. Finally, visualization is addressed and enhanced with a gesture-based control system that allows the user to navigate through the enhanced data set in a virtual immersive experience. This work has operational applications including military, security, disaster relief, and task-based robotic path planning.

  7. GEOSPATIAL ANALYSIS OF WETLANDS DEGRADATION IN MAKURDI, NIGERIA

    Directory of Open Access Journals (Sweden)

    P. Anule

    2017-09-01

    Full Text Available Globally, the amount of wetlands have being on the decline due to the fragile nature of these ecosystems and unplanned land consumption practices. This has created pressure on suitable land for cultivation in most developing countries where most of the growing food demand originates. Previous studies revealed that wetlands and agricultural landuse dominated the landscape of Makurdi. However, the trend is changing in recent times. Makurdi has undergone tremendous transformation in its landuse/landcover due to rapid urbanization since 1976 when it became the capital city of Benue State. To estimate the land cover change in Makurdi, Landsat ETM, ETM+ and OLI satellite data for 1996, 2006 and 2016, respectively were utilised. The study adapted the Kappa index for assessing accuracy of the land use/cover maps generated from the analysis to improve the accuracy of results. An accuracy level of 80 to 91 % was achieved. The results reveal an overall significant increase in built-up area and other land uses at the expense of wetlands from 26.3 % in 1996 to 18.1 % in 2016. Further analysis includes the land consumption rate (LCR and land absorption coefficient (LAC which reveals the role of population expansion in the recorded levels of wetland losses recorded in this study. The study projects a further decline of wetland cover by 33.15 km2 (or by 22.57 % in 2026 if steps are not instituted to control the rate of decline. Suggestions are made to align with and incorporate into policy the strategic need to adopt the provisions of the SDGs at local levels if we intend to avert the massive failure recorded by the now rested MDGs.

  8. Geospatial Analysis of Wetlands Degradation in Makurdi, Nigeria

    Science.gov (United States)

    Anule, P.; Ujoh, F.

    2017-09-01

    Globally, the amount of wetlands have being on the decline due to the fragile nature of these ecosystems and unplanned land consumption practices. This has created pressure on suitable land for cultivation in most developing countries where most of the growing food demand originates. Previous studies revealed that wetlands and agricultural landuse dominated the landscape of Makurdi. However, the trend is changing in recent times. Makurdi has undergone tremendous transformation in its landuse/landcover due to rapid urbanization since 1976 when it became the capital city of Benue State. To estimate the land cover change in Makurdi, Landsat ETM, ETM+ and OLI satellite data for 1996, 2006 and 2016, respectively were utilised. The study adapted the Kappa index for assessing accuracy of the land use/cover maps generated from the analysis to improve the accuracy of results. An accuracy level of 80 to 91 % was achieved. The results reveal an overall significant increase in built-up area and other land uses at the expense of wetlands from 26.3 % in 1996 to 18.1 % in 2016. Further analysis includes the land consumption rate (LCR) and land absorption coefficient (LAC) which reveals the role of population expansion in the recorded levels of wetland losses recorded in this study. The study projects a further decline of wetland cover by 33.15 km2 (or by 22.57 %) in 2026 if steps are not instituted to control the rate of decline. Suggestions are made to align with and incorporate into policy the strategic need to adopt the provisions of the SDGs at local levels if we intend to avert the massive failure recorded by the now rested MDGs.

  9. Geospatial analysis of forest fragmentation in Uttara Kannada District, India

    Directory of Open Access Journals (Sweden)

    Ramachandra T V

    2016-04-01

    Full Text Available Background: Landscapes consist of heterogeneous interacting dynamic elements with complex ecological, economic and cultural attributes. These complex interactions help in the sustenance of natural resources through bio-geochemical and hydrological cycling. The ecosystem functions are altered with changes in the landscape structure. Fragmentation of large contiguous forests to small and isolated forest patches either by natural phenomena or anthropogenic activities leads to drastic changes in forest patch sizes, shape, connectivity and internal heterogeneity, which restrict the movement leading to inbreeding among Meta populations with extirpation of species. Methods: Landscape dynamics are assessed through land use analysis by way of remote sensing data acquired at different time periods. Forest fragmentation is assessed at the pixel level through computation of two indicators, i.e., Pf (the ratio of pixels that are forested to the total non-water pixels in the window and Pff (the proportion of all adjacent (cardinal directions only pixel pairs that include at least one forest pixel, for which both pixels are forested. Results: Uttara Kannada District has the distinction of having the highest forest cover in Karnataka State, India. This region has been experiencing changes in its forest cover and consequent alterations in functional abilities of its ecosystem. Temporal land use analyses show the trend of deforestation, evident from the reduction of evergreen - semi evergreen forest cover from 57.31 % (1979 to 32.08 % (2013 Forest fragmentation at the landscape level shows a decline of interior forests 64.42 % (1979 to 25.62 % (2013 and transition of non-forest categories such as crop land, plantations and built-up areas, amounting now to 47.29 %. PCA prioritized geophysical and socio variables responsible for changes in the landscape structure at local levels. Conclusion: Terrestrial forest ecosystems in Uttara Kannada District of Central

  10. Geospatial Analysis of Pediatric EMS Run Density and Endotracheal Intubation

    Directory of Open Access Journals (Sweden)

    Matthew Hansen

    2016-09-01

    Full Text Available Introduction: The association between geographic factors, including transport distance, and pediatric emergency medical services (EMS run clustering on out-of-hospital pediatric endotracheal intubation is unclear. The objective of this study was to determine if endotracheal intubation procedures are more likely to occur at greater distances from the hospital and near clusters of pediatric calls. Methods: This was a retrospective observational study including all EMS runs for patients less than 18 years of age from 2008 to 2014 in a geographically large and diverse Oregon county that includes densely populated urban areas near Portland and remote rural areas. We geocoded scene addresses using the automated address locator created in the cloud-based mapping platform ArcGIS, supplemented with manual address geocoding for remaining cases. We then use the Getis-Ord Gi spatial statistic feature in ArcGIS to map statistically significant spatial clusters (hot spots of pediatric EMS runs throughout the county. We then superimposed all intubation procedures performed during the study period on maps of pediatric EMS-run hot spots, pediatric population density, fire stations, and hospitals. We also performed multivariable logistic regression to determine if distance traveled to the hospital was associated with intubation after controlling for several confounding variables. Results: We identified a total of 7,797 pediatric EMS runs during the study period and 38 endotracheal intubations. In univariate analysis we found that patients who were intubated were similar to those who were not in gender and whether or not they were transported to a children’s hospital. Intubated patients tended to be transported shorter distances and were older than non-intubated patients. Increased distance from the hospital was associated with reduced odds of intubation after controlling for age, sex, scene location, and trauma system entry status in a multivariate logistic

  11. Geospatial analysis of forest fragmentation in Uttara Kannada District, India

    Institute of Scientific and Technical Information of China (English)

    Ramachandra T V; Bharath Setturu; Subash Chandran

    2016-01-01

    Background: Landscapes consist of heterogeneous interacting dynamic elements with complex ecological,economic and cultural attributes. These complex interactions help in the sustenance of natural resources through bio-geochemical and hydrological cycling. The ecosystem functions are altered with changes in the landscape structure. Fragmentation of large contiguous forests to small and isolated forest patches either by natural phenomena or anthropogenic activities leads to drastic changes in forest patch sizes, shape, connectivity and internal heterogeneity, which restrict the movement leading to inbreeding among Meta populations with extirpation of species.Methods: Landscape dynamics are assessed through land use analysis by way of remote sensing data acquired at different time periods. Forest fragmentation is assessed at the pixel level through computation of two indicators,i.e., Pf(the ratio of pixels that are forested to the total non-water pixels in the window) and Pff(the proportion of all adjacent(cardinal directions only) pixel pairs that include at least one forest pixel, for which both pixels are forested).Results: Uttara Kannada District has the distinction of having the highest forest cover in Karnataka State, India. This region has been experiencing changes in its forest cover and consequent alterations in functional abilities of its ecosystem. Temporal land use analyses show the trend of deforestation, evident from the reduction of evergreen-semi evergreen forest cover from 57.31 %(1979) to 32.08 %(2013) Forest fragmentation at the landscape level shows a decline of interior forests 64.42 %(1979) to 25.62 %(2013) and transition of non-forest categories such as crop land, plantations and built-up areas, amounting now to 47.29 %. PCA prioritized geophysical and socio variables responsible for changes in the landscape structure at local levels.Conclusion: Terrestrial forest ecosystems in Uttara Kannada District of Central Western Ghats have been

  12. Collaborative Strategies for Sustainable EU Flood Risk Management: FOSS and Geospatial Tools—Challenges and Opportunities for Operative Risk Analysis

    Directory of Open Access Journals (Sweden)

    Raffaele Albano

    2015-12-01

    Full Text Available An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to mitigate the impact of natural hazards on European economies and societies, improved risk assessment, and management needs to be pursued. With the recent transition to a more risk-based approach in European flood management policy, flood analysis models have become an important part of flood risk management (FRM. In this context, free and open-source (FOSS geospatial models provide better and more complete information to stakeholders regarding their compliance with the Flood Directive (2007/60/EC for effective and collaborative FRM. A geospatial model is an essential tool to address the European challenge for comprehensive and sustainable FRM because it allows for the use of integrated social and economic quantitative risk outcomes in a spatio-temporal domain. Moreover, a FOSS model can support governance processes using an interactive, transparent and collaborative approach, providing a meaningful experience that both promotes learning and generates knowledge through a process of guided discovery regarding flood risk management. This article aims to organize the available knowledge and characteristics of the methods available to give operational recommendations and principles that can support authorities, local entities, and the stakeholders involved in decision-making with regard to flood risk management in their compliance with the Floods Directive (2007/60/EC.

  13. The R package "sperrorest" : Parallelized spatial error estimation and variable importance assessment for geospatial machine learning

    Science.gov (United States)

    Schratz, Patrick; Herrmann, Tobias; Brenning, Alexander

    2017-04-01

    Computational and statistical prediction methods such as the support vector machine have gained popularity in remote-sensing applications in recent years and are often compared to more traditional approaches like maximum-likelihood classification. However, the accuracy assessment of such predictive models in a spatial context needs to account for the presence of spatial autocorrelation in geospatial data by using spatial cross-validation and bootstrap strategies instead of their now more widely used non-spatial equivalent. The R package sperrorest by A. Brenning [IEEE International Geoscience and Remote Sensing Symposium, 1, 374 (2012)] provides a generic interface for performing (spatial) cross-validation of any statistical or machine-learning technique available in R. Since spatial statistical models as well as flexible machine-learning algorithms can be computationally expensive, parallel computing strategies are required to perform cross-validation efficiently. The most recent major release of sperrorest therefore comes with two new features (aside from improved documentation): The first one is the parallelized version of sperrorest(), parsperrorest(). This function features two parallel modes to greatly speed up cross-validation runs. Both parallel modes are platform independent and provide progress information. par.mode = 1 relies on the pbapply package and calls interactively (depending on the platform) parallel::mclapply() or parallel::parApply() in the background. While forking is used on Unix-Systems, Windows systems use a cluster approach for parallel execution. par.mode = 2 uses the foreach package to perform parallelization. This method uses a different way of cluster parallelization than the parallel package does. In summary, the robustness of parsperrorest() is increased with the implementation of two independent parallel modes. A new way of partitioning the data in sperrorest is provided by partition.factor.cv(). This function gives the user the

  14. Geospatial Semantics and the Semantic Web

    CERN Document Server

    Ashish, Naveen

    2011-01-01

    The availability of geographic and geospatial information and services, especially on the open Web has become abundant in the last several years with the proliferation of online maps, geo-coding services, geospatial Web services and geospatially enabled applications. The need for geospatial reasoning has significantly increased in many everyday applications including personal digital assistants, Web search applications, local aware mobile services, specialized systems for emergency response, medical triaging, intelligence analysis and more. Geospatial Semantics and the Semantic Web: Foundation

  15. A cross-sectional ecological analysis of international and sub-national health inequalities in commercial geospatial resource availability.

    Science.gov (United States)

    Dotse-Gborgbortsi, Winfred; Wardrop, Nicola; Adewole, Ademola; Thomas, Mair L H; Wright, Jim

    2018-05-23

    Commercial geospatial data resources are frequently used to understand healthcare utilisation. Although there is widespread evidence of a digital divide for other digital resources and infra-structure, it is unclear how commercial geospatial data resources are distributed relative to health need. To examine the distribution of commercial geospatial data resources relative to health needs, we assembled coverage and quality metrics for commercial geocoding, neighbourhood characterisation, and travel time calculation resources for 183 countries. We developed a country-level, composite index of commercial geospatial data quality/availability and examined its distribution relative to age-standardised all-cause and cause specific (for three main causes of death) mortality using two inequality metrics, the slope index of inequality and relative concentration index. In two sub-national case studies, we also examined geocoding success rates versus area deprivation by district in Eastern Region, Ghana and Lagos State, Nigeria. Internationally, commercial geospatial data resources were inversely related to all-cause mortality. This relationship was more pronounced when examining mortality due to communicable diseases. Commercial geospatial data resources for calculating patient travel times were more equitably distributed relative to health need than resources for characterising neighbourhoods or geocoding patient addresses. Countries such as South Africa have comparatively high commercial geospatial data availability despite high mortality, whilst countries such as South Korea have comparatively low data availability and low mortality. Sub-nationally, evidence was mixed as to whether geocoding success was lowest in more deprived districts. To our knowledge, this is the first global analysis of commercial geospatial data resources in relation to health outcomes. In countries such as South Africa where there is high mortality but also comparatively rich commercial geospatial

  16. Assessing Vulnerability to Heat: A Geospatial Analysis for the City of Philadelphia

    Directory of Open Access Journals (Sweden)

    Laura Barron

    2018-04-01

    Full Text Available Urban heat island (UHI effect is an increasingly prominent health and environmental hazard that is linked to urbanization and climate change. Greening reduces the negative impacts of UHI; trees specifically are the most effective in ambient temperature reduction. This paper investigates vulnerability to heat in the Philadelphia, Pennsylvania and identifies where street trees can be planted as a public intervention. We used geospatial information systems (GIS software to map a validated Heat Vulnerability Index to identify vulnerability at the block level. Using a high-low geospatial cluster analysis, we assessed where the City of Philadelphia can most effectively plant street trees to address UHI. This information was then aggregated to the neighborhood level for more effective citizen communication and policymaking. We identified that 26 of 48 (54% neighborhoods that were vulnerable to heat also lacked street trees. Of 158 Philadelphia neighborhoods, 63 (40% contained block groups of high vulnerability to either heat or street tree infrastructure. Neighborhoods that were ranked highest in both classifications were identified in two adjacent West Philadelphia neighborhoods. Planting street trees is a public service a city can potentially reduce the negative health impacts of UHI. GIS can be used to identify and recommend street tree plantings to reduce urban heat.

  17. Comparative study of cocoa black ants temporal population distribution utilizing geospatial analysis

    Science.gov (United States)

    Adnan, N. A.; Bakar, S.; Mazlan, A. H.; Yusoff, Z. Mohd; Rasam, A. R. Abdul

    2018-02-01

    Cocoa plantation also subjected to diseases and pests infestation. Some pests not only reduced the yield but also inhibit the growth of trees. Therefore, the Malaysia Cocoa Board (MCB) has explored Cocoa Black Ants (CBA) as one of their biological control mechanism to reduce the pest infestation of the Cocoa Pod Borer (CPB). CPB is capable to cause damage to cocoa beans, and later on will reduce the quality of dried cocoa beans. This study tries to integrate the use of geospatial analysis in understanding population distribution pattern of CBA to enhance its capability in controlling CPB infestation. Two objectives of the study are i) to generate temporal CBA distribution of cocoa plantation for two different blocks, and ii) to compare visually the CBA population distribution pattern with the aid of geospatial technique. This study managed to find the CBA population pattern which indicated spatially modest amount of low pattern distribution in February of 2007 until reaching the highest levels of ant populations in September 2007 and decreasing by the end of the year in 2009 for two different blocks (i.e 10B and 18A). Therefore, the usage of GIS is important to explain the CBA pattern population in the mature cocoa field. This finding might to be used as an indicator to examine the optimum distribution of CBA, which needed as a biological control agent against the CPB in the future.

  18. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    Science.gov (United States)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc

  19. Geospatial analysis of emergency department visits for targeting community-based responses to the opioid epidemic.

    Directory of Open Access Journals (Sweden)

    Daniel A Dworkis

    Full Text Available The opioid epidemic in the United States carries significant morbidity and mortality and requires a coordinated response among emergency providers, outpatient providers, public health departments, and communities. Anecdotally, providers across the spectrum of care at Massachusetts General Hospital (MGH in Boston, MA have noticed that Charlestown, a community in northeast Boston, has been particularly impacted by the opioid epidemic and needs both emergency and longer-term resources. We hypothesized that geospatial analysis of the home addresses of patients presenting to the MGH emergency department (ED with opioid-related emergencies might identify "hot spots" of opioid-related healthcare needs within Charlestown that could then be targeted for further investigation and resource deployment. Here, we present a geospatial analysis at the United States census tract level of the home addresses of all patients who presented to the MGH ED for opioid-related emergency visits between 7/1/2012 and 6/30/2015, including 191 visits from 100 addresses in Charlestown, MA. Among the six census tracts that comprise Charlestown, we find a 9.5-fold difference in opioid-related ED visits, with 45% of all opioid-related visits from Charlestown originating in tract 040401. The signal from this census tract remains strong after adjusting for population differences between census tracts, and while this tract is one of the higher utilizing census tracts in Charlestown of the MGH ED for all cause visits, it also has a 2.9-fold higher rate of opioid-related visits than the remainder of Charlestown. Identifying this hot spot of opioid-related emergency needs within Charlestown may help re-distribute existing resources efficiently, empower community and ED-based physicians to advocate for their patients, and serve as a catalyst for partnerships between MGH and local community groups. More broadly, this analysis demonstrates that EDs can use geospatial analysis to address

  20. From geospatial observations of ocean currents to causal predictors of spatio-economic activity using computer vision and machine learning

    Science.gov (United States)

    Popescu, Florin; Ayache, Stephane; Escalera, Sergio; Baró Solé, Xavier; Capponi, Cecile; Panciatici, Patrick; Guyon, Isabelle

    2016-04-01

    The big data transformation currently revolutionizing science and industry forges novel possibilities in multi-modal analysis scarcely imaginable only a decade ago. One of the important economic and industrial problems that stand to benefit from the recent expansion of data availability and computational prowess is the prediction of electricity demand and renewable energy generation. Both are correlates of human activity: spatiotemporal energy consumption patterns in society are a factor of both demand (weather dependent) and supply, which determine cost - a relation expected to strengthen along with increasing renewable energy dependence. One of the main drivers of European weather patterns is the activity of the Atlantic Ocean and in particular its dominant Northern Hemisphere current: the Gulf Stream. We choose this particular current as a test case in part due to larger amount of relevant data and scientific literature available for refinement of analysis techniques. This data richness is due not only to its economic importance but also to its size being clearly visible in radar and infrared satellite imagery, which makes it easier to detect using Computer Vision (CV). The power of CV techniques makes basic analysis thus developed scalable to other smaller and less known, but still influential, currents, which are not just curves on a map, but complex, evolving, moving branching trees in 3D projected onto a 2D image. We investigate means of extracting, from several image modalities (including recently available Copernicus radar and earlier Infrared satellites), a parameterized representation of the state of the Gulf Stream and its environment that is useful as feature space representation in a machine learning context, in this case with the EC's H2020-sponsored 'See.4C' project, in the context of which data scientists may find novel predictors of spatiotemporal energy flow. Although automated extractors of Gulf Stream position exist, they differ in methodology

  1. Geospatial Authentication

    Science.gov (United States)

    Lyle, Stacey D.

    2009-01-01

    A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server.

  2. Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)

    Science.gov (United States)

    Hancher, M.

    2013-12-01

    Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.

  3. Learning transfer of geospatial technologies in secondary science and mathematics core areas

    Science.gov (United States)

    Nielsen, Curtis P.

    The purpose of this study was to investigate the transfer of geospatial technology knowledge and skill presented in a social sciences course context to other core areas of the curriculum. Specifically, this study explored the transfer of geospatial technology knowledge and skill to the STEM-related core areas of science and mathematics among ninth-grade students. Haskell's (2001) research on "levels of transfer" provided the theoretical framework for this study, which sought to demonstrate the experimental group's higher ability to transfer geospatial skills, higher mean assignment scores, higher post-test scores, higher geospatial skill application and deeper levels of transfer application than the control group. The participants of the study consisted of thirty ninth-graders enrolled in U.S. History, Earth Science and Integrated Mathematics 1 courses. The primary investigator of this study had no previous classroom experiences with this group of students. The participants who were enrolled in the school's existing two-section class configuration were assigned to experimental and control groups. The experimental group had ready access to Macintosh MacBook laptop computers, and the control group had ready access to Macintosh iPads. All participants in U.S. History received instruction with and were required to use ArcGIS Explorer Online during a Westward Expansion project. All participants were given the ArcGIS Explorer Online content assessment following the completion of the U.S. History project. Once the project in U.S. History was completed, Earth Science and Integrated Mathematics 1 began units of instruction beginning with a multiple-choice content pre-test created by the classroom teachers. Experimental participants received the same unit of instruction without the use or influence of ArcGIS Explorer Online. At the end of the Earth Science and Integrated Math 1 units, the same multiple-choice test was administered as the content post-test. Following the

  4. Infant and Child Mortality in India in the Last Two Decades: A Geospatial Analysis

    Science.gov (United States)

    Singh, Abhishek; Pathak, Praveen Kumar; Chauhan, Rajesh Kumar; Pan, William

    2011-01-01

    Background Studies examining the intricate interplay between poverty, female literacy, child malnutrition, and child mortality are rare in demographic literature. Given the recent focus on Millennium Development Goals 4 (child survival) and 5 (maternal health), we explored whether the geographic regions that were underprivileged in terms of wealth, female literacy, child nutrition, or safe delivery were also grappling with the elevated risk of child mortality; whether there were any spatial outliers; whether these relationships have undergone any significant change over historical time periods. Methodology The present paper attempted to investigate these critical questions using data from household surveys like NFHS 1992–1993, NFHS 1998–1999 and DLHS 2002–2004. For the first time, we employed geo-spatial techniques like Moran's-I, univariate LISA, bivariate LISA, spatial error regression, and spatiotemporal regression to address the research problem. For carrying out the geospatial analysis, we classified India into 76 natural regions based on the agro-climatic scheme proposed by Bhat and Zavier (1999) following the Census of India Study and all estimates were generated for each of the geographic regions. Result/Conclusions This study brings out the stark intra-state and inter-regional disparities in infant and under-five mortality in India over the past two decades. It further reveals, for the first time, that geographic regions that were underprivileged in child nutrition or wealth or female literacy were also likely to be disadvantaged in terms of infant and child survival irrespective of the state to which they belong. While the role of economic status in explaining child malnutrition and child survival has weakened, the effect of mother's education has actually become stronger over time. PMID:22073208

  5. Geospatial Analysis of Low-frequency Radio Signals Collected During the 2017 Solar Eclipse

    Science.gov (United States)

    Liles, W. C.; Nelson, J.; Kerby, K. C.; Lukes, L.; Henry, J.; Oputa, J.; Lemaster, G.

    2017-12-01

    The total solar eclipse of 2017, with a path that crosses the continental United States, offers a unique opportunity to gather geospatially diverse data. The EclipseMob project has been designed to crowdsource this data by building a network of citizen scientists across the country. The project focuses on gathering low-frequency radio wave data before, during, and after the eclipse. WWVB, a 60 KHz transmitter in Ft. Collins, CO operated by the National Institutes of Standard and Technology, will provide the transmit signal that will be observed by project participants. Participating citizen scientists are building simple antennas and receivers designed by the EclipseMob team and provided to participants in the form of "receiver kits." The EclipseMob receiver downsamples the 60 KHz signal to 18 KHz and supplies the downsampled signal to the audio jack of a smartphone. A dedicated app is used to collect data and upload it to the EclipseMob server. By studying the variations in WWVB amplitude observed during the eclipse at over 150 locations across the country, we aim to understand how the ionization of the D layer of the ionosphere is impacted by the eclipse as a function of both time and space (location). The diverse locations of the EclipseMob participants will provide data from a wide variety of propagation paths - some crossing the path of the total eclipse, and some remaining on the same side of the eclipse path as the transmitter. Our initial data analysis will involve identifying characteristics that define geospatial relationships in the behavior of observed WWVB signal amplitudes.

  6. The Analysis of Geospatial Information for Validating Some Numbers of Islands in Indonesia

    Directory of Open Access Journals (Sweden)

    Sukendra - Martha

    2017-12-01

    Full Text Available This article discusses a comparison of various numbers of islands in Indonesia; and it addresses a valid method of accounting or enumerating numbers of islands in Indonesia. Methodology used is an analysis to compare the different number of islands from various sources.  First, some numbers of  Indonesian islands were derived from: (i Centre for Survey and Mapping- Indonesian Arm Forces (Pussurta ABRI recorded as 17,508 islands; (ii Agency for Geospatial Information (BIG previously known as National Coordinating Agency for Surveys and Mapping (Bakosurtanal as national mapping authority reported with 17,506 islands (after loosing islands of  Sipadan and Ligitan; (iii Ministry of Internal Affair published 17,504 islands. Many parties have referred the number of 17,504 islands even though it has not yet been supported by back-up documents; (iv Hidrographic Office of Indonesian Navy has released with numbers of 17,499; (v Other sources indicated different numbers of islands, and indeed will imply to people confusion. In the other hand, the number of 13,466 named islands has a strong document (Gazetteer. Second, enumerating the total number of islands in Indonesia can be proposed by three ways: (i island census through toponimic survey, (ii using map, and (iii applying remote sensing images. Third, the procedures of searching valid result in number of islands is by remote sensing approach - high resolution satellite images. The result of this work implies the needs of one geospatial data source (including total numbers of islands in the form of ‘One Map Policy’ that will impact in the improvement of  Indonesian geographic data administration.

  7. From Analysis to Impact: Challenges and Outcomes from Google's Cloud-based Platforms for Analyzing and Leveraging Petapixels of Geospatial Data

    Science.gov (United States)

    Thau, D.

    2017-12-01

    For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson

  8. Geospatial health

    DEFF Research Database (Denmark)

    Utzinger, Jürg; Rinaldi, Laura; Malone, John B.

    2011-01-01

    Geospatial Health is an international, peer-reviewed scientific journal produced by the Global Network for Geospatial Health (GnosisGIS). This network was founded in 2000 and the inaugural issue of its official journal was published in November 2006 with the aim to cover all aspects of geographical...... information system (GIS) applications, remote sensing and other spatial analytic tools focusing on human and veterinary health. The University of Naples Federico II is the publisher, producing two issues per year, both as hard copy and an open-access online version. The journal is referenced in major...... databases, including CABI, ISI Web of Knowledge and PubMed. In 2008, it was assigned its first impact factor (1.47), which has now reached 1.71. Geospatial Health is managed by an editor-in-chief and two associate editors, supported by five regional editors and a 23-member strong editorial board...

  9. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    Science.gov (United States)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We

  10. Exploring Local Level Factors Shaping the Implementation of a Blended Learning Module for Information and Geospatial Literacy in Ontario

    Directory of Open Access Journals (Sweden)

    Michelle M. Vine

    2016-12-01

    Full Text Available The objectives of this research study were to examine local level factors shaping the implementation of a blended pedagogical approach for geospatial- and information-literacy, and to understand implementer satisfaction. As such, we addressed the following research questions: What local-level factors shape the implementation of the blended learning model? and How satisfied are implementers (faculty, administrators and library instructional/support staff with the new blended learning model for geospatial and information fluency? Focus groups (n=7 plus one interview (total n=22 were conducted with key stakeholders (e.g., staff, faculty, administrators to better understand facilitators, barriers, and/or issues related to module development, in addition to perceptions about how the modules are utilized by teaching assistants (TAs, instructional assistants (IAs, and instructors. Participants were identified according to their status as either discipline-specific instructional staff (i.e., instructor, TA, IA or staff who supported the development of modules (i.e., library instructional staff, library management, administrators. From an ontological standpoint that privileges an individual perspective on the nature of reality, while epistemologically seeking to understand the relationship between the “knower” and what can be known, we adopted a theory of constructivism to support this inquiry. Transcripts were imported into a qualitative analysis software package (NVivo 8.0 for organization, coding and analysis. Instructors found value in the online modules, particularly in a blended learning setting. Instructors felt that having the material in advance, in-class time could be better focused on interaction, assignments, and assessments and resulted in reduced anxiety in busy lab environments. Several key themes emerged, including: (a instructor expectations (time constraints, sustainability, and collaborative nature of development process and

  11. Artificial Intelligence in geospatial analysis: applications of self-organizing maps in the context of geographic information science.

    OpenAIRE

    Henriques, Roberto André Pereira

    2011-01-01

    A thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Systems. The size and dimensionality of available geospatial repositories increases every day, placing additional pressure on existing analysis tools, as they are expected to extract more knowledge from these databases. Most of these tools were created in a data poor environment and thus rarely address concerns of efficiency, dimensionality and automatic exploration. In ...

  12. Integrating Free and Open Source Solutions into Geospatial Science Education

    Directory of Open Access Journals (Sweden)

    Vaclav Petras

    2015-06-01

    Full Text Available While free and open source software becomes increasingly important in geospatial research and industry, open science perspectives are generally less reflected in universities’ educational programs. We present an example of how free and open source software can be incorporated into geospatial education to promote open and reproducible science. Since 2008 graduate students at North Carolina State University have the opportunity to take a course on geospatial modeling and analysis that is taught with both proprietary and free and open source software. In this course, students perform geospatial tasks simultaneously in the proprietary package ArcGIS and the free and open source package GRASS GIS. By ensuring that students learn to distinguish between geospatial concepts and software specifics, students become more flexible and stronger spatial thinkers when choosing solutions for their independent work in the future. We also discuss ways to continually update and improve our publicly available teaching materials for reuse by teachers, self-learners and other members of the GIS community. Only when free and open source software is fully integrated into geospatial education, we will be able to encourage a culture of openness and, thus, enable greater reproducibility in research and development applications.

  13. Geo-spatial multi-criteria analysis for wave energy conversion system deployment

    Energy Technology Data Exchange (ETDEWEB)

    Nobre, Ana; Pacheco, Miguel [Data Centre, Instituto Hidrografico, Portuguese Navy, Rua das Trinas 49, 1249-093 Lisboa (Portugal); Jorge, Raquel; Lopes, M.F.P.; Gato, L.M.C. [IDMEC, Instituto Superior Tecnico, Technical University of Lisbon, Avenida Rovisco Pais 1, 1049-001, Lisboa (Portugal)

    2009-01-15

    The growing requirements for renewable energy production lead to the development of a new series of systems, including wave energy conversion systems. Due to their sensitivity and the impact of the aggressive marine environment, the selection of the most adequate location for these systems is a major and very important task. Several factors, such as technological limitations, environmental conditions, administrative and logistic conditions, have to be taken into account in order to support the decision for best location. This paper describes a geo-spatial multi-criteria analysis methodology, based on geographic information systems technology, for identification of the best location to deploy a wave energy farm. This methodology is not conversion system dependent and therefore can be easily customized for different systems and implementation conditions. Selection factors can include, for example, ocean depth, sea bottom type, existing underwater cables, marine protected areas, ports location, shoreline, power grid location, military exercise areas, climatology of wave significant height, period and power. A case study demonstrating this methodology is presented, for an area offshore the Portuguese southwest coast. The system output allows a clear differential identification of the best spots for implementing a wave energy farm. It is not just a simple Boolean result showing valid and invalid locations, but a layer with a valued suitability for farm deployment. (author)

  14. Geo-Spatial Multi-criteria Analysis for Wave Energy System Deployment

    Energy Technology Data Exchange (ETDEWEB)

    Nobre, Ana; Pacheco, Miguel (Instituto Hidrografico, Rua das Trinas, 49, Lisboa (PT)); Jorge, Raquel Lopes, M. F. P.; Gato, L. M. C. (IDMEC, Instituto Superior Tecnico, Technical University of Lisbon, Av. Rovisco Pais, Lisboa (PT))

    2007-07-01

    The growing requirements for renewable energy production lead to the development of a new series of systems, including wave energy conversion systems. Due to their sensitivity and the impact of the aggressive marine environment, the selection of the most adequate location for these systems is a major and very important task. Several factors, such as technological limitations, environmental conditions, administrative and logistic conditions, have to be taken into account in order to support the decision for best location. This paper describes a geo-spatial multi-criteria analysis methodology, based on geographic information systems technology, for selection of the best location to deploy a wave energy farm. This methodology is not conversion system dependent and therefore can be easily customized for different systems and conditions. Selection factors can include, for example, ocean depth, bottom type, underwater cables, marine protected areas, ports location, shoreline, power grid location, military exercise areas, climatology of wave significant height, period and direction. A case study demonstrating this methodology is presented, for an area offshore the Portuguese southwest coast. The system output allows a clear identification of the best spots for a wave energy farm. It is not just a simple Boolean result showing valid and invalid locations, but a layer with a graded suitability for farm deployment.

  15. Image processing analysis of geospatial uav orthophotos for palm oil plantation monitoring

    Science.gov (United States)

    Fahmi, F.; Trianda, D.; Andayani, U.; Siregar, B.

    2018-03-01

    Unmanned Aerial Vehicle (UAV) is one of the tools that can be used to monitor palm oil plantation remotely. With the geospatial orthophotos, it is possible to identify which part of the plantation land is fertile for planted crops, means to grow perfectly. It is also possible furthermore to identify less fertile in terms of growth but not perfect, and also part of plantation field that is not growing at all. This information can be easily known quickly with the use of UAV photos. In this study, we utilized image processing algorithm to process the orthophotos for more accurate and faster analysis. The resulting orthophotos image were processed using Matlab including classification of fertile, infertile, and dead palm oil plants by using Gray Level Co-Occurrence Matrix (GLCM) method. The GLCM method was developed based on four direction parameters with specific degrees 0°, 45°, 90°, and 135°. From the results of research conducted with 30 image samples, it was found that the accuracy of the system can be reached by using the features extracted from the matrix as parameters Contras, Correlation, Energy, and Homogeneity.

  16. Surface deformation analysis over Vrancea seismogenic area through radar and GPS geospatial data

    Science.gov (United States)

    Zoran, Maria A.; Savastru, Roxana S.; Savastru, Dan M.; Serban, Florin S.; Teleaga, Delia M.; Mateciuc, Doru N.

    2017-10-01

    Time series analysis of GPS (Global Positioning Systems) and InSAR (Interferometric Synthetic Aperture Radar) data are important tools for Earth's surface deformation assessment, which can result from a wide range of geological phenomena like as earthquakes, landslides or ground water level changes. The aim of this paper was to identify several types of earthquake precursors that might be observed from geospatial data in Vrancea seismogenic region in Romania. Continuous GPS Romanian network stations and few field campaigns data recorded between 2005-2012 years revealed a displacement of about 5 or 6 millimeters per year in horizontal direction relative motion, and a few millimeters per year in vertical direction. In order to assess possible deformations due to earthquakes and respectively for possible slow deformations, have been used also time series Sentinel 1 satellite data available for Vrancea zone during October 2014 till October 2016 to generate two types of interferograms (short-term and medium- term). During investigated period were not recorded medium or strong earthquakes, so interferograms over test area revealed small displacements on vertical direction (subsidence or uplifts) of 5-10 millimeters per year. Based on GPS continuous network data and satellite Sentinel 1 results, different possible tectonic scenarios were developed. The localization of horizontal and vertical motions, fault slip, and surface deformation of the continental blocks provides new information, in support of different geodynamic models for Vrancea tectonic active region in Romania and Europe.

  17. Geospatial Information Response Team

    Science.gov (United States)

    Witt, Emitt C.

    2010-01-01

    Extreme emergency events of national significance that include manmade and natural disasters seem to have become more frequent during the past two decades. The Nation is becoming more resilient to these emergencies through better preparedness, reduced duplication, and establishing better communications so every response and recovery effort saves lives and mitigates the long-term social and economic impacts on the Nation. The National Response Framework (NRF) (http://www.fema.gov/NRF) was developed to provide the guiding principles that enable all response partners to prepare for and provide a unified national response to disasters and emergencies. The NRF provides five key principles for better preparation, coordination, and response: 1) engaged partnerships, 2) a tiered response, 3) scalable, flexible, and adaptable operations, 4) unity of effort, and 5) readiness to act. The NRF also describes how communities, tribes, States, Federal Government, privatesector, and non-governmental partners apply these principles for a coordinated, effective national response. The U.S. Geological Survey (USGS) has adopted the NRF doctrine by establishing several earth-sciences, discipline-level teams to ensure that USGS science, data, and individual expertise are readily available during emergencies. The Geospatial Information Response Team (GIRT) is one of these teams. The USGS established the GIRT to facilitate the effective collection, storage, and dissemination of geospatial data information and products during an emergency. The GIRT ensures that timely geospatial data are available for use by emergency responders, land and resource managers, and for scientific analysis. In an emergency and response capacity, the GIRT is responsible for establishing procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing coordinated products and services utilizing the USGS' exceptional pool of

  18. Landscape epidemiology and machine learning: A geospatial approach to modeling West Nile virus risk in the United States

    Science.gov (United States)

    Young, Sean Gregory

    The complex interactions between human health and the physical landscape and environment have been recognized, if not fully understood, since the ancient Greeks. Landscape epidemiology, sometimes called spatial epidemiology, is a sub-discipline of medical geography that uses environmental conditions as explanatory variables in the study of disease or other health phenomena. This theory suggests that pathogenic organisms (whether germs or larger vector and host species) are subject to environmental conditions that can be observed on the landscape, and by identifying where such organisms are likely to exist, areas at greatest risk of the disease can be derived. Machine learning is a sub-discipline of artificial intelligence that can be used to create predictive models from large and complex datasets. West Nile virus (WNV) is a relatively new infectious disease in the United States, and has a fairly well-understood transmission cycle that is believed to be highly dependent on environmental conditions. This study takes a geospatial approach to the study of WNV risk, using both landscape epidemiology and machine learning techniques. A combination of remotely sensed and in situ variables are used to predict WNV incidence with a correlation coefficient as high as 0.86. A novel method of mitigating the small numbers problem is also tested and ultimately discarded. Finally a consistent spatial pattern of model errors is identified, indicating the chosen variables are capable of predicting WNV disease risk across most of the United States, but are inadequate in the northern Great Plains region of the US.

  19. Using Cluster Analysis to Compartmentalize a Large Managed Wetland Based on Physical, Biological, and Climatic Geospatial Attributes.

    Science.gov (United States)

    Hahus, Ian; Migliaccio, Kati; Douglas-Mankin, Kyle; Klarenberg, Geraldine; Muñoz-Carpena, Rafael

    2018-04-27

    Hierarchical and partitional cluster analyses were used to compartmentalize Water Conservation Area 1, a managed wetland within the Arthur R. Marshall Loxahatchee National Wildlife Refuge in southeast Florida, USA, based on physical, biological, and climatic geospatial attributes. Single, complete, average, and Ward's linkages were tested during the hierarchical cluster analyses, with average linkage providing the best results. In general, the partitional method, partitioning around medoids, found clusters that were more evenly sized and more spatially aggregated than those resulting from the hierarchical analyses. However, hierarchical analysis appeared to be better suited to identify outlier regions that were significantly different from other areas. The clusters identified by geospatial attributes were similar to clusters developed for the interior marsh in a separate study using water quality attributes, suggesting that similar factors have influenced variations in both the set of physical, biological, and climatic attributes selected in this study and water quality parameters. However, geospatial data allowed further subdivision of several interior marsh clusters identified from the water quality data, potentially indicating zones with important differences in function. Identification of these zones can be useful to managers and modelers by informing the distribution of monitoring equipment and personnel as well as delineating regions that may respond similarly to future changes in management or climate.

  20. Integrating fire behavior models and geospatial analysis for wildland fire risk assessment and fuel management planning

    Science.gov (United States)

    Alan A. Ager; Nicole M. Vaillant; Mark A. Finney

    2011-01-01

    Wildland fire risk assessment and fuel management planning on federal lands in the US are complex problems that require state-of-the-art fire behavior modeling and intensive geospatial analyses. Fuel management is a particularly complicated process where the benefits and potential impacts of fuel treatments must be demonstrated in the context of land management goals...

  1. Thinking Critically in Space: Toward a Mixed-Methods Geospatial Approach to Education Policy Analysis

    Science.gov (United States)

    Yoon, Ee-Seul; Lubienski, Christopher

    2018-01-01

    This paper suggests that synergies can be produced by using geospatial analyses as a bridge between traditional qualitative-quantitative distinctions in education research. While mapping tools have been effective for informing education policy studies, especially in terms of educational access and choice, they have also been underutilized and…

  2. Geospatial semantic web

    CERN Document Server

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such as WFS, WMS, RDF, OWL, and GeoSPARQL, and demonstrates how to use the Geospatial Semantic Web technologies to solve practical real-world problems such as spatial data interoperability.

  3. Geospatial Services Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: To process, store, and disseminate geospatial data to the Department of Defense and other Federal agencies.DESCRIPTION: The Geospatial Services Laboratory...

  4. Geospatial Water Quality Analysis of Dilla Town, Gadeo Zone, Ethiopia - A Case Study

    Science.gov (United States)

    Pakhale, G. K.; Wakeyo, T. B.

    2015-12-01

    Dilla is a socio-economically important town in Ethiopia, established on the international highway joining capital cities of Ethiopia and Kenya. It serves as an administrative center of the Gedeo Zone in SNNPR region of Ethiopia accommodating around 65000 inhabitants and also as an important trade centre for coffee. Due to the recent developments and urbanization in town and surrounding area, waste and sewage discharge has been raised significantly into the water resources. Also frequent rainfall in the region worsens the problem of water quality. In this view, present study aims to analyze water quality profile of Dilla town using 12 physico-chemical parameters. 15 Sampling stations are identified amongst the open wells, bore wells and from surface water, which are being extensively used for drinking and other domestic purposes. Spectrophotometer is used to analyze data and Gaussian process regression is used to interpolate the same in GIS environment to represent spatial distribution of parameters. Based on observed and desirable values of parameters, water quality index (WQI); an indicator of weighted estimate of the quantities of various parameters ranging from 1 to 100, is developed in GIS. Higher value of WQI indicates better while low value indicates poor water quality. This geospatial analysis is carried out before and after rainfall to understand temporal variation with reference to rainfall which facilitates in identifying the potential zones of drinking water. WQI indicated that 8 out of 15 locations come under acceptable category indicating the suitability of water for human use, however remaining locations are unfit. For example: the water sample at main_campus_ustream_1 (site name) site has very low WQI after rainfall, making it unfit for human usage. This suggests undertaking of certain measures in town to enhance the water quality. These results are useful for town authorities to take corrective measures and ameliorate the water quality for human

  5. Open Access to Multi-Domain Collaborative Analysis of Geospatial Data Through the Internet

    Science.gov (United States)

    Turner, A.

    2009-12-01

    The internet has provided us with a high bandwidth, low latency, globally connected network in which to rapidly share realtime data from sensors, reports, and imagery. In addition, the availability of this data is even easier to obtain, consume and analyze. Another aspect of the internet has been the increased approachability of complex systems through lightweight interfaces - with additional complex services able to provide more advanced connections into data services. These analyses and discussions have primarily been siloed within single domains, or kept out of the reach of amateur scientists and interested citizens. However, through more open access to analytical tools and data, experts can collaborate with citizens to gather information, provide interfaces for experimenting and querying results, and help make improved insights and feedback for further investigation. For example, farmers in Uganda are able to use their mobile phones to query, analyze, and be alerted to banana crop disease based on agriculture and climatological data. In the U.S., local groups use online social media sharing sites to gather data on storm-water runoff and stream siltation in order to alert wardens and environmental agencies. This talk will present various web-based geospatial visualization and analysis techniques and tools such as Google Earth and GeoCommons that have emerged that provide for a collaboration between experts of various domains as well as between experts, government, and citizen scientists. Through increased communication and the sharing of data and tools, it is possible to gain broad insight and development of joint, working solutions to a variety of difficult scientific and policy related questions.

  6. Geospatial Analysis of Near-Term Technical Potential of BECCS in the U.S.

    Science.gov (United States)

    Baik, E.; Sanchez, D.; Turner, P. A.; Mach, K. J.; Field, C. B.; Benson, S. M.

    2017-12-01

    Atmospheric carbon dioxide (CO2) removal using bioenergy with carbon capture and storage (BECCS) is crucial for achieving stringent climate change mitigation targets. To date, previous work discussing the feasibility of BECCS has largely focused on land availability and bioenergy potential, while CCS components - including capacity, injectivity, and location of potential storage sites - have not been thoroughly considered in the context of BECCS. A high-resolution geospatial analysis of both biomass production and potential geologic storage sites is conducted to consider the near-term deployment potential of BECCS in the U.S. The analysis quantifies the overlap between the biomass resource and CO2 storage locations within the context of storage capacity and injectivity. This analysis leverages county-level biomass production data from the U.S. Department of Energy's Billion Ton Report alongside potential CO2 geologic storage sites as provided by the USGS Assessment of Geologic Carbon Dioxide Storage Resources. Various types of lignocellulosic biomass (agricultural residues, dedicated energy crops, and woody biomass) result in a potential 370-400 Mt CO2 /yr of negative emissions in 2020. Of that CO2, only 30-31% of the produced biomass (110-120 Mt CO2 /yr) is co-located with a potential storage site. While large potential exists, there would need to be more than 250 50-MW biomass power plants fitted with CCS to capture all the co-located CO2 capacity in 2020. Neither absolute injectivity nor absolute storage capacity is likely to limit BECCS, but the results show regional capacity and injectivity constraints in the U.S. that had not been identified in previous BECCS analysis studies. The state of Illinois, the Gulf region, and western North Dakota emerge as the best locations for near-term deployment of BECCS with abundant biomass, sufficient storage capacity and injectivity, and the co-location of the two resources. Future studies assessing BECCS potential should

  7. GEOSPATIAL ANALYSIS OF ATMOSPHERIC HAZE EFFECT BY SOURCE AND SINK LANDSCAPE

    Directory of Open Access Journals (Sweden)

    T. Yu

    2017-09-01

    Full Text Available Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents

  8. Geospatial Analysis of Atmospheric Haze Effect by Source and Sink Landscape

    Science.gov (United States)

    Yu, T.; Xu, K.; Yuan, Z.

    2017-09-01

    Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD) of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN) and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents atmospheric haze

  9. A GEOSPATIAL ANALYSIS OF THE RELATIONSHIP BETWEEN ENVIRONMENTAL DRIVERS AND VECTOR-BORNE DISEASES

    Directory of Open Access Journals (Sweden)

    MARIA IOANA VLAD-ȘANDRU

    2015-10-01

    Full Text Available A Geospatial Analysis of the Relationship between Environmental Drivers and Vector-Borne Diseases. Human health is profoundly affected by weather and climate. Environmental health is becoming a major preoccupation on a world-wide scale; there is a close correlation between a population’s state of health and the quality of its environment, considering many infectious diseases are at least partly dependent on environmental factors. When we talk about the environment, we realize that it includes and affects fields of action from our daily life. Earth observation from space, with validation from in situ observations, provide a greater understanding of the environment and enable us to monitor and predict key environmental phenomena and events that can affect our livelihoods and health. Even thought, the use of Earth observation is growing in usefulness for a wide variety of uses, it is extremely unlikely that Earth Observation will be able to detect infectious diseases directly. Instead, Earth observation can be used to detect high NDVI index (and possibly attribute the high surface chlorophyll concentration to a particular disease, and help predict the movement of the agents carrying vector-borne disease. Many diseases need certain temperature and moisture conditions to breed. The primary objective of analyzing environmental health risk and vulnerabilities is to support the Development Regions to strengthen their capacity to assess, visualize and analyze health risks and incorporate the results of this analysis in a health risk map for disaster risk reduction, emergency preparedness and response plans. At the same time, such an analysis applied in health, allows starting the collection and homogenization of baseline data, information and maps to help health authorities and decision makers to take informed decisions in times of crises. Informational Health Platform would be used for the integration of data coming from different sources in order to

  10. Geospatial analysis of long-term morphological changes in Cochin estuary, SW coast of India

    Digital Repository Service at National Institute of Oceanography (India)

    DineshKumar, P.K.; Gopinath, G.; Manimurali, R.; Muraleedharan, K.R.

    are complex, where resource and management systems often confront with multiple conflicts. Estuarine functioning is case sensitive to changes in environmental factors and human interventions. Morphology of estuaries are generally characterized by the strong... for future. Synchronous environmental data would be useful in understanding the carrying capacity and problems and potentialities of fisheries, tourism and navigation. CONCLUSION In the discussion above, we examined the long period geospatial information...

  11. Multilayer geospatial analysis of water availability for shale resources development in Mexico

    Science.gov (United States)

    Galdeano, C.; Cook, M. A.; Webber, M. E.

    2017-08-01

    Mexico’s government enacted an energy reform in 2013 that aims to foster competitiveness and private investment throughout the energy sector value chain. As part of this reform, it is expected that extraction of oil and gas via hydraulic fracturing will increase in five shale basins (e.g. Burgos, Sabinas, Tampico, Tuxpan, and Veracruz). Because hydraulic fracturing is a water-intensive activity, it is relevant to assess the potential water availability for this activity in Mexico. This research aims to quantify the water availability for hydraulic fracturing in Mexico and identify its spatial distribution along the five shale basins. The methodology consisted of a multilayer geospatial analysis that overlays the water availability in the watersheds and aquifers with the different types of shale resources areas (e.g. oil and associated gas, wet gas and condensate, and dry gas) in the five shale basins. The aquifers and watersheds in Mexico are classified in four zones depending on average annual water availability. Three scenarios were examined based on different impact level on watersheds and aquifers from hydraulic fracturing. For the most conservative scenario analyzed, the results showed that the water available could be used to extract between 8.15 and 70.42 Quadrillion British thermal units (Quads) of energy in the typical 20-30 year lifetime of the hydraulic fracturing wells that could be supplied with the annual water availability overlaying the shale areas, with an average across estimates of around 18.05 Quads. However, geographic variation in water availability could represent a challenge for extracting the shale reserves. Most of the water available is located closer to the Gulf of Mexico, but the areas with the larger recoverable shale reserves coincide with less water availability in Northern Mexico. New water management techniques (such as recycling and re-use), more efficient fracturing methods, shifts in usage patterns, or other water sources need

  12. Rural and remote dental services shortages: filling the gaps through geo-spatial analysis evidence-based targeting.

    Science.gov (United States)

    Shiika, Yulia; Kruger, Estie; Tennant, Marc

    Australia has a significant mal-distribution of its limited dental workforce. Outside the major capital cities, the distribution of accessible dental care is at best patchy. This study applied geo-spatial analysis technology to locate gaps in dental service accessibility for rural and remote dwelling Australians, in order to test the hypothesis that there are a few key location points in Australia where further dental services could make a significant contribution to ameliorating the immediate shortage crisis. A total of 2,086 dental practices were located in country areas, covering a combined catchment area of 1.84 million square kilometers, based on 50 km catchment zones around each clinic. Geo-spatial analysis technology was used to identify gaps in the accessibility of dental services for rural and remote dwelling Australians. An extraction of data was obtained to analyse the integrated geographically-aligned database. Results: Resolution of the lack of dental practices for 74 townships (of greater than 500 residents) across Australia could potentially address access for 104,000 people. An examination of the socio-economic mix found that the majority of the dental practices (84%) are located in areas classified as less disadvantaged. Output from the study provided a cohesive national map that has identified locations that could have health improvement via the targeting of dental services to that location. The study identified potential location sites for dental clinics, to address the current inequity in accessing dental services in rural and remote Australia.

  13. Learning about Urban Ecology through the Use of Visualization and Geospatial Technologies

    Science.gov (United States)

    Barnett, Michael; Houle, Meredith; Mark, Sheron; Strauss, Eric; Hoffman, Emily

    2010-01-01

    During the past three years we have been designing and implementing a technology enhanced urban ecology program using geographic information systems (GIS) coupled with technology. Our initial work focused on professional development for in-service teachers and implementation in K-12 classrooms. However, upon reflection and analysis of the…

  14. Ecosystem Services Provided by Agricultural Land as Modeled by Broad Scale Geospatial Analysis

    Science.gov (United States)

    Kokkinidis, Ioannis

    Agricultural ecosystems provide multiple services including food and fiber provision, nutrient cycling, soil retention and water regulation. Objectives of the study were to identify and quantify a selection of ecosystem services provided by agricultural land, using existing geospatial tools and preferably free and open source data, such as the Virginia Land Use Evaluation System (VALUES), the North Carolina Realistic Yield Expectations (RYE) database, and the land cover datasets NLCD and CDL. Furthermore I sought to model tradeoffs between provisioning and other services. First I assessed the accuracy of agricultural land in NLCD and CDL over a four county area in eastern Virginia using cadastral parcels. I uncovered issues concerning the definition of agricultural land. The area and location of agriculture saw little change in the 19 years studied. Furthermore all datasets have significant errors of omission (11.3 to 95.1%) and commission (0 to 71.3%). Location of agriculture was used with spatial crop yield databases I created and combined with models I adapted to calculate baseline values for plant biomass, nutrient composition and requirements, land suitability for and potential production of biofuels and the economic impact of agriculture for the four counties. The study area was then broadened to cover 97 counties in eastern Virginia and North Carolina, investigating the potential for increased regional grain production through intensification and extensification of agriculture. Predicted yield from geospatial crop models was compared with produced yield from the NASS Survey of Agriculture. Area of most crops in CDL was similar to that in the Survey of Agriculture, but a yield gap is present for most years, partially due to weather, thus indicating potential for yield increase through intensification. Using simple criteria I quantified the potential to extend agriculture in high yield land in other uses and modeled the changes in erosion and runoff should

  15. Mapping and Analysis of Forest and Land Fire Potential Using Geospatial Technology and Mathematical Modeling

    International Nuclear Information System (INIS)

    Suliman, M D H; Mahmud, M; Reba, M N M; S, L W

    2014-01-01

    Forest and land fire can cause negative implications for forest ecosystems, biodiversity, air quality and soil structure. However, the implications involved can be minimized through effective disaster management system. Effective disaster management mechanisms can be developed through appropriate early warning system as well as an efficient delivery system. This study tried to focus on two aspects, namely by mapping the potential of forest fire and land as well as the delivery of information to users through WebGIS application. Geospatial technology and mathematical modeling used in this study for identifying, classifying and mapping the potential area for burning. Mathematical models used is the Analytical Hierarchy Process (AHP), while Geospatial technologies involved include remote sensing, Geographic Information System (GIS) and digital field data collection. The entire Selangor state was chosen as our study area based on a number of cases have been reported over the last two decades. AHP modeling to assess the comparison between the three main criteria of fuel, topography and human factors design. Contributions of experts directly involved in forest fire fighting operations and land comprising officials from the Fire and Rescue Department Malaysia also evaluated in this model. The study found that about 32.83 square kilometers of the total area of Selangor state are the extreme potential for fire. Extreme potential areas identified are in Bestari Jaya and Kuala Langat High Ulu. Continuity of information and terrestrial forest fire potential was displayed in WebGIS applications on the internet. Display information through WebGIS applications is a better approach to help the decision-making process at a high level of confidence and approximate real conditions. Agencies involved in disaster management such as Jawatankuasa Pengurusan Dan Bantuan Bencana (JPBB) of District, State and the National under the National Security Division and the Fire and Rescue

  16. Mapping and Analysis of Forest and Land Fire Potential Using Geospatial Technology and Mathematical Modeling

    Science.gov (United States)

    Suliman, M. D. H.; Mahmud, M.; Reba, M. N. M.; S, L. W.

    2014-02-01

    Forest and land fire can cause negative implications for forest ecosystems, biodiversity, air quality and soil structure. However, the implications involved can be minimized through effective disaster management system. Effective disaster management mechanisms can be developed through appropriate early warning system as well as an efficient delivery system. This study tried to focus on two aspects, namely by mapping the potential of forest fire and land as well as the delivery of information to users through WebGIS application. Geospatial technology and mathematical modeling used in this study for identifying, classifying and mapping the potential area for burning. Mathematical models used is the Analytical Hierarchy Process (AHP), while Geospatial technologies involved include remote sensing, Geographic Information System (GIS) and digital field data collection. The entire Selangor state was chosen as our study area based on a number of cases have been reported over the last two decades. AHP modeling to assess the comparison between the three main criteria of fuel, topography and human factors design. Contributions of experts directly involved in forest fire fighting operations and land comprising officials from the Fire and Rescue Department Malaysia also evaluated in this model. The study found that about 32.83 square kilometers of the total area of Selangor state are the extreme potential for fire. Extreme potential areas identified are in Bestari Jaya and Kuala Langat High Ulu. Continuity of information and terrestrial forest fire potential was displayed in WebGIS applications on the internet. Display information through WebGIS applications is a better approach to help the decision-making process at a high level of confidence and approximate real conditions. Agencies involved in disaster management such as Jawatankuasa Pengurusan Dan Bantuan Bencana (JPBB) of District, State and the National under the National Security Division and the Fire and Rescue

  17. Geospatial Analysis of Unmet Surgical Need in Uganda: An Analysis of SOSAS Survey Data.

    Science.gov (United States)

    Farber, S Harrison; Vissoci, Joao Ricardo Nickenig; Tran, Tu M; Fuller, Anthony T; Butler, Elissa K; Andrade, Luciano; Staton, Catherine; Makumbi, Fredrick; Luboga, Samuel; Muhumuza, Christine; Namanya, Didacus B; Chipman, Jeffrey G; Galukande, Moses; Haglund, Michael M

    2017-02-01

    Globally, a staggering five billion people lack access to adequate surgical care. Sub-Saharan Africa represents one of the regions of greatest need. We sought to understand how geographic factors related to unmet surgical need (USN) in Uganda. We performed a geographic information system analysis of a nationwide survey on surgical conditions performed in 105 enumeration areas (EAs) representing the national population. At the district level, we determined the spatial autocorrelation of the following study variables: prevalence of USN, hub distance (distance from EA to the nearest surgical center), area of coverage (geographic catchment area of each center), tertiary facility transport time (average respondent-reported travel time), and care availability (rate of hospital beds by population and by district). We then used local indicators of spatial association (LISA) and spatial regression to identify any significant clustering of these study variables among the districts. The survey enumerated 4248 individuals. The prevalence of USN varied from 2.0-45 %. The USN prevalence was highest in the Northern and Western Regions. Moran's I bivariate analysis indicated a positive correlation between USN and hub distance (p = 0.03), area of coverage (p = 0.02), and facility transport time (p = 0.03). These associations were consistent nationally. The LISA analysis showed a high degree of clustering among sets of districts in the Northern Sub-Region. This study demonstrates a statistically significant association between USN and the geographic variables examined. We have identified the Northern Sub-Region as the highest priority areas for financial investment to reduce this unmet surgical disease burden.

  18. 3D Geospatial Models for Visualization and Analysis of Groundwater Contamination at a Nuclear Materials Processing Facility

    Science.gov (United States)

    Stirewalt, G. L.; Shepherd, J. C.

    2003-12-01

    Analysis of hydrostratigraphy and uranium and nitrate contamination in groundwater at a former nuclear materials processing facility in Oklahoma were undertaken employing 3-dimensional (3D) geospatial modeling software. Models constructed played an important role in the regulatory decision process of the U.S. Nuclear Regulatory Commission (NRC) because they enabled visualization of temporal variations in contaminant concentrations and plume geometry. Three aquifer systems occur at the site, comprised of water-bearing fractured shales separated by indurated sandstone aquitards. The uppermost terrace groundwater system (TGWS) aquifer is composed of terrace and alluvial deposits and a basal shale. The shallow groundwater system (SGWS) aquifer is made up of three shale units and two sandstones. It is separated from the overlying TGWS and underlying deep groundwater system (DGWS) aquifer by sandstone aquitards. Spills of nitric acid solutions containing uranium and radioactive decay products around the main processing building (MPB), leakage from storage ponds west of the MPB, and leaching of radioactive materials from discarded equipment and waste containers contaminated both the TGWS and SGWS aquifers during facility operation between 1970 and 1993. Constructing 3D geospatial property models for analysis of groundwater contamination at the site involved use of EarthVision (EV), a 3D geospatial modeling software developed by Dynamic Graphics, Inc. of Alameda, CA. A viable 3D geohydrologic framework model was initially constructed so property data could be spatially located relative to subsurface geohydrologic units. The framework model contained three hydrostratigraphic zones equivalent to the TGWS, SGWS, and DGWS aquifers in which groundwater samples were collected, separated by two sandstone aquitards. Groundwater data collected in the three aquifer systems since 1991 indicated high concentrations of uranium (>10,000 micrograms/liter) and nitrate (> 500 milligrams

  19. How bicycle level of traffic stress correlate with reported cyclist accidents injury severities: A geospatial and mixed logit analysis.

    Science.gov (United States)

    Chen, Chen; Anderson, Jason C; Wang, Haizhong; Wang, Yinhai; Vogt, Rachel; Hernandez, Salvador

    2017-11-01

    Transportation agencies need efficient methods to determine how to reduce bicycle accidents while promoting cycling activities and prioritizing safety improvement investments. Many studies have used standalone methods, such as level of traffic stress (LTS) and bicycle level of service (BLOS), to better understand bicycle mode share and network connectivity for a region. However, in most cases, other studies rely on crash severity models to explain what variables contribute to the severity of bicycle related crashes. This research uniquely correlates bicycle LTS with reported bicycle crash locations for four cities in New Hampshire through geospatial mapping. LTS measurements and crash locations are compared visually using a GIS framework. Next, a bicycle injury severity model, that incorporates LTS measurements, is created through a mixed logit modeling framework. Results of the visual analysis show some geospatial correlation between higher LTS roads and "Injury" type bicycle crashes. It was determined, statistically, that LTS has an effect on the severity level of bicycle crashes and high LTS can have varying effects on severity outcome. However, it is recommended that further analyses be conducted to better understand the statistical significance and effect of LTS on injury severity. As such, this research will validate the use of LTS as a proxy for safety risk regardless of the recorded bicycle crash history. This research will help identify the clustering patterns of bicycle crashes on high-risk corridors and, therefore, assist with bicycle route planning and policy making. This paper also suggests low-cost countermeasures or treatments that can be implemented to address high-risk areas. Specifically, with the goal of providing safer routes for cyclists, such countermeasures or treatments have the potential to substantially reduce the number of fatalities and severe injuries. Published by Elsevier Ltd.

  20. GEOSPATIAL ANALYSIS OF URBAN LAND USE PATTERN ANALYSIS FOR HEMORRHAGIC FEVER RISK – A REVIEW

    Directory of Open Access Journals (Sweden)

    L. N. Izzah

    2016-09-01

    Full Text Available Human modification of the natural environment continues to create habitats in which vectors of a wide variety of human and animal pathogens (such as Plasmodium, Aedes aegypti, Arenavirus etc. thrive if unabated with an enormous potential to negatively affect public health. Typical examples of these modifications include impoundments, dams, irrigation systems, landfills and so on that provide enabled environment for the transmission of Hemorrhagic fever such as malaria, dengue, avian flu, Lassa fever etc. Furthermore, contemporary urban dwelling pattern appears to be associated with the prevalence of Hemorrhagic diseases in recent years. These observations are not peculiar to the developing world, as urban expansion also contributes significantly to mosquito and other vectors habitats. This habitats offer breeding ground to some vector virus populations. The key to disease control is developing an understanding of the contribution of human landscape modification to vector-borne pathogen transmission and how a balance may be achieved between human development, public health, and responsible urban land use. A comprehensive review of urban land use Pattern Analysis for Hemorrhagic fever risk has been conducted in this paper. The study found that most of the available literatures dwell more on the impact of urban land use on malaria and dengue fevers; however, studies are yet to be found discussing the implications of urban land use on the risk of Ebola, Lassa and other non-mosquito borne VHFs. A relational model for investigating the influence of urban land use change pattern on the risk of Hemorrhagic fever has been proposed in this study.

  1. Bayesian spatio-temporal analysis and geospatial risk factors of human monocytic ehrlichiosis.

    Directory of Open Access Journals (Sweden)

    Ram K Raghavan

    Full Text Available Variations in spatio-temporal patterns of Human Monocytic Ehrlichiosis (HME infection in the state of Kansas, USA were examined and the relationship between HME relative risk and various environmental, climatic and socio-economic variables were evaluated. HME data used in the study was reported to the Kansas Department of Health and Environment between years 2005-2012, and geospatial variables representing the physical environment [National Land cover/Land use, NASA Moderate Resolution Imaging Spectroradiometer (MODIS], climate [NASA MODIS, Prediction of Worldwide Renewable Energy (POWER], and socio-economic conditions (US Census Bureau were derived from publicly available sources. Following univariate screening of candidate variables using logistic regressions, two Bayesian hierarchical models were fit; a partial spatio-temporal model with random effects and a spatio-temporal interaction term, and a second model that included additional covariate terms. The best fitting model revealed that spatio-temporal autocorrelation in Kansas increased steadily from 2005-2012, and identified poverty status, relative humidity, and an interactive factor, 'diurnal temperature range x mixed forest area' as significant county-level risk factors for HME. The identification of significant spatio-temporal pattern and new risk factors are important in the context of HME prevention, for future research in the areas of ecology and evolution of HME, and as well as climate change impacts on tick-borne diseases.

  2. Geospatial Education and Research Development: A Laboratory for Remote Sensing and Environmental Analysis (LaRSEA)

    Science.gov (United States)

    Allen, Thomas R., Jr.

    1999-01-01

    Old Dominion University has claimed the title "University of the 21st Century," with a bold emphasis on technology innovation and application. In keeping with this claim, the proposed work has implemented a new laboratory equipped for remote sensing as well as curriculum and research innovations afforded for present and future faculty and students. The developments summarized within this report would not have been possible without the support of the NASA grant and significant cost-sharing of several units within the University. The grant effectively spring-boarded the university into major improvements in its approach to remote sensing and geospatial information technologies. The university has now committed to licensing Erdas Imagine software for the laboratory, a campus-wide ESRI geographic information system (GIS) products license, and several smaller software and hardware utilities available to faculty and students through the laboratory. Campus beneficiaries of this grant have included faculty from departments including Ocean, Earth. and Atmospheric Sciences, Political Science and Geography, Ecological Sciences, Environmental Health, and Civil and Environmental Engineering. High student interest is evidenced in students in geology, geography, ecology, urban studies, and planning. Three new courses have been added to the catalog and offered this year. Cross-cutting curriculum changes are in place with growing enrollments in remote sensing, GIS, and a new co-taught seminar in applied coastal remote sensing. The enabling grant has also allowed project participants to attract external funding for research grants, thereby providing additional funds beyond the planned matching, maintenance and growth of software and hardware, and stipends for student assistants. Two undergraduate assistants and two graduate assistants have been employed by full-time assistantships as a result. A new certificate is offered to students completing an interdisciplinary course sequence

  3. Dynamic Science Data Services for Display, Analysis and Interaction in Widely-Accessible, Web-Based Geospatial Platforms, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — TerraMetrics, Inc., proposes a Phase II R/R&D program to implement the TerraBlocksTM Server architecture that provides geospatial data authoring, storage and...

  4. The Value of Information and Geospatial Technologies for the analysis of tidal current patterns in the Guanabara Bay (Rio de Janeiro)

    Science.gov (United States)

    Isotta Cristofori, Elena; Demarchi, Alessandro; Facello, Anna; Cámaro, Walther; Hermosilla, Fernando; López, Jaime

    2016-04-01

    The study and validation of tidal current patterns relies on the combination of several data sources such as numerical weather prediction models, hydrodynamic models, weather stations, current drifters and remote sensing observations. The assessment of the accuracy and the reliability of produced patterns and the communication of results, including an easy to understand visualization of data, is crucial for a variety of stakeholders including decision-makers. The large diffusion of geospatial equipment such as GPS, current drifters, aerial photogrammetry, allows to collect data in the field using mobile and portable devices with a relative limited effort in terms of time and economic resources. Theses real-time measurements are essential in order to validate the models and specifically to assess the skill of the model during critical environmental conditions. Moreover, the considerable development in remote sensing technologies, cartographic services and GPS applications have enabled the creation of Geographic Information Systems (GIS) capable to store, analyze, manage and integrate spatial or geographical information with hydro-meteorological data. This valuable contribution of Information and geospatial technologies can benefit manifold decision-makers including high level sport athletes. While the numerical approach, commonly used to validate models with in-situ data, is more familiar for scientific users, high level sport users are not familiar with a numerical representations of data. Therefore the integration of data collected in the field into a GIS allows an immediate visualization of performed analysis into geographic maps. This visualization represents a particularly effective way to communicate current patterns assessment results and uncertainty in information, leading to an increase of confidence level about the forecast. The aim of this paper is to present the methodology set-up in collaboration with the Austrian Sailing Federation, for the study of

  5. Using a Web GIS Plate Tectonics Simulation to Promote Geospatial Thinking

    Science.gov (United States)

    Bodzin, Alec M.; Anastasio, David; Sharif, Rajhida; Rutzmoser, Scott

    2016-01-01

    Learning with Web-based geographic information system (Web GIS) can promote geospatial thinking and analysis of georeferenced data. Web GIS can enable learners to analyze rich data sets to understand spatial relationships that are managed in georeferenced data visualizations. We developed a Web GIS plate tectonics simulation as a capstone learning…

  6. Learning: An Evolutionary Analysis

    Science.gov (United States)

    Swann, Joanna

    2009-01-01

    This paper draws on the philosophy of Karl Popper to present a descriptive evolutionary epistemology that offers philosophical solutions to the following related problems: "What happens when learning takes place?" and "What happens in human learning?" It provides a detailed analysis of how learning takes place without any direct transfer of…

  7. National Geospatial Program

    Science.gov (United States)

    Carswell, William J.

    2011-01-01

    The National Geospatial Program (NGP; http://www.usgs.gov/ngpo/) satisfies the needs of customers by providing geospatial products and services that customers incorporate into their decisionmaking and operational activities. These products and services provide geospatial data that are organized and maintained in cost-effective ways and developed by working with partners and organizations whose activities align with those of the program. To accomplish its mission, the NGP— organizes, maintains, publishes, and disseminates the geospatial baseline of the Nation's topography, natural landscape, and manmade environment through The National Map

  8. Geospatial tool-based morphometric analysis using SRTM data in Sarabanga Watershed, Cauvery River, Salem district, Tamil Nadu, India

    Science.gov (United States)

    Arulbalaji, P.; Gurugnanam, B.

    2017-11-01

    A morphometric analysis of Sarabanga watershed in Salem district has been chosen for the present study. Geospatial tools, such as remote sensing and GIS, are utilized for the extraction of river basin and its drainage networks. The Shuttle Radar Topographic Mission (SRTM-30 m resolution) data have been used for morphometric analysis and evaluating various morphometric parameters. The morphometric parameters of Sarabanga watershed have been analyzed and evaluated by pioneer methods, such as Horton and Strahler. The dendritic type of drainage pattern is draining the Sarabanga watershed, which indicates that lithology and gentle slope category is controlling the study area. The Sarabanga watershed is covered an area of 1208 km2. The slope of the watershed is various from 10 to 40% and which is controlled by lithology of the watershed. The bifurcation ratio ranges from 3 to 4.66 indicating the influence of geological structure and suffered more structural disturbances. The form factor indicates elongated shape of the study area. The total stream length and area of watershed indicate that mean annual rainfall runoff is relatively moderate. The basin relief expressed that watershed has relatively high denudation rates. The drainage density of the watershed is low indicating that infiltration is more dominant. The ruggedness number shows the peak discharges that are likely to be relatively higher. The present study is very useful to plan the watershed management.

  9. GEODATA: Information System Based on Geospatial for Early Warning Tracking and Analysis Agricultural Plant Diseases in Central Java

    Science.gov (United States)

    Prasetyo, S. Y. J.; Agus, Y. H.; Dewi, C.; Simanjuntak, B. H.; Hartomo, K. D.

    2017-03-01

    The Government of Indonesia is currently faced with the problems of food, especially rice. It needs in large numbers that have to import from neighboring countries. Actually, the Indonesian government has the ability to produce rice to meet national needs but is still faced with the problem of pest attack rice annually increasing extent. One of the factors is that geographically Indonesia located on the migration path of world rice insect pests (called BPH or Brown Planthoppers) (Nilaparvata lugens Stal.) It leads endemic status annually. One proposed strategy to be applied is to use an early warning system based on a specific region of the main pest population. The proposed information system called GEODATA. GEODATA is Geospatial Outbreak of Disease Tracking and Analysis. The system works using a library ESSA (Exponential Smoothing - Spatial Autocorrelation) developed in previous studies in Satya Wacana Christian University. GEODATA built to meet the qualifications required surveillance device by BMKG (Indonesian Agency of Meteorology, Climatology and Geophysics’ Central Java Provinces), BPTPH (Indonesian Agency of Plant Protection and Horticulture) Central Java Provinces, BKP-KP District Boyolali, Central Java, (Indonesian Agency of Food Security and Agriculture Field Supervisor, District Boyolali, Central Java Provinces) and farmer groups. GIS GEODATA meets the needs of surveillance devices that include: (1) mapping of the disease, (2) analysis of the dynamics of the disease, and (3) prediction of attacks / disease outbreaks in a particular region. GIS GEODATA is currently under implementation in the laboratory field observations of plant pest in Central Java province, Indonesia.

  10. Geospatial Engineering

    Science.gov (United States)

    2017-02-22

    following areas:  Hydrology.  Surface configuration.  Surface materials .  Vegetation.  Obstacles.  Man-made features. 1-13. Terrain analysis and...57. Pipelines that carry petroleum and natural gas are an important mode of transportation, while rail, water, and road transportation are used...Figure B-15. Example of a product showing aerial concealment......................................... B-16 Figure B-16. Example of a surface material

  11. A Python Geospatial Language Toolkit

    Science.gov (United States)

    Fillmore, D.; Pletzer, A.; Galloy, M.

    2012-12-01

    The volume and scope of geospatial data archives, such as collections of satellite remote sensing or climate model products, has been rapidly increasing and will continue to do so in the near future. The recently launched (October 2011) Suomi National Polar-orbiting Partnership satellite (NPP) for instance, is the first of a new generation of Earth observation platforms that will monitor the atmosphere, oceans, and ecosystems, and its suite of instruments will generate several terabytes each day in the form of multi-spectral images and derived datasets. Full exploitation of such data for scientific analysis and decision support applications has become a major computational challenge. Geophysical data exploration and knowledge discovery could benefit, in particular, from intelligent mechanisms for extracting and manipulating subsets of data relevant to the problem of interest. Potential developments include enhanced support for natural language queries and directives to geospatial datasets. The translation of natural language (that is, human spoken or written phrases) into complex but unambiguous objects and actions can be based on a context, or knowledge domain, that represents the underlying geospatial concepts. This poster describes a prototype Python module that maps English phrases onto basic geospatial objects and operations. This module, along with the associated computational geometry methods, enables the resolution of natural language directives that include geographic regions of arbitrary shape and complexity.

  12. Examining the Effect of Enactment of a Geospatial Curriculum on Students' Geospatial Thinking and Reasoning

    Science.gov (United States)

    Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara

    2014-08-01

    A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.

  13. Geospatial Analysis of Climate-Related Changes in North American Arctic Ecosystems and Implications for Terrestrial Flora and Fauna

    Science.gov (United States)

    Amirazodi, S.; Griffin, R.

    2016-12-01

    Climate change induces range shifts among many terrestrial species in Arctic regions. At best, warming often forces poleward migration if a stable environment is to be maintained. At worst, marginal ecosystems may disappear entirely without a contiguous shift allowing migratory escape to similar environs. These changing migration patterns and poleward range expansion push species into higher latitudes where ecosystems are less stable and more sensitive to change. This project focuses on ecosystem geography and interspecies relationships and interactions by analyzing seasonality and changes over time in variables including the following: temperature, precipitation, vegetation, physical boundaries, population demographics, permafrost, sea ice, and food and water availability. Publicly available data from remote sensing platforms are used throughout, and processed with both commercially available and open sourced GIS tools. This analysis describes observed range changes for selected North American species, and attempts to provide insight into the causes and effects of these phenomena. As the responses to climate change are complex and varied, the goal is to produce the aforementioned results in an easily understood set of geospatial representations to better support decision making regarding conservation prioritization and enable adaptive responses and mitigation strategies.

  14. Automating the Analysis of Spatial Grids A Practical Guide to Data Mining Geospatial Images for Human & Environmental Applications

    CERN Document Server

    Lakshmanan, Valliappa

    2012-01-01

    The ability to create automated algorithms to process gridded spatial data is increasingly important as remotely sensed datasets increase in volume and frequency. Whether in business, social science, ecology, meteorology or urban planning, the ability to create automated applications to analyze and detect patterns in geospatial data is increasingly important. This book provides students with a foundation in topics of digital image processing and data mining as applied to geospatial datasets. The aim is for readers to be able to devise and implement automated techniques to extract information from spatial grids such as radar, satellite or high-resolution survey imagery.

  15. The VI-Suite: a set of environmental analysis tools with geospatial data applications

    NARCIS (Netherlands)

    Southall, Ryan; Biljecki, F.

    2017-01-01

    Background: The VI-Suite is a free and open-source addon for the 3D content creation application Blender, developed primarily as a tool for the contextual and performative analysis of buildings. Its functionality has grown from simple, static lighting analysis to fully parametric lighting,

  16. A Transient Landscape: Geospatial Analysis and Numerical Modeling of Coastal Geomorphology in the Outer Banks, North Carolina

    Science.gov (United States)

    Hardin, Eric Jon

    Coastal landscapes can be relentlessly dynamic---owing to wave energy, tidal cycles, extreme weather events, and perpetual coastal winds. In these settings, the ever-changing landscape can threaten assets and infrastructure, necessitating costly measures to mitigate associated risks and to repair or maintain the changing landscape. Mapping and monitoring of terrain change, identification of areas susceptible to dramatic change, and understanding the processes that drive landscape change are critical for the development of responsible coastal management strategies and policies. Over the past two decades, LiDAR mapping has been conducted along the U.S. east coast (including the Outer Banks, North Carolina) on a near annual basis---generating a rich time series of topographic data with unprecedented accuracy, resolution, and extent. This time series has captured the response of the landscape to episodic storms, daily forcing of wind and waves, and anthropogenic activities. This work presents raster-based geospatial techniques developed to gain new insights into coastal geomorphology from the time series of available LiDAR. Per-cell statistical techniques derive information that is typically not obtained through the techniques traditionally employed by coastal scientists and engineers. Application of these techniques to study sites along the Outer Banks, NC, revealed substantial spatial and temporal variations in terrain change. Additionally, they identify the foredunes as being the most geomorphologically dynamic coastal features. In addition to per-cell statistical analysis, an approach is presented for the extraction of the dune ridge and dune toe (two features that are essential to standard vulnerability assessment). The approach employs a novel application of least cost path analysis and a physics-based model of an elastic sheet. The spatially distributed nature of the approach achieves a high level of automation and repeatability that semi-automated methods and

  17. Forest Inventory and Analysis in the United States: Remote sensing and geospatial activities

    Science.gov (United States)

    Mark Nelson; Gretchen Moisen; Mark Finco

    2007-01-01

    Our Nation's forests provide a wealth of ecological, social, and economic resources. These forest lands cover over 300 million hectares of the United States, or about one third of the total land area. Accurate and timely information about them is essential to their wise management and use. The mission of the Forest Service's Forest Inventory and Analysis (FIA...

  18. NCI's Distributed Geospatial Data Server

    Science.gov (United States)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under

  19. Not Just a Game … When We Play Together, We Learn Together: Interactive Virtual Environments and Gaming Engines for Geospatial Visualization

    Science.gov (United States)

    Shipman, J. S.; Anderson, J. W.

    2017-12-01

    An ideal tool for ecologists and land managers to investigate the impacts of both projected environmental changes and policy alternatives is the creation of immersive, interactive, virtual landscapes. As a new frontier in visualizing and understanding geospatial data, virtual landscapes require a new toolbox for data visualization that includes traditional GIS tools and uncommon tools such as the Unity3d game engine. Game engines provide capabilities to not only explore data but to build and interact with dynamic models collaboratively. These virtual worlds can be used to display and illustrate data that is often more understandable and plausible to both stakeholders and policy makers than is achieved using traditional maps.Within this context we will present funded research that has been developed utilizing virtual landscapes for geographic visualization and decision support among varied stakeholders. We will highlight the challenges and lessons learned when developing interactive virtual environments that require large multidisciplinary team efforts with varied competences. The results will emphasize the importance of visualization and interactive virtual environments and the link with emerging research disciplines within Visual Analytics.

  20. Generating Geospatially Realistic Driving Patterns Derived From Clustering Analysis Of Real EV Driving Data

    DEFF Research Database (Denmark)

    Pedersen, Anders Bro; Aabrandt, Andreas; Østergaard, Jacob

    2014-01-01

    In order to provide a vehicle fleet that realistically represents the predicted Electric Vehicle (EV) penetration for the future, a model is required that mimics people driving behaviour rather than simply playing back collected data. When the focus is broadened from on a traditional user...... scales, which calls for a statistically correct, yet flexible model. This paper describes a method for modelling EV, based on non-categorized data, which takes into account the plug in locations of the vehicles. By using clustering analysis to extrapolate and classify the primary locations where...

  1. GEOSPATIAL ANALYSIS OF ROAD DISTRESSES AND THE RELATIONSHIP WITH THE SLOPE FACTOR

    Directory of Open Access Journals (Sweden)

    NOR A. M. NASIR

    2016-05-01

    Full Text Available A road is a medium that a person uses to move from one destination to another. A good road can provide comfort and safety to users, however, poor maintenance of a road might cause danger to them. Therefore, a road maintenance management system needs to be carried out to ensure the effectiveness as well as the efficiency of road maintenance itself. In this era, there are so many systems created in order to help data storage and analysis. One of the systems is the Geographic Information System (GIS. Other than getting the location of distresses, GIS also can help to classify the severity level of distresses and to correlate the distresses occurring in Universiti Kebangsaan Malaysia (UKM with the slope gradient. Road distress data was collected using GPS applications supported by Supersurv 3 software. The study shows that the GIS method helps to produce a good spatial database. The road gradient factor is related to the level of road damage.

  2. Geospatial Analysis and Remote Sensing from Airplanes and Satellites for Cultural Resources Management

    Science.gov (United States)

    Giardino, Marco J.; Haley, Bryan S.

    2005-01-01

    Cultural resource management consists of research to identify, evaluate, document and assess cultural resources, planning to assist in decision-making, and stewardship to implement the preservation, protection and interpretation of these decisions and plans. One technique that may be useful in cultural resource management archaeology is remote sensing. It is the acquisition of data and derivative information about objects or materials (targets) located on the Earth's surface or in its atmosphere by using sensor mounted on platforms located at a distance from the targets to make measurements on interactions between the targets and electromagnetic radiation. Included in this definition are systems that acquire imagery by photographic methods and digital multispectral sensors. Data collected by digital multispectral sensors on aircraft and satellite platforms play a prominent role in many earth science applications, including land cover mapping, geology, soil science, agriculture, forestry, water resource management, urban and regional planning, and environmental assessments. Inherent in the analysis of remotely sensed data is the use of computer-based image processing techniques. Geographical information systems (GIS), designed for collecting, managing, and analyzing spatial information, are also useful in the analysis of remotely sensed data. A GIS can be used to integrate diverse types of spatially referenced digital data, including remotely sensed and map data. In archaeology, these tools have been used in various ways to aid in cultural resource projects. For example, they have been used to predict the presence of archaeological resources using modern environmental indicators. Remote sensing techniques have also been used to directly detect the presence of unknown sites based on the impact of past occupation on the Earth's surface. Additionally, remote sensing has been used as a mapping tool aimed at delineating the boundaries of a site or mapping previously

  3. A quantitative comparison of moldic and vuggy porosity structure in karst aquifers using image and geospatial analysis

    Science.gov (United States)

    Culpepper, A. R.; Manda, A. K.

    2011-12-01

    Limestone aquifers are vital sources of groundwater for domestic and industrial use throughout the world. To sustain rising population throughout the southeastern United States, aquifers are increasingly exploited to provide the populace clean and reliable water resources. The moldic Castle Hayne and the vuggy Biscayne aquifer systems are two highly productive aquifers that provide critical water resources to millions of citizens in eastern North Carolina and southeastern Florida, respectively. In order to better understand karst aquifers and evaluate the potential for contaminant transport, detailed investigation of 2D porosity and pore geometry using image and geospatial analysis were undertaken. The objective of this study is to compare and contrast the porosity structure of moldic and vuggy karst aquifers by quantifying 2D porosity and pore geometry from images of slabbed core samples and optical televiewer images. Televiewer images and images of painted core samples from the Spring Garden Member of the Castle Hayne aquifer and Miami Limestone Formation of the Biscayne aquifer were acquired for analysis of porosity structure. The procedure for converting images of slabbed core and televiewer images to a GIS useable format consisted of rectification, calibration, image enhancement, classification, recoding and filtering. In GIS, raster or vector formats were used to assess pore attributes (e.g., area and perimeter) and structure. Preliminary results show that both pore area and perimeter for the Spring Garden Member of the Castle Hayne and Miami Limestone Formation of the Biscayne aquifers can be described by exponential distributions. In both sets of slabbed core images the relatively small pores have the highest occurrence, whereas larger pores occur less frequently. However, the moldic Spring Garden Member of the Castle Hayne aquifer has larger pore sizes derived from cores images than the vuggy Miami Limestone Formation of Biscayne aquifer. Total porosity

  4. Bystander Cardiopulmonary Resuscitation Is Clustered and Associated With Neighborhood Socioeconomic Characteristics: A Geospatial Analysis of Kent County, Michigan.

    Science.gov (United States)

    Uber, Amy; Sadler, Richard C; Chassee, Todd; Reynolds, Joshua C

    2017-08-01

    Geographic clustering of bystander cardiopulmonary resuscitation (CPR) is associated with demographic and socioeconomic features of the community where out-of-hospital cardiac arrest (OHCA) occurred, although this association remains largely untested in rural areas. With a significant rural component and relative racial homogeneity, Kent County, Michigan, provides a unique setting to externally validate or identify new community features associated with bystander CPR. Using a large, countywide data set, we tested for geographic clustering of bystander CPR and its associations with community socioeconomic features. Secondary analysis of adult OHCA subjects (2010-2015) in the Cardiac Arrest Registry to Enhance Survival (CARES) data set for Kent County, Michigan. After linking geocoded OHCA cases to U.S. census data, we used Moran's I-test to assess for spatial autocorrelation of population-weighted cardiac arrest rate by census block group. Getis-Ord Gi statistic assessed for spatial clustering of bystander CPR and mixed-effects hierarchical logistic regression estimated adjusted associations between community features and bystander CPR. Of 1,592 subjects, 1,465 met inclusion criteria. Geospatial analysis revealed significant clustering of OHCA in more populated/urban areas. Conversely, bystander CPR was less likely in these areas (99% confidence) and more likely in suburban and rural areas (99% confidence). Adjusting for clinical, demographic, and socioeconomic covariates, bystander CPR was associated with public location (odds ratio [OR] = 1.19; 95% confidence interval [CI] = 1.03-1.39), initially shockable rhythms (OR = 1.48; 95% CI = 1.12-1.96), and those in urban neighborhoods (OR = 0.54; 95% CI = 0.38-0.77). Out-of-hospital cardiac arrest and bystander CPR are geographically clustered in Kent County, Michigan, but bystander CPR is inversely associated with urban designation. These results offer new insight into bystander CPR patterns in mixed urban and rural

  5. Access to public drinking water fountains in Berkeley, California: a geospatial analysis.

    Science.gov (United States)

    Avery, Dylan C; Smith, Charlotte D

    2018-01-24

    In January 2015, Berkeley, California became the first city in the Unites States to impose a tax on sugar-sweetened beverages. The tax is intended to discourage purchase of sugary beverages and promote consumption of healthier alternatives such as tap water. The goal of the study was to assess the condition of public drinking water fountains and determine if there is a difference in access to clean, functioning fountains based on race or socio-economic status. A mobile-GIS App was created to locate and collect data on existing drinking water fountains in Berkeley, CA. Demographic variables related to race and socio-economic status (SES) were acquired from the US Census - American Community Survey database. Disparities in access to, or condition of drinking water fountains relative to demographics was explored using spatial analyses. Spatial statistical-analysis was performed to estimate demographic characteristics of communities near the water fountains and logistic regression was used to examine the relationship between household median income or race and condition of fountain. Although most fountains were classified as functioning, some were dirty, clogged, or both dirty and clogged. No spatial relationships between demographic characteristics and fountain conditions were observed. All geo-located data and a series of maps were provided to the City of Berkeley and the public. The geo-database created as an outcome of this study is useful for prioritizing maintenance of existing fountains and planning the locations of future fountains. The methodologies used for this study could be applied to a wide variety of asset inventory and assessment projects such as clinics or pharmaceutical dispensaries, both in developed and developing countries.

  6. Geospatial Analysis of Drug Poisoning Deaths Involving Heroin in the USA, 2000-2014.

    Science.gov (United States)

    Stewart, Kathleen; Cao, Yanjia; Hsu, Margaret H; Artigiani, Eleanor; Wish, Eric

    2017-08-01

    We investigate the geographic patterns of drug poisoning deaths involving heroin by county for the USA from 2000 to 2014. The county-level patterns of mortality are examined with respect to age-adjusted rates of death for different classes of urbanization and racial and ethnic groups, while rates based on raw counts of drug poisoning deaths involving heroin are estimated for different age groups and by gender. To account for possible underestimations in these rates due to small areas or small numbers, spatial empirical Baye's estimation techniques have been used to smooth the rates of death and alleviate underestimation when analyzing spatial patterns for these different groups. The geographic pattern of poisoning deaths involving heroin has shifted from the west coast of the USA in the year 2000 to New England, the Mid-Atlantic region, and the Great Lakes and central Ohio Valley by 2014. The evolution over space and time of clusters of drug poisoning deaths involving heroin is confirmed through the SaTScan analysis. For this period, White males were found to be the most impacted population group overall; however, Blacks and Hispanics are highly impacted in counties where significant populations of these two groups reside. Our results show that while 35-54-year-olds were the most highly impacted age group by county from 2000 to 2010, by 2014, the trend had changed with an increasing number of counties experiencing higher death rates for individuals 25-34 years. The percentage of counties across the USA classified as large metro with deaths involving heroin is estimated to have decreased from approximately 73% in 2010 to just fewer than 56% in 2014, with a shift to small metro and non-metro counties. Understanding the geographic variations in impact on different population groups in the USA has become particularly necessary in light of the extreme increase in the use and misuse of street drugs including heroin and the subsequent rise in opioid-related deaths in the

  7. Who serves the urban poor? A geospatial and descriptive analysis of health services in slum settlements in Dhaka, Bangladesh.

    Science.gov (United States)

    Adams, Alayne M; Islam, Rubana; Ahmed, Tanvir

    2015-03-01

    In Bangladesh, the health risks of unplanned urbanization are disproportionately shouldered by the urban poor. At the same time, affordable formal primary care services are scarce, and what exists is almost exclusively provided by non-government organizations (NGOs) working on a project basis. So where do the poor go for health care? A health facility mapping of six urban slum settlements in Dhaka was undertaken to explore the configuration of healthcare services proximate to where the poor reside. Three methods were employed: (1) Social mapping and listing of all Health Service Delivery Points (HSDPs); (2) Creation of a geospatial map including Global Positioning System (GPS) co-ordinates of all HSPDs in the six study areas and (3) Implementation of a facility survey of all HSDPs within six study areas. Descriptive statistics are used to examine the number, type and concentration of service provider types, as well as indicators of their accessibility in terms of location and hours of service. A total of 1041 HSDPs were mapped, of which 80% are privately operated and the rest by NGOs and the public sector. Phamacies and non-formal or traditional doctors make up 75% of the private sector while consultation chambers account for 20%. Most NGO and Urban Primary Health Care Project (UPHCP) static clinics are open 5-6 days/week, but close by 4-5 pm in the afternoon. Evening services are almost exclusively offered by private HSDPs; however, only 37% of private sector health staff possess some kind of formal medical qualification. This spatial analysis of health service supply in poor urban settlements emphasizes the importance of taking the informal private sector into account in efforts to increase effective coverage of quality services. Features of informal private sector service provision that have facilitated market penetration may be relevant in designing formal services that better meet the needs of the urban poor. Published by Oxford University Press in association

  8. Geospatial Analysis Framework

    Directory of Open Access Journals (Sweden)

    Elisabeta Antonia Haller

    2010-04-01

    Full Text Available In a computerized society, the volume of data grows unexpectedly, making their processing time a very difficult task. A priority has become the processing of data in useful information and knowledge. Thus we can say that data mining is a result of technological developments. Interpretation of spatial data has made the subject of research over time, reaching now to have a large variety of instruments and software products for representation and interpretation. What we need to understand beyond the facilities offered by one system or another, proprietary or open source solution, is how they work and interact with spatial data.

  9. Learning Haskell data analysis

    CERN Document Server

    Church, James

    2015-01-01

    If you are a developer, analyst, or data scientist who wants to learn data analysis methods using Haskell and its libraries, then this book is for you. Prior experience with Haskell and a basic knowledge of data science will be beneficial.

  10. Influence of Topographic and Hydrographic Factors on the Spatial Distribution of Leptospirosis Disease in São Paulo County, Brazil: An Approach Using Geospatial Techniques and GIS Analysis

    Science.gov (United States)

    Ferreira, M. C.; Ferreira, M. F. M.

    2016-06-01

    Leptospirosis is a zoonosis caused by Leptospira genus bacteria. Rodents, especially Rattus norvegicus, are the most frequent hosts of this microorganism in the cities. The human transmission occurs by contact with urine, blood or tissues of the rodent and contacting water or mud contaminated by rodent urine. Spatial patterns of concentration of leptospirosis are related to the multiple environmental and socioeconomic factors, like housing near flooding areas, domestic garbage disposal sites and high-density of peoples living in slums located near river channels. We used geospatial techniques and geographical information system (GIS) to analysing spatial relationship between the distribution of leptospirosis cases and distance from rivers, river density in the census sector and terrain slope factors, in Sao Paulo County, Brazil. To test this methodology we used a sample of 183 geocoded leptospirosis cases confirmed in 2007, ASTER GDEM2 data, hydrography and census sectors shapefiles. Our results showed that GIS and geospatial analysis techniques improved the mapping of the disease and permitted identify the spatial pattern of association between location of cases and spatial distribution of the environmental variables analyzed. This study showed also that leptospirosis cases might be more related to the census sectors located on higher river density areas and households situated at shorter distances from rivers. In the other hand, it was not possible to assert that slope terrain contributes significantly to the location of leptospirosis cases.

  11. Leveraging the geospatial advantage

    Science.gov (United States)

    Ben Butler; Andrew Bailey

    2013-01-01

    The Wildland Fire Decision Support System (WFDSS) web-based application leverages geospatial data to inform strategic decisions on wildland fires. A specialized data team, working within the Wildland Fire Management Research Development and Application group (WFM RD&A), assembles authoritative national-level data sets defining values to be protected. The use of...

  12. Information gathering, management and transferring for geospatial intelligence - A conceptual approach to create a spatial data infrastructure

    Science.gov (United States)

    Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena

    2017-06-01

    Since long ago, information is a key factor for military organizations. In military context the success of joint and combined operations depends on the accurate information and knowledge flow concerning the operational theatre: provision of resources, environment evolution, targets' location, where and when an event will occur. Modern military operations cannot be conceive without maps and geospatial information. Staffs and forces on the field request large volume of information during the planning and execution process, horizontal and vertical geospatial information integration is critical for decision cycle. Information and knowledge management are fundamental to clarify an environment full of uncertainty. Geospatial information (GI) management rises as a branch of information and knowledge management, responsible for the conversion process from raw data collect by human or electronic sensors to knowledge. Geospatial information and intelligence systems allow us to integrate all other forms of intelligence and act as a main platform to process and display geospatial-time referenced events. Combining explicit knowledge with person know-how to generate a continuous learning cycle that supports real time decisions, mitigates the influences of fog of war and provides the knowledge supremacy. This paper presents the analysis done after applying a questionnaire and interviews about the GI and intelligence management in a military organization. The study intended to identify the stakeholder's requirements for a military spatial data infrastructure as well as the requirements for a future software system development.

  13. Capacity Building through Geospatial Education in Planning and School Curricula

    Science.gov (United States)

    Kumar, P.; Siddiqui, A.; Gupta, K.; Jain, S.; Krishna Murthy, Y. V. N.

    2014-11-01

    Geospatial technology has widespread usage in development planning and resource management. It offers pragmatic tools to help urban and regional planners to realize their goals. On the request of Ministry of Urban Development, Govt. of India, the Indian Institute of Remote Sensing (IIRS), Dehradun has taken an initiative to study the model syllabi of All India Council for Technical Education for planning curricula of Bachelor and Master (five disciplines) programmes. It is inferred that geospatial content across the semesters in various planning fields needs revision. It is also realized that students pursuing planning curricula are invariably exposed to spatial mapping tools but the popular digital drafting software have limitations on geospatial analysis of planning phenomena. Therefore, students need exposure on geospatial technologies to understand various real world phenomena. Inputs were given to seamlessly merge and incorporate geospatial components throughout the semesters wherever seems relevant. Another initiative by IIRS was taken to enhance the understanding and essence of space and geospatial technologies amongst the young minds at 10+2 level. The content was proposed in a manner such that youngsters start realizing the innumerable contributions made by space and geospatial technologies in their day-to-day life. This effort both at school and college level would help in not only enhancing job opportunities for young generation but also utilizing the untapped human resource potential. In the era of smart cities, higher economic growth and aspirations for a better tomorrow, integration of Geospatial technologies with conventional wisdom can no longer be ignored.

  14. Geospatial data sharing, online spatial analysis and processing of Indian Biodiversity data in Internet GIS domain - A case study for raster based online geo-processing

    Science.gov (United States)

    Karnatak, H.; Pandey, K.; Oberai, K.; Roy, A.; Joshi, D.; Singh, H.; Raju, P. L. N.; Krishna Murthy, Y. V. N.

    2014-11-01

    National Biodiversity Characterization at Landscape Level, a project jointly sponsored by Department of Biotechnology and Department of Space, was implemented to identify and map the potential biodiversity rich areas in India. This project has generated spatial information at three levels viz. Satellite based primary information (Vegetation Type map, spatial locations of road & village, Fire occurrence); geospatially derived or modelled information (Disturbance Index, Fragmentation, Biological Richness) and geospatially referenced field samples plots. The study provides information of high disturbance and high biological richness areas suggesting future management strategies and formulating action plans. The study has generated for the first time baseline database in India which will be a valuable input towards climate change study in the Indian Subcontinent. The spatial data generated during the study is organized as central data repository in Geo-RDBMS environment using PostgreSQL and POSTGIS. The raster and vector data is published as OGC WMS and WFS standard for development of web base geoinformation system using Service Oriented Architecture (SOA). The WMS and WFS based system allows geo-visualization, online query and map outputs generation based on user request and response. This is a typical mashup architecture based geo-information system which allows access to remote web services like ISRO Bhuvan, Openstreet map, Google map etc., with overlay on Biodiversity data for effective study on Bio-resources. The spatial queries and analysis with vector data is achieved through SQL queries on POSTGIS and WFS-T operations. But the most important challenge is to develop a system for online raster based geo-spatial analysis and processing based on user defined Area of Interest (AOI) for large raster data sets. The map data of this study contains approximately 20 GB of size for each data layer which are five in number. An attempt has been to develop system using

  15. Infrastructure for the Geospatial Web

    Science.gov (United States)

    Lake, Ron; Farley, Jim

    Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.

  16. Increasing the value of geospatial informatics with open approaches for Big Data

    Science.gov (United States)

    Percivall, G.; Bermudez, L. E.

    2017-12-01

    Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."

  17. A Geospatial Online Instruction Model

    OpenAIRE

    Athena OWEN-NAGEL; John C. RODGERS III; Shrinidhi AMBINAKUDIGE

    2012-01-01

    The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model’s effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based teaching effectiveness has not yet been clearly demonstrated for geospatial courses. The pedagogical model implemented in this study heavily utilizes ...

  18. Agricultural Capacity to Increase the Production of Select Fruits and Vegetables in the US: A Geospatial Modeling Analysis.

    Science.gov (United States)

    Conrad, Zach; Peters, Christian J; Chui, Kenneth; Jahns, Lisa; Griffin, Timothy S

    2017-09-23

    The capacity of US agriculture to increase the output of specific foods to accommodate increased demand is not well documented. This research uses geospatial modeling to examine the capacity of the US agricultural landbase to increase the per capita availability of an example set of nutrient-dense fruits and vegetables. These fruits and vegetables were selected based on nutrient content and an increasing trend of domestic production and consumption. Geographic information system models were parameterized to identify agricultural land areas meeting crop-specific growing requirements for monthly precipitation and temperature; soil depth and type; cropland availability; and proximity to existing production centers. The results of these analyses demonstrate that crop production can be expanded by nearly 144,000 ha within existing national production centers, generating an additional 0.05 cup-equivalents of fruits and vegetables per capita per day, representing a 1.7% increase above current total F&V availability. Expanding the size of national crop production centers can further increase the availability of all F&V by 2.5%-5.4%, which is still less than the recommended amount. Challenges to increasing F&V production in the US include lack of labor availability, barriers to adoption among producers, and threats to crop yields from environmental concerns.

  19. Intentional learning: A concept analysis.

    Science.gov (United States)

    Mollman, Sarah; Candela, Lori

    2018-01-01

    To use a concept analysis to determine a clear definition of the term "intentional learning" for use in nursing. The term intentional learning has been used for years in educational, business, and even nursing literature. It has been used to denote processes leading to higher order thinking and the ability to use knowledge in new situations; both of which are important skills to develop in nursing students. But the lack of a common, accepted definition of the term makes it difficult for nurse educators to base instruction and learning experiences on or to evaluate its overall effectiveness in educating students for diverse, fast-paced clinical practices. A concept analysis following the eight-step method developed by Walker and Avant (2011). Empirical and descriptive literature.  Five defining attributes were identified: (1) self-efficacy for learning, (2) active, effortful, and engaged learning, (3) mastery of goals where learning is the goal, (4) self-directed learning, and (5) self-regulation of learning. Through this concept analysis, nursing will have a clear definition of intentional learning. This will enable nurse educators to generate, evaluate, and test learning experiences that promote further development of intentional learning in nursing students. Nurses in practice will also be able to evaluate if the stated benefits are demonstrated and how this impacts patient care and outcomes. © 2017 Wiley Periodicals, Inc.

  20. Submergence analysis of the proposed Ken Betwa Dam (Madhya Pradesh India, using geospatial technology in Environmental Impact Assessments

    Directory of Open Access Journals (Sweden)

    Goparaju Laxmi

    2017-12-01

    Full Text Available This study has analysed the Landsat 8 OLI data (December 2016 to delineate the various land use/land cover classes of the area which will be submerged by the proposed Daudhan/Greater Gangau Dam, which is part of the proposed Ken Betwa River Link Project (in the Madhya Pradesh state of India and also the area likely to be submerged in the Panna Tiger Reserve (PTR. The proposed area of submergence was computed at various full reservoir lengths (FRL, 278 m, 283 m, 288 m, 289 m and 293 m. Similarly the area of submergence for the Panna Tiger Reserve was computed at the mentioned FRLs. It was concluded that a large part of the Panna Tiger Reserve would be submerged and habitat of various animals and plants would be under threat. In comparison with the figures given in the Environmental Impact Assessment certain serious discrepancies and weaknesses were detected and it was felt that they should have been addressed. The results were compared with the EIA – EMP report of the Ken-Betwa link project, Phase 1, prepared by Agricultural Finance Corporation Limited for the National Water Development Agency (Ministry of Water Resources, River Development and Ganga Rejuvenation, Government of India. A proper evaluation of the negative impacts would help when making relevant decisions and appropriate steps to ensure that the loss is kept to a minimum. Safeguarding the biodiversity of forests and wildlife habitats should be the priority as their loss is irreplaceable. Geospatial technology helps in studying the overall spatial view of the proposed submergence area and the visualization gives a clear picture of the likely scenario in the future. It would assist in decision making and mitigation measures.

  1. Geospatial Technology in Geography Education

    NARCIS (Netherlands)

    Muniz Solari, Osvaldo; Demirci, A.; van der Schee, J.A.

    2015-01-01

    The book is presented as an important starting point for new research in Geography Education (GE) related to the use and application of geospatial technologies (GSTs). For this purpose, the selection of topics was based on central ideas to GE in its relationship with GSTs. The process of geospatial

  2. A Geospatial Online Instruction Model

    Science.gov (United States)

    Rodgers, John C., III; Owen-Nagel, Athena; Ambinakudige, Shrinidhi

    2012-01-01

    The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model's effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based…

  3. FOSS Tools and Applications for Education in Geospatial Sciences

    Directory of Open Access Journals (Sweden)

    Marco Ciolli

    2017-07-01

    Full Text Available While the theory and implementation of geographic information systems (GIS have a history of more than 50 years, the development of dedicated educational tools and applications in this field is more recent. This paper presents a free and open source software (FOSS approach for education in the geospatial disciplines, which has been used over the last 20 years at two Italian universities. The motivations behind the choice of FOSS are discussed with respect to software availability and development, as well as educational material licensing. Following this philosophy, a wide range of educational tools have been developed, covering topics from numerical cartography and GIS principles to the specifics regarding different systems for the management and analysis of spatial data. Various courses have been implemented for diverse recipients, ranging from professional training workshops to PhD courses. Feedback from the students of those courses provides an invaluable assessment of the effectiveness of the approach, supplying at the same time directions for further improvement. Finally, lessons learned after 20 years are discussed, highlighting how the management of educational materials can be difficult even with a very open approach to licensing. Overall, the use of free and open source software for geospatial (FOSS4G science provides a clear advantage over other approaches, not only simplifying software and data management, but also ensuring that all of the information related to system design and implementation is available.

  4. The Future of Geospatial Standards

    Science.gov (United States)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds

  5. Strategizing Teacher Professional Development for Classroom Uses of Geospatial Data and Tools

    Science.gov (United States)

    Zalles, Daniel R.; Manitakos, James

    2016-01-01

    Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE), a 4.5-year National Science Foundation funded project, explored the strategies that stimulate teacher commitment to the project's driving innovation: having students use geospatial information technology (GIT) to learn about weather, climate,…

  6. The Efficacy of Educative Curriculum Materials to Support Geospatial Science Pedagogical Content Knowledge

    Science.gov (United States)

    Bodzin, Alec; Peffer, Tamara; Kulo, Violet

    2012-01-01

    Teaching and learning about geospatial aspects of energy resource issues requires that science teachers apply effective science pedagogical approaches to implement geospatial technologies into classroom instruction. To address this need, we designed educative curriculum materials as an integral part of a comprehensive middle school energy…

  7. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin

    2010-03-01

    Geospatial information systems provide an abundance of information for researchers and scientists. Unfortunately this type of data can usually only be analyzed a few megapixels at a time, giving researchers a very narrow view into these voluminous data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented levels expediting analysis, interrogation, and discovery. ©2010 IEEE.

  8. INTEGRATING GEOSPATIAL TECHNOLOGIES AND SECONDARY STUDENT PROJECTS: THE GEOSPATIAL SEMESTER

    Directory of Open Access Journals (Sweden)

    Bob Kolvoord

    2012-12-01

    Full Text Available Resumen:El Semestre Geoespacial es una actividad de educación geográfica centrada en que los estudiantes del último curso de secundaria en los institutos norteamericanos, adquieran competencias y habilidades específicas en sistemas de información geográfica, GPS y teledetección. A través de una metodología de aprendizaje basado en proyectos, los alumnos se motivan e implican en la realización de trabajos de investigación en los que analizan, e incluso proponen soluciones, diferentes procesos, problemas o cuestiones de naturaleza espacial. El proyecto está coordinado por la Universidad James Madison y lleva siete años implantándose en diferentes institutos del Estado de Virginia, implicando a más de 20 centros educativos y 1.500 alumnos. Los alumnos que superan esta asignatura de la enseñanza secundaria obtienen la convalidación de determinados créditos académicos de la Universidad de referencia.Palabras clave:Sistemas de información geográfica, enseñanza, didáctica de la geografía, semestre geoespacial.Abstract:The Geospatial Semester is a geographical education activity focused on students in their final year of secondary schools in the U.S., acquiring specific skills in GIS, GPS and remote sensing. Through a methodology for project-based learning, students are motivated and involved in conducting research using geographic information systems and analyze, and even propose solutions, different processes, problems or issues spatial in nature. The Geospatial Semester university management not only ensures proper coaching, guidance and GIS training for teachers of colleges, but has established a system whereby students who pass this course of secondary education gain the recognition of certain credits from the University.Key words:Geographic information system, teaching, geographic education, geospatial semester. Résumé:Le semestre géospatial est une activité axée sur l'éducation géographique des étudiants en derni

  9. Tools for open geospatial science

    Science.gov (United States)

    Petras, V.; Petrasova, A.; Mitasova, H.

    2017-12-01

    Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.

  10. Exploratory Analysis in Learning Analytics

    Science.gov (United States)

    Gibson, David; de Freitas, Sara

    2016-01-01

    This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…

  11. Automated geospatial Web Services composition based on geodata quality requirements

    Science.gov (United States)

    Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael

    2012-10-01

    Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.

  12. A FRAMEWORK FOR AN OPEN SOURCE GEOSPATIAL CERTIFICATION MODEL

    Directory of Open Access Journals (Sweden)

    T. U. R. Khan

    2016-06-01

    Full Text Available The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission “Making geospatial education and opportunities accessible to all”. Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the “Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM. The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and

  13. a Framework for AN Open Source Geospatial Certification Model

    Science.gov (United States)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  14. Research and Practical Trends in Geospatial Sciences

    Science.gov (United States)

    Karpik, A. P.; Musikhin, I. A.

    2016-06-01

    In recent years professional societies have been undergoing fundamental restructuring brought on by extensive technological change and rapid evolution of geospatial science. Almost all professional communities have been affected. Communities are embracing digital techniques, modern equipment, software and new technological solutions at a staggering pace. In this situation, when planning financial investments and intellectual resource management, it is crucial to have a clear understanding of those trends that will be in great demand in 3-7 years. This paper reviews current scientific and practical activities of such non-governmental international organizations as International Federation of Surveyors, International Cartographic Association, and International Society for Photogrammetry and Remote Sensing, analyzes and groups most relevant topics brought up at their scientific events, forecasts most probable research and practical trends in geospatial sciences, outlines topmost leading countries and emerging markets for further detailed analysis of their activities, types of scientific cooperation and joint implementation projects.

  15. RESEARCH AND PRACTICAL TRENDS IN GEOSPATIAL SCIENCES

    Directory of Open Access Journals (Sweden)

    A. P. Karpik

    2016-06-01

    Full Text Available In recent years professional societies have been undergoing fundamental restructuring brought on by extensive technological change and rapid evolution of geospatial science. Almost all professional communities have been affected. Communities are embracing digital techniques, modern equipment, software and new technological solutions at a staggering pace. In this situation, when planning financial investments and intellectual resource management, it is crucial to have a clear understanding of those trends that will be in great demand in 3-7 years. This paper reviews current scientific and practical activities of such non-governmental international organizations as International Federation of Surveyors, International Cartographic Association, and International Society for Photogrammetry and Remote Sensing, analyzes and groups most relevant topics brought up at their scientific events, forecasts most probable research and practical trends in geospatial sciences, outlines topmost leading countries and emerging markets for further detailed analysis of their activities, types of scientific cooperation and joint implementation projects.

  16. Hotspots and causes of motor vehicle crashes in Baltimore, Maryland: A geospatial analysis of five years of police crash and census data.

    Science.gov (United States)

    Dezman, Zachary; de Andrade, Luciano; Vissoci, Joao Ricardo; El-Gabri, Deena; Johnson, Abree; Hirshon, Jon Mark; Staton, Catherine A

    2016-11-01

    Road traffic injuries are a leading killer of youth (aged 15-29) and are projected to be the 7th leading cause of death by 2030. To better understand road traffic crash locations and characteristics in the city of Baltimore, we used police and census data, to describe the epidemiology, hotspots, and modifiable risk factors involved to guide further interventions. Data on all crashes in Baltimore City from 2009 to 2013 were made available from the Maryland Automated Accident Reporting System. Socioeconomic data collected by the US CENSUS 2010 were obtained. A time series analysis was conducted using an ARIMA model. We analyzed the geographical distribution of traffic crashes and hotspots using exploratory spatial data analysis and spatial autocorrelation. Spatial regression was performed to evaluate the impact of socioeconomic indicators on hotspots. In Baltimore City, between 2009 and 2013, there were a total of 100,110 crashes reported, with 1% of crashes considered severe. Of all crashes, 7% involved vulnerable road users and 12% had elderly or youth involvement. Reasons for crashes included: distracted driving (31%), speeding (6%), and alcohol or drug use (5%). After 2010, we observed an increasing trend in all crashes especially from March to June. Distracted driving then youth and elderly drivers were consistently the highest risk factors over time. Multivariate spatial regression model including socioeconomic indicators and controlling for age, gender and population size did not show a distinct predictor of crashes explaining only 20% of the road crash variability, indicating crashes are not geographically explained by socioeconomic indicators alone. In Baltimore City, road traffic crashes occurred predominantly in the high density center of the city, involved distracted driving and extremes of age with an increase in crashes from March to June. There was no association between socioeconomic variables where crashes occurred and hotspots. In depth analysis of

  17. Preparing Preservice Teachers to Incorporate Geospatial Technologies in Geography Teaching

    Science.gov (United States)

    Harte, Wendy

    2017-01-01

    This study evaluated the efficacy of geospatial technology (GT) learning experiences in two geography curriculum courses to determine their effectiveness for developing preservice teacher confidence and preparing preservice teachers to incorporate GT in their teaching practices. Surveys were used to collect data from preservice teachers at three…

  18. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  19. Geospatial Information is the Cornerstone of Effective Hazards Response

    Science.gov (United States)

    Newell, Mark

    2008-01-01

    Every day there are hundreds of natural disasters world-wide. Some are dramatic, whereas others are barely noticeable. A natural disaster is commonly defined as a natural event with catastrophic consequences for living things in the vicinity. Those events include earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, and wildfires. Man-made disasters are events that are caused by man either intentionally or by accident, and that directly or indirectly threaten public health and well-being. These occurrences span the spectrum from terrorist attacks to accidental oil spills. To assist in responding to natural and potential man-made disasters, the U.S. Geological Survey (USGS) has established the Geospatial Information Response Team (GIRT) (http://www.usgs.gov/emergency/). The primary purpose of the GIRT is to ensure rapid coordination and availability of geospatial information for effective response by emergency responders, and land and resource managers, and for scientific analysis. The GIRT is responsible for establishing monitoring procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing relevant geospatial products and services. The GIRT is focused on supporting programs, offices, other agencies, and the public in mission response to hazards. The GIRT will leverage the USGS Geospatial Liaison Network and partnerships with the Department of Homeland Security (DHS), National Geospatial-Intelligence Agency (NGA), and Northern Command (NORTHCOM) to coordinate the provisioning and deployment of USGS geospatial data, products, services, and equipment. The USGS geospatial liaisons will coordinate geospatial information sharing with State, local, and tribal governments, and ensure geospatial liaison back-up support procedures are in place. The GIRT will coordinate disposition of USGS staff in support of DHS response center activities as requested by DHS. The GIRT

  20. Geospatial Technologies to Improve Urban Energy Efficiency

    Directory of Open Access Journals (Sweden)

    Bharanidharan Hemachandran

    2011-07-01

    Full Text Available The HEAT (Home Energy Assessment Technologies pilot project is a FREE Geoweb mapping service, designed to empower the urban energy efficiency movement by allowing residents to visualize the amount and location of waste heat leaving their homes and communities as easily as clicking on their house in Google Maps. HEAT incorporates Geospatial solutions for residential waste heat monitoring using Geographic Object-Based Image Analysis (GEOBIA and Canadian built Thermal Airborne Broadband Imager technology (TABI-320 to provide users with timely, in-depth, easy to use, location-specific waste-heat information; as well as opportunities to save their money and reduce their green-house-gas emissions. We first report on the HEAT Phase I pilot project which evaluates 368 residences in the Brentwood community of Calgary, Alberta, Canada, and describe the development and implementation of interactive waste heat maps, energy use models, a Hot Spot tool able to view the 6+ hottest locations on each home and a new HEAT Score for inter-city waste heat comparisons. We then describe current challenges, lessons learned and new solutions as we begin Phase II and scale from 368 to 300,000+ homes with the newly developed TABI-1800. Specifically, we introduce a new object-based mosaicing strategy, an adaptation of Emissivity Modulation to correct for emissivity differences, a new Thermal Urban Road Normalization (TURN technique to correct for scene-wide microclimatic variation. We also describe a new Carbon Score and opportunities to update city cadastral errors with automatically defined thermal house objects.

  1. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  2. Urban Image Classification: Per-Pixel Classifiers, Sub-Pixel Analysis, Object-Based Image Analysis, and Geospatial Methods. 10; Chapter

    Science.gov (United States)

    Myint, Soe W.; Mesev, Victor; Quattrochi, Dale; Wentz, Elizabeth A.

    2013-01-01

    Remote sensing methods used to generate base maps to analyze the urban environment rely predominantly on digital sensor data from space-borne platforms. This is due in part from new sources of high spatial resolution data covering the globe, a variety of multispectral and multitemporal sources, sophisticated statistical and geospatial methods, and compatibility with GIS data sources and methods. The goal of this chapter is to review the four groups of classification methods for digital sensor data from space-borne platforms; per-pixel, sub-pixel, object-based (spatial-based), and geospatial methods. Per-pixel methods are widely used methods that classify pixels into distinct categories based solely on the spectral and ancillary information within that pixel. They are used for simple calculations of environmental indices (e.g., NDVI) to sophisticated expert systems to assign urban land covers. Researchers recognize however, that even with the smallest pixel size the spectral information within a pixel is really a combination of multiple urban surfaces. Sub-pixel classification methods therefore aim to statistically quantify the mixture of surfaces to improve overall classification accuracy. While within pixel variations exist, there is also significant evidence that groups of nearby pixels have similar spectral information and therefore belong to the same classification category. Object-oriented methods have emerged that group pixels prior to classification based on spectral similarity and spatial proximity. Classification accuracy using object-based methods show significant success and promise for numerous urban 3 applications. Like the object-oriented methods that recognize the importance of spatial proximity, geospatial methods for urban mapping also utilize neighboring pixels in the classification process. The primary difference though is that geostatistical methods (e.g., spatial autocorrelation methods) are utilized during both the pre- and post

  3. Use of geospatial technology for delineating groundwater potential zones with an emphasis on water-table analysis in Dwarka River basin, Birbhum, India

    Science.gov (United States)

    Thapa, Raju; Gupta, Srimanta; Gupta, Arindam; Reddy, D. V.; Kaur, Harjeet

    2018-05-01

    Dwarka River basin in Birbhum, West Bengal (India), is an agriculture-dominated area where groundwater plays a crucial role. The basin experiences seasonal water stress conditions with a scarcity of surface water. In the presented study, delineation of groundwater potential zones (GWPZs) is carried out using a geospatial multi-influencing factor technique. Geology, geomorphology, soil type, land use/land cover, rainfall, lineament and fault density, drainage density, slope, and elevation of the study area were considered for the delineation of GWPZs in the study area. About 9.3, 71.9 and 18.8% of the study area falls within good, moderate and poor groundwater potential zones, respectively. The potential groundwater yield data corroborate the outcome of the model, with maximum yield in the older floodplain and minimum yield in the hard-rock terrains in the western and south-western regions. Validation of the GWPZs using the yield of 148 wells shows very high accuracy of the model prediction, i.e., 89.1% on superimposition and 85.1 and 81.3% on success and prediction rates, respectively. Measurement of the seasonal water-table fluctuation with a multiplicative model of time series for predicting the short-term trend of the water table, followed by chi-square analysis between the predicted and observed water-table depth, indicates a trend of falling groundwater levels, with a 5% level of significance and a p-value of 0.233. The rainfall pattern for the last 3 years of the study shows a moderately positive correlation ( R 2 = 0.308) with the average water-table depth in the study area.

  4. A COMBINATION OF GEOSPATIAL AND CLINICAL ANALYSIS IN PREDICTING DISABILITY OUTCOME AFTER ROAD TRAFFIC INJURY (RTI IN A DISTRICT IN MALAYSIA

    Directory of Open Access Journals (Sweden)

    R. Nik Hisamuddin

    2016-09-01

    Full Text Available This was a Prospective Cohort Study commencing from July 2011 until June 2013 involving all injuries related to motor vehicle crashes (MVC attended Emergency Departments (ED of two tertiary centers in a district in Malaysia. Selected attributes were geospatially analyzed by using ARCGIS (by ESRI software version 10.1 licensed to the institution and Google Map free software and multiple logistic regression was performed by using SPSS version 22.0. A total of 439 cases were recruited. The mean age (SD of the MVC victims was 26.04 years (s.d 15.26. Male comprised of 302 (71.7% of the cases. Motorcyclists were the commonest type of victims involved [351(80.0%]. Hotspot MVC locations occurred at certain intersections and on roads within borough of Kenali and Binjai. The number of severely injured and polytrauma are mostly on the road network within speed limit of 60 km/hour. A person with an increase in ISS of one score had a 37 % higher odd to have disability at hospital discharge (95% CI: 1.253, 1.499, p-value < 0.001. Pediatric age group (less than 19 years of age had 52.1% lesser odds to have disability at discharge from hospital (95% CI: 0.258, 0.889, p-value < 0.001 and patients who underwent operation for definitive management had 4.14 times odds to have disability at discharge from hospital (95% CI: 1.681, 10.218, p-value = 0.002. Overall this study has proven that GIS with a combination of traditional statistical analysis is still a powerful tool in road traffic injury (RTI related research.

  5. Geospatial Information from Satellite Imagery for Geovisualisation of Smart Cities in India

    Science.gov (United States)

    Mohan, M.

    2016-06-01

    In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.

  6. GEOSPATIAL INFORMATION FROM SATELLITE IMAGERY FOR GEOVISUALISATION OF SMART CITIES IN INDIA

    Directory of Open Access Journals (Sweden)

    M. Mohan

    2016-06-01

    Full Text Available In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.

  7. Geospatial Analysis of Land Use and Land Cover Changes for Discharge at Way Kualagaruntang Watershed in Bandar Lampung

    OpenAIRE

    Yuniarti, Fieni; K, Dyah Indriana; Winarno, Dwi Joko

    2013-01-01

    Land use and land cover change in a watershed might drive some impacts, such as high amounts of discharge fluctuations. Way Kuala Garuntang Watersheed is one of watershed in Bandar Lampung that has changed significantly. This study analyzed land use and land cover change to determine how much its influence on discharce fluctuations based on Geographics Information System. The method used in this study comprised of hidrology, spatial and sensitivity analysis. Hidrology analysis based on daily ...

  8. The geo-spatial information infrastructure at the Centre for Control and Prevention of Zoonoses, University of Ibadan, Nigeria: an emerging sustainable One-Health pavilion.

    Science.gov (United States)

    Olugasa, B O

    2014-12-01

    The World-Wide-Web as a contemporary means of information sharing offers a platform for geo-spatial information dissemination to improve education about spatio-temporal patterns of disease spread at the human-animal-environment interface in developing countries of West Africa. In assessing the quality of exposure to geospatial information applications among students in five purposively selected institutions in West Africa, this study reviewed course contents and postgraduate programmes in zoonoses surveillance. Geospatial information content and associated practical exercises in zoonoses surveillance were scored.. Seven criteria were used to categorize and score capability, namely, spatial data capture; thematic map design and interpretation; spatio-temporal analysis; remote sensing of data; statistical modelling; the management of spatial data-profile; and web-based map sharing operation within an organization. These criteria were used to compute weighted exposure during training at the institutions. A categorical description of institution with highest-scoring of computed Cumulative Exposure Point Average (CEPA) was based on an illustration with retrospective records of rabies cases, using data from humans, animals and the environment, that were sourced from Grand Bassa County, Liberia to create and share maps and information with faculty, staff, students and the neighbourhood about animal bite injury surveillance and spatial distribution of rabies-like illness. Uniformly low CEPA values (0-1.3) were observed across academic departments. The highest (3.8) was observed at the Centre for Control and Prevention of Zoonoses (CCPZ), University of Ibadan, Nigeria, where geospatial techniques were systematically taught, and thematic and predictive maps were produced and shared online with other institutions in West Africa. In addition, a short course in zoonosis surveillance, which offers inclusive learning in geospatial applications, is taught at CCPZ. The paper

  9. GenGIS 2: geospatial analysis of traditional and genetic biodiversity, with new gradient algorithms and an extensible plugin framework.

    Directory of Open Access Journals (Sweden)

    Donovan H Parks

    Full Text Available GenGIS is free and open source software designed to integrate biodiversity data with a digital map and information about geography and habitat. While originally developed with microbial community analyses and phylogeography in mind, GenGIS has been applied to a wide range of datasets. A key feature of GenGIS is the ability to test geographic axes that can correspond to routes of migration or gradients that influence community similarity. Here we introduce GenGIS version 2, which extends the linear gradient tests introduced in the first version to allow comprehensive testing of all possible linear geographic axes. GenGIS v2 also includes a new plugin framework that supports the development and use of graphically driven analysis packages: initial plugins include implementations of linear regression and the Mantel test, calculations of alpha-diversity (e.g., Shannon Index for all samples, and geographic visualizations of dissimilarity matrices. We have also implemented a recently published method for biomonitoring reference condition analysis (RCA, which compares observed species richness and diversity to predicted values to determine whether a given site has been impacted. The newest version of GenGIS supports vector data in addition to raster files. We demonstrate the new features of GenGIS by performing a full gradient analysis of an Australian kangaroo apple data set, by using plugins and embedded statistical commands to analyze human microbiome sample data, and by applying RCA to a set of samples from Atlantic Canada. GenGIS release versions, tutorials and documentation are freely available at http://kiwi.cs.dal.ca/GenGIS, and source code is available at https://github.com/beiko-lab/gengis.

  10. INTEGRATED ASSESSMENT AND GEOSPATIAL ANALYSIS OF ACCUMULATION OF PETROLEUM HYDROCARBONS IN THE SOIL COVER OF SAKHALIN ISLAND

    Directory of Open Access Journals (Sweden)

    V. V. Dmitriev

    2017-01-01

    Full Text Available The article considers the approach to the integral estimation of the assessment of petroleum hydrocarbons (PHc in the soil cover of Sakhalin Island. The soil map of Sakhalin was used as the cartographic base for this work. The soil map includes 103 soil polygons. An additional information on soils was also taken from The Soil Atlas of the Russian Federation. As an integral criterion for the accumulation of PHc, it is proposed to use an integral indicator calculated on the basis of 5 evaluation criteria. The choice of criteria for the assessment was based on the works of Russian scientists. The evaluation criteria on each of the polygons include information on the soil texture, the total thickness of the organic and humus horizons, the content of organic carbon in these horizons and the content of organic carbon in the mineral horizons, as well as the presence of a gley barrier.The calculation of the integral indicator is based on the principles of the ASPID methodology. On this basis, the authors compiled the map of the potential capacity of Sakhalin soils to accumulate petroleum hydrocarbons. On the basis of GIS-technology using the estimates of the integral indicator, the analysis has been performed revealing the features of spatial differentiation of PHc accumulation in the soil cover.The analysis and assessment of the accumulations of petroleum hydrocarbons has shown that peaty and peat boggy soil have the greatest ability to holding the PHc. The lowest ability to accumulate petroleum hydrocarbons is typical of illuvial-ferruginous podzols (illuvial low-humic podzols. The soils of this group occupy 1% of the island. In general, soils with low and very low hydrocarbon accumulation capacity occupy less than forty percent of the territory. 

  11. Geospatial Absorption and Regional Effects

    Directory of Open Access Journals (Sweden)

    IOAN MAC

    2009-01-01

    Full Text Available The geospatial absorptions are characterized by a specific complexity both in content and in their phenomenological and spatial manifestation fields. Such processes are differentiated according to their specificity to pre-absorption, absorption or post-absorption. The mechanisms that contribute to absorption are extremely numerous: aggregation, extension, diffusion, substitution, resistivity (resilience, stratification, borrowings, etc. Between these mechanisms frequent relations are established determining an amplification of the process and of its regional effects. The installation of the geographic osmosis phenomenon in a given territory (a place for example leads to a homogenization of the geospatial state and to the installation of the regional homogeneity.

  12. Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu

    Science.gov (United States)

    Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.

    2014-12-01

    Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard

  13. The Geospatial Web and Local Geographical Education

    Science.gov (United States)

    Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.

    2010-01-01

    Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…

  14. Applying geo-spatial analysis in community needs assessment: Implications for planning and prioritizing based on data.

    Science.gov (United States)

    Baig, Kamran; Shaw-Ridley, Mary; Munoz, Oscar J

    2016-10-01

    Colonias are sub standardized and unincorporated areas located along the US-Mexico border, with severely lacking infrastructure. Residents have poor health and limited availability, accessibility and/or utilization of healthcare services in the region. Using 2006-2007 community needs assessment (CNA) surveys collected by the Center for Housing and Urban Development of Texas A&M University, 410 randomly selected surveys from Hidalgo County, Texas were analyzed. Descriptive and spatial analyses were performed and Odds ratio (OR) was calculated. Out of 410 surveys, 333 were geo-coded to identify areas most in need of dental and vision care. Two hospitals existed within 5 miles radius of the mean centers for the two areas. Distance to health care facility was not statistically predictive of the need of dental care OR=0.96 (95% CI=0.855-1.078, p value=0.492) and vision care OR=1.083 (95% CI=0.968-1.212, p value=0.164). Integrating spatial analysis and CNA enhances planning to improve service accessibility and utilization in underserved areas. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Geospatial Analysis of Land Use and Land Cover Transitions from 1986–2014 in a Peri-Urban Ghana

    Directory of Open Access Journals (Sweden)

    Divine Odame Appiah

    2017-12-01

    Full Text Available Recently, peri-urbanisation has led to the transformation of the rural landscape, changing rural land uses into peri-urban land uses, under varying driving factors. This paper analyzes the dynamic transitions among identified land use and land cover (LULC types in the Bosomtwe district of Ghana, from 1986 to 2014. An integrated approach of geo-information tools of satellite remote sensing in Earth Resource Data Analysis System (ERDAS Imagine 13 and ArcMap 10.2 Geographic Information System (GIS, with Markov chain analytical techniques were used to examine the combined forest land cover transitions, relative to build-up, recent fallows and grasslands and projected possible factors influencing the transitions under business as usual and unusual situations. Statistical analyses of the classified Landsat TM, ETM+ and Landsat 8 Operational Land Imager and Thermal Infrared Sensor (OLI/TIS indicated that over the period of 24 years, the Bosomtwe district has undergone a series of land use conversions with remarkable forest losses especially between 2002 and 2010. In 2010 dense forest cover was degraded to low forest by 4040 ha indicating 0.40% transition probability in the future. There was a remarkable increase of built-up/bare and concrete area with a 380% increment in the 1986–2002 transition periods. The application of the Markov futuristic land use dynamics by the years 2018 and 2028, projected from the 2014 LULC indicated a future steady decline in the area coverage of the dense forest to low forest category. This is currently being driven (as at the 2017 LULC trends, by the combined effects of increasing build up bare and concrete surface land uses as well as the expanding recent fallows and grassland. The paper concluded that the health of the ecosystem and biodiversity of the lake Bosomtwe need to be sustainably managed by the Bosomtwe district assembly.

  16. Habitat Suitability analysis of Koklass (Pucrasia macrolopha) Pheasant in Churdhar Wildlife Sanctuary of Himachal Pradesh, India using Geospatial Technology

    Science.gov (United States)

    Eliza, K.; Sarma, K.

    2014-12-01

    Pheasants are at the brink of destruction due to degradation of forests, environmental pollution, climatic changes and extensive hunting of wild floras and faunas.The problem is more acute in the developing countries where wildlife and biodiversity conservation are often less prioritized due to more pressing demands of food security and poverty alleviation. Koklass Pheasant (Pucrasia macrolopha) species is distributed from Afghanistan and Pakistan in the east along the Himalayas to southeastern Tibet, western China and southeastern Mongolia.This species is grouped under endangered species in Red Data Book of Zoological Survey of India and also classified as least concern species according to IUCN Red List of Threatened Species.Conservation biologists and managers need a range of both classical analyses and specific modern tools to face the increasing threats to biodiversity. Among these tools, habitat-suitability modeling has recently emerged as a relevant technique to assess global impacts to define wide conservation priorities.The present study is carried out using remote sensing satellite imagery and GIS modeling technique for assessing habitat suitability of Koklass Pheasants and finding out the habitat factors influencing the Koklass distribution in Churdhar Wildlife Sanctuary, India. Effective management and conservation of wildlife populations and their habitats largely depend on our ability to understand and predict species-habitat interactions. Different thematic maps viz., land use/cover, forest types, drainage buffer, multiple ring buffers of sighting locations and multiple ring buffers of roads have been prepared to support the objective of the study. The Weighted Overlay Analysis model is used for identifying different potential areas of habitat for this endangered species. The most suitable area for Koklass Pheasant within the Wildlife Sanctuary is found to be about 23.8 percent of the total area which is due to favourable habitat conditions for the

  17. Climate change effects on Chikungunya transmission in Europe: geospatial analysis of vector's climatic suitability and virus' temperature requirements.

    Science.gov (United States)

    Fischer, Dominik; Thomas, Stephanie M; Suk, Jonathan E; Sudre, Bertrand; Hess, Andrea; Tjaden, Nils B; Beierkuhnlein, Carl; Semenza, Jan C

    2013-11-12

    Chikungunya was, from the European perspective, considered to be a travel-related tropical mosquito-borne disease prior to the first European outbreak in Northern Italy in 2007. This was followed by cases of autochthonous transmission reported in South-eastern France in 2010. Both events occurred after the introduction, establishment and expansion of the Chikungunya-competent and highly invasive disease vector Aedes albopictus (Asian tiger mosquito) in Europe. In order to assess whether these outbreaks are indicative of the beginning of a trend or one-off events, there is a need to further examine the factors driving the potential transmission of Chikungunya in Europe. The climatic suitability, both now and in the future, is an essential starting point for such an analysis. The climatic suitability for Chikungunya outbreaks was determined by using bioclimatic factors that influence, both vector and, pathogen. Climatic suitability for the European distribution of the vector Aedes albopictus was based upon previous correlative environmental niche models. Climatic risk classes were derived by combining climatic suitability for the vector with known temperature requirements for pathogen transmission, obtained from outbreak regions. In addition, the longest potential intra-annual season for Chikungunya transmission was estimated for regions with expected vector occurrences.In order to analyse spatio-temporal trends for risk exposure and season of transmission in Europe, climate change impacts are projected for three time-frames (2011-2040, 2041-2070 and 2071-2100) and two climate scenarios (A1B and B1) from the Intergovernmental Panel on Climate Change (IPCC). These climatic projections are based on regional climate model COSMO-CLM, which builds on the global model ECHAM5. European areas with current and future climatic suitability of Chikungunya transmission are identified. An increase in risk is projected for Western Europe (e.g. France and Benelux-States) in the

  18. A Geospatial Scavenger Hunt

    Science.gov (United States)

    Martinez, Adriana E.; Williams, Nikki A.; Metoyer, Sandra K.; Morris, Jennifer N.; Berhane, Stephen A.

    2009-01-01

    With the use of technology such as Global Positioning System (GPS) units and Google Earth for a simple-machine scavenger hunt, you will transform a standard identification activity into an exciting learning experience that motivates students, incorporates practical skills in technology, and enhances students' spatial-thinking skills. In the…

  19. Geospatial Estimates of Road Salt Usage Across a Gradient of Urbanizing Watersheds in Southern Ontario:Thesis for Masters in Spatial Analysis (MSA)

    Science.gov (United States)

    Giberson, G. K.; Oswald, C.

    2015-12-01

    In areas affected by snow, chloride (Cl) salts are widely used as a de-icing agent to improve road conditions. While the improvement in road safety is indisputable, there are environmental consequences to local aquatic ecosystems. In many waterways, Cl concentrations have been increasing since the early 1990s, often exceeding national water quality guidelines. To determine the quantity of Cl that is accumulating in urban and urbanizing watersheds, accurate estimates of road salt usage at the watershed-scale are needed. The complex jurisdictional control over road salt application in southern Ontario lends itself to a geospatial approach for calculating Cl inputs to improve the accuracy of watershed-scale Cl mass balance estimates. This study will develop a geospatial protocol for combining information on road salt applications and road network areas to refine watershed-scale Cl inputs, as well as assess spatiotemporal patterns in road salt application across the southern Ontario study region. The overall objective of this project is to use geospatial methods (predominantly ArcGIS) to develop high-accuracy estimates of road salt usage in urbanizing watersheds in southern Ontario. Specifically, the aims will be to map and summarize the types and areas ("lane-lengths") of roadways in each watershed that have road salt applied to them, to determine the most appropriate source(s) of road salt usage data for each watershed, taking into consideration multiple levels of jurisdiction (e.g. municipal, regional, provincial), to calculate and summarize sub-watershed and watershed-scale road salt usage estimates for multiple years, and to analyze intra-watershed spatiotemporal patterns of road salt usage, especially focusing on impervious surfaces. These analyses will recommend areas of concern exacerbated by high-levels of road salt distribution; recommendations around modifying on-the-ground operations will be the next step in helping to correct these issues.

  20. OSGeo - Open Source Geospatial Foundation

    Directory of Open Access Journals (Sweden)

    Margherita Di Leo

    2012-09-01

    Full Text Available L'esigenza nata verso la fine del 2005 di selezionare ed organizzare più di 200 progetti FOSS4G porta alla nascita nel Febbraio2006 di OSGeo (the Open Source Geospatial Foundation, organizzazione internazionale la cui mission è promuovere lo sviluppo collaborativo di software libero focalizzato sull'informazione geografica (FOSS4G.Open   Source   Geospatial   Foundation (OSGeoThe Open Source Geospatial Foundation (OSGeo  is  a  not-for-profit  organization, created  in  early  2006  to  the  aim  at  sup-porting   the   collaborative   development of  geospatial  open  source  software,  and promote its widespread use. The founda-tion provides financial, organizational and legal support to the broader open source geospatial community. It also serves as an independent  legal  entity  to  which  com-munity  members  can  contribute  code, funding  and  other  resources,  secure  in the knowledge that their contributions will be maintained for public benefit. OSGeo also  serves  as  an  outreach  and  advocacy organization for the open source geospa-tial  community,  and  provides  a  common forum  and  shared  infrastructure  for  im-proving  cross-project  collaboration.  The foundation's projects are all freely available and  useable  under  an  OSI-certified  open source license. The Italian OSGeo local chapter is named GFOSS.it     (Associazione     Italiana     per l'informazione Geografica Libera.

  1. Grid Enabled Geospatial Catalogue Web Service

    Science.gov (United States)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  2. The role of visualization in learning from computer-based images

    Science.gov (United States)

    Piburn, Michael D.; Reynolds, Stephen J.; McAuliffe, Carla; Leedy, Debra E.; Birk, James P.; Johnson, Julia K.

    2005-05-01

    Among the sciences, the practice of geology is especially visual. To assess the role of spatial ability in learning geology, we designed an experiment using: (1) web-based versions of spatial visualization tests, (2) a geospatial test, and (3) multimedia instructional modules built around QuickTime Virtual Reality movies. Students in control and experimental sections were administered measures of spatial orientation and visualization, as well as a content-based geospatial examination. All subjects improved significantly in their scores on spatial visualization and the geospatial examination. There was no change in their scores on spatial orientation. A three-way analysis of variance, with the geospatial examination as the dependent variable, revealed significant main effects favoring the experimental group and a significant interaction between treatment and gender. These results demonstrate that spatial ability can be improved through instruction, that learning of geological content will improve as a result, and that differences in performance between the genders can be eliminated.

  3. Archetypal Analysis for Machine Learning

    DEFF Research Database (Denmark)

    Mørup, Morten; Hansen, Lars Kai

    2010-01-01

    Archetypal analysis (AA) proposed by Cutler and Breiman in [1] estimates the principal convex hull of a data set. As such AA favors features that constitute representative ’corners’ of the data, i.e. distinct aspects or archetypes. We will show that AA enjoys the interpretability of clustering - ...... for K-means [2]. We demonstrate that the AA model is relevant for feature extraction and dimensional reduction for a large variety of machine learning problems taken from computer vision, neuroimaging, text mining and collaborative filtering....

  4. Hydrogeologic characteristics and geospatial analysis of water-table changes in the alluvium of the lower Arkansas River Valley, southeastern Colorado, 2002, 2008, and 2015

    Science.gov (United States)

    Holmberg, Michael J.

    2017-05-15

    The U.S. Geological Survey in cooperation with the Lower Arkansas Valley Water Conservancy District measures groundwater levels periodically in about 100 wells completed in the alluvial material of the Arkansas River Valley in Pueblo, Crowley, Otero, Bent, and Prowers Counties in southeastern Colorado, of which 95 are used for the analysis in this report. The purpose of this report is to provide information to water-resource administrators, managers, planners, and users about groundwater characteristics in the alluvium of the lower Arkansas Valley extending roughly 150 miles between Pueblo Reservoir and the Colorado-Kansas State line. This report includes three map sheets showing (1) bedrock altitude at the base of the alluvium of the lower Arkansas Valley; (2) estimated spring-to-spring and fall-to-fall changes in water-table altitude between 2002, 2008, and 2015; and (3) estimated saturated thickness in the alluvium during spring and fall of 2002, 2008, and 2015, and thickness of the alluvium in the lower Arkansas Valley. Water-level changes were analyzed by geospatial interpolation methods.Available data included all water-level measurements made between January 1, 2001, and December 31, 2015; however, only data from fall and spring of 2002, 2008, and 2015 are mapped in this report. To account for the effect of John Martin Reservoir in Bent County, Colorado, lake levels at the reservoir were assigned to points along the approximate shoreline and were included in the water-level dataset. After combining the water-level measurements and lake levels, inverse distance weighting was used to interpolate between points and calculate the altitude of the water table for fall and spring of each year for comparisons. Saturated thickness was calculated by subtracting the bedrock surface from the water-table surface. Thickness of the alluvium was calculated by subtracting the bedrock surface from land surface using a digital elevation model.In order to analyze the response

  5. The National 3-D Geospatial Information Web-Based Service of Korea

    Science.gov (United States)

    Lee, D. T.; Kim, C. W.; Kang, I. G.

    2013-09-01

    3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of

  6. Geospatial Data Management Platform for Urban Groundwater

    Science.gov (United States)

    Gaitanaru, D.; Priceputu, A.; Gogu, C. R.

    2012-04-01

    Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis

  7. Python geospatial development essentials

    CERN Document Server

    Bahgat, Karim

    2015-01-01

    This book is ideal for Python programmers who are tasked with or wish to make a special-purpose GIS application. Analysts, political scientists, geographers, and GIS specialists seeking a creative platform to experiment with cutting-edge spatial analysis, but who are still only beginners in Python, will also find this book beneficial. Familiarity with Tkinter application development in Python is preferable but not mandatory.

  8. Local Government GIS and Geospatial Capabilities : Suitability for Integrated Transportation & Land Use Planning (California SB 375)

    Science.gov (United States)

    2009-11-01

    This report examines two linked phenomena in transportation planning: the geospatial analysis capabilities of local planning agencies and the increasing demands on such capabilities imposed by comprehensive planning mandates.

  9. Using the Geospatial Web to Deliver and Teach Giscience Education Programs

    Science.gov (United States)

    Veenendaal, B.

    2015-05-01

    Geographic information science (GIScience) education has undergone enormous changes over the past years. One major factor influencing this change is the role of the geospatial web in GIScience. In addition to the use of the web for enabling and enhancing GIScience education, it is also used as the infrastructure for communicating and collaborating among geospatial data and users. The web becomes both the means and the content for a geospatial education program. However, the web does not replace the traditional face-to-face environment, but rather is a means to enhance it, expand it and enable an authentic and real world learning environment. This paper outlines the use of the web in both the delivery and content of the GIScience program at Curtin University. The teaching of the geospatial web, web and cloud based mapping, and geospatial web services are key components of the program, and the use of the web and online learning are important to deliver this program. Some examples of authentic and real world learning environments are provided including joint learning activities with partner universities.

  10. Encoding and analyzing aerial imagery using geospatial semantic graphs

    Energy Technology Data Exchange (ETDEWEB)

    Watson, Jean-Paul; Strip, David R.; McLendon, William Clarence,; Parekh, Ojas D.; Diegert, Carl F.; Martin, Shawn Bryan; Rintoul, Mark Daniel

    2014-02-01

    While collection capabilities have yielded an ever-increasing volume of aerial imagery, analytic techniques for identifying patterns in and extracting relevant information from this data have seriously lagged. The vast majority of imagery is never examined, due to a combination of the limited bandwidth of human analysts and limitations of existing analysis tools. In this report, we describe an alternative, novel approach to both encoding and analyzing aerial imagery, using the concept of a geospatial semantic graph. The advantages of our approach are twofold. First, intuitive templates can be easily specified in terms of the domain language in which an analyst converses. These templates can be used to automatically and efficiently search large graph databases, for specific patterns of interest. Second, unsupervised machine learning techniques can be applied to automatically identify patterns in the graph databases, exposing recurring motifs in imagery. We illustrate our approach using real-world data for Anne Arundel County, Maryland, and compare the performance of our approach to that of an expert human analyst.

  11. Reinforcement Learning for Ramp Control: An Analysis of Learning Parameters

    Directory of Open Access Journals (Sweden)

    Chao Lu

    2016-08-01

    Full Text Available Reinforcement Learning (RL has been proposed to deal with ramp control problems under dynamic traffic conditions; however, there is a lack of sufficient research on the behaviour and impacts of different learning parameters. This paper describes a ramp control agent based on the RL mechanism and thoroughly analyzed the influence of three learning parameters; namely, learning rate, discount rate and action selection parameter on the algorithm performance. Two indices for the learning speed and convergence stability were used to measure the algorithm performance, based on which a series of simulation-based experiments were designed and conducted by using a macroscopic traffic flow model. Simulation results showed that, compared with the discount rate, the learning rate and action selection parameter made more remarkable impacts on the algorithm performance. Based on the analysis, some suggestionsabout how to select suitable parameter values that can achieve a superior performance were provided.

  12. Geospatial Health: the first five years

    Directory of Open Access Journals (Sweden)

    Jürg Utzinger

    2011-11-01

    Full Text Available Geospatial Health is an international, peer-reviewed scientific journal produced by the Global Network for Geospatial Health (GnosisGIS. This network was founded in 2000 and the inaugural issue of its official journal was published in November 2006 with the aim to cover all aspects of geographical information system (GIS applications, remote sensing and other spatial analytic tools focusing on human and veterinary health. The University of Naples Federico II is the publisher, producing two issues per year, both as hard copy and an open-access online version. The journal is referenced in major databases, including CABI, ISI Web of Knowledge and PubMed. In 2008, it was assigned its first impact factor (1.47, which has now reached 1.71. Geospatial Health is managed by an editor-in-chief and two associate editors, supported by five regional editors and a 23-member strong editorial board. This overview takes stock of the first five years of publishing: 133 contributions have been published so far, primarily original research (79.7%, followed by reviews (7.5%, announcements (6.0%, editorials and meeting reports (3.0% each and a preface in the first issue. A content analysis of all the original research articles and reviews reveals that three quarters of the publications focus on human health with the remainder dealing with veterinary health. Two thirds of the papers come from Africa, Asia and Europe with similar numbers of contributions from each continent. Studies of more than 35 different diseases, injuries and risk factors have been presented. Malaria and schistosomiasis were identified as the two most important diseases (11.2% each. Almost half the contributions were based on GIS, one third on spatial analysis, often using advanced Bayesian geostatistics (13.8%, and one quarter on remote sensing. The 120 original research articles, reviews and editorials were produced by 505 authors based at institutions and universities in 52 countries

  13. Model My Watershed and BiG CZ Data Portal: Interactive geospatial analysis and hydrological modeling web applications that leverage the Amazon cloud for scientists, resource managers and students

    Science.gov (United States)

    Aufdenkampe, A. K.; Mayorga, E.; Tarboton, D. G.; Sazib, N. S.; Horsburgh, J. S.; Cheetham, R.

    2016-12-01

    The Model My Watershed Web app (http://wikiwatershed.org/model/) was designed to enable citizens, conservation practitioners, municipal decision-makers, educators, and students to interactively select any area of interest anywhere in the continental USA to: (1) analyze real land use and soil data for that area; (2) model stormwater runoff and water-quality outcomes; and (3) compare how different conservation or development scenarios could modify runoff and water quality. The BiG CZ Data Portal is a web application for scientists for intuitive, high-performance map-based discovery, visualization, access and publication of diverse earth and environmental science data via a map-based interface that simultaneously performs geospatial analysis of selected GIS and satellite raster data for a selected area of interest. The two web applications share a common codebase (https://github.com/WikiWatershed and https://github.com/big-cz), high performance geospatial analysis engine (http://geotrellis.io/ and https://github.com/geotrellis) and deployment on the Amazon Web Services (AWS) cloud cyberinfrastructure. Users can use "on-the-fly" rapid watershed delineation over the national elevation model to select their watershed or catchment of interest. The two web applications also share the goal of enabling the scientists, resource managers and students alike to share data, analyses and model results. We will present these functioning web applications and their potential to substantially lower the bar for studying and understanding our water resources. We will also present work in progress, including a prototype system for enabling citizen-scientists to register open-source sensor stations (http://envirodiy.org/mayfly/) to stream data into these systems, so that they can be reshared using Water One Flow web services.

  14. Geo-spatial technologies in urban environments policy, practice, and pixels

    CERN Document Server

    Jensen, Ryan R; McLean, Daniel

    2004-01-01

    Using Geospatial Technologies in Urban Environments simultaneously fills two gaping vacuums in the scholarly literature on urban geography. The first is the clear and straightforward application of geospatial technologies to practical urban issues. By using remote sensing and statistical techniques (correlation-regression analysis, the expansion method, factor analysis, and analysis of variance), the - thors of these 12 chapters contribute significantly to our understanding of how geospatial methodologies enhance urban studies. For example, the GIS Specialty Group of the Association of American Geographers (AAG) has the largest m- bership of all the AAG specialty groups, followed by the Urban Geography S- cialty Group. Moreover, the Urban Geography Specialty Group has the largest number of cross-memberships with the GIS Specialty Group. This book advances this important geospatial and urban link. Second, the book fills a wide void in the urban-environment literature. Although the Annals of the Association of ...

  15. Integration of Geospatial Science in Teacher Education

    Science.gov (United States)

    Hauselt, Peggy; Helzer, Jennifer

    2012-01-01

    One of the primary missions of our university is to train future primary and secondary teachers. Geospatial sciences, including GIS, have long been excluded from teacher education curriculum. This article explains the curriculum revisions undertaken to increase the geospatial technology education of future teachers. A general education class…

  16. Geo-spatial analysis of land-water resource degradation in two economically contrasting agricultural regions adjoining national capital territory (Delhi).

    Science.gov (United States)

    Kaur, Ravinder; Minhas, P S; Jain, P C; Singh, P; Dubey, D S

    2009-07-01

    The present study was aimed at characterizing the soil-water resource degradation in the rural areas of Gurgaon and Mewat districts, the two economically contrasting areas in policy zones-II and III of the National Capital Region (NCR), and assessing the impact of the study area's local conditions on the type and extent of resource degradation. This involved generation of detailed spatial information on the land use, cropping pattern, farming practices, soils and surface/ground waters of Gurgaon and Mewat districts through actual resource surveys, standard laboratory methods and GIS/remote sensing techniques. The study showed that in contrast to just 2.54% (in rabi season) to 4.87% (in kharif season) of agricultural lands in Gurgaon district, about 11.77% (in rabi season) to 24.23% (in kharif season) of agricultural lands in Mewat district were irrigated with saline to marginally saline canal water. Further, about 10.69% of agricultural lands in the Gurgaon district and 42.15% of agricultural lands in the Mewat district were drain water irrigated. A large part of this surface water irrigated area, particularly in Nuh (48.7%), Nagina (33.5%), and Punhana (24.1%) blocks of Mewat district, was either waterlogged (7.4% area with water depth) or at risk of being waterlogged (17.1% area with 2-3 m ground water depth). Local resource inventory showed prevalence of several illegal private channels in Mewat district. These private channels divert degraded canal waters into the nearby intersecting drains and thereby increase extent of surface irrigated agricultural lands in the Mewat district. Geo-spatial analysis showed that due to seepage of these degraded waters from unlined drains and canals, ground waters of about 39.6% of Mewat district were salt affected (EC(m)ean = 7.05 dS/m and SAR(m)ean = 7.71). Besides, sub-surface drinking waters of almost the entire Mewat district were contaminated with undesirable concentrations of chromium (Cr 2.0-3.23 ppm), manganese (Mn: 0

  17. Evaluation of Data Management Systems for Geospatial Big Data

    OpenAIRE

    Amirian, Pouria; Basiri, Anahid; Winstanley, Adam C.

    2014-01-01

    Big Data encompasses collection, management, processing and analysis of the huge amount of data that varies in types and changes with high frequency. Often data component of Big Data has a positional component as an important part of it in various forms, such as postal address, Internet Protocol (IP) address and geographical location. If the positional components in Big Data extensively used in storage, retrieval, analysis, processing, visualization and knowledge discovery (geospatial Big Dat...

  18. The Impact of a Geospatial Technology-Supported Energy Curriculum on Middle School Students' Science Achievement

    Science.gov (United States)

    Kulo, Violet; Bodzin, Alec

    2013-02-01

    Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade students classified in three ability level tracks. Data were gathered through pre/posttest content knowledge assessments, daily classroom observations, and daily reflective meetings with the teacher. Findings indicated a significant increase in the energy content knowledge for all the students. Effect sizes were large for all three ability level tracks, with the middle and low track classes having larger effect sizes than the upper track class. Learners in all three tracks were highly engaged with the curriculum. Curriculum effectiveness and practical issues involved with using geospatial technologies to support science learning are discussed.

  19. Social Learning Network Analysis Model to Identify Learning Patterns Using Ontology Clustering Techniques and Meaningful Learning

    Science.gov (United States)

    Firdausiah Mansur, Andi Besse; Yusof, Norazah

    2013-01-01

    Clustering on Social Learning Network still not explored widely, especially when the network focuses on e-learning system. Any conventional methods are not really suitable for the e-learning data. SNA requires content analysis, which involves human intervention and need to be carried out manually. Some of the previous clustering techniques need…

  20. Gamification and geospatial health management

    Science.gov (United States)

    Wortley, David

    2014-06-01

    Sensor and Measurement technologies are rapidly developing for many consumer applications which have the potential to make a major impact on business and society. One of the most important areas for building a sustainable future is in health management. This opportunity arises because of the growing popularity of lifestyle monitoring devices such as the Jawbone UP bracelet, Nike Fuelband and Samsung Galaxy GEAR. These devices measure physical activity and calorie consumption and, when visualised on mobile and portable devices, enable users to take more responsibility for their personal health. This presentation looks at how the process of gamification can be applied to develop important geospatial health management applications that could not only improve the health of nations but also significantly address some of the issues in global health such as the ageing society and obesity.

  1. Gamification and geospatial health management

    International Nuclear Information System (INIS)

    Wortley, David

    2014-01-01

    Sensor and Measurement technologies are rapidly developing for many consumer applications which have the potential to make a major impact on business and society. One of the most important areas for building a sustainable future is in health management. This opportunity arises because of the growing popularity of lifestyle monitoring devices such as the Jawbone UP bracelet, Nike Fuelband and Samsung Galaxy GEAR. These devices measure physical activity and calorie consumption and, when visualised on mobile and portable devices, enable users to take more responsibility for their personal health. This presentation looks at how the process of gamification can be applied to develop important geospatial health management applications that could not only improve the health of nations but also significantly address some of the issues in global health such as the ageing society and obesity

  2. Visualization and Ontology of Geospatial Intelligence

    Science.gov (United States)

    Chan, Yupo

    Recent events have deepened our conviction that many human endeavors are best described in a geospatial context. This is evidenced in the prevalence of location-based services, as afforded by the ubiquitous cell phone usage. It is also manifested by the popularity of such internet engines as Google Earth. As we commute to work, travel on business or pleasure, we make decisions based on the geospatial information provided by such location-based services. When corporations devise their business plans, they also rely heavily on such geospatial data. By definition, local, state and federal governments provide services according to geographic boundaries. One estimate suggests that 85 percent of data contain spatial attributes.

  3. GSKY: A scalable distributed geospatial data server on the cloud

    Science.gov (United States)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as

  4. Geospatial database for heritage building conservation

    Science.gov (United States)

    Basir, W. N. F. W. A.; Setan, H.; Majid, Z.; Chong, A.

    2014-02-01

    Heritage buildings are icons from the past that exist in present time. Through heritage architecture, we can learn about economic issues and social activities of the past. Nowadays, heritage buildings are under threat from natural disaster, uncertain weather, pollution and others. In order to preserve this heritage for the future generation, recording and documenting of heritage buildings are required. With the development of information system and data collection technique, it is possible to create a 3D digital model. This 3D information plays an important role in recording and documenting heritage buildings. 3D modeling and virtual reality techniques have demonstrated the ability to visualize the real world in 3D. It can provide a better platform for communication and understanding of heritage building. Combining 3D modelling with technology of Geographic Information System (GIS) will create a database that can make various analyses about spatial data in the form of a 3D model. Objectives of this research are to determine the reliability of Terrestrial Laser Scanning (TLS) technique for data acquisition of heritage building and to develop a geospatial database for heritage building conservation purposes. The result from data acquisition will become a guideline for 3D model development. This 3D model will be exported to the GIS format in order to develop a database for heritage building conservation. In this database, requirements for heritage building conservation process are included. Through this research, a proper database for storing and documenting of the heritage building conservation data will be developed.

  5. Geospatial database for heritage building conservation

    International Nuclear Information System (INIS)

    Basir, W N F W A; Setan, H; Majid, Z; Chong, A

    2014-01-01

    Heritage buildings are icons from the past that exist in present time. Through heritage architecture, we can learn about economic issues and social activities of the past. Nowadays, heritage buildings are under threat from natural disaster, uncertain weather, pollution and others. In order to preserve this heritage for the future generation, recording and documenting of heritage buildings are required. With the development of information system and data collection technique, it is possible to create a 3D digital model. This 3D information plays an important role in recording and documenting heritage buildings. 3D modeling and virtual reality techniques have demonstrated the ability to visualize the real world in 3D. It can provide a better platform for communication and understanding of heritage building. Combining 3D modelling with technology of Geographic Information System (GIS) will create a database that can make various analyses about spatial data in the form of a 3D model. Objectives of this research are to determine the reliability of Terrestrial Laser Scanning (TLS) technique for data acquisition of heritage building and to develop a geospatial database for heritage building conservation purposes. The result from data acquisition will become a guideline for 3D model development. This 3D model will be exported to the GIS format in order to develop a database for heritage building conservation. In this database, requirements for heritage building conservation process are included. Through this research, a proper database for storing and documenting of the heritage building conservation data will be developed

  6. ANALYSIS OF STUDENTS’ LEARNING OBSTACLES ON LEARNING INVERS FUNCTION MATERIAL

    Directory of Open Access Journals (Sweden)

    Krisna Satrio Perbowo

    2017-09-01

    Full Text Available This research is based on the presence of obstacle in learning mathematics on inverse function. This research aims to analyze the learning obstacle, to know the types of error that is suffered by the students in learning inverse function. Kind of this kualitative research descriptive with data triangulation. The research subjects are high school students which is contained of 74 students and was taken 6 students to be main sample. The data of students’ error is obtained from the writen test result, the students’ false answers are identified into the type of error. Then it was chosen several students to be interviewed. Which the analysis result finding data in this research showed there are 4 types of errors, which are concept error, procedure error, counting error and concluding error. An obstacle which appear in learning inverse function is influenced by two factors, i.e internal factor and eksternal factor. Internal factor is showed by the students’ motivation in following learning and students’ skill in receiving learning material. While the eksternal factor is showed by the curriculum which applied in school with acceleration class caused many narrow learning time, teaching materials that is less complete with the discussion of question sample.

  7. Insights into lahar deposition processes in the Curah Lengkong (Semeru Volcano, Indonesia) using photogrammetry-based geospatial analysis, near-surface geophysics and CFD modelling

    Science.gov (United States)

    Gomez, C.; Lavigne, F.; Sri Hadmoko, D.; Wassmer, P.

    2018-03-01

    Semeru Volcano is an active stratovolcano located in East Java (Indonesia), where historic lava flows, occasional pyroclastic flows and vulcanian explosions (on average every 5 min to 15 min) generate a stock of material that is remobilized by lahars, mostly occurring during the rainy season between October and March. Every year, several lahars flow down the Curah Lengkong Valley on the South-east flank of the volcano, where numerous lahar studies have been conducted. In the present contribution, the objective was to study the spatial distribution of boulder-size clasts and try to understand how this distribution relates to the valley morphology and to the dynamic and deposition dynamic of lahars. To achieve this objective, the method relies on a combination of (1) aerial photogrammetry-derived geospatial data on boulders' distribution, (2) ground penetrating radar data collected along a 2 km series of transects and (3) a CFD model of flow to analyse the results from the deposits. Results show that <1 m diameter boulders are evenly distributed along the channel, but that lava flow deposits visible at the surface of the river bed and SABO dams increase the concentration of clasts upstream of their position. Lateral input of boulders from collapsing lava-flow deposits can bring outsized clasts in the system that tend to become trapped at one location. Finally, the comparison between the CFD simulation and previous research using video imagery of lahars put the emphasis the fact that there is no direct link between the sedimentary units observed in the field and the flow that deposited them. Both grain size, flow orientation, matrix characteristics can be very different in a deposit for one single flow, even in confined channels like the Curah Lengkong.

  8. Learning topography with Tangible Landscape games

    Science.gov (United States)

    Petrasova, A.; Tabrizian, P.; Harmon, B. A.; Petras, V.; Millar, G.; Mitasova, H.; Meentemeyer, R. K.

    2017-12-01

    Understanding topography and its representations is crucial for correct interpretation and modeling of surface processes. However, novice earth science and landscape architecture students often find reading topographic maps challenging. As a result, many students struggle to comprehend more complex spatial concepts and processes such as flow accumulation or sediment transport.We developed and tested a new method for teaching hydrology, geomorphology, and grading using Tangible Landscape—a tangible interface for geospatial modeling. Tangible Landscape couples a physical and digital model of a landscape through a real-time cycle of hands-on modeling, 3D scanning, geospatial computation, and projection. With Tangible Landscape students can sculpt a projection-augmented topographic model of a landscape with their hands and use a variety of tangible objects to immediately see how they are changing geospatial analytics such as contours, profiles, water flow, or landform types. By feeling and manipulating the shape of the topography, while seeing projected geospatial analytics, students can intuitively learn about 3D topographic form, its representations, and how topography controls physical processes. Tangible Landscape is powered by GRASS GIS, an open source geospatial platform with extensive libraries for geospatial modeling and analysis. As such, Tangible Landscape can be used to design a wide range of learning experiences across a large number of geoscience disciplines.As part of a graduate level course that teaches grading, 16 students participated in a series of workshops, which were developed as serious games to encourage learning through structured play. These serious games included 1) diverting rain water to a specified location with minimal changes to landscape, 2) building different combinations of landforms, and 3) reconstructing landscapes based on projected contour information with feedback.In this poster, we will introduce Tangible Landscape, and

  9. Geospatial Information System Capability Maturity Models

    Science.gov (United States)

    2017-06-01

    To explore how State departments of transportation (DOTs) evaluate geospatial tool applications and services within their own agencies, particularly their experiences using capability maturity models (CMMs) such as the Urban and Regional Information ...

  10. GIBS Geospatial Data Abstraction Library (GDAL)

    Data.gov (United States)

    National Aeronautics and Space Administration — GDAL is an open source translator library for raster geospatial data formats that presents a single abstract data model to the calling application for all supported...

  11. Learning slow features for behavior analysis

    NARCIS (Netherlands)

    Zafeiriou, Lazaros; Nicolaou, Mihalis A.; Zafeiriou, Stefanos; Nikitids, Symeon; Pantic, Maja

    2013-01-01

    A recently introduced latent feature learning technique for time varying dynamic phenomena analysis is the socalled Slow Feature Analysis (SFA). SFA is a deterministic component analysis technique for multi-dimensional sequences that by minimizing the variance of the first order time derivative

  12. Lessons learned from failure analysis

    International Nuclear Information System (INIS)

    Le May, I.

    2006-01-01

    Failure analysis can be a very useful tool to designers and operators of plant and equipment. It is not simply something that is done for lawyers and insurance companies, but is a tool from which lessons can be learned and by means of which the 'breed' can be improved. In this presentation, several failure investigations that have contributed to understanding will be presented. Specifically, the following cases will be discussed: 1) A fire at a refinery that occurred in a desulphurization unit. 2) The failure of a pipeline before it was even put into operation. 3) Failures in locomotive axles that took place during winter operation. The refinery fire was initially blamed on defective Type 321 seamless stainless steel tubing, but there were conflicting views between 'experts' involved as to the mechanism of failure and the writer was called upon to make an in-depth study. This showed that there were a variety of failure mechanism involved, including high temperature fracture, environmentally-induced cracking and possible manufacturing defects. The unraveling of the failure sequence is described and illustrated. The failure of an oil transmission was discovered when the line was pressure tested some months after it had been installed and before it was put into service. Repairs were made and failure occurred in another place upon the next pressure test being conducted. After several more repairs had been made the line was abandoned and a lawsuit was commenced on the basis that the steel was defective. An investigation disclosed that the material was sensitive to embrittlement and the causes of this were determined. As a result, changes were made in the microstructural control of the product to avoid similar problems in future. A series of axle failures occurred in diesel electric locomotives during winter. An investigation was made to determine the nature of the failures which were not by classical fatigue, nor did they correspond to published illustrations of Cu

  13. Implementing a High School Level Geospatial Technologies and Spatial Thinking Course

    Science.gov (United States)

    Nielsen, Curtis P.; Oberle, Alex; Sugumaran, Ramanathan

    2011-01-01

    Understanding geospatial technologies (GSTs) and spatial thinking is increasingly vital to contemporary life including common activities and hobbies; learning in science, mathematics, and social science; and employment within fields as diverse as engineering, health, business, and planning. As such, there is a need for a stand-alone K-12…

  14. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    Science.gov (United States)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  15. Deep Learning in Medical Image Analysis.

    Science.gov (United States)

    Shen, Dinggang; Wu, Guorong; Suk, Heung-Il

    2017-06-21

    This review covers computer-assisted analysis of images in the field of medical imaging. Recent advances in machine learning, especially with regard to deep learning, are helping to identify, classify, and quantify patterns in medical images. At the core of these advances is the ability to exploit hierarchical feature representations learned solely from data, instead of features designed by hand according to domain-specific knowledge. Deep learning is rapidly becoming the state of the art, leading to enhanced performance in various medical applications. We introduce the fundamentals of deep learning methods and review their successes in image registration, detection of anatomical and cellular structures, tissue segmentation, computer-aided disease diagnosis and prognosis, and so on. We conclude by discussing research issues and suggesting future directions for further improvement.

  16. Graduate Ethics Curricula for Future Geospatial Technology Professionals (Invited)

    Science.gov (United States)

    Wright, D. J.; Dibiase, D.; Harvey, F.; Solem, M.

    2009-12-01

    Professionalism in today's rapidly-growing, multidisciplinary geographic information science field (e.g., geographic information systems or GIS, remote sensing, cartography, quantitative spatial analysis), now involves a commitment to ethical practice as informed by a more sophisticated understanding of the ethical implications of geographic technologies. The lack of privacy introduced by mobile mapping devices, the use of GIS for military and surveillance purposes, the appropriate use of data collected using these technologies for policy decisions (especially for conservation and sustainability) and general consequences of inequities that arise through biased access to geospatial tools and derived data all continue to be challenging issues and topics of deep concern for many. Students and professionals working with GIS and related technologies should develop a sound grasp of these issues and a thorough comprehension of the concerns impacting their use and development in today's world. However, while most people agree that ethics matters for GIS, we often have difficulty putting ethical issues into practice. An ongoing project supported by NSF seeks to bridge this gap by providing a sound basis for future ethical consideration of a variety of issues. A model seminar curriculum is under development by a team of geographic information science and technology (GIS&T) researchers and professional ethicists, along with protocols for course evaluations. In the curricula students first investigate the nature of professions in general and the characteristics of a GIS&T profession in particular. They hone moral reasoning skills through methodical analyses of case studies in relation to various GIS Code of Ethics and Rules of Conduct. They learn to unveil the "moral ecologies" of a profession through actual interviews with real practitioners in the field. Assignments thus far include readings, class discussions, practitioner interviews, and preparations of original case

  17. The new geospatial tools: global transparency enhancing safeguards verification

    Energy Technology Data Exchange (ETDEWEB)

    Pabian, Frank Vincent [Los Alamos National Laboratory

    2010-09-16

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  18. The new geospatial tools: global transparency enhancing safeguards verification

    International Nuclear Information System (INIS)

    Pabian, Frank Vincent

    2010-01-01

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  19. Analysis of Former Learning Assistants' Views on Cooperative Learning

    Science.gov (United States)

    Gray, Kara E.; Otero, Valerie K.

    2009-11-01

    The University of Colorado Learning Assistant (LA) program integrates a weekly education seminar, meetings with science faculty to review content, and a semester-long teaching experience that hires undergraduates to work with groups of students in university science courses. Following this three-pronged learning experience, some of the LAs continue into the teacher certification program. While previous research has shown that this model has more than doubled the number of science and math majors graduating with a teaching certification, the question remains whether these teachers are better prepared to teach. The analysis presented here addresses this question by comparing the views of former LAs to the views of comparable teachers on the issue of cooperative learning. Interviews were conducted with ten middle school and high school science teachers throughout their first year of teaching. Results suggest differences in former LAs views toward group work and their purposes for using group work.

  20. GEOSPATIAL CHARACTERIZATION OF BIODIVERSITY: NEED AND CHALLENGES

    Directory of Open Access Journals (Sweden)

    P. S. Roy

    2012-08-01

    Full Text Available Explaining the distribution of species and understanding their abundance and spatial distribution at multiple scales using remote sensing and ground based observation have been the central aspect of the meeting of COP10 for achieving CBD 2020 targets. In this respect the Biodiveristy Characterization at Landscape Level for India is a milestone in biodiversity study in this country. Satellite remote sensing has been used to derive the spatial extent and vegetation composition patterns. Sensitivity of different multi-scale landscape metrics, species composition, ecosystem uniqueness and diversity in distribution of biological diversity is assessed through customized landscape analysis software to generate the biological richness surface. The uniqueness of the study lies in the creation of baseline geo-spatial data on vegetation types using multi-temporal satellite remote sensing data (IRS LISS III, deriving biological richness based on spatial landscape analysis and inventory of location specific information about 7964 unique plant species recorded in 20,000 sample plots in India and their status with respect to endemic, threatened and economic/medicinal importance. The results generated will serve as a baseline database for various assessment of the biodiversity for addressing CBD 2020 targets.

  1. A geospatial search engine for discovering multi-format geospatial data across the web

    Science.gov (United States)

    Christopher Bone; Alan Ager; Ken Bunzel; Lauren Tierney

    2014-01-01

    The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created. However, challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist. The objective of this paper is to present a publically...

  2. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    Science.gov (United States)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources

  3. Roadside video data analysis deep learning

    CERN Document Server

    Verma, Brijesh; Stockwell, David

    2017-01-01

    This book highlights the methods and applications for roadside video data analysis, with a particular focus on the use of deep learning to solve roadside video data segmentation and classification problems. It describes system architectures and methodologies that are specifically built upon learning concepts for roadside video data processing, and offers a detailed analysis of the segmentation, feature extraction and classification processes. Lastly, it demonstrates the applications of roadside video data analysis including scene labelling, roadside vegetation classification and vegetation biomass estimation in fire risk assessment.

  4. Metric learning for DNA microarray data analysis

    International Nuclear Information System (INIS)

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-01-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  5. Methods and Tools to Align Curriculum to the Skills and Competencies Needed by the Workforce - an Example from Geospatial Science and Technology

    Science.gov (United States)

    Johnson, A. B.

    2012-12-01

    Geospatial science and technology (GST) including geographic information systems, remote sensing, global positioning systems and mobile applications, are valuable tools for geoscientists and students learning to become geoscientists. GST allows the user to analyze data spatially and temporarily and then visualize the data and outcomes in multiple formats (digital, web and paper). GST has evolved rapidly and it has been difficult to create effective curriculum as few guidelines existed to help educators. In 2010, the US Department of Labor (DoL), in collaboration with the National Geospatial Center of Excellence (GeoTech Center), a National Science Foundation supported grant, approved the Geospatial Technology Competency Mode (GTCM). The GTCM was developed and vetted with industry experts and provided the structure and example competencies needed across the industry. While the GTCM was helpful, a more detailed list of skills and competencies needed to be identified in order to build appropriate curriculum. The GeoTech Center carried out multiple DACUM events to identify the skills and competencies needed by entry-level workers. DACUM (Developing a Curriculum) is a job analysis process whereby expert workers are convened to describe what they do for a specific occupation. The outcomes from multiple DACUMs were combined into a MetaDACUM and reviewed by hundreds of GST professionals. This provided a list of more than 320 skills and competencies needed by the workforce. The GeoTech Center then held multiple workshops across the U.S. where more than 100 educators knowledgeable in teaching GST parsed the list into Model Courses and a Model Certificate Program. During this process, tools were developed that helped educators define which competency should be included in a specific course and the depth of instruction for that competency. This presentation will provide details about the process, methodology and tools used to create the Models and suggest how they can be used

  6. Local government GIS and geospatial capabilities : suitability for integrated transportation and land use planning (California SB 375).

    Science.gov (United States)

    2009-11-01

    This report examines two linked phenomena in transportation planning: the geospatial analysis capabilities of local planning agencies and the increasing demands on such capabilities imposed by comprehensive planning mandates. The particular examples ...

  7. From Geomatics to Geospatial Intelligent Service Science

    Directory of Open Access Journals (Sweden)

    LI Deren

    2017-10-01

    Full Text Available The paper reviews the 60 years of development from traditional surveying and mapping to today's geospatial intelligent service science.The three important stages of surveying and mapping, namely analogue,analytical and digital stage are summarized.The author introduces the integration of GNSS,RS and GIS(3S,which forms the rise of geospatial informatics(Geomatics.The development of geo-spatial information science in digital earth era is analyzed,and the latest progress of geo-spatial information science towards real-time intelligent service in smart earth era is discussed.This paper focuses on the three development levels of "Internet plus" spatial information intelligent service.In the era of big data,the traditional geomatics will surely take advantage of the integration of communication,navigation,remote sensing,artificial intelligence,virtual reality and brain cognition science,and become geospatial intelligent service science,thereby making contributions to national economy,defense and people's livelihood.

  8. Entrepreneurship Learning Process by using SWOT Analysis

    OpenAIRE

    Jajat Sudrajat; Muhammad Ali Rahman; Antonius Sianturi; Vendy Vendy

    2016-01-01

    The research objective was to produce a model of learning entrepreneurship by using SWOT analysis, which was currently being run with the concept of large classes and small classes. The benefits of this study was expected to be useful for the Binus Entrepreneurship Center (BEC) unit to create a map development learning entrepreneurship. Influences that would be generated by using SWOT Analysis were very wide as the benefits of the implementation of large classes and small classes for students...

  9. Slow feature analysis: unsupervised learning of invariances.

    Science.gov (United States)

    Wiskott, Laurenz; Sejnowski, Terrence J

    2002-04-01

    Invariant features of temporally varying signals are useful for analysis and classification. Slow feature analysis (SFA) is a new method for learning invariant or slowly varying features from a vectorial input signal. It is based on a nonlinear expansion of the input signal and application of principal component analysis to this expanded signal and its time derivative. It is guaranteed to find the optimal solution within a family of functions directly and can learn to extract a large number of decorrelated features, which are ordered by their degree of invariance. SFA can be applied hierarchically to process high-dimensional input signals and extract complex features. SFA is applied first to complex cell tuning properties based on simple cell output, including disparity and motion. Then more complicated input-output functions are learned by repeated application of SFA. Finally, a hierarchical network of SFA modules is presented as a simple model of the visual system. The same unstructured network can learn translation, size, rotation, contrast, or, to a lesser degree, illumination invariance for one-dimensional objects, depending on only the training stimulus. Surprisingly, only a few training objects suffice to achieve good generalization to new objects. The generated representation is suitable for object recognition. Performance degrades if the network is trained to learn multiple invariances simultaneously.

  10. Multidimensional (OLAP) Analysis for Designing Dynamic Learning Strategy

    Science.gov (United States)

    Rozeva, A.; Deliyska, B.

    2010-10-01

    Learning strategy in an intelligent learning system is generally elaborated on the basis of assessment of the following factors: learner's time for reaction, content of the learning object, amount of learning material in a learning object, learning object specification, e-learning medium and performance control. Current work proposes architecture for dynamic learning strategy design by implementing multidimensional analysis model of learning factors. The analysis model concerns on-line analytical processing (OLAP) of learner's data structured as multidimensional cube. Main components of the architecture are analysis agent for performing the OLAP operations on learner data cube, adaptation generator and knowledge selection agent for performing adaptive navigation in the learning object repository. The output of the analysis agent is involved in dynamic elaboration of learning strategy that fits best to learners profile and behavior. As a result an adaptive learning path for individual learner and for learner groups is generated.

  11. GEO-SPATIAL MODELING OF TRAVEL TIME TO MEDICAL FACILITIES IN MUNA BARAT DISTRICT, SOUTHEAST SULAWESI PROVINCE, INDONESIA

    Directory of Open Access Journals (Sweden)

    Nelson Sula

    2018-03-01

    Full Text Available Background: Health services are strongly influenced by regional topography. Road infrastructure is a key in access to health services. The geographic information system becomes a tool in modeling access to health services. Objective: To analyze geospatial data of the travel time to medical facilities in Muna Barat district, Southeast Sulawesi Province, Indonesia. Methods: This research used geospatial analysis with classification of raster data then overlaid with raster data such as Digital Elevation Modeling (DEM, Road of Vector data, and the point of Public Health Center (Puskesmas. Results: The result of geospatial analysis showed that the travel time to Puskesmas in Napano Kusambi and Kusambi sub districts is between 90-120 minutes, and travel time to the hospital in Kusambi sub district is required more than 2 hours. Conclusion: The output of this geospatial analysis can be an input for local government in planning infrastructure development in Muna Barat District, Indonesia.

  12. A Practice Approach of Multi-source Geospatial Data Integration for Web-based Geoinformation Services

    Science.gov (United States)

    Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.

    2014-04-01

    Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.

  13. Lessons learned in applying function analysis

    International Nuclear Information System (INIS)

    Mitchel, G.R.; Davey, E.; Basso, R.

    2001-01-01

    This paper summarizes the lessons learned in undertaking and applying function analysis based on the recent experience of utility, AECL and international design and assessment projects. Function analysis is an analytical technique that can be used to characterize and asses the functions of a system and is widely recognized as an essential component of a 'systematic' approach to design, on that integrated operational and user requirements into the standard design process. (author)

  14. Remote Sensing Technologies and Geospatial Modelling Hierarchy for Smart City Support

    Science.gov (United States)

    Popov, M.; Fedorovsky, O.; Stankevich, S.; Filipovich, V.; Khyzhniak, A.; Piestova, I.; Lubskyi, M.; Svideniuk, M.

    2017-12-01

    The approach to implementing the remote sensing technologies and geospatial modelling for smart city support is presented. The hierarchical structure and basic components of the smart city information support subsystem are considered. Some of the already available useful practical developments are described. These include city land use planning, urban vegetation analysis, thermal condition forecasting, geohazard detection, flooding risk assessment. Remote sensing data fusion approach for comprehensive geospatial analysis is discussed. Long-term city development forecasting by Forrester - Graham system dynamics model is provided over Kiev urban area.

  15. Transportation of Large Wind Components: A Review of Existing Geospatial Data

    Energy Technology Data Exchange (ETDEWEB)

    Mooney, Meghan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maclaurin, Galen [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    This report features the geospatial data component of a larger project evaluating logistical and infrastructure requirements for transporting oversized and overweight (OSOW) wind components. The goal of the larger project was to assess the status and opportunities for improving the infrastructure and regulatory practices necessary to transport wind turbine towers, blades, and nacelles from current and potential manufacturing facilities to end-use markets. The purpose of this report is to summarize existing geospatial data on wind component transportation infrastructure and to provide a data gap analysis, identifying areas for further analysis and data collection.

  16. Foreword to the theme issue on geospatial computer vision

    Science.gov (United States)

    Wegner, Jan Dirk; Tuia, Devis; Yang, Michael; Mallet, Clement

    2018-06-01

    Geospatial Computer Vision has become one of the most prevalent emerging fields of investigation in Earth Observation in the last few years. In this theme issue, we aim at showcasing a number of works at the interface between remote sensing, photogrammetry, image processing, computer vision and machine learning. In light of recent sensor developments - both from the ground as from above - an unprecedented (and ever growing) quantity of geospatial data is available for tackling challenging and urgent tasks such as environmental monitoring (deforestation, carbon sequestration, climate change mitigation), disaster management, autonomous driving or the monitoring of conflicts. The new bottleneck for serving these applications is the extraction of relevant information from such large amounts of multimodal data. This includes sources, stemming from multiple sensors, that exhibit distinct physical nature of heterogeneous quality, spatial, spectral and temporal resolutions. They are as diverse as multi-/hyperspectral satellite sensors, color cameras on drones, laser scanning devices, existing open land-cover geodatabases and social media. Such core data processing is mandatory so as to generate semantic land-cover maps, accurate detection and trajectories of objects of interest, as well as by-products of superior added-value: georeferenced data, images with enhanced geometric and radiometric qualities, or Digital Surface and Elevation Models.

  17. The African Geospatial Sciences Institute (agsi): a New Approach to Geospatial Training in North Africa

    Science.gov (United States)

    Oeldenberger, S.; Khaled, K. B.

    2012-07-01

    The African Geospatial Sciences Institute (AGSI) is currently being established in Tunisia as a non-profit, non-governmental organization (NGO). Its objective is to accelerate the geospatial capacity development in North-Africa, providing the facilities for geospatial project and management training to regional government employees, university graduates, private individuals and companies. With typical course durations between one and six months, including part-time programs and long-term mentoring, its focus is on practical training, providing actual project execution experience. The AGSI will complement formal university education and will work closely with geospatial certification organizations and the geospatial industry. In the context of closer cooperation between neighboring North Africa and the European Community, the AGSI will be embedded in a network of several participating European and African universities, e. g. the ITC, and international organizations, such as the ISPRS, the ICA and the OGC. Through a close cooperation with African organizations, such as the AARSE, the RCMRD and RECTAS, the network and exchange of ideas, experiences, technology and capabilities will be extended to Saharan and sub-Saharan Africa. A board of trustees will be steering the AGSI operations and will ensure that practical training concepts and contents are certifiable and can be applied within a credit system to graduate and post-graduate education at European and African universities. The geospatial training activities of the AGSI are centered on a facility with approximately 30 part- and full-time general staff and lecturers in Tunis during the first year. The AGSI will operate a small aircraft with a medium-format aerial camera and compact LIDAR instrument for local, community-scale data capture. Surveying training, the photogrammetric processing of aerial images, GIS data capture and remote sensing training will be the main components of the practical training courses

  18. AGWA: The Automated Geospatial Watershed Assessment Tool

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment Tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  19. Geospatial Modeling of Asthma Population in Relation to Air Pollution

    Science.gov (United States)

    Kethireddy, Swatantra R.; Tchounwou, Paul B.; Young, John H.; Luvall, Jeffrey C.; Alhamdan, Mohammad

    2013-01-01

    Current observations indicate that asthma is growing every year in the United States, specific reasons for this are not well understood. This study stems from an ongoing research effort to investigate the spatio-temporal behavior of asthma and its relatedness to air pollution. The association between environmental variables such as air quality and asthma related health issues over Mississippi State are investigated using Geographic Information Systems (GIS) tools and applications. Health data concerning asthma obtained from Mississippi State Department of Health (MSDH) for 9-year period of 2003-2011, and data of air pollutant concentrations (PM2.5) collected from USEPA web resources, and are analyzed geospatially to establish the impacts of air quality on human health specifically related to asthma. Disease mapping using geospatial techniques provides valuable insights into the spatial nature, variability, and association of asthma to air pollution. Asthma patient hospitalization data of Mississippi has been analyzed and mapped using quantitative Choropleth techniques in ArcGIS. Patients have been geocoded to their respective zip codes. Potential air pollutant sources of Interstate highways, Industries, and other land use data have been integrated in common geospatial platform to understand their adverse contribution on human health. Existing hospitals and emergency clinics are being injected into analysis to further understand their proximity and easy access to patient locations. At the current level of analysis and understanding, spatial distribution of Asthma is observed in the populations of Zip code regions in gulf coast, along the interstates of south, and in counties of Northeast Mississippi. It is also found that asthma is prevalent in most of the urban population. This GIS based project would be useful to make health risk assessment and provide information support to the administrators and decision makers for establishing satellite clinics in future.

  20. Student Focused Geospatial Curriculum Initiatives: Internships and Certificate Programs at NCCU

    Science.gov (United States)

    Vlahovic, G.; Malhotra, R.

    2009-12-01

    Service (GRITS) Center housed in the Department of Environmental, Earth and Geospatial Sciences. The GRITS center was established in 2006 with funding from the National Science Foundation to promote the learning and application of geospatial technologies. Since then GRITS has been a hub for Geographical Information Science (GIS) curriculum development, faculty and professional GIS workshops, grant writing and outreach efforts. The Center also serves as a contact point for partnerships with other universities, national organizations and businesses in the geospatial arena - and as a result, opens doors to the professional world for our graduate and undergraduate students.

  1. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  2. Geospatial Information Relevant to the Flood Protection Available on The Mainstream Web

    Directory of Open Access Journals (Sweden)

    Kliment Tomáš

    2014-03-01

    Full Text Available Flood protection is one of several disciplines where geospatial data is very important and is a crucial component. Its management, processing and sharing form the foundation for their efficient use; therefore, special attention is required in the development of effective, precise, standardized, and interoperable models for the discovery and publishing of data on the Web. This paper describes the design of a methodology to discover Open Geospatial Consortium (OGC services on the Web and collect descriptive information, i.e., metadata in a geocatalogue. A pilot implementation of the proposed methodology - Geocatalogue of geospatial information provided by OGC services discovered on Google (hereinafter “Geocatalogue” - was used to search for available resources relevant to the area of flood protection. The result is an analysis of the availability of resources discovered through their metadata collected from the OGC services (WMS, WFS, etc. and the resources they provide (WMS layers, WFS objects, etc. within the domain of flood protection.

  3. Geospatial-temporal semantic graph representations of trajectories from remote sensing and geolocation data

    Science.gov (United States)

    Perkins, David Nikolaus; Brost, Randolph; Ray, Lawrence P.

    2017-08-08

    Various technologies for facilitating analysis of large remote sensing and geolocation datasets to identify features of interest are described herein. A search query can be submitted to a computing system that executes searches over a geospatial temporal semantic (GTS) graph to identify features of interest. The GTS graph comprises nodes corresponding to objects described in the remote sensing and geolocation datasets, and edges that indicate geospatial or temporal relationships between pairs of nodes in the nodes. Trajectory information is encoded in the GTS graph by the inclusion of movable nodes to facilitate searches for features of interest in the datasets relative to moving objects such as vehicles.

  4. A research on the security of wisdom campus based on geospatial big data

    Science.gov (United States)

    Wang, Haiying

    2018-05-01

    There are some difficulties in wisdom campus, such as geospatial big data sharing, function expansion, data management, analysis and mining geospatial big data for a characteristic, especially the problem of data security can't guarantee cause prominent attention increasingly. In this article we put forward a data-oriented software architecture which is designed by the ideology of orienting data and data as kernel, solve the problem of traditional software architecture broaden the campus space data research, develop the application of wisdom campus.

  5. Machine Learning Methods for Production Cases Analysis

    Science.gov (United States)

    Mokrova, Nataliya V.; Mokrov, Alexander M.; Safonova, Alexandra V.; Vishnyakov, Igor V.

    2018-03-01

    Approach to analysis of events occurring during the production process were proposed. Described machine learning system is able to solve classification tasks related to production control and hazard identification at an early stage. Descriptors of the internal production network data were used for training and testing of applied models. k-Nearest Neighbors and Random forest methods were used to illustrate and analyze proposed solution. The quality of the developed classifiers was estimated using standard statistical metrics, such as precision, recall and accuracy.

  6. Entrepreneurship Learning Process by using SWOT Analysis

    Directory of Open Access Journals (Sweden)

    Jajat Sudrajat

    2016-03-01

    Full Text Available The research objective was to produce a model of learning entrepreneurship by using SWOT analysis, which was currently being run with the concept of large classes and small classes. The benefits of this study was expected to be useful for the Binus Entrepreneurship Center (BEC unit to create a map development learning entrepreneurship. Influences that would be generated by using SWOT Analysis were very wide as the benefits of the implementation of large classes and small classes for students and faculty. Participants of this study were Binus student of various majors who were taking courses EN001 and EN002. This study used research and development that examining the theoretical learning components of entrepreneurship education (teaching and learning dimension, where there were six dimensions of the survey which was a fundamental element in determining the framework of entrepreneurship education. Research finds that a strategy based on a matrix of factors is at least eight strategies for improving the learning process of entrepreneurship. From eight strategies are one of them strategies to increase collaboration BEC with family support. This strategy is supported by the survey results to the three majors who are following the EN001 and EN002, where more than 85% of the students are willing to do an aptitude test to determine the advantages and disadvantages of self-development and more of 54% of the students are not willing to accept the wishes of their parents because they do not correspond to his ideals. Based on the above results, it is suggested for further research, namely developing entrepreneurship research by analyzing other dimensions.

  7. GeoSearch: A lightweight broking middleware for geospatial resources discovery

    Science.gov (United States)

    Gui, Z.; Yang, C.; Liu, K.; Xia, J.

    2012-12-01

    With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value

  8. Does Service-Learning Increase Student Learning?: A Meta-Analysis

    Science.gov (United States)

    Warren, Jami L.

    2012-01-01

    Research studies reflect mixed results on whether or not service-learning increases student learning outcomes. The current study seeks to reconcile these findings by extending a meta-analysis conducted by Novak, Markey, and Allen (2007) in which these authors examined service-learning and student learning outcomes. In the current study, 11…

  9. Geospatial analysis of land use and water interaction in the peri-urban area of Cuauhtémoc, Chihuahua. A socio-environmental study in northern Mexico

    Directory of Open Access Journals (Sweden)

    Rolando Enrique Díaz Caravantes

    2013-07-01

    Full Text Available For decades, city growth has been considered only in terms of land availability. In cities of northern Mexico, usually located in arid or semi-arid regions, there is a high dependence on groundwater. For this reason, comprehensive planning urban must consider the peri-urban area not only in terms of soil, but also groundwater. Water transfer for urban use produces severe alterations to the natural environment as aquifer depletion and drastic changes in land use/cover. This paper presents a spatial analysis of land use and water in the peri-urban area of Ciudad Cuauhtémoc. Using geographic modeling and remote sensing we assessed the dynamics of land use/cover. The results indicate that land change processes occur in a context of high competition for water between different users. This process is not usually considered in studies measuring urban spatial expansion, but should be considered to fully understand the effects of urban growth in the territory.

  10. Geospatial analysis of residential proximity to open-pit coal mining areas in relation to micronuclei frequency, particulate matter concentration, and elemental enrichment factors.

    Science.gov (United States)

    Espitia-Pérez, Lyda; Arteaga-Pertuz, Marcia; Soto, José Salvador; Espitia-Pérez, Pedro; Salcedo-Arteaga, Shirley; Pastor-Sierra, Karina; Galeano-Páez, Claudia; Brango, Hugo; da Silva, Juliana; Henriques, João A P

    2018-05-03

    During coal surface mining, several activities such as drilling, blasting, loading, and transport produce large quantities of particulate matter (PM) that is directly emitted into the atmosphere. Occupational exposure to this PM has been associated with an increase of DNA damage, but there is a scarcity of data examining the impact of these industrial operations in cytogenetic endpoints frequency and cancer risk of potentially exposed surrounding populations. In this study, we used a Geographic Information Systems (GIS) approach and Inverse Distance Weighting (IDW) methods to perform a spatial and statistical analysis to explore whether exposure to PM 2.5 and PM 10 pollution, and additional factors, including the enrichment of the PM with inorganic elements, contribute to cytogenetic damage in residents living in proximity to an open-pit coal mining area. Results showed a spatial relationship between exposure to elevated concentrations of PM 2.5, PM 10 and micronuclei frequency in binucleated (MNBN) and mononucleated (MNMONO) cells. Active pits, disposal, and storage areas could be identified as the possible emission sources of combustion elements. Mining activities were also correlated with increased concentrations of highly enriched elements like S, Cu and Cr in the atmosphere, corroborating its role in the inorganic elements pollution around coal mines. Elements enriched in the PM 2.5 fraction contributed to increasing of MNBN but seems to be more related to increased MNMONO frequencies and DNA damage accumulated in vivo. The combined use of GIS and IDW methods could represent an important tool for monitoring potential cancer risk associated to dynamically distributed variables like the PM. Copyright © 2018. Published by Elsevier Ltd.

  11. Decision Performance Using Spatial Decision Support Systems: A Geospatial Reasoning Ability Perspective

    Science.gov (United States)

    Erskine, Michael A.

    2013-01-01

    As many consumer and business decision makers are utilizing Spatial Decision Support Systems (SDSS), a thorough understanding of how such decisions are made is crucial for the information systems domain. This dissertation presents six chapters encompassing a comprehensive analysis of the impact of geospatial reasoning ability on…

  12. An Effective Framework for Distributed Geospatial Query Processing in Grids

    Directory of Open Access Journals (Sweden)

    CHEN, B.

    2010-08-01

    Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.

  13. The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework

    Science.gov (United States)

    Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.

    2016-12-01

    The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During

  14. Geospatial Data as a Service: Towards planetary scale real-time analytics

    Science.gov (United States)

    Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.

    2017-12-01

    analysis capabilities, for dealing with petabyte-scale geospatial data collections.

  15. Classification, (big) data analysis and statistical learning

    CERN Document Server

    Conversano, Claudio; Vichi, Maurizio

    2018-01-01

    This edited book focuses on the latest developments in classification, statistical learning, data analysis and related areas of data science, including statistical analysis of large datasets, big data analytics, time series clustering, integration of data from different sources, as well as social networks. It covers both methodological aspects as well as applications to a wide range of areas such as economics, marketing, education, social sciences, medicine, environmental sciences and the pharmaceutical industry. In addition, it describes the basic features of the software behind the data analysis results, and provides links to the corresponding codes and data sets where necessary. This book is intended for researchers and practitioners who are interested in the latest developments and applications in the field. The peer-reviewed contributions were presented at the 10th Scientific Meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in Santa Margherita di Pul...

  16. The didactic situation in geometry learning based on analysis of learning obstacles and learning trajectory

    Science.gov (United States)

    Sulistyowati, Fitria; Budiyono, Slamet, Isnandar

    2017-12-01

    This study aims to design a didactic situation based on the analysis of learning obstacles and learning trajectory on prism volume. The type of this research is qualitative and quantitative research with steps: analyzing the learning obstacles and learning trajectory, preparing the didactic situation, applying the didactic situation in the classroom, mean difference test of problem solving ability with t-test statistic. The subjects of the study were 8th grade junior high school students in Magelang 2016/2017 selected randomly from eight existing classes. The result of this research is the design of didactic situations that can be implemented in prism volume learning. The effectiveness of didactic situations that have been designed is shown by the mean difference test that is the problem solving ability of the students after the application of the didactic situation better than before the application. The didactic situation that has been generated is expected to be a consideration for teachers to design lessons that match the character of learners, classrooms and teachers themselves, so that the potential thinking of learners can be optimized to avoid the accumulation of learning obstacles.

  17. Stakeholder Alignment and Changing Geospatial Information Capabilities

    Science.gov (United States)

    Winter, S.; Cutcher-Gershenfeld, J.; King, J. L.

    2015-12-01

    Changing geospatial information capabilities can have major economic and social effects on activities such as drought monitoring, weather forecasts, agricultural productivity projections, water and air quality assessments, the effects of forestry practices and so on. Whose interests are served by such changes? Two common mistakes are assuming stability in the community of stakeholders and consistency in stakeholder behavior. Stakeholder communities can reconfigure dramatically as some leave the discussion, others enter, and circumstances shift — all resulting in dynamic points of alignment and misalignment . New stakeholders can bring new interests, and existing stakeholders can change their positions. Stakeholders and their interests need to be be considered as geospatial information capabilities change, but this is easier said than done. New ways of thinking about stakeholder alignment in light of changes in capability are presented.

  18. Solar Maps | Geospatial Data Science | NREL

    Science.gov (United States)

    Solar Maps Solar Maps These solar maps provide average daily total solar resource information on disability, contact the Geospatial Data Science Team. U.S. State Solar Resource Maps Access state maps of MT NE NV NH NJ NM NY NC ND OH OK OR PA RI SC SD TN TX UT VT VA WA WV WI WY × U.S. Solar Resource

  19. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    Science.gov (United States)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current

  20. Two Contrasting Approaches to Building High School Teacher Capacity to Teach About Local Climate Change Using Powerful Geospatial Data and Visualization Technology

    Science.gov (United States)

    Zalles, D. R.

    2011-12-01

    The presentation will compare and contrast two different place-based approaches to helping high school science teachers use geospatial data visualization technology to teach about climate change in their local regions. The approaches are being used in the development, piloting, and dissemination of two projects for high school science led by the author: the NASA-funded Data-enhanced Investigations for Climate Change Education (DICCE) and the NSF funded Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE). DICCE is bringing an extensive portal of Earth observation data, the Goddard Interactive Online Visualization and Analysis Infrastructure, to high school classrooms. STORE is making available data for viewing results of a particular IPCC-sanctioned climate change model in relation to recent data about average temperatures, precipitation, and land cover for study areas in central California and western New York State. Across the two projects, partner teachers of academically and ethnically diverse students from five states are participating in professional development and pilot testing. Powerful geospatial data representation technologies are difficult to implement in high school science because of challenges that teachers and students encounter navigating data access and making sense of data characteristics and nomenclature. Hence, on DICCE, the researchers are testing the theory that by providing a scaffolded technology-supported process for instructional design, starting from fundamental questions about the content domain, teachers will make better instructional decisions. Conversely, the STORE approach is rooted in the perspective that co-design of curricular materials among researchers and teacher partners that work off of "starter" lessons covering focal skills and understandings will lead to the most effective utilizations of the technology in the classroom. The projects' goals and strategies for student

  1. Machine learning analysis of binaural rowing sounds

    DEFF Research Database (Denmark)

    Johard, Leonard; Ruffaldi, Emanuele; Hoffmann, Pablo F.

    2011-01-01

    Techniques for machine hearing are increasing their potentiality due to new application domains. In this work we are addressing the analysis of rowing sounds in natural context for the purpose of supporting a training system based on virtual environments. This paper presents the acquisition metho...... methodology and the evaluation of different machine learning techniques for classifying rowing-sound data. We see that a combination of principal component analysis and shallow networks perform equally well as deep architectures, while being much faster to train.......Techniques for machine hearing are increasing their potentiality due to new application domains. In this work we are addressing the analysis of rowing sounds in natural context for the purpose of supporting a training system based on virtual environments. This paper presents the acquisition...

  2. The importance of contrastive analysis in foreign language learning ...

    African Journals Online (AJOL)

    The importance of contrastive analysis in foreign language learning with ... In the South African context, knowledge of English plays a significant part, but can ... on in the learning process should result in positive transfer of Zulu while curbing ...

  3. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  4. Economic assessment of the use value of geospatial information

    Science.gov (United States)

    Bernknopf, Richard L.; Shapiro, Carl D.

    2015-01-01

    Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI) contained in geospatial data is the difference between the net benefits (in present value terms) of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1) a retrospective model about environmental regulation of agrochemicals; (2) a prospective model about the impact and mitigation of earthquakes in urban areas; and (3) a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  5. IRB Process Improvements: A Machine Learning Analysis.

    Science.gov (United States)

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  6. Analysis of Documents Published in Scopus Database on Foreign Language Learning through Mobile Learning: A Content Analysis

    Science.gov (United States)

    Uzunboylu, Huseyin; Genc, Zeynep

    2017-01-01

    The purpose of this study is to determine the recent trends in foreign language learning through mobile learning. The study was conducted employing document analysis and related content analysis among the qualitative research methodology. Through the search conducted on Scopus database with the key words "mobile learning and foreign language…

  7. Intelligent data analysis for e-learning enhancing security and trustworthiness in online learning systems

    CERN Document Server

    Miguel, Jorge; Xhafa, Fatos

    2016-01-01

    Intelligent Data Analysis for e-Learning: Enhancing Security and Trustworthiness in Online Learning Systems addresses information security within e-Learning based on trustworthiness assessment and prediction. Over the past decade, many learning management systems have appeared in the education market. Security in these systems is essential for protecting against unfair and dishonest conduct-most notably cheating-however, e-Learning services are often designed and implemented without considering security requirements. This book provides functional approaches of trustworthiness analysis, modeling, assessment, and prediction for stronger security and support in online learning, highlighting the security deficiencies found in most online collaborative learning systems. The book explores trustworthiness methodologies based on collective intelligence than can overcome these deficiencies. It examines trustworthiness analysis that utilizes the large amounts of data-learning activities generate. In addition, as proc...

  8. Learning Situations in Nursing Education: A Concept Analysis.

    Science.gov (United States)

    Shahsavari, Hooman; Zare, Zahra; Parsa-Yekta, Zohreh; Griffiths, Pauline; Vaismoradi, Mojtaba

    2018-02-01

    The nursing student requires opportunities to learn within authentic contexts so as to enable safe and competent practice. One strategy to facilitate such learning is the creation of learning situations. A lack of studies on the learning situation in nursing and other health care fields has resulted in insufficient knowledge of the characteristics of the learning situation, its antecedents, and consequences. Nurse educators need to have comprehensive and practical knowledge of the definition and characteristics of the learning situation so as to enable their students to achieve enhanced learning outcomes. The aim of this study was to clarify the concept of the learning situation as it relates to the education of nurses and improve understanding of its characteristics, antecedents, and consequences. The Bonis method of concept analysis, as derived from the Rodgers' evolutionary method, provided the framework for analysis. Data collection and analysis were undertaken in two phases: "interdisciplinary" and "intra-disciplinary." The data source was a search of the literature, encompassing nursing and allied health care professions, published from 1975 to 2016. No agreement on the conceptual phenomenon was discovered in the international literature. The concept of a learning situation was used generally in two ways and thus classified into the themes of: "formal/informal learning situation" and "biologic/nonbiologic learning situation." Antecedents to the creation of a learning situation included personal and environmental factors. The characteristics of a learning situation were described in terms of being complex, dynamic, and offering potential and effective learning opportunities. Consequences of the learning situation included enhancement of the students' learning, professionalization, and socialization into the professional role. The nurse educator, when considering the application of the concept of a learning situation in their educational planning, must

  9. Qualitative-Geospatial Methods of Exploring Person-Place Transactions in Aging Adults: A Scoping Review.

    Science.gov (United States)

    Hand, Carri; Huot, Suzanne; Laliberte Rudman, Debbie; Wijekoon, Sachindri

    2017-06-01

    Research exploring how places shape and interact with the lives of aging adults must be grounded in the places where aging adults live and participate. Combined participatory geospatial and qualitative methods have the potential to illuminate the complex processes enacted between person and place to create much-needed knowledge in this area. The purpose of this scoping review was to identify methods that can be used to study person-place relationships among aging adults and their neighborhoods by determining the extent and nature of research with aging adults that combines qualitative methods with participatory geospatial methods. A systematic search of nine databases identified 1,965 articles published from 1995 to late 2015. We extracted data and assessed whether the geospatial and qualitative methods were supported by a specified methodology, the methods of data analysis, and the extent of integration of geospatial and qualitative methods. Fifteen studies were included and used the photovoice method, global positioning system tracking plus interview, or go-along interviews. Most included articles provided sufficient detail about data collection methods, yet limited detail about methodologies supporting the study designs and/or data analysis. Approaches that combine participatory geospatial and qualitative methods are beginning to emerge in the aging literature. By more explicitly grounding studies in a methodology, better integrating different types of data during analysis, and reflecting on methods as they are applied, these methods can be further developed and utilized to provide crucial place-based knowledge that can support aging adults' health, well-being, engagement, and participation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Online Resources to Support Professional Development for Managing and Preserving Geospatial Data

    Science.gov (United States)

    Downs, R. R.; Chen, R. S.

    2013-12-01

    Improved capabilities of information and communication technologies (ICT) enable the development of new systems and applications for collecting, managing, disseminating, and using scientific data. New knowledge, skills, and techniques are also being developed to leverage these new ICT capabilities and improve scientific data management practices throughout the entire data lifecycle. In light of these developments and in response to increasing recognition of the wider value of scientific data for society, government agencies are requiring plans for the management, stewardship, and public dissemination of data and research products that are created by government-funded studies. Recognizing that data management and dissemination have not been part of traditional science education programs, new educational programs and learning resources are being developed to prepare new and practicing scientists, data scientists, data managers, and other data professionals with skills in data science and data management. Professional development and training programs also are being developed to address the need for scientists and professionals to improve their expertise in using the tools and techniques for managing and preserving scientific data. The Geospatial Data Preservation Resource Center offers an online catalog of various open access publications, open source tools, and freely available information for the management and stewardship of geospatial data and related resources, such as maps, GIS, and remote sensing data. Containing over 500 resources that can be found by type, topic, or search query, the geopreservation.org website enables discovery of various types of resources to improve capabilities for managing and preserving geospatial data. Applications and software tools can be found for use online or for download. Online journal articles, presentations, reports, blogs, and forums are also available through the website. Available education and training materials include

  11. Providing Geospatial Education and Real World Applications of Data across the Climate Initiative Themes

    Science.gov (United States)

    Weigel, A. M.; Griffin, R.; Bugbee, K.

    2015-12-01

    Various organizations such as the Group on Earth Observations (GEO) have developed a structure for general thematic areas in Earth science research, however the Climate Data Initiative (CDI) is addressing the challenging goal of organizing such datasets around core themes specifically related to climate change impacts. These thematic areas, which currently include coastal flooding, food resilience, ecosystem vulnerability, water, transportation, energy infrastructure, and human health, form the core of a new college course at the University of Alabama in Huntsville developed around real-world applications in the Earth sciences. The goal of this course is to educate students on the data available and scope of GIS applications in Earth science across the CDI climate themes. Real world applications and datasets serve as a pedagogical tool that provide a useful medium for instruction in scientific geospatial analysis and GIS software. With a wide range of potential research areas that fall under the rubric of "Earth science", thematic foci can help to structure a student's understanding of the potential uses of GIS across sub-disciplines, while communicating core data processing concepts. The learning modules and use-case scenarios for this course demonstrate the potential applications of CDI data to undergraduate and graduate Earth science students.

  12. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    Directory of Open Access Journals (Sweden)

    Shaoming Pan

    Full Text Available Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  13. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    Science.gov (United States)

    Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen

    2015-01-01

    Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  14. A Geo-Event-Based Geospatial Information Service: A Case Study of Typhoon Hazard

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-03-01

    Full Text Available Social media is valuable in propagating information during disasters for its timely and available characteristics nowadays, and assists in making decisions when tagged with locations. Considering the ambiguity and inaccuracy in some social data, additional authoritative data are needed for important verification. However, current works often fail to leverage both social and authoritative data and, on most occasions, the data are used in disaster analysis after the fact. Moreover, current works organize the data from the perspective of the spatial location, but not from the perspective of the disaster, making it difficult to dynamically analyze the disaster. All of the disaster-related data around the affected locations need to be retrieved. To solve these limitations, this study develops a geo-event-based geospatial information service (GEGIS framework and proceeded as follows: (1 a geo-event-related ontology was constructed to provide a uniform semantic basis for the system; (2 geo-events and attributes were extracted from the web using a natural language process (NLP and used in the semantic similarity match of the geospatial resources; and (3 a geospatial information service prototype system was designed and implemented for automatically retrieving and organizing geo-event-related geospatial resources. A case study of a typhoon hazard is analyzed here within the GEGIS and shows that the system would be effective when typhoons occur.

  15. Statistical learning methods in high-energy and astrophysics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Forschungszentrum Juelich GmbH, Zentrallabor fuer Elektronik, 52425 Juelich (Germany) and Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de; Kiesling, C. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2004-11-21

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application.

  16. Statistical learning methods in high-energy and astrophysics analysis

    International Nuclear Information System (INIS)

    Zimmermann, J.; Kiesling, C.

    2004-01-01

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application

  17. A Comparative Analysis of Three Unique Theories of Organizational Learning

    Science.gov (United States)

    Leavitt, Carol C.

    2011-01-01

    The purpose of this paper is to present three classical theories on organizational learning and conduct a comparative analysis that highlights their strengths, similarities, and differences. Two of the theories -- experiential learning theory and adaptive -- generative learning theory -- represent the thinking of the cognitive perspective, while…

  18. Machine Learning Interface for Medical Image Analysis.

    Science.gov (United States)

    Zhang, Yi C; Kagen, Alexander C

    2017-10-01

    TensorFlow is a second-generation open-source machine learning software library with a built-in framework for implementing neural networks in wide variety of perceptual tasks. Although TensorFlow usage is well established with computer vision datasets, the TensorFlow interface with DICOM formats for medical imaging remains to be established. Our goal is to extend the TensorFlow API to accept raw DICOM images as input; 1513 DaTscan DICOM images were obtained from the Parkinson's Progression Markers Initiative (PPMI) database. DICOM pixel intensities were extracted and shaped into tensors, or n-dimensional arrays, to populate the training, validation, and test input datasets for machine learning. A simple neural network was constructed in TensorFlow to classify images into normal or Parkinson's disease groups. Training was executed over 1000 iterations for each cross-validation set. The gradient descent optimization and Adagrad optimization algorithms were used to minimize cross-entropy between the predicted and ground-truth labels. Cross-validation was performed ten times to produce a mean accuracy of 0.938 ± 0.047 (95 % CI 0.908-0.967). The mean sensitivity was 0.974 ± 0.043 (95 % CI 0.947-1.00) and mean specificity was 0.822 ± 0.207 (95 % CI 0.694-0.950). We extended the TensorFlow API to enable DICOM compatibility in the context of DaTscan image analysis. We implemented a neural network classifier that produces diagnostic accuracies on par with excellent results from previous machine learning models. These results indicate the potential role of TensorFlow as a useful adjunct diagnostic tool in the clinical setting.

  19. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    Science.gov (United States)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  20. Geospatial Web Services in Real Estate Information System

    Science.gov (United States)

    Radulovic, Aleksandra; Sladic, Dubravka; Govedarica, Miro; Popovic, Dragana; Radovic, Jovana

    2017-12-01

    Since the data of cadastral records are of great importance for the economic development of the country, they must be well structured and organized. Records of real estate on the territory of Serbia met many problems in previous years. To prevent problems and to achieve efficient access, sharing and exchange of cadastral data on the principles of interoperability, domain model for real estate is created according to current standards in the field of spatial data. The resulting profile of the domain model for the Serbian real estate cadastre is based on the current legislation and on Land Administration Domain Model (LADM) which is specified in the ISO19152 standard. Above such organized data, and for their effective exchange, it is necessary to develop a model of services that must be provided by the institutions interested in the exchange of cadastral data. This is achieved by introducing a service-oriented architecture in the information system of real estate cadastre and with that ensures efficiency of the system. It is necessary to develop user services for download, review and use of the real estate data through the web. These services should be provided to all users who need access to cadastral data (natural and legal persons as well as state institutions) through e-government. It is also necessary to provide search, view and download of cadastral spatial data by specifying geospatial services. Considering that real estate contains geometric data for parcels and buildings it is necessary to establish set of geospatial services that would provide information and maps for the analysis of spatial data, and for forming a raster data. Besides the theme Cadastral parcels, INSPIRE directive specifies several themes that involve data on buildings and land use, for which data can be provided from real estate cadastre. In this paper, model of geospatial services in Serbia is defined. A case study of using these services to estimate which household is at risk of

  1. Generation of Multiple Metadata Formats from a Geospatial Data Repository

    Science.gov (United States)

    Hudspeth, W. B.; Benedict, K. K.; Scott, S.

    2012-12-01

    The Earth Data Analysis Center (EDAC) at the University of New Mexico is partnering with the CYBERShARE and Environmental Health Group from the Center for Environmental Resource Management (CERM), located at the University of Texas, El Paso (UTEP), the Biodiversity Institute at the University of Kansas (KU), and the New Mexico Geo- Epidemiology Research Network (GERN) to provide a technical infrastructure that enables investigation of a variety of climate-driven human/environmental systems. Two significant goals of this NASA-funded project are: a) to increase the use of NASA Earth observational data at EDAC by various modeling communities through enabling better discovery, access, and use of relevant information, and b) to expose these communities to the benefits of provenance for improving understanding and usability of heterogeneous data sources and derived model products. To realize these goals, EDAC has leveraged the core capabilities of its Geographic Storage, Transformation, and Retrieval Engine (Gstore) platform, developed with support of the NSF EPSCoR Program. The Gstore geospatial services platform provides general purpose web services based upon the REST service model, and is capable of data discovery, access, and publication functions, metadata delivery functions, data transformation, and auto-generated OGC services for those data products that can support those services. Central to the NASA ACCESS project is the delivery of geospatial metadata in a variety of formats, including ISO 19115-2/19139, FGDC CSDGM, and the Proof Markup Language (PML). This presentation details the extraction and persistence of relevant metadata in the Gstore data store, and their transformation into multiple metadata formats that are increasingly utilized by the geospatial community to document not only core library catalog elements (e.g. title, abstract, publication data, geographic extent, projection information, and database elements), but also the processing steps used to

  2. Development of Geospatial Map Based Election Portal

    Science.gov (United States)

    Gupta, A. Kumar Chandra; Kumar, P.; Vasanth Kumar, N.

    2014-11-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Election portal (GMEP) of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for pertain to planning and management of Department of Chief Electoral Officer, and as an election related information searching tools (Polling Station, Assembly and parliamentary constituency etc.,) for the citizens of NCTD. The GMEP is based on Client-Server architecture model. It has been developed using ArcGIS Server 10.0 with J2EE front-end on Microsoft Windows environment. The GMEP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMEP includes delimited precinct area boundaries of Voters Area of Polling stations, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMEP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of elections. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  3. Streamlining geospatial metadata in the Semantic Web

    Science.gov (United States)

    Fugazza, Cristiano; Pepe, Monica; Oggioni, Alessandro; Tagliolato, Paolo; Carrara, Paola

    2016-04-01

    In the geospatial realm, data annotation and discovery rely on a number of ad-hoc formats and protocols. These have been created to enable domain-specific use cases generalized search is not feasible for. Metadata are at the heart of the discovery process and nevertheless they are often neglected or encoded in formats that either are not aimed at efficient retrieval of resources or are plainly outdated. Particularly, the quantum leap represented by the Linked Open Data (LOD) movement did not induce so far a consistent, interlinked baseline in the geospatial domain. In a nutshell, datasets, scientific literature related to them, and ultimately the researchers behind these products are only loosely connected; the corresponding metadata intelligible only to humans, duplicated on different systems, seldom consistently. Instead, our workflow for metadata management envisages i) editing via customizable web- based forms, ii) encoding of records in any XML application profile, iii) translation into RDF (involving the semantic lift of metadata records), and finally iv) storage of the metadata as RDF and back-translation into the original XML format with added semantics-aware features. Phase iii) hinges on relating resource metadata to RDF data structures that represent keywords from code lists and controlled vocabularies, toponyms, researchers, institutes, and virtually any description one can retrieve (or directly publish) in the LOD Cloud. In the context of a distributed Spatial Data Infrastructure (SDI) built on free and open-source software, we detail phases iii) and iv) of our workflow for the semantics-aware management of geospatial metadata.

  4. Assessing and Valuing Historical Geospatial Data for Decisions

    Science.gov (United States)

    Sylak-Glassman, E.; Gallo, J.

    2016-12-01

    We will present a method for assessing the use and valuation of historical geospatial data and information products derived from Earth observations (EO). Historical data is widely used in the establishment of baseline reference cases, time-series analysis, and Earth system modeling. Historical geospatial data is used in diverse application areas, such as risk assessment in the insurance and reinsurance industry, disaster preparedness and response planning, historical demography, land-use change analysis, and paleoclimate research, among others. Establishing the current value of previously collected data, often from EO systems that are no longer operating, is difficult since the costs associated with their preservation, maintenance, and dissemination are current, while the costs associated with their original collection are sunk. Understanding their current use and value can aid in funding decisions about the data management infrastructure and workforce allocation required to maintain their availability. Using a value-tree framework to trace the application of data from EO systems, sensors, networks, and surveys, to weighted key Federal objectives, we are able to estimate relative contribution of individual EO systems, sensors, networks, and surveys to meeting those objectives. The analysis relies on a modified Delphi method to elicit relative levels of reliance on individual EO data inputs, including historical data, from subject matter experts. This results in the identification of a representative portfolio of all EO data used to meet key Federal objectives. Because historical data is collected in conjunction with all other EO data within a weighted framework, its contribution to meeting key Federal objectives can be specifically identified and evaluated in relationship to other EO data. The results of this method could be applied better understanding and projecting the long-term value of data from current and future EO systems.

  5. Intelligence, mapping, and geospatial exploitation system (IMAGES)

    Science.gov (United States)

    Moellman, Dennis E.; Cain, Joel M.

    1998-08-01

    This paper provides further detail to one facet of the battlespace visualization concept described in last year's paper Battlespace Situation Awareness for Force XXI. It focuses on the National Imagery and Mapping Agency (NIMA) goal to 'provide customers seamless access to tailorable imagery, imagery intelligence, and geospatial information.' This paper describes Intelligence, Mapping, and Geospatial Exploitation System (IMAGES), an exploitation element capable of CONUS baseplant operations or field deployment to provide NIMA geospatial information collaboratively into a reconnaissance, surveillance, and target acquisition (RSTA) environment through the United States Imagery and Geospatial Information System (USIGS). In a baseplant CONUS setting IMAGES could be used to produce foundation data to support mission planning. In the field it could be directly associated with a tactical sensor receiver or ground station (e.g. UAV or UGV) to provide near real-time and mission specific RSTA to support mission execution. This paper provides IMAGES functional level design; describes the technologies, their interactions and interdependencies; and presents a notional operational scenario to illustrate the system flexibility. Using as a system backbone an intelligent software agent technology, called Open Agent ArchitectureTM (OAATM), IMAGES combines multimodal data entry, natural language understanding, and perceptual and evidential reasoning for system management. Configured to be DII COE compliant, it would utilize, to the extent possible, COTS applications software for data management, processing, fusion, exploitation, and reporting. It would also be modular, scaleable, and reconfigurable. This paper describes how the OAATM achieves data synchronization and enables the necessary level of information to be rapidly available to various command echelons for making informed decisions. The reasoning component will provide for the best information to be developed in the timeline

  6. Researches and Analysis on Middle School Students’ English Learning Motivation

    Institute of Scientific and Technical Information of China (English)

    陈虹; 韩小乐

    2008-01-01

    <正>This thesis discusses the relations among English learning motivations, learning strategies and study efficiency under China’s background through reviewing the Chinese and overseas English learning motivation research, analyzing its explanation, characteristics and the questionnaire results. Several suggestions on how to stimulate and foster the students’English learning motivation have been given through the analysis of existing problems in the English study of students. I expect these would be animating English teaching in school.

  7. Leveraging geospatial data, technology, and methods for improving the health of communities: priorities and strategies from an expert panel convened by the CDC.

    Science.gov (United States)

    Elmore, Kim; Flanagan, Barry; Jones, Nicholas F; Heitgerd, Janet L

    2010-04-01

    In 2008, CDC convened an expert panel to gather input on the use of geospatial science in surveillance, research and program activities focused on CDC's Healthy Communities Goal. The panel suggested six priorities: spatially enable and strengthen public health surveillance infrastructure; develop metrics for geospatial categorization of community health and health inequity; evaluate the feasibility and validity of standard metrics of community health and health inequities; support and develop GIScience and geospatial analysis; provide geospatial capacity building, training and education; and, engage non-traditional partners. Following the meeting, the strategies and action items suggested by the expert panel were reviewed by a CDC subcommittee to determine priorities relative to ongoing CDC geospatial activities, recognizing that many activities may need to occur either in parallel, or occur multiple times across phases. Phase A of the action items centers on developing leadership support. Phase B focuses on developing internal and external capacity in both physical (e.g., software and hardware) and intellectual infrastructure. Phase C of the action items plan concerns the development and integration of geospatial methods. In summary, the panel members provided critical input to the development of CDC's strategic thinking on integrating geospatial methods and research issues across program efforts in support of its Healthy Communities Goal.

  8. Geospatial Services in Special Libraries: A Needs Assessment Perspective

    Science.gov (United States)

    Barnes, Ilana

    2013-01-01

    Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…

  9. Geospatial cryptography: enabling researchers to access private, spatially referenced, human subjects data for cancer control and prevention.

    Science.gov (United States)

    Jacquez, Geoffrey M; Essex, Aleksander; Curtis, Andrew; Kohler, Betsy; Sherman, Recinda; Emam, Khaled El; Shi, Chen; Kaufmann, Andy; Beale, Linda; Cusick, Thomas; Goldberg, Daniel; Goovaerts, Pierre

    2017-07-01

    As the volume, accuracy and precision of digital geographic information have increased, concerns regarding individual privacy and confidentiality have come to the forefront. Not only do these challenge a basic tenet underlying the advancement of science by posing substantial obstacles to the sharing of data to validate research results, but they are obstacles to conducting certain research projects in the first place. Geospatial cryptography involves the specification, design, implementation and application of cryptographic techniques to address privacy, confidentiality and security concerns for geographically referenced data. This article defines geospatial cryptography and demonstrates its application in cancer control and surveillance. Four use cases are considered: (1) national-level de-duplication among state or province-based cancer registries; (2) sharing of confidential data across cancer registries to support case aggregation across administrative geographies; (3) secure data linkage; and (4) cancer cluster investigation and surveillance. A secure multi-party system for geospatial cryptography is developed. Solutions under geospatial cryptography are presented and computation time is calculated. As services provided by cancer registries to the research community, de-duplication, case aggregation across administrative geographies and secure data linkage are often time-consuming and in some instances precluded by confidentiality and security concerns. Geospatial cryptography provides secure solutions that hold significant promise for addressing these concerns and for accelerating the pace of research with human subjects data residing in our nation's cancer registries. Pursuit of the research directions posed herein conceivably would lead to a geospatially encrypted geographic information system (GEGIS) designed specifically to promote the sharing and spatial analysis of confidential data. Geospatial cryptography holds substantial promise for accelerating the

  10. The geospatial data quality REST API for primary biodiversity data.

    Science.gov (United States)

    Otegui, Javier; Guralnick, Robert P

    2016-06-01

    We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  11. Machine learning approaches in medical image analysis

    DEFF Research Database (Denmark)

    de Bruijne, Marleen

    2016-01-01

    Machine learning approaches are increasingly successful in image-based diagnosis, disease prognosis, and risk assessment. This paper highlights new research directions and discusses three main challenges related to machine learning in medical imaging: coping with variation in imaging protocols......, learning from weak labels, and interpretation and evaluation of results....

  12. Forensic Learning Disability Nursing Role Analysis

    Science.gov (United States)

    Mason, Tom; Phipps, Dianne; Melling, Kat

    2011-01-01

    This article reports on a study carried out on the role constructs of forensic and nonforensic Learning Disability Nursing in relation to six binary themes. The aims were to identify if there were differences in perceptions of forensic learning disability nurses and nonforensic learning disability nurses in relation to the six binary themes of the…

  13. Data Democracy and Decision Making: Enhancing the Use and Value of Geospatial Data and Scientific Information

    Science.gov (United States)

    Shapiro, C. D.

    2014-12-01

    Data democracy is a concept that has great relevance to the use and value of geospatial data and scientific information. Data democracy describes a world in which data and information are widely and broadly accessible, understandable, and useable. The concept operationalizes the public good nature of scientific information and provides a framework for increasing benefits from its use. Data democracy encompasses efforts to increase accessibility to geospatial data and to expand participation in its collection, analysis, and application. These two pillars are analogous to demand and supply relationships. Improved accessibility, or demand, includes increased knowledge about geospatial data and low barriers to retrieval and use. Expanded participation, or supply, encompasses a broader community involved in developing geospatial data and scientific information. This pillar of data democracy is characterized by methods such as citizen science or crowd sourcing.A framework is developed for advancing the use of data democracy. This includes efforts to assess the societal benefits (economic and social) of scientific information. This knowledge is critical to continued monitoring of the effectiveness of data democracy implementation and of potential impact on the use and value of scientific information. The framework also includes an assessment of opportunities for advancing data democracy both on the supply and demand sides. These opportunities include relatively inexpensive efforts to reduce barriers to use as well as the identification of situations in which participation can be expanded in scientific efforts to enhance the breadth of involvement as well as expanding participation to non-traditional communities. This framework provides an initial perspective on ways to expand the "scientific community" of data users and providers. It also describes a way forward for enhancing the societal benefits from geospatial data and scientific information. As a result, data

  14. Geospatial Image Stream Processing: Models, techniques, and applications in remote sensing change detection

    Science.gov (United States)

    Rueda-Velasquez, Carlos Alberto

    Detection of changes in environmental phenomena using remotely sensed data is a major requirement in the Earth sciences, especially in natural disaster related scenarios where real-time detection plays a crucial role in the saving of human lives and the preservation of natural resources. Although various approaches formulated to model multidimensional data can in principle be applied to the inherent complexity of remotely sensed geospatial data, there are still challenging peculiarities that demand a precise characterization in the context of change detection, particularly in scenarios of fast changes. In the same vein, geospatial image streams do not fit appropriately in the standard Data Stream Management System (DSMS) approach because these systems mainly deal with tuple-based streams. Recognizing the necessity for a systematic effort to address the above issues, the work presented in this thesis is a concrete step toward the foundation and construction of an integrated Geospatial Image Stream Processing framework, GISP. First, we present a data and metadata model for remotely sensed image streams. We introduce a precise characterization of images and image streams in the context of remotely sensed geospatial data. On this foundation, we define spatially-aware temporal operators with a consistent semantics for change analysis tasks. We address the change detection problem in settings where multiple image stream sources are available, and thus we introduce an architectural design for the processing of geospatial image streams from multiple sources. With the aim of targeting collaborative scientific environments, we construct a realization of our architecture based on Kepler, a robust and widely used scientific workflow management system, as the underlying computational support; and open data and Web interface standards, as a means to facilitate the interoperability of GISP instances with other processing infrastructures and client applications. We demonstrate our

  15. Geospatial Brokering - Challenges and Future Directions

    Science.gov (United States)

    White, C. E.

    2012-12-01

    An important feature of many brokers is to facilitate straightforward human access to scientific data while maintaining programmatic access to it for system solutions. Standards-based protocols are critical for this, and there are a number of protocols to choose from. In this discussion, we will present a web application solution that leverages certain protocols - e.g., OGC CSW, REST, and OpenSearch - to provide programmatic as well as human access to geospatial resources. We will also discuss managing resources to reduce duplication yet increase discoverability, federated search solutions, and architectures that combine human-friendly interfaces with powerful underlying data management. The changing requirements witnessed in brokering solutions over time, our recent experience participating in the EarthCube brokering hack-a-thon, and evolving interoperability standards provide insight to future technological and philosophical directions planned for geospatial broker solutions. There has been much change over the past decade, but with the unprecedented data collaboration of recent years, in many ways the challenges and opportunities are just beginning.

  16. Geospatial metadata retrieval from web services

    Directory of Open Access Journals (Sweden)

    Ivanildo Barbosa

    Full Text Available Nowadays, producers of geospatial data in either raster or vector formats are able to make them available on the World Wide Web by deploying web services that enable users to access and query on those contents even without specific software for geoprocessing. Several providers around the world have deployed instances of WMS (Web Map Service, WFS (Web Feature Service and WCS (Web Coverage Service, all of them specified by the Open Geospatial Consortium (OGC. In consequence, metadata about the available contents can be retrieved to be compared with similar offline datasets from other sources. This paper presents a brief summary and describes the matching process between the specifications for OGC web services (WMS, WFS and WCS and the specifications for metadata required by the ISO 19115 - adopted as reference for several national metadata profiles, including the Brazilian one. This process focuses on retrieving metadata about the identification and data quality packages as well as indicates the directions to retrieve metadata related to other packages. Therefore, users are able to assess whether the provided contents fit to their purposes.

  17. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    Science.gov (United States)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    paper presents the background, architectural design, and activities of GeoCloud in support of the Geospatial Platform Initiative. System security strategies and approval processes for migrating federal geospatial data, information, and applications into cloud, and cost estimation for cloud operations are covered. Finally, some lessons learned from the GeoCloud project are discussed as reference for geoscientists to consider in the adoption of cloud computing.

  18. Activity-Based Intelligence prevedere il futuro osservando il presente con gli strumenti Hexagon Geospatial

    Directory of Open Access Journals (Sweden)

    Massimo Zotti

    2015-06-01

    Full Text Available The intelligence of human activities on the earth's surface, obtained through the analysis of earth observation data and other geospatial information, is vital for the planning and execution of any military action, for peacekeeping or for humanitarian emergencies. The success of these actions largely depends on the ability to analyze timely data from multiple sources. However, the proliferation of new sources of intelligence in a Geospatial big data scenario increasingly complicate the analysis of such activities by human analysts. Modern technologies solve these problems by enabling the Activity Based Intelligence, a methodology that improves the efficiency and timeliness of intelligence through the analysis of historical, current and future activity, to identify patterns, trends and relationships hidden in large data collections from different sources.

  19. Combining Formal Logic and Machine Learning for Sentiment Analysis

    DEFF Research Database (Denmark)

    Petersen, Niklas Christoffer; Villadsen, Jørgen

    2014-01-01

    This paper presents a formal logical method for deep structural analysis of the syntactical properties of texts using machine learning techniques for efficient syntactical tagging. To evaluate the method it is used for entity level sentiment analysis as an alternative to pure machine learning...

  20. Confidence-Based Learning in Investment Analysis

    Science.gov (United States)

    Serradell-Lopez, Enric; Lara-Navarra, Pablo; Castillo-Merino, David; González-González, Inés

    The aim of this study is to determine the effectiveness of using multiple choice tests in subjects related to the administration and business management. To this end we used a multiple-choice test with specific questions to verify the extent of knowledge gained and the confidence and trust in the answers. The tests were performed in a group of 200 students at the bachelor's degree in Business Administration and Management. The analysis made have been implemented in one subject of the scope of investment analysis and measured the level of knowledge gained and the degree of trust and security in the responses at two different times of the course. The measurements have been taken into account different levels of difficulty in the questions asked and the time spent by students to complete the test. The results confirm that students are generally able to obtain more knowledge along the way and get increases in the degree of trust and confidence in the answers. It is confirmed as the difficulty level of the questions set a priori by the heads of the subjects are related to levels of security and confidence in the answers. It is estimated that the improvement in the skills learned is viewed favourably by businesses and are especially important for job placement of students.

  1. Integrated Sustainable Planning for Industrial Region Using Geospatial Technology

    Science.gov (United States)

    Tiwari, Manish K.; Saxena, Aruna; Katare, Vivek

    2012-07-01

    The Geospatial techniques and its scope of applications have undergone an order of magnitude change since its advent and now it has been universally accepted as a most important and modern tool for mapping and monitoring of various natural resources as well as amenities and infrastructure. The huge and voluminous spatial database generated from various Remote Sensing platforms needs proper management like storage, retrieval, manipulation and analysis to extract desired information, which is beyond the capability of human brain. This is where the computer aided GIS technology came into existence. A GIS with major input from Remote Sensing satellites for the natural resource management applications must be able to handle the spatiotemporal data, supporting spatiotemporal quarries and other spatial operations. Software and the computer-based tools are designed to make things easier to the user and to improve the efficiency and quality of information processing tasks. The natural resources are a common heritage, which we have shared with the past generations, and our future generation will be inheriting these resources from us. Our greed for resource and our tremendous technological capacity to exploit them at a much larger scale has created a situation where we have started withdrawing from the future stocks. Bhopal capital region had attracted the attention of the planners from the beginning of the five-year plan strategy for Industrial development. However, a number of projects were carried out in the individual Districts (Bhopal, Rajgarh, Shajapur, Raisen, Sehore) which also gave fruitful results, but no serious efforts have been made to involve the entire region. No use of latest Geospatial technique (Remote Sensing, GIS, GPS) to prepare a well structured computerized data base without which it is very different to retrieve, analyze and compare the data for monitoring as well as for planning the developmental activities in future.

  2. A Spatial Data Infrastructure Integrating Multisource Heterogeneous Geospatial Data and Time Series: A Study Case in Agriculture

    Directory of Open Access Journals (Sweden)

    Gloria Bordogna

    2016-05-01

    Full Text Available Currently, the best practice to support land planning calls for the development of Spatial Data Infrastructures (SDI capable of integrating both geospatial datasets and time series information from multiple sources, e.g., multitemporal satellite data and Volunteered Geographic Information (VGI. This paper describes an original OGC standard interoperable SDI architecture and a geospatial data and metadata workflow for creating and managing multisource heterogeneous geospatial datasets and time series, and discusses it in the framework of the Space4Agri project study case developed to support the agricultural sector in Lombardy region, Northern Italy. The main novel contributions go beyond the application domain for which the SDI has been developed and are the following: the ingestion within an a-centric SDI, potentially distributed in several nodes on the Internet to support scalability, of products derived by processing remote sensing images, authoritative data, georeferenced in-situ measurements and voluntary information (VGI created by farmers and agronomists using an original Smart App; the workflow automation for publishing sets and time series of heterogeneous multisource geospatial data and relative web services; and, finally, the project geoportal, that can ease the analysis of the geospatial datasets and time series by providing complex intelligent spatio-temporal query and answering facilities.

  3. Investigating Climate Change Issues With Web-Based Geospatial Inquiry Activities

    Science.gov (United States)

    Dempsey, C.; Bodzin, A. M.; Sahagian, D. L.; Anastasio, D. J.; Peffer, T.; Cirucci, L.

    2011-12-01

    In the Environmental Literacy and Inquiry middle school Climate Change curriculum we focus on essential climate literacy principles with an emphasis on weather and climate, Earth system energy balance, greenhouse gases, paleoclimatology, and how human activities influence climate change (http://www.ei.lehigh.edu/eli/cc/). It incorporates a related set of a framework and design principles to provide guidance for the development of the geospatial technology-integrated Earth and environmental science curriculum materials. Students use virtual globes, Web-based tools including an interactive carbon calculator and geologic timeline, and inquiry-based lab activities to investigate climate change topics. The curriculum includes educative curriculum materials that are designed to promote and support teachers' learning of important climate change content and issues, geospatial pedagogical content knowledge, and geographic spatial thinking. The curriculum includes baseline instructional guidance for teachers and provides implementation and adaptation guidance for teaching with diverse learners including low-level readers, English language learners and students with disabilities. In the curriculum, students use geospatial technology tools including Google Earth with embedded spatial data to investigate global temperature changes, areas affected by climate change, evidence of climate change, and the effects of sea level rise on the existing landscape. We conducted a designed-based research implementation study with urban middle school students. Findings showed that the use of the Climate Change curriculum showed significant improvement in urban middle school students' understanding of climate change concepts.

  4. Infusion of Climate Change and Geospatial Science Concepts into Environmental and Biological Science Curriculum

    Science.gov (United States)

    Balaji Bhaskar, M. S.; Rosenzweig, J.; Shishodia, S.

    2017-12-01

    The objective of our activity is to improve the students understanding and interpretation of geospatial science and climate change concepts and its applications in the field of Environmental and Biological Sciences in the College of Science Engineering and Technology (COEST) at Texas Southern University (TSU) in Houston, TX. The courses of GIS for Environment, Ecology and Microbiology were selected for the curriculum infusion. A total of ten GIS hands-on lab modules, along with two NCAR (National Center for Atmospheric Research) lab modules on climate change were implemented in the "GIS for Environment" course. GIS and Google Earth Labs along with climate change lectures were infused into Microbiology and Ecology courses. Critical thinking and empirical skills of the students were assessed in all the courses. The student learning outcomes of these courses includes the ability of students to interpret the geospatial maps and the student demonstration of knowledge of the basic principles and concepts of GIS (Geographic Information Systems) and climate change. At the end of the courses, students developed a comprehensive understanding of the geospatial data, its applications in understanding climate change and its interpretation at the local and regional scales during multiple years.

  5. Renewable electricity generation in India—A learning rate analysis

    International Nuclear Information System (INIS)

    Partridge, Ian

    2013-01-01

    The cost of electricity generation using renewable technologies is widely assumed to be higher than the cost for conventional generation technologies, but likely to fall with growing experience of the technologies concerned. This paper tests the second part of that statement using learning rate analysis, based on large samples of wind and small hydro projects in India, and projects likely changes in these costs through 2020. It is the first study of learning rates for renewable generation technologies in India, and only the second in any developing country—it provides valuable input to the development of Indian energy policy and will be relevant to policy makers in other developing countries. The paper considers some potential problems with learning rate analysis raised by Nordhaus (2009. The Perils of the Learning Model for Modeling Endogenous Technological Change. National Bureau of Economic Research Working Paper Series No. 14638). By taking account of these issues, it is possible both to improve the models used for making cost projections and to examine the potential impact of remaining forecasting problems. - Highlights: • The first learning rate analysis of wind generation costs in India. • Only the second learning rate analysis for wind in any developing country. • Reviews missing variable and related issues in learning rate analysis. • Finds a 17.7% learning rate for wind generation costs in India. • Finds no significant learning effect for small hydro

  6. Ensemble Learning or Deep Learning? Application to Default Risk Analysis

    Directory of Open Access Journals (Sweden)

    Shigeyuki Hamori

    2018-03-01

    Full Text Available Proper credit-risk management is essential for lending institutions, as substantial losses can be incurred when borrowers default. Consequently, statistical methods that can measure and analyze credit risk objectively are becoming increasingly important. This study analyzes default payment data and compares the prediction accuracy and classification ability of three ensemble-learning methods—specifically, bagging, random forest, and boosting—with those of various neural-network methods, each of which has a different activation function. The results obtained indicate that the classification ability of boosting is superior to other machine-learning methods including neural networks. It is also found that the performance of neural-network models depends on the choice of activation function, the number of middle layers, and the inclusion of dropout.

  7. A Javascript GIS Platform Based on Invocable Geospatial Web Services

    Directory of Open Access Journals (Sweden)

    Konstantinos Evangelidis

    2018-04-01

    Full Text Available Semantic Web technologies are being increasingly adopted by the geospatial community during last decade through the utilization of open standards for expressing and serving geospatial data. This was also dramatically assisted by the ever-increasing access and usage of geographic mapping and location-based services via smart devices in people’s daily activities. In this paper, we explore the developmental framework of a pure JavaScript client-side GIS platform exclusively based on invocable geospatial Web services. We also extend JavaScript utilization on the server side by deploying a node server acting as a bridge between open source WPS libraries and popular geoprocessing engines. The vehicle for such an exploration is a cross platform Web browser capable of interpreting JavaScript commands to achieve interaction with geospatial providers. The tool is a generic Web interface providing capabilities of acquiring spatial datasets, composing layouts and applying geospatial processes. In an ideal form the end-user will have to identify those services, which satisfy a geo-related need and put them in the appropriate row. The final output may act as a potential collector of freely available geospatial web services. Its server-side components may exploit geospatial processing suppliers composing that way a light-weight fully transparent open Web GIS platform.

  8. Affect and Learning : a computational analysis

    NARCIS (Netherlands)

    Broekens, Douwe Joost

    2007-01-01

    In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation

  9. Searches over graphs representing geospatial-temporal remote sensing data

    Science.gov (United States)

    Brost, Randolph; Perkins, David Nikolaus

    2018-03-06

    Various technologies pertaining to identifying objects of interest in remote sensing images by searching over geospatial-temporal graph representations are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Geospatial-temporal graph searches are made computationally efficient by taking advantage of characteristics of geospatial-temporal data in remote sensing images through the application of various graph search techniques.

  10. Geospatial Visualization of Scientific Data Through Keyhole Markup Language

    Science.gov (United States)

    Wernecke, J.; Bailey, J. E.

    2008-12-01

    The development of virtual globes has provided a fun and innovative tool for exploring the surface of the Earth. However, it has been the paralleling maturation of Keyhole Markup Language (KML) that has created a new medium and perspective through which to visualize scientific datasets. Originally created by Keyhole Inc., and then acquired by Google in 2004, in 2007 KML was given over to the Open Geospatial Consortium (OGC). It became an OGC international standard on 14 April 2008, and has subsequently been adopted by all major geobrowser developers (e.g., Google, Microsoft, ESRI, NASA) and many smaller ones (e.g., Earthbrowser). By making KML a standard at a relatively young stage in its evolution, developers of the language are seeking to avoid the issues that plagued the early World Wide Web and development of Hypertext Markup Language (HTML). The popularity and utility of Google Earth, in particular, has been enhanced by KML features such as the Smithsonian volcano layer and the dynamic weather layers. Through KML, users can view real-time earthquake locations (USGS), view animations of polar sea-ice coverage (NSIDC), or read about the daily activities of chimpanzees (Jane Goodall Institute). Perhaps even more powerful is the fact that any users can create, edit, and share their own KML, with no or relatively little knowledge of manipulating computer code. We present an overview of the best current scientific uses of KML and a guide to how scientists can learn to use KML themselves.

  11. Statistical and machine learning approaches for network analysis

    CERN Document Server

    Dehmer, Matthias

    2012-01-01

    Explore the multidisciplinary nature of complex networks through machine learning techniques Statistical and Machine Learning Approaches for Network Analysis provides an accessible framework for structurally analyzing graphs by bringing together known and novel approaches on graph classes and graph measures for classification. By providing different approaches based on experimental data, the book uniquely sets itself apart from the current literature by exploring the application of machine learning techniques to various types of complex networks. Comprised of chapters written by internation

  12. The clinical learning environment in nursing education: a concept analysis.

    Science.gov (United States)

    Flott, Elizabeth A; Linden, Lois

    2016-03-01

    The aim of this study was to report an analysis of the clinical learning environment concept. Nursing students are evaluated in clinical learning environments where skills and knowledge are applied to patient care. These environments affect achievement of learning outcomes, and have an impact on preparation for practice and student satisfaction with the nursing profession. Providing clarity of this concept for nursing education will assist in identifying antecedents, attributes and consequences affecting student transition to practice. The clinical learning environment was investigated using Walker and Avant's concept analysis method. A literature search was conducted using WorldCat, MEDLINE and CINAHL databases using the keywords clinical learning environment, clinical environment and clinical education. Articles reviewed were written in English and published in peer-reviewed journals between 1995-2014. All data were analysed for recurring themes and terms to determine possible antecedents, attributes and consequences of this concept. The clinical learning environment contains four attribute characteristics affecting student learning experiences. These include: (1) the physical space; (2) psychosocial and interaction factors; (3) the organizational culture and (4) teaching and learning components. These attributes often determine achievement of learning outcomes and student self-confidence. With better understanding of attributes comprising the clinical learning environment, nursing education programmes and healthcare agencies can collaborate to create meaningful clinical experiences and enhance student preparation for the professional nurse role. © 2015 John Wiley & Sons Ltd.

  13. LEARNING DIFFICULTIES: AN ANALYSIS BASED ON VIGOTSKY

    Directory of Open Access Journals (Sweden)

    Adriane Cenci

    2010-06-01

    Full Text Available We aimed, along the text, to bring a reflection upon learning difficulties based on Socio-Historical Theory, relating what is observed in schools to what has been discussed about learning difficulties and the theory proposed by Vygotsky in the early XX century. We understand that children enter school carrying experiences and knowledge from their cultural group and that school ignores such knowledge very often. Then, it is in such disengagement that emerges what we started to call learning difficulties. One cannot forget to see a child as a whole – a student is a social being constituted by culture, language and specific values to which one must be attentive.

  14. Language Learning of Gifted Individuals: A Content Analysis Study

    Science.gov (United States)

    Gokaydin, Beria; Baglama, Basak; Uzunboylu, Huseyin

    2017-01-01

    This study aims to carry out a content analysis of the studies on language learning of gifted individuals and determine the trends in this field. Articles on language learning of gifted individuals published in the Scopus database were examined based on certain criteria including type of publication, year of publication, language, research…

  15. A Survey on Deep Learning in Medical Image Analysis

    NARCIS (Netherlands)

    Litjens, G.J.; Kooi, T.; Ehteshami Bejnordi, B.; Setio, A.A.A.; Ciompi, F.; Ghafoorian, M.; Laak, J.A.W.M. van der; Ginneken, B. van; Sanchez, C.I.

    2017-01-01

    Deep learning algorithms, in particular convolutional networks, have rapidly become a methodology of choice for analyzing medical images. This paper reviews the major deep learning concepts pertinent to medical image analysis and summarizes over 300 contributions to the field, most of which appeared

  16. Explaining discontinuity in organizational learning : a process analysis

    NARCIS (Netherlands)

    Berends, J.J.; Lammers, I.S.

    2010-01-01

    This paper offers a process analysis of organizational learning as it unfolds in a social and temporal context. Building upon the 4I framework (Crossan et al. 1999), we examine organizational learning processes in a longitudinal case study of an implementation of knowledge management in an

  17. Open cyberGIS software for geospatial research and education in the big data era

    Science.gov (United States)

    Wang, Shaowen; Liu, Yan; Padmanabhan, Anand

    CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  18. Open cyberGIS software for geospatial research and education in the big data era

    Directory of Open Access Journals (Sweden)

    Shaowen Wang

    2016-01-01

    Full Text Available CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS, spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies–open access, source, and integration–to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  19. Revelation of `Hidden' Balinese Geospatial Heritage on A Map

    Science.gov (United States)

    Soeria Atmadja, Dicky A. S.; Wikantika, Ketut; Budi Harto, Agung; Putra, Daffa Gifary M.

    2018-05-01

    Bali is not just about beautiful nature. It also has a unique and interesting cultural heritage, including `hidden' geospatial heritage. Tri Hita Karana is a Hinduism concept of life consisting of human relation to God, to other humans and to the nature (Parahiyangan, Pawongan and Palemahan), Based on it, - in term of geospatial aspect - the Balinese derived its spatial orientation, spatial planning & lay out, measurement as well as color and typography. Introducing these particular heritage would be a very interesting contribution to Bali tourism. As a respond to these issues, a question arise on how to reveal these unique and highly valuable geospatial heritage on a map which can be used to introduce and disseminate them to the tourists. Symbols (patterns & colors), orientation, distance, scale, layout and toponimy have been well known as elements of a map. There is an chance to apply Balinese geospatial heritage in representing these map elements.

  20. DIGI-vis: Distributed interactive geospatial information visualization

    KAUST Repository

    Ponto, Kevin; Kuester, Falk

    2010-01-01

    data sets. We propose a distributed data gathering and visualization system that allows researchers to view these data at hundreds of megapixels simultaneously. This system allows scientists to view real-time geospatial information at unprecedented

  1. A Geospatial Decision Support System Toolkit, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and commercialize a working prototype Geospatial Decision Support Toolkit (GeoKit). GeoKit will enable scientists, agencies, and stakeholders to...

  2. 75 FR 10309 - Announcement of National Geospatial Advisory Committee Meeting

    Science.gov (United States)

    2010-03-05

    ... Geospatial Advisory Committee (NGAC) will meet on March 24-25, 2010 at the One Washington Circle Hotel, 1... implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting...

  3. FOSS geospatial libraries in scientific workflow environments: experiences and directions

    CSIR Research Space (South Africa)

    McFerren, G

    2011-07-01

    Full Text Available of experiments. In context of three sets of research (wildfire research, flood modelling and the linking of disease outbreaks to multi-scale environmental conditions), we describe our efforts to provide geospatial capability for scientific workflow software...

  4. Geospatial Google Street View with Virtual Reality: A Motivational Approach for Spatial Training Education

    Directory of Open Access Journals (Sweden)

    Carlos Carbonell-Carrera

    2017-08-01

    Full Text Available Motivation is a determining factor in the learning process, and encourages the student to participate in activities that increase their performance. Learning strategies supplemented by computer technology in a scenario-based learning environment can improve students′ motivation for spatial knowledge acquisition. In this sense, a workshop carried out with 43-second year engineering students supported by Google Street View mobile geospatial application for location-based tasks is presented, in which participants work in an immersive wayfinding 3D urban environment on virtual reality. Students use their own smartphones with Google Street View application integrated in virtual reality (VR 3D glasses with a joystick as locomotion interface. The tool to analyse the motivational factor of this pedagogical approach is the multidimensional measurement device Intrinsic Motivation Inventory with six subscales: interest, perceived competence, perceived choice, effort, tension, and value, measured on a seven point Likert scale. Scores in all subscales considered are above 4 on a scale of 7. A usability study conducted at the end of the experiment provides values above 3 on a scale of 5 in efficacy, efficiency and satisfaction. The results of the experiment carried out indicate that geospatial Google Street View application in Virtual Reality is a motivating educational purpose in the field of spatial training.

  5. Combining forest inventory, satellite remote sensing, and geospatial data for mapping forest attributes of the conterminous United States

    Science.gov (United States)

    Mark Nelson; Greg Liknes; Charles H. Perry

    2009-01-01

    Analysis and display of forest composition, structure, and pattern provides information for a variety of assessments and management decision support. The objective of this study was to produce geospatial datasets and maps of conterminous United States forest land ownership, forest site productivity, timberland, and reserved forest land. Satellite image-based maps of...

  6. Bridging the Gap between NASA Hydrological Data and the Geospatial Community

    Science.gov (United States)

    Rui, Hualan; Teng, Bill; Vollmer, Bruce; Mocko, David M.; Beaudoing, Hiroko K.; Nigro, Joseph; Gary, Mark; Maidment, David; Hooper, Richard

    2011-01-01

    There is a vast and ever increasing amount of data on the Earth interconnected energy and hydrological systems, available from NASA remote sensing and modeling systems, and yet, one challenge persists: increasing the usefulness of these data for, and thus their use by, the geospatial communities. The Hydrology Data and Information Services Center (HDISC), part of the Goddard Earth Sciences DISC, has continually worked to better understand the hydrological data needs of the geospatial end users, to thus better able to bridge the gap between NASA data and the geospatial communities. This paper will cover some of the hydrological data sets available from HDISC, and the various tools and services developed for data searching, data subletting ; format conversion. online visualization and analysis; interoperable access; etc.; to facilitate the integration of NASA hydrological data by end users. The NASA Goddard data analysis and visualization system, Giovanni, is described. Two case examples of user-customized data services are given, involving the EPA BASINS (Better Assessment Science Integrating point & Non-point Sources) project and the CUAHSI Hydrologic Information System, with the common requirement of on-the-fly retrieval of long duration time series for a geographical point

  7. Learning motivation and student achievement : description analysis and relationships both

    Directory of Open Access Journals (Sweden)

    Ari Riswanto

    2017-03-01

    Full Text Available Education is very important for humans, through the education throughout the world will increasingly flourish. However, if faced with the activities within the learning process, not a few men (students who have less motivation in learning activities. This resulted in fewer maximal learning processes and in turn will affect student achievement. This study focuses to discuss matters relating to the motivation to learn and student achievement, with the aim of strengthening the importance of motivation in the learning process so that a clear relationship with student achievement. The method used is descriptive analysis and simple correlation to the 97 students taking the course introduction to Microeconomics and Indonesian. The conclusion from this research is the students have a good record if it has a well and motivated as well, and this study concludes their tie's difference between learning motivation and achievement of students on two different courses.

  8. Comparative Analysis of Kernel Methods for Statistical Shape Learning

    National Research Council Canada - National Science Library

    Rathi, Yogesh; Dambreville, Samuel; Tannenbaum, Allen

    2006-01-01

    .... In this work, we perform a comparative analysis of shape learning techniques such as linear PCA, kernel PCA, locally linear embedding and propose a new method, kernelized locally linear embedding...

  9. Mapping a Difference: The Power of Geospatial Visualization

    Science.gov (United States)

    Kolvoord, B.

    2015-12-01

    Geospatial Technologies (GST), such as GIS, GPS and remote sensing, offer students and teachers the opportunity to study the "why" of where. By making maps and collecting location-based data, students can pursue authentic problems using sophisticated tools. The proliferation of web- and cloud-based tools has made these technologies broadly accessible to schools. In addition, strong spatial thinking skills have been shown to be a key factor in supporting students that want to study science, technology, engineering, and mathematics (STEM) disciplines (Wai, Lubinski and Benbow) and pursue STEM careers. Geospatial technologies strongly scaffold the development of these spatial thinking skills. For the last ten years, the Geospatial Semester, a unique dual-enrollment partnership between James Madison University and Virginia high schools, has provided students with the opportunity to use GST's to hone their spatial thinking skills and to do extended projects of local interest, including environmental, geological and ecological studies. Along with strong spatial thinking skills, these students have also shown strong problem solving skills, often beyond those of fellow students in AP classes. Programs like the Geospatial Semester are scalable and within the reach of many college and university departments, allowing strong engagement with K-12 schools. In this presentation, we'll share details of the Geospatial Semester and research results on the impact of the use of these technologies on students' spatial thinking skills, and discuss the success and challenges of developing K-12 partnerships centered on geospatial visualization.

  10. The Value of Information - Accounting for a New Geospatial Paradigm

    Science.gov (United States)

    Pearlman, J.; Coote, A. M.

    2014-12-01

    A new frontier in consideration of socio-economic benefit is valuing information as an asset, often referred to as Infonomics. Conventional financial practice does not easily provide a mechanism for valuing information and yet clearly for many of the largest corporations, such as Google and Facebook, it is their principal asset. This is exacerbated for public sector organizations, as those that information-centric rather than information-enabled are relatively few - statistics, archiving and mapping agencies are perhaps the only examples - so it's not at the top of the agenda for Government. However, it is a hugely important issue when valuing Geospatial data and information. Geospatial data allows public institutions to operate, and facilitates the provision of essential services for emergency response and national defense. In this respect, geospatial data is strongly analogous to other types of public infrastructure, such as utilities and roads. The use of Geospatial data is widespread from companies in the transportation or construction sectors to individual planning for daily events. The categorization of geospatial data as infrastructure is critical to decisions related to investment in its management, maintenance and upgrade over time. Geospatial data depreciates in the same way that physical infrastructure depreciates. It needs to be maintained otherwise its functionality and value in use declines. We have coined the term geo-infonomics to encapsulate the concept. This presentation will develop the arguments around its importance and current avenues of research.

  11. Bim and Gis: when Parametric Modeling Meets Geospatial Data

    Science.gov (United States)

    Barazzetti, L.; Banfi, F.

    2017-12-01

    Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.

  12. BIM AND GIS: WHEN PARAMETRIC MODELING MEETS GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2017-12-01

    Full Text Available Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building scale to the infrastructure (where geospatial data cannot be neglected has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by “pure” GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator industry, as well as new solutions for parametric modelling with additional geoinformation.

  13. Economic Assessment of the Use Value of Geospatial Information

    Directory of Open Access Journals (Sweden)

    Richard Bernknopf

    2015-07-01

    Full Text Available Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI contained in geospatial data is the difference between the net benefits (in present value terms of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1 a retrospective model about environmental regulation of agrochemicals; (2 a prospective model about the impact and mitigation of earthquakes in urban areas; and (3 a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  14. Language Learning of Gifted Individuals: A Content Analysis Study

    Directory of Open Access Journals (Sweden)

    Beria Gokaydin

    2017-11-01

    Full Text Available This study aims to carry out a content analysis of the studies on language learning of gifted individuals and determine the trends in this field. Articles on language learning of gifted individuals published in the Scopus database were examined based on certain criteria including type of publication, year of publication, language, research discipline, countries of research, institutions of authors, key words, and resources. Data were analyzed with the content analysis method. Results showed that the number of studies on language learning of gifted individuals has increased throughout the years. Recommendations for further research and practices are provided.

  15. Learning disabilities: analysis of 69 children

    Directory of Open Access Journals (Sweden)

    Meister Eduardo Kaehler

    2001-01-01

    Full Text Available With this article we intend to demonstrate the importance of evaluation and follow up of children with learning disabilities, through a multidisciplinary team. As well as to establish the need of intervention. We evaluate 69 children, from Aline Picheth Public School, in Curitiba, attending first or second grade of elementary school, through general and evolutionary neurological examination, pediatric checklist symptoms, and social, linguistic and psychological (WISC-III, Bender Infantile and WPPSI-figures evaluation. The incidence was higher in boys (84,1%, familiar history of learning disabilities was found in 42%, and writing abnormalities in 56,5%. The most frequent diagnosis was attention deficit and hyperactivity disorder, in 39,1%. With this program, we aimed to reduce the retention taxes and stress the importance of this evaluation, and, if necessary, multidisciplinar intervention in the cases of learning disabilities.

  16. Recent Advances in Geospatial Visualization with the New Google Earth

    Science.gov (United States)

    Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.

    2017-12-01

    Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.

  17. Integrated web system of geospatial data services for climate research

    Science.gov (United States)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander

    2016-04-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required. An approach for integrated analysis of georefernced climatological data sets based on combination of web and GIS technologies in the framework of spatial data infrastructure paradigm is presented. According to this approach a dedicated data-processing web system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is being developed. It is based on Open Geospatial Consortium (OGC) standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement #14.613.21.0037.

  18. Nansat: a Scientist-Orientated Python Package for Geospatial Data Processing

    Directory of Open Access Journals (Sweden)

    Anton A. Korosov

    2016-10-01

    Full Text Available Nansat is a Python toolbox for analysing and processing 2-dimensional geospatial data, such as satellite imagery, output from numerical models, and gridded in-situ data. It is created with strong focus on facilitating research, and development of algorithms and autonomous processing systems. Nansat extends the widely used Geospatial Abstraction Data Library (GDAL by adding scientific meaning to the datasets through metadata, and by adding common functionality for data analysis and handling (e.g., exporting to various data formats. Nansat uses metadata vocabularies that follow international metadata standards, in particular the Climate and Forecast (CF conventions, and the NASA Directory Interchange Format (DIF and Global Change Master Directory (GCMD keywords. Functionality that is commonly needed in scientific work, such as seamless access to local or remote geospatial data in various file formats, collocation of datasets from different sources and geometries, and visualization, is also built into Nansat. The paper presents Nansat workflows, its functional structure, and examples of typical applications.

  19. Creating the learning situation to promote student deep learning: Data analysis and application case

    Science.gov (United States)

    Guo, Yuanyuan; Wu, Shaoyan

    2017-05-01

    How to lead students to deeper learning and cultivate engineering innovative talents need to be studied for higher engineering education. In this study, through the survey data analysis and theoretical research, we discuss the correlation of teaching methods, learning motivation, and learning methods. In this research, we find that students have different motivation orientation according to the perception of teaching methods in the process of engineering education, and this affects their choice of learning methods. As a result, creating situations is critical to lead students to deeper learning. Finally, we analyze the process of learning situational creation in the teaching process of «bidding and contract management workshops». In this creation process, teachers use the student-centered teaching to lead students to deeper study. Through the study of influence factors of deep learning process, and building the teaching situation for the purpose of promoting deep learning, this thesis provide a meaningful reference for enhancing students' learning quality, teachers' teaching quality and the quality of innovation talent.

  20. Learning and Development Expertise: An Australian Analysis

    Science.gov (United States)

    Hodge, Steven; Harvey, Jack

    2015-01-01

    Learning and development (L&D) practitioners draw on a distinctive range of knowledge, skills and techniques in their work. Over the years, there have been attempts to capture this range and identify typical L&D roles. The research presented here was undertaken to identify characteristic areas of expertise (AOEs) of L&D practice in…

  1. Speech Analysis and Visual Image: Language Learning

    Science.gov (United States)

    Loo, Alfred; Chung, C. W.; Lam, Alan

    2016-01-01

    Students will speak a second language with an accent if they learn the language after the age of six. It does not matter how motivated and clever they are, the accent will not go away. Only a few gifted students can speak a second language flawlessly. The exact reasons for this phenomenon are unknown. Although a large number of hypotheses have…

  2. Exploring the Peer Interaction Effects on Learning Achievement in a Social Learning Platform Based on Social Network Analysis

    Science.gov (United States)

    Lin, Yu-Tzu; Chen, Ming-Puu; Chang, Chia-Hu; Chang, Pu-Chen

    2017-01-01

    The benefits of social learning have been recognized by existing research. To explore knowledge distribution in social learning and its effects on learning achievement, we developed a social learning platform and explored students' behaviors of peer interactions by the proposed algorithms based on social network analysis. An empirical study was…

  3. Learning Methods for Dynamic Topic Modeling in Automated Behavior Analysis.

    Science.gov (United States)

    Isupova, Olga; Kuzin, Danil; Mihaylova, Lyudmila

    2017-09-27

    Semisupervised and unsupervised systems provide operators with invaluable support and can tremendously reduce the operators' load. In the light of the necessity to process large volumes of video data and provide autonomous decisions, this paper proposes new learning algorithms for activity analysis in video. The activities and behaviors are described by a dynamic topic model. Two novel learning algorithms based on the expectation maximization approach and variational Bayes inference are proposed. Theoretical derivations of the posterior estimates of model parameters are given. The designed learning algorithms are compared with the Gibbs sampling inference scheme introduced earlier in the literature. A detailed comparison of the learning algorithms is presented on real video data. We also propose an anomaly localization procedure, elegantly embedded in the topic modeling framework. It is shown that the developed learning algorithms can achieve 95% success rate. The proposed framework can be applied to a number of areas, including transportation systems, security, and surveillance.

  4. Learning templates for artistic portrait lighting analysis.

    Science.gov (United States)

    Chen, Xiaowu; Jin, Xin; Wu, Hongyu; Zhao, Qinping

    2015-02-01

    Lighting is a key factor in creating impressive artistic portraits. In this paper, we propose to analyze portrait lighting by learning templates of lighting styles. Inspired by the experience of artists, we first define several novel features that describe the local contrasts in various face regions. The most informative features are then selected with a stepwise feature pursuit algorithm to derive the templates of various lighting styles. After that, the matching scores that measure the similarity between a testing portrait and those templates are calculated for lighting style classification. Furthermore, we train a regression model by the subjective scores and the feature responses of a template to predict the score of a portrait lighting quality. Based on the templates, a novel face illumination descriptor is defined to measure the difference between two portrait lightings. Experimental results show that the learned templates can well describe the lighting styles, whereas the proposed approach can assess the lighting quality of artistic portraits as human being does.

  5. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  6. Prospects for PV: a learning curve analysis

    International Nuclear Information System (INIS)

    Zwaan, Bob van der; Rabi, A.

    2003-01-01

    This article gives an overview of the current state-of-the-art of photovoltaic electricity technology, and addresses its potential for cost reductions over the first few decades of the 21st century. Current PV production cost ranges are presented, both in terms of capacity installation and electricity generation, of single crystalline silicon, multi-crystalline silicon, amorphous silicon and other thin film technologies. Possible decreases of these costs are assessed, as expected according to the learning-curve methodology. We also estimate how much PV could gain if external costs (due to environmental and health damage) of energy were internalised, for example by an energy tax. Our conclusions are that, (1) mainly due its high costs, PV electricity is unlikely to play a major role in global energy supply and carbon emissions abatement before 2020, (2) extrapolating learning curves observed in the past, one can expect its costs to decrease significantly over the coming years, so that a considerable PV electricity share world-wide could materialise after 2020, (3) niche-market applications, e.g. using stand-alone systems in remote areas, are crucial for continuing 'the ride along the learning curve', (4) damage costs of conventional (fossil) power sources are considerable, and their internalisation would improve the competitiveness of PV, although probably not enough to close the current cost gap. (author)

  7. Geospatial environmental data modelling applications using remote sensing, GIS and spatial statistics

    Energy Technology Data Exchange (ETDEWEB)

    Siljander, M.

    2010-07-01

    This thesis presents novel modelling applications for environmental geospatial data using remote sensing, GIS and statistical modelling techniques. The studied themes can be classified into four main themes: (i) to develop advanced geospatial databases. Paper (I) demonstrates the creation of a geospatial database for the Glanville fritillary butterfly (Melitaea cinxia) in the Aaland Islands, south-western Finland; (ii) to analyse species diversity and distribution using GIS techniques. Paper (II) presents a diversity and geographical distribution analysis for Scopulini moths at a world-wide scale; (iii) to study spatiotemporal forest cover change. Paper (III) presents a study of exotic and indigenous tree cover change detection in Taita Hills Kenya using airborne imagery and GIS analysis techniques; (iv) to explore predictive modelling techniques using geospatial data. In Paper (IV) human population occurrence and abundance in the Taita Hills highlands was predicted using the generalized additive modelling (GAM) technique. Paper (V) presents techniques to enhance fire prediction and burned area estimation at a regional scale in East Caprivi Namibia. Paper (VI) compares eight state-of-the-art predictive modelling methods to improve fire prediction, burned area estimation and fire risk mapping in East Caprivi Namibia. The results in Paper (I) showed that geospatial data can be managed effectively using advanced relational database management systems. Metapopulation data for Melitaea cinxia butterfly was successfully combined with GPS-delimited habitat patch information and climatic data. Using the geospatial database, spatial analyses were successfully conducted at habitat patch level or at more coarse analysis scales. Moreover, this study showed it appears evident that at a large-scale spatially correlated weather conditions are one of the primary causes of spatially correlated changes in Melitaea cinxia population sizes. In Paper (II) spatiotemporal characteristics

  8. Analysis of Virtual Learning Environments from a Comprehensive Semiotic Perspective

    Directory of Open Access Journals (Sweden)

    Gloria María Álvarez Cadavid

    2012-11-01

    Full Text Available Although there is a wide variety of perspectives and models for the study of online education, most of these focus on the analysis of the verbal aspects of such learning, while very few consider the relationship between speech and elements of a different nature, such as images and hypermediality. In a previous article we presented a proposal for a comprehensive semiotic analysis of virtual learning environments that more recently has been developed and tested for the study of different online training courses without instructional intervention. In this paper we use this same proposal to analyze online learning environments in the framework of courses with instructional intervention. One of the main observations in relation to this type of analyses is that the organizational aspects of the courses are found to be related to the way in which the input elements for the teaching and learning process are constructed.

  9. I Learn What I Need: Needs Analysis of English Learning in Taiwan

    Science.gov (United States)

    Chen, I-Ju; Chang, Yung-Hao; Chang, Wei-Huan

    2016-01-01

    The purpose of this study was to investigate the needs analysis of English learning from the viewpoints of students and the real needs of employers regarding English usage at the workplace. A questionnaire was administered to 60 participants comprising 30 senior students and 30 employers. After quantitative analysis, the results demonstrated that…

  10. Using a Bracketed Analysis as a Learning Tool.

    Science.gov (United States)

    Main, Keith

    1995-01-01

    Bracketed analysis is an examination of experiences within a defined time frame or "bracket." It assumes the ability to learn from any source: behaviors, emotions, rational and irrational thought, insights, reflections, and reactions. A bracketed analysis to determine what went wrong with a grant proposal that missed deadlines…

  11. A Geospatial Semantic Enrichment and Query Service for Geotagged Photographs

    Science.gov (United States)

    Ennis, Andrew; Nugent, Chris; Morrow, Philip; Chen, Liming; Ioannidis, George; Stan, Alexandru; Rachev, Preslav

    2015-01-01

    With the increasing abundance of technologies and smart devices, equipped with a multitude of sensors for sensing the environment around them, information creation and consumption has now become effortless. This, in particular, is the case for photographs with vast amounts being created and shared every day. For example, at the time of this writing, Instagram users upload 70 million photographs a day. Nevertheless, it still remains a challenge to discover the “right” information for the appropriate purpose. This paper describes an approach to create semantic geospatial metadata for photographs, which can facilitate photograph search and discovery. To achieve this we have developed and implemented a semantic geospatial data model by which a photograph can be enrich with geospatial metadata extracted from several geospatial data sources based on the raw low-level geo-metadata from a smartphone photograph. We present the details of our method and implementation for searching and querying the semantic geospatial metadata repository to enable a user or third party system to find the information they are looking for. PMID:26205265

  12. BPELPower—A BPEL execution engine for geospatial web services

    Science.gov (United States)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  13. Restful Implementation of Catalogue Service for Geospatial Data Provenance

    Science.gov (United States)

    Jiang, L. C.; Yue, P.; Lu, X. C.

    2013-10-01

    Provenance, also known as lineage, is important in understanding the derivation history of data products. Geospatial data provenance helps data consumers to evaluate the quality and reliability of geospatial data. In a service-oriented environment, where data are often consumed or produced by distributed services, provenance could be managed by following the same service-oriented paradigm. The Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) is used for the registration and query of geospatial data provenance by extending ebXML Registry Information Model (ebRIM). Recent advance of the REpresentational State Transfer (REST) paradigm has shown great promise for the easy integration of distributed resources. RESTful Web Service aims to provide a standard way for Web clients to communicate with servers based on REST principles. The existing approach for provenance catalogue service could be improved by adopting the RESTful design. This paper presents the design and implementation of a catalogue service for geospatial data provenance following RESTful architecture style. A middleware named REST Converter is added on the top of the legacy catalogue service to support a RESTful style interface. The REST Converter is composed of a resource request dispatcher and six resource handlers. A prototype service is developed to demonstrate the applicability of the approach.

  14. Inverse analysis of turbidites by machine learning

    Science.gov (United States)

    Naruse, H.; Nakao, K.

    2017-12-01

    This study aims to propose a method to estimate paleo-hydraulic conditions of turbidity currents from ancient turbidites by using machine-learning technique. In this method, numerical simulation was repeated under various initial conditions, which produces a data set of characteristic features of turbidites. Then, this data set of turbidites is used for supervised training of a deep-learning neural network (NN). Quantities of characteristic features of turbidites in the training data set are given to input nodes of NN, and output nodes are expected to provide the estimates of initial condition of the turbidity current. The optimization of weight coefficients of NN is then conducted to reduce root-mean-square of the difference between the true conditions and the output values of NN. The empirical relationship with numerical results and the initial conditions is explored in this method, and the discovered relationship is used for inversion of turbidity currents. This machine learning can potentially produce NN that estimates paleo-hydraulic conditions from data of ancient turbidites. We produced a preliminary implementation of this methodology. A forward model based on 1D shallow-water equations with a correction of density-stratification effect was employed. This model calculates a behavior of a surge-like turbidity current transporting mixed-size sediment, and outputs spatial distribution of volume per unit area of each grain-size class on the uniform slope. Grain-size distribution was discretized 3 classes. Numerical simulation was repeated 1000 times, and thus 1000 beds of turbidites were used as the training data for NN that has 21000 input nodes and 5 output nodes with two hidden-layers. After the machine learning finished, independent simulations were conducted 200 times in order to evaluate the performance of NN. As a result of this test, the initial conditions of validation data were successfully reconstructed by NN. The estimated values show very small

  15. Data Quality, Provenance and IPR Management services: their role in empowering geospatial data suppliers and users

    Science.gov (United States)

    Millard, Keiran

    2015-04-01

    This paper looks at current experiences of geospatial users and geospatial suppliers and how they have been limited by suitable frameworks for managing and communicating data quality, data provenance and intellectual property rights (IPR). Current political and technological drivers mean that increasing volumes of geospatial data are available through a plethora of different products and services, and whilst this is inherently a good thing it does create a new generation of challenges. This paper consider two examples of where these issues have been examined and looks at the challenges and possible solutions from a data user and data supplier perspective. The first example is the IQmulus project that is researching fusion environments for big geospatial point clouds and coverages. The second example is the EU Emodnet programme that is establishing thematic data portals for public marine and coastal data. IQmulus examines big geospatial data; the data from sources such as LIDAR, SONAR and numerical simulations; these data are simply too big for routine and ad-hoc analysis, yet they could realise a myriad of disparate, and readily useable, information products with the right infrastructure in place. IQmulus is researching how to deliver this infrastructure technically, but a financially sustainable delivery depends on being able to track and manage ownership and IPR across the numerous data sets being processed. This becomes complex when the data is composed of multiple overlapping coverages, however managing this allows for uses to be delivered highly-bespoke products to meet their budget and technical needs. The Emodnet programme delivers harmonised marine data at the EU scale across seven thematic portals. As part of the Emodnet programme a series of 'check points' have been initiated to examine how useful these services and other public data services actually are to solve real-world problems. One key finding is that users have been confused by the fact that often

  16. Feasibility study of geospatial mapping of chronic disease risk to inform public health commissioning.

    Science.gov (United States)

    Noble, Douglas; Smith, Dianna; Mathur, Rohini; Robson, John; Greenhalgh, Trisha

    2012-01-01

    To explore the feasibility of producing small-area geospatial maps of chronic disease risk for use by clinical commissioning groups and public health teams. Cross-sectional geospatial analysis using routinely collected general practitioner electronic record data. Tower Hamlets, an inner-city district of London, UK, characterised by high socioeconomic and ethnic diversity and high prevalence of non-communicable diseases. The authors used type 2 diabetes as an example. The data set was drawn from electronic general practice records on all non-diabetic individuals aged 25-79 years in the district (n=163 275). The authors used a validated instrument, QDScore, to calculate 10-year risk of developing type 2 diabetes. Using specialist mapping software (ArcGIS), the authors produced visualisations of how these data varied by lower and middle super output area across the district. The authors enhanced these maps with information on examples of locality-based social determinants of health (population density, fast food outlets and green spaces). Data were piloted as three types of geospatial map (basic, heat and ring). The authors noted practical, technical and information governance challenges involved in producing the maps. Usable data were obtained on 96.2% of all records. One in 11 adults in our cohort was at 'high risk' of developing type 2 diabetes with a 20% or more 10-year risk. Small-area geospatial mapping illustrated 'hot spots' where up to 17.3% of all adults were at high risk of developing type 2 diabetes. Ring maps allowed visualisation of high risk for type 2 diabetes by locality alongside putative social determinants in the same locality. The task of downloading, cleaning and mapping data from electronic general practice records posed some technical challenges, and judgement was required to group data at an appropriate geographical level. Information governance issues were time consuming and required local and national consultation and agreement. Producing

  17. Open Source Web Based Geospatial Processing with OMAR

    Directory of Open Access Journals (Sweden)

    Mark Lucas

    2009-01-01

    Full Text Available The availability of geospatial data sets is exploding. New satellites, aerial platforms, video feeds, global positioning system tagged digital photos, and traditional GIS information are dramatically increasing across the globe. These raw materials need to be dynamically processed, combined and correlated to generate value added information products to answer a wide range of questions. This article provides an overview of OMAR web based geospatial processing. OMAR is part of the Open Source Software Image Map project under the Open Source Geospatial Foundation. The primary contributors of OSSIM make their livings by providing professional services to US Government agencies and programs. OMAR provides one example that open source software solutions are increasingly being deployed in US government agencies. We will also summarize the capabilities of OMAR and its plans for near term development.

  18. Towards Geo-spatial Hypermedia: Concepts and Prototype Implementation

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Vestergaard, Peter Posselt; Ørbæk, Peter

    2002-01-01

    This paper combines spatial hypermedia with techniques from Geographical Information Systems and location based services. We describe the Topos 3D Spatial Hypermedia system and how it has been developed to support geo-spatial hypermedia coupling hypermedia information to model representations...... of real world buildings and landscapes. The prototype experiments are primarily aimed at supporting architects and landscape architects in their work on site. Here it is useful to be able to superimpose and add different layers of information to, e.g. a landscape depending on the task being worked on. We...... and indirect navigation. Finally, we conclude with a number of research issues which are central to the future development of geo-spatial hypermedia, including design issues in combining metaphorical and literal hypermedia space, as well as a discussion of the role of spatial parsing in a geo-spatial context....

  19. The Kinematic Learning Model using Video and Interfaces Analysis

    Science.gov (United States)

    Firdaus, T.; Setiawan, W.; Hamidah, I.

    2017-09-01

    An educator currently in demand to apply the learning to not be separated from the development of technology. Educators often experience difficulties when explaining kinematics material, this is because kinematics is one of the lessons that often relate the concept to real life. Kinematics is one of the courses of physics that explains the cause of motion of an object, Therefore it takes the thinking skills and analytical skills in understanding these symptoms. Technology is one that can bridge between conceptual relationship with real life. A framework of technology-based learning models has been developed using video and interfaces analysis on kinematics concept. By using this learning model, learners will be better able to understand the concept that is taught by the teacher. This learning model is able to improve the ability of creative thinking, analytical skills, and problem-solving skills on the concept of kinematics.

  20. Stochastic sensitivity analysis and Langevin simulation for neural network learning

    International Nuclear Information System (INIS)

    Koda, Masato

    1997-01-01

    A comprehensive theoretical framework is proposed for the learning of a class of gradient-type neural networks with an additive Gaussian white noise process. The study is based on stochastic sensitivity analysis techniques, and formal expressions are obtained for stochastic learning laws in terms of functional derivative sensitivity coefficients. The present method, based on Langevin simulation techniques, uses only the internal states of the network and ubiquitous noise to compute the learning information inherent in the stochastic correlation between noise signals and the performance functional. In particular, the method does not require the solution of adjoint equations of the back-propagation type. Thus, the present algorithm has the potential for efficiently learning network weights with significantly fewer computations. Application to an unfolded multi-layered network is described, and the results are compared with those obtained by using a back-propagation method

  1. Identification of phreatophytic groundwater dependent ecosystems using geospatial technologies

    Science.gov (United States)

    Perez Hoyos, Isabel Cristina

    The protection of groundwater dependent ecosystems (GDEs) is increasingly being recognized as an essential aspect for the sustainable management and allocation of water resources. Ecosystem services are crucial for human well-being and for a variety of flora and fauna. However, the conservation of GDEs is only possible if knowledge about their location and extent is available. Several studies have focused on the identification of GDEs at specific locations using ground-based measurements. However, recent progress in technologies such as remote sensing and their integration with geographic information systems (GIS) has provided alternative ways to map GDEs at much larger spatial extents. This study is concerned with the discovery of patterns in geospatial data sets using data mining techniques for mapping phreatophytic GDEs in the United States at 1 km spatial resolution. A methodology to identify the probability of an ecosystem to be groundwater dependent is developed. Probabilities are obtained by modeling the relationship between the known locations of GDEs and main factors influencing groundwater dependency, namely water table depth (WTD) and aridity index (AI). A methodology is proposed to predict WTD at 1 km spatial resolution using relevant geospatial data sets calibrated with WTD observations. An ensemble learning algorithm called random forest (RF) is used in order to model the distribution of groundwater in three study areas: Nevada, California, and Washington, as well as in the entire United States. RF regression performance is compared with a single regression tree (RT). The comparison is based on contrasting training error, true prediction error, and variable importance estimates of both methods. Additionally, remote sensing variables are omitted from the process of fitting the RF model to the data to evaluate the deterioration in the model performance when these variables are not used as an input. Research results suggest that although the prediction

  2. Representation of activity in images using geospatial temporal graphs

    Science.gov (United States)

    Brost, Randolph; McLendon, III, William C.; Parekh, Ojas D.; Rintoul, Mark Daniel; Watson, Jean-Paul; Strip, David R.; Diegert, Carl

    2018-05-01

    Various technologies pertaining to modeling patterns of activity observed in remote sensing images using geospatial-temporal graphs are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Activity patterns may be discerned from the graphs by coding nodes representing persistent objects like buildings differently from nodes representing ephemeral objects like vehicles, and examining the geospatial-temporal relationships of ephemeral nodes within the graph.

  3. Technologies Connotation and Developing Characteristics of Open Geospatial Information Platform

    Directory of Open Access Journals (Sweden)

    GUO Renzhong

    2016-02-01

    Full Text Available Based on the background of developments of surveying,mapping and geoinformation,aimed at the demands of data fusion,real-time sharing,in-depth processing and personalization,this paper analyzes significant features of geo-spatial service in digital city,focuses on theory,method and key techniques of open environment of cloud computing,multi-path data updating,full-scale urban geocoding,multi-source spatial data integration,adaptive geo-processing and adaptive Web mapping.As the basis for it,the Open Geospatial information platform is developed,and successfully implicated in digital Shenzhen.

  4. Assessing the socioeconomic impact and value of open geospatial information

    Science.gov (United States)

    Pearlman, Francoise; Pearlman, Jay; Bernknopf, Richard; Coote, Andrew; Craglia, Massimo; Friedl, Lawrence; Gallo, Jason; Hertzfeld, Henry; Jolly, Claire; Macauley, Molly K.; Shapiro, Carl; Smart, Alan

    2016-03-10

    The production and accessibility of geospatial information including Earth observation is changing greatly both technically and in terms of human participation. Advances in technology have changed the way that geospatial data are produced and accessed, resulting in more efficient processes and greater accessibility than ever before. Improved technology has also created opportunities for increased participation in the gathering and interpretation of data through crowdsourcing and citizen science efforts. Increased accessibility has resulted in greater participation in the use of data as prices for Government-produced data have fallen and barriers to access have been reduced.

  5. An approach for heterogeneous and loosely coupled geospatial data distributed computing

    Science.gov (United States)

    Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui

    2010-07-01

    Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.

  6. A Global Geospatial Database of 5000+ Historic Flood Event Extents

    Science.gov (United States)

    Tellman, B.; Sullivan, J.; Doyle, C.; Kettner, A.; Brakenridge, G. R.; Erickson, T.; Slayback, D. A.

    2017-12-01

    A key dataset that is missing for global flood model validation and understanding historic spatial flood vulnerability is a global historical geo-database of flood event extents. Decades of earth observing satellites and cloud computing now make it possible to not only detect floods in near real time, but to run these water detection algorithms back in time to capture the spatial extent of large numbers of specific events. This talk will show results from the largest global historical flood database developed to date. We use the Dartmouth Flood Observatory flood catalogue to map over 5000 floods (from 1985-2017) using MODIS, Landsat, and Sentinel-1 Satellites. All events are available for public download via the Earth Engine Catalogue and via a website that allows the user to query floods by area or date, assess population exposure trends over time, and download flood extents in geospatial format.In this talk, we will highlight major trends in global flood exposure per continent, land use type, and eco-region. We will also make suggestions how to use this dataset in conjunction with other global sets to i) validate global flood models, ii) assess the potential role of climatic change in flood exposure iii) understand how urbanization and other land change processes may influence spatial flood exposure iv) assess how innovative flood interventions (e.g. wetland restoration) influence flood patterns v) control for event magnitude to assess the role of social vulnerability and damage assessment vi) aid in rapid probabilistic risk assessment to enable microinsurance markets. Authors on this paper are already using the database for the later three applications and will show examples of wetland intervention analysis in Argentina, social vulnerability analysis in the USA, and micro insurance in India.

  7. GEOSPATIAL DATA INTEGRATION FOR ASSESSING LANDSLIDE HAZARD ON ENGINEERED SLOPES

    Directory of Open Access Journals (Sweden)

    P. E. Miller

    2012-07-01

    Full Text Available Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator’s hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator’s existing field-based approaches.

  8. A Smart Web-Based Geospatial Data Discovery System with Oceanographic Data as an Example

    Directory of Open Access Journals (Sweden)

    Yongyao Jiang

    2018-02-01

    Full Text Available Discovering and accessing geospatial data presents a significant challenge for the Earth sciences community as massive amounts of data are being produced on a daily basis. In this article, we report a smart web-based geospatial data discovery system that mines and utilizes data relevancy from metadata user behavior. Specifically, (1 the system enables semantic query expansion and suggestion to assist users in finding more relevant data; (2 machine-learned ranking is utilized to provide the optimal search ranking based on a number of identified ranking features that can reflect users’ search preferences; (3 a hybrid recommendation module is designed to allow users to discover related data considering metadata attributes and user behavior; (4 an integrated graphic user interface design is developed to quickly and intuitively guide data consumers to the appropriate data resources. As a proof of concept, we focus on a well-defined domain-oceanography and use oceanographic data discovery as an example. Experiments and a search example show that the proposed system can improve the scientific community’s data search experience by providing query expansion, suggestion, better search ranking, and data recommendation via a user-friendly interface.

  9. Aspect level sentiment analysis using machine learning

    Science.gov (United States)

    Shubham, D.; Mithil, P.; Shobharani, Meesala; Sumathy, S.

    2017-11-01

    In modern world the development of web and smartphones increases the usage of online shopping. The overall feedback about product is generated with the help of sentiment analysis using text processing.Opinion mining or sentiment analysis is used to collect and categorized the reviews of product. The proposed system uses aspect leveldetection in which features are extracted from the datasets. The system performs pre-processing operation such as tokenization, part of speech and limitization on the data tofinds meaningful information which is used to detect the polarity level and assigns rating to product. The proposed model focuses on aspects to produces accurate result by avoiding the spam reviews.

  10. Acceptance on Mobile Learning via SMS: A Rasch Model Analysis

    Directory of Open Access Journals (Sweden)

    Issham Ismail

    2010-04-01

    Full Text Available This study investigated whether mobile learning via Short Message Service (SMS-learning is accepted by the students enrolled in the distance learning academic programme in the Universiti Sains Malaysia. This study explored the impact of perceived usefulness, perceived ease of use and usability of the system to their acceptability. The survey was constructed using a questionnaire consisting of statements regarding the participants’ demographics, experiences in and perception of using mobile learning via SMS, involving 105 students from management and sciences disciplines. The Rasch Model Analysis was used for measurement correspond to a 5 point Likert. Results indicated that the usability of the system contributed to be effectiveness in assisting the students with their study. Respondents agree that SMS-learning is easy, effective and useful to help them study. However, the results found that there has been a problem in mobile learning that less interaction with lecturers. It implies that the acceptability of students to this mode on communication and interaction is highly endorsed.

  11. Simulated interprofessional education: an analysis of teaching and learning processes.

    Science.gov (United States)

    van Soeren, Mary; Devlin-Cop, Sandra; Macmillan, Kathleen; Baker, Lindsay; Egan-Lee, Eileen; Reeves, Scott

    2011-11-01

    Simulated learning activities are increasingly being used in health professions and interprofessional education (IPE). Specifically, IPE programs are frequently adopting role-play simulations as a key learning approach. Despite this widespread adoption, there is little empirical evidence exploring the teaching and learning processes embedded within this type of simulation. This exploratory study provides insight into the nature of these processes through the use of qualitative methods. A total of 152 clinicians, 101 students and 9 facilitators representing a range of health professions, participated in video-recorded role-plays and debrief sessions. Videotapes were analyzed to explore emerging issues and themes related to teaching and learning processes related to this type of interprofessional simulated learning experience. In addition, three focus groups were conducted with a subset of participants to explore perceptions of their educational experiences. Five key themes emerged from the data analysis: enthusiasm and motivation, professional role assignment, scenario realism, facilitator style and background and team facilitation. Our findings suggest that program developers need to be mindful of these five themes when using role-plays in an interprofessional context and point to the importance of deliberate and skilled facilitation in meeting desired learning outcomes.

  12. Mapping learning and game mechanics for serious games analysis

    NARCIS (Netherlands)

    Arnab, S.; Lim, T.; Brandao Carvalho, M.; Bellotti, F.; De Freitas, S.; Louchart, S.; Suttie, N.; Berta, R.; De Gloria, A.

    2015-01-01

    Although there is a consensus on the instructional potential of Serious Games (SGs), there is still a lack of methodologies and tools not only for design but also to support analysis and assessment. Filling this gap is one of the main aims of the Games and Learning Alliance (http://www.galanoe.eu)

  13. Conversation Analysis in Computer-Assisted Language Learning

    Science.gov (United States)

    González-Lloret, Marta

    2015-01-01

    The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…

  14. Cooperative Learning in Turkey: A Content Analysis of Theses

    Science.gov (United States)

    Dirlikli, Murat

    2016-01-01

    This study is a content analysis of theses concerning cooperative learning prepared in Turkey between the years 1993 and 2014. A total of 220 theses which were accessible online (open access) at the site of Council of Higher Education (CoHE) were analyzed. The publishing classification form used in this study was prepared analyzing similar forms…

  15. Mapping Learning and Game Mechanics for Serious Games Analysis

    Science.gov (United States)

    Arnab, Sylvester; Lim, Theodore; Carvalho, Maira B.; Bellotti, Francesco; de Freitas, Sara; Louchart, Sandy; Suttie, Neil; Berta, Riccardo; De Gloria, Alessandro

    2015-01-01

    Although there is a consensus on the instructional potential of Serious Games (SGs), there is still a lack of methodologies and tools not only for design but also to support analysis and assessment. Filling this gap is one of the main aims of the Games and Learning Alliance (http://www.galanoe.eu) European Network of Excellence on Serious Games,…

  16. A NoSQL–SQL Hybrid Organization and Management Approach for Real-Time Geospatial Data: A Case Study of Public Security Video Surveillance

    Directory of Open Access Journals (Sweden)

    Chen Wu

    2017-01-01

    Full Text Available With the widespread deployment of ground, air and space sensor sources (internet of things or IoT, social networks, sensor networks, the integrated applications of real-time geospatial data from ubiquitous sensors, especially in public security and smart city domains, are becoming challenging issues. The traditional geographic information system (GIS mostly manages time-discretized geospatial data by means of the Structured Query Language (SQL database management system (DBMS and emphasizes query and retrieval of massive historical geospatial data on disk. This limits its capability for on-the-fly access of real-time geospatial data for online analysis in real time. This paper proposes a hybrid database organization and management approach with SQL relational databases (RDB and not only SQL (NoSQL databases (including the main memory database, MMDB, and distributed files system, DFS. This hybrid approach makes full use of the advantages of NoSQL and SQL DBMS for the real-time access of input data and structured on-the-fly analysis results which can meet the requirements of increased spatio-temporal big data linking analysis. The MMDB facilitates real-time access of the latest input data such as the sensor web and IoT, and supports the real-time query for online geospatial analysis. The RDB stores change information such as multi-modal features and abnormal events extracted from real-time input data. The DFS on disk manages the massive geospatial data, and the extensible storage architecture and distributed scheduling of a NoSQL database satisfy the performance requirements of incremental storage and multi-user concurrent access. A case study of geographic video (GeoVideo surveillance of public security is presented to prove the feasibility of this hybrid organization and management approach.

  17. Analysis of an Interactive Technology Supported Problem-Based Learning STEM Project Using Selected Learning Sciences Interest Areas (SLSIA)

    Science.gov (United States)

    Kumar, David Devraj

    2017-01-01

    This paper reports an analysis of an interactive technology-supported, problem-based learning (PBL) project in science, technology, engineering and mathematics (STEM) from a Learning Sciences perspective using the Selected Learning Sciences Interest Areas (SLSIA). The SLSIA was adapted from the "What kinds of topics do ISLS [International…

  18. A resource-oriented architecture for a Geospatial Web

    Science.gov (United States)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary

  19. MultiSpec: A Desktop and Online Geospatial Image Data Processing Tool

    Science.gov (United States)

    Biehl, L. L.; Hsu, W. K.; Maud, A. R. M.; Yeh, T. T.

    2017-12-01

    MultiSpec is an easy to learn and use, freeware image processing tool for interactively analyzing a broad spectrum of geospatial image data, with capabilities such as image display, unsupervised and supervised classification, feature extraction, feature enhancement, and several other functions. Originally developed for Macintosh and Windows desktop computers, it has a community of several thousand users worldwide, including researchers and educators, as a practical and robust solution for analyzing multispectral and hyperspectral remote sensing data in several different file formats. More recently MultiSpec was adapted to run in the HUBzero collaboration platform so that it can be used within a web browser, allowing new user communities to be engaged through science gateways. MultiSpec Online has also been extended to interoperate with other components (e.g., data management) in HUBzero through integration with the geospatial data building blocks (GABBs) project. This integration enables a user to directly launch MultiSpec Online from data that is stored and/or shared in a HUBzero gateway and to save output data from MultiSpec Online to hub storage, allowing data sharing and multi-step workflows without having to move data between different systems. MultiSpec has also been used in K-12 classes for which one example is the GLOBE program (www.globe.gov) and in outreach material such as that provided by the USGS (eros.usgs.gov/educational-activities). MultiSpec Online now provides teachers with another way to use MultiSpec without having to install the desktop tool. Recently MultiSpec Online was used in a geospatial data session with 30-35 middle school students at the Turned Onto Technology and Leadership (TOTAL) Camp in the summers of 2016 and 2017 at Purdue University. The students worked on a flood mapping exercise using Landsat 5 data to learn about land remote sensing using supervised classification techniques. Online documentation is available for Multi

  20. Crisp Clustering Algorithm for 3D Geospatial Vector Data Quantization

    DEFF Research Database (Denmark)

    Azri, Suhaibah; Anton, François; Ujang, Uznir

    2015-01-01

    In the next few years, 3D data is expected to be an intrinsic part of geospatial data. However, issues on 3D spatial data management are still in the research stage. One of the issues is performance deterioration during 3D data retrieval. Thus, a practical 3D index structure is required for effic...

  1. Geospatial Technology In Environmental Impact Assessments – Retrospective.

    Directory of Open Access Journals (Sweden)

    Goparaju Laxmi

    2015-10-01

    Full Text Available Environmental Impact Assessments are studies conducted to give us an insight into the various impacts caused by an upcoming industry or any developmental activity. It should address various social, economic and environmental issues ensuring that negative impacts are mitigated. In this context, geospatial technology has been used widely in recent times.

  2. Automated Geospatial Watershed Assessment Tool (AGWA) Poster Presentation

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA, see: www.tucson.ars.ag.gov/agwa or http://www.epa.gov/esd/land-sci/agwa/) is a GIS interface jointly developed by the USDA-Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona...

  3. Big Data analytics in the Geo-Spatial Domain

    NARCIS (Netherlands)

    R.A. Goncalves (Romulo); M.G. Ivanova (Milena); M.L. Kersten (Martin); H. Scholten; S. Zlatanova; F. Alvanaki (Foteini); P. Nourian (Pirouz); E. Dias

    2014-01-01

    htmlabstractBig data collections in many scientific domains have inherently rich spatial and geo-spatial features. Spatial location is among the core aspects of data in Earth observation sciences, astronomy, and seismology to name a few. The goal of our project is to design an efficient data

  4. A study on state of Geospatial courses in Indian Universities

    Science.gov (United States)

    Shekhar, S.

    2014-12-01

    Today the world is dominated by three technologies such as Nano technology, Bio technology and Geospatial technology. This increases the huge demand for experts in the respective field for disseminating the knowledge as well as for an innovative research. Therefore, the prime need is to train the existing fraternity to gain progressive knowledge in these technologies and impart the same to student community. The geospatial technology faces some peculiar problem than other two technologies because of its interdisciplinary, multi-disciplinary nature. It attracts students and mid career professionals from various disciplines including Physics, Computer science, Engineering, Geography, Geology, Agriculture, Forestry, Town Planning and so on. Hence there is always competition to crab and stabilize their position. The students of Master's degree in Geospatial science are facing two types of problem. The first one is no unique identity in the academic field. Neither they are exempted for National eligibility Test for Lecturer ship nor given an opportunity to have the exam in geospatial science. The second one is differential treatment by the industrial world. The students are either given low grade jobs or poorly paid for their job. Thus, it is a serious issue about the future of this course in the Universities and its recognition in the academic and industrial world. The universities should make this course towards more job oriented in consultation with the Industries and Industries should come forward to share their demands and requirements to the Universities, so that necessary changes in the curriculum can be made to meet the industrial requirements.

  5. Persistent Teaching Practices after Geospatial Technology Professional Development

    Science.gov (United States)

    Rubino-Hare, Lori A.; Whitworth, Brooke A.; Bloom, Nena E.; Claesgens, Jennifer M.; Fredrickson, Kristi M.; Sample, James C.

    2016-01-01

    This case study described teachers with varying technology skills who were implementing the use of geospatial technology (GST) within project-based instruction (PBI) at varying grade levels and contexts 1 to 2 years following professional development. The sample consisted of 10 fifth- to ninth-grade teachers. Data sources included artifacts,…

  6. Theoretical multi-tier trust framework for the geospatial domain

    CSIR Research Space (South Africa)

    Umuhoza, D

    2010-01-01

    Full Text Available chain or workflow from data acquisition to knowledge discovery. The author’s present work in progress of a theoretical multi-tier trust framework for processing chain from data acquisition to knowledge discovery in geospatial domain. Holistic trust...

  7. Sextant: Visualizing time-evolving linked geospatial data

    NARCIS (Netherlands)

    C. Nikolaou (Charalampos); K. Dogani (Kallirroi); K. Bereta (Konstantina); G. Garbis (George); M. Karpathiotakis (Manos); K. Kyzirakos (Konstantinos); M. Koubarakis (Manolis)

    2015-01-01

    textabstractThe linked open data cloud is constantly evolving as datasets get continuously updated with newer versions. As a result, representing, querying, and visualizing the temporal dimension of linked data is crucial. This is especially important for geospatial datasets that form the backbone

  8. Shared Geospatial Metadata Repository for Ontario University Libraries: Collaborative Approaches

    Science.gov (United States)

    Forward, Erin; Leahey, Amber; Trimble, Leanne

    2015-01-01

    Successfully providing access to special collections of digital geospatial data in academic libraries relies upon complete and accurate metadata. Creating and maintaining metadata using specialized standards is a formidable challenge for libraries. The Ontario Council of University Libraries' Scholars GeoPortal project, which created a shared…

  9. Geospatial Data Repository. Sharing Data Across the Organization and Beyond

    National Research Council Canada - National Science Library

    Ruiz, Marilyn

    2001-01-01

    .... This short Technical Note discusses a five-part approach to creating a data repository that addresses the problems of the historical organizational framework for geospatial data. Fort Hood, Texas was the site used to develop the prototype. A report documenting the complete study will be available in late Spring 2001.

  10. Learning Over Time: Using Rapid Prototyping Generative Analysis Experts and Reduction of Scope to Operationalize Design

    Science.gov (United States)

    2010-05-04

    during the Vietnam Conflict. 67 David A. Kolb , Experiential Learning : Experience as the Source of Learning and Development. (Upper Saddle River, NJ...Essentials for Military Applications. Newport Paper #10. Newport: Newport War College Press. 1996. Kolb , David A. Experiential Learning : Experience... learning over analysis. A broad review of design theory suggests that four techniques - rapid prototyping, generative analysis, use of experts, and

  11. Geospatial Modelling for Micro Zonation of Groundwater Regime in Western Assam, India

    Science.gov (United States)

    Singh, R. P.

    2016-12-01

    Water, most precious natural resource on earth, is vital to sustain the natural system and human civilisation on the earth. The Assam state located in north-eastern part of India has a relatively good source of ground water due to their geographic and physiographic location but there is problem deterioration of groundwater quality causing major health problem in the area. In this study, I tried a integrated study of remote sensing and GIS and chemical analysis of groundwater samples to throw a light over groundwater regime and provides information for decision makers to make sustainable water resource management. The geospatial modelling performed by integrating hydrogeomorphic features. Geomorphology, lineament, Drainage, Landuse/landcover layer were generated through visual interpretation on satellite image (LISS III) based on tone, texture, shape, size, and arrangement of the features. Slope layer was prepared by using SRTM DEM data set .The LULC of the area were categories in to 6 classes of Agricultural field, Forest area ,River, Settlement , Tree-clad area and Wetlands. The geospatial modelling performed through weightage and rank method in GIS, depending on the influence of the features on ground water regime. To Assess the ground water quality of the area 45 groundwater samples have been collected from the field and chemical analysis performed through the standard method in the laboratory. The overall assessment of the ground water quality of the area analyse through Water Quality Index and found that about 70% samples are not potable for drinking purposes due to higher concentration Arsenic, Fluoride and Iron. It appears that, source of all these pollutants geologically and geomorphologically derived. Interpolated layer of Water Quality Index and geospatial modelled Groundwater potential layer provides a holistic view of groundwater scenario and provide direction for better planning and groundwater resource management. Study will be discussed in details

  12. Learning Management System Migration: An Analysis of Stakeholder Perspectives

    Directory of Open Access Journals (Sweden)

    Tom G Ryan

    2012-01-01

    Full Text Available In this mixed methods study the authors describe the institution-level perceptions of stakeholders transitioning to a new learning management system (LMS. We address issues related to change, the institution’s administration of the transition process, problems encountered, and realized learning via online survey data collection, analysis, and interpretation. We further detail results of a faculty survey, which sought to illuminate the LMS transition experience. The summation includes suggestions for institutions as they prepare for, and move through, foreseeable LMS change and transition.

  13. Advances in independent component analysis and learning machines

    CERN Document Server

    Bingham, Ella; Laaksonen, Jorma; Lampinen, Jouko

    2015-01-01

    In honour of Professor Erkki Oja, one of the pioneers of Independent Component Analysis (ICA), this book reviews key advances in the theory and application of ICA, as well as its influence on signal processing, pattern recognition, machine learning, and data mining. Examples of topics which have developed from the advances of ICA, which are covered in the book are: A unifying probabilistic model for PCA and ICA Optimization methods for matrix decompositions Insights into the FastICA algorithmUnsupervised deep learning Machine vision and image retrieval A review of developments in the t

  14. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Årup; Frutiger, Sally A.

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing. Hum. Brain Mapping 15...

  15. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    Science.gov (United States)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  16. Geospatial decision support systems for societal decision making

    Science.gov (United States)

    Bernknopf, R.L.

    2005-01-01

    While science provides reliable information to describe and understand the earth and its natural processes, it can contribute more. There are many important societal issues in which scientific information can play a critical role. Science can add greatly to policy and management decisions to minimize loss of life and property from natural and man-made disasters, to manage water, biological, energy, and mineral resources, and in general, to enhance and protect our quality of life. However, the link between science and decision-making is often complicated and imperfect. Technical language and methods surround scientific research and the dissemination of its results. Scientific investigations often are conducted under different conditions, with different spatial boundaries, and in different timeframes than those needed to support specific policy and societal decisions. Uncertainty is not uniformly reported in scientific investigations. If society does not know that data exist, what the data mean, where to use the data, or how to include uncertainty when a decision has to be made, then science gets left out -or misused- in a decision making process. This paper is about using Geospatial Decision Support Systems (GDSS) for quantitative policy analysis. Integrated natural -social science methods and tools in a Geographic Information System that respond to decision-making needs can be used to close the gap between science and society. The GDSS has been developed so that nonscientists can pose "what if" scenarios to evaluate hypothetical outcomes of policy and management choices. In this approach decision makers can evaluate the financial and geographic distribution of potential policy options and their societal implications. Actions, based on scientific information, can be taken to mitigate hazards, protect our air and water quality, preserve the planet's biodiversity, promote balanced land use planning, and judiciously exploit natural resources. Applications using the

  17. Applying Authentic Data Analysis in Learning Earth Atmosphere

    Science.gov (United States)

    Johan, H.; Suhandi, A.; Samsudin, A.; Wulan, A. R.

    2017-09-01

    The aim of this research was to develop earth science learning material especially earth atmosphere supported by science research with authentic data analysis to enhance reasoning through. Various earth and space science phenomenon require reasoning. This research used experimental research with one group pre test-post test design. 23 pre-service physics teacher participated in this research. Essay test was conducted to get data about reason ability. Essay test was analyzed quantitatively. Observation sheet was used to capture phenomena during learning process. The results showed that student’s reasoning ability improved from unidentified and no reasoning to evidence based reasoning and inductive/deductive rule-based reasoning. Authentic data was considered using Grid Analysis Display System (GrADS). Visualization from GrADS facilitated students to correlate the concepts and bring out real condition of nature in classroom activity. It also helped student to reason the phenomena related to earth and space science concept. It can be concluded that applying authentic data analysis in learning process can help to enhance students reasoning. This study is expected to help lecture to bring out result of geoscience research in learning process and facilitate student understand concepts.

  18. Towards Geo-spatial Information Science in Big Data Era

    Directory of Open Access Journals (Sweden)

    LI Deren

    2016-04-01

    Full Text Available Since the 1990s, with the advent of worldwide information revolution and the development of internet, geospatial information science have also come of age, which pushed forward the building of digital Earth and cyber city. As we entered the 21st century, with the development and integration of global information technology and industrialization, internet of things and cloud computing came into being, human society enters into the big data era. This article covers the key features (ubiquitous, multi-dimension and dynamics, internet+networking, full automation and real-time, from sensing to recognition, crowdsourcing and VGI, and service-oriented of geospatial information science in the big data era and addresses the key technical issues (non-linear four dimensional Earth reference frame system, space based enhanced GNSS, space-air and land unified network communication techniques, on board processing techniques for multi-sources image data, smart interface service techniques for space-borne information, space based resource scheduling and network security, design and developing of a payloads based multi-functional satellite platform. That needs to be resolved to provide a new definition of geospatial information science in big data era. Based on the discussion in this paper, the author finally proposes a new definition of geospatial information science (geomatics, i.e. Geomatics is a multiple discipline science and technology which, using a systematic approach, integrates all the means for spatio-temporal data acquisition, information extraction, networked management, knowledge discovering, spatial sensing and recognition, as well as intelligent location based services of any physical objects and human activities around the earth and its environment. Starting from this new definition, geospatial information science will get much more chances and find much more tasks in big data era for generation of smart earth and smart city . Our profession

  19. Global polar geospatial information service retrieval based on search engine and ontology reasoning

    Science.gov (United States)

    Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang

    2007-01-01

    In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.

  20. Automatic geospatial information Web service composition based on ontology interface matching

    Science.gov (United States)

    Xu, Xianbin; Wu, Qunyong; Wang, Qinmin

    2008-10-01

    With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.

  1. Validation of VARK learning modalities questionnaire using Rasch analysis

    Science.gov (United States)

    Fitkov-Norris, E. D.; Yeghiazarian, A.

    2015-02-01

    This article discusses the application of Rasch analysis to assess the internal validity of a four sub-scale VARK (Visual, Auditory, Read/Write and Kinaesthetic) learning styles instrument. The results from the analysis show that the Rasch model fits the majority of the VARK questionnaire data and the sample data support the internal validity of the four sub-constructs at 1% level of significance for all but one item. While this suggests that the instrument could potentially be used as a predictor for a person's learning preference orientation, further analysis is necessary to confirm the invariability of the instrument across different user groups across factors such as gender, age, educational and cultural background.

  2. 12. Collaborative Learning – A Possible Approach of Learning in the Discipline of Study Musical Analysis

    Directory of Open Access Journals (Sweden)

    Vlahopol Gabriela

    2016-03-01

    Full Text Available The musician’s typology is anchored, according to the traditional perception, within the limits of an individualistic image, which searches, develops and affirms its creativity following an individual training process. The collaborative learning is one of the educational patterns less used in the artistic education, being limited to several disciplines whose specificity requires appurtenance to a study group (for instance chamber training, orchestra. The method’s application to the theoretical disciplines often encounters reserves both on part of the teachers and the students as well, because of the efforts required for its design and implementation. The study herein offers a possible approach of collaborative learning within the course of study Musical Analysis, pleading for the need of the social component development of the learning activities of the instrumental performer student, by his involvement within a study group.

  3. A survey on deep learning in medical image analysis.

    Science.gov (United States)

    Litjens, Geert; Kooi, Thijs; Bejnordi, Babak Ehteshami; Setio, Arnaud Arindra Adiyoso; Ciompi, Francesco; Ghafoorian, Mohsen; van der Laak, Jeroen A W M; van Ginneken, Bram; Sánchez, Clara I

    2017-12-01

    Deep learning algorithms, in particular convolutional networks, have rapidly become a methodology of choice for analyzing medical images. This paper reviews the major deep learning concepts pertinent to medical image analysis and summarizes over 300 contributions to the field, most of which appeared in the last year. We survey the use of deep learning for image classification, object detection, segmentation, registration, and other tasks. Concise overviews are provided of studies per application area: neuro, retinal, pulmonary, digital pathology, breast, cardiac, abdominal, musculoskeletal. We end with a summary of the current state-of-the-art, a critical discussion of open challenges and directions for future research. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Geo-Spatial Support for Assessment of Anthropic Impact on Biodiversity

    Directory of Open Access Journals (Sweden)

    Marco Piragnolo

    2014-04-01

    Full Text Available This paper discusses a methodology where geo-spatial analysis tools are used to quantify risk derived from anthropic activities on habitats and species. The method has been developed with a focus on simplification and the quality of standard procedures set on flora and fauna protected by the European Directives. In this study case, the DPSIR (Drivers, Pressures, State, Impacts, Responses is applied using spatial procedures in a geographical information system (GIS framework. This approach can be inserted in a multidimensional space as the analysis is applied to each threat, pressure and activity and also to each habitat and species, at the spatial and temporal scale. Threats, pressures and activities, stress and indicators can be managed by means of a geo-database and analyzed using spatial analysis functions in a tested GIS workflow environment. The method applies a matrix with risk values, and the final product is a geo-spatial representation of impact indicators, which can be used as a support for decision-makers at various levels (regional, national and European.

  5. Emerging Online Learning Environments and Student Learning: An Analysis of Faculty Perceptions

    Directory of Open Access Journals (Sweden)

    Gary Brown

    2004-01-01

    Full Text Available New educational technologies and online learning environments (OLEs are infiltrating today’s college classes and campuses. While research has examined many aspects of this permeation, one research gap exists. How do faculty perceive the learning experience in courses that use OLEs compared to courses that do not? One important factor that may influence faculty perceptions are their reasons for teaching with OLEs. This paper seeks to understand how faculty perceive OLEs as a function of their reasons for teaching with this educational technology. This paper also investigates whether faculty evaluations of OLEs differ based on gender and by years teaching. The results of the analysis reveal several noteworthy patterns. First, it appears that favorable opinions about the learning experiences in online learning environments are not because faculty are motivated to learn about new technologies per se, but because they want to update their vitas and teaching skills. Second, the results suggest that it may be harder to convince older and more experienced faculty to use new technologies compared to younger and less experienced faculty. These results apply to both male and female faculty and provide practical implications for universities and support services on how to recruit and then support faculty who implement educational technologies.

  6. Evaluating the Open Source Data Containers for Handling Big Geospatial Raster Data

    Directory of Open Access Journals (Sweden)

    Fei Hu

    2018-04-01

    Full Text Available Big geospatial raster data pose a grand challenge to data management technologies for effective big data query and processing. To address these challenges, various big data container solutions have been developed or enhanced to facilitate data storage, retrieval, and analysis. Data containers were also developed or enhanced to handle geospatial data. For example, Rasdaman was developed to handle raster data and GeoSpark/SpatialHadoop were enhanced from Spark/Hadoop to handle vector data. However, there are few studies to systematically compare and evaluate the features and performances of these popular data containers. This paper provides a comprehensive evaluation of six popular data containers (i.e., Rasdaman, SciDB, Spark, ClimateSpark, Hive, and MongoDB for handling multi-dimensional, array-based geospatial raster datasets. Their architectures, technologies, capabilities, and performance are compared and evaluated from two perspectives: (a system design and architecture (distributed architecture, logical data model, physical data model, and data operations; and (b practical use experience and performance (data preprocessing, data uploading, query speed, and resource consumption. Four major conclusions are offered: (1 no data containers, except ClimateSpark, have good support for the HDF data format used in this paper, requiring time- and resource-consuming data preprocessing to load data; (2 SciDB, Rasdaman, and MongoDB handle small/mediate volumes of data query well, whereas Spark and ClimateSpark can handle large volumes of data with stable resource consumption; (3 SciDB and Rasdaman provide mature array-based data operation and analytical functions, while the others lack these functions for users; and (4 SciDB, Spark, and Hive have better support of user defined functions (UDFs to extend the system capability.

  7. Creating of Central Geospatial Database of the Slovak Republic and Procedures of its Revision

    Science.gov (United States)

    Miškolci, M.; Šafář, V.; Šrámková, R.

    2016-06-01

    The article describes the creation of initial three dimensional geodatabase from planning and designing through the determination of technological and manufacturing processes to practical using of Central Geospatial Database (CGD - official name in Slovak language is Centrálna Priestorová Databáza - CPD) and shortly describes procedures of its revision. CGD ensures proper collection, processing, storing, transferring and displaying of digital geospatial information. CGD is used by Ministry of Defense (MoD) for defense and crisis management tasks and by Integrated rescue system. For military personnel CGD is run on MoD intranet, and for other users outside of MoD is transmutated to ZbGIS (Primary Geodatabase of Slovak Republic) and is run on public web site. CGD is a global set of geo-spatial information. CGD is a vector computer model which completely covers entire territory of Slovakia. Seamless CGD is created by digitizing of real world using of photogrammetric stereoscopic methods and measurements of objects properties. Basic vector model of CGD (from photogrammetric processing) is then taken out to the field for inspection and additional gathering of objects properties in the whole area of mapping. Finally real-world objects are spatially modeled as a entities of three-dimensional database. CGD gives us opportunity, to get know the territory complexly in all the three spatial dimensions. Every entity in CGD has recorded the time of collection, which allows the individual to assess the timeliness of information. CGD can be utilized for the purposes of geographical analysis, geo-referencing, cartographic purposes as well as various special-purpose mapping and has the ambition to cover the needs not only the MoD, but to become a reference model for the national geographical infrastructure.

  8. Tsunami vertical-evacuation planning in the U.S. Pacific Northwest as a geospatial, multi-criteria decision problem

    Science.gov (United States)

    Wood, Nathan; Jones, Jeanne; Schelling, John; Schmidtlein, Mathew

    2014-01-01

    Tsunami vertical-evacuation (TVE) refuges can be effective risk-reduction options for coastal communities with local tsunami threats but no accessible high ground for evacuations. Deciding where to locate TVE refuges is a complex risk-management question, given the potential for conflicting stakeholder priorities and multiple, suitable sites. We use the coastal community of Ocean Shores (Washington, USA) and the local tsunami threat posed by Cascadia subduction zone earthquakes as a case study to explore the use of geospatial, multi-criteria decision analysis for framing the locational problem of TVE siting. We demonstrate a mixed-methods approach that uses potential TVE sites identified at community workshops, geospatial analysis to model changes in pedestrian evacuation times for TVE options, and statistical analysis to develop metrics for comparing population tradeoffs and to examine influences in decision making. Results demonstrate that no one TVE site can save all at-risk individuals in the community and each site provides varying benefits to residents, employees, customers at local stores, tourists at public venues, children at schools, and other vulnerable populations. The benefit of some proposed sites varies depending on whether or not nearby bridges will be functioning after the preceding earthquake. Relative rankings of the TVE sites are fairly stable under various criteria-weighting scenarios but do vary considerably when comparing strategies to exclusively protect tourists or residents. The proposed geospatial framework can serve as an analytical foundation for future TVE siting discussions.

  9. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    Science.gov (United States)

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  10. Life Cycle Management Considerations of Remotely Sensed Geospatial Data and Documentation for Long Term Preservation

    Science.gov (United States)

    Khayat, Mohammad G.; Kempler, Steven J.

    2015-01-01

    As geospatial missions age, one of the challenges for the usability of data is the availability of relevant and updated metadata with sufficient documentation that can be used by future generations of users to gain knowledge from the original data. Given that remote sensing data undergo many intermediate processing steps, for example, an understanding of the exact algorithms employed and the quality of that data produced, could be key considerations for these users. As interest in global climate data is increasing, documentation about older data, their origins, and provenance are valuable to first time users attempting to perform historical climate research or comparative analysis of global change. Incomplete or missing documentation could be what stands in the way of a new researcher attempting to use the data. Therefore, preservation of documentation and related metadata is sometimes just as critical as the preservation of the original observational data. The Goddard Earth Sciences - Data and Information Service Center (GES DISC), a NASA Earth science Distributed Active Archive Center (DAAC), that falls under the management structure of the Earth Science Data and Information System (ESDIS), is actively pursuing the preservation of all necessary artifacts needed by future users. In this paper we will detail the data custodial planning and the data lifecycle process developed for content preservation, our implementation of a Preservation System to safeguard documents and associated artifacts from legacy (older) missions, as well as detail lessons learned regarding access rights and confidentiality of information issues. We also elaborate on key points that made our preservation effort successful; the primary points being: the drafting of a governing baseline for historical data preservation from satellite missions, and using the historical baseline as a guide to content filtering of what documents to preserve. The Preservation System currently archives

  11. Machine Learning for the Evolutionary Analysis of Breast Cancer

    Directory of Open Access Journals (Sweden)

    Alexander Mackenzie Rivero

    2018-02-01

    Full Text Available The use of machine learning allows the creation of a predictive data model, as a result of the analysis in a data set with 286 instances and nine attributes belonging to the Institute of Oncology of the University Medical Center. Ljubljana. Based on this situation, the data are preprocessed by applying intelligent data analysis techniques to eliminate missing values as well as the evaluation of each attribute that allows the optimization of results. We used several classification algorithms including J48 trees, random forest, bayes net, naive bayes, decision table, in order to obtain one that given the characteristics of the data, would allow the best classification percentage and therefore a better matrix of confusion, Using 66 % of the data for learning and 33 % for validating the model. Using this model, a predictor with a 71,134 % e effectiveness is obtained to estimate or not the recurrence of breast cancer.

  12. Tensor-Dictionary Learning with Deep Kruskal-Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, Andrew J.; Pu, Yunchen; Sun, Yannan; Spell, Gregory; Carin, Lawrence

    2017-04-20

    We introduce new dictionary learning methods for tensor-variate data of any order. We represent each data item as a sum of Kruskal decomposed dictionary atoms within the framework of beta-process factor analysis (BPFA). Our model is nonparametric and can infer the tensor-rank of each dictionary atom. This Kruskal-Factor Analysis (KFA) is a natural generalization of BPFA. We also extend KFA to a deep convolutional setting and develop online learning methods. We test our approach on image processing and classification tasks achieving state of the art results for 2D & 3D inpainting and Caltech 101. The experiments also show that atom-rank impacts both overcompleteness and sparsity.

  13. High performance geospatial and climate data visualization using GeoJS

    Science.gov (United States)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    data and analysis regarding 1) the human trafficking domain, 2) New York City taxi drop-offs and pick-ups, and 3) the Ebola outbreak. GeoJS supports advanced visualization features such as picking and selecting, as well as clustering. It also supports 2D contour plots, vector plots, heat maps, and geospatial graphs.

  14. Heterogeneous agent model and numerical analysis of learning

    Czech Academy of Sciences Publication Activity Database

    Vošvrda, Miloslav; Vácha, Lukáš

    2002-01-01

    Roč. 9, č. 17 (2002), s. 15-22 ISSN 1212-074X R&D Projects: GA ČR GA402/01/0034; GA ČR GA402/01/0539; GA AV ČR IAA7075202 Institutional research plan: CEZ:AV0Z1075907 Keywords : efficient markets hypothesis * technical trading rules * numerical analysis of learning Subject RIV: AH - Economics

  15. Mapping the world: cartographic and geographic visualization by the United Nations Geospatial Information Section (formerly Cartographic Section)

    Science.gov (United States)

    Kagawa, Ayako; Le Sourd, Guillaume

    2018-05-01

    United Nations Secretariat activities, mapping began in 1946, and by 1951, the need for maps increased and an office with a team of cartographers was established. Since then, with the development of technologies including internet, remote sensing, unmanned aerial systems, relationship database management and information systems, geospatial information provides an ever-increasing variation of support to the work of the Organization for planning of operations, decision-making and monitoring of crises. However, the need for maps has remained intact. This presentation aims to highlight some of the cartographic representation styles over the decades by reviewing the evolution of selected maps by the office, and noting the changing cognitive and semiotic aspects of cartographic and geographic visualization required by the United Nations. Through presentation and analysis of these maps, the changing dynamics of the Organization in information management can be reflected, with a reminder of the continuing and expanding deconstructionist role of a cartographer, now geospatial information management experts.

  16. Learning curve analysis of mitral valve repair using telemanipulative technology.

    Science.gov (United States)

    Charland, Patrick J; Robbins, Tom; Rodriguez, Evilio; Nifong, Wiley L; Chitwood, Randolph W

    2011-08-01

    To determine if the time required to perform mitral valve repairs using telemanipulation technology decreases with experience and how that decrease is influenced by patient and procedure variables. A single-center retrospective review was conducted using perioperative and outcomes data collected contemporaneously on 458 mitral valve repair surgeries using telemanipulative technology. A regression model was constructed to assess learning with this technology and predict total robot time using multiple predictive variables. Statistical analysis was used to determine if models were significantly useful, to rule out correlation between predictor variables, and to identify terms that did not contribute to the prediction of total robot time. We found a statistically significant learning curve (P learning percentage∗ derived from total robot times† for the first 458 recorded cases of mitral valve repair using telemanipulative technology is 95% (R(2) = .40). More than one third of the variability in total robot time can be explained through our model using the following variables: type of repair (chordal procedures, ablations, and leaflet resections), band size, use of clips alone in band implantation, and the presence of a fellow at bedside (P Learning in mitral valve repair surgery using telemanipulative technology occurs at the East Carolina Heart Institute according to a logarithmic curve, with a learning percentage of 95%. From our regression output, we can make an approximate prediction of total robot time using an additive model. These metrics can be used by programs for benchmarking to manage the implementation of this new technology, as well as for capacity planning, scheduling, and capital budget analysis. Copyright © 2011 The American Association for Thoracic Surgery. All rights reserved.

  17. Virtual learning object and environment: a concept analysis.

    Science.gov (United States)

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés Dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive statistics and the concepts through lexicographic analysis with support of the IRAMUTEQ software. The sample was made up of 161 studies. The concept of "virtual learning environment" was presented in 99 (61.5%) studies, whereas the concept of "virtual learning object" was presented in only 15 (9.3%) studies. A virtual learning environment includes several and different types of virtual learning objects in a common pedagogical context. Analisar o conceito de objeto e de ambiente virtual de aprendizagem na perspectiva evolucionária de Rodgers. Estudo descritivo, de abordagem mista, realizado a partir das etapas propostas por Rodgers em seu modelo de análise conceitual. A coleta de dados ocorreu em agosto de 2015 com a busca de dissertações e teses no Banco de Teses e Dissertações da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. Os dados quantitativos foram analisados a partir de estatística descritiva simples e os conceitos pela análise lexicográfica com suporte do IRAMUTEQ. A amostra é constituída de 161 estudos. O conceito de "ambiente virtual de aprendizagem" foi apresentado em 99 (61,5%) estudos, enquanto o de "objeto virtual de aprendizagem" em apenas 15 (9,3%). Concluiu-se que um ambiente virtual de aprendizagem reúne vários e diferentes tipos de objetos virtuais de aprendizagem em um contexto pedagógico comum.

  18. Emerging trends in geospatial artificial intelligence (geoAI): potential applications for environmental epidemiology.

    Science.gov (United States)

    VoPham, Trang; Hart, Jaime E; Laden, Francine; Chiang, Yao-Yi

    2018-04-17

    Geospatial artificial intelligence (geoAI) is an emerging scientific discipline that combines innovations in spatial science, artificial intelligence methods in machine learning (e.g., deep learning), data mining, and high-performance computing to extract knowledge from spatial big data. In environmental epidemiology, exposure modeling is a commonly used approach to conduct exposure assessment to determine the distribution of exposures in study populations. geoAI technologies provide important advantages for exposure modeling in environmental epidemiology, including the ability to incorporate large amounts of big spatial and temporal data in a variety of formats; computational efficiency; flexibility in algorithms and workflows to accommodate relevant characteristics of spatial (environmental) processes including spatial nonstationarity; and scalability to model other environmental exposures across different geographic areas. The objectives of this commentary are to provide an overview of key concepts surrounding the evolving and interdisciplinary field of geoAI including spatial data science, machine learning, deep learning, and data mining; recent geoAI applications in research; and potential future directions for geoAI in environmental epidemiology.

  19. New directions in valuing geospatial information - how to value goespatial information for policy and business decisioins in the future

    Science.gov (United States)

    Smart, A. C.

    2014-12-01

    Governments are increasingly asking for more evidence of the benefits of investing in geospatial data and infrastructure before investing. They are looking for a clearer articulation of the economic, environmental and social benefits than has been possble in the past. Development of techniques has accelerated in the past five years as governments and industry become more involved in the capture and use of geospatial data. However evaluation practitioners have struggled to answer these emerging questions. The paper explores the types of questions that decision makers are asking and discusses the different approaches and methods that have been used recently to answer them. It explores the need for better buisness case models. The emerging approaches are then discussed and their attributes reviewed. These include methods of analysing tengible economic benefits, intangible benefits and societal benefits. The paper explores the use of value chain analysis and real options analysis to better articulate the impacts on international competitiveness and how to value the potential benefits of innovations enabled by the geospatial data that is produced. The paper concludes by illustrating the potential for these techniques in current and future decision making.

  20. Narrative analysis: how students learn from stories of practice.

    Science.gov (United States)

    Edwards, Sharon Lorraine

    2016-01-01

    To describe and recommend a variety of data analysis methods when engaging in narrative research using story as an aid to nursing students' learning. Narrative research methodology is used in many nursing research studies. However, narrative research reports are generally unspecific regarding the analysis and interpretive process. This article examines the qualitative analytical approaches of Lieblich et al's ( 1998 ) narrative processes of holistic content and analysis of form, incorporated as overarching theories. To support these theories and to provide a more rounded analytical process, other authors' work is included. Approaching narrative analysis from different perspectives is recommended. For each cycle of analysis, it is important to conceptualise the analysis using descriptors drawn from the initial literature review and the initial text. Rigour and transparency are foremost, and tables are generated that reflect each stage of the analysis. The final stage of analysis is to clearly report, organise and present findings to reflect the richly varied and diverse potential of stories. Engaging in narrative research and then dealing with the large quantities of data to analyse can be daunting, difficult to manage and appear complex. It is also challenging and rewarding. With clear descriptors, examining the data using multiple lenses can serve to develop a greater level of insight into understanding nursing students' learning from their clinical experiences, presented as stories, when involved in the care of individuals. There are many approaches to narrative analysis in nursing research and it can be difficult to establish the main research approach best suited to the study. There is no single way to define narrative analysis and a combination of strategies can be applied.

  1. Analysis and Visualization of Relations in eLearning

    Science.gov (United States)

    Dráždilová, Pavla; Obadi, Gamila; Slaninová, Kateřina; Martinovič, Jan; Snášel, Václav

    The popularity of eLearning systems is growing rapidly; this growth is enabled by the consecutive development in Internet and multimedia technologies. Web-based education became wide spread in the past few years. Various types of learning management systems facilitate development of Web-based courses. Users of these courses form social networks through the different activities performed by them. This chapter focuses on searching the latent social networks in eLearning systems data. These data consist of students activity records wherein latent ties among actors are embedded. The social network studied in this chapter is represented by groups of students who have similar contacts and interact in similar social circles. Different methods of data clustering analysis can be applied to these groups, and the findings show the existence of latent ties among the group members. The second part of this chapter focuses on social network visualization. Graphical representation of social network can describe its structure very efficiently. It can enable social network analysts to determine the network degree of connectivity. Analysts can easily determine individuals with a small or large amount of relationships as well as the amount of independent groups in a given network. When applied to the field of eLearning, data visualization simplifies the process of monitoring the study activities of individuals or groups, as well as the planning of educational curriculum, the evaluation of study processes, etc.

  2. Semi-supervised learning for ordinal Kernel Discriminant Analysis.

    Science.gov (United States)

    Pérez-Ortiz, M; Gutiérrez, P A; Carbonero-Ruz, M; Hervás-Martínez, C

    2016-12-01

    Ordinal classification considers those classification problems where the labels of the variable to predict follow a given order. Naturally, labelled data is scarce or difficult to obtain in this type of problems because, in many cases, ordinal labels are given by a user or expert (e.g. in recommendation systems). Firstly, this paper develops a new strategy for ordinal classification where both labelled and unlabelled data are used in the model construction step (a scheme which is referred to as semi-supervised learning). More specifically, the ordinal version of kernel discriminant learning is extended for this setting considering the neighbourhood information of unlabelled data, which is proposed to be computed in the feature space induced by the kernel function. Secondly, a new method for semi-supervised kernel learning is devised in the context of ordinal classification, which is combined with our developed classification strategy to optimise the kernel parameters. The experiments conducted compare 6 different approaches for semi-supervised learning in the context of ordinal classification in a battery of 30 datasets, showing (1) the good synergy of the ordinal version of discriminant analysis and the use of unlabelled data and (2) the advantage of computing distances in the feature space induced by the kernel function. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...

  4. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  5. Zika virus infection and microcephaly: Evidence regarding geospatial associations.

    Science.gov (United States)

    Vissoci, João Ricardo Nickenig; Rocha, Thiago Augusto Hernandes; Silva, Núbia Cristina da; de Sousa Queiroz, Rejane Christine; Thomaz, Erika Bárbara Abreu Fonseca; Amaral, Pedro Vasconcelos Maia; Lein, Adriana; Branco, Maria Dos Remédios Freitas Carvalho; Aquino, José; Rodrigues, Zulimar Márita Ribeiro; da Silva, Antônio Augusto Moura; Staton, Catherine

    2018-04-01

    Although the Zika virus (ZIKV) epidemic ceased to be a public health emergency by the end of 2016, studies to improve knowledge about this emerging disease are still needed, especially those investigating a causal relationship between ZIKV in pregnant women and microcephaly in neonates. However, there are still many challenges in describing the relationship between ZIKV and microcephaly. The few studies focusing on the epidemiological profile of ZIKV and its changes over time are largely limited to systematic reviews of case reports and dispersal mapping of ZIKV spread over time without quantitative methods to analyze patterns and their covariates. Since Brazil has been at the epicenter of the ZIKV epidemic, this study examines the geospatial association between ZIKV and microcephaly in Brazil. Our study is categorized as a retrospective, ecological study based on secondary databases. Data were obtained from January to December 2016, from the following data sources: Brazilian System for Epidemiological Surveillance, Disease Notification System, System for Specialized Management Support, and Brazilian Institute of Geography and Statistics. Data were aggregated by municipality. Incidence rates were estimated per 100,000 inhabitants. Analyses consisted of mapping the aggregated incidence rates of ZIKV and microcephaly, followed by a Getis-Ord-Gi spatial cluster analysis and a Bivariate Local Moran's I analysis. The incidence of ZIKV cases is changing the virus's spatial pattern, shifting from Brazil's Northeast region to the Midwest and North regions. The number of municipalities in clusters of microcephaly incidence is also shifting from the Northeast region to the Midwest and North, after a time lag is considered. Our findings suggest an increase in microcephaly incidence in the Midwest and North regions, associated with high levels of ZIKV infection months before. The greatest burden of microcephaly shifted from the Northeast to other Brazilian regions at the

  6. Geospatial distribution modeling and determining suitability of groundwater quality for irrigation purpose using geospatial methods and water quality index (WQI) in Northern Ethiopia

    Science.gov (United States)

    Gidey, Amanuel

    2018-06-01

    Determining suitability and vulnerability of groundwater quality for irrigation use is a key alarm and first aid for careful management of groundwater resources to diminish the impacts on irrigation. This study was conducted to determine the overall suitability of groundwater quality for irrigation use and to generate their spatial distribution maps in Elala catchment, Northern Ethiopia. Thirty-nine groundwater samples were collected to analyze and map the water quality variables. Atomic absorption spectrophotometer, ultraviolet spectrophotometer, titration and calculation methods were used for laboratory groundwater quality analysis. Arc GIS, geospatial analysis tools, semivariogram model types and interpolation methods were used to generate geospatial distribution maps. Twelve and eight water quality variables were used to produce weighted overlay and irrigation water quality index models, respectively. Root-mean-square error, mean square error, absolute square error, mean error, root-mean-square standardized error, measured values versus predicted values were used for cross-validation. The overall weighted overlay model result showed that 146 km2 areas are highly suitable, 135 km2 moderately suitable and 60 km2 area unsuitable for irrigation use. The result of irrigation water quality index confirms 10.26% with no restriction, 23.08% with low restriction, 20.51% with moderate restriction, 15.38% with high restriction and 30.76% with the severe restriction for irrigation use. GIS and irrigation water quality index are better methods for irrigation water resources management to achieve a full yield irrigation production to improve food security and to sustain it for a long period, to avoid the possibility of increasing environmental problems for the future generation.

  7. Creating 3D models of historical buildings using geospatial data

    Science.gov (United States)

    Alionescu, Adrian; Bǎlǎ, Alina Corina; Brebu, Floarea Maria; Moscovici, Anca-Maria

    2017-07-01

    Recently, a lot of interest has been shown to understand a real world object by acquiring its 3D images of using laser scanning technology and panoramic images. A realistic impression of geometric 3D data can be generated by draping real colour textures simultaneously captured by a colour camera images. In this context, a new concept of geospatial data acquisition has rapidly revolutionized the method of determining the spatial position of objects, which is based on panoramic images. This article describes an approach that comprises inusing terrestrial laser scanning and panoramic images captured with Trimble V10 Imaging Rover technology to enlarge the details and realism of the geospatial data set, in order to obtain 3D urban plans and virtual reality applications.

  8. Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics

    Science.gov (United States)

    Singh, R.; Bermudez, L. E.

    2013-12-01

    Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics The Open Geospatial Consortium (OGC) mission is to serve as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC coordinates with over 400 institutions in the development of geospatial standards. In the last years two main trends are making disruptions in geospatial applications: mobile and context sharing. People now have more and more mobile devices to support their work and personal life. Mobile devices are intermittently connected to the internet and have smaller computing capacity than a desktop computer. Based on this trend a new OGC file format standard called GeoPackage will enable greater geospatial data sharing on mobile devices. GeoPackage is perhaps best understood as the natural evolution of Shapefiles, which have been the predominant lightweight geodata sharing format for two decades. However the format is extremely limited. Four major shortcomings are that only vector points, lines, and polygons are supported; property names are constrained by the dBASE format; multiple files are required to encode a single data set; and multiple Shapefiles are required to encode multiple data sets. A more modern lingua franca for geospatial data is long overdue. GeoPackage fills this need with support for vector data, image tile matrices, and raster data. And it builds upon a database container - SQLite - that's self-contained, single-file, cross-platform, serverless, transactional, and open source. A GeoPackage, in essence, is a set of SQLite database tables whose content and layout is described in the candidate GeoPackage Implementation Specification available at https://portal.opengeospatial.org/files/?artifact_id=54838&version=1. The second trend is sharing client 'contexts'. When a user is looking into an article or a product on the web

  9. National Geospatial-Intelligence Agency Academic Research Program

    Science.gov (United States)

    Loomer, S. A.

    2004-12-01

    "Know the Earth.Show the Way." In fulfillment of its vision, the National Geospatial-Intelligence Agency (NGA) provides geospatial intelligence in all its forms and from whatever source-imagery, imagery intelligence, and geospatial data and information-to ensure the knowledge foundation for planning, decision, and action. To achieve this, NGA conducts a multi-disciplinary program of basic research in geospatial intelligence topics through grants and fellowships to the leading investigators, research universities, and colleges of the nation. This research provides the fundamental science support to NGA's applied and advanced research programs. The major components of the NGA Academic Research Program (NARP) are: - NGA University Research Initiatives (NURI): Three-year basic research grants awarded competitively to the best investigators across the US academic community. Topics are selected to provide the scientific basis for advanced and applied research in NGA core disciplines. - Historically Black College and University - Minority Institution Research Initiatives (HBCU-MI): Two-year basic research grants awarded competitively to the best investigators at Historically Black Colleges and Universities, and Minority Institutions across the US academic community. - Director of Central Intelligence Post-Doctoral Research Fellowships: Fellowships providing access to advanced research in science and technology applicable to the intelligence community's mission. The program provides a pool of researchers to support future intelligence community needs and develops long-term relationships with researchers as they move into career positions. This paper provides information about the NGA Academic Research Program, the projects it supports and how other researchers and institutions can apply for grants under the program.

  10. NativeView: A Geospatial Curriculum for Native Nation Building

    Science.gov (United States)

    Rattling Leaf, J.

    2007-12-01

    In the spirit of collaboration and reciprocity, James Rattling Leaf of Sinte Gleska University on the Rosebud Reservation of South Dakota will present recent developments, experiences, insights and a vision for education in Indian Country. As a thirty-year young institution, Sinte Gleska University is founded by a strong vision of ancestral leadership and the values of the Lakota Way of Life. Sinte Gleska University (SGU) has initiated the development of a Geospatial Education Curriculum project. NativeView: A Geospatial Curriculum for Native Nation Building is a two-year project that entails a disciplined approach towards the development of a relevant Geospatial academic curriculum. This project is designed to meet the educational and land management needs of the Rosebud Lakota Tribe through the utilization of Geographic Information Systems (GIS), Remote Sensing (RS) and Global Positioning Systems (GPS). In conjunction with the strategy and progress of this academic project, a formal presentation and demonstration of the SGU based Geospatial software RezMapper software will exemplify an innovative example of state of the art information technology. RezMapper is an interactive CD software package focused toward the 21 Lakota communities on the Rosebud Reservation that utilizes an ingenious concept of multimedia mapping and state of the art data compression and presentation. This ongoing development utilizes geographic data, imagery from space, historical aerial photography and cultural features such as historic Lakota documents, language, song, video and historical photographs in a multimedia fashion. As a tangible product, RezMapper will be a project deliverable tool for use in the classroom and to a broad range of learners.

  11. Adoption of Geospatial Systems towards evolving Sustainable Himalayan Mountain Development

    Science.gov (United States)

    Murthy, M. S. R.; Bajracharya, B.; Pradhan, S.; Shestra, B.; Bajracharya, R.; Shakya, K.; Wesselmann, S.; Ali, M.; Bajracharya, S.; Pradhan, S.

    2014-11-01

    Natural resources dependence of mountain communities, rapid social and developmental changes, disaster proneness and climate change are conceived as the critical factors regulating sustainable Himalayan mountain development. The Himalayan region posed by typical geographic settings, diverse physical and cultural diversity present a formidable challenge to collect and manage data, information and understands varied socio-ecological settings. Recent advances in earth observation, near real-time data, in-situ measurements and in combination of information and communication technology have transformed the way we collect, process, and generate information and how we use such information for societal benefits. Glacier dynamics, land cover changes, disaster risk reduction systems, food security and ecosystem conservation are a few thematic areas where geospatial information and knowledge have significantly contributed to informed decision making systems over the region. The emergence and adoption of near-real time systems, unmanned aerial vehicles (UAV), board-scale citizen science (crowd-sourcing), mobile services and mapping, and cloud computing have paved the way towards developing automated environmental monitoring systems, enhanced scientific understanding of geophysical and biophysical processes, coupled management of socio-ecological systems and community based adaptation models tailored to mountain specific environment. There are differentiated capacities among the ICIMOD regional member countries with regard to utilization of earth observation and geospatial technologies. The region can greatly benefit from a coordinated and collaborative approach to capture the opportunities offered by earth observation and geospatial technologies. The regional level data sharing, knowledge exchange, and Himalayan GEO supporting geospatial platforms, spatial data infrastructure, unique region specific satellite systems to address trans-boundary challenges would go a long way in

  12. SPECTRAL COLOR INDICES BASED GEOSPATIAL MODELING OF SOIL ORGANIC MATTER IN CHITWAN DISTRICT, NEPAL

    Directory of Open Access Journals (Sweden)

    U. K. Mandal

    2016-06-01

    Full Text Available Space Technology provides a resourceful-cost effective means to assess soil nutrients essential for soil management plan. Soil organic matter (SOM is one of valuable controlling productivity of crops by providing nutrient in farming systems. Geospatial modeling of soil organic matter is essential if there is unavailability of soil test laboratories and its strong spatial correlation. In the present analysis, soil organic matter is modeled from satellite image derived spectral color indices. Brightness Index (BI, Coloration Index (CI, Hue Index (HI, Redness Index (RI and Saturation Index (SI were calculated by converting DN value to radiance and radiance to reflectance from Thematic Mapper image. Geospatial model was developed by regressing SOM with color indices and producing multiple regression model using stepwise regression technique. The multiple regression equation between SOM and spectral indices was significant with R = 0. 56 at 95% confidence level. The resulting MLR equation was then used for the spatial prediction for the entire study area. Redness Index was found higher significance in estimating the SOM. It was used to predict SOM as auxiliary variables using cokringing spatial interpolation technique. It was tested in seven VDCs of Chitwan district of Nepal using Thematic Mapper remotely sensed data. SOM was found to be measured ranging from 0.15% to 4.75 %, with a mean of 2.24 %. Remotely sensed data derived spectral color indices have the potential as useful auxiliary variables for estimating SOM content to generate soil fertility management plans.

  13. Collective Sensing: Integrating Geospatial Technologies to Understand Urban Systems—An Overview

    Directory of Open Access Journals (Sweden)

    Geoffrey J. Hay

    2011-08-01

    Full Text Available Cities are complex systems composed of numerous interacting components that evolve over multiple spatio-temporal scales. Consequently, no single data source is sufficient to satisfy the information needs required to map, monitor, model, and ultimately understand and manage our interaction within such urban systems. Remote sensing technology provides a key data source for mapping such environments, but is not sufficient for fully understanding them. In this article we provide a condensed urban perspective of critical geospatial technologies and techniques: (i Remote Sensing; (ii Geographic Information Systems; (iii object-based image analysis; and (iv sensor webs, and recommend a holistic integration of these technologies within the language of open geospatial consortium (OGC standards in-order to more fully understand urban systems. We then discuss the potential of this integration and conclude that this extends the monitoring and mapping options beyond “hard infrastructure” by addressing “humans as sensors”, mobility and human-environment interactions, and future improvements to quality of life and of social infrastructures.

  14. Fast Deployment on the Cloud of Integrated Postgres, API and a Jupyter Notebook for Geospatial Collaboration

    Science.gov (United States)

    Fatland, R.; Tan, A.; Arendt, A. A.

    2016-12-01

    We describe a Python-based implementation of a PostgreSQL database accessed through an Application Programming Interface (API) hosted on the Amazon Web Services public cloud. The data is geospatial and concerns hydrological model results in the glaciated catchment basins of southcentral and southeast Alaska. This implementation, however, is intended to be generalized to other forms of geophysical data, particularly data that is intended to be shared across a collaborative team or publicly. An example (moderate-size) dataset is provided together with the code base and a complete installation tutorial on GitHub. An enthusiastic scientist with some familiarity with software installation can replicate the example system in two hours. This installation includes database, API, a test Client and a supporting Jupyter Notebook, specifically oriented towards Python 3 and markup text to comprise an executable paper. The installation 'on the cloud' often engenders discussion and consideration of cloud cost and safety. By treating the process as somewhat "cookbook" we hope to first demonstrate the feasibility of the proposition. A discussion of cost and data security is provided in this presentation and in the accompanying tutorial/documentation. This geospatial data system case study is part of a larger effort at the University of Washington to enable research teams to take advantage of the public cloud to meet challenges in data management and analysis.

  15. Designing a two-rank acceptance sampling plan for quality inspection of geospatial data products

    Science.gov (United States)

    Tong, Xiaohua; Wang, Zhenhua; Xie, Huan; Liang, Dan; Jiang, Zuoqin; Li, Jinchao; Li, Jun

    2011-10-01

    To address the disadvantages of classical sampling plans designed for traditional industrial products, we originally propose a two-rank acceptance sampling plan (TRASP) for the inspection of geospatial data outputs based on the acceptance quality level (AQL). The first rank sampling plan is to inspect the lot consisting of map sheets, and the second is to inspect the lot consisting of features in an individual map sheet. The TRASP design is formulated as an optimization problem with respect to sample size and acceptance number, which covers two lot size cases. The first case is for a small lot size with nonconformities being modeled by a hypergeometric distribution function, and the second is for a larger lot size with nonconformities being modeled by a Poisson distribution function. The proposed TRASP is illustrated through two empirical case studies. Our analysis demonstrates that: (1) the proposed TRASP provides a general approach for quality inspection of geospatial data outputs consisting of non-uniform items and (2) the proposed acceptance sampling plan based on TRASP performs better than other classical sampling plans. It overcomes the drawbacks of percent sampling, i.e., "strictness for large lot size, toleration for small lot size," and those of a national standard used specifically for industrial outputs, i.e., "lots with different sizes corresponding to the same sampling plan."

  16. Analyzing Personal Happiness from Global Survey and Weather Data: A Geospatial Approach.

    Science.gov (United States)

    Peng, Yi-Fan; Tang, Jia-Hong; Fu, Yang-chih; Fan, I-chun; Hor, Maw-Kae; Chan, Ta-Chien

    2016-01-01

    Past studies have shown that personal subjective happiness is associated with various macro- and micro-level background factors, including environmental conditions, such as weather and the economic situation, and personal health behaviors, such as smoking and exercise. We contribute to this literature of happiness studies by using a geospatial approach to examine both macro and micro links to personal happiness. Our geospatial approach incorporates two major global datasets: representative national survey data from the International Social Survey Program (ISSP) and corresponding world weather data from the National Oceanic and Atmospheric Administration (NOAA). After processing and filtering 55,081 records of ISSP 2011 survey data from 32 countries, we extracted 5,420 records from China and 25,441 records from 28 other countries. Sensitivity analyses of different intervals for average weather variables showed that macro-level conditions, including temperature, wind speed, elevation, and GDP, are positively correlated with happiness. To distinguish the effects of weather conditions on happiness in different seasons, we also adopted climate zone and seasonal variables. The micro-level analysis indicated that better health status and eating more vegetables or fruits are highly associated with happiness. Never engaging in physical activity appears to make people less happy. The findings suggest that weather conditions, economic situations, and personal health behaviors are all correlated with levels of happiness.

  17. Surface temperatures in New York City: Geospatial data enables the accurate prediction of radiative heat transfer.

    Science.gov (United States)

    Ghandehari, Masoud; Emig, Thorsten; Aghamohamadnia, Milad

    2018-02-02

    Despite decades of research seeking to derive the urban energy budget, the dynamics of thermal exchange in the densely constructed environment is not yet well understood. Using New York City as a study site, we present a novel hybrid experimental-computational approach for a better understanding of the radiative heat transfer in complex urban environments. The aim of this work is to contribute to the calculation of the urban energy budget, particularly the stored energy. We will focus our attention on surface thermal radiation. Improved understanding of urban thermodynamics incorporating the interaction of various bodies, particularly in high rise cities, will have implications on energy conservation at the building scale, and for human health and comfort at the urban scale. The platform presented is based on longwave hyperspectral imaging of nearly 100 blocks of Manhattan, in addition to a geospatial radiosity model that describes the collective radiative heat exchange between multiple buildings. Despite assumptions in surface emissivity and thermal conductivity of buildings walls, the close comparison of temperatures derived from measurements and computations is promising. Results imply that the presented geospatial thermodynamic model of urban structures can enable accurate and high resolution analysis of instantaneous urban surface temperatures.

  18. Image analysis and machine learning for detecting malaria.

    Science.gov (United States)

    Poostchi, Mahdieh; Silamut, Kamolrat; Maude, Richard J; Jaeger, Stefan; Thoma, George

    2018-04-01

    Malaria remains a major burden on global health, with roughly 200 million cases worldwide and more than 400,000 deaths per year. Besides biomedical research and political efforts, modern information technology is playing a key role in many attempts at fighting the disease. One of the barriers toward a successful mortality reduction has been inadequate malaria diagnosis in particular. To improve diagnosis, image analysis software and machine learning methods have been used to quantify parasitemia in microscopic blood slides. This article gives an overview of these techniques and discusses the current developments in image analysis and machine learning for microscopic malaria diagnosis. We organize the different approaches published in the literature according to the techniques used for imaging, image preprocessing, parasite detection and cell segmentation, feature computation, and automatic cell classification. Readers will find the different techniques listed in tables, with the relevant articles cited next to them, for both thin and thick blood smear images. We also discussed the latest developments in sections devoted to deep learning and smartphone technology for future malaria diagnosis. Published by Elsevier Inc.

  19. Data mining learning bootstrap through semantic thumbnail analysis

    Science.gov (United States)

    Battiato, Sebastiano; Farinella, Giovanni Maria; Giuffrida, Giovanni; Tribulato, Giuseppe

    2007-01-01

    The rapid increase of technological innovations in the mobile phone industry induces the research community to develop new and advanced systems to optimize services offered by mobile phones operators (telcos) to maximize their effectiveness and improve their business. Data mining algorithms can run over data produced by mobile phones usage (e.g. image, video, text and logs files) to discover user's preferences and predict the most likely (to be purchased) offer for each individual customer. One of the main challenges is the reduction of the learning time and cost of these automatic tasks. In this paper we discuss an experiment where a commercial offer is composed by a small picture augmented with a short text describing the offer itself. Each customer's purchase is properly logged with all relevant information. Upon arrival of new items we need to learn who the best customers (prospects) for each item are, that is, the ones most likely to be interested in purchasing that specific item. Such learning activity is time consuming and, in our specific case, is not applicable given the large number of new items arriving every day. Basically, given the current customer base we are not able to learn on all new items. Thus, we need somehow to select among those new items to identify the best candidates. We do so by using a joint analysis between visual features and text to estimate how good each new item could be, that is, whether or not is worth to learn on it. Preliminary results show the effectiveness of the proposed approach to improve classical data mining techniques.

  20. Flash Study Analysis and the Music Learning Pro-Files Project

    Science.gov (United States)

    Cremata, Radio; Pignato, Joseph; Powell, Bryan; Smith, Gareth Dylan

    2016-01-01

    This paper introduces the Music Learning Profiles Project, and its methodological approach, flash study analysis. Flash study analysis is a method that draws heavily on extant qualitative approaches to education research, to develop broad understandings of music learning in diverse contexts. The Music Learning Profiles Project (MLPP) is an…

  1. Challenges in sharing of geospatial data by data custodians in South Africa

    Science.gov (United States)

    Kay, Sissiel E.

    2018-05-01

    As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.

  2. An Ontology-supported Approach for Automatic Chaining of Web Services in Geospatial Knowledge Discovery

    Science.gov (United States)

    di, L.; Yue, P.; Yang, W.; Yu, G.

    2006-12-01

    Recent developments in geospatial semantic Web have shown promise for automatic discovery, access, and use of geospatial Web services to quickly and efficiently solve particular application problems. With the semantic Web technology, it is highly feasible to construct intelligent geospatial knowledge systems that can provide answers to many geospatial application questions. A key challenge in constructing such intelligent knowledge system is to automate the creation of a chain or process workflow that involves multiple services and highly diversified data and can generate the answer to a specific question of users. This presentation discusses an approach for automating composition of geospatial Web service chains by employing geospatial semantics described by geospatial ontologies. It shows how ontology-based geospatial semantics are used for enabling the automatic discovery, mediation, and chaining of geospatial Web services. OWL-S is used to represent the geospatial semantics of individual Web services and the type of the services it belongs to and the type of the data it can handle. The hierarchy and classification of service types are described in the service ontology. The hierarchy and classification of data types are presented in the data ontology. For answering users' geospatial questions, an Artificial Intelligent (AI) planning algorithm is used to construct the service chain by using the service and data logics expressed in the ontologies. The chain can be expressed as a graph with nodes representing services and connection weights representing degrees of semantic matching between nodes. The graph is a visual representation of logical geo-processing path for answering users' questions. The graph can be instantiated to a physical service workflow for execution to generate the answer to a user's question. A prototype system, which includes real world geospatial applications, is implemented to demonstrate the concept and approach.

  3. Analysis of Learning Tools in the study of Developmental of Interactive Multimedia Based Physic Learning Charged in Problem Solving

    Science.gov (United States)

    Manurung, Sondang; Demonta Pangabean, Deo

    2017-05-01

    The main purpose of this study is to produce needs analysis, literature review, and learning tools in the study of developmental of interactive multimedia based physic learning charged in problem solving to improve thinking ability of physic prospective student. The first-year result of the study is: result of the draft based on a needs analysis of the facts on the ground, the conditions of existing learning and literature studies. Following the design of devices and instruments performed as well the development of media. Result of the second study is physics learning device -based interactive multimedia charged problem solving in the form of textbooks and scientific publications. Previous learning models tested in a limited sample, then in the evaluation and repair. Besides, the product of research has an economic value on the grounds: (1) a virtual laboratory to offer this research provides a solution purchases physics laboratory equipment is expensive; (2) address the shortage of teachers of physics in remote areas as a learning tool can be accessed offline and online; (3). reducing material or consumables as tutorials can be done online; Targeted research is the first year: i.e story board learning physics that have been scanned in a web form CD (compact disk) and the interactive multimedia of gas Kinetic Theory concept. This draft is based on a needs analysis of the facts on the ground, the existing learning conditions, and literature studies. Previous learning models tested in a limited sample, then in the evaluation and repair.

  4. Investigating the Learning-Theory Foundations of Game-Based Learning: A Meta-Analysis

    Science.gov (United States)

    Wu, W-H.; Hsiao, H-C.; Wu, P-L.; Lin, C-H.; Huang, S-H.

    2012-01-01

    Past studies on the issue of learning-theory foundations in game-based learning stressed the importance of establishing learning-theory foundation and provided an exploratory examination of established learning theories. However, we found research seldom addressed the development of the use or failure to use learning-theory foundations and…

  5. Geo-Spatial Tactical Decision Aid Systems: Fuzzy Logic for Supporting Decision Making

    National Research Council Canada - National Science Library

    Grasso, Raffaele; Giannecchini, Simone

    2006-01-01

    .... This paper describes a tactical decision aid system based on fuzzy logic reasoning for data fusion and on current Open Geospatial Consortium specifications for interoperability, data dissemination...

  6. A Machine Learning Based Intrusion Impact Analysis Scheme for Clouds

    Directory of Open Access Journals (Sweden)

    Junaid Arshad

    2012-01-01

    Full Text Available Clouds represent a major paradigm shift, inspiring the contemporary approach to computing. They present fascinating opportunities to address dynamic user requirements with the provision of on demand expandable computing infrastructures. However, Clouds introduce novel security challenges which need to be addressed to facilitate widespread adoption. This paper is focused on one such challenge - intrusion impact analysis. In particular, we highlight the significance of intrusion impact analysis for the overall security of Clouds. Additionally, we present a machine learning based scheme to address this challenge in accordance with the specific requirements of Clouds for intrusion impact analysis. We also present rigorous evaluation performed to assess the effectiveness and feasibility of the proposed method to address this challenge for Clouds. The evaluation results demonstrate high degree of effectiveness to correctly determine the impact of an intrusion along with significant reduction with respect to the intrusion response time.

  7. Exploring the Relationships between Tutor Background, Tutor Training, and Student Learning: A Problem-Based Learning Meta-Analysis

    Science.gov (United States)

    Leary, Heather; Walker, Andrew; Shelton, Brett E.; Fitt, M. Harrison

    2013-01-01

    Despite years of primary research on problem-based learning and literature reviews, no systematic effort has been made to analyze the relationship between tutor characteristics and student learning outcomes. In an effort to fill that gap the following meta-analysis coded 223 outcomes from 94 studies with small but positive gains for PBL students…

  8. Employees' Willingness to Participate in Work-Related Learning: A Multilevel Analysis of Employees' Learning Intentions

    Science.gov (United States)

    Kyndt, Eva; Onghena, Patrick; Smet, Kelly; Dochy, Filip

    2014-01-01

    The current study focuses on employees' learning intentions, or the willingness to undertake formal work-related learning. This cross-sectional survey study included a sample of 1,243 employees that are nested within 21 organisations. The results of the multilevel analysis show that self-directedness in career processes, time management,…

  9. Geospatial intelligence and visual classification of environmentally observed species in the Future Internet

    Science.gov (United States)

    Arbab-Zavar, B.; Chakravarthy, A.; Sabeur, Z. A.

    2012-04-01

    The rapid development of advanced smart communication tools with good quality and resolution video cameras, audio and GPS devices in the last few years shall lead to profound impacts on the way future environmental observations are conducted and accessed by communities. The resulting large scale interconnections of these "Future Internet Things" form a large environmental sensing network which will generate large volumes of quality environmental observations and at highly localised spatial scales. This enablement in environmental sensing at local scales will be of great importance to contribute in the study of fauna and flora in the near future, particularly on the effect of climate change on biodiversity in various regions of Europe and beyond. The Future Internet could also potentially become the de facto information space to provide participative real-time sensing by communities and improve our situation awarness of the effect of climate on local environments. In the ENVIROFI(2011-2013) Usage Area project in the FP7 FI-PPP programme, a set of requirements for specific (and generic) enablers is achieved with the potential establishement of participating community observatories of the future. In particular, the specific enablement of interest concerns the building of future interoperable services for the management of environmental data intelligently with tagged contextual geo-spatial information generated by multiple operators in communities (Using smart phones). The classification of observed species in the resulting images is achieved with structured data pre-processing, semantic enrichement using contextual geospatial information, and high level fusion with controlled uncertainty estimations. The returned identification of species is further improved using future ground truth corrections and learning by the specific enablers.

  10. The Usability Analysis of An E-Learning Environment

    Directory of Open Access Journals (Sweden)

    Fulya TORUN

    2015-10-01

    Full Text Available In this research, an E-learning environment is developed for the teacher candidates taking the course on Scientific Research Methods. The course contents were adapted to one of the constructivist approach models referred to as 5E, and an expert opinion was received for the compliance of this model. An usability analysis was also performed to determine the usability of the e-learning environment. The participants of the research comprised 42 teacher candidates. The mixed method was used in the research. 3 different data collection tools were used in order to measure the three basic concepts of usability analyses, which are the dimensions of effectiveness, efficiency and satisfaction. Two of the data collection tools were the scales developed by different researchers and were applied with the approval received from the researchers involved. On the other hand, the usability test as another data tool was prepared by the researchers who conducted this study for the purpose of determining the participants’ success in handling the twelve tasks assigned to them with respect to the use of elearning environment, the seconds they spent on that environment and the number of clicks they performed. Considering the results of the analyses performed within the data obtained, the usability of the developed e-learning environment proved to be at a higher rate.

  11. A survey on Barrett's esophagus analysis using machine learning.

    Science.gov (United States)

    de Souza, Luis A; Palm, Christoph; Mendel, Robert; Hook, Christian; Ebigbo, Alanna; Probst, Andreas; Messmann, Helmut; Weber, Silke; Papa, João P

    2018-05-01

    This work presents a systematic review concerning recent studies and technologies of machine learning for Barrett's esophagus (BE) diagnosis and treatment. The use of artificial intelligence is a brand new and promising way to evaluate such disease. We compile some works published at some well-established databases, such as Science Direct, IEEEXplore, PubMed, Plos One, Multidisciplinary Digital Publishing Institute (MDPI), Association for Computing Machinery (ACM), Springer, and Hindawi Publishing Corporation. Each selected work has been analyzed to present its objective, methodology, and results. The BE progression to dysplasia or adenocarcinoma shows a complex pattern to be detected during endoscopic surveillance. Therefore, it is valuable to assist its diagnosis and automatic identification using computer analysis. The evaluation of the BE dysplasia can be performed through manual or automated segmentation through machine learning techniques. Finally, in this survey, we reviewed recent studies focused on the automatic detection of the neoplastic region for classification purposes using machine learning methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Analysis of Learning Development With Sugeno Fuzzy Logic And Clustering

    Directory of Open Access Journals (Sweden)

    Maulana Erwin Saputra

    2017-06-01

    Full Text Available In the first journal, I made this attempt to analyze things that affect the achievement of students in each school of course vary. Because students are one of the goals of achieving the goals of successful educational organizations. The mental influence of students’ emotions and behaviors themselves in relation to learning performance. Fuzzy logic can be used in various fields as well as Clustering for grouping, as in Learning Development analyzes. The process will be performed on students based on the symptoms that exist. In this research will use fuzzy logic and clustering. Fuzzy is an uncertain logic but its excess is capable in the process of language reasoning so that in its design is not required complicated mathematical equations. However Clustering method is K-Means method is method where data analysis is broken down by group k (k = 1,2,3, .. k. To know the optimal number of Performance group. The results of the research is with a questionnaire entered into matlab will produce a value that means in generating the graph. And simplify the school in seeing Student performance in the learning process by using certain criteria. So from the system that obtained the results for a decision-making required by the school.

  13. Autonomous learning in gesture recognition by using lobe component analysis

    Science.gov (United States)

    Lu, Jian; Weng, Juyang

    2007-02-01

    Gesture recognition is a new human-machine interface method implemented by pattern recognition(PR).In order to assure robot safety when gesture is used in robot control, it is required to implement the interface reliably and accurately. Similar with other PR applications, 1) feature selection (or model establishment) and 2) training from samples, affect the performance of gesture recognition largely. For 1), a simple model with 6 feature points at shoulders, elbows, and hands, is established. The gestures to be recognized are restricted to still arm gestures, and the movement of arms is not considered. These restrictions are to reduce the misrecognition, but are not so unreasonable. For 2), a new biological network method, called lobe component analysis(LCA), is used in unsupervised learning. Lobe components, corresponding to high-concentrations in probability of the neuronal input, are orientation selective cells follow Hebbian rule and lateral inhibition. Due to the advantage of LCA method for balanced learning between global and local features, large amount of samples can be used in learning efficiently.

  14. Extreme learning machine for ranking: generalization analysis and applications.

    Science.gov (United States)

    Chen, Hong; Peng, Jiangtao; Zhou, Yicong; Li, Luoqing; Pan, Zhibin

    2014-05-01

    The extreme learning machine (ELM) has attracted increasing attention recently with its successful applications in classification and regression. In this paper, we investigate the generalization performance of ELM-based ranking. A new regularized ranking algorithm is proposed based on the combinations of activation functions in ELM. The generalization analysis is established for the ELM-based ranking (ELMRank) in terms of the covering numbers of hypothesis space. Empirical results on the benchmark datasets show the competitive performance of the ELMRank over the state-of-the-art ranking methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Indoor Air Quality Analysis Using Deep Learning with Sensor Data

    Directory of Open Access Journals (Sweden)

    Jaehyun Ahn

    2017-10-01

    Full Text Available Indoor air quality analysis is of interest to understand the abnormal atmospheric phenomena and external factors that affect air quality. By recording and analyzing quality measurements, we are able to observe patterns in the measurements and predict the air quality of near future. We designed a microchip made out of sensors that is capable of periodically recording measurements, and proposed a model that estimates atmospheric changes using deep learning. In addition, we developed an efficient algorithm to determine the optimal observation period for accurate air quality prediction. Experimental results with real-world data demonstrate the feasibility of our approach.

  16. Job task analysis: lessons learned from application in course development

    International Nuclear Information System (INIS)

    Meredith, J.B.

    1985-01-01

    Those at Public Service Electric and Gas Company are committed to a systematic approach to training known as Instructional System Design. Our performance-based training emphasizes the ISD process to have trainees do or perform the task whenever and wherever it is possible for the jobs for which they are being trained. Included is a brief description of our process for conducting and validating job analyses. The major thrust of this paper is primarily on the lessons that we have learned in the design and development of training programs based upon job analysis results

  17. Humanities data in R exploring networks, geospatial data, images, and text

    CERN Document Server

    Arnold, Taylor

    2015-01-01

    This pioneering book teaches readers to use R within four core analytical areas applicable to the Humanities: networks, text, geospatial data, and images. This book is also designed to be a bridge: between quantitative and qualitative methods, individual and collaborative work, and the humanities and social scientists. Exploring Humanities Data Types with R does not presuppose background programming experience. Early chapters take readers from R set-up to exploratory data analysis (continuous and categorical data, multivariate analysis, and advanced graphics with emphasis on aesthetics and facility). Everything is hands-on: networks are explained using U.S. Supreme Court opinions, and low-level NLP methods are applied to short stories by Sir Arthur Conan Doyle. The book’s data, code, appendix with 100 basic programming exercises and solutions, and dedicated website are valuable resources for readers. The methodology will have wide application in classrooms and self-study for the humanities, but also for use...

  18. Web Log Pre-processing and Analysis for Generation of Learning Profiles in Adaptive E-learning

    Directory of Open Access Journals (Sweden)

    Radhika M. Pai

    2016-03-01

    Full Text Available Adaptive E-learning Systems (AESs enhance the efficiency of online courses in education by providing personalized contents and user interfaces that changes according to learner’s requirements and usage patterns. This paper presents the approach to generate learning profile of each learner which helps to identify the learning styles and provide Adaptive User Interface which includes adaptive learning components and learning material. The proposed method analyzes the captured web usage data to identify the learning profile of the learners. The learning profiles are identified by an algorithmic approach that is based on the frequency of accessing the materials and the time spent on the various learning components on the portal. The captured log data is pre-processed and converted into standard XML format to generate learners sequence data corresponding to the different sessions and time spent. The learning style model adopted in this approach is Felder-Silverman Learning Style Model (FSLSM. This paper also presents the analysis of learner’s activities, preprocessed XML files and generated sequences.

  19. Web Log Pre-processing and Analysis for Generation of Learning Profiles in Adaptive E-learning

    Directory of Open Access Journals (Sweden)

    Radhika M. Pai

    2016-04-01

    Full Text Available Adaptive E-learning Systems (AESs enhance the efficiency of online courses in education by providing personalized contents and user interfaces that changes according to learner’s requirements and usage patterns. This paper presents the approach to generate learning profile of each learner which helps to identify the learning styles and provide Adaptive User Interface which includes adaptive learning components and learning material. The proposed method analyzes the captured web usage data to identify the learning profile of the learners. The learning profiles are identified by an algorithmic approach that is based on the frequency of accessing the materials and the time spent on the various learning components on the portal. The captured log data is pre-processed and converted into standard XML format to generate learners sequence data corresponding to the different sessions and time spent. The learning style model adopted in this approach is Felder-Silverman Learning Style Model (FSLSM. This paper also presents the analysis of learner’s activities, preprocessed XML files and generated sequences.

  20. ANALYSIS LEARNING MODEL OF DISCOVERY AND UNDERSTANDING THE CONCEPT PRELIMINARY TO PHYSICS LEARNING OUTCOMES SMA

    Directory of Open Access Journals (Sweden)

    Sri Rosepda Sebayang

    2015-12-01

    Full Text Available This study aims: 1 to determine whether the student learning outcomes using discovery learning is better than conventional learning 2 To determine whether the learning outcomes of students who have a high initial concept understanding better then of low initial concept understanding, and 3 to determine the effect of interaction discovery learning and understanding of the initial concept of the learning outcomes of students. The samples in this study was taken by cluster random sampling two classes where class X PIA 3 as a class experiment with applying discovery learning and class X PIA 2 as a control class by applying conventional learning. The instrument used in this study is a test of learning outcomes in the form of multiple-choice comprehension test initial concept description form. The results of research are: 1 learning outcomes of students who were taught with discovery learning is better than the learning outcomes of students who are taught by conventional learning, 2 student learning outcomes with high initial conceptual understanding better than the learning outcomes of students with low initial conceptual understanding, and 3 there was no interaction between discovery learning and understanding of initial concepts for the student learning outcomes.

  1. Predicting Smoking Status Using Machine Learning Algorithms and Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Charles Frank

    2018-03-01

    Full Text Available Smoking has been proven to negatively affect health in a multitude of ways. As of 2009, smoking has been considered the leading cause of preventable morbidity and mortality in the United States, continuing to plague the country’s overall health. This study aims to investigate the viability and effectiveness of some machine learning algorithms for predicting the smoking status of patients based on their blood tests and vital readings results. The analysis of this study is divided into two parts: In part 1, we use One-way ANOVA analysis with SAS tool to show the statistically significant difference in blood test readings between smokers and non-smokers. The results show that the difference in INR, which measures the effectiveness of anticoagulants, was significant in favor of non-smokers which further confirms the health risks associated with smoking. In part 2, we use five machine learning algorithms: Naïve Bayes, MLP, Logistic regression classifier, J48 and Decision Table to predict the smoking status of patients. To compare the effectiveness of these algorithms we use: Precision, Recall, F-measure and Accuracy measures. The results show that the Logistic algorithm outperformed the four other algorithms with Precision, Recall, F-Measure, and Accuracy of 83%, 83.4%, 83.2%, 83.44%, respectively.

  2. Quantitative Machine Learning Analysis of Brain MRI Morphology throughout Aging.

    Science.gov (United States)

    Shamir, Lior; Long, Joe

    2016-01-01

    While cognition is clearly affected by aging, it is unclear whether the process of brain aging is driven solely by accumulation of environmental damage, or involves biological pathways. We applied quantitative image analysis to profile the alteration of brain tissues during aging. A dataset of 463 brain MRI images taken from a cohort of 416 subjects was analyzed using a large set of low-level numerical image content descriptors computed from the entire brain MRI images. The correlation between the numerical image content descriptors and the age was computed, and the alterations of the brain tissues during aging were quantified and profiled using machine learning. The comprehensive set of global image content descriptors provides high Pearson correlation of ~0.9822 with the chronological age, indicating that the machine learning analysis of global features is sensitive to the age of the subjects. Profiling of the predicted age shows several periods of mild changes, separated by shorter periods of more rapid alterations. The periods with the most rapid changes were around the age of 55, and around the age of 65. The results show that the process of brain aging of is not linear, and exhibit short periods of rapid aging separated by periods of milder change. These results are in agreement with patterns observed in cognitive decline, mental health status, and general human aging, suggesting that brain aging might not be driven solely by accumulation of environmental damage. Code and data used in the experiments are publicly available.

  3. Learning from Trending, Precursor Analysis, and System Failures

    Energy Technology Data Exchange (ETDEWEB)

    Youngblood, R. W. [Idaho National Laboratory, Idaho Falls, ID (United States); Duffey, R. B. [Idaho National Laboratory, Idaho Falls, ID (United States)

    2015-11-01

    Models of reliability growth relate current system unreliability to currently accumulated experience. But “experience” comes in different forms. Looking back after a major accident, one is sometimes able to identify previous events or measurable performance trends that were, in some sense, signaling the potential for that major accident: potential that could have been recognized and acted upon, but was not recognized until the accident occurred. This could be a previously unrecognized cause of accidents, or underestimation of the likelihood that a recognized potential cause would actually operate. Despite improvements in the state of practice of modeling of risk and reliability, operational experience still has a great deal to teach us, and work has been going on in several industries to try to do a better job of learning from experience before major accidents occur. It is not enough to say that we should review operating experience; there is too much “experience” for such general advice to be considered practical. The paper discusses the following: 1. The challenge of deciding what to focus on in analysis of operating experience. 2. Comparing what different models of learning and reliability growth imply about trending and precursor analysis.

  4. Machine learning methods for clinical forms analysis in mental health.

    Science.gov (United States)

    Strauss, John; Peguero, Arturo Martinez; Hirst, Graeme

    2013-01-01

    In preparation for a clinical information system implementation, the Centre for Addiction and Mental Health (CAMH) Clinical Information Transformation project completed multiple preparation steps. An automated process was desired to supplement the onerous task of manual analysis of clinical forms. We used natural language processing (NLP) and machine learning (ML) methods for a series of 266 separate clinical forms. For the investigation, documents were represented by feature vectors. We used four ML algorithms for our examination of the forms: cluster analysis, k-nearest neigh-bours (kNN), decision trees and support vector machines (SVM). Parameters for each algorithm were optimized. SVM had the best performance with a precision of 64.6%. Though we did not find any method sufficiently accurate for practical use, to our knowledge this approach to forms has not been used previously in mental health.

  5. ANALYSIS OF VIRTUAL ENVIRONMENT BENEFIT IN E-LEARNING

    Directory of Open Access Journals (Sweden)

    NOVÁK, Martin

    2013-06-01

    Full Text Available The analysis of the virtual environment assets towards the e-learning process improvements is mentioned in this article. The virtual environment was created within the solution of the project ‘Virtualization’ at the Faculty of Economics and Administration, University of Pardubice. The aim of this project was to eliminate the disproportion of free access to licensed software between groups of part-time and full-time students. The research was realized within selected subjects of the study program System Engineering and Informatics. The subjects were connected to the informatics, applied informatics, control and decision making. Student subject results, student feedback based on electronic questionnaire and data from log file of virtual server usage were compared and analysed. Based on analysis of virtualization possibilities the solution of virtual environment was implemented through Microsoft Terminal Server.

  6. What students learn in problem-based learning: a process analysis

    NARCIS (Netherlands)

    E.H.J. Yew (Elaine); H.G. Schmidt (Henk)

    2012-01-01

    textabstractThis study aimed to provide an account of how learning takes place in problem-based learning (PBL), and to identify the relationships between the learning-oriented activities of students with their learning outcomes. First, the verbal interactions and computer resources studied by nine

  7. Common Mobile Learning Characteristics--An Analysis of Mobile Learning Models and Frameworks

    Science.gov (United States)

    Imtinan, Umera; Chang, Vanessa; Issa, Tomayess

    2013-01-01

    Mobile learning offers learning opportunities to learners without the limitations of time and space. Mobile learning has introduced a number of flexible options to the learners across disciplines and at different educational levels. However, designing mobile learning content is an equally challenging task for the instructional designers.…

  8. An Analysis of Learning Barriers: The Saudi Arabian Context

    Science.gov (United States)

    Khan, Intakhab A.

    2011-01-01

    Learning and teaching are quite interrelated. Teaching can't take place unless the target students learn. Thus, teaching is a bi-polar activity. Learning barriers or causes of learning difficulties are quite common in an educational setting. But, when it comes to a very adverse effect it becomes crucial and unavoidable. There are different kinds…

  9. Contextualizing Cave Maps as Geospatial Information: Case Study of Indonesia

    Science.gov (United States)

    Reinhart, H.

    2017-12-01

    Caves are the result of solution processes. Because they are happened from geochemical and tectonic activity, they can be considered as geosphere phenomena. As one of the geosphere phenomena, especially at karst landform, caves have spatial dimensions and aspects. Cave’s utilizations and developments are increasing in many sectors such as hydrology, earth science, and tourism industry. However, spatial aspects of caves are poorly concerned dues to the lack of recognition toward cave maps. Many stakeholders have not known significances and importance of cave maps in determining development of a cave. Less information can be considered as the cause. Therefore, it is strongly necessary to put cave maps into the right context in order to make stakeholders realize the significance of it. Also, cave maps will be officially regarded as tools related to policy, development, and conservation act of caves hence they will have regulation in the usages and applications. This paper aims to make the contextualization of cave maps toward legal act. The act which is used is Act Number 4 Year 2011 About Geospatial Information. The contextualization is done by scrutinizing every articles and clauses related to cave maps and seek the contextual elements from both of them. The results are that cave maps can be regarded as geospatial information and classified as thematic geospatial information. The usages of them can be regulated through the Act Number 4 Year 2011. The regulations comprised by data acquisition, database, authorities, surveyor, and the obligation of providing cave maps in planning cave’s development and the environment surrounding.

  10. Improving the Slum Planning Through Geospatial Decision Support System

    Science.gov (United States)

    Shekhar, S.

    2014-11-01

    In India, a number of schemes and programmes have been launched from time to time in order to promote integrated city development and to enable the slum dwellers to gain access to the basic services. Despite the use of geospatial technologies in planning, the local, state and central governments have only been partially successful in dealing with these problems. The study on existing policies and programmes also proved that when the government is the sole provider or mediator, GIS can become a tool of coercion rather than participatory decision-making. It has also been observed that local level administrators who have adopted Geospatial technology for local planning continue to base decision-making on existing political processes. In this juncture, geospatial decision support system (GSDSS) can provide a framework for integrating database management systems with analytical models, graphical display, tabular reporting capabilities and the expert knowledge of decision makers. This assists decision-makers to generate and evaluate alternative solutions to spatial problems. During this process, decision-makers undertake a process of decision research - producing a large number of possible decision alternatives and provide opportunities to involve the community in decision making. The objective is to help decision makers and planners to find solutions through a quantitative spatial evaluation and verification process. The study investigates the options for slum development in a formal framework of RAY (Rajiv Awas Yojana), an ambitious program of Indian Government for slum development. The software modules for realizing the GSDSS were developed using the ArcGIS and Community -VIZ software for Gulbarga city.

  11. Establishing Accurate and Sustainable Geospatial Reference Layers in Developing Countries

    Science.gov (United States)

    Seaman, V. Y.

    2017-12-01

    Accurate geospatial reference layers (settlement names & locations, administrative boundaries, and population) are not readily available for most developing countries. This critical information gap makes it challenging for governments to efficiently plan, allocate resources, and provide basic services. It also hampers international agencies' response to natural disasters, humanitarian crises, and other emergencies. The current work involves a recent successful effort, led by the Bill & Melinda Gates Foundation and the Government of Nigeria, to obtain such data. The data collection began in 2013, with local teams collecting names, coordinates, and administrative attributes for over 100,000 settlements using ODK-enabled smartphones. A settlement feature layer extracted from satellite imagery was used to ensure all settlements were included. Administrative boundaries (Ward, LGA) were created using the settlement attributes. These "new" boundary layers were much more accurate than existing shapefiles used by the government and international organizations. The resulting data sets helped Nigeria eradicate polio from all areas except in the extreme northeast, where security issues limited access and vaccination activities. In addition to the settlement and boundary layers, a GIS-based population model was developed, in partnership with Oak Ridge National Laboratories and Flowminder), that used the extracted settlement areas and characteristics, along with targeted microcensus data. This model provides population and demographics estimates independent of census or other administrative data, at a resolution of 90 meters. These robust geospatial data layers found many other uses, including establishing catchment area settlements and populations for health facilities, validating denominators for population-based surveys, and applications across a variety of government sectors. Based on the success of the Nigeria effort, a partnership between DfID and the Bill & Melinda Gates

  12. Multi-class geospatial object detection based on a position-sensitive balancing framework for high spatial resolution remote sensing imagery

    Science.gov (United States)

    Zhong, Yanfei; Han, Xiaobing; Zhang, Liangpei

    2018-04-01

    Multi-class geospatial object detection from high spatial resolution (HSR) remote sensing imagery is attracting increasing attention in a wide range of object-related civil and engineering applications. However, the distribution of objects in HSR remote sensing imagery is location-variable and complicated, and how to accurately detect the objects in HSR remote sensing imagery is a critical problem. Due to the powerful feature extraction and representation capability of deep learning, the deep learning based region proposal generation and object detection integrated framework has greatly promoted the performance of multi-class geospatial object detection for HSR remote sensing imagery. However, due to the translation caused by the convolution operation in the convolutional neural network (CNN), although the performance of the classification stage is seldom influenced, the localization accuracies of the predicted bounding boxes in the detection stage are easily influenced. The dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage has not been addressed for HSR remote sensing imagery, and causes position accuracy problems for multi-class geospatial object detection with region proposal generation and object detection. In order to further improve the performance of the region proposal generation and object detection integrated framework for HSR remote sensing imagery object detection, a position-sensitive balancing (PSB) framework is proposed in this paper for multi-class geospatial object detection from HSR remote sensing imagery. The proposed PSB framework takes full advantage of the fully convolutional network (FCN), on the basis of a residual network, and adopts the PSB framework to solve the dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage. In addition, a pre-training mechanism is utilized to accelerate the training procedure

  13. Teaching And Learning Tectonics With Web-GIS

    Science.gov (United States)

    Anastasio, D. J.; Sahagian, D. L.; Bodzin, A.; Teletzke, A. L.; Rutzmoser, S.; Cirucci, L.; Bressler, D.; Burrows, J. E.

    2012-12-01

    Tectonics is a new curriculum enhancement consisting of six Web GIS investigations designed to augment a traditional middle school Earth science curriculum. The investigations are aligned to Disciplinary Core Ideas: Earth and Space Science from the National Research Council's (2012) Framework for K-12 Science Education and to tectonics benchmark ideas articulated in the AAAS Project 2061 (2007) Atlas of Science Literacy. The curriculum emphasizes geospatial thinking and scientific inquiry and consists of the following modules: Geohazards, which plate boundary is closest to me? How do we recognize plate boundaries? How does thermal energy move around the Earth? What happens when plates diverge? What happens when plate move sideways past each other? What happens when plates collide? The Web GIS interface uses JavaScript for simplicity, intuition, and convenience for implementation on a variety of platforms making it easier for diverse middle school learners and their teachers to conduct authentic Earth science investigations, including multidisciplinary visualization, analysis, and synthesis of data. Instructional adaptations allow students who are English language learners, have disabilities, or are reluctant readers to perform advanced desktop GIS functions including spatial analysis, map visualization and query. The Web GIS interface integrates graphics, multimedia, and animation in addition to newly developed features, which allow users to explore and discover geospatial patterns that would not be easily visible using typical classroom instructional materials. The Tectonics curriculum uses a spatial learning design model that incorporates a related set of frameworks and design principles. The framework builds on the work of other successful technology-integrated curriculum projects and includes, alignment of materials and assessments with learning goals, casting key ideas in real-world problems, engaging students in scientific practices that foster the use of key

  14. The national atlas as a metaphor for improved use of a national geospatial data infrastructure

    NARCIS (Netherlands)

    Aditya Kurniawan Muhammad, T.

    2007-01-01

    Geospatial Data infrastructures have been developed worldwide. Geoportals have been created as an interface to allow users or the community to discover and use geospatial data offered by providers of these initiatives. This study focuses on the development of a web national atlas as an alternative

  15. 76 FR 15311 - Legacy Learning Systems, Inc.; Analysis of Proposed Consent Order To Aid Public Comment

    Science.gov (United States)

    2011-03-21

    ... FEDERAL TRADE COMMISSION [File No. 102 3055] Legacy Learning Systems, Inc.; Analysis of Proposed... electronically or in paper form. Comments should refer to ``Legacy Learning Systems, File No. 102 3055'' to... it. A comment filed in paper form should include the ``Legacy Learning Systems, File No. 102 3055...

  16. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  17. Open geospatial infrastructure for data management and analytics in interdisciplinary research

    DEFF Research Database (Denmark)

    Jeppesen, Jacob Høxbroe; Ebeid, Emad Samuel Malki; Jacobsen, Rune Hylsberg

    2018-01-01

    , and information and communications technology needed to promote the implementation of precision agriculture is limited by proprietary integrations and non-standardized data formats and connections. In this paper, an open geospatial data infrastructure is presented, based on standards defined by the Open...... software, and was complemented by open data from governmental offices along with ESA satellite imagery. Four use cases are presented, covering analysis of nearly 50 000 crop fields and providing seamless interaction with an emulated machine terminal. They act to showcase both for how the infrastructure......The terms Internet of Things and Big Data are currently subject to much attention, though the specific impact of these terms in our practical lives are difficult to apprehend. Data-driven approaches do lead to new possibilities, and significant improvements within a broad range of domains can...

  18. WE-B-BRC-02: Risk Analysis and Incident Learning

    Energy Technology Data Exchange (ETDEWEB)

    Fraass, B. [Cedars Sinai Medical Center (United States)

    2016-06-15

    Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. We therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology

  19. WE-B-BRC-02: Risk Analysis and Incident Learning

    International Nuclear Information System (INIS)

    Fraass, B.

    2016-01-01

    Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. We therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology

  20. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    Directory of Open Access Journals (Sweden)

    João Cardoso

    2013-05-01

    Full Text Available The Arterial Pressure Waveform (APW can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1 a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2 the acquired position and amplitude of onset, Systolic Peak (SP, Point of Inflection (Pi and Dicrotic Wave (DW were used for the computation of some morphological attributes; (3 pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4 classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic, J48 (decision tree and RIPPER (rule-based induction; and (5 we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx. Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95% and high area under the curve (AUC of a Receiver Operating Characteristic (ROC curve (0.961. Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation.

  1. Exploring the Earth Using Deep Learning Techniques

    Science.gov (United States)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Research using deep neural networks have significantly matured in recent times, and there is now a surge in interest to apply such methods to Earth systems science and the geosciences. When combined with Big Data, we believe there are opportunities for significantly transforming a number of areas relevant to researchers and policy makers. In particular, by using a combination of data from a range of satellite Earth observations as well as computer simulations from climate models and reanalysis, we can gain new insights into the information that is locked within the data. Global geospatial datasets describe a wide range of physical and chemical parameters, which are mostly available using regular grids covering large spatial and temporal extents. This makes them perfect candidates to apply deep learning methods. So far, these techniques have been successfully applied to image analysis through the use of convolutional neural networks. However, this is only one field of interest, and there is potential for many more use cases to be explored. The deep learning algorithms require fast access to large amounts of data in the form of tensors and make intensive use of CPU in order to train its models. The Australian National Computational Infrastructure (NCI) has recently augmented its Raijin 1.2 PFlop supercomputer with hardware accelerators. Together with NCI's 3000 core high performance OpenStack cloud, these computational systems have direct access to NCI's 10+ PBytes of datasets and associated Big Data software technologies (see http://geonetwork.nci.org.au/ and http://nci.org.au/systems-services/national-facility/nerdip/). To effectively use these computing infrastructures requires that both the data and software are organised in a way that readily supports the deep learning software ecosystem. Deep learning software, such as the open source TensorFlow library, has allowed us to demonstrate the possibility of generating geospatial models by combining information from

  2. Categorizing natural disaster damage assessment using satellite-based geospatial techniques

    Science.gov (United States)

    Myint, S.W.; Yuan, M.; Cerveny, R.S.; Giri, C.

    2008-01-01

    Remote sensing of a natural disaster's damage offers an exciting backup and/or alternative to traditional means of on-site damage assessment. Although necessary for complete assessment of damage areas, ground-based damage surveys conducted in the aftermath of natural hazard passage can sometimes be potentially complicated due to on-site difficulties (e.g., interaction with various authorities and emergency services) and hazards (e.g., downed power lines, gas lines, etc.), the need for rapid mobilization (particularly for remote locations), and the increasing cost of rapid physical transportation of manpower and equipment. Satellite image analysis, because of its global ubiquity, its ability for repeated independent analysis, and, as we demonstrate here, its ability to verify on-site damage assessment provides an interesting new perspective and investigative aide to researchers. Using one of the strongest tornado events in US history, the 3 May 1999 Oklahoma City Tornado, as a case example, we digitized the tornado damage path and co-registered the damage path using pre- and post-Landsat Thematic Mapper image data to perform a damage assessment. We employed several geospatial approaches, specifically the Getis index, Geary's C, and two lacunarity approaches to categorize damage characteristics according to the original Fujita tornado damage scale (F-scale). Our results indicate strong relationships between spatial indices computed within a local window and tornado F-scale damage categories identified through the ground survey. Consequently, linear regression models, even incorporating just a single band, appear effective in identifying F-scale damage categories using satellite imagery. This study demonstrates that satellite-based geospatial techniques can effectively add spatial perspectives to natural disaster damages, and in particular for this case study, tornado damages.

  3. Geospatial techniques for allocating vulnerability zoning of geohazards along the Karakorum Highway, Gilgit-Baltistan-Pakistan

    Science.gov (United States)

    Khan, K. M.; Rashid, S.; Yaseen, M.; Ikram, M.

    2016-12-01

    The Karakoram Highway (KKH) 'eighth wonder of the world', constructed and completed by the consent of Pakistan and China in 1979 as a Friendship Highway. It connect Gilgit-Baltistan, a strategically prominent region of Pakistan, with Xinjiang region in China. Due to manifold geology/geomorphology, soil formation, steep slopes, climate change well as unsustainable anthropogenic activities, still, KKH is remarkably vulnerable to natural hazards i.e. land subsistence, landslides, erosion, rock fall, floods, debris flows, cyclical torrential rainfall and snowfall, lake outburst etc. Most of the time these geohazard's damaging effects jeopardized the life in the region. To ascertain the nature and frequency of the disaster and vulnerability zoning, a rating and management (logistic) analysis were made to investigate the spatiotemporal sharing of the natural hazard. The substantial dynamics of the physiograpy, geology, geomorphology, soils and climate were carefully understand while slope, aspect, elevation, profile curvature and rock hardness was calculated by different techniques. To assess the nature and intensity geospatial analysis were conducted and magnitude of every factor was gauged by using logistic regression. Moreover, ever relative variable was integrated in the evaluation process. Logistic regression and geospatial techniques were used to map the geohazard vulnerability zoning (GVZ). The GVZ model findings were endorsed by the reviews of documented hazards in the current years and the precision was realized more than 88.1 %. The study has proved the model authentication by highlighting the comfortable indenture among the vulnerability mapping and past documented hazards. By using a receiver operating characteristic curve, the logistic regression model made satisfactory results. The outcomes will be useful in sustainable land use and infrastructure planning, mainly in high risk zones for reduceing economic damages and community betterment.

  4. Visual artificial grammar learning in dyslexia: A meta-analysis.

    Science.gov (United States)

    van Witteloostuijn, Merel; Boersma, Paul; Wijnen, Frank; Rispens, Judith

    2017-11-01

    Literacy impairments in dyslexia have been hypothesized to be (partly) due to an implicit learning deficit. However, studies of implicit visual artificial grammar learning (AGL) have often yielded null results. The aim of this study is to weigh the evidence collected thus far by performing a meta-analysis of studies on implicit visual AGL in dyslexia. Thirteen studies were selected through a systematic literature search, representing data from 255 participants with dyslexia and 292 control participants (mean age range: 8.5-36.8 years old). If the 13 selected studies constitute a random sample, individuals with dyslexia perform worse on average than non-dyslexic individuals (average weighted effect size=0.46, 95% CI [0.14 … 0.77], p=0.008), with a larger effect in children than in adults (p=0.041; average weighted effect sizes 0.71 [sig.] versus 0.16 [non-sig.]). However, the presence of a publication bias indicates the existence of missing studies that may well null the effect. While the studies under investigation demonstrate that implicit visual AGL is impaired in dyslexia (more so in children than in adults, if in adults at all), the detected publication bias suggests that the effect might in fact be zero. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  5. SAFE: A Sentiment Analysis Framework for E-Learning

    Directory of Open Access Journals (Sweden)

    Francesco Colace

    2014-12-01

    Full Text Available The spread of social networks allows sharing opinions on different aspects of life and daily millions of messages appear on the web. This textual information can be a rich source of data for opinion mining and sentiment analysis: the computational study of opinions, sentiments and emotions expressed in a text. Its main aim is the identification of the agreement or disagreement statements that deal with positive or negative feelings in comments or reviews. In this paper, we investigate the adoption, in the field of the e-learning, of a probabilistic approach based on the Latent Dirichlet Allocation (LDA as Sentiment grabber. By this approach, for a set of documents belonging to a same knowledge domain, a graph, the Mixed Graph of Terms, can be automatically extracted. The paper shows how this graph contains a set of weighted word pairs, which are discriminative for sentiment classification. In this way, the system can detect the feeling of students on some topics and teacher can better tune his/her teaching approach. In fact, the proposed method has been tested on datasets coming from e-learning platforms. A preliminary experimental campaign shows how the proposed approach is effective and satisfactory.

  6. Parallel multiple instance learning for extremely large histopathology image analysis.

    Science.gov (United States)

    Xu, Yan; Li, Yeshu; Shen, Zhengyang; Wu, Ziwei; Gao, Teng; Fan, Yubo; Lai, Maode; Chang, Eric I-Chao

    2017-08-03

    Histopathology images are critical for medical diagnosis, e.g., cancer and its treatment. A standard histopathology slice can be easily scanned at a high resolution of, say, 200,000×200,000 pixels. These high resolution images can make most existing imaging processing tools infeasible or less effective when operated on a single machine with limited memory, disk space and computing power. In this paper, we propose an algorithm tackling this new emerging "big data" problem utilizing parallel computing on High-Performance-Computing (HPC) clusters. Experimental results on a large-scale data set (1318 images at a scale of 10 billion pixels each) demonstrate the efficiency and effectiveness of the proposed algorithm for low-latency real-time applications. The framework proposed an effective and efficient system for extremely large histopathology image analysis. It is based on the multiple instance learning formulation for weakly-supervised learning for image classification, segmentation and clustering. When a max-margin concept is adopted for different clusters, we obtain further improvement in clustering performance.

  7. River predisposition to ice jams: a simplified geospatial model

    Directory of Open Access Journals (Sweden)

    S. De Munck

    2017-07-01

    Full Text Available Floods resulting from river ice jams pose a great risk to many riverside municipalities in Canada. The location of an ice jam is mainly influenced by channel morphology. The goal of this work was therefore to develop a simplified geospatial model to estimate the predisposition of a river channel to ice jams. Rather than predicting the timing of river ice breakup, the main question here was to predict where the broken ice is susceptible to jam based on the river's geomorphological characteristics. Thus, six parameters referred to potential causes for ice jams in the literature were initially selected: presence of an island, narrowing of the channel, high sinuosity, presence of a bridge, confluence of rivers, and slope break. A GIS-based tool was used to generate the aforementioned factors over regular-spaced segments along the entire channel using available geospatial data. An ice jam predisposition index (IJPI was calculated by combining the weighted optimal factors. Three Canadian rivers (province of Québec were chosen as test sites. The resulting maps were assessed from historical observations and local knowledge. Results show that 77 % of the observed ice jam sites on record occurred in river sections that the model considered as having high or medium predisposition. This leaves 23 % of false negative errors (missed occurrence. Between 7 and 11 % of the highly predisposed river sections did not have an ice jam on record (false-positive cases. Results, limitations, and potential improvements are discussed.

  8. Geospatial Data Quality of the Servir CORS Network

    Science.gov (United States)

    Santos, J.; Teodoro, R.; Mira, N.; Mendes, V. B.

    2015-08-01

    The SERVIR Continuous Operation Reference Stations (CORS) network was implemented in 2006 to facilitate land surveying with Global Navigation Satellite Systems (GNSS) positioning techniques. Nowadays, the network covers all Portuguese mainland. The SERVIR data is provided to many users, such as surveyors, universities (for education and research purposes) and companies that deal with geographic information. By middle 2012, there was a significant change in the network accessing paradigm, the most important of all being the increase in the responsibility of managing the network to guarantee a permanent availability and the highest quality of the geospatial data. In addition, the software that is used to manage the network and to compute the differential corrections was replaced by a new software package. These facts were decisive to perform the quality control of the SERVIR network and evaluate positional accuracy. In order to perform such quality control, a significant number of geodetic monuments spread throughout the country were chosen. Some of these monuments are located in the worst location regarding the network geometry in order to evaluate the accuracy of positions for the worst case scenarios. Data collection was carried out using different GNSS positioning modes and were compared against the benchmark positions that were determined using data acquired in static mode in 3-hour sessions. We conclude the geospatial data calculated and provided to the users community by the network is, within the surveying purposes, accurate, precise and fits the needs of those users.

  9. geo-spatial analysis of crime in kaduna metropolis, nigeria.

    African Journals Online (AJOL)

    Dr A.B.Ahmed

    2017-02-24

    Feb 24, 2017 ... Geographic Information System (GIS) as a tool can be used by relevant agencies such as ... enforcement, information about the location of a crime incident, suspect, or victim is ..... Development Report in Nigeria. Available at:.

  10. Development of Indoor Air Pollution Concentration Prediction by Geospatial Analysis

    Directory of Open Access Journals (Sweden)

    Adyati Pradini Yudison

    2015-07-01

    Full Text Available People living near busy roads are potentially exposed to traffic-induced air pollutants. The pollutants may intrude into the indoor environment, causing health risks to the occupants. Prediction of pollutant exposure therefore is of great importance for impact assessment and policy making related to environmentally sustainable transport. This study involved the selection of spatial interpolation methods that can be used for prediction of indoor air quality based on outdoor pollutant mapping without indoor measurement data. The research was undertaken in the densely populated area of Karees, Bandung, Indonesia. The air pollutant NO2 was monitored in this area as a preliminary study. Nitrogen dioxide concentrations were measured by passive diffusion tube. Outdoor NO2 concentrations were measured at 94 locations, consisting of 30 roadside and 64 outdoor locations. Residential indoor NO2 concentrations were measured at 64 locations. To obtain a spatially continuous air quality map, the spatial interpolation methods of inverse distance weighting (IDW and Kriging were applied. Selection of interpolation method was done based on the smallest root mean square error (RMSE and standard deviation (SD. The most appropriate interpolation method for outdoor NO2 concentration mapping was Kriging with an SD value of 5.45 µg/m3 and an RMSE value of 5.45 µg/m3, while for indoor NO2 concentration mapping the IDW was best fitted with an RMSE value of 5.92 µg/m3 and an SD value of 5.92 µg/m3.

  11. Motivation and Self-Regulated Learning: A Multivariate Multilevel Analysis

    Directory of Open Access Journals (Sweden)

    Wondimu Ahmed

    2017-09-01

    Full Text Available This study investigated the relationship between motivation and self-regulated learning (SRL in a nationally representative sample of 5245, 15-year-old students in the USA. A multivariate multilevel analysis was conducted to examine the role of three motivational variables (self-efficacy, intrinsic value & instrumental value in predicting three SRL strategies (memorization, elaboration & control. The results showed that compared to self-efficacy, intrinsic value and instrumental value of math were stronger predictors of memorization, elaboration and control strategies. None of the motivational variables had a stronger effect on one strategy than the other. The findings suggest that the development of self-regulatory skills in math can be greatly enhanced by helping students develop positive value of and realistic expectancy for success in math.

  12. Distributed Multi-interface Catalogue for Geospatial Data

    Science.gov (United States)

    Nativi, S.; Bigagli, L.; Mazzetti, P.; Mattia, U.; Boldrini, E.

    2007-12-01

    Several geosciences communities (e.g. atmospheric science, oceanography, hydrology) have developed tailored data and metadata models and service protocol specifications for enabling online data discovery, inventory, evaluation, access and download. These specifications are conceived either profiling geospatial information standards or extending the well-accepted geosciences data models and protocols in order to capture more semantics. These artifacts have generated a set of related catalog -and inventory services- characterizing different communities, initiatives and projects. In fact, these geospatial data catalogs are discovery and access systems that use metadata as the target for query on geospatial information. The indexed and searchable metadata provide a disciplined vocabulary against which intelligent geospatial search can be performed within or among communities. There exists a clear need to conceive and achieve solutions to implement interoperability among geosciences communities, in the context of the more general geospatial information interoperability framework. Such solutions should provide search and access capabilities across catalogs, inventory lists and their registered resources. Thus, the development of catalog clearinghouse solutions is a near-term challenge in support of fully functional and useful infrastructures for spatial data (e.g. INSPIRE, GMES, NSDI, GEOSS). This implies the implementation of components for query distribution and virtual resource aggregation. These solutions must implement distributed discovery functionalities in an heterogeneous environment, requiring metadata profiles harmonization as well as protocol adaptation and mediation. We present a catalog clearinghouse solution for the interoperability of several well-known cataloguing systems (e.g. OGC CSW, THREDDS catalog and data services). The solution implements consistent resource discovery and evaluation over a dynamic federation of several well-known cataloguing and

  13. Variational Bayesian Learning for Wavelet Independent Component Analysis

    Science.gov (United States)

    Roussos, E.; Roberts, S.; Daubechies, I.

    2005-11-01

    In an exploratory approach to data analysis, it is often useful to consider the observations as generated from a set of latent generators or "sources" via a generally unknown mapping. For the noisy overcomplete case, where we have more sources than observations, the problem becomes extremely ill-posed. Solutions to such inverse problems can, in many cases, be achieved by incorporating prior knowledge about the problem, captured in the form of constraints. This setting is a natural candidate for the application of the Bayesian methodology, allowing us to incorporate "soft" constraints in a natural manner. The work described in this paper is mainly driven by problems in functional magnetic resonance imaging of the brain, for the neuro-scientific goal of extracting relevant "maps" from the data. This can be stated as a `blind' source separation problem. Recent experiments in the field of neuroscience show that these maps are sparse, in some appropriate sense. The separation problem can be solved by independent component analysis (ICA), viewed as a technique for seeking sparse components, assuming appropriate distributions for the sources. We derive a hybrid wavelet-ICA model, transforming the signals into a domain where the modeling assumption of sparsity of the coefficients with respect to a dictionary is natural. We follow a graphical modeling formalism, viewing ICA as a probabilistic generative model. We use hierarchical source and mixing models and apply Bayesian inference to the problem. This allows us to perform model selection in order to infer the complexity of the representation, as well as automatic denoising. Since exact inference and learning in such a model is intractable, we follow a variational Bayesian mean-field approach in the conjugate-exponential family of distributions, for efficient unsupervised learning in multi-dimensional settings. The performance of the proposed algorithm is demonstrated on some representative experiments.

  14. A Functional Genomic Analysis of NF1-Associated Learning Disabilities

    National Research Council Canada - National Science Library

    Tang, Shao-Jun

    2008-01-01

    Learning disabilities severely deteriorate the life of many NF1 patients. However, the pathogenic process for NF1-associated learning disabilities has not been fully understood and an effective therapy is not available...

  15. A Functional Genomic Analysis of NF1-Associated Learning Disabilities

    National Research Council Canada - National Science Library

    Tang, Shao-Jun

    2007-01-01

    Learning disabilities severely deteriorate the life of many NF1 patients. However, the pathogenic process for NF1-associated learning disabilities has not been fully understood and an effective therapy is not available...

  16. A Functional Genomic Analysis of NF1-Associated Learning Disabilities

    National Research Council Canada - National Science Library

    Tang, Shao-Jun

    2006-01-01

    Learning disabilities severely deteriorate the life of many NFI patients. However, the pathogenic process for NFI-associated learning disabilities has not been fully understood and an effective therapy is not available...

  17. Effects of feedback in a computer-based learning environment on students’ learning outcomes: a meta-analysis

    NARCIS (Netherlands)

    van der Kleij, Fabienne; Feskens, Remco C.W.; Eggen, Theodorus Johannes Hendrikus Maria

    2015-01-01

    In this meta-analysis, we investigated the effects of methods for providing item-based feedback in a computer-based environment on students’ learning outcomes. From 40 studies, 70 effect sizes were computed, which ranged from −0.78 to 2.29. A mixed model was used for the data analysis. The results

  18. Blended learning in paediatric emergency medicine: preliminary analysis of a virtual learning environment.

    Science.gov (United States)

    Spedding, Ruth; Jenner, Rachel; Potier, Katherine; Mackway-Jones, Kevin; Carley, Simon

    2013-04-01

    Paediatric emergency medicine (PEM) currently faces many competing educational challenges. Recent changes to the working patterns have made the delivery of effective teaching to trainees extremely difficult. We developed a virtual learning environment, on the basis of socioconstructivist principles, which allows learning to take place regardless of time or location. The aim was to evaluate the effectiveness of a blended e-learning approach for PEM training. We evaluated the experiences of ST3 trainees in PEM using a multimodal approach. We classified and analysed message board discussions over a 6-month period to look for evidence of practice change and learning. We conducted semistructured qualitative interviews with trainees approximately 5 months after they completed the course. Trainees embraced the virtual learning environment and had positive experiences of the blended approach to learning. Socioconstructivist learning did take place through the use of message boards on the virtual learning environment. Despite their initial unfamiliarity with the online learning system, the participants found it easy to access and use. The participants found the learning relevant and there was an overlap between shop floor learning and the online content. Clinical discussion was often led by trainees on the forums and these were described as enjoyable and informative. A blended approach to e-learning in basic PEM is effective and enjoyable to trainees.

  19. Analysis of applications suitable for mobile learning of preschool children

    OpenAIRE

    Stoimenovski, Aleksandar; Kraleva, Radoslava; Kralev, Velin

    2016-01-01

    This article considers the use of mobile learning in Bulgarian education by young children. The most used mobile operating systems are analyzed. Also some of the most used existing applications suitable for mobile learning of preschool children are presented and classified. Keywords: Mobile applications for preschool children, mobile learning.

  20. Learning Conditions, Members' Motivation and Satisfaction: A Multilevel Analysis

    Science.gov (United States)

    Dimas, Isabel Dórdio; Rebelo, Teresa; Lourenço, Paulo Renato

    2015-01-01

    Purpose: The purpose of this paper was to contribute to the clarification of the conditions under which teams can be successful, especially those related to team learning. To attain this goal, in the present study, the mediating role played by team members' motivation on the relationship between team learning conditions (shared learning beliefs…