WorldWideScience

Sample records for web analyzing spatial

  1. Geometry and Morphology of the Cosmic Web : Analyzing Spatial Patterns in the Universe

    NARCIS (Netherlands)

    van de Weygaert, Rien; Jones, Bernard J. T.; Platen, Erwin; Aragon-Calvo, Miguel A.; Anton, F

    2009-01-01

    We review the analysis of the Cosmic Web by means of an extensive toolset based on the use of Delaunay and Voronoi tessellations. The Cosmic Web is the salient and pervasive foamlike pattern in which matter has organized itself on scales of a few up to more than a hundred Megaparsec. The weblike spa

  2. Geometry and Morphology of the Cosmic Web: Analyzing Spatial Patterns in the Universe

    CERN Document Server

    van de Weygaert, Rien; Jones, Bernard J T; Platen, Erwin

    2009-01-01

    We review the analysis of the Cosmic Web by means of an extensive toolset based on the use of Delaunay and Voronoi tessellations. The Cosmic Web is the salient and pervasive foamlike pattern in which matter has organized itself on scales of a few up to more than a hundred Megaparsec. First, we describe the Delaunay Tessellation Field Estimator (DTFE). The DTFE formalism is shown to recover the hierarchical nature and the anisotropic morphology of the cosmic matter distribution. The Multiscale Morphology Filter (MMF) uses the DTFE density field to extract the diverse morphological elements - filaments, sheets and clusters - on the basis of a ScaleSpace analysis which searches for these morphologies over a range of scales. Subsequently, we discuss the Watershed Voidfinder (WVF), which invokes the discrete watershed transform to identify voids in the cosmic matter distribution. The WVF is able to determine the location, size and shape of the voids. The watershed transform is also a key element in the SpineWeb an...

  3. Analyzing Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M.-Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2007-01-01

    Web services should be dependable, because businesses rely on them. For that purpose the Service Oriented Architecture has standardized specifications at a syntactical level. In this paper, we demonstrate how such specifications are used to derive semantic models in the form of (timed) automata...

  4. Spatial data management for WebGIS

    Institute of Scientific and Technical Information of China (English)

    钟志农; 景宁; 陈荦; 吴秋云

    2004-01-01

    Spatial data management is an important factor that affects the performance of WebGIS. In this paper, spatial data management for WebGIS is discussed from two aspects: spatial data model and data storage. Current data models and storage methods are analyzed and compared. According to the requirements of spatial data management for WebGIS,a spatial data model is proposed, which organizes spatial data in four levels: geometry, feature, layer and map. Based on this model, spatial data are managed and stored in object-relational database management system (ORDBMS).

  5. Elements of a Spatial Web

    DEFF Research Database (Denmark)

    Jensen, Christian S.

    2010-01-01

    and are relevant to a text argument. An important element in enabling such queries is to be able to rank spatial web objects. Another is to be able to determine the relevance of an object to a query. Yet another is to enable the efficient processing of such queries. The talk covers recent results on spatial web...

  6. Identifying and Analyzing Web Server Attacks

    Energy Technology Data Exchange (ETDEWEB)

    Seifert, Christian; Endicott-Popovsky, Barbara E.; Frincke, Deborah A.; Komisarczuk, Peter; Muschevici, Radu; Welch, Ian D.

    2008-08-29

    Abstract: Client honeypots can be used to identify malicious web servers that attack web browsers and push malware to client machines. Merely recording network traffic is insufficient to perform comprehensive forensic analyses of such attacks. Custom tools are required to access and analyze network protocol data. Moreover, specialized methods are required to perform a behavioral analysis of an attack, which helps determine exactly what transpired on the attacked system. This paper proposes a record/replay mechanism that enables forensic investigators to extract application data from recorded network streams and allows applications to interact with this data in order to conduct behavioral analyses. Implementations for the HTTP and DNS protocols are presented and their utility in network forensic investigations is demonstrated.

  7. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  8. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use

  9. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  10. Data management on the spatial web

    DEFF Research Database (Denmark)

    Jensen, Christian S.

    2012-01-01

    billion web queries are issued that have local intent and target spatial web objects. These are points of interest with a web presence, and they thus have locations as well as textual descriptions. This development has given prominence to spatial web data management, an area ripe with new and exciting...... opportunities and challenges. The research community has embarked on inventing and supporting new query functionality for the spatial web. Different kinds of spatial web queries return objects that are near a location argument and are relevant to a text argument. To support such queries, it is important...... functionality enabled by the setting. Further, the talk offers insight into the data management techniques capable of supporting such functionality....

  11. Data management on the spatial web

    DEFF Research Database (Denmark)

    Jensen, Christian S.

    2012-01-01

    Due in part to the increasing mobile use of the web and the proliferation of geo-positioning, the web is fast acquiring a significant spatial aspect. Content and users are being augmented with locations that are used increasingly by location-based services. Studies suggest that each week, several...... functionality enabled by the setting. Further, the talk offers insight into the data management techniques capable of supporting such functionality.......Due in part to the increasing mobile use of the web and the proliferation of geo-positioning, the web is fast acquiring a significant spatial aspect. Content and users are being augmented with locations that are used increasingly by location-based services. Studies suggest that each week, several...... billion web queries are issued that have local intent and target spatial web objects. These are points of interest with a web presence, and they thus have locations as well as textual descriptions. This development has given prominence to spatial web data management, an area ripe with new and exciting...

  12. Analyzing the Change-Proneness of APIs and web APIs

    NARCIS (Netherlands)

    Romano, D.

    2015-01-01

    Analyzing the Change-Proneness of APIs and web APIs APIs and web APIs are used to expose existing business logic and, hence, to ease the reuse of functionalities across multiple software systems. Software systems can use the business logic of legacy systems by binding their APIs and web APIs. With t

  13. APPROACHES TO ANALYZE THE QUALITY OF ROMANIAN TOURISM WEB SITES

    Directory of Open Access Journals (Sweden)

    Lacurezeanu Ramona

    2013-07-01

    The purpose of our work is to analyze travel web-sites, more exactly, whether the criteria used to analyze virtual stores are also adequate for the Romanian tourism product. Following the study, we concluded that the Romanian online tourism web-sites for the Romanian market have the features that we found listed on similar web-sites of France, England, Germany, etc. In conclusion, online Romanian tourism can be considered one of the factors of economic growth.

  14. User Behavior Analysis from Web Log using Log Analyzer Tool

    Directory of Open Access Journals (Sweden)

    Brijesh Bakariya

    2013-11-01

    Full Text Available Now a day, internet plays a role of huge database in which many websites, information and search engines are available. But due to unstructured and semi-structured data in webpage, it has become a challenging task to extract relevant information. Its main reason is that traditional knowledge based technique are not correct to efficiently utilization the knowledge, because it consist of many discover pattern, contains a lots of noise and uncertainty. In this paper, analyzing of web usage mining has been made with the help if web log data for which web log analyzer tool, “Deep Log Analyzer” to find out abstract information from particular server and also tried to find out the user behavior and also developed an ontology which consist the relation among efficient web apart of web usage mining.

  15. A WebGIS-based system for analyzing and visualizing air quality data for Shanghai Municipality

    Science.gov (United States)

    Wang, Manyi; Liu, Chaoshun; Gao, Wei

    2014-10-01

    An online visual analytical system based on Java Web and WebGIS for air quality data for Shanghai Municipality was designed and implemented to quantitatively analyze and qualitatively visualize air quality data. By analyzing the architecture of WebGIS and Java Web, we firstly designed the overall scheme for system architecture, then put forward the software and hardware environment and also determined the main function modules for the system. The visual system was ultimately established with the DIV + CSS layout method combined with JSP, JavaScript, and some other computer programming languages based on the Java programming environment. Moreover, Struts, Spring, and Hibernate frameworks (SSH) were integrated in the system for the purpose of easy maintenance and expansion. To provide mapping service and spatial analysis functions, we selected ArcGIS for Server as the GIS server. We also used Oracle database and ESRI file geodatabase to store spatial data and non-spatial data in order to ensure the data security. In addition, the response data from the Web server are resampled to implement rapid visualization through the browser. The experimental successes indicate that this system can quickly respond to user's requests, and efficiently return the accurate processing results.

  16. User Behavior Analysis from Web Log using Log Analyzer Tool

    OpenAIRE

    Brijesh Bakariya; Ghanshyam Singh Thakur

    2013-01-01

    Now a day, internet plays a role of huge database in which many websites, information and search engines are available. But due to unstructured and semi-structured data in webpage, it has become a challenging task to extract relevant information. Its main reason is that traditional knowledge based technique are not correct to efficiently utilization the knowledge, because it consist of many discover pattern, contains a lots of noise and uncertainty. In this paper, analyzing of web usage minin...

  17. TOWARD SEMANTIC WEB INFRASTRUCTURE FOR SPATIAL FEATURES' INFORMATION

    Directory of Open Access Journals (Sweden)

    R. Arabsheibani

    2015-12-01

    Full Text Available The Web and its capabilities can be employed as a tool for data and information integration if comprehensive datasets and appropriate technologies and standards enable the web with interpretation and easy alignment of data and information. Semantic Web along with the spatial functionalities enable the web to deal with the huge amount of data and information. The present study investigate the advantages and limitations of the Spatial Semantic Web and compare its capabilities with relational models in order to build a spatial data infrastructure. An architecture is proposed and a set of criteria is defined for the efficiency evaluation. The result demonstrate that when using the data with special characteristics such as schema dynamicity, sparse data or available relations between the features, the spatial semantic web and graph databases with spatial operations are preferable.

  18. Modeling and Analyze the Deep Web: Surfacing Hidden Value

    OpenAIRE

    Suneet Kumar; Anuj Kumar Yadav; Rakesh Bharati; Rani Choudhary

    2011-01-01

    Focused web crawlers have recently emerged as an alternative to the well-established web search engines. While the well-known focused crawlers retrieve relevant web-pages, there are various applications which target whole websites instead of single web-pages. For example, companies are represented by websites, not by individual web-pages. To answer queries targeted at Websites, web directories are an established solution. In this paper, we introduce a novel focused website crawler to employ t...

  19. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    Science.gov (United States)

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  20. Spatial Data Web Services Pricing Model Infrastructure

    Science.gov (United States)

    Ozmus, L.; Erkek, B.; Colak, S.; Cankurt, I.; Bakıcı, S.

    2013-08-01

    The General Directorate of Land Registry and Cadastre (TKGM) which is the leader in the field of cartography largely continues its missions which are; to keep and update land registry and cadastre system of the country under the responsibility of the treasure, to perform transactions related to real estate and to establish Turkish national spatial information system. TKGM a public agency has completed many projects. Such as; Continuously Operating GPS Reference Stations (TUSAGA-Aktif), Geo-Metadata Portal (HBB), Orthophoto-Base Map Production and web services, Completion of Initial Cadastre, Cadastral Renovation Project (TKMP), Land Registry and Cadastre Information System (TAKBIS), Turkish National Spatial Data Infrastructure Project (TNSDI), Ottoman Land Registry Archive Information System (TARBIS). TKGM provides updated map and map information to not only public institutions but also to related society in the name of social responsibility principals. Turkish National Spatial Data Infrastructure activities have been started by the motivation of Circular No. 2003/48 which was declared by Turkish Prime Ministry in 2003 within the context of e-Transformation of Turkey Short-term Action Plan. Action No. 47 in the mentioned action plan implies that "A Feasibility Study shall be made in order to establish the Turkish National Spatial Data Infrastructure" whose responsibility has been given to General Directorate of Land Registry and Cadastre. Feasibility report of NSDI has been completed in 10th of December 2010. After decision of Steering Committee, feasibility report has been send to Development Bank (old name State Planning Organization) for further evaluation. There are two main arrangements with related this project (feasibility report).First; Now there is only one Ministry which is Ministry of Environment and Urbanism responsible for establishment, operating and all national level activities of NSDI. And Second arrangement is related to institutional Level. The

  1. Spatial Statistics for Dyadic Data: Analyzing the Relationship Landscape.

    Science.gov (United States)

    Wood, Nathan D; Okhotnikov, Ilya A

    2017-01-01

    Spatial statistics has a rich tradition in earth, economic, and epidemiological sciences and has potential to affect the study of couples as well. When applied to couple data, spatial statistics can model within- and between-couple differences with results that are readily accessible for researchers and clinicians. This article offers a primer in using spatial statistics as a methodological tool for analyzing dyadic data. The article will introduce spatial approaches, review data structure required for spatial analysis, available software, and examples of data output.

  2. KOMPONEN WEB DATA ANALYZER PADA IE STUDI KASUS: AKSES WEB TERFAVORIT LABORATORIUM IBS TEKNIK INFORMATIKA - ITS

    Directory of Open Access Journals (Sweden)

    Darlis Heru Murti

    2005-07-01

    Explorer. Untuk itu di dalam pelaksanaan penelitian ini, akan dilakukan perancangan dan pembuatan sebuah perangkat lunak komponen Web Data Analyzer yang melekat pada browser Internet Explorer untuk pencarian akses web terfavorit pengguna. Uji coba dan evaluasi pada penelitian ini dilakukan dengan melakukan instalasi komponen Web Data Analyzer pada sejumlah workstation di Laboratorium IBS Teknik Informatika ITS. Hasil uji coba menunjukkan bahwa komponen Web Data Analyzer mampu memonitor dan menganalisa data aktivitas browsing pengguna serta melakukan otomatisasi terhadap fitur Favorites Internet Explorer dari data aktivitas browsing pengguna yang berhasil tersimpan ke database server. Kata kunci: band object, explorer bar, browser helper object (bho, http analyzer.

  3. HMR Log Analyzer: Analyze Web Application Logs Over Hadoop MapReduce

    Directory of Open Access Journals (Sweden)

    Sayalee Narkhede

    2013-07-01

    Full Text Available In today’s Internet world, log file analysis is becoming a necessary task for analyzing the customer’sbehavior in order to improve advertising and sales as well as for datasets like environment, medical,banking system it is important to analyze the log data to get required knowledge from it. Web mining is theprocess of discovering the knowledge from the web data. Log files are getting generated very fast at therate of 1-10 Mb/s per machine, a single data center can generate tens of terabytes of log data in a day.These datasets are huge. In order to analyze such large datasets we need parallel processing system andreliable data storage mechanism. Virtual database system is an effective solution for integrating the databut it becomes inefficient for large datasets. The Hadoop framework provides reliable data storage byHadoop Distributed File System and MapReduce programming model which is a parallel processingsystem for large datasets. Hadoop distributed file system breaks up input data and sends fractions of theoriginal data to several machines in hadoop cluster to hold blocks of data. This mechanism helps toprocess log data in parallel using all the machines in the hadoop cluster and computes result efficiently.The dominant approach provided by hadoop to “Store first query later”, loads the data to the HadoopDistributed File System and then executes queries written in Pig Latin. This approach reduces the responsetime as well as the load on to the end system. This paper proposes a log analysis system using HadoopMapReduce which will provide accurate results in minimum response time.

  4. Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)

    Science.gov (United States)

    Bhattacharya, D.; M., M.

    2016-06-01

    Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.

  5. DESIGN FOR CONNECTING SPATIAL DATA INFRASTRUCTURES WITH SENSOR WEB (SENSDI

    Directory of Open Access Journals (Sweden)

    D. Bhattacharya

    2016-06-01

    Full Text Available Integrating Sensor Web With Spatial Data Infrastructures (SENSDI aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS; 'Sensor Planning Service' (SPS; 'Sensor Alert Service' (SAS; a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS. Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.

  6. Analyzing Web pages visual scanpaths: between and within tasks variability.

    Science.gov (United States)

    Drusch, Gautier; Bastien, J M Christian

    2012-01-01

    In this paper, we propose a new method for comparing scanpaths in a bottom-up approach, and a test of the scanpath theory. To do so, we conducted a laboratory experiment in which 113 participants were invited to accomplish a set of tasks on two different websites. For each site, they had to perform two tasks that had to be repeated ounce. The data were analyzed using a procedure similar to the one used by Duchowski et al. [8]. The first step was to automatically identify, then label, AOIs with the mean-shift clustering procedure [19]. Then, scanpaths were compared two by two with a modified version of the string-edit method, which take into account the order of AOIs visualizations [2]. Our results show that scanpaths variability between tasks but within participants seems to be lower than the variability within task for a given participant. In other words participants seem to be more coherent when they perform different tasks, than when they repeat the same tasks. In addition, participants view more of the same AOI when they perform a different task on the same Web page than when they repeated the same task. These results are quite different from what predicts the scanpath theory.

  7. Importance of the spatial data and the sensor web in the ubiquitous computing area

    Science.gov (United States)

    Akçit, Nuhcan; Tomur, Emrah; Karslıoǧlu, Mahmut O.

    2014-08-01

    Spatial data has become a critical issue in recent years. In the past years, nearly more than three quarters of databases, were related directly or indirectly to locations referring to physical features, which constitute the relevant aspects. Spatial data is necessary to identify or calculate the relationships between spatial objects when using spatial operators in programs or portals. Originally, calculations were conducted using Geographic Information System (GIS) programs on local computers. Subsequently, through the Internet, they formed a geospatial web, which is integrated into a discoverable collection of geographically related web standards and key features, and constitutes a global network of geospatial data that employs the World Wide Web to process textual data. In addition, the geospatial web is used to gather spatial data producers, resources, and users. Standards also constitute a critical dimension in further globalizing the idea of the geospatial web. The sensor web is an example of the real time service that the geospatial web can provide. Sensors around the world collect numerous types of data. The sensor web is a type of sensor network that is used for visualizing, calculating, and analyzing collected sensor data. Today, people use smart devices and systems more frequently because of the evolution of technology and have more than one mobile device. The considerable number of sensors and different types of data that are positioned around the world have driven the production of interoperable and platform-independent sensor web portals. The focus of such production has been on further developing the idea of an interoperable and interdependent sensor web of all devices that share and collect information. The other pivotal idea consists of encouraging people to use and send data voluntarily for numerous purposes with the some level of credibility. The principal goal is to connect mobile and non-mobile device in the sensor web platform together to

  8. Spatial and social connectedness in web-based work collaboration

    NARCIS (Netherlands)

    Handberg, L.; Gullström, C.; Kort, J.; Nyström, J.

    2016-01-01

    The work presented here seeks an integration of spatial and social features supporting shared activities, and engages users in multiple locations to manipulate realtime video-streams. Standard and easily available equipment is used together with the communication standard WebRTC. It adds a spatial

  9. Food-web structure of seagrass communities across different spatial scales and human impacts.

    Science.gov (United States)

    Coll, Marta; Schmidt, Allison; Romanuk, Tamara; Lotze, Heike K

    2011-01-01

    Seagrass beds provide important habitat for a wide range of marine species but are threatened by multiple human impacts in coastal waters. Although seagrass communities have been well-studied in the field, a quantification of their food-web structure and functioning, and how these change across space and human impacts has been lacking. Motivated by extensive field surveys and literature information, we analyzed the structural features of food webs associated with Zostera marina across 16 study sites in 3 provinces in Atlantic Canada. Our goals were to (i) quantify differences in food-web structure across local and regional scales and human impacts, (ii) assess the robustness of seagrass webs to simulated species loss, and (iii) compare food-web structure in temperate Atlantic seagrass beds with those of other aquatic ecosystems. We constructed individual food webs for each study site and cumulative webs for each province and the entire region based on presence/absence of species, and calculated 16 structural properties for each web. Our results indicate that food-web structure was similar among low impact sites across regions. With increasing human impacts associated with eutrophication, however, food-web structure show evidence of degradation as indicated by fewer trophic groups, lower maximum trophic level of the highest top predator, fewer trophic links connecting top to basal species, higher fractions of herbivores and intermediate consumers, and higher number of prey per species. These structural changes translate into functional changes with impacted sites being less robust to simulated species loss. Temperate Atlantic seagrass webs are similar to a tropical seagrass web, yet differed from other aquatic webs, suggesting consistent food-web characteristics across seagrass ecosystems in different regions. Our study illustrates that food-web structure and functioning of seagrass habitats change with human impacts and that the spatial scale of food-web analysis

  10. Formalization and web-based implementation of spatial data fusion

    Science.gov (United States)

    Wiemann, Stefan

    2017-02-01

    Spatial data fusion plays an important role for spatial information retrieval from disconnected data sources and is thus a precondition for comprehensive and consistent decision making. In particular on the Web, it can help to combine spatial data from the variety of existing, but distributed sources, e.g. as provided by Spatial Data Infrastructure (SDI). However, standardized spatial data processing on the Web still lacks broad acceptance beyond the scientific domain. This article describes a formalization and service-based implementation of the spatial data fusion process. The formalization builds on a set theoretic description of the considered domain and derives a number of possible fusion objectives. Geoprocessing patterns are used to describe commonly used sub-routines of the fusion process and therefore support the workflow composition. The implementation is based on open standards and comprises a Web-client, several geoprocessing services and a fusion engine to support the Web-based compilation and execution of spatial data fusion workflows in an ad hoc manner.

  11. Geo-communication and Web-based Spatial Data Infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2006-01-01

    ! Therefore there is a strong need for theories and models that can describe this complex web in the SDI and geo-communication consisting of active components, passive components, users and information in order to make it possible to handle the complexity and to give the necessary framework.......The purpose of geo-communication is to bridge the gap between reality and data sources on one side and decisions on the other side. This is achieved through several types of activities, where web-services and spatial data infrastructure play an important role. The introduction of web......-services as index-portals based on geo-information has changed the conditions for both content and form of geo-communication. A high number of players and interactions as well as a very high number of all kinds of information and combinations of these characterize geo-communication carried out through web...

  12. Web-based visualization of spatial objects in 3DGIS

    Institute of Scientific and Technical Information of China (English)

    ZHANG LiQiang; GUO ZhiFeng; KANG ZhiZhong; ZHANG LiXin; ZHANG XingMing; YANG Ling

    2009-01-01

    Adaptive rendering large and complex spatial data has become an important research issue In a 3DGIS application.In order to transmit the data to the client efficiently,this paper proposes a node-layer data model to manage the 3D scene.Because the large spatial data and limited network bandwidth are the main bottlenecks of web-based 3DGIS,a client/server architecture including progressive transmission methods and multiresolution representations,together with the spatial index,are developed to improve the performance.All this makes the application quite scalable.Experimental results reveal that the application works appropriately.

  13. Collab-Analyzer: An Environment for Conducting Web-Based Collaborative Learning Activities and Analyzing Students' Information-Searching Behaviors

    Science.gov (United States)

    Wu, Chih-Hsiang; Hwang, Gwo-Jen; Kuo, Fan-Ray

    2014-01-01

    Researchers have found that students might get lost or feel frustrated while searching for information on the Internet to deal with complex problems without real-time guidance or supports. To address this issue, a web-based collaborative learning system, Collab-Analyzer, is proposed in this paper. It is not only equipped with a collaborative…

  14. An Application of Session Based Clustering to Analyze Web Pages of User Interest from Web Log Files

    Directory of Open Access Journals (Sweden)

    c. P. Sumathi

    2010-01-01

    Full Text Available Problem statement: With the continued growth and proliferation of e-commerce, Web services and Web-based information systems, the volumes of click-stream and user data collected by Web-based organizations in their daily operations have reached astronomical proportions. Analyzing such data can help these organizations optimize the functionality of web-based applications and provide more personalized content to visitors. This type of analysis involved the automatic discovery of usage interest on the web pages which are often stored in web and applications server access logs. Approach: The usage interest on the web pages in various sessions was partitioned into clusters such that sessions with “similar” interest were placed in the same cluster using expectation maximization clustering technique as discussed in this study. Results: The approach results in the generation of usage profiles and automatic identification of user interest in each profile. Conclusion: The significance of the results will be helpful for organizations for web site improvement based on their navigational interest and provide recommendations for page(s not yet visited by the user.

  15. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging.

    Science.gov (United States)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver; Keasling, Jay D; Northen, Trent R; Bowen, Benjamin P

    2017-06-06

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. The OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI ( http://openmsi.nersc.gov ), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. These results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat .

  16. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    de Raad, Markus [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); de Rond, Tristan [Univ. of California, Berkeley, CA (United States); Rübel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Keasling, Jay D. [Univ. of California, Berkeley, CA (United States); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Technical Univ. of Denmark, Lyngby (Denmark); Northen, Trent R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Bowen, Benjamin P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States)

    2017-05-03

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. In this paper, the OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http://openmsi.nersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. Finally, these results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat.

  17. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DEFF Research Database (Denmark)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    2017-01-01

    processing tools for the analysis of large arrayed MSI sample sets. The OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http......://openmsinersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was :evaluated by analyzing an MSI data set of a high-throughput glycoside...... hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 pm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis...

  18. Reconstruction of paleoenvironments by analyzing spatial shell orientation

    Science.gov (United States)

    Lukeneder, Susanne; Lukeneder, Alexander; Weber, Gerhard W.; Exner, Ulrike

    2013-04-01

    one side of the shell (transverse axis) was measured (landmark s & c). Spatial orientation was characterized by dip and dip direction of the longitudinal axis, as well as by strike and azimuth of a plane defined by both axes. The exact spatial orientation data was determined for a sample of 699 ammonoids within the bed and statistically analyzed. The results provide a hint on the geodynamic processes (paleocurrents), depositional conditions (allochthonous or autochthonous) and other general information about the ancient environment. The method can be adapted for other mass-occurring fossils and thus represents a good template for studies of topographical paleoenvironmental factors. References: Flügel, E. 2004. Microfacies of carbonate rocks. Analysis, Interpretation and Application. Springer, Berlin Heidelberg New York, p.182. Lukeneder S., Lukeneder A., Harzhauser M., Islamoglu Y., Krystyn L., Lein R. 2012. A delayed carbonate factory breakdown during the Tethyan-wide Carnian Pluvial Episode along the Cimmerian terranes (Taurus, Turkey). Facies 58: 279-296.

  19. Web-Based Spatial Training Using Handheld Touch Screen Devices

    Science.gov (United States)

    Martin-Dorta, Norena; Saorin, Jose Luis; Contero, Manuel

    2011-01-01

    This paper attempts to harness the opportunities for mobility and the new user interfaces that handheld touch screen devices offer, in a non-formal learning context, with a view to developing spatial ability. This research has addressed two objectives: first, analyzing the effects that training can have on spatial visualisation using the…

  20. Analyzing spatial data from mouse tracker methodology: An entropic approach.

    Science.gov (United States)

    Calcagnì, Antonio; Lombardi, Luigi; Sulpizio, Simone

    2017-01-11

    Mouse tracker methodology has recently been advocated to explore the motor components of the cognitive dynamics involved in experimental tasks like categorization, decision-making, and language comprehension. This methodology relies on the analysis of computer-mouse trajectories, by evaluating whether they significantly differ in terms of direction, amplitude, and location when a given experimental factor is manipulated. In this kind of study, a descriptive geometric approach is usually adopted in the analysis of raw trajectories, where they are summarized with several measures, such as maximum-deviation and area under the curve. However, using raw trajectories to extract spatial descriptors of the movements is problematic due to the noisy and irregular nature of empirical movement paths. Moreover, other significant components of the movement, such as motor pauses, are disregarded. To overcome these drawbacks, we present a novel approach (EMOT) to analyze computer-mouse trajectories that quantifies movement features in terms of entropy while modeling trajectories as composed by fast movements and motor pauses. A dedicated entropy decomposition analysis is additionally developed for the model parameters estimation. Two real case studies from categorization tasks are finally used to test and evaluate the characteristics of the new approach.

  1. Implementing a Cost Effectiveness Analyzer for Web-Supported Academic Instruction: A Campus Wide Analysis

    Science.gov (United States)

    Cohen, Anat; Nachmias, Rafi

    2009-01-01

    This paper describes the implementation of a quantitative cost effectiveness analyzer for Web-supported academic instruction that was developed in Tel Aviv University during a long term study. The paper presents the cost effectiveness analysis of Tel Aviv University campus. Cost and benefit of 3,453 courses were analyzed, exemplifying campus-wide…

  2. Small-scale spatial pattern of web-building spiders (Araneae) in alfalfa: relationship to disturbance from cutting, prey availability, and intraguild interactions.

    Science.gov (United States)

    Birkhofer, Klaus; Scheu, Stefan; Wise, David H

    2007-08-01

    Understanding the development of spatial patterns in generalist predators will improve our ability to incorporate them into biological control programs. We studied the small-scale spatial patterns of spider webs in alfalfa by analyzing the relationship between web locations over distances ranging from 4 to 66 cm. Using a coordinate-based spatial statistic (O-ring) and assuming a heterogeneous distribution of suitable web sites, we analyzed the impact of cutting and changes in spider abundance on web distribution. We analyzed the influence of small-scale variation in prey availability by comparing web distributions to the pattern of sticky-trap captures of Aphididae and Diptera described by a count-based spatial statistic (SADIE). Cutting of alfalfa reduced the overall density of web-building spiders but had no immediate impact on the spatial distribution of their webs. Availability of aphids was highest before the alfalfa was cut and was clumped at a scale of 66 cm. Spider webs, however, were not clumped at any scale or date. In contrast, webs were regularly distributed at smaller distances (web-building spiders were most active during this period, we hypothesize that the development of small-scale regularity in web locations was driven by intraguild interactions. Our results suggest that intraguild interactions contribute to the development of small-scale spatial patterns of spider webs in alfalfa. Variation in prey availability may have more of an influence on web distribution in crops with a different vegetation structure or if patterns are studied at larger spatial scales.

  3. Information disparities of Taiwan's health Web sites by spatial variation.

    Science.gov (United States)

    Hsiao, Fang-Ying; Chang, Polun; Hsu, Chiehwen Ed

    2008-11-06

    This project study based upon 40 of Taiwan's health Web sites that belonged to teaching hospitals or medical centers. We divided these Web sites into north, center, south and east of Taiwan by their location. The five major research criteria were "Web site information credibility," "organization management," tailored content," "easy surfing" and "online interaction". Based on the study, we found that in general, Web sites that locate on the north had higher ratings than others Web sites.

  4. SWORS: a system for the efficient retrieval of relevant spatial web objects

    DEFF Research Database (Denmark)

    Cao, Xin; Cong, Gao; Jensen, Christian S.

    2012-01-01

    Spatial web objects that possess both a geographical location and a textual description are gaining in prevalence. This gives prominence to spatial keyword queries that exploit both location and textual arguments. Such queries are used in many web services such as yellow pages and maps services....

  5. SWORS: a system for the efficient retrieval of relevant spatial web objects

    DEFF Research Database (Denmark)

    Cao, Xin; Cong, Gao; Jensen, Christian S.

    2012-01-01

    Spatial web objects that possess both a geographical location and a textual description are gaining in prevalence. This gives prominence to spatial keyword queries that exploit both location and textual arguments. Such queries are used in many web services such as yellow pages and maps services....

  6. Role of detritus in a spatial food web model with diffusion

    Science.gov (United States)

    Pekalski, Andrzej; Szwabiński, Janusz

    2014-05-01

    One of the central themes in modern ecology is the enduring debate on whether there is a relationship between the complexity of a biological community and its stability. In this paper, we focus on the role of detritus and spatial dispersion on the stability of ecosystems. Using Monte Carlo simulations we analyze two three-level models of food webs: a grazing one with the basal species (i.e., primary producers) having unlimited food resources and a detrital one in which the basal species uses detritus as a food resource. While the vast majority of theoretical studies neglects detritus, from our results it follows that the detrital food web is more stable than its grazing counterpart, because the interactions mediated by detritus damp out fluctuations in species' densities. Since the detritus model is the more complex one in terms of interaction patterns, our results provide evidence for the advocates of the complexity as one of the factors enhancing stability of ecosystems.

  7. Spatial scales of carbon flow in a river food web

    Science.gov (United States)

    Finlay, J.C.; Khandwala, S.; Power, M.E.

    2002-01-01

    Spatial extents of food webs that support stream and river consumers are largely unknown, but such information is essential for basic understanding and management of lotic ecosystems. We used predictable variation in algal ??13C with water velocity, and measurements of consumer ??13C and ??15N to examine carbon flow and trophic structure in food webs of the South Fork Eel River in Northern California. Analyses of ??13C showed that the most abundant macroinvertebrate groups (collector-gatherers and scrapers) relied on algae from local sources within their riffle or shallow pool habitats. In contrast, filter-feeding invertebrates in riffles relied in part on algal production derived from upstream shallow pools. Riffle invertebrate predators also relied in part on consumers of pool-derived algal carbon. One abundant taxon drifting from shallow pools and riffles (baetid mayflies) relied on algal production derived from the habitats from which they dispersed. The trophic linkage from pool algae to riffle invertebrate predators was thus mediated through either predation on pool herbivores dispersing into riffles, or on filter feeders. Algal production in shallow pool habitats dominated the resource base of vertebrate predators in all habitats at the end of the summer. We could not distinguish between the trophic roles of riffle algae and terrestrial detritus, but both carbon sources appeared to play minor roles for vertebrate consumers. In shallow pools, small vertebrates, including three-spined stickleback (Gasterosteus aculeatus), roach (Hesperoleucas symmetricus), and rough-skinned newts (Taricha granulosa), relied on invertebrate prey derived from local pool habitats. During the most productive summer period, growth of all size classes of steelhead and resident rainbow trout (Oncorhynchus mykiss) in all habitats (shallow pools, riffles, and deep unproductive pools) was largely derived from algal production in shallow pools. Preliminary data suggest that the strong

  8. ndex Structure for the Multi-scale Representation of Multi-dimensional Spatial Data in WebGIS

    Directory of Open Access Journals (Sweden)

    ZhihanLV

    2010-05-01

    Full Text Available To solve the problem that existing data structure cannot support the multi-scale representation of multi-dimensional spatial data in Web Geographic Information System (WebGIS, a modified data structurehas been put forward: (1 getting the main tree from the deformation of the index structure of region quadtree partitioned on the basis of the rule of pyramid structure; (2 possessing the sub-tree structure supporting the overlap of multi-dimensional spatial data; (3reflecting the changes in spatial resolution using the depth of thetree; (4all the nodes of the tree are the container of spatial objects. The necessity of generating the index is analyzed which is described. The algorithm for data generation of index structure, the support to the multi-dimensional data and the query process isdiscussed. For the same data source, some comparative experiments are provided using this structure and layer, and the result shows that this index method can represent and search massive multi-dimensional spatial data effectively in WebGIS. The structure has been used inShanghai multi-dimensional WebGIS system.

  9. Hydrological and Biogeochemical Controls on Seasonal and Spatial Differences in Food Webs in the Everglades

    Science.gov (United States)

    Kendall, C.; Wankel, S. D.; Bemis, B. E.; Rawlik, P. S.; Krabbenhoft, D. P.; Lange, T.

    2002-05-01

    Stable isotopes can be used to determine the relative trophic positions of biota within a food web, and to improve our understanding of the biomagnification of contaminants. Plants at the base of the food web uptake dissolved organic carbon (DIC) and nitrogen (DIN) for growth, and their tissue reflects the isotopic composition of these sources. Animals then mirror the isotopic composition of the primary producers, as modified by consumer-diet fractionations at successive trophic steps. During 1995-99, we collected algae, macrophyte, invertebrate, and fish samples from 15 USGS sites in the Everglades and analyzed them for d13C and d15N with the goal of characterizing seasonal and spatial differences in food web relations. Carbon isotopes effectively distinguish between two main types of food webs: ones where algae is the dominant base of the food web, which are characteristic of relatively pristine marsh sites with long hydroperiods, and ones where macrophyte debris appears to be a significant source of nutrients, which are apparently characteristic of shorter hydroperiod sites, and nutrient-impacted marshes and canals. There usually is an inverse relation between d13C and d15N of organisms over time, especially in more pristine environments, reflecting seasonal changes in the d13C of DIC and the d15N of DIN. The d13C and d15N of algae also show strong positive correlations with seasonal changes in water levels. This variability is substantially damped up the food chain, probably because of the longer integration times of animals vs. plants. We speculate that these seasonal shifts in water level result in changes in biogeochemical reactions and nutrient levels, with corresponding variations in the d15N and d13C of biota. For example, small changes in water level may change the balance of photosynthesis, bacterial respiration, and atmospheric exchange reactions that control the d13C of DIC. Such changes will probably also affect the d15N of dissolved inorganic N (DIN

  10. w4CSeq: software and web application to analyze 4C-seq data.

    Science.gov (United States)

    Cai, Mingyang; Gao, Fan; Lu, Wange; Wang, Kai

    2016-11-01

    Circularized Chromosome Conformation Capture followed by deep sequencing (4C-Seq) is a powerful technique to identify genome-wide partners interacting with a pre-specified genomic locus. Here, we present a computational and statistical approach to analyze 4C-Seq data generated from both enzyme digestion and sonication fragmentation-based methods. We implemented a command line software tool and a web interface called w4CSeq, which takes in the raw 4C sequencing data (FASTQ files) as input, performs automated statistical analysis and presents results in a user-friendly manner. Besides providing users with the list of candidate interacting sites/regions, w4CSeq generates figures showing genome-wide distribution of interacting regions, and sketches the enrichment of key features such as TSSs, TTSs, CpG sites and DNA replication timing around 4C sites. Users can establish their own web server by downloading source codes at https://github.com/WGLab/w4CSeq Additionally, a demo web server is available at http://w4cseq.wglab.org CONTACT: kaiwang@usc.edu or wangelu@usc.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. A Web-based System for Observing and Analyzing Computer Mediated Communications

    CERN Document Server

    May, Madeth; Prévôt, Patrick

    2007-01-01

    Tracking data of user's activities resulting from Computer Mediated Communication (CMC) tools (forum, chat, etc.) is often carried out in an ad-hoc manner, which either confines the reusability of data in different purposes or makes data exploitation difficult. Our research works are biased toward methodological challenges involved in designing and developing a generic system for tracking user's activities while interacting with asynchronous communication tools like discussion forums. We present in this paper, an approach for building a Web-based system for observing and analyzing user activity on any type of discussion forums.

  12. Geo-communication, web-services, and spatial data infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2007-01-01

    The introduction of web-services as index-portals based on geo-information has changed the conditions for both content and form of geo-communication. A high number of players and interactions as well as a very high number of all kinds of information and combinations of these caracterise web...... looks very complex, and it will get even more complex. Therefore, there is a strong need for theories and models that can describe this complex web in the SDI and geo-communication consisting of active components, passive components, users, and information in order to make it possible to handle......, collaboration, standards, models, specifications, web services, and finally the information. Awareness of the complexity is necessary, and structure is needed to make it possible for the geo-information community to pull together in the same direction. Modern web-based geo-communication and its infrastucture...

  13. Web-based GIS for spatial pattern detection: application to malaria incidence in Vietnam.

    Science.gov (United States)

    Bui, Thanh Quang; Pham, Hai Minh

    2016-01-01

    There is a great concern on how to build up an interoperable health information system of public health and health information technology within the development of public information and health surveillance programme. Technically, some major issues remain regarding to health data visualization, spatial processing of health data, health information dissemination, data sharing and the access of local communities to health information. In combination with GIS, we propose a technical framework for web-based health data visualization and spatial analysis. Data was collected from open map-servers and geocoded by open data kit package and data geocoding tools. The Web-based system is designed based on Open-source frameworks and libraries. The system provides Web-based analyst tool for pattern detection through three spatial tests: Nearest neighbour, K function, and Spatial Autocorrelation. The result is a web-based GIS, through which end users can detect disease patterns via selecting area, spatial test parameters and contribute to managers and decision makers. The end users can be health practitioners, educators, local communities, health sector authorities and decision makers. This web-based system allows for the improvement of health related services to public sector users as well as citizens in a secure manner. The combination of spatial statistics and web-based GIS can be a solution that helps empower health practitioners in direct and specific intersectional actions, thus provide for better analysis, control and decision-making.

  14. Geo-communication, web-services, and spatial data infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2007-01-01

    The introduction of web-services as index-portals based on geo-information has changed the conditions for both content and form of geo-communication. A high number of players and interactions as well as a very high number of all kinds of information and combinations of these caracterise web...... services, where maps are only a part of the whole. This chapter discusses the relations between the different components of SDI and geo-communication as well as the impact thereof. Discussed is also a model for the organization of the passive components of the infrastructure; that is, legislation......, collaboration, standards, models, specifications, web services, and finally the information. Awareness of the complexity is necessary, and structure is needed to make it possible for the geo-information community to pull together in the same direction. Modern web-based geo-communication and its infrastucture...

  15. A study on heterogeneous distributed spatial information platform based on semantic Web services

    Science.gov (United States)

    Peng, Shuang-yun; Yang, Kun; Xu, Quan-li; Huang, Bang-mei

    2008-10-01

    With the development of Semantic Web technology, the spatial information service based on ontology is an effective way for sharing and interoperation of heterogeneous information resources in the distributed network environment. This paper discusses spatial information sharing and interoperability in the Semantic Web Services architecture. Through using Ontology record spatial information in sharing knowledge system, explicit and formalization expresses the default and the concealment semantic information. It provides the prerequisite for spatial information sharing and interoperability; Through Semantic Web Services technology parses Ontology and intelligent buildings services under network environment, form a network of services. In order to realize the practical applications of spatial information sharing and interoperation in different brunches of CDC system, a prototype system for HIV/AIDS information sharing based on geo-ontology has also been developed by using the methods described above.

  16. Creating web applications for spatial epidemiological analysis and mapping in R using Rwui

    Directory of Open Access Journals (Sweden)

    Deonarine Andrew

    2011-04-01

    Full Text Available Abstract Background Creating a user friendly web based application which executes an R script allows physicians, epidemiologists, and others unfamiliar with the statistical language to perform powerful statistical analyses easily. The geographic mapping of data is an important tool in spatial epidemiological analysis, and the R project includes many tools for such analyses, but few for visualization. Hence, web applications that run R for epidemiological analysis need to be able to present the results in a geographic format. Results Rwui is a web application for creating web based applications for running R scripts. We describe updates to Rwui that enable it to create web applications for R scripts which return the results of the analysis to the web page as geographic maps. Conclusions Rwui enables statisticians to create web applications for R scripts without the need to learn web programming. Creating a web application provides users access to an R based analysis without the need to learn R. Recent updates to Rwui have increased its applicability in the field of spatial epidemiological analysis.

  17. Persistence of information on the web: Analyzing citations contained in research articles

    DEFF Research Database (Denmark)

    Lawrance, S.; Coetzee, F.; Flake, G.

    2000-01-01

    We analyze the persistence of information on the web, looking at the percentage of invalid URLs contained in academic articles within the CiteSeer (ResearchIndex) database. The number of URLs contained in the papers has increased from an average of 0.06 in 1993 to 1.6 in 1999. We found...... that a significant percentage of URLs are now invalid, ranging from 23% for 1999 articles, to 53% for 1994. We also found that for almost all of the invalid URLs, it was possible to locate the information (or highly related information) in an alternate location, primarily with the use of search engines. However......, the ability to relocate missing information varied according to search experience and effort expended. Citation practices suggest that more information may be lost in the future unless these practices are improved. We discuss persistent URL standards and their usage, and give recommendations for citing URLs...

  18. SambVca 2. A Web Tool for Analyzing Catalytic Pockets with Topographic Steric Maps

    KAUST Repository

    Falivene, Laura

    2016-06-27

    Developing more efficient catalysts remains one of the primary targets of organometallic chemists. To accelerate reaching this goal, effective molecular descriptors and visualization tools can represent a remarkable aid. Here, we present a Web application for analyzing the catalytic pocket of metal complexes using topographic steric maps as a general and unbiased descriptor that is suitable for every class of catalysts. To show the broad applicability of our approach, we first compared the steric map of a series of transition metal complexes presenting popular mono-, di-, and tetracoordinated ligands and three classic zirconocenes. This comparative analysis highlighted similarities and differences between totally unrelated ligands. Then, we focused on a recently developed Fe(II) catalyst that is active in the asymmetric transfer hydrogenation of ketones and imines. Finally, we expand the scope of these tools to rationalize the inversion of enantioselectivity in enzymatic catalysis, achieved by point mutation of three amino acids of mononuclear p-hydroxymandelate synthase.

  19. A Novel Web Application to Analyze and Visualize Extreme Heat Events

    Science.gov (United States)

    Li, G.; Jones, H.; Trtanj, J.

    2016-12-01

    Extreme heat is the leading cause of weather-related deaths in the United States annually and is expected to increase with our warming climate. However, most of these deaths are preventable with proper tools and services to inform the public about heat waves. In this project, we have investigated the key indicators of a heat wave, the vulnerable populations, and the data visualization strategies of how those populations most effectively absorb heat wave data. A map-based web app has been created that allows users to search and visualize historical heat waves in the United States incorporating these strategies. This app utilizes daily maximum temperature data from NOAA Global Historical Climatology Network which contains about 2.7 million data points from over 7,000 stations per year. The point data are spatially aggregated into county-level data using county geometry from US Census Bureau and stored in Postgres database with PostGIS spatial capability. GeoServer, a powerful map server, is used to serve the image and data layers (WMS and WFS). The JavaScript-based web-mapping platform Leaflet is used to display the temperature layers. A number of functions have been implemented for the search and display. Users can search for extreme heat events by county or by date. The "by date" option allows a user to select a date and a Tmax threshold which then highlights all of the areas on the map that meet those date and temperature parameters. The "by county" option allows the user to select a county on the map which then retrieves a list of heat wave dates and daily Tmax measurements. This visualization is clean, user-friendly, and novel because while this sort of time, space, and temperature measurements can be found by querying meteorological datasets, there does not exist a tool that neatly packages this information together in an easily accessible and non-technical manner, especially in a time where climate change urges a better understanding of heat waves.

  20. Collab-Analyzer: An Environment for Conducting Web-Based Collaborative Learning Activities and Analyzing Students' Information-Searching Behaviors

    Science.gov (United States)

    Wu, Chih-Hsiang; Hwang, Gwo-Jen; Kuo, Fan-Ray

    2014-01-01

    Researchers have found that students might get lost or feel frustrated while searching for information on the Internet to deal with complex problems without real-time guidance or supports. To address this issue, a web-based collaborative learning system, Collab-Analyzer, is proposed in this paper. It is not only equipped with a collaborative…

  1. Efficient Top-k Locality Search for Co-located Spatial Web Objects

    DEFF Research Database (Denmark)

    Qu, Qiang; Liu, Siyuan; Yang, Bin

    2014-01-01

    In step with the web being used widely by mobile users, user location is becoming an essential signal in services, including local intent search. Given a large set of spatial web objects consisting of a geographical location and a textual description (e.g., online business directory entries...... of restaurants, bars, and shops), how can we find sets of objects that are both spatially and textually relevant to a query? Most of existing studies solve the problem by requiring that all query keywords are covered by the returned objects and then rank the sets by spatial proximity. The needs for identifying...

  2. Analyzing the Evolution of Web Services using Fine-Grained Changes

    NARCIS (Netherlands)

    Romano, D.; Pinzger, M.

    2012-01-01

    Preprint of paper published in: ICWS 2012 - IEEE 19th International Conference on Web Services, 24-29 June 2012; doi:10.1109/ICWS.2012.29 In the service-oriented paradigm web service interfaces are considered contracts between web service subscribers and providers. However, these interfaces are co

  3. Privacy for Semantic Web Mining using Advanced DSA – Spatial LBS Case Study in mining

    Directory of Open Access Journals (Sweden)

    S.Nagaprasad Sri

    2010-09-01

    Full Text Available The Web Services paradigm promises to enable rich flexible and dynamic interoperation of highly distributed, heterogeneous network enabled services. The idea of Web Services Mining that it makes use of the findings in the field of data mining and applies them to the world of Web Services. The emerging concept of Semantic Web Services aims at more sophisticated Web Services technologies: on basis of Semantic Description Frameworks, Intelligent mechanisms are envisioned for Discovery, Composition, and contracting of Web Services. The aim of semantic web is not only to support to access information on the web but also to support its usage. Geospatial Semantic Web is an augmentation to the Semantic Web that adds geospatial abstractions, as well as related reasoning, representation and query mechanisms. Web Service Security represents a key requirement for today’s distributed interconnected digital world and for the new generations, Web 2.0 and Semantic Web. To date, the problem of security has been investigated very much in the context of standardization efforts; Personal judgments are made usually based on the sensitivity of the information and the reputation of the party to which the information is to be disclosed. On the privacy front, this means that privacy invasion would net more quality and sensitive personal information. In this paper, we had implemented a case study on integrated privacy issues of Spatial Semantic Web Services Mining. Initially we improved privacy of Geospatial Semantic Layer. Finally, we implemented a Location Based System and improved its digital signature capability, using advanced Digital Signature standards.

  4. Privacy for Semantic Web Mining using Advanced DSA – Spatial LBS Case Study

    Directory of Open Access Journals (Sweden)

    Dr.D.Sravan Kumar,

    2010-05-01

    Full Text Available The Web Services paradigm promises to enable rich flexible and dynamic interoperation of highly distributed, heterogeneous network enabled services. The idea of Web Services Mining that it makes use of the findings in the field of data mining and applies them to the world of Web Services. The emerging concept of Semantic Web Services aims at more sophisticated Web Services technologies: on basis of Semantic Description Frameworks, Intelligent mechanisms are envisioned for Discovery, Composition, and contracting of Web Services. The aim of semantic web is not only to support to access information on the web but also to support its usage. Geospatial Semantic Web is an augmentation to the Semantic Web that adds geospatial abstractions, as well as related reasoning, representation and query mechanisms. Web Service Security represents a key requirement for today’s distributed interconnected digital world and for the new generations, Web 2.0 and Semantic Web. To date, the problem of security has been investigated very much in the context of standardization efforts; Personal judgments are made usually based on the sensitivity of the information and the reputation of the party towhich the information is to be disclosed. On the privacy front,this means that privacy invasion would net more quality and sensitive personal information. In this paper, we had implemented a case study on integrated privacy issues of Spatial Semantic Web Services Mining. Initially we improved privacy of Geospatial Semantic Layer. Finally, we implemented a Location Based System and improved its digital signature capability, using advanced Digital Signature standards.

  5. [Spatial distribution pattern of Chilo suppressalis analyzed by classical method and geostatistics].

    Science.gov (United States)

    Yuan, Zheming; Fu, Wei; Li, Fangyi

    2004-04-01

    Two original samples of Chilo suppressalis and their grid, random and sequence samples were analyzed by classical method and geostatistics to characterize the spatial distribution pattern of C. suppressalis. The limitations of spatial distribution analysis with classical method, especially influenced by the original position of grid, were summarized rather completely. On the contrary, geostatistics characterized well the spatial distribution pattern, congregation intensity and spatial heterogeneity of C. suppressalis. According to geostatistics, the population was up to Poisson distribution in low density. As for higher density population, its distribution was up to aggregative, and the aggregation intensity and dependence range were 0.1056 and 193 cm, respectively. Spatial heterogeneity was also found in the higher density population. Its spatial correlativity in line direction was more closely than that in row direction, and the dependence ranges in line and row direction were 115 and 264 cm, respectively.

  6. A framework for efficient spatial web object retrieval

    DEFF Research Database (Denmark)

    Wu, Dinging; Cong, Gao; Jensen, Christian S.

    2012-01-01

    The conventional Internet is acquiring a geospatial dimension. Web documents are being geo-tagged and geo-referenced objects such as points of interest are being associated with descriptive text documents. The resulting fusion of geo-location and documents enables new kinds of queries that take...... of the framework demonstrate that the paper’s proposal is capable of excellent performance...

  7. Leisure Commons: Spatial History of Web 2.0

    NARCIS (Netherlands)

    P.A. Arora (Payal)

    2014-01-01

    markdownabstract__Abstract__ The Internet has matured. It is now characterized by a new generation of websites popularly termed as Web 2.0. The nature of this transformation is predominantly social versus technical in nature, and is marked by the rise of social network sites and user generated cont

  8. Voids and the Cosmic Web: cosmic depression & spatial complexity

    NARCIS (Netherlands)

    van de Weygaert, Rien; Shandarin, S.; Saar, E.; Einasto, J.

    2016-01-01

    Voids form a prominent aspect of the Megaparsec distribution of galaxies and matter. Not only do theyrepresent a key constituent of the Cosmic Web, they also are one of the cleanest probesand measures of global cosmological parameters. The shape and evolution of voids are highly sensitive tothe natu

  9. Forward, back and home again : analyzing user behavior on the web

    OpenAIRE

    2006-01-01

    In a period of less than two decades, the World Wide Web has evolved into one of the most important sources of information and services. Due to the infancy of the Web and its rapid growth, our knowledge on how users interact with the Web is limited - knowledge which is likely to provide pointers for improvements in the design of Web sites and Web browsers. In this thesis, we aim to provide an integrative overview of theoretical insights and empirical findings, and to extend this body of knowl...

  10. A Bayesian spatial random parameters Tobit model for analyzing crash rates on roadway segments.

    Science.gov (United States)

    Zeng, Qiang; Wen, Huiying; Huang, Helai; Abdel-Aty, Mohamed

    2017-03-01

    This study develops a Bayesian spatial random parameters Tobit model to analyze crash rates on road segments, in which both spatial correlation between adjacent sites and unobserved heterogeneity across observations are accounted for. The crash-rate data for a three-year period on road segments within a road network in Florida, are collected to compare the performance of the proposed model with that of a (fixed parameters) Tobit model and a spatial (fixed parameters) Tobit model in the Bayesian context. Significant spatial effect is found in both spatial models and the results of Deviance Information Criteria (DIC) show that the inclusion of spatial correlation in the Tobit regression considerably improves model fit, which indicates the reasonableness of considering cross-segment spatial correlation. The spatial random parameters Tobit regression has lower DIC value than does the spatial Tobit regression, suggesting that accommodating the unobserved heterogeneity is able to further improve model fit when the spatial correlation has been considered. Moreover, the random parameters Tobit model provides a more comprehensive understanding of the effect of speed limit on crash rates than does its fixed parameters counterpart, which suggests that it could be considered as a good alternative for crash rate analysis.

  11. Smart Cities Intelligence System (SMACiSYS) Integrating Sensor Web with Spatial Data Infrastructures (sensdi)

    Science.gov (United States)

    Bhattacharya, D.; Painho, M.

    2017-09-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  12. Considering spatial heterogeneity in the distributed lag non-linear model when analyzing spatiotemporal data.

    Science.gov (United States)

    Chien, Lung-Chang; Guo, Yuming; Li, Xiao; Yu, Hwa-Lung

    2016-11-16

    The distributed lag non-linear (DLNM) model has been frequently used in time series environmental health research. However, its functionality for assessing spatial heterogeneity is still restricted, especially in analyzing spatiotemporal data. This study proposed a solution to take a spatial function into account in the DLNM, and compared the influence with and without considering spatial heterogeneity in a case study. This research applied the DLNM to investigate non-linear lag effect up to 7 days in a case study about the spatiotemporal impact of fine particulate matter (PM2.5) on preschool children's acute respiratory infection in 41 districts of northern Taiwan during 2005 to 2007. We applied two spatiotemporal methods to impute missing air pollutant data, and included the Markov random fields to analyze district boundary data in the DLNM. When analyzing the original data without a spatial function, the overall PM2.5 effect accumulated from all lag-specific effects had a slight variation at smaller PM2.5 measurements, but eventually decreased to relative risk significantly analyzing spatiotemporal imputed data without a spatial function, the overall PM2.5 effect did not decrease but increased in monotone as PM2.5 increased over 20 μg/m(3). After adding a spatial function in the DLNM, spatiotemporal imputed data conducted similar results compared with the overall effect from the original data. Moreover, the spatial function showed a clear and uneven pattern in Taipei, revealing that preschool children living in 31 districts of Taipei were vulnerable to acute respiratory infection. Our findings suggest the necessity of including a spatial function in the DLNM to make a spatiotemporal analysis available and to conduct more reliable and explainable research. This study also revealed the analytical impact if spatial heterogeneity is ignored.Journal of Exposure Science and Environmental Epidemiology advance online publication, 16 November 2016; doi:10.1038/jes

  13. Voids and the Cosmic Web: cosmic depressions & spatial complexity

    CERN Document Server

    van de Weygaert, Rien

    2016-01-01

    Voids form a prominent aspect of the Megaparsec distribution of galaxies and matter. Not only do they represent a key constituent of the Cosmic Web, they also are one of the cleanest probes and measures of global cosmological parameters. The shape and evolution of voids are highly sensitive to the nature of dark energy, while their substructure and galaxy population provides a direct key to the nature of dark matter. Also, the pristine environment of void interiors is an important testing ground for our understanding of environmental influences on galaxy formation and evolution. In this paper, we review the key aspects of the structure and dynamics of voids, with a particular focus on the hierarchical evolution of the void population. We demonstrate how the rich structural pattern of the Cosmic Web is related to the complex evolution and buildup of voids.

  14. Voids and the Cosmic Web: cosmic depression & spatial complexity

    Science.gov (United States)

    van de Weygaert, Rien

    2016-10-01

    Voids form a prominent aspect of the Megaparsec distribution of galaxies and matter. Not only do theyrepresent a key constituent of the Cosmic Web, they also are one of the cleanest probesand measures of global cosmological parameters. The shape and evolution of voids are highly sensitive tothe nature of dark energy, while their substructure and galaxy population provides a direct key to thenature of dark matter. Also, the pristine environment of void interiors is an important testing groundfor our understanding of environmental influences on galaxy formation and evolution. In this paper, we reviewthe key aspects of the structure and dynamics ofvoids, with a particular focus on the hierarchical evolution of the void population. We demonstratehow the rich structural pattern of the Cosmic Web is related to the complex evolution and buildupof voids.

  15. Isotopic evidence for the spatial heterogeneity of the planktonic food webs in the transition zone between river and lake ecosystems.

    Science.gov (United States)

    Doi, Hideyuki; Zuykova, Elena I; Shikano, Shuichi; Kikuchi, Eisuke; Ota, Hiroshi; Yurlova, Natalia I; Yadrenkina, Elena

    2013-01-01

    Resources and organisms in food webs are distributed patchily. The spatial structure of food webs is important and critical to understanding their overall structure. However, there is little available information about the small-scale spatial structure of food webs. We investigated the spatial structure of food webs in a lake ecosystem at the littoral transition zone between an inflowing river and a lake. We measured the carbon isotope ratios of zooplankton and particulate organic matter (POM; predominantly phytoplankton) in the littoral zone of a saline lake. Parallel changes in the δ (13)C values of zooplankton and their respective POMs indicated that there is spatial heterogeneity of the food web in this study area. Lake ecosystems are usually classified at the landscape level as either pelagic or littoral habitats. However, we showed small-scale spatial heterogeneity among planktonic food webs along an environmental gradient. Stable isotope data is useful for detecting spatial heterogeneity of habitats, populations, communities, and ecosystems.

  16. 3D Spatial Data Infrastructures for web-based Visualization

    OpenAIRE

    Schilling, Arne

    2014-01-01

    In this thesis, concepts for developing Spatial Data Infrastructures with an emphasis on visualizing 3D landscape and city models in distributed environments are discussed. Spatial Data Infrastructures are important for public authorities in order to perform tasks on a daily basis, and serve as research topic in geo-informatics. Joint initiatives at national and international level exist for harmonizing procedures and technologies. Interoperability is an important aspect in this context - as ...

  17. Spatial Object Aggregation Based on Data Structure,Local Triangulation and Hierarchical Analyzing Method

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper focuses on the methods and process of spatial aggregation based on semantic and geometric characteristics of spatial objects and relations among the objects with the help of spatial data structure (Formal Data Structure),the Local Constrained Delaunay Triangulations and semantic hierarchy.The adjacent relation among connected objects and unconnected objects has been studied through constrained triangle as elementary processing unit in aggregation operation.The hierarchical semantic analytical matrix is given for analyzing the similarity between objects types and between objects.Several different cases of aggregation have been presented in this paper.

  18. Spatial capture-recapture: a promising method for analyzing data collected using artificial cover objects

    Science.gov (United States)

    Sutherland, Chris; Munoz, David; Miller, David A.W.; Grant, Evan

    2016-01-01

    Spatial capture–recapture (SCR) is a relatively recent development in ecological statistics that provides a spatial context for estimating abundance and space use patterns, and improves inference about absolute population density. SCR has been applied to individual encounter data collected noninvasively using methods such as camera traps, hair snares, and scat surveys. Despite the widespread use of capture-based surveys to monitor amphibians and reptiles, there are few applications of SCR in the herpetological literature. We demonstrate the utility of the application of SCR for studies of reptiles and amphibians by analyzing capture–recapture data from Red-Backed Salamanders, Plethodon cinereus, collected using artificial cover boards. Using SCR to analyze spatial encounter histories of marked individuals, we found evidence that density differed little among four sites within the same forest (on average, 1.59 salamanders/m2) and that salamander detection probability peaked in early October (Julian day 278) reflecting expected surface activity patterns of the species. The spatial scale of detectability, a measure of space use, indicates that the home range size for this population of Red-Backed Salamanders in autumn was 16.89 m2. Surveying reptiles and amphibians using artificial cover boards regularly generates spatial encounter history data of known individuals, which can readily be analyzed using SCR methods, providing estimates of absolute density and inference about the spatial scale of habitat use.

  19. Retrieving top-k prestige-based relevant spatial web objects

    DEFF Research Database (Denmark)

    Cao, Xin; Cong, Gao; Jensen, Christian S.

    2010-01-01

    of prestige-based relevance to capture both the textual relevance of an object to a query and the effects of nearby objects. Based on this, a new type of query, the Location-aware top-k Prestige-based Text retrieval (LkPT) query, is proposed that retrieves the top-k spatial web objects ranked according...... to both prestige-based relevance and location proximity. We propose two algorithms that compute LkPT queries. Empirical studies with real-world spatial data demonstrate that LkPT queries are more effective in retrieving web objects than a previous approach that does not consider the effects of nearby...

  20. Analyzing Web 2.0 Integration with Next Generation Networks for Services Rendering

    CERN Document Server

    Lakhtaria, Kamaljit I

    2010-01-01

    The Next Generation Networks (NGN) aims to integrate for IP-based telecom infrastructures and provide most advance & high speed emerging value added services. NGN capable to provide higher innovative services, these services will able to integrate communication and Web service into a single platform. IP Multimedia Subsystem, a NGN leading technology, enables a variety of NGN-compliant communications services to interoperate while being accessed through different kinds of access networks, preferably broadband. IMS–NGN services essential by both consumer and corporate users are by now used to access services, even communications services through the web and web-based communities and social networks, It is key for success of IMS-based services to be provided with efficient web access, so users can benefit from those new services by using web-based applications and user interfaces, not only NGN-IMS User Equipments and SIP protocol. Many Service are under planning which provided only under convergence of ...

  1. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui

    Science.gov (United States)

    2012-01-01

    Background The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. Methods This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Results Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. Conclusions This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics

  2. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui

    Directory of Open Access Journals (Sweden)

    Newton Richard

    2012-09-01

    Full Text Available Abstract Background The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. Methods This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Results Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. Conclusions This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical

  3. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui.

    Science.gov (United States)

    Newton, Richard; Deonarine, Andrew; Wernisch, Lorenz

    2012-09-24

    The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an

  4. A geospatial web portal for sharing and analyzing greenhouse gas data derived from satellite remote sensing images

    Science.gov (United States)

    Lin, Hao; Yu, Bailang; Chen, Zuoqi; Hu, Yingjie; Huang, Yan; Wu, Jianping; Wu, Bin; Ge, Rong

    2013-09-01

    Greenhouse gas data collected by different institutions throughout the world have significant scientific values for global climate change studies. Due to the diversity of data formats and different specifications of data access interfaces, most of those data should be first downloaded onto a local machine before they can be used. To overcome this limitation, we present a geospatial web portal for sharing and analyzing greenhouse gas data derived from remote sensing images. As a proof-of-concept, a prototype has also been designed and implemented. The workflow of the web portal contains four processes: data access, data analysis, results visualization, and results output. A large volume of greenhouse gas data have been collected, described, and indexed in the portal, and a variety of data analysis services, such as calculating the temporal variation of regionally averaged column CO2 values and analyzing the latitudinal variations of globally averaged column CO2 values, are integrated into this portal. With the integrated geospatial data and services, researchers can collect and analyze greenhouse gas data online, and can preview and download the analysis results directly from the web portal. The geospatial web portal has been implemented as a web application, and we also used a study case to illustrate this framework.

  5. MATISSE a web-based tool to access, visualize and analyze high resolution minor bodies observation

    Science.gov (United States)

    Zinzi, Angelo; Capria, Maria Teresa; Palomba, Ernesto; Antonelli, Lucio Angelo; Giommi, Paolo

    2016-07-01

    In the recent years planetary exploration missions acquired data from minor bodies (i.e., dwarf planets, asteroid and comets) at a detail level never reached before. Since these objects often present very irregular shapes (as in the case of the comet 67P Churyumov-Gerasimenko target of the ESA Rosetta mission) "classical" bidimensional projections of observations are difficult to understand. With the aim of providing the scientific community a tool to access, visualize and analyze data in a new way, ASI Science Data Center started to develop MATISSE (Multi-purposed Advanced Tool for the Instruments for the Solar System Exploration - http://tools.asdc.asi.it/matisse.jsp) in late 2012. This tool allows 3D web-based visualization of data acquired by planetary exploration missions: the output could either be the straightforward projection of the selected observation over the shape model of the target body or the visualization of a high-order product (average/mosaic, difference, ratio, RGB) computed directly online with MATISSE. Standard outputs of the tool also comprise downloadable files to be used with GIS software (GeoTIFF and ENVI format) and 3D very high-resolution files to be viewed by means of the free software Paraview. During this period the first and most frequent exploitation of the tool has been related to visualization of data acquired by VIRTIS-M instruments onboard Rosetta observing the comet 67P. The success of this task, well represented by the good number of published works that used images made with MATISSE confirmed the need of a different approach to correctly visualize data coming from irregular shaped bodies. In the next future the datasets available to MATISSE are planned to be extended, starting from the addition of VIR-Dawn observations of both Vesta and Ceres and also using standard protocols to access data stored in external repositories, such as NASA ODE and Planetary VO.

  6. Analyzing Web 2.0 Integration with Next Generation Networks for Services Rendering

    Directory of Open Access Journals (Sweden)

    Kamaljit I. Lakhtaria

    2010-08-01

    Full Text Available The Next Generation Networks (NGN aims to integrate for IP-based telecominfrastructures and provide most advance & high speed emerging value added services.NGN capable to provide higher innovative services, these services will able to integratecommunication and Web service into a single platform. IP Multimedia Subsystem, aNGN leading technology, enables a variety of NGN-compliant communications servicesto interoperate while being accessed through different kinds of access networks,preferably broadband. IMS–NGN services essential by both consumer and corporateusers are by now used to access services, even communications services through the weband web-based communities and social networks, It is key for success of IMS-basedservices to be provided with efficient web access, so users can benefit from those newservices by using web-based applications and user interfaces, not only NGN-IMS UserEquipments and SIP protocol. Many Service are under planning which provided onlyunder convergence of IMS & Web 2.0. Convergence between Web 2.0 and NGN-IMScreates and serves new invented innovative, entertainment and information appealing aswell as user centric services and applications. These services merge features from WWWand Communication worlds. On the one hand, interactivity, ubiquity, social orientation,user participation and content generation, etc. are relevant characteristics coming fromWeb 2.0 services. Parallel IMS enables services including multimedia telephony, mediasharing (video-audio, instant messaging with presence and context, online directory,etc. all of them applicable to mobile, fixed or convergent telecom networks. With thispaper, this paper brings out the benefits of adopting web 2.0 technologies for telecomservices. As the services are today mainly driven by the user's needs, and proposed theconcept of unique customizable service interface.

  7. Retrieving top-k prestige-based relevant spatial web objects

    DEFF Research Database (Denmark)

    Cao, Xin; Cong, Gao; Jensen, Christian S.

    2010-01-01

    The location-aware keyword query returns ranked objects that are near a query location and that have textual descriptions that match query keywords. This query occurs inherently in many types of mobile and traditional web services and applications, e.g., Yellow Pages and Maps services. Previous...... of prestige-based relevance to capture both the textual relevance of an object to a query and the effects of nearby objects. Based on this, a new type of query, the Location-aware top-k Prestige-based Text retrieval (LkPT) query, is proposed that retrieves the top-k spatial web objects ranked according...... to both prestige-based relevance and location proximity. We propose two algorithms that compute LkPT queries. Empirical studies with real-world spatial data demonstrate that LkPT queries are more effective in retrieving web objects than a previous approach that does not consider the effects of nearby...

  8. Analyzing spatial patterns linked to the ecology of herbivores and their natural enemies in the soil

    Directory of Open Access Journals (Sweden)

    Raquel eCampos-Herrera

    2013-09-01

    Full Text Available Modern agricultural systems can benefit from the application of concepts and models from applied ecology. When understood, multitrophic interactions among plants, pests, diseases and their natural enemies can be exploited to increase crop production and reduce undesirable environmental impacts. Although the understanding of subterranean ecology is rudimentary compared to the perspective aboveground, technologies today vastly reduce traditional obstacles to studying cryptic communities. Here we emphasize advantages to integrating as much as possible the use of these methods in order to leverage the information gained from studying communities of soil organisms. PCR–based approaches to identify and quantify species (real time qPCR and new generation sequencing greatly expand the ability to investigate food web interactions because there is less need for wide taxonomic expertise within research programs. Improved methods to capture and measure volatiles in the soil atmosphere in situ make it possible to detect and study chemical cues that are critical to communication across trophic levels. The application of SADIE to directly assess rather than infer spatial patterns in belowground agroecosystems has improved the ability to characterize relationships between organisms in space and time. We review selected methodology and use of these tools and describe some of the ways they were integrated to study soil food webs in Florida citrus orchards with the goal of developing new biocontrol approaches.

  9. The Implementation of a Cost Effectiveness Analyzer for Web-Supported Academic Instruction: An Example from Life Science

    Science.gov (United States)

    Cohen, Anat; Nachmias, Rafi

    2012-01-01

    This paper describes implementation of a quantitative cost effectiveness analyzer for Web-supported academic instruction that was developed in our University. The paper presents the cost effectiveness analysis of one academic exemplary course in Life Science department and its introducing to the course lecturer for evaluation. The benefits and…

  10. A New Approach to Analyzing HST Spatial Scans: The Transmission Spectrum of HD 209458 b

    Science.gov (United States)

    Tsiaras, A.; Waldmann, I. P.; Rocchetto, M.; Varley, R.; Morello, G.; Damiano, M.; Tinetti, G.

    2016-12-01

    The Wide Field Camera 3 on the Hubble Space Telescope is currently one of the most widely used instruments for observing exoplanetary atmospheres, especially with the use of the spatial scanning technique. An increasing number of exoplanets have been studied using this technique as it enables the observation of bright targets without saturating the sensitive detectors. In this work, we present a new pipeline for analyzing the data obtained with the spatial scanning technique, starting from the raw data provided by the instrument. In addition to commonly used correction techniques, we take into account the geometric distortions of the instrument, the impact of which may become important when they are combined with the scanning process. Our approach can improve the photometric precision for existing data and also extend the limits of the spatial scanning technique, as it allows the analysis of even longer spatial scans. As an application of our method and pipeline, we present the results from a reanalysis of the spatially scanned transit spectrum of HD 209458 b. We calculate the transit depth per wavelength channel with an average relative uncertainty of 40 ppm. We interpret the final spectrum with { T }-REx, our fully Bayesian spectral retrieval code, which confirms the presence of water vapor and clouds in the atmosphere of HD 209458 b. The narrow wavelength range limits our ability to disentangle the degeneracies between the fitted atmospheric parameters. Additional data over a broader spectral range are needed to address this issue.

  11. Spatial Search Techniques for Mobile 3D Queries in Sensor Web Environments

    Directory of Open Access Journals (Sweden)

    James D. Carswell

    2013-03-01

    Full Text Available Developing mobile geo-information systems for sensor web applications involves technologies that can access linked geographical and semantically related Internet information. Additionally, in tomorrow’s Web 4.0 world, it is envisioned that trillions of inexpensive micro-sensors placed throughout the environment will also become available for discovery based on their unique geo-referenced IP address. Exploring these enormous volumes of disparate heterogeneous data on today’s location and orientation aware smartphones requires context-aware smart applications and services that can deal with “information overload”. 3DQ (Three Dimensional Query is our novel mobile spatial interaction (MSI prototype that acts as a next-generation base for human interaction within such geospatial sensor web environments/urban landscapes. It filters information using “Hidden Query Removal” functionality that intelligently refines the search space by calculating the geometry of a three dimensional visibility shape (Vista space at a user’s current location. This 3D shape then becomes the query “window” in a spatial database for retrieving information on only those objects visible within a user’s actual 3D field-of-view. 3DQ reduces information overload and serves to heighten situation awareness on constrained commercial off-the-shelf devices by providing visibility space searching as a mobile web service. The effects of variations in mobile spatial search techniques in terms of query speed vs. accuracy are evaluated and presented in this paper.

  12. Web-Based Interactive System for Analyzing Achievement Gaps in Public Schools System

    Science.gov (United States)

    Wang, Kening; Mulvenon, Sean W.; Stegman, Charles; Xia, Yanling

    2010-01-01

    The National Office for Research on Measurement and Evaluation Systems (NORMES) at the University of Arkansas developed a web-based interactive system to provide information on state, district, and school level achievement gaps between white students and black students, socioeconomically disadvantaged students and non-disadvantaged students, male…

  13. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    Science.gov (United States)

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web

  14. SynergyFinder: a web application for analyzing drug combination dose-response matrix data.

    Science.gov (United States)

    Ianevski, Aleksandr; He, Liye; Aittokallio, Tero; Tang, Jing

    2017-08-01

    Rational design of drug combinations has become a promising strategy to tackle the drug sensitivity and resistance problem in cancer treatment. To systematically evaluate the pre-clinical significance of pairwise drug combinations, functional screening assays that probe combination effects in a dose-response matrix assay are commonly used. To facilitate the analysis of such drug combination experiments, we implemented a web application that uses key functions of R-package SynergyFinder, and provides not only the flexibility of using multiple synergy scoring models, but also a user-friendly interface for visualizing the drug combination landscapes in an interactive manner. The SynergyFinder web application is freely accessible at https://synergyfinder.fimm.fi ; The R-package and its source-code are freely available at http://bioconductor.org/packages/release/bioc/html/synergyfinder.html . jing.tang@helsinki.fi.

  15. A web application for recording and analyzing the clinical experiences of nursing students.

    Science.gov (United States)

    Meyer, Linda; Sedlmeyer, Robert; Carlson, Cathy; Modlin, Susan

    2003-01-01

    A primary focus in nursing education is to provide students with a diverse range of clinical experiences. Historically, the collection and assessment of data from students' clinical experiences have been paper-and-pencil tasks that are arduous for both students and nursing faculty. The volume of collected information also has made it difficult to produce ad hoc statistical reports without additional intensive manual labor. To facilitate recording and analysis of these data, the Nursing and Computer Science Departments at Indiana University-Purdue University Fort Wayne have collaborated to create a Web application: Essential Clinical Behaviors. The use of the Web-accessible database represents a major change in nursing education by alteration of format used by students to record their clinical experiences in nursing courses. The application was designed to enhance nursing students' learning and to assist faculty in making student assignments, evaluating student progress, and supporting curriculum decisions. This report discusses the rationale for the development of the Web application, a description of its data entry and reporting mechanisms, an overview of the system architecture, its use in the nursing curriculum, and planned enhancements.

  16. Spatial learning affects thread tension control in orb-web spiders.

    Science.gov (United States)

    Nakata, Kensuke

    2013-08-23

    Although it is well known that spatial learning can be important in the biology of predators that actively move around in search for food, comparatively little is known about ways in which spatial learning might function in the strategies of sit-and-wait predators. In this study, Cyclosa octotuberculata, an orb-web spider that uses its legs to contract radial threads of its web to increase thread tension, was trained to capture prey in limited web sectors. After training, spiders that had captured prey in horizontal web sectors applied more tension on radial threads connected to horizontal sectors than spiders that had captured prey in vertical sectors. This result suggests that the effect of experience on C. octotuberculata's behaviour is not expressed in the way the trained spider responds to prey-derived stimuli and is instead expressed in behaviour by which the spider anticipates the likely direction from which prey will arrive in the future. This illustrates that learning can be important even when the predator remains in one location during foraging bouts.

  17. Integrated coastal management, marine spatial data infrastructures, and semantic web services

    OpenAIRE

    Cömert, Çetin; Ulutaş, Deniztan; Akıncı, Halil; Kara, Gülten

    2008-01-01

    The aim of this work was to get acquainted with semantic web services (SWS) and assess their potential for the implementation of technical interoperability infrastructure of Spatial Data Infrastructures (SDIs). SDIs are widely accepted way of enabling collaboration among various parties allowing sharing of “data” and “services” of each other. Collaboration is indispensable given either the business model or other requirements such as that of “Sustainable Development” of the date. SDIs can be ...

  18. Development of a Spatial Decision Support System for Analyzing Changes in Hydro-meteorological Risk

    Science.gov (United States)

    van Westen, Cees

    2013-04-01

    In the framework of the EU FP7 Marie Curie ITN Network "CHANGES: Changing Hydro-meteorological Risks, as Analyzed by a New Generation of European Scientists (http://www.changes-itn.eu)", a spatial decision support system is under development with the aim to analyze the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. The SDSS is one of the main outputs of the CHANGES network, which will develop an advanced understanding of how global changes, related to environmental and climate change as well as socio-economical change, may affect the temporal and spatial patterns of hydro-meteorological hazards and associated risks in Europe; how these changes can be assessed, modeled, and incorporated in sustainable risk management strategies, focusing on spatial planning, emergency preparedness and risk communication. The CHANGES network consists of 11 full partners and 6 associate partners of which 5 private companies, representing 10 European countries. The CHANGES network has hired 12 Early Stage Researchers (ESRs) and is currently hiring 3-6 researchers more for the implementation of the SDSS. The Spatial Decision Support System will be composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and

  19. A Cloud-enabled Service-oriented Spatial Web Portal for Facilitating Arctic Data Discovery, Integration, and Utilization

    Science.gov (United States)

    dias, S. B.; Yang, C.; Li, Z.; XIA, J.; Liu, K.; Gui, Z.; Li, W.

    2013-12-01

    Global climate change has become one of the biggest concerns for human kind in the 21st century due to its broad impacts on society and ecosystems across the world. Arctic has been observed as one of the most vulnerable regions to the climate change. In order to understand the impacts of climate change on the natural environment, ecosystems, biodiversity and others in the Arctic region, and thus to better support the planning and decision making process, cross-disciplinary researches are required to monitor and analyze changes of Arctic regions such as water, sea level, biodiversity and so on. Conducting such research demands the efficient utilization of various geospatially referenced data, web services and information related to Arctic region. In this paper, we propose a cloud-enabled and service-oriented Spatial Web Portal (SWP) to support the discovery, integration and utilization of Arctic related geospatial resources, serving as a building block of polar CI. This SWP leverages the following techniques: 1) a hybrid searching mechanism combining centralized local search, distributed catalogue search and specialized Internet search for effectively discovering Arctic data and web services from multiple sources; 2) a service-oriented quality-enabled framework for seamless integration and utilization of various geospatial resources; and 3) a cloud-enabled parallel spatial index building approach to facilitate near-real time resource indexing and searching. A proof-of-concept prototype is developed to demonstrate the feasibility of the proposed SWP, using an example of analyzing the Arctic snow cover change over the past 50 years.

  20. shinyGEO: a web-based application for analyzing gene expression omnibus datasets.

    Science.gov (United States)

    Dumas, Jasmine; Gargano, Michael A; Dancik, Garrett M

    2016-12-01

    The Gene Expression Omnibus (GEO) is a public repository of gene expression data. Although GEO has its own tool, GEO2R, for data analysis, evaluation of single genes is not straightforward and survival analysis in specific GEO datasets is not possible without bioinformatics expertise. We describe a web application, shinyGEO, that allows a user to download gene expression data sets directly from GEO in order to perform differential expression and survival analysis for a gene of interest. In addition, shinyGEO supports customized graphics, sample selection, data export and R code generation so that all analyses are reproducible. The availability of shinyGEO makes GEO datasets more accessible to non-bioinformaticians, promising to lead to better understanding of biological processes and genetic diseases such as cancer. Web application and source code are available from http://gdancik.github.io/shinyGEO/ CONTACT: dancikg@easternct.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Remote sensing, geographical information systems, and spatial modeling for analyzing public transit services

    Science.gov (United States)

    Wu, Changshan

    Public transit service is a promising transportation mode because of its potential to address urban sustainability. Current ridership of public transit, however, is very low in most urban regions, particularly those in the United States. This woeful transit ridership can be attributed to many factors, among which poor service quality is key. Given this, there is a need for transit planning and analysis to improve service quality. Traditionally, spatially aggregate data are utilized in transit analysis and planning. Examples include data associated with the census, zip codes, states, etc. Few studies, however, address the influences of spatially aggregate data on transit planning results. In this research, previous studies in transit planning that use spatially aggregate data are reviewed. Next, problems associated with the utilization of aggregate data, the so-called modifiable areal unit problem (MAUP), are detailed and the need for fine resolution data to support public transit planning is argued. Fine resolution data is generated using intelligent interpolation techniques with the help of remote sensing imagery. In particular, impervious surface fraction, an important socio-economic indicator, is estimated through a fully constrained linear spectral mixture model using Landsat Enhanced Thematic Mapper Plus (ETM+) data within the metropolitan area of Columbus, Ohio in the United States. Four endmembers, low albedo, high albedo, vegetation, and soil are selected to model heterogeneous urban land cover. Impervious surface fraction is estimated by analyzing low and high albedo endmembers. With the derived impervious surface fraction, three spatial interpolation methods, spatial regression, dasymetric mapping, and cokriging, are developed to interpolate detailed population density. Results suggest that cokriging applied to impervious surface is a better alternative for estimating fine resolution population density. With the derived fine resolution data, a multiple

  2. Spatial data integration for analyzing the dynamics of Albanian Adriatic shoreline

    Science.gov (United States)

    Arapi, Luan; Nikolli, Pal; Kovaçi, Sander

    2016-04-01

    Shoreline mapping and shoreline change detection are critical subjects for coastal resource management, coastal environmental protection and sustainable coastal development and planning. Coastal changes are attracting more focus since they are important environmental indicators that directly impact coastal economic development and land management. Changes in the shape of shoreline may essentially affect the environment of the coastal zone. These may be caused by natural processes and human activities. The undertaken work focuses on analyzing the Adriatic shoreline dynamics, using spatial temporal data, by taking advantage of Geographic Informatin System (GIS) and Remote Sensing (RS). Shoreline mapping focuses on some specific issues such as mapping methods used to acquire shoreline data, models and database design used to represent shoreline in the spatial database and shoreline -change analysis methods. The study area extends from the mouth of Buna River in the north to Vlora Bay in the south covering a total length of about 220 km. Detection and future assessment of Albanian Adriatic shoreline spatial position is carried out through integration of multi scale resolution of spatial temporal data and different processing methods. We have combined topographic maps at different scales (1:75 000, 1918; 1:50 000, 1937; 1:25 000, 1960, 1986 and 1:10 000, 1995), digital aerial photographs of 2007 year, satellite images of Landsat TM, Landsat ETM+ and field observed GIS data. Generation of spatial data is carried out through vectorization process and image processing. Monitoring the dynamics of shoreline position change requires understanding the coastal processes as well as coastal mapping methods. The net rates of variations in the position of the shoreline are calculated according to transects disposed perpendicularly to the baseline and spaced equally along the coast. Analysis of the relative impact of the natural factors and human activities, it is fundamental

  3. Spatial guilds in the Serengeti food web revealed by a Bayesian group model.

    Directory of Open Access Journals (Sweden)

    Edward B Baskerville

    2011-12-01

    Full Text Available Food webs, networks of feeding relationships in an ecosystem, provide fundamental insights into mechanisms that determine ecosystem stability and persistence. A standard approach in food-web analysis, and network analysis in general, has been to identify compartments, or modules, defined by many links within compartments and few links between them. This approach can identify large habitat boundaries in the network but may fail to identify other important structures. Empirical analyses of food webs have been further limited by low-resolution data for primary producers. In this paper, we present a Bayesian computational method for identifying group structure using a flexible definition that can describe both functional trophic roles and standard compartments. We apply this method to a newly compiled plant-mammal food web from the Serengeti ecosystem that includes high taxonomic resolution at the plant level, allowing a simultaneous examination of the signature of both habitat and trophic roles in network structure. We find that groups at the plant level reflect habitat structure, coupled at higher trophic levels by groups of herbivores, which are in turn coupled by carnivore groups. Thus the group structure of the Serengeti web represents a mixture of trophic guild structure and spatial pattern, in contrast to the standard compartments typically identified. The network topology supports recent ideas on spatial coupling and energy channels in ecosystems that have been proposed as important for persistence. Furthermore, our Bayesian approach provides a powerful, flexible framework for the study of network structure, and we believe it will prove instrumental in a variety of biological contexts.

  4. GoPubMed与Web of Knowledge分析工具Analyze Results的比较研究%Comparison of GoPubMed and Analyze Results, an analysis tool of Web of Knowledge

    Institute of Scientific and Technical Information of China (English)

    张玢; 许培扬; 王敏; 马明; 栗文靖

    2009-01-01

    GoPubMed是基于本体论的生物医学文献分析工具,它可以对检索到的PubMed文献进行全方位的统计分析,Web of Knowlegde平台的Analyze Results分析功能也可对检索结果进行统计分析.对这2种文献分析工具特点进行对比分析.

  5. ATGC transcriptomics: a web-based application to integrate, explore and analyze de novo transcriptomic data.

    Science.gov (United States)

    Gonzalez, Sergio; Clavijo, Bernardo; Rivarola, Máximo; Moreno, Patricio; Fernandez, Paula; Dopazo, Joaquín; Paniego, Norma

    2017-02-22

    In the last years, applications based on massively parallelized RNA sequencing (RNA-seq) have become valuable approaches for studying non-model species, e.g., without a fully sequenced genome. RNA-seq is a useful tool for detecting novel transcripts and genetic variations and for evaluating differential gene expression by digital measurements. The large and complex datasets resulting from functional genomic experiments represent a challenge in data processing, management, and analysis. This problem is especially significant for small research groups working with non-model species. We developed a web-based application, called ATGC transcriptomics, with a flexible and adaptable interface that allows users to work with new generation sequencing (NGS) transcriptomic analysis results using an ontology-driven database. This new application simplifies data exploration, visualization, and integration for a better comprehension of the results. ATGC transcriptomics provides access to non-expert computer users and small research groups to a scalable storage option and simple data integration, including database administration and management. The software is freely available under the terms of GNU public license at http://atgcinta.sourceforge.net .

  6. A user-friendly web portal for analyzing conformational changes in structures of Mycobacterium tuberculosis.

    Science.gov (United States)

    Hassan, Sameer; Thangam, Manonanthini; Vasudevan, Praveen; Kumar, G Ramesh; Unni, Rahul; Devi, P K Gayathri; Hanna, Luke Elizabeth

    2015-10-01

    Initiation of the Tuberculosis Structural Consortium has resulted in the expansion of the Mycobacterium tuberculosis (MTB) protein structural database. Currently, 969 experimentally solved structures are available for 354 MTB proteins. This includes multiple crystal structures for a given protein under different functional conditions, such as the presence of different ligands or mutations. In depth analysis of the multiple structures reveal that subtle differences exist in conformations of a given protein under varied conditions. Therefore, it is immensely important to understand the conformational differences between the multiple structures of a given protein in order to select the most suitable structure for molecular docking and structure-based drug designing. Here, we introduce a web portal ( http://bmi.icmr.org.in/mtbsd/torsion.php ) that we developed to provide comparative data on the ensemble of available structures of MTB proteins, such as Cα root means square deviation (RMSD), sequence identity, presence of mutations and torsion angles. Additionally, torsion angles were used to perform principal component analysis (PCA) to identify the conformational differences between the structures. Additionally, we present a few case studies to demonstrate this database. Graphical Abstract Conformational changes seen in the structures of the enoyl-ACP reductase protein encoded by the Mycobacterial gene inhA.

  7. Efficient Retrieval of the Top-k Most Relevant Spatial Web Objects

    DEFF Research Database (Denmark)

    Cong, Gao; Jensen, Christian Søndergaard; Wu, Dingming

    2009-01-01

    The conventional Internet is acquiring a geo-spatial dimension. Web documents are being geo-tagged, and geo-referenced objects such as points of interest are being associated with descriptive text documents. The resulting fusion of geo-location and documents enables a new kind of top-k query...... that takes into account both location proximity and text relevancy. To our knowledge, only naive techniques exist that are capable of computing a general web information retrieval query while also taking location into account. This paper proposes a new indexing framework for location-aware top-k text...... both text relevancy and location proximity to prune the search space. Results of empirical studies with an implementation of the framework demonstrate that the paper’s proposal offers scalability and is capable of excellent performance....

  8. Concept of a spatial data infrastructure for web-mapping, processing and service provision for geo-hazards

    Science.gov (United States)

    Weinke, Elisabeth; Hölbling, Daniel; Albrecht, Florian; Friedl, Barbara

    2017-04-01

    for the possibility of rapid mapping. The server tier consists of java based web and GIS server. Sub and main services are part of the service tier. Sub services are for example map services, feature editing services, geometry services, geoprocessing services and metadata services. For (meta)data provision and to support data interoperability, web standards of the OGC and the rest-interface is used. Four central main services are designed and developed: (1) a mapping service (including image segmentation and classification approaches), (2) a monitoring service to monitor changes over time, (3) a validation service to analyze landslide delineations from different sources and (4) an infrastructure service to identify affected landslides. The main services use and combine parts of the sub services. Furthermore, a series of client applications based on new technology standards making use of the data and services offered by the spatial data infrastructure. Next steps include the design to extend the current spatial data infrastructure to other areas and geo-hazard types to develop a spatial data infrastructure that can assist targeted mapping and monitoring of geo-hazards on a global context.

  9. Spatially Analyzing the Inequity of the Hong Kong Urban Heat Island by Socio-Demographic Characteristics.

    Science.gov (United States)

    Wong, Man Sing; Peng, Fen; Zou, Bin; Shi, Wen Zhong; Wilson, Gaines J

    2016-03-12

    Recent studies have suggested that some disadvantaged socio-demographic groups face serious environmental-related inequities in Hong Kong due to the rising ambient urban temperatures. Identifying heat-vulnerable groups and locating areas of Surface Urban Heat Island (SUHI) inequities is thus important for prioritizing interventions to mitigate death/illness rates from heat. This study addresses this problem by integrating methods of remote sensing retrieval, logistic regression modelling, and spatial autocorrelation. In this process, the SUHI effect was first estimated from the Land Surface Temperature (LST) derived from a Landsat image. With the scale assimilated to the SUHI and socio-demographic data, a logistic regression model was consequently adopted to ascertain their relationships based on Hong Kong Tertiary Planning Units (TPUs). Lastly, inequity "hotspots" were derived using spatial autocorrelation methods. Results show that disadvantaged socio-demographic groups were significantly more prone to be exposed to an intense SUHI effect: over half of 287 TPUs characterized by age groups of 60+ years, secondary and matriculation education attainment, widowed, divorced and separated, low and middle incomes, and certain occupation groups of workers, have significant Odds Ratios (ORs) larger than 1.2. It can be concluded that a clustering analysis stratified by age, income, educational attainment, marital status, and occupation is an effective way to detect the inequity hotspots of SUHI exposure. Additionally, inequities explored using income, marital status and occupation factors were more significant than the age and educational attainment in these areas. The derived maps and model can be further analyzed in urban/city planning, in order to mitigate the physical and social causes of the SUHI effect.

  10. Spatially Analyzing the Inequity of the Hong Kong Urban Heat Island by Socio-Demographic Characteristics

    Science.gov (United States)

    Wong, Man Sing; Peng, Fen; Zou, Bin; Shi, Wen Zhong; Wilson, Gaines J.

    2016-01-01

    Recent studies have suggested that some disadvantaged socio-demographic groups face serious environmental-related inequities in Hong Kong due to the rising ambient urban temperatures. Identifying heat-vulnerable groups and locating areas of Surface Urban Heat Island (SUHI) inequities is thus important for prioritizing interventions to mitigate death/illness rates from heat. This study addresses this problem by integrating methods of remote sensing retrieval, logistic regression modelling, and spatial autocorrelation. In this process, the SUHI effect was first estimated from the Land Surface Temperature (LST) derived from a Landsat image. With the scale assimilated to the SUHI and socio-demographic data, a logistic regression model was consequently adopted to ascertain their relationships based on Hong Kong Tertiary Planning Units (TPUs). Lastly, inequity “hotspots” were derived using spatial autocorrelation methods. Results show that disadvantaged socio-demographic groups were significantly more prone to be exposed to an intense SUHI effect: over half of 287 TPUs characterized by age groups of 60+ years, secondary and matriculation education attainment, widowed, divorced and separated, low and middle incomes, and certain occupation groups of workers, have significant Odds Ratios (ORs) larger than 1.2. It can be concluded that a clustering analysis stratified by age, income, educational attainment, marital status, and occupation is an effective way to detect the inequity hotspots of SUHI exposure. Additionally, inequities explored using income, marital status and occupation factors were more significant than the age and educational attainment in these areas. The derived maps and model can be further analyzed in urban/city planning, in order to mitigate the physical and social causes of the SUHI effect. PMID:26985899

  11. Spatially Analyzing the Inequity of the Hong Kong Urban Heat Island by Socio-Demographic Characteristics

    Directory of Open Access Journals (Sweden)

    Man Sing Wong

    2016-03-01

    Full Text Available Recent studies have suggested that some disadvantaged socio-demographic groups face serious environmental-related inequities in Hong Kong due to the rising ambient urban temperatures. Identifying heat-vulnerable groups and locating areas of Surface Urban Heat Island (SUHI inequities is thus important for prioritizing interventions to mitigate death/illness rates from heat. This study addresses this problem by integrating methods of remote sensing retrieval, logistic regression modelling, and spatial autocorrelation. In this process, the SUHI effect was first estimated from the Land Surface Temperature (LST derived from a Landsat image. With the scale assimilated to the SUHI and socio-demographic data, a logistic regression model was consequently adopted to ascertain their relationships based on Hong Kong Tertiary Planning Units (TPUs. Lastly, inequity “hotspots” were derived using spatial autocorrelation methods. Results show that disadvantaged socio-demographic groups were significantly more prone to be exposed to an intense SUHI effect: over half of 287 TPUs characterized by age groups of 60+ years, secondary and matriculation education attainment, widowed, divorced and separated, low and middle incomes, and certain occupation groups of workers, have significant Odds Ratios (ORs larger than 1.2. It can be concluded that a clustering analysis stratified by age, income, educational attainment, marital status, and occupation is an effective way to detect the inequity hotspots of SUHI exposure. Additionally, inequities explored using income, marital status and occupation factors were more significant than the age and educational attainment in these areas. The derived maps and model can be further analyzed in urban/city planning, in order to mitigate the physical and social causes of the SUHI effect.

  12. 基于Web Service企业级应用分析与构建%Analyzing and Building Enterprise Application Based on Web Service

    Institute of Scientific and Technical Information of China (English)

    郭清菊

    2009-01-01

    Web Service是创建可操作的分布式应用程序的新平台,该文分析了Web Service核心技术标准和Web Service的企业应用,并以网络会议管理系统为例介绍了以Web Service为核心的企业级应用案例.

  13. Metacommunity composition of web-spiders in a fragmented neotropical forest: relative importance of environmental and spatial effects.

    Directory of Open Access Journals (Sweden)

    Ronei Baldissera

    Full Text Available The distribution of beta diversity is shaped by factors linked to environmental and spatial control. The relative importance of both processes in structuring spider metacommunities has not yet been investigated in the Atlantic Forest. The variance explained by purely environmental, spatially structured environmental, and purely spatial components was compared for a metacommunity of web spiders. The study was carried out in 16 patches of Atlantic Forest in southern Brazil. Field work was done in one landscape mosaic representing a slight gradient of urbanization. Environmental variables encompassed plot- and patch-level measurements and a climatic matrix, while principal coordinates of neighbor matrices (PCNMs acted as spatial variables. A forward selection procedure was carried out to select environmental and spatial variables influencing web-spider beta diversity. Variation partitioning was used to estimate the contribution of pure environmental and pure spatial effects and their shared influence on beta-diversity patterns, and to estimate the relative importance of selected environmental variables. Three environmental variables (bush density, land use in the surroundings of patches, and shape of patches and two spatial variables were selected by forward selection procedures. Variation partitioning revealed that 15% of the variation of beta diversity was explained by a combination of environmental and PCNM variables. Most of this variation (12% corresponded to pure environmental and spatially environmental structure. The data indicated that (1 spatial legacy was not important in explaining the web-spider beta diversity; (2 environmental predictors explained a significant portion of the variation in web-spider composition; (3 one-third of environmental variation was due to a spatial structure that jointly explains variation in species distributions. We were able to detect important factors related to matrix management influencing the web

  14. Metacommunity composition of web-spiders in a fragmented neotropical forest: relative importance of environmental and spatial effects.

    Science.gov (United States)

    Baldissera, Ronei; Rodrigues, Everton N L; Hartz, Sandra M

    2012-01-01

    The distribution of beta diversity is shaped by factors linked to environmental and spatial control. The relative importance of both processes in structuring spider metacommunities has not yet been investigated in the Atlantic Forest. The variance explained by purely environmental, spatially structured environmental, and purely spatial components was compared for a metacommunity of web spiders. The study was carried out in 16 patches of Atlantic Forest in southern Brazil. Field work was done in one landscape mosaic representing a slight gradient of urbanization. Environmental variables encompassed plot- and patch-level measurements and a climatic matrix, while principal coordinates of neighbor matrices (PCNMs) acted as spatial variables. A forward selection procedure was carried out to select environmental and spatial variables influencing web-spider beta diversity. Variation partitioning was used to estimate the contribution of pure environmental and pure spatial effects and their shared influence on beta-diversity patterns, and to estimate the relative importance of selected environmental variables. Three environmental variables (bush density, land use in the surroundings of patches, and shape of patches) and two spatial variables were selected by forward selection procedures. Variation partitioning revealed that 15% of the variation of beta diversity was explained by a combination of environmental and PCNM variables. Most of this variation (12%) corresponded to pure environmental and spatially environmental structure. The data indicated that (1) spatial legacy was not important in explaining the web-spider beta diversity; (2) environmental predictors explained a significant portion of the variation in web-spider composition; (3) one-third of environmental variation was due to a spatial structure that jointly explains variation in species distributions. We were able to detect important factors related to matrix management influencing the web-spider beta

  15. Spatial variations in food web structures with alternative stable states: evidence from stable isotope analysis in a large eutrophic lake

    Science.gov (United States)

    Li, Yunkai; Zhang, Yuying; Xu, Jun; Zhang, Shuo

    2017-05-01

    Food web structures are well known to vary widely among ecosystems. Moreover, many food web studies of lakes have generally attempted to characterize the overall food web structure and have largely ignored internal spatial and environmental variations. In this study, we hypothesize that there is a high degree of spatial heterogeneity within an ecosystem and such heterogeneity may lead to strong variations in environmental conditions and resource availability, in turn resulting in different trophic pathways. Stable carbon and nitrogen isotopes were employed for the whole food web to describe the structure of the food web in different sub-basins within Taihu Lake. This lake is a large eutrophic freshwater lake that has been intensively managed and highly influenced by human activities for more than 50 years. The results show significant isotopic differences between basins with different environmental characteristics. Such differences likely result from isotopic baseline differences combining with a shift in food web structure. Both are related to local spatial heterogeneity in nutrient loading in waters. Such variation should be explicitly considered in future food web studies and ecosystem-based management in this lake ecosystem.

  16. A spatial web/agent-based model to support stakeholders' negotiation regarding land development.

    Science.gov (United States)

    Pooyandeh, Majeed; Marceau, Danielle J

    2013-11-15

    Decision making in land management can be greatly enhanced if the perspectives of concerned stakeholders are taken into consideration. This often implies negotiation in order to reach an agreement based on the examination of multiple alternatives. This paper describes a spatial web/agent-based modeling system that was developed to support the negotiation process of stakeholders regarding land development in southern Alberta, Canada. This system integrates a fuzzy analytic hierarchy procedure within an agent-based model in an interactive visualization environment provided through a web interface to facilitate the learning and negotiation of the stakeholders. In the pre-negotiation phase, the stakeholders compare their evaluation criteria using linguistic expressions. Due to the uncertainty and fuzzy nature of such comparisons, a fuzzy Analytic Hierarchy Process is then used to prioritize the criteria. The negotiation starts by a development plan being submitted by a user (stakeholder) through the web interface. An agent called the proposer, which represents the proposer of the plan, receives this plan and starts negotiating with all other agents. The negotiation is conducted in a step-wise manner where the agents change their attitudes by assigning a new set of weights to their criteria. If an agreement is not achieved, a new location for development is proposed by the proposer agent. This process is repeated until a location is found that satisfies all agents to a certain predefined degree. To evaluate the performance of the model, the negotiation was simulated with four agents, one of which being the proposer agent, using two hypothetical development plans. The first plan was selected randomly; the other one was chosen in an area that is of high importance to one of the agents. While the agents managed to achieve an agreement about the location of the land development after three rounds of negotiation in the first scenario, seven rounds were required in the second

  17. Study on Integration of Logistics Oriented Spatial Information Web Services%面向物流的空间信息Web服务集成研究

    Institute of Scientific and Technical Information of China (English)

    肖桂荣; 聂乔; 吴升

    2011-01-01

    Logistics essentially refers to material entities movement process with distinct spatial measurement and spatial characteristics, where integration and application of spatial information techniques and other modern techniques of logistics management are needed. This is a new interdisciplinary research fields where to extent spatial information services combined with web services and geospatial analysis to the area of logistics management, and then integrate the concept of spatial information services into modern logistics services system to carry out logistics oriented spatial information web services access, inte-gration and application. What's more, the key point to analyze logistics spatial phenomenon from the geographic perspective. Based on OGC web service framework, this paper we have put forward design and built the architecture of logistics spatial information services mainly include the mechanism for service integration, high-efficiency call and service composition and the model of integration based on web services, which clear its inherent elements and the relationship. Besides, we designed and developed the mechanism for service composition based on Net-Petri and Logistics Web of web service engine, which resolved the problems of dynamic access, high-efficiency call and real-time integrate to the logistics spatial information services. This work provided a new way and measure to the spatial information services being further developed and applied in the logistics area. By this way, even though the logistics information system constructors don't have a professional GIS background, they can also call spatial information service in their own programs. According to our study, the means of techniques of integrating and applying the logistics spatial information services, which achieve the dynamically composited and collaboratively integrated effectiveness and the practical experience for logistics spatial information service system construction.%本文针对

  18. Trialling a web-based spatial information management tool with Land Managers in Victoria, Australia.

    Science.gov (United States)

    Roberts, Anna M; Park, Geoff; Melland, Alice R; Miller, Ian

    2009-01-01

    A prototype web-based spatial information management tool (called eFarmer) was tested for its useability and usefulness by 46 Land Managers and 5 extension staff in Victoria, Australia. Participants had a range of enterprises (dairy, beef/sheep grazing, cropping, lifestyle land use), property sizes and computer ownership and expertise. A follow up study was conducted with 12 dairy farmers, where features regarding assessment of nutrient losses from paddocks (Farm Nutrient Loss Index, FNLI) were added to eFarmer. Over 27,000 maps (including 11,000 with aerial photography) were accessed by Land Managers during a 5-month trial period. Despite limited training and support, 1350 people are registered users, and approximately 700 have actively used the tool. Reasons for the success include providing improved access to spatial information, enabling measurement of farm features and creation of farm maps, providing a basis for decision-making about farm inputs, support for better farm and landscape scale action planning and production and Land Managers being able to seek management advice from the extension staff who facilitated eFarmer testing programs. For dairy farmers in the FNLI trial, awareness of off-site impacts increased and most changed management practices. Provision of on-going training and support will be at least as important as further development of the tool itself. Web-based spatial information tools have potential to improve the awareness of Land Managers about their environmental impacts and influence their decision-making. Access to spatial information has potential to reduce information asymmetry between Land Managers, extension staff and catchment planners in a constructive way. It will also change the role of extension staff away from being an expert with answers, to a facilitator enabling learning. Results have applicability in countries where there is a high level of farm computer ownership, relevant spatial information is available in GIS format

  19. How can mental maps, applied to the coast environment, help in collecting and analyzing spatial representations?

    Directory of Open Access Journals (Sweden)

    Servane Gueben-Venière

    2011-09-01

    Full Text Available Après avoir été principalement utilisées en géographie urbaine, puis quelque peu mises de côté par les géographes, les cartes mentales font désormais l’objet d’un regain d’intérêt, en particulier dans le champ de la géographie de l’environnement. Appliquées à l’espace littoral et employées en complément de l’entretien, elles se révèlent être non seulement un bon outil de recueil des représentations spatiales, mais aussi une aide précieuse pour leur analyse. Cet article s’appuie sur l’exemple de l’utilisation des cartes mentales dans le poster scientifique Des ingénieurs de plus en plus « verts ». Évolution du regard des ingénieurs en charge de la gestion du littoral néerlandais, lauréat du concours organisé par le forum de l’École Doctorale de Géographie de Paris de 2011.After having been mainly used in urban geography, then cast aside by the geographers, mental maps are now the object of renewed interest, particularly in the field of environmental geography. Applied to the coast, and used as a supplement to the interview, these maps are not only of great assistance in collecting spatial representations, but also helpful in analyzing them. This article uses the example of the integration of mental maps in the scientific poster “Des ingénieurs de plus en plus “verts”. Évolution du regard des ingénieurs en charge de la gestion du littoral néerlandais”(Engineers are ‘greener and greener’. Evolution of the thinking of engineers in charge of Dutch coastal management., prize-winner of the competition organized by the Paris Doctoral School of Geography Forum in 2011.

  20. Analyzing crop change scenario with the SmartScape spatial decision support system

    NARCIS (Netherlands)

    Tayyebi, A; Tayyebi, AH; Jokar Arsanjani, J; Vaz, E; Helbich, M

    2016-01-01

    Agricultural land use is increasingly changing due to different anthropogenic activities. A combination of economic, socio-political, and cultural factors exerts a direct impact on agricultural changes. This study aims to illustrate how stakeholders and policymakers can take advantage of a web-based

  1. Utilizing mixed methods research in analyzing Iranian researchers’ informarion search behaviour in the Web and presenting current pattern

    Directory of Open Access Journals (Sweden)

    Maryam Asadi

    2015-12-01

    Full Text Available Using mixed methods research design, the current study has analyzed Iranian researchers’ information searching behaviour on the Web.Then based on extracted concepts, the model of their information searching behavior was revealed. . Forty-four participants, including academic staff from universities and research centers were recruited for this study selected by purposive sampling. Data were gathered from questionnairs including ten questions and semi-structured interview. Each participant’s memos were analyzed using grounded theory methods adapted from Strauss & Corbin (1998. Results showed that the main objectives of subjects were doing a research, writing a paper, studying, doing assignments, downloading files and acquiring public information in using Web. The most important of learning about how to search and retrieve information were trial and error and get help from friends among the subjects. Information resources are identified by searching in information resources (e.g. search engines, references in papers, and search in Online database… communications facilities & tools (e.g. contact with colleagues, seminars & workshops, social networking..., and information services (e.g. RSS, Alerting, and SDI. Also, Findings indicated that searching by search engines, reviewing references, searching in online databases, and contact with colleagues and studying last issue of the electronic journals were the most important for searching. The most important strategies were using search engines and scientific tools such as Google Scholar. In addition, utilizing from simple (Quick search method was the most common among subjects. Using of topic, keywords, title of paper were most important of elements for retrieval information. Analysis of interview showed that there were nine stages in researchers’ information searching behaviour: topic selection, initiating search, formulating search query, information retrieval, access to information

  2. Building spatial composite indicators to analyze environmental health inequalities on a regional scale

    OpenAIRE

    Saib, Mahdi-Salim; Caudeville, Julien; Beauchamp, Maxime; Carré, Florence; Ganry, Olivier; Trugeon, Alain; Cicolella, Andre

    2015-01-01

    Background Reducing health inequalities involves the identification and characterization of social and exposure factors and the way they accumulate in a given area. The areas of accumulation then allow for prioritization of interventions. The present study aims to build spatial composite indicators based on the aggregation of environmental, social and health indicators and their inter-relationships. Method Preliminary work was carried out firstly to homogenize spatial coverage, and secondly t...

  3. Coordinating distributed software development : a resource relationships perspective on analyzing the spatial effects

    OpenAIRE

    Wiredu, Gamel O.

    2010-01-01

    peer-reviewed As more software development organizations are increasingly distributing their operations spatially, information systems development researchers are taking perspectives such as transactions costs and resource dependency to explain the effects of spatial distribution on coordination. This paper argues that these perspectives are limited because they do not address all the key relationships between software development resources in a unified and systemic manner. The interaction...

  4. A web-tool to find spatially explicit climate-smart solutions for the sector agriculture

    Science.gov (United States)

    Verzandvoort, Simone; Kuikman, Peter; Walvoort, Dennis

    2017-04-01

    Europe faces the challenge to produce more food and more biomass for the bio-economy, to adapt its agricultural sector to negative consequences of climate change, and to reduce greenhouse gas emissions from agriculture. Climate-smart agriculture (CSA) solutions and technologies improve agriculture's productivity and provide economic growth and stability, increase resilience, and help to reduce GHG emissions from agricultural activities. The Climate Smart Agriculture Booster (CSAb) (http://csabooster.climate-kic.org/) is a Flagship Program under Climate-KIC, aiming to facilitate the adoption of CSA solutions and technologies in the European agro-food sector. This adoption requires spatially explicit, contextual information on farming activities and risks and opportunities related to climate change in regions across Europe. Other spatial information supporting adoption includes Information on where successful implementations were already done, on where CSA would profit from enabling policy conditions, and where markets or business opportunities for selling or purchasing technology and knowledge are located or emerging. The Spatial Solution Finder is a web-based spatial tool aiming to help agri-food companies (supply and processing), authorities or agricultural organisations find CSA solutions and technologies that fit local farmers and regions, and to demonstrate examples of successful implementations as well as expected impact at the farm and regional level. The tool is based on state of the art (geo)datasets of environmental and socio-economic conditions (partly open access, partly derived from previous research) and open source web-technology. The philosophy of the tool is that combining existing datasets with contextual information on the region of interest with personalized information entered by the user provides a suitable basis for offering a basket of options for CSA solutions and technologies. Solutions and technologies are recommended to the user based on

  5. Development of a Web GIS Application for Visualizing and Analyzing Community Out of Hospital Cardiac Arrest Patterns

    Science.gov (United States)

    Semple, Hugh; Qin, Han; Sasson, Comilla

    2013-01-01

    Improving survival rates at the neighborhood level is increasingly seen as a priority for reducing overall rates of out-of-hospital cardiac arrest (OHCA) in the United States. Since wide disparities exist in OHCA rates at the neighborhood level, it is important for public health officials and residents to be able to quickly locate neighborhoods where people are at elevated risk for cardiac arrest and to target these areas for educational outreach and other mitigation strategies. This paper describes an OHCA web mapping application that was developed to provide users with interactive maps and data for them to quickly visualize and analyze the geographic pattern of cardiac arrest rates, bystander CPR rates, and survival rates at the neighborhood level in different U.S. cities. The data comes from the CARES Registry and is provided over a period spanning several years so users can visualize trends in neighborhood out-of-hospital cardiac arrest patterns. Users can also visualize areas that are statistical hot and cold spots for cardiac arrest and compare OHCA and bystander CPR rates in the hot and cold spots. Although not designed as a public participation GIS (PPGIS), this application seeks to provide a forum around which data and maps about local patterns of OHCA can be shared, analyzed and discussed with a view of empowering local communities to take action to address the high rates of OHCA in their vicinity. PMID:23923097

  6. Development of a Web GIS Application for Visualizing and Analyzing Community Out of Hospital Cardiac Arrest Patterns.

    Science.gov (United States)

    Semple, Hugh; Qin, Han; Sasson, Comilla

    2013-01-01

    Improving survival rates at the neighborhood level is increasingly seen as a priority for reducing overall rates of out-of-hospital cardiac arrest (OHCA) in the United States. Since wide disparities exist in OHCA rates at the neighborhood level, it is important for public health officials and residents to be able to quickly locate neighborhoods where people are at elevated risk for cardiac arrest and to target these areas for educational outreach and other mitigation strategies. This paper describes an OHCA web mapping application that was developed to provide users with interactive maps and data for them to quickly visualize and analyze the geographic pattern of cardiac arrest rates, bystander CPR rates, and survival rates at the neighborhood level in different U.S. cities. The data comes from the CARES Registry and is provided over a period spanning several years so users can visualize trends in neighborhood out-of-hospital cardiac arrest patterns. Users can also visualize areas that are statistical hot and cold spots for cardiac arrest and compare OHCA and bystander CPR rates in the hot and cold spots. Although not designed as a public participation GIS (PPGIS), this application seeks to provide a forum around which data and maps about local patterns of OHCA can be shared, analyzed and discussed with a view of empowering local communities to take action to address the high rates of OHCA in their vicinity.

  7. The PPI3D web server for searching, analyzing and modeling protein-protein interactions in the context of 3D structures.

    Science.gov (United States)

    Dapkūnas, Justas; Timinskas, Albertas; Olechnovič, Kliment; Margelevičius, Mindaugas; Dičiūnas, Rytis; Venclovas, Česlovas

    2016-12-22

    The PPI3D web server is focused on searching and analyzing the structural data on protein-protein interactions. Reducing the data redundancy by clustering and analyzing the properties of interaction interfaces using Voronoi tessellation makes this software a highly effective tool for addressing different questions related to protein interactions.

  8. Analyzing Spatial and Temporal Variation in Precipitation Estimates in a Coupled Model

    Science.gov (United States)

    Tomkins, C. D.; Springer, E. P.; Costigan, K. R.

    2001-12-01

    Integrated modeling efforts at the Los Alamos National Laboratory aim to simulate the hydrologic cycle and study the impacts of climate variability and land use changes on water resources and ecosystem function at the regional scale. The integrated model couples three existing models independently responsible for addressing the atmospheric, land surface, and ground water components: the Regional Atmospheric Model System (RAMS), the Los Alamos Distributed Hydrologic System (LADHS), and the Finite Element and Heat Mass (FEHM). The upper Rio Grande Basin, extending 92,000 km2 over northern New Mexico and southern Colorado, serves as the test site for this model. RAMS uses nested grids to simulate meteorological variables, with the smallest grid over the Rio Grande having 5-km horizontal grid spacing. As LADHS grid spacing is 100 m, a downscaling approach is needed to estimate meteorological variables from the 5km RAMS grid for input into LADHS. This study presents daily and cumulative precipitation predictions, in the month of October for water year 1993, and an approach to compare LADHS downscaled precipitation to RAMS-simulated precipitation. The downscaling algorithm is based on kriging, using topography as a covariate to distribute the precipitation and thereby incorporating the topographical resolution achieved at the 100m-grid resolution in LADHS. The results of the downscaling are analyzed in terms of the level of variance introduced into the model, mean simulated precipitation, and the correlation between the LADHS and RAMS estimates. Previous work presented a comparison of RAMS-simulated and observed precipitation recorded at COOP and SNOTEL sites. The effects of downscaling the RAMS precipitation were evaluated using Spearman and linear correlations and by examining the variance of both populations. The study focuses on determining how the downscaling changes the distribution of precipitation compared to the RAMS estimates. Spearman correlations computed for

  9. The Effects of Web-Based Interactive Virtual Tours on the Development of Prospective Mathematics Teachers' Spatial Skills

    Science.gov (United States)

    Kurtulus, Aytac

    2013-01-01

    The aim of this study was to investigate the effects of web-based interactive virtual tours on the development of prospective mathematics teachers' spatial skills. The study was designed based on experimental method. The "one-group pre-test post-test design" of this method was taken as the research model. The study was conducted with 3rd year…

  10. Analyzing existing conventional soil information sources to be incorporated in thematic Spatial Data Infrastructures

    Science.gov (United States)

    Pascual-Aguilar, J. A.; Rubio, J. L.; Domínguez, J.; Andreu, V.

    2012-04-01

    New information technologies give the possibility of widespread dissemination of spatial information to different geographical scales from continental to local by means of Spatial Data Infrastructures. Also administrative awareness on the need for open access information services has allowed the citizens access to this spatial information through development of legal documents, such as the INSPIRE Directive of the European Union, adapted by national laws as in the case of Spain. The translation of the general criteria of generic Spatial Data Infrastructures (SDI) to thematic ones is a crucial point for the progress of these instruments as large tool for the dissemination of information. In such case, it must be added to the intrinsic criteria of digital information, such as the harmonization information and the disclosure of metadata, the own environmental information characteristics and the techniques employed in obtaining it. In the case of inventories and mapping of soils, existing information obtained by traditional means, prior to the digital technologies, is considered to be a source of valid information, as well as unique, for the development of thematic SDI. In this work, an evaluation of existing and accessible information that constitutes the basis for building a thematic SDI of soils in Spain is undertaken. This information framework has common features to other European Union states. From a set of more than 1,500 publications corresponding to the national territory of Spain, the study was carried out in those documents (94) found for five autonomous regions of northern Iberian Peninsula (Asturias, Cantabria, Basque Country, Navarra and La Rioja). The analysis was performed taking into account the criteria of soil mapping and inventories. The results obtained show a wide variation in almost all the criteria: geographic representation (projections, scales) and geo-referencing the location of the profiles, map location of profiles integrated with edaphic

  11. Location-based Web Search

    Science.gov (United States)

    Ahlers, Dirk; Boll, Susanne

    In recent years, the relation of Web information to a physical location has gained much attention. However, Web content today often carries only an implicit relation to a location. In this chapter, we present a novel location-based search engine that automatically derives spatial context from unstructured Web resources and allows for location-based search: our focused crawler applies heuristics to crawl and analyze Web pages that have a high probability of carrying a spatial relation to a certain region or place; the location extractor identifies the actual location information from the pages; our indexer assigns a geo-context to the pages and makes them available for a later spatial Web search. We illustrate the usage of our spatial Web search for location-based applications that provide information not only right-in-time but also right-on-the-spot.

  12. Spatially resolved spectroscopy across stellar surfaces. I. Using exoplanet transits to analyze 3D stellar atmospheres

    Science.gov (United States)

    Dravins, Dainis; Ludwig, Hans-Günter; Dahlén, Erik; Pazira, Hiva

    2017-09-01

    Context. High-precision stellar analyses require hydrodynamic modeling to interpret chemical abundances or oscillation modes. Exoplanet atmosphere studies require stellar background spectra to be known along the transit path while detection of Earth analogs require stellar microvariability to be understood. Hydrodynamic 3D models can be computed for widely different stars but have been tested in detail only for the Sun with its resolved surface features. Model predictions include spectral line shapes, asymmetries, and wavelength shifts, and their center-to-limb changes across stellar disks. Aims: We observe high-resolution spectral line profiles across spatially highly resolved stellar surfaces, which are free from the effects of spatial smearing and rotational broadening present in full-disk spectra, enabling comparisons to synthetic profiles from 3D models. Methods: During exoplanet transits, successive stellar surface portions become hidden and differential spectroscopy between various transit phases provides spectra of small surface segments temporarily hidden behind the planet. Planets cover no more than 1% of any main-sequence star, enabling high spatial resolution but demanding very precise observations. Realistically measurable quantities are identified through simulated observations of synthetic spectral lines. Results: In normal stars, line profile ratios between various transit phases may vary by 0.5%, requiring S/N ≳ 5000 for meaningful spectral reconstruction. While not yet realistic for individual spectral lines, this is achievable for cool stars by averaging over numerous lines with similar parameters. Conclusions: For bright host stars of large transiting planets, spatially resolved spectroscopy is currently practical. More observable targets are likely to be found in the near future by ongoing photometric searches.

  13. A review of the formulation and application of the spatial equilibrium models to analyze policy

    Institute of Scientific and Technical Information of China (English)

    Phan Sy Hieu; Steve Harrison

    2011-01-01

    This paper reviews alternative market equilibrium models for policy analysis.The origin of spatial equilibrium models and their application to wood and wood-processing industries are described.Three mathematical programming models commonly applied to solve spatial problems - namely linear programming,non-linear programming and mixed complementary programming - are reviewed in terms of forms of objective functions and constraint equalities and inequalities.These programming are illustrated with numerical examples.Linear programming is only applied in transportation problems to solve quantities transported between regions when quantities supplied and demanded in each region are already known.It is argued that linear programming can be applied in broader context to transportation problems where supply and demand quantities are unknown and are linear.In this context,linear programming is seen as a more convenient method for modelers because it has a simpler objective function and does not require as strict conditions,for instance the equal numbers of variables and equations required in mixed complementary programming.Finally,some critical insights are provided on the interpretation of optimal solutions generated by solving spatial equilibrium models.

  14. A Web-based spatial decision supporting system for land management and soil conservation

    Science.gov (United States)

    Terribile, F.; Agrillo, A.; Bonfante, A.; Buscemi, G.; Colandrea, M.; D'Antonio, A.; De Mascellis, R.; De Michele, C.; Langella, G.; Manna, P.; Marotta, L.; Mileti, F. A.; Minieri, L.; Orefice, N.; Valentini, S.; Vingiani, S.; Basile, A.

    2015-07-01

    Today it is evident that there are many contrasting demands on our landscape (e.g. food security, more sustainable agriculture, higher income in rural areas, etc.) as well as many land degradation problems. It has been proved that providing operational answers to these demands and problems is extremely difficult. Here we aim to demonstrate that a spatial decision support system based on geospatial cyberinfrastructure (GCI) can address all of the above, so producing a smart system for supporting decision making for agriculture, forestry, and urban planning with respect to the landscape. In this paper, we discuss methods and results of a special kind of GCI architecture, one that is highly focused on land management and soil conservation. The system allows us to obtain dynamic, multidisciplinary, multiscale, and multifunctional answers to agriculture, forestry, and urban planning issues through the Web. The system has been applied to and tested in an area of about 20 000 ha in the south of Italy, within the framework of a European LIFE+ project (SOILCONSWEB). The paper reports - as a case study - results from two different applications dealing with agriculture (olive growth tool) and environmental protection (soil capability to protect groundwater). Developed with the help of end users, the system is starting to be adopted by local communities. The system indirectly explores a change of paradigm for soil and landscape scientists. Indeed, the potential benefit is shown of overcoming current disciplinary fragmentation over landscape issues by offering - through a smart Web-based system - truly integrated geospatial knowledge that may be directly and freely used by any end user (www.landconsultingweb.eu). This may help bridge the last very important divide between scientists working on the landscape and end users.

  15. A web based spatial decision supporting system for land management and soil conservation

    Science.gov (United States)

    Terribile, F.; Agrillo, A.; Bonfante, A.; Buscemi, G.; Colandrea, M.; D'Antonio, A.; De Mascellis, R.; De Michele, C.; Langella, G.; Manna, P.; Marotta, L.; Mileti, F. A.; Minieri, L.; Orefice, N.; Valentini, S.; Vingiani, S.; Basile, A.

    2015-02-01

    Today it is evident that there are many contrasting demands on our landscape (e.g. food security, more sustainable agriculture, higher income in rural areas, etc.) but also many land degradation problems. It has been proved that providing operational answers to these demands and problems is extremely difficult. Here we aim to demonstrate that a Spatial Decision Support System based on geospatial cyber-infrastructure (GCI) can embody all of the above, so producing a smart system for supporting decision making for agriculture, forestry and urban planning with respect to the landscape. In this paper, we discuss methods and results of a special kind of GCI architecture, one that is highly focused on soil and land conservation (SOILCONSWEB-LIFE+ project). The system allows us to obtain dynamic, multidisciplinary, multiscale, and multifunctional answers to agriculture, forestry and urban planning issues through the web. The system has been applied to and tested in an area of about 20 000 ha in the South of Italy, within the framework of a European LIFE+ project. The paper reports - as a case study - results from two different applications dealing with agriculture (olive growth tool) and environmental protection (soil capability to protect groundwater). Developed with the help of end users, the system is starting to be adopted by local communities. The system indirectly explores a change of paradigm for soil and landscape scientists. Indeed, the potential benefit is shown of overcoming current disciplinary fragmentation over landscape issues by offering - through a smart web based system - truly integrated geospatial knowledge that may be directly and freely used by any end user (http://www.landconsultingweb.eu). This may help bridge the last much important divide between scientists working on the landscape and end users.

  16. Easier surveillance of climate-related health vulnerabilities through a Web-based spatial OLAP application

    Directory of Open Access Journals (Sweden)

    Gosselin Pierre

    2009-04-01

    Full Text Available Abstract Background Climate change has a significant impact on population health. Population vulnerabilities depend on several determinants of different types, including biological, psychological, environmental, social and economic ones. Surveillance of climate-related health vulnerabilities must take into account these different factors, their interdependence, as well as their inherent spatial and temporal aspects on several scales, for informed analyses. Currently used technology includes commercial off-the-shelf Geographic Information Systems (GIS and Database Management Systems with spatial extensions. It has been widely recognized that such OLTP (On-Line Transaction Processing systems were not designed to support complex, multi-temporal and multi-scale analysis as required above. On-Line Analytical Processing (OLAP is central to the field known as BI (Business Intelligence, a key field for such decision-support systems. In the last few years, we have seen a few projects that combine OLAP and GIS to improve spatio-temporal analysis and geographic knowledge discovery. This has given rise to SOLAP (Spatial OLAP and a new research area. This paper presents how SOLAP and climate-related health vulnerability data were investigated and combined to facilitate surveillance. Results Based on recent spatial decision-support technologies, this paper presents a spatio-temporal web-based application that goes beyond GIS applications with regard to speed, ease of use, and interactive analysis capabilities. It supports the multi-scale exploration and analysis of integrated socio-economic, health and environmental geospatial data over several periods. This project was meant to validate the potential of recent technologies to contribute to a better understanding of the interactions between public health and climate change, and to facilitate future decision-making by public health agencies and municipalities in Canada and elsewhere. The project also aimed at

  17. Analyzing Ca(2+) dynamics in intact epithelial cells using spatially limited flash photolysis.

    Science.gov (United States)

    Almassy, Janos; Yule, David I

    2013-01-01

    The production of saliva by parotid acinar cells is stimulated by Ca(2+) activation of Cl(-) and K(+) channels located in the apical plasma membrane of these polarized cells. Here we describe a paradigm for the focal photorelease of either Ca(2+) or an inositol 1,4,5 trisphosphate (InsP(3)) analog. The protocol is designed to be useful for investigating subcellular Ca(2+) dynamics in polarized cells with minimal experimental intervention. Parotid acinar cells are loaded with cell-permeable versions of the caged precursors (NP-EGTA-AM or Ci-InsP(3)/PM). Photolysis is accomplished using a spatially limited, focused diode laser, but the experiment can be readily modified to whole-field photolysis using a xenon flash lamp.

  18. Modern tools for development of interactive web map applications for visualization spatial data on the internet

    Directory of Open Access Journals (Sweden)

    Horáková Bronislava

    2009-11-01

    Full Text Available In the last few years has begun the development of dynamic web applications, often called Web2.0. From this development wascreated a technology called Mashups. Mashups may easily combine huge amounts of data sources and functionalities of existing as wellas future web applications and services. Therefore they are used to develop a new device, which offers new possibilities of informationusage. This technology provides possibilities of developing basic as well as robust web applications not only for IT or GIS specialists,but also for common users. Software companies have developed web projects for building mashup application also called mashupeditors.

  19. MarineMap: Web-Based Technology for Coastal and Marine Spatial Planning

    Science.gov (United States)

    McClintock, W.; Ferdana, Z.; Merrifield, M.; Steinback, C.; Marinemap Consortium

    2010-12-01

    Science, technology and stakeholder engagement are at the heart of marine spatial planning (MSP). Yet, most stakeholders are not scientists or technologists. MarineMap (http://northcoast.marinemap.org) is a web-based decision support tool developed specifically for use by non-technical stakeholders in marine protected area (MPA) planning. However, MarineMap has been developed so that it may be extended to virtually any MSP project where there is a need for (a) visualization and analysis of geospatial data, (b) siting prospective use areas (e.g., for wind or wave energy sites, MPAs, transportation routes), (c) collaboration and communication amongst stakeholders, and (d) transparency of the process to the public. MarineMap is extremely well documented, is based on free and open source technologies and, therefore, may be implemented by anyone without licensing fees. Furthermore, the underlying technologies are extremely flexible and extensible, making it ideal for incorporating new models (e.g., tradeoff analyses, cumulative impacts, etc.) as they are identified for specific MSP projects. We will demonstrate how MarineMap has been developed for MPA planning in California, human impact assessment and MSP on the West Coast, energy and conservation planning in Oregon, and explain how interested parties may access MarineMap's source code and contribute to development.

  20. Scholarship 2.0: analyzing scholars’ use of Web 2.0 tools in research and teaching activity

    Directory of Open Access Journals (Sweden)

    Licia Calvi

    2013-11-01

    Full Text Available Over the past 15 years the Web has transformed the ways in which we search for information and use it. In more recent years, we have seen the emergence of a new array of innovative tools that collectively go under the name of ‘Web 2.0’, in which the information user is also increasingly an information producer (i.e., prosumer, by sharing or creating content.The success of Web 2.0 tools for personal use is only partially replicated in the professional sphere and, particularly, in the academic environment in relation with research and teaching.To date, very few studies have explored the level of adoption of Web 2.0 among academic researchers in their research and teaching activity. It is not known in what way how and how much Web 2.0 is currently used within research communities, and we are not aware of the drivers and the drawbacks of the use of Web 2.0 tools in academia, where the majority of people is focused either on research or on teaching activities.To analyse these issues, i.e. the combined adoption of Web 2.0 tools in teaching and research, the authors carried out a survey among teaching and researching staff of the University of Breda in The Netherlands. This country was chosen mainly because it is on the cutting edge as far as innovation is concerned. An important driver in choosing the Breda University’s academic community was the fact that one of the two authors of this survey works as senior researcher at this university.The purpose of our survey was to explore the level of adoption of Web 2.0 tools among the academic communities. We were interested in investigating how they were using these tools in the creation of scientific knowledge both in their research and teaching activity. We were also interested in analysing differences in the level of adoption of Web 2.0 tools with regard to researchers’ position, age, gender, and research field.Finally, in our study we explored the issue of peer reviewing in the Web 2.0 setting

  1. Spatial analyzing system for urban land-use management based on GIS and multi-criteria assessment modeling

    Institute of Scientific and Technical Information of China (English)

    Fu Yang; Guangming Zeng; Chunyan Du; Lin Tang; Jianfei Zhou; Zhongwu Li

    2008-01-01

    Urban land management requires the integration of a wide range of data on ecological process,environmental process and process on urban planning and development.This paper combined land suitability modeling with remote sensing (RS),landscape ecological analysis and geographic information system (GIS) to develop a spatial analyzing system for urban expansion land management.The spatial analyzing system incorporates the use of a multi-criteria mechanism in GIS for the suitability evaluation of urban expansion land.Grey relational analysis (GRA) was combined with analytic hierarchy process (AHP) to address the uncertainties during the process of evaluation.This approach was applied to explicitly identify constraints and opportunities for future land conservation and development in Changsha City,China.Validation of the methodology showed a high degree of coincidence with the previous independent studies as regards ecological suitability.The methodology can be useful in environmental protection,land management and regional planning.

  2. Taxonomical and ecological characteristics of the desmids placoderms in reservoir: analyzing the spatial and temporal distribution

    Directory of Open Access Journals (Sweden)

    Sirlene Aparecida Felisberto

    2014-12-01

    Full Text Available AIM: This study aimed to evaluate the influence of river-dam axis and abiotic factors on the composition of Closteriaceae, Gonatozygaceae, Mesotaeniaceae and Peniaceae in a tropical reservoir METHODS: Water samples for physical, chemical and periphyton analysis were collected in April and August 2002 in different regions along the axis of the river-dam of Rosana Reservoir, River Basin Paranapanema. The substrates collected, always in the litoranea region, were petioles of Eichhornia azurea (Swartz Kunth. To examine the relationship of abiotic variables with reservoir zones and between the floristic composition of desmids, we used principal component analysis (PCA and canonical correspondence analysis (CCA RESULTS: The results of the PCA explained 81.3% of the total variability in the first two axes. In the first axis, the variables of conductivity, water temperature and the pH were related to the sampling regions of April with higher values, while for the month of August, nitrate, total phosphorus and dissolved oxygen showed higher values. We identified 20 taxa, distributed in the genera Closterium (14, Gonatozygon (4, Netrium (1 and Penium (1. Spatially, the higher taxa were recorded in the lacustrine region for both collection periods. The canonical correspondence analysis (CCA summarized 62.2% of total data variability of taxa in the first two axes, and in August, Closterium incurvum Brébisson, C. cornu Ehrenberg ex Ralfs and Gonatozygon monotaenium De Bary, were related to higher values of turbidity and nitrate to the lacustrine and intermediate regions CONCLUSION: Thus, the formation of groups was due to the regions along the longitudinal axis, then the seasonal period, which must be related to the low current velocity, the higher values of temperature and the water transparency, especially in late summer

  3. Analyzing the influence of BDNF heterozygosity on spatial memory response to 17β-estradiol.

    Science.gov (United States)

    Wu, Y W C; Du, X; van den Buuse, M; Hill, R A

    2015-01-20

    The recent use of estrogen-based therapies as adjunctive treatments for the cognitive impairments of schizophrenia has produced promising results; however the mechanism behind estrogen-based cognitive enhancement is relatively unknown. Brain-derived neurotrophic factor (BDNF) regulates learning and memory and its expression is highly responsive to estradiol. We recently found that estradiol modulates the expression of hippocampal parvalbumin-positive GABAergic interneurons, known to regulate neuronal synchrony and cognitive function. What is unknown is whether disruptions to the aforementioned estradiol-parvalbumin pathway alter learning and memory, and whether BDNF may mediate these events. Wild-type (WT) and BDNF heterozygous (+/-) mice were ovariectomized (OVX) at 5 weeks of age and simultaneously received empty, estradiol- or progesterone-filled implants for 7 weeks. At young adulthood, mice were tested for spatial and recognition memory in the Y-maze and novel-object recognition test, respectively. Hippocampal protein expression of BDNF and GABAergic interneuron markers, including parvalbumin, were assessed. WT OVX mice show impaired performance on Y-maze and novel-object recognition test. Estradiol replacement in OVX mice prevented the Y-maze impairment, a Behavioral abnormality of dorsal hippocampal origin. BDNF and parvalbumin protein expression in the dorsal hippocampus and parvalbumin-positive cell number in the dorsal CA1 were significantly reduced by OVX in WT mice, while E2 replacement prevented these deficits. In contrast, BDNF(+/-) mice showed either no response or an opposite response to hormone manipulation in both behavioral and molecular indices. Our data suggest that BDNF status is an important biomarker for predicting responsiveness to estrogenic compounds which have emerged as promising adjunctive therapeutics for schizophrenia patients.

  4. Application of geostatistics with Indicator Kriging for analyzing spatial variability of groundwater arsenic concentrations in Southwest Bangladesh.

    Science.gov (United States)

    Hassan, M Manzurul; Atkins, Peter J

    2011-01-01

    This article seeks to explore the spatial variability of groundwater arsenic (As) concentrations in Southwestern Bangladesh. Facts about spatial pattern of As are important to understand the complex processes of As concentrations and its spatial predictions in the unsampled areas of the study site. The relevant As data for this study were collected from Southwest Bangladesh and were analyzed with Flow Injection Hydride Generation Atomic Absorption Spectrometry (FI-HG-AAS). A geostatistical analysis with Indicator Kriging (IK) was employed to investigate the regionalized variation of As concentration. The IK prediction map shows a highly uneven spatial pattern of arsenic concentrations. The safe zones are mainly concentrated in the north, central and south part of the study area in a scattered manner, while the contamination zones are found to be concentrated in the west and northeast parts of the study area. The southwest part of the study area is contaminated with a highly irregular pattern. A Generalized Linear Model (GLM) was also used to investigate the relationship between As concentrations and aquifer depths. A negligible negative correlation between aquifer depth and arsenic concentrations was found in the study area. The fitted value with 95 % confidence interval shows a decreasing tendency of arsenic concentrations with the increase of aquifer depth. The adjusted mean smoothed lowess curve with a bandwidth of 0.8 shows an increasing trend of arsenic concentration up to a depth of 75 m, with some erratic fluctuations and regional variations at the depth between 30 m and 60 m. The borehole lithology was considered to analyze and map the pattern of As variability with aquifer depths. The study has performed an investigation of spatial pattern and variation of As concentrations.

  5. Mining the Social Web Analyzing Data from Facebook, Twitter, LinkedIn, and Other Social Media Sites

    CERN Document Server

    Russell, Matthew

    2011-01-01

    Want to tap the tremendous amount of valuable social data in Facebook, Twitter, LinkedIn, and Google+? This refreshed edition helps you discover who's making connections with social media, what they're talking about, and where they're located. You'll learn how to combine social web data, analysis techniques, and visualization to find what you've been looking for in the social haystack-as well as useful information you didn't know existed. Each standalone chapter introduces techniques for mining data in different areas of the social Web, including blogs and email. All you need to get started

  6. Open Source Web-Based Solutions for Disseminating and Analyzing Flood Hazard Information at the Community Level

    Science.gov (United States)

    -Santillan, M. M.-M.; Santillan, J. R.; Morales, E. M. O.

    2017-09-01

    We discuss in this paper the development, including the features and functionalities, of an open source web-based flood hazard information dissemination and analytical system called "Flood EViDEns". Flood EViDEns is short for "Flood Event Visualization and Damage Estimations", an application that was developed by the Caraga State University to address the needs of local disaster managers in the Caraga Region in Mindanao, Philippines in accessing timely and relevant flood hazard information before, during and after the occurrence of flood disasters at the community (i.e., barangay and household) level. The web application made use of various free/open source web mapping and visualization technologies (GeoServer, GeoDjango, OpenLayers, Bootstrap), various geospatial datasets including LiDAR-derived elevation and information products, hydro-meteorological data, and flood simulation models to visualize various scenarios of flooding and its associated damages to infrastructures. The Flood EViDEns application facilitates the release and utilization of this flood-related information through a user-friendly front end interface consisting of web map and tables. A public version of the application can be accessed at http://121.97.192.11:8082/"target="_blank">http://121.97.192.11:8082/. The application is currently expanded to cover additional sites in Mindanao, Philippines through the "Geo-informatics for the Systematic Assessment of Flood Effects and Risks for a Resilient Mindanao" or the "Geo-SAFER Mindanao" Program.

  7. Temporal and Spatial Independent Component Analysis for fMRI Data Sets Embedded in the AnalyzeFMRI R Package

    Directory of Open Access Journals (Sweden)

    Pierre Lafaye de Micheaux

    2011-10-01

    Full Text Available For statistical analysis of functional magnetic resonance imaging (fMRI data sets, we propose a data-driven approach based on independent component analysis (ICA implemented in a new version of the AnalyzeFMRI R package. For fMRI data sets, spatial dimension being much greater than temporal dimension, spatial ICA is the computationally tractable approach generally proposed. However, for some neuroscientific applications, temporal independence of source signals can be assumed and temporal ICA becomes then an attractive exploratory technique. In this work, we use a classical linear algebra result ensuring the tractability of temporal ICA. We report several experiments on synthetic data and real MRI data sets that demonstrate the potential interest of our R package.

  8. Analyzing the causes and spatial pattern of the European 2003 carbon flux anomaly using seven models

    Directory of Open Access Journals (Sweden)

    M. Vetter

    2008-04-01

    Full Text Available Globally, the year 2003 is associated with one of the largest atmospheric CO2 rises on record. In the same year, Europe experienced an anomalously strong flux of CO2 from the land to the atmosphere associated with an exceptionally dry and hot summer in Western and Central Europe. In this study we analyze the magnitude of this carbon flux anomaly and key driving ecosystem processes using simulations of seven terrestrial ecosystem models of different complexity and types (process-oriented and diagnostic. We address the following questions: (1 how large were deviations in the net European carbon flux in 2003 relative to a short-term baseline (1998–2002 and to longer-term variations in annual fluxes (1980 to 2005, (2 which European regions exhibited the largest changes in carbon fluxes during the growing season 2003, and (3 which ecosystem processes controlled the carbon balance anomaly .

    In most models the prominence of 2003 anomaly in carbon fluxes declined with lengthening of the reference period from one year to 16 years. The 2003 anomaly for annual net carbon fluxes ranged between 0.35 and –0.63 Pg C for a reference period of one year and between 0.17 and –0.37 Pg C for a reference period of 16 years for the whole Europe.

    In Western and Central Europe, the anomaly in simulated net ecosystem productivity (NEP over the growing season in 2003 was outside the 1σ variance bound of the carbon flux anomalies for 1980–2005 in all models. The estimated anomaly in net carbon flux ranged between –42 and –158 Tg C for Western Europe and between 24 and –129 Tg C for Central Europe depending on the model used. All models responded to a dipole pattern of the climate anomaly in 2003. In Western and Central Europe NEP was reduced due to heat and drought. In contrast, lower than normal temperatures and higher air humidity decreased NEP over Northeastern Europe. While models agree on the sign of changes in

  9. Spatial dependence of polycrystalline FTO’s conductance analyzed by conductive atomic force microscope (C-AFM)

    Energy Technology Data Exchange (ETDEWEB)

    Peixoto, Alexandre Pessoa; Costa, J. C. da [Department of Electrical Engineering, University of Brasília, Campus Universitário Darcy Ribeiro, Asa Norte, PO Box 4386, Brasília - DF, 70919-970 (Brazil)

    2014-05-15

    Fluorine-doped Tin oxide (FTO) is a highly transparent, electrically conductive polycrystalline material frequently used as an electrode in organic solar cells and optical-electronic devices [1–2]. In this work a spatial analysis of the conductive behavior of FTO was carried out by Conductive-mode Atomic Force Microscopy (C-AFM). Rare highly oriented grains sample give us an opportunity to analyze the top portion of polycrystalline FTO and compare with the border one. It is shown that the current flow essentially takes place through the polycrystalline edge at grain boundaries.

  10. Analyzing trophic transfer of heavy metals for food webs in the newly-formed wetlands of the Yellow River Delta, China

    Energy Technology Data Exchange (ETDEWEB)

    Cui Baoshan, E-mail: cuibs@bnu.edu.cn [State Key Joint Laboratory of Environmental Simulation and Pollution Control, School of Environment, Beijing Normal University, Beijing 100875 (China); Zhang Qijun [State Key Joint Laboratory of Environmental Simulation and Pollution Control, School of Environment, Beijing Normal University, Beijing 100875 (China); Zhang Kejiang [Department of Civil Engineering, University of Calgary, Alberta, T2N 1N4 (Canada); Liu Xinhui; Zhang Honggang [State Key Joint Laboratory of Environmental Simulation and Pollution Control, School of Environment, Beijing Normal University, Beijing 100875 (China)

    2011-05-15

    Nine heavy metals sampled from water, sediments, and aquatic organisms in the newly-formed wetlands of the Yellow River Delta (YRD) of China were analyzed to evaluate their concentrations and trophic transfer in food webs. The stable carbon ({delta}{sup 13}C) and nitrogen ({delta}{sup 15}N) isotopes were used to investigate trophic interactions. Results show that most of heavy metals detected in water and sediments are lower than that in Yangtze River Delta and Pearl River Delta. The longest food web is approximately 4 with the highest trophic level of birds. The difference of heavy metal concentrations between endangered Saunders's Gull and other three kinds of protected birds is not obvious. Cd, Zn, and Hg were identified to have an increase with the trophic level (TL), while As, Cr, Cu, Mn, Ni and Pb show an opposite trend, however, the biomagnification of the selected nine heavy metals in the food webs is not significant. - Highlights: > Heavy metal content in newly-formed wetlands is lower than that in similar regions. > There is a trophic level-dependent accumulation of heavy metals in food webs. > The longest food web is approximately 4 with the highest trophic level of birds. > Cd, Zn, and Hg were identified to increase with the trophic level. > The difference of metal content between Saunders's Gull and other birds isn't obvious. - The newly-formed wetlands show slight heavy metal contamination and weak biomagnification through the food webs in the Yellow River Delta.

  11. Spatial heterogeneity in the structure of the planktonic food web in the North Sea

    DEFF Research Database (Denmark)

    Richardson, Kathrine; Nielsen, Torkel Gissel; Bo Pedersen, Flemming

    1998-01-01

    production as well as the greatest percentage of total water column primary production being channelled into copepods were recorded. The regions where subsurface phytoplankton peaks were predicted to form were, thus, characterised by a 'classical' food web in which energy is efficiently transferred...... into larger zooplankters. We argue that heterogeneity in the nutrient status of phytoplankton in the subsurface peak can be important in controlling the type ('classical' or 'regenerated') of planktonic food web found in the water column as a whole...

  12. Fit3D: a web application for highly accurate screening of spatial residue patterns in protein structure data.

    Science.gov (United States)

    Kaiser, Florian; Eisold, Alexander; Bittrich, Sebastian; Labudde, Dirk

    2016-03-01

    The clarification of linkage between protein structure and function is still a demanding process and can be supported by comparison of spatial residue patterns, so-called structural motifs. However, versatile up-to-date resources to search for local structure similarities are rare. We present Fit3D, an easily accessible web application for highly accurate screening of structural motifs in 3D protein data. The web application is accessible at https://biosciences.hs-mittweida.de/fit3d and program sources of the command line version were released under the terms of GNU GPLv3. Platform-independent binaries and documentations for offline usage are available at https://bitbucket.org/fkaiser/fit3d florian.kaiser@hs-mittweida.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Effects of Seasonal and Spatial Differences in Food Webs on Mercury Concentrations in Fish in the Everglades

    Science.gov (United States)

    Kendall, C.; Bemis, B. E.; Wankel, S. D.; Rawlik, P. S.; Lange, T.; Krabbenhoft, D. P.

    2002-05-01

    A clear understanding of the aquatic food web is essential for determining the entry points and subsequent biomagnification pathways of contaminants such as methyl-mercury (MeHg) in the Everglades. Anthropogenic changes in nutrients can significantly affect the entry points of MeHg by changing food web structure from one dominated by algal productivity to one dominated by macrophytes and associated microbial activity. These changes in the base of the food web can also influence the distribution of animals within the ecosystem, and subsequently the bioaccumulation of MeHg up the food chain. As part of several collaborations with local and other federal agencies, more than 7000 Everglades samples were collected in 1995-99, and analysed for d13C and d15N. Many organisms were also analysed for d34S, gut contents, total Hg, and MeHg. Carbon isotopes effectively distinguish between two main types of food webs: ones where algae is the dominant base of the food web, which are characteristic of relatively pristine marsh sites with long hydroperiods, and ones where macrophyte debris appears to be a significant source of nutrients, which are apparently characteristic of shorter hydroperiod sites, and nutrient-impacted marshes and canals. Many organisms show significant (5-12%) spatial and temporal differences in d13C and d15N values across the Everglades. These differences may reflect site and season-specific differences in the relative importance of algae vs. macrophyte debris to the food web. However, there is a lack of evidence that these sites otherwise differ in food chain length (as determined by d15N values). This conclusion is generally supported by gut contents and mercury data. Furthermore, there are no statistically significant differences between the Delta d15N (predator-algae) values at pristine marsh, nutrient-impacted marsh, or canal sites. The main conclusions from this preliminary comparison of gut contents, stable isotope, and Hg data are: (1) there is

  14. Spatial Analysis of Geohazards using ArcGIS--A web-based Course.

    Science.gov (United States)

    Harbert, W.; Davis, D.

    2003-12-01

    As part of the Environmental Systems Research Incorporated (ESRI) Virtual Campus program, a course was designed to present the benefits of Geographical Information Systems (GIS) based spatial analysis as applied towards a variety of geohazards. We created this on-line ArcGIS 8.x-based course to aid the motivated student or professional in his or her efforts to use GIS in determining where geohazards are likely to occur and for assessing their potential impact on the human community. Our course is broadly designed for earth scientists, public sector professionals, students, and others who want to apply GIS to the study of geohazards. Participants work with ArcGIS software and diverse datasets to display, visualize and analyze a wide variety of data sets and map a variety of geohazards including earthquakes, volcanoes, landslides, tsunamis, and floods. Following the GIS-based methodology of posing a question, decomposing the question into specific criteria, applying the criteria to spatial or tabular geodatasets and then analyzing feature relationships, from the beginning the course content was designed in order to enable the motivated student to answer questions. For example, to explain the relationship between earth quake location, earthquake depth, and plate boundaries; use a seismic hazard map to identify population and features at risk from an earthquake; import data from an earthquake catalog and visualize these data in 3D; explain the relationship between earthquake damage and local geology; use a flood scenario map to identify features at risk for forecast river discharges; use a tsunami inundation map to identify population and features at risk from tsunami; use a hurricane inundation map to identify the population at risk for any given category hurricane; estimate accumulated precipitation by integrating time-series Doppler radar data; and model a real-life landslide event. The six on-line modules for our course are Earthquakes I, Earthquakes II, Volcanoes

  15. A Web-based Course in the Spatial Analysis of Geohazards using ArcGIS

    Science.gov (United States)

    Harbert, W.; Davis, D.

    2003-04-01

    Geologic hazards loom all around us. As population growth forces more communities to expand into areas at risk from these ominous threats, concern increases about the danger that geohazards pose to people, property, and the environment. As part of the Environmental Systems Research Incorporated (ESRI) Virtual Campus program, a course was designed to present the benefits of Geographical Information Systems (GIS) based spatial analysis as applied towards a variety of geohazards. We created this on-line ArcGIS 8.2-based course to aid the motivated student or professional in his or her efforts to use GIS in determining where geohazards are likely to occur and for assessing their potential impact on the human community. Our course is broadly designed for earth scientists, public sector professionals, students, and others who want to apply GIS to the study of geohazards. Participants work with ArcGIS software and diverse datasets to display, visualize and analyze a wide variety of data sets and map a variety of geohazards including earthquakes, volcanoes, landslides, tsunamis, and floods. Following the GIS-based methodology of posing a question, decomposing the question into specific criteria, applying the criteria to spatial or tabular geodatasets and then analyzing feature relationships, from the beginning the course content was designed in order to enable the motivated student to answer questions. For example, to explain the relationship between earth quake location, earthquake depth, and plate boundaries; use a seismic hazard map to identify population and features at risk from an earthquake; import data from an earthquake catalog and visualize these data in 3D; explain the relationship between earthquake damage and local geology; use a flood scenario map to identify features at risk for forecast river discharges; use a tsunami inundation map to identify population and features at risk from tsunami; use a hurricane inundation map to identify the population at risk

  16. Food-web inferences of stable isotope spatial patterns in copepods and yellowfin tuna in the pelagic eastern Pacific Ocean

    Science.gov (United States)

    Olson, Robert J.; Popp, Brian N.; Graham, Brittany S.; López-Ibarra, Gladis A.; Galván-Magaña, Felipe; Lennert-Cody, Cleridy E.; Bocanegra-Castillo, Noemi; Wallsgrove, Natalie J.; Gier, Elizabeth; Alatorre-Ramírez, Vanessa; Ballance, Lisa T.; Fry, Brian

    2010-07-01

    Evaluating the impacts of climate and fishing on oceanic ecosystems requires an improved understanding of the trophodynamics of pelagic food webs. Our approach was to examine broad-scale spatial relationships among the stable N isotope values of copepods and yellowfin tuna ( Thunnus albacares), and to quantify yellowfin tuna trophic status in the food web based on stable-isotope and stomach-contents analyses. Using a generalized additive model fitted to abundance-weighted-average δ 15N values of several omnivorous copepod species, we examined isotopic spatial relationships among yellowfin tuna and copepods. We found a broad-scale, uniform gradient in δ 15N values of copepods increasing from south to north in a region encompassing the eastern Pacific warm pool and parts of several current systems. Over the same region, a similar trend was observed for the δ 15N values in the white muscle of yellowfin tuna caught by the purse-seine fishery, implying limited movement behavior. Assuming the omnivorous copepods represent a proxy for the δ 15N values at the base of the food web, the isotopic difference between these two taxa, “ ΔYFT-COP,” was interpreted as a trophic-position offset. Yellowfin tuna trophic-position estimates based on their bulk δ 15N values were not significantly different than independent estimates based on stomach contents, but are sensitive to errors in the trophic enrichment factor and the trophic position of copepods. An apparent inshore-offshore, east to west gradient in yellowfin tuna trophic position was corroborated using compound-specific isotope analysis of amino acids conducted on a subset of samples. The gradient was not explained by the distribution of yellowfin tuna of different sizes, by seasonal variability at the base of the food web, or by known ambit distances (i.e. movements). Yellowfin tuna stomach contents did not show a regular inshore-offshore gradient in trophic position during 2003-2005, but the trophic

  17. Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast

    Science.gov (United States)

    Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.

    2009-12-01

    This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the

  18. Development and Evaluation of a Web Map Mind Tool Environment with the Theory of Spatial Thinking and Project-Based Learning Strategy

    Science.gov (United States)

    Hou, Huei-Tse; Yu, Tsai-Fang; Wu, Yi-Xuan; Sung, Yao-Ting; Chang, Kuo-En

    2016-01-01

    The theory of spatial thinking is relevant to the learning and teaching of many academic domains. One promising method to facilitate learners' higher-order thinking is to utilize a web map mind tool to assist learners in applying spatial thinking to cooperative problem solving. In this study, an environment is designed based on the theory of…

  19. Spatial variability of carbon (δ13C) and nitrogen (δ15N) stable isotope ratios in an Arctic marine food web

    DEFF Research Database (Denmark)

    Hansen, Joan Holst; Hedeholm, Rasmus Berg; Sünksen, Kaj

    2012-01-01

    Stable isotopes of carbon (δ13C) and nitrogen (δ15N) were used to examine trophic structures in an arctic marine food web at small and large spatial scales. Twelve species, from primary consumers to Greenland shark, were sampled at a large spatial scale near the west and east coasts of Greenland...... of the variation to physical and biological sources. Hence, significant differences in isotopic signatures on both large and small spatial scales were less related to food web structure than to different physical and biological properties of the water masses. Accordingly, the results illustrate the importance...

  20. A web-based spatial decision supporting system (S-DSS) for grapevine quality: the viticultural tool of the SOILCONS-WEB Project

    Science.gov (United States)

    Manna, Piero; Bonfante, Antonello; Basile, Angelo; Langella, Giuliano; Agrillo, Antonietta; De Mascellis, Roberto; Florindo Mileti, Antonio; Minieri, Luciana; Orefice, Nadia; Terribile, Fabio

    2014-05-01

    The SOILCONSWEB Project aims to create a decision support system operating at the landscape scale (Spatial-DSS) for the protection and the management of soils in both agricultural and environmental issues; it is a cyber-infrastructure built on remote servers operating through the web at www.landconsultingweb.eu. It includes - among others - a series of tools specifically designed to a Viticulture aiming at high quality wines production. The system is realized thanks to a collaboration between the University of Naples Federico II, CNR ISAFoM, Ariespace srl and SeSIRCA-Campania Region within a 5-years LIFE+ project funded by European Community. The system includes tools based on modelling procedures at different level of complexity some of which specifically designed for viticulture issues. One of the implemented models arise from the original desktop based SWAP model (Kroes et al, 2008). It can be run "on the fly" through a very user friendly web-interface. The specific tool, thanks to the model based on the Richard's equation can produce data on vineyard water stress, simulating the soil water balances of the different soil types within an area of interest. Thanks to a specific program developed within the project activities, the Spatial-DSS every day acquires punctual weather data and automatically spatialize them with geostatistical approaches in order to use the data as input for the SPA (Soil Plant Atmosphere ) model running. In particular for defining the upper boundary condition (rainfall and temperatures to estimate ET0 by the Hargraves model). Soil hydraulic properties (47 soil profiles within the study area), also essential for modelling simulation, were measured in laboratory using the Wind's approach or estimated through HYPRES PTF. Water retention and hydraulic conductivity relationships were parameterized according to the van Genuchten-Mualem model; Decision makers (individuals, groups of interests and public bodies) through the DSS can have real

  1. Facilitating Spatial Thinking in World Geography Using Web-Based GIS

    Science.gov (United States)

    Jo, Injeong; Hong, Jung Eun; Verma, Kanika

    2016-01-01

    Advocates for geographic information system (GIS) education contend that learning about GIS promotes students' spatial thinking. Empirical studies are still needed to elucidate the potential of GIS as an instructional tool to support spatial thinking in other geography courses. Using a non-equivalent control group research design, this study…

  2. MS Data Miner: a web-based software tool to analyze, compare, and share mass spectrometry protein identifications.

    Science.gov (United States)

    Dyrlund, Thomas F; Poulsen, Ebbe T; Scavenius, Carsten; Sanggaard, Kristian W; Enghild, Jan J

    2012-09-01

    Data processing and analysis of proteomics data are challenging and time consuming. In this paper, we present MS Data Miner (MDM) (http://sourceforge.net/p/msdataminer), a freely available web-based software solution aimed at minimizing the time required for the analysis, validation, data comparison, and presentation of data files generated in MS software, including Mascot (Matrix Science), Mascot Distiller (Matrix Science), and ProteinPilot (AB Sciex). The program was developed to significantly decrease the time required to process large proteomic data sets for publication. This open sourced system includes a spectra validation system and an automatic screenshot generation tool for Mascot-assigned spectra. In addition, a Gene Ontology term analysis function and a tool for generating comparative Excel data reports are included. We illustrate the benefits of MDM during a proteomics study comprised of more than 200 LC-MS/MS analyses recorded on an AB Sciex TripleTOF 5600, identifying more than 3000 unique proteins and 3.5 million peptides. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Analyzing HT-SELEX data with the Galaxy Project tools--A web based bioinformatics platform for biomedical research.

    Science.gov (United States)

    Thiel, William H; Giangrande, Paloma H

    2016-03-15

    The development of DNA and RNA aptamers for research as well as diagnostic and therapeutic applications is a rapidly growing field. In the past decade, the process of identifying aptamers has been revolutionized with the advent of high-throughput sequencing (HTS). However, bioinformatics tools that enable the average molecular biologist to analyze these large datasets and expedite the identification of candidate aptamer sequences have been lagging behind the HTS revolution. The Galaxy Project was developed in order to efficiently analyze genome, exome, and transcriptome HTS data, and we have now applied these tools to aptamer HTS data. The Galaxy Project's public webserver is an open source collection of bioinformatics tools that are powerful, flexible, dynamic, and user friendly. The online nature of the Galaxy webserver and its graphical interface allow users to analyze HTS data without compiling code or installing multiple programs. Herein we describe how tools within the Galaxy webserver can be adapted to pre-process, compile, filter and analyze aptamer HTS data from multiple rounds of selection.

  4. The leisure commons a spatial history of web 2.0

    CERN Document Server

    Arora, Payal

    2014-01-01

    There is much excitement about Web 2.0 as an unprecedented, novel, community-building space for experiencing, producing, and consuming leisure, particularly through social network sites. What is needed is a perspective that is invested in neither a utopian or dystopian posture but sees historical continuity to this cyberleisure geography. This book investigates the digital public sphere by drawing parallels to another leisure space that shares its rhetoric of being open, democratic, and free for all: the urban park. It makes the case that the history and politics of public parks as an urban co

  5. Comparative Pathway Analyzer--a web server for comparative analysis, clustering and visualization of metabolic networks in multiple organisms.

    Science.gov (United States)

    Oehm, Sebastian; Gilbert, David; Tauch, Andreas; Stoye, Jens; Goesmann, Alexander

    2008-07-01

    In order to understand the phenotype of any living system, it is essential to not only investigate its genes, but also the specific metabolic pathway variant of the organism of interest, ideally in comparison with other organisms. The Comparative Pathway Analyzer, CPA, calculates and displays the differences in metabolic reaction content between two sets of organisms. Because results are highly dependent on the distribution of organisms into these two sets and the appropriate definition of these sets often is not easy, we provide hierarchical clustering methods for the identification of significant groupings. CPA also visualizes the reaction content of several organisms simultaneously allowing easy comparison. Reaction annotation data and maps for visualizing the results are taken from the KEGG database. Additionally, users can upload their own annotation data. This website is free and open to all users and there is no login requirement. It is available at https://www.cebitec.uni-bielefeld.de/groups/brf/software/cpa/index.html.

  6. Development of a web GIS application for emissions inventory spatial allocation based on open source software tools

    Science.gov (United States)

    Gkatzoflias, Dimitrios; Mellios, Giorgos; Samaras, Zissis

    2013-03-01

    Combining emission inventory methods and geographic information systems (GIS) remains a key issue for environmental modelling and management purposes. This paper examines the development of a web GIS application as part of an emission inventory system that produces maps and files with spatial allocated emissions in a grid format. The study is not confined in the maps produced but also presents the features and capabilities of a web application that can be used by every user even without any prior knowledge of the GIS field. The development of the application was based on open source software tools such as MapServer for the GIS functions, PostgreSQL and PostGIS for the data management and HTML, PHP and JavaScript as programming languages. In addition, background processes are used in an innovative manner to handle the time consuming and computational costly procedures of the application. Furthermore, a web map service was created to provide maps to other clients such as the Google Maps API v3 that is used as part of the user interface. The output of the application includes maps in vector and raster format, maps with temporal resolution on daily and hourly basis, grid files that can be used by air quality management systems and grid files consistent with the European Monitoring and Evaluation Programme Grid. Although the system was developed and validated for the Republic of Cyprus covering a remarkable wide range of pollutant and emissions sources, it can be easily customized for use in other countries or smaller areas, as long as geospatial and activity data are available.

  7. Utilization of web-based stationary rainfall data for near-real-time derivation of spatial landslide susceptibility

    Science.gov (United States)

    Canli, Ekrem; Glade, Thomas; Loigge, Bernd

    2016-04-01

    Scarcity of high-quality meteorological data is often referred to as one of the main constraints for performing real-time landslide forecasting. Meteorological data may be expensive or not up-to-date any more soon after it is acquired. However, the internet is a great source of freely available, high quality real-time weather data from different sources. Web scraping has emerged into a highly valuable technique for utilizing information from public websites. Hereby, web scraping is the process of automatically gathering data from the internet, extracting these data according to required needs, storing the selected data and using those self-generated databases for further analysis. This technique is of great value, in particular for weather data that is released regularly in short intervals to the public, but may be applicable to any other type of continuously released data. By applying these techniques, research institutions in developing countries may be able to generate their own free data without the need of purchasing expensive, ready-made weather data. However, some weather data providers already offer application programming interfaces (API) that facilitate access to real-time weather data, but those usually have to be purchased. Here we present an approach for integrating web-based rainfall data from different sources into an automated workflow. This workflow ranges from the query of near-real-time data to spatially interpolating those rain gauge measurements into a continuous rainfall raster. Subsequently, this raster is handed over into a dynamic, physical-based landslide model for generating hourly distributed landslide susceptibility maps on a regional scale. Future work involves the establishment or regional intensity-duration rainfall thresholds that are continuously evaluated against the distributed rainfall patterns based on real-time rainfall data.

  8. Study on the Sharing Mechanism of Coal Mine Spatial Data Based on Agent and Web Service%基于Agent和Web Service的煤矿空间数据共享机制研究

    Institute of Scientific and Technical Information of China (English)

    谢娟娟; 顾寄南

    2012-01-01

    随着GIS和计算机技术的发展,煤矿企业积累了海量多源异构的空间数据.如何消除煤矿企业内的空间信息孤岛问题,实现空间数据共享,是当前煤矿所迫切需要解决的问题.讨论了煤矿异构空间数据的集成方法,创新性地通过Agent和Web Service的技术融合,实现空间数据的智能共享和高效查询,构建一个基于Agent和Web Services的煤矿空间数据集成系统.%With the development of GIS and the computer technology, coal enterprises have accumulated mass spatial data,which is multi-source and heterogeneous.How to eliminate space information islands in coal mine enterprises and realize the spatial data sharing are the problems that current coal mine need to resolve urgently.We discuss the heterogeneous data integrated method of coal enterprises, and inno-vatively coalesce Agent and Web Services to realize intelligent sharing and efficient inquiry for spatial data and structure a coal mine spatial data integration system based on Agent and Web Services.

  9. 一种面向网络发布的海洋数据库引擎%A Marine Remote Sensing Spatial Database Engine for Web Publishing

    Institute of Scientific and Technical Information of China (English)

    陈志荣; 徐财江

    2008-01-01

    To meet the requirements of efficient management and web publishing for marine remote sensing data,a spatial database engine,named MRSSDE,is designed independently.The logical model,physical model,and optimization method of MRSSDE are discussed in detail.Compared to the ArcSDE,which is the leading product of Spatial Database Engine,the MRSSDE proved to be more effective.

  10. Web Service在空间数据互操作中的研究%Research on Web Service of Spatial Data Interoperability

    Institute of Scientific and Technical Information of China (English)

    王峰; 田锋

    2005-01-01

    介绍了Web Service的概念、组成技术和协议,分析了基于Web Service的空间数据互操作规范,总结了Web Service在空间数据的共享、互操作和集成上的良好支持和适应性,指出Web Service是实现空间数据互操作的最佳解决方案.

  11. Using the World Wide Web as a Teaching Tool: Analyzing Images of Aging and the Visual Needs of an Aging Society.

    Science.gov (United States)

    Jakobi, Patricia

    1999-01-01

    Analysis of Web site images of aging to identify positive and negative representations can help teach students about social perceptions of older adults. Another learning experience involves consideration of the needs of older adults in Web site design. (SK)

  12. Spatially Referenced Educational Achievement Data Exploration: A Web-Based Interactive System Integration of GIS, PHP, and MySQL Technologies

    Science.gov (United States)

    Mulvenon, Sean W.; Wang, Kening; Mckenzie, Sarah; Anderson, Travis

    2006-01-01

    Effective exploration of spatially referenced educational achievement data can help educational researchers and policy analysts speed up gaining valuable insight into datasets. This article illustrates a demo system developed in the National Office for Research on Measurement and Evaluation Systems (NORMES) for supporting Web-based interactive…

  13. Spatially Referenced Educational Achievement Data Exploration: A Web-Based Interactive System Integration of GIS, PHP, and MySQL Technologies

    Science.gov (United States)

    Mulvenon, Sean W.; Wang, Kening; Mckenzie, Sarah; Anderson, Travis

    2006-01-01

    Effective exploration of spatially referenced educational achievement data can help educational researchers and policy analysts speed up gaining valuable insight into datasets. This article illustrates a demo system developed in the National Office for Research on Measurement and Evaluation Systems (NORMES) for supporting Web-based interactive…

  14. SLDs for Visualizing Multicolor Elevation Contour Lines in Geo-Spatial Web Applications

    CERN Document Server

    Kodge, B G

    2011-01-01

    This paper addresses the need for geospatial consumers (either humans or machines) to visualize multicolored elevation contour poly lines with respect their different contour intervals and control the visual portrayal of the data with which they work. The current OpenGIS Web Map Service (WMS) specification supports the ability for an information provider to specify very basic styling options by advertising a preset collection of visual portrayals for each available data set. However, while a WMS currently can provide the user with a choice of style options, the WMS can only tell the user the name of each style. It cannot tell the user what portrayal will look like on the map. More importantly, the user has no way of defining their own styling rules. The ability for a human or machine client to define these rules requires a styling language that the client and server can both understand. Defining this language, called the StyledLayerDescriptor (SLD), is the main focus of this paper, and it can be used to portr...

  15. An approach to analyzing environmental drivers to spatial variations in annual distribution of periphytic protozoa in coastal ecosystems.

    Science.gov (United States)

    Xu, Guangjian; Xu, Henglong

    2016-03-15

    The environmental drivers to the spatial variation in annual distribution were studied based on an annual dataset of periphytic protozoa using multivariate approaches. Samples were monthly collected at four stations within a pollution gradient in coastal waters of the Yellow Sea, northern China during a 1-year period. The second-stage (2STAGE) analyses showed that the internal patterns of the annual distribution were changed along the pollution gradient in terms of abundance. The dominant species represented different succession dynamics among four sampling stations during a 1-year cycle. Best matching analysis demonstrated that the spatial variations in annual distribution of the protozoa were significantly correlated with ammonium nitrogen (NH4-N), alone or in combination with salinity and dissolved oxygen (DO). Based on the results, we suggest that the nutrients, salinity and DO may be the main drivers to shape the spatial variations in annual distribution of periphytic protozoa.

  16. Spatial differences in East scotia ridge hydrothermal vent food webs: influences of chemistry, microbiology and predation on trophodynamics.

    Directory of Open Access Journals (Sweden)

    William D K Reid

    Full Text Available The hydrothermal vents on the East Scotia Ridge are the first to be explored in the Antarctic and are dominated by large peltospiroid gastropods, stalked barnacles (Vulcanolepas sp. and anomuran crabs (Kiwa sp. but their food webs are unknown. Vent fluid and macroconsumer samples were collected at three vent sites (E2, E9N and E9S at distances of tens of metres to hundreds of kilometres apart with contrasting vent fluid chemistries to describe trophic interactions and identify potential carbon fixation pathways using stable isotopes. δ(13C of dissolved inorganic carbon from vent fluids ranged from -4.6‰ to 0.8‰ at E2 and from -4.4‰ to 1.5‰ at E9. The lowest macroconsumer δ(13C was observed in peltospiroid gastropods (-30.0‰ to -31.1‰ and indicated carbon fixation via the Calvin-Benson-Bassham (CBB cycle by endosymbiotic gamma-Proteobacteria. Highest δ(13C occurred in Kiwa sp. (-19.0‰ to -10.5‰, similar to that of the epibionts sampled from their ventral setae. Kiwa sp. δ(13C differed among sites, which were attributed to spatial differences in the epibiont community and the relative contribution of carbon fixed via the reductive tricarboxylic acid (rTCA and CBB cycles assimilated by Kiwa sp. Site differences in carbon fixation pathways were traced into higher trophic levels e.g. a stichasterid asteroid that predates on Kiwa sp. Sponges and anemones at the periphery of E2 assimilated a proportion of epipelagic photosynthetic primary production but this was not observed at E9N. Differences in the δ(13C and δ(34S values of vent macroconsumers between E2 and E9 sites suggest the relative contributions of photosynthetic and chemoautotrophic carbon fixation (rTCA v CBB entering the hydrothermal vent food webs vary between the sites.

  17. Improving query services of web map by web mining

    Science.gov (United States)

    Huang, Maojun

    2007-11-01

    Web map is the hybrid of map and the World Wide Web (known as Web). It is usually created with WebGIS techniques. With the rapid social development, web maps oriented the public are facing pressure that dissatisfy the increased demanding. The geocoding database plays a key role in supporting query services effectively. The traditional geocoding method is laborious and time-consuming. And there is much online spatial information, which would be the supplementary information source for geocoding. Therefore, this paper discusses how to improve query services by web mining. The improvement can be described from three facets: first, improving location query by discovering and extracting address information from the Web to extend geocoding database. Second, enhancing the ability of optimum path query of public traffic and buffer query by spatial analyzing and reasoning on the extended geocoding database. Third, adjusting strategies of collecting data according to patterns discovered by web map query mining. Finally, this paper presents the designing of the application system and experimental results.

  18. Simply Analyzing the Application of Web Technology in Distribution Network%浅析Web技术在配网GIS中的应用

    Institute of Scientific and Technical Information of China (English)

    杨勤勤

    2004-01-01

    针对传统配电管理系统中GIS的特点,提出原有的GIS系统已不能满足现在管理信息系统发展的要求,而Web GIS在这方面已获得了成功,能够很好的解决这一要求.比较Web GIS和传统GIS的不同,分析Web GIS的实现技术,并以实例分析了Web GIS在生产工作中的应用.

  19. Multi-Source Data Processing Middleware for Land Monitoring within a Web-Based Spatial Data Infrastructure for Siberia

    Directory of Open Access Journals (Sweden)

    Christiane Schmullius

    2013-06-01

    Full Text Available Land monitoring is a key issue in Earth system sciences to study environmental changes. To generate knowledge about change, e.g., to decrease uncertaincy in the results and build confidence in land change monitoring, multiple information sources are needed. Earth observation (EO satellites and in situ measurements are available for operational monitoring of the land surface. As the availability of well-prepared geospatial time-series data for environmental research is limited, user-dependent processing steps with respect to the data source and formats pose additional challenges. In most cases, it is possible to support science with spatial data infrastructures (SDI and services to provide such data in a processed format. A data processing middleware is proposed as a technical solution to improve interdisciplinary research using multi-source time-series data and standardized data acquisition, pre-processing, updating and analyses. This solution is being implemented within the Siberian Earth System Science Cluster (SIB-ESS-C, which combines various sources of EO data, climate data and analytical tools. The development of this SDI is based on the definition of automated and on-demand tools for data searching, ordering and processing, implemented along with standard-compliant web services. These tools, consisting of a user-friendly download, analysis and interpretation infrastructure, are available within SIB-ESS-C for operational use.

  20. Geospatial semantic web

    CERN Document Server

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such as WFS, WMS, RDF, OWL, and GeoSPARQL, and demonstrates how to use the Geospatial Semantic Web technologies to solve practical real-world problems such as spatial data interoperability.

  1. Analyzing the relationship between peak runoff discharge and land-use pattern – a spatial optimization approach

    Directory of Open Access Journals (Sweden)

    I.-Y. Yeo

    2009-04-01

    Full Text Available This paper investigates the impacts of land-use patterns on watershed hydrology and characterizes the nature of this relationship. The approach combines a spatially explicit, process-based hydrological simulation model, a land-use optimization model, the Integrated Hydrological and Land-Use Optimization (IHLUO model, and an extensive GIS database. Numerical experiments are conducted to assess changes in the peak discharge rate under various spatial land-use arrangements, and to delineate the optimal land distribution that minimizes the peak discharge. The area of application is a catchment of the Old Woman Creek watershed in the southwestern coastal area of Lake Erie, OH. The global optimality of the delineated land pattern at a 30-m resolution is evaluated using a combinatorial statistical method. A large number of solutions has been generated from clearly different initial solutions, and these solutions turn out to be very close to each other, strongly supporting the case for a convex relationship between peak discharge and land-use pattern. The Weibull distribution is used to generate a point estimate of the global optimal value and its confidence interval. The peak discharge function is further examined in light of the underlying physics used in the simulation model.

  2. Differentiating between spatial and temporal effects by applying modern data analyzing techniques to measured soil moisture data

    Science.gov (United States)

    Hohenbrink, Tobias L.; Lischeid, Gunnar; Schindler, Uwe

    2013-04-01

    Large data sets containing time series of soil hydrological variables exist due to extensive monitoring work in the last decades. The interplay of different processes and influencing factors cause spatial and temporal patterns which contribute to the total variance. That implies that monitoring data sets contain information about the most relevant processes. That information can be extracted using modern data analysis techniques. Our objectives were (i) to decompose the total variance of an example data set of measured soil moisture time series in independent components and (ii) relate them to specific influencing factors. Soil moisture had been measured at 12 plots in an Albeluvisol located in Müncheberg, northeastern Germany, between May 1st, 2008 and July 1st, 2011. Each plot was equipped with FDR probes in 7 depths between 30 cm and 300 cm. Six plots were cultivated with winter rye and silage maize (Crop Rotation System I) and the other six with silage maize, winter rye/millet, triticale/lucerne and lucerne (Crop Rotation System II). We applied a principal component analysis to the soil moisture data set. The first component described the mean behavior in time of all soil moisture time series. The second component reflected the impact of soil depth. Together they explained 80 % of the data set's total variance. An analysis of the first two components confirmed that measured plots showed similar signal damping extend in each depth. The fourth component revealed the impact of the two different crop rotation systems which explained about 4 % of the total variance and 13 % of the spatial variance of soil moisture data. That is only a minor fraction compared to small scale soil texture heterogeneity effects. Principal component analysis has proven to be a useful tool to extract less apparent signals.

  3. Fano lineshapes of 'Peak-tracking chip' spatial profiles analyzed with correlation analysis for bioarray imaging and refractive index sensing

    KAUST Repository

    Bougot-Robin, K.

    2013-05-22

    The asymmetric Fano resonance lineshapes, resulting from interference between background and a resonant scattering, is archetypal in resonant waveguide grating (RWG) reflectivity. Resonant profile shift resulting from a change of refractive index (from fluid medium or biomolecules at the chip surface) is classically used to perform label-free sensing. Lineshapes are sometimes sampled at discretized “detuning” values to relax instrumental demands, the highest reflectivity element giving a coarse resonance estimate. A finer extraction, needed to increase sensor sensitivity, can be obtained using a correlation approach, correlating the sensed signal to a zero-shifted reference signal. Fabrication process is presented leading to discrete Fano profiles. Our findings are illustrated with resonance profiles from silicon nitride RWGs operated at visible wavelengths. We recently demonstrated that direct imaging multi-assay RWGs sensing may be rendered more reliable using “chirped” RWG chips, by varying a RWG structure parameter. Then, the spatial reflectivity profiles of tracks composed of RWGs units with slowly varying filling factor (thus slowly varying resonance condition) are measured under monochromatic conditions. Extracting the resonance location using spatial Fano profiles allows multiplex refractive index based sensing. Discretization and sensitivity are discussed both through simulation and experiment for different filling factor variation, here Δf=0.0222 and Δf=0.0089. This scheme based on a “Peak-tracking chip” demonstrates a new technique for bioarray imaging using a simpler set-up that maintains high performance with cheap lenses, with down to Δn=2×10-5 RIU sensitivity for the highest sampling of Fano lineshapes. © (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

  4. Analyzing key constraints to biogas production from crop residues and manure in the EU—A spatially explicit model

    Science.gov (United States)

    Persson, U. Martin

    2017-01-01

    This paper presents a spatially explicit method for making regional estimates of the potential for biogas production from crop residues and manure, accounting for key technical, biochemical, environmental and economic constraints. Methods for making such estimates are important as biofuels from agricultural residues are receiving increasing policy support from the EU and major biogas producers, such as Germany and Italy, in response to concerns over unintended negative environmental and social impacts of conventional biofuels. This analysis comprises a spatially explicit estimate of crop residue and manure production for the EU at 250 m resolution, and a biogas production model accounting for local constraints such as the sustainable removal of residues, transportation of substrates, and the substrates’ biochemical suitability for anaerobic digestion. In our base scenario, the EU biogas production potential from crop residues and manure is about 0.7 EJ/year, nearly double the current EU production of biogas from agricultural substrates, most of which does not come from residues or manure. An extensive sensitivity analysis of the model shows that the potential could easily be 50% higher or lower, depending on the stringency of economic, technical and biochemical constraints. We find that the potential is particularly sensitive to constraints on the substrate mixtures’ carbon-to-nitrogen ratio and dry matter concentration. Hence, the potential to produce biogas from crop residues and manure in the EU depends to large extent on the possibility to overcome the challenges associated with these substrates, either by complementing them with suitable co-substrates (e.g. household waste and energy crops), or through further development of biogas technology (e.g. pretreatment of substrates and recirculation of effluent). PMID:28141827

  5. Analyzing key constraints to biogas production from crop residues and manure in the EU-A spatially explicit model.

    Science.gov (United States)

    Einarsson, Rasmus; Persson, U Martin

    2017-01-01

    This paper presents a spatially explicit method for making regional estimates of the potential for biogas production from crop residues and manure, accounting for key technical, biochemical, environmental and economic constraints. Methods for making such estimates are important as biofuels from agricultural residues are receiving increasing policy support from the EU and major biogas producers, such as Germany and Italy, in response to concerns over unintended negative environmental and social impacts of conventional biofuels. This analysis comprises a spatially explicit estimate of crop residue and manure production for the EU at 250 m resolution, and a biogas production model accounting for local constraints such as the sustainable removal of residues, transportation of substrates, and the substrates' biochemical suitability for anaerobic digestion. In our base scenario, the EU biogas production potential from crop residues and manure is about 0.7 EJ/year, nearly double the current EU production of biogas from agricultural substrates, most of which does not come from residues or manure. An extensive sensitivity analysis of the model shows that the potential could easily be 50% higher or lower, depending on the stringency of economic, technical and biochemical constraints. We find that the potential is particularly sensitive to constraints on the substrate mixtures' carbon-to-nitrogen ratio and dry matter concentration. Hence, the potential to produce biogas from crop residues and manure in the EU depends to large extent on the possibility to overcome the challenges associated with these substrates, either by complementing them with suitable co-substrates (e.g. household waste and energy crops), or through further development of biogas technology (e.g. pretreatment of substrates and recirculation of effluent).

  6. myFX: a turn-key software for laboratory desktops to analyze spatial patterns of gene expression in Drosophila embryos.

    Science.gov (United States)

    Montiel, Ivan; Konikoff, Charlotte; Braun, Bremen; Packard, Mary; Gramates, Sian L; Sun, Qian; Ye, Jieping; Kumar, Sudhir

    2014-05-01

    Spatial patterns of gene expression are of key importance in understanding developmental networks. Using in situ hybridization, many laboratories are generating images to describe these spatial patterns and to test biological hypotheses. To facilitate such analyses, we have developed biologist-centric software (myFX) that contains computational methods to automatically process and analyze images depicting embryonic gene expression in the fruit fly Drosophila melanogaster. It facilitates creating digital descriptions of spatial patterns in images and enables measurements of pattern similarity and visualization of expression across genes and developmental stages. myFX interacts directly with the online FlyExpress database, which allows users to search thousands of existing patterns to find co-expressed genes by image comparison.

  7. Research of Web Pages Categorization

    Institute of Scientific and Technical Information of China (English)

    Zhongda Lin; Kun Deng; Yanfen Hong

    2006-01-01

    In this paper, we discuss several issues related to automated classification of web pages, especially text classification of web pages. We analyze features selection and categorization algorithms of web pages and give some suggestions for web pages categorization.

  8. Mercury bioaccumulation in the food web of Three Gorges Reservoir (China): Tempo-spatial patterns and effect of reservoir management

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jun [College of Fisheries, Huazhong Agricultural University, Key Laboratory of Freshwater Animal Breeding, Ministry of Agriculture, Wuhan 430070 (China); Freshwater Aquaculture Collaborative Innovation Center of Hubei Province, Wuhan 430070 (China); Zhou, Qiong, E-mail: hainan@mail.hzau.edu.cn [College of Fisheries, Huazhong Agricultural University, Key Laboratory of Freshwater Animal Breeding, Ministry of Agriculture, Wuhan 430070 (China); Freshwater Aquaculture Collaborative Innovation Center of Hubei Province, Wuhan 430070 (China); Yuan, Gailing; He, Xugang [College of Fisheries, Huazhong Agricultural University, Key Laboratory of Freshwater Animal Breeding, Ministry of Agriculture, Wuhan 430070 (China); Freshwater Aquaculture Collaborative Innovation Center of Hubei Province, Wuhan 430070 (China); Xie, Ping [College of Fisheries, Huazhong Agricultural University, Key Laboratory of Freshwater Animal Breeding, Ministry of Agriculture, Wuhan 430070 (China); Donghu Experimental Station of Lake Ecosystems, State Key Laboratory of Freshwater Ecology and Biotechnology of China, Institute of Hydrobiology, Chinese Academy of Sciences, Wuhan 430072 (China)

    2015-09-15

    Tempo-spatial patterns of mercury bioaccumulation and tropho-dynamics, and the potential for a reservoir effect were evaluated in the Three Gorges Reservoir (TGR, China) from 2011 to 2012, using total mercury concentrations (THg) and stable isotopes (δ{sup 13}C and δ{sup 15}N) of food web components (seston, aquatic invertebrates and fish). Hg concentrations in aquatic invertebrates and fish indicated a significant temporal trend associated with regular seasonal water-level manipulation. This includes water level lowering to allow for storage of water during the wet season (summer); a decrease of water levels from September to June providing a setting for flood storage. Hg concentrations in organisms were the highest after flooding. Higher Hg concentrations in fish were observed at the location farthest from the dam. Hg concentrations in water and sediment were correlated. Compared with the reservoirs of United States and Canada, TGR had lower trophic magnification factors (0.046–0.066), that are explained primarily by organic carbon concentrations in sediment, and the effect of “growth dilution”. Based on comparison before and after the impoundment of TGR, THg concentration in biota did not display an obvious long-term reservoir effect due to (i) short time since inundation, (ii) regular water discharge associated with water-level regulation, and/or (iii) low organic matter content in the sediment. - Highlights: • Hg concentrations were measured in biota of the main stem of 3 Gorges Reservoir. • Fish Hg concentration post-flood period > pre-flood period > flood period. • Fish Hg concentrations were the highest farthest from the dam. • THg in fish 2 years after inundation were the same as before impoundment. • Low biomagnification was ascribed to low DOC content in the sediment.

  9. Analyzing the spatial patterns and drivers of ecosystem services in rapidly urbanizing Taihu Lake Basin of China

    Science.gov (United States)

    Ai, Junyong; Sun, Xiang; Feng, Lan; Li, Yangfan; Zhu, Xiaodong

    2015-09-01

    Quantifying and mapping the distribution patterns of ecosystem services can help to ascertain which services should be protected and where investments should be directed to improve synergies and reduce tradeoffs. Moreover, the indicators of urbanization that affect the provision of ecosystem services must be identified to determine which approach to adopt in formulating policies related to these services. This paper presents a case study that maps the distribution of multiple ecosystem services and analyzes the ways in which they interact. The relationship between the supply of ecosystem services and the socio-economic development in the Taihu Lake Basin of eastern China is also revealed. Results show a significant negative relationship between crop production and tourism income ( p<0.005) and a positive relationship between crop production, nutrient retention, and carbon sequestration ( p<0.005). The negative effects of the urbanization process on providing and regulating services are also identified through a comparison of the ecosystem services in large and small cities. Regression analysis was used to compare and elucidate the relative significance of the selected urbanization factors to ecosystem services. The results indicate that urbanization level is the most substantial factor inversely correlated with crop production ( R 2 = 0.414) and nutrient retention services ( R 2 = 0.572). Population density is the most important factor that negatively affects carbon sequestration ( R 2 = 0.447). The findings of this study suggest the potential relevance of ecosystem service dynamics to urbanization management and decision making.

  10. TFmiR: a web server for constructing and analyzing disease-specific transcription factor and miRNA co-regulatory networks.

    Science.gov (United States)

    Hamed, Mohamed; Spaniol, Christian; Nazarieh, Maryam; Helms, Volkhard

    2015-07-01

    TFmiR is a freely available web server for deep and integrative analysis of combinatorial regulatory interactions between transcription factors, microRNAs and target genes that are involved in disease pathogenesis. Since the inner workings of cells rely on the correct functioning of an enormously complex system of activating and repressing interactions that can be perturbed in many ways, TFmiR helps to better elucidate cellular mechanisms at the molecular level from a network perspective. The provided topological and functional analyses promote TFmiR as a reliable systems biology tool for researchers across the life science communities. TFmiR web server is accessible through the following URL: http://service.bioinformatik.uni-saarland.de/tfmir.

  11. A distributed open source web-application for spatial multi-criteria evaluation for decision support systems infrastructure

    NARCIS (Netherlands)

    Boerboom, L.G.J.; Alan, O.O.

    2013-01-01

    Spatial data availability on internet or intranet rapidly increases. Laymen use this data through applications such as Google Maps, Google Earth and Virtual Earth. Relatively new standards allow interoperable use for publication, sharing and calculation of spatial data. We discuss the opportunities

  12. Analyzing the equilibrium states of a quasi-neutral spatially inhomogeneous system of charges above a liquid dielectric film based on the first principles of quantum statistics

    Science.gov (United States)

    Lytvynenko, D. M.; Slyusarenko, Yu V.

    2017-08-01

    A theory of quasi-neutral equilibrium states of charges above a liquid dielectric surface is developed. This theory is based on the first principles of quantum statistics for systems comprising many identical particles. The proposed approach involves applying the variational principle, modified for the considered systems, and the Thomas-Fermi model. In the terms of the developed theory self-consistency equations are obtained. These equations provide the relation between the main parameters describing the system: the potential of the static electric field, the distribution function of charges and the surface profile of the liquid dielectric. The equations are used to study the phase transition in the system to a spatially periodic state. The proposed method can be applied in analyzing the properties of the phase transition in the system in relation to the spatially periodic states of wave type. Using the analytical and numerical methods, we perform a detailed study of the dependence of the critical parameters of such a phase transition on the thickness of the liquid dielectric film. Some stability criteria for the new asymmetric phase of the studied system are discussed.

  13. ICO amplicon NGS data analysis: a Web tool for variant detection in common high-risk hereditary cancer genes analyzed by amplicon GS Junior next-generation sequencing.

    Science.gov (United States)

    Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi

    2014-03-01

    Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories.

  14. 利用流形学习进行空间信息服务分类%Research on Geo-spatial Web Services Classification Based on Manifold Learning

    Institute of Scientific and Technical Information of China (English)

    陈科; 王家耀; 成毅; 谢明霞

    2013-01-01

    The problems existed in the traditional methods of Web services classification are analyzed, the concepts of manifold and manifold learning and the purpose of introducing the manifold learning into the Web services are described. The algorithm for the visualization and classification of geo-spatial Web services(GWS) based on manifold learning is proposed. During the process of dimension reduction, the similarity between GWS is preserved and the data manifold is unrolled. In order to improve the precision of classification, we gain the mapping rule from the GWS to the 2D data and the initial number of clusters according to the visualization of 2D mapping data. The experimental results prove the validity of the improved visualization and classification algorithm for GWS proposed in this paper.%对现有的Web服务分类方法中存在的问题进行了分析,阐述了流形和流形学习的概念以及将流形学习引入到Web服务领域的目的,提出了利用流形学习进行空间信息服务分类的方法.该方法在空间信息服务降维前后保持各服务间的相似(近邻)关系不变,并通过对服务进行降维可视化指导,确定初始聚类数和聚类中心,从而提高利用聚类分析实现空间信息服务无监督分类的精度.实验表明,本文方法不仅能够对抽象的Web服务进行数值化表示,而且能够有效地提高服务分类的性能.

  15. Quantifying Uncertainty in the Trophic Magnification Factor Related to Spatial Movements of Organisms in a Food Web

    DEFF Research Database (Denmark)

    McLeod, Anne; Arnot, Jon; Borgå, Katrine

    2015-01-01

    contamination) and variation in environmental, physiological, and ecological parameters included within the model. Finally, the model was used to explore interactions between spatial heterogeneity in water and sediment contaminant concentrations and theoretical movement profiles of different fish species...

  16. Analyzing spatial variability of soil properties in the urban park before and after reconstruction to support decision-making in landscaping

    Science.gov (United States)

    Romzaikina, Olga; Vasenev, Viacheslav; Khakimova, Rita

    2017-04-01

    On-going urbanization stresses a necessity for structural and aesthetically organized urban landscapes to improve citizen's life quality. Urban soils and vegetation are the main components of urban ecosystems. Urban greenery regulates the climate, controls and air quality and supports biodiversity in urban areas. Soils play a key role in supporting urban greenery. However, soils of urban parks also perform other important environmental functions. Urban soils are influenced by a variety of environmental and anthropogenic factors and, in the result, are highly heterogeneous and dynamic. Reconstructions of green zones and urban parks, usually occurring in cities, alter soil properties. Analyzing spatial variability and dynamics of soil properties is important to support decision-making in landscaping. Therefore, the research aimed to analyze the spatial distribution of the key soil properties (acidity, soil organic carbon (SOC) and nutrient contents) in the urban park before and after reconstruction to support decision-making in selecting ornamental plants for landscaping. The research was conducted in the urban park named after Artyom Borovik in Moscow before (2012) and after (2014) the reconstruction. Urban soil's properties maps for both periods were created by interpolation of the field data. The observed urban soils included recreazems, urbanozems and constuctozems. Before the reconstruction soils were sampled using the uniform design (the net with 100 m side and key plots with 50m size). After the reconstructions the additional samples were collected at locations, where the land cover and functional zones changed in a result of the reconstruction.We sample from the depths 0-30, 30-50 and 50-100 cm. The following soil properties were measured: pH, SOC, K2O and P2O5. The maps of the analyzed properties were developed using open QGIS2.4 software by IDW. The vegetation in the park was examined using the scale of the visual assessment. The results of the visual

  17. Implementation of OGC Web Map Service Based on Web Service

    Institute of Scientific and Technical Information of China (English)

    JIA Wenjue; CHEN Yumin; GONG Jianya

    2004-01-01

    OGC Web Map Service is one kind of OGC Portrayal Services belongs to OGC Web Service model and it provides multi-platform interoperability of spatial data set. This paper presents a method for implementing OGC Web Map Service based on Web Service technique and introduces the detailed process.

  18. Use of a Web-based physical activity record system to analyze behavior in a large population: cross-sectional study.

    Science.gov (United States)

    Namba, Hideyuki; Yamada, Yosuke; Ishida, Mika; Takase, Hideto; Kimura, Misaka

    2015-03-19

    The use of Web-based physical activity systems has been proposed as an easy method for collecting physical activity data. We have developed a system that has exhibited high accuracy as assessed by the doubly labeled water method. The purpose of this study was to collect behavioral data from a large population using our Web-based physical activity record system and assess the physical activity of the population based on these data. In this paper, we address the difference in physical activity for each urban scale. In total, 2046 participants (aged 30-59 years; 1105 men and 941 women) participated in the study. They were asked to complete data entry before bedtime using their personal computer on 1 weekday and 1 weekend day. Their residential information was categorized as urban, urban-rural, or rural. Participant responses expressed the intensity of each activity at 15-minute increments and were recorded on a Web server. Residential areas were compared and multiple regression analysis was performed. Most participants had a metabolic equivalent (MET) ranging from 1.4 to 1.8, and the mean MET was 1.60 (SD 0.28). The median value of moderate-to-vigorous physical activity (MVPA, ≥3 MET) was 7.92 MET-hours/day. A 1-way ANCOVA showed that total physical activity differed depending on the type of residential area (F2,2027=5.19, P=.006). The urban areas (n=950) had the lowest MET-hours/day (mean 37.8, SD, 6.0), followed by urban-rural areas (n=432; mean 38.6, SD 6.5; P=.04), and rural areas (n=664; mean 38.8, SD 7.4; P=.002). Two-way ANCOVA showed a significant interaction between sex and area of residence on the urban scale (F2,2036=4.53, P=.01). Men in urban areas had the lowest MET-hours/day (MVPA, ≥3 MET) at mean 7.9 (SD 8.7); men in rural areas had a MET-hours/day (MVPA, ≥3 MET) of mean 10.8 (SD 12.1, P=.002). No significant difference was noted in women among the 3 residential areas. Multiple regression analysis showed that physical activity consisting of

  19. Acquiring geographical data with web harvesting

    Science.gov (United States)

    Dramowicz, K.

    2016-04-01

    Many websites contain very attractive and up to date geographical information. This information can be extracted, stored, analyzed and mapped using web harvesting techniques. Poorly organized data from websites are transformed with web harvesting into a more structured format, which can be stored in a database and analyzed. Almost 25% of web traffic is related to web harvesting, mostly while using search engines. This paper presents how to harvest geographic information from web documents using the free tool called the Beautiful Soup, one of the most commonly used Python libraries for pulling data from HTML and XML files. It is a relatively easy task to process one static HTML table. The more challenging task is to extract and save information from tables located in multiple and poorly organized websites. Legal and ethical aspects of web harvesting are discussed as well. The paper demonstrates two case studies. The first one shows how to extract various types of information about the Good Country Index from the multiple web pages, load it into one attribute table and map the results. The second case study shows how script tools and GIS can be used to extract information from one hundred thirty six websites about Nova Scotia wines. In a little more than three minutes a database containing one hundred and six liquor stores selling these wines is created. Then the availability and spatial distribution of various types of wines (by grape types, by wineries, and by liquor stores) are mapped and analyzed.

  20. Analyzing Clickstreams

    DEFF Research Database (Denmark)

    Andersen, Jesper; Giversen, Anders; Jensen, Allan H.

    . Extensible Markup Language (XML) is fast becoming the new standard for data representation and exchange on the World Wide Web. The rapid emergence of XML data on the web, e.g., business-to-business (B2B) ecommerce, is making it necessary for OLAP and other data analysis tools to handleXML data as well...

  1. Spatial Keyword Querying

    DEFF Research Database (Denmark)

    Cao, Xin; Chen, Lisi; Cong, Gao;

    2012-01-01

    The web is increasingly being used by mobile users. In addition, it is increasingly becoming possible to accurately geo-position mobile users and web content. This development gives prominence to spatial web data management. Specifically, a spatial keyword query takes a user location and user-sup...... different kinds of functionality as well as the ideas underlying their definition....

  2. Mendel,MD: A user-friendly open-source web tool for analyzing WES and WGS in the diagnosis of patients with Mendelian disorders.

    Science.gov (United States)

    G C C L Cardenas, Raony; D Linhares, Natália; L Ferreira, Raquel; Pena, Sérgio D J

    2017-06-01

    Whole exome and whole genome sequencing have both become widely adopted methods for investigating and diagnosing human Mendelian disorders. As pangenomic agnostic tests, they are capable of more accurate and agile diagnosis compared to traditional sequencing methods. This article describes new software called Mendel,MD, which combines multiple types of filter options and makes use of regularly updated databases to facilitate exome and genome annotation, the filtering process and the selection of candidate genes and variants for experimental validation and possible diagnosis. This tool offers a user-friendly interface, and leads clinicians through simple steps by limiting the number of candidates to achieve a final diagnosis of a medical genetics case. A useful innovation is the "1-click" method, which enables listing all the relevant variants in genes present at OMIM for perusal by clinicians. Mendel,MD was experimentally validated using clinical cases from the literature and was tested by students at the Universidade Federal de Minas Gerais, at GENE-Núcleo de Genética Médica in Brazil and at the Children's University Hospital in Dublin, Ireland. We show in this article how it can simplify and increase the speed of identifying the culprit mutation in each of the clinical cases that were received for further investigation. Mendel,MD proved to be a reliable web-based tool, being open-source and time efficient for identifying the culprit mutation in different clinical cases of patients with Mendelian Disorders. It is also freely accessible for academic users on the following URL: https://mendelmd.org.

  3. Energy Zones Study: A Comprehensive Web-Based Mapping Tool to Identify and Analyze Clean Energy Zones in the Eastern Interconnection

    Energy Technology Data Exchange (ETDEWEB)

    Koritarov, Vladimir [Argonne National Lab. (ANL), Argonne, IL (United States); Kuiper, James [Argonne National Lab. (ANL), Argonne, IL (United States); Hlava, Kevin [Argonne National Lab. (ANL), Argonne, IL (United States); Orr, Andrew [Argonne National Lab. (ANL), Argonne, IL (United States); Rollins, Katherine [Argonne National Lab. (ANL), Argonne, IL (United States); Brunner, Donna [Argonne National Lab. (ANL), Argonne, IL (United States); Green, Herman [Argonne National Lab. (ANL), Argonne, IL (United States); Makar, Jeffrey [Argonne National Lab. (ANL), Argonne, IL (United States); Ayers, Andrew [Argonne National Lab. (ANL), Argonne, IL (United States); Holm, Michael [Argonne National Lab. (ANL), Argonne, IL (United States); Simunich, Kathy [Argonne National Lab. (ANL), Argonne, IL (United States); Wang, Jianhui [Argonne National Lab. (ANL), Argonne, IL (United States); McLamore, Michael [Argonne National Lab. (ANL), Argonne, IL (United States); Shamsuddin, Shabbir [Argonne National Lab. (ANL), Argonne, IL (United States); Kavicky, James [Argonne National Lab. (ANL), Argonne, IL (United States); Portante, Edgar [Argonne National Lab. (ANL), Argonne, IL (United States); Conzelmann, Guenter [Argonne National Lab. (ANL), Argonne, IL (United States); Molburg, John [Argonne National Lab. (ANL), Argonne, IL (United States); Clark, Corrie [Argonne National Lab. (ANL), Argonne, IL (United States); Snyder, Seth [Argonne National Lab. (ANL), Argonne, IL (United States); Darling, Seth [Argonne National Lab. (ANL), Argonne, IL (United States); Braun, Joseph [Argonne National Lab. (ANL), Argonne, IL (United States); Botterud, Audun [Argonne National Lab. (ANL), Argonne, IL (United States); Gasper, John [Argonne National Lab. (ANL), Argonne, IL (United States); Richmond, Pamela [Argonne National Lab. (ANL), Argonne, IL (United States); Beardsley, Brett [Argonne National Lab. (ANL), Argonne, IL (United States); Schlueter, Scott [Argonne National Lab. (ANL), Argonne, IL (United States); Augustine, Chad [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heimiller, Donna [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hurlbut, David J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Milbrandt, Anelia [National Renewable Energy Lab. (NREL), Golden, CO (United States); Schneider, Thomas R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hadley, Stanton W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gracia, Jose R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Mays, Gary T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Belles, Randy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Omitaomu, Olufemi A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fernandez, Steven [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hadjerioua, Boualem [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Stewart, Kevin M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kodysh, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Smith, Travis [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2013-09-01

    This report describes the work conducted in support of the Eastern Interconnection States’ Planning Council (EISPC) Energy Zones Study and the development of the Energy Zones Mapping Tool performed by a team of experts from three National Laboratories. The multi-laboratory effort was led by Argonne National Laboratory (Argonne), in collaboration with the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). In June 2009, the U.S. Department of Energy (DOE) and the National Energy Technology Laboratory published Funding Opportunity Announcement FOA-0000068, which invited applications for interconnection-level analysis and planning. In December 2009, the Eastern Interconnection Planning Collaborative (EIPC) and the EISPC were selected as two award recipients for the Eastern Interconnection. Subsequently, in 2010, DOE issued Research Call RC-BM-2010 to DOE’s Federal Laboratories to provide research support and assistance to FOA-0000068 awardees on a variety of key subjects. Argonne was selected as the lead laboratory to provide support to EISPC in developing a methodology and a mapping tool for identifying potential clean energy zones in the Eastern Interconnection. In developing the EISPC Energy Zones Mapping Tool (EZ Mapping Tool), Argonne, NREL, and ORNL closely collaborated with the EISPC Energy Zones Work Group which coordinated the work on the Energy Zones Study. The main product of the Energy Zones Study is the EZ Mapping Tool, which is a web-based decision support system that allows users to locate areas with high suitability for clean power generation in the U.S. portion of the Eastern Interconnection. The mapping tool includes 9 clean (low- or no-carbon) energy resource categories and 29 types of clean energy technologies. The EZ Mapping Tool contains an extensive geographic information system database and allows the user to apply a flexible modeling approach for the identification and analysis of potential energy zones

  4. A web-based multicriteria evaluation of spatial trade-offs between environmental and economic implications from hydraulic fracturing in a shale gas region in Ohio.

    Science.gov (United States)

    Liu, X; Gorsevski, P V; Yacobucci, M M; Onasch, C M

    2016-06-01

    Planning of shale gas infrastructure and drilling sites for hydraulic fracturing has important spatial implications. The evaluation of conflicting and competing objectives requires an explicit consideration of multiple criteria as they have important environmental and economic implications. This study presents a web-based multicriteria spatial decision support system (SDSS) prototype with a flexible and user-friendly interface that could provide educational or decision-making capabilities with respect to hydraulic fracturing site selection in eastern Ohio. One of the main features of this SDSS is to emphasize potential trade-offs between important factors of environmental and economic ramifications from hydraulic fracturing activities using a weighted linear combination (WLC) method. In the prototype, the GIS-enabled analytical components allow spontaneous visualization of available alternatives on maps which provide value-added features for decision support processes and derivation of final decision maps. The SDSS prototype also facilitates nonexpert participation capabilities using a mapping module, decision-making tool, group decision module, and social media sharing tools. The logical flow of successively presented forms and standardized criteria maps is used to generate visualization of trade-off scenarios and alternative solutions tailored to individual user's preferences that are graphed for subsequent decision-making.

  5. Analyzing web log files of the health on the net HONmedia search engine to define typical image search tasks for image retrieval evaluation.

    Science.gov (United States)

    Müller, Henning; Boyer, Célia; Gaudinat, Arnaud; Hersh, William; Geissbuhler, Antoine

    2007-01-01

    Medical institutions produce ever-increasing amount of diverse information. The digital form makes these data available for the use on more than a single patient. Images are no exception to this. However, less is known about how medical professionals search for visual medical information and how they want to use it outside of the context of a single patient. This article analyzes ten months of usage log files of the Health on the Net (HON) medical media search engine. Key words were extracted from all queries and the most frequent terms and subjects were identified. The dataset required much pre-treatment. Problems included national character sets, spelling errors and the use of terms in several languages. The results show that media search, particularly for images, was frequently used. The most common queries were for general concepts (e.g., heart, lung). To define realistic information needs for the ImageCLEFmed challenge evaluation (Cross Language Evaluation Forum medical image retrieval), we used frequent queries that were still specific enough to at least cover two of the three axes on modality, anatomic region, and pathology. Several research groups evaluated their image retrieval algorithms based on these defined topics.

  6. A Theoretical Approach to Analyze the Parametric Influence on Spatial Patterns of Spodoptera frugiperda (J.E. Smith) (Lepidoptera: Noctuidae) Populations.

    Science.gov (United States)

    Garcia, A G; Godoy, W A C

    2017-06-01

    Studies of the influence of biological parameters on the spatial distribution of lepidopteran insects can provide useful information for managing agricultural pests, since the larvae of many species cause serious impacts on crops. Computational models to simulate the spatial dynamics of insect populations are increasingly used, because of their efficiency in representing insect movement. In this study, we used a cellular automata model to explore different patterns of population distribution of Spodoptera frugiperda (J.E. Smith) (Lepidoptera: Noctuidae), when the values of two biological parameters that are able to influence the spatial pattern (larval viability and adult longevity) are varied. We mapped the spatial patterns observed as the parameters varied. Additionally, by using population data for S. frugiperda obtained in different hosts under laboratory conditions, we were able to describe the expected spatial patterns occurring in corn, cotton, millet, and soybean crops based on the parameters varied. The results are discussed from the perspective of insect ecology and pest management. We concluded that computational approaches can be important tools to study the relationship between the biological parameters and spatial distributions of lepidopteran insect pests.

  7. Computer Programs for Obtaining and Analyzing Daily Mean Steamflow Data from the U.S. Geological Survey National Water Information System Web Site

    Science.gov (United States)

    Granato, Gregory E.

    2009-01-01

    Research Council, 2004). The USGS maintains the National Water Information System (NWIS), a distributed network of computers and file servers used to store and retrieve hydrologic data (Mathey, 1998; U.S. Geological Survey, 2008). NWISWeb is an online version of this database that includes water data from more than 24,000 streamflow-gaging stations throughout the United States (U.S. Geological Survey, 2002, 2008). Information from NWISWeb is commonly used to characterize streamflows at gaged sites and to help predict streamflows at ungaged sites. Five computer programs were developed for obtaining and analyzing streamflow from the National Water Information System (NWISWeb). The programs were developed as part of a study by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, to develop a stochastic empirical loading and dilution model. The programs were developed because reliable, efficient, and repeatable methods are needed to access and process streamflow information and data. The first program is designed to facilitate the downloading and reformatting of NWISWeb streamflow data. The second program is designed to facilitate graphical analysis of streamflow data. The third program is designed to facilitate streamflow-record extension and augmentation to help develop long-term statistical estimates for sites with limited data. The fourth program is designed to facilitate statistical analysis of streamflow data. The fifth program is a preprocessor to create batch input files for the U.S. Environmental Protection Agency DFLOW3 program for calculating low-flow statistics. These computer programs were developed to facilitate the analysis of daily mean streamflow data for planning-level water-quality analyses but also are useful for many other applications pertaining to streamflow data and statistics. These programs and the associated documentation are included on the CD-ROM accompanying this report. This report and the appendixes on the

  8. Optimal Experience of Web Activities.

    Science.gov (United States)

    Chen, Hsiang; Wigand, R. T.; Nilan, M. S.

    1999-01-01

    Reports on Web users' optimal flow experiences to examine positive aspects of Web experiences that could be linked to theory applied to other media and then incorporated into Web design. Discusses the use of content-analytic procedures to analyze open-ended questionnaires that examined Web users' perceived flow experiences. (Author/LRW)

  9. Integrating geospatial data and cropping system simulation within a geographic information system to analyze spatial seed cotton yield, water use, and irrigation requirements

    Science.gov (United States)

    The development of sensors that provide geospatial information on crop and soil conditions has been a primary success for precision agriculture. However, further developments are needed to integrate geospatial data into computer algorithms that spatially optimize crop production while considering po...

  10. 变异函数在水深场空间结构分析中的应用%Application of Semi-variogram in Analyzing Spatial Construction of Sounding Field

    Institute of Scientific and Technical Information of China (English)

    陈轶; 彭认灿; 张立华; 董箭; 李宁

    2011-01-01

    Semi-variogram is a very useful spatial analyzing method in geostatistics field. The paper introduces this method into a study on spatial analyzing of sounding field. By analyzing the data of nautical charts and harbor charts of our country, the result shows that, the soundings spatial relativity is pertinent to the distance and the terrain of seabed. Meanwhile, the characters of spatial variation are different on the different scale. Lastly,the spatial variations of sounding field are different in different directions.%引进了地统计学中的变异函数,通过对我国多幅航海图和港湾图水深数据的计算分析,初步揭示了水深场的空间变异特征.结果表明:水深点的空间相关性与相互间的距离和所表征的海底地形有关;在不同的尺度下水深场的空间变异特征不同;水深场的空间变异存在各向异性.

  11. Semantic web mining and the representation, analysis, and evolution of web space

    OpenAIRE

    Berendt, Bettina; Hotho, Andreas; Stumme, Gerd

    2005-01-01

    Semantic Web Mining aims at combining the two fast-developing research areas Semantic Web and Web Mining. This survey analyzes the convergence of trends from both areas: Growing numbers of researchers work on improving the results of Web Mining by exploiting semantic structures in the Web, and they use Web Mining techniques for building the Semantic Web. Last but not least, these techniques can be used for mining the Semantic Web itself. The second aim of this paper is to...

  12. 一种网络空间数据发布与在线处理平台的设计与实现%Design and Implement of a Web Spatial Data Publish and Online Processing Platform

    Institute of Scientific and Technical Information of China (English)

    周玉科; 周成虎; 陈荣国; 张明波; 陈应东

    2011-01-01

    Recent advances in internet technologies, coupled with wide adoption of the web services paradigm and interoperability standards, make the World Wide Web a popular vehicle for geo-spatial information distribution and online geo-processing. In this paper, a new spatial computing and data service publishing platform, I. E. LreisServer, is designed and implemented. The platform complies with OGC specification and implement WMS, WFS, GML standards. The extension implement details are discussed through a cooperation perspective. In the backside this platform takes postgis as spatial database and ap-plys its powerful ability to analyze and query spatial data. In the frontside, RIA technology such as open-layers and active is hybrid used, and use c# asp. Net to display map and couple with fat client spatial operation. The geometry objects model in this platform comply with OGC simple feature specification (point, polyline, polygon), and has map project function, including algorithm buffer, overlay, etc. The spatial index implement quardtree, R-tree, etc. A new kind of map cache mechanism is designed and developed to help speed up historical map data showing and accelate interoperation on the client side. Unit test is done with different data sources on this platform. In this paper we also evaluate alternative approaches and assess the pros and cons of our design and implementation. The results showed that: (1)because of the aps. Net cache tool, this platform can have a better performance in WMS service than ordinary OGC WMS. (2) On the benefit of spatial data storage and operating functions in Postgis, LreisServer can provide spatial data service and raw data in GML format. And (3)using the loading balance strategy, LreisServer can do simple spatial process and analysis on the client side.%本文讨论了一种负载均衡的Web空间数据发布和地理分析平台的实现(LreisServer),并通过不同的数据源对其性能进行测试.该平台遵循OGC服务标准,

  13. 利用开源软件开发基于Web服务的林业空间信息系统%Web service based spatial forest information system using an open source software approach

    Institute of Scientific and Technical Information of China (English)

    李世明; Joachim Saborowski; Jens Nieschulze; 李增元; 陆元昌; 陈尔学

    2007-01-01

    由于技术等原因,林业空间信息处于一种困境:数据提供方找不到合适的途径发布数据;数据需求方不能访问、集成已有的林业空间信息.为促进林业空间信息共享,本文提出利用开源软件开发基于Web服务的林业空间信息系统.在Web服务的框架下,系统支持互操作,可以集成来自其他应用服务器的Web服务,并可扩展为区域或国家林业空间数据基础设施的一部分.开源软件的发展,在操作系统、Web服务器、WebGIS和数据库管理系统等方面为用户提供了可用的开源软件.利用开源软件开发林业空间信息系统,在促进信息共享的同时,将节省软件购买及开发成本.本文选用遵循OGC开放规范的开源软件包Deegree和UMN MapServer开发了河南省西峡县林业空间信息系统原型,用户可以通过标准的Web浏览器访问来自于不同数据服务器的林业空间信息和旅游线路信息,促进了林业空间信息的共享.%For technical and other reasons there is a dilemma that data providers cannot find an appropriate way to redistribute spatial forest data and data users who need spatial data cannot access and integrate available forest resources information. To overcome this dilemma, this paper proposed a spatial forest information system based on Web service using an open source software approach. With Web service based architecture, the system can enable interoperability, integrate Web services from other application servers, reuse codes, and shorten the development time and cost. At the same time, it is possible to extend the local system to a regional or national spatial forest information system. The growth of Open Source Software (OSS) provides an alternative choice to proprietary software for operating systems, web servers, Web-based GIS applications and database management systems. Using open source software to develop spatial forest information systems can greatly reduce the cost while

  14. 18 Years of Recovery: Spatial Variation and Structure of a Secondary Forest Analyzed with Airborne Lidar Data in the Brazilian Atlantic Forest

    Science.gov (United States)

    dos-Santos, M. N.; Keller, M. M.; Scaranello, M. A., Sr.; Longo, M.; Daniel, P.

    2016-12-01

    Ongoing forest fragmentation in the tropics severely reduces the ability of remaining forests to store carbon and provide ecosystem services, however, secondary regeneration could offset the impacts of forest degradation. Previous plot-based forest inventory studies have shown that secondary regeneration is promoted at remnant forest edges. However, this process has not been studied at landscape scale. We used over 450 ha of lidar data to study the forest structure and spatial variation of secondary growth forest 18 years after swidden cultivation abandonment in Serra do Conduro State Park. Lidar data was acquired in December 2015 with a density of 93 points per square meter using an airborne scanning laser system (Optech Orion M-300). Serra do Conduru, a 10 000 ha State Park in Bahia was created in 1997 as part of a network of forest reserves with both old-growth forest and secondary forest aiming at establishing a central corridor of the Atlantic forest. The Brazilian Atlantic forest is a highly human modified and fragmented forest landscape reduced to 12.5% of its original extent. Prior to the establishment of the State Park, the area was a mosaic of forest and agricultural area. We created 10m wide buffers from the edge of the remnant forest into the secondary forest and generated lidar metrics for each strip in order to ask: does the distance from the remnant forest create a gradient effect on the secondary forest structure? We cross-compared the lidar metrics of the samples. Results demonstrate that distance from old-growth forest promotes spatial variation in forest recovery and forest structure.

  15. Characterizing web heuristics

    NARCIS (Netherlands)

    de Jong, Menno D.T.; van der Geest, Thea

    2000-01-01

    This article is intended to make Web designers more aware of the qualities of heuristics by presenting a framework for analyzing the characteristics of heuristics. The framework is meant to support Web designers in choosing among alternative heuristics. We hope that better knowledge of the

  16. Dynamic Web Pages: Performance Impact on Web Servers.

    Science.gov (United States)

    Kothari, Bhupesh; Claypool, Mark

    2001-01-01

    Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)

  17. WebCom: A Model for Understanding Web Site Communication

    DEFF Research Database (Denmark)

    Godsk, Mikkel; Petersen, Anja Bechmann

    2008-01-01

    This chapter presents a model (WebCom) for understanding and analyzing Web site-mediated communication, also referred to as Web site communication. The model combines three theoretical approaches - communication, medium, and activity theory - into one generic model that benefits from each...... of the approaches' strengths. Furthermore, it is discussed and shortly demonstrated how WebCom can be used for analytical and design purposes with YouTube as an example. The chapter concludes that WebCom is able to serve as a theoretically-based model for understanding complex Web site communication situations...

  18. Evaluation Method of Web Site Based on Web Structure Mining

    Institute of Scientific and Technical Information of China (English)

    LiJun-e; ZhouDong-ru

    2003-01-01

    The structure of Web site became more complex than before. During the design period of a Web site, the lack of model and method results in improper Web structure,which depend on the designer's experience. From the point of view of software engineering, every period in the software life must be evaluated before starting the next period's work. It is very important and essential to search relevant methods for evaluating Web structure before the site is completed. In this work, after studying the related work about the Web structure mining and analyzing the major structure mining methods (Page-rank and Hub/Authority), a method based on the Page-rank for Web structure evaluation in design stage is proposed. A Web structure modeling language WSML is designed, and the implement strategies for evaluating system of the Web site structure are given out. Web structure mining has being used mainly in search engines before. It is the first time to employ the Web structure mining technology to evaluate a Web structure in the design period of a Web site. It contributes to the formalization of the design documents for Web site and the improving of software engineering for large scale Web site, and the evaluating system is a practical tool for Web site construction.

  19. Analyzing the Facebook Friendship Graph

    OpenAIRE

    Catanese, Salvatore; De Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo

    2010-01-01

    Online Social Networks (OSN) during last years acquired a huge and increasing popularity as one of the most important emerging Web phenomena, deeply modifying the behavior of users and contributing to build a solid substrate of connections and relationships among people using the Web. In this preliminary work paper, our purpose is to analyze Facebook, considering a significant sample of data reflecting relationships among subscribed users. Our goal is to extract, from this platform, relevant ...

  20. Teaching Tectonics to Undergraduates with Web GIS

    Science.gov (United States)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  1. ANGDelMut – a web-based tool for predicting and analyzing functional loss mechanisms of amyotrophic lateral sclerosis-associated angiogenin mutations [v3; ref status: indexed, http://f1000r.es/2yt

    Directory of Open Access Journals (Sweden)

    Aditya K Padhi

    2014-02-01

    Full Text Available ANGDelMut is a web-based tool for predicting the functional consequences of missense mutations in the angiogenin (ANG protein, which is associated with amyotrophic lateral sclerosis (ALS. Missense mutations in ANG result in loss of either ribonucleolytic activity or nuclear translocation activity or both of these functions, and in turn cause ALS. However, no web-based tools are available to predict whether a newly identified ANG mutation will possibly lead to ALS. More importantly, no web-implemented method is currently available to predict the mechanisms of loss-of-function(s of ANG mutants. In light of this observation, we developed the ANGDelMut web-based tool, which predicts whether an ANG mutation is deleterious or benign. The user selects certain attributes from the input panel, which serves as a query to infer whether a mutant will exhibit loss of ribonucleolytic activity or nuclear translocation activity or whether the overall stability will be affected. The output states whether the mutation is deleterious or benign, and if it is deleterious, gives the possible mechanism(s of loss-of-function. This web-based tool, freely available at http://bioschool.iitd.ernet.in/DelMut/, is the first of its kind to provide a platform for researchers and clinicians, to infer the functional consequences of ANG mutations and correlate their possible association with ALS ahead of experimental findings.

  2. Analyzing Orientations

    Science.gov (United States)

    Ruggles, Clive L. N.

    Archaeoastronomical field survey typically involves the measurement of structural orientations (i.e., orientations along and between built structures) in relation to the visible landscape and particularly the surrounding horizon. This chapter focuses on the process of analyzing the astronomical potential of oriented structures, whether in the field or as a desktop appraisal, with the aim of establishing the archaeoastronomical "facts". It does not address questions of data selection (see instead Chap. 25, "Best Practice for Evaluating the Astronomical Significance of Archaeological Sites", 10.1007/978-1-4614-6141-8_25) or interpretation (see Chap. 24, "Nature and Analysis of Material Evidence Relevant to Archaeoastronomy", 10.1007/978-1-4614-6141-8_22). The main necessity is to determine the azimuth, horizon altitude, and declination in the direction "indicated" by any structural orientation. Normally, there are a range of possibilities, reflecting the various errors and uncertainties in estimating the intended (or, at least, the constructed) orientation, and in more formal approaches an attempt is made to assign a probability distribution extending over a spread of declinations. These probability distributions can then be cumulated in order to visualize and analyze the combined data from several orientations, so as to identify any consistent astronomical associations that can then be correlated with the declinations of particular astronomical objects or phenomena at any era in the past. The whole process raises various procedural and methodological issues and does not proceed in isolation from the consideration of corroborative data, which is essential in order to develop viable cultural interpretations.

  3. Association Rule Mining for Web Recommendation

    Directory of Open Access Journals (Sweden)

    R. Suguna

    2012-10-01

    Full Text Available Web usage mining is the application of web mining to discover the useful patterns from the web in order to understand and analyze the behavior of the web users and web based applications. It is theemerging research trend for today’s researchers. It entirely deals with web log files which contain the user website access information. It is an interesting thing to analyze and understand the user behaviorabout the web access. Web usage mining normally has three categories: 1. Preprocessing, 2. Pattern Discovery and 3. Pattern Analysis. This paper proposes the association rule mining algorithms for betterWeb Recommendation and Web Personalization. Web recommendation systems are considered as an important role to understand customers’ behavior, interest, improving customer convenience, increasingservice provider profits and future needs.

  4. Analyzing Underway Replenishments Through Spatial Mapping

    Science.gov (United States)

    2012-12-01

    dimensions of 1x1 day transit time and using the Pythagorean Theorem, it would take a CLF ship at least 1.4 days of total transit time to replenish a...Business Tree Map. (Available from http://www.smartmoney.com, 2012). This is a snapshot of the stock market after closing on August 14, 2012. Boxes...represent companies nested under a listed industry. Size of box is directly proportional to the size of their respective market capitalization

  5. Dark Web

    CERN Document Server

    Chen, Hsinchun

    2012-01-01

    The University of Arizona Artificial Intelligence Lab (AI Lab) Dark Web project is a long-term scientific research program that aims to study and understand the international terrorism (Jihadist) phenomena via a computational, data-centric approach. We aim to collect "ALL" web content generated by international terrorist groups, including web sites, forums, chat rooms, blogs, social networking sites, videos, virtual world, etc. We have developed various multilingual data mining, text mining, and web mining techniques to perform link analysis, content analysis, web metrics (technical

  6. 基于HTML5、Ajax和Web Service的WebGIS研究%Research on WebGIS based on HTML5, Ajax and Web Service

    Institute of Scientific and Technical Information of China (English)

    徐卓揆

    2012-01-01

    为改变WebGIS中各种浏览嚣缺乏支持矢量数据的标准方法、数据互操作能力有限和空间分析功能较弱的现状,基于新一代HTML5标准、Ajax和Web Service技术,本文提出了开放式WebGIS模型,并开发了实验平台.该平台支持数据共享的Web Services和OGC新规范Web Processing Service,改善了现有WebGIS缺陷,提高了WebGIS的互操作及空间分析能力.%To amend the situation, including no standard for various browsers to support GIS vector data, limitation of GIS data interoperation and weak function of spatial analysis in WebGIS, based on new HTML5 atandard, Ajax and Web Service technologies,the paper put forward an open model of WebGIS, and an experimental platform was developed In this platform, Web Services for data interoperation and new OGC standard Web Processing Service were supported, the platform obviously amended the defects of WebGIS,and improved the abilities of interoperation and spatial analysis with WebGIS.

  7. Economic analysis of spider web airline networks

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The distinct network organization, management, service and operating characteristics of US Southwest Airlines are key elements of its success compared with other airlines. As a network organization type, the spider web airline network has received more attention. In this paper, we analyzed the relation between the spider web airline network and spider web, and the structure of spider web airline network, built the assignment model of the spider web airline network,and investigated the economics concerned.

  8. Utility of leaf-colouring information published on web sites for evaluation of spatial-temporal variability of autumn leaf phenology in Japan

    Science.gov (United States)

    Nagai, S.; Saitoh, T. M.; Suzuki, R.

    2016-12-01

    Spatio-temporal variability of leaf-colouring information published on web sites ("big data") was examined to evaluate the spatio-temporal characteristics of autumn leaf phenology in Japan. We first collected leaf-colouring information published on the meteorological service web site: "tenki.jp" (http://www.tenki.jp/) from September to December in 2015 (about 750 points); we then evaluated the relationship between spatio-temporal variability of leaf-colouring information and warmth index based on daily mean air temperature (reference air temperature was 5 degree Celsius). We also examined the relationship between leaf-colouring information and the timing of end of growing season detected by analysing the Terra and Aqua/MODIS satellite-observed daily green red vegetation index. We found that, for the peak timing of leaf-colouring, (1) changes along the horizontal (latitudinal) gradient showed non-linear relationship for the whole Japan; (2) changes along the vertical (altitudinal) gradient in the western Japan (west of E138) tended to be larger than those along the horizontal gradient; and (3) changes along vertical gradient showed linear correlation with the daily warmth index.

  9. Analyzing Math and Science Pre-Service Teachers School Experience Course Journals Shared in Web-Based Platforms [Web Destekli Ortamlarda Fen ve Matematik Öğretmen Adaylarının Paylaştıkları Öğretmenlik Uygulaması Günlüklerinin İncelenmesi

    Directory of Open Access Journals (Sweden)

    Didem İnel Ekici

    2016-08-01

    Full Text Available In this study, we aimed to investigate pre-service math and science teachers during teaching practicum course by analyzing daily journals shared in a web-based environment. 65 seniors participated in the study (41 pre-service math teachers and 24 pre-service science teachers. Within qualitative case study approach, the study focused on descriptive and content analysis during the data analysis. By looking at the findings of the study, pre-service math teachers provided more details in discussing the teaching methods and emphasizing on lesson preparations compared to pre-service science teachers. On the other hand, pre-service science teachers included more specific activity examples than did pre-service math teachers in their daily journal entries. Another important finding of the study showed that pre-service teachers tended to evaluate themselves and their friends positively, but negatively criticized their teachers. In light of these findings, we recommend offering teaching practicum course when pre-service teachers take their theoretical courses. [Bu araştırmada fen ve matematik öğretmen adaylarının “öğretmenlik uygulaması” dersi kapsamında web destekli ortamlarda paylaştıkları günlüklerin incelenmesi amaçlanmıştır. Araştırmaya dördüncü sınıfta öğrenim görmekte olan 65 öğretmen adayı katılmıştır (24 Fen öğretmen adayı ve 41 Matematik öğretmen adayı. Nitel verilere dayalı bir durum çalışması olan araştırma verilerinin analizinde betimsel analiz ve içerik analizi yöntemleri kullanılmıştır. Araştırma bulguları incelendiğinde, öğrenme sürecinde kullanılan öğretim yöntemlerini değerlendirme ve ders öncesi hazırlıkları vurgulama konularında matematik öğretmen adaylarının günlüklerinde, fen öğretmen adaylarından daha detaylı bilgiler sundukları belirlenmiştir. Buna karşılık fen öğretmen adayları günlüklerinde, matematik öğretmen adaylarından daha fazla

  10. Web Engineering

    Energy Technology Data Exchange (ETDEWEB)

    White, Bebo

    2003-06-23

    Web Engineering is the application of systematic, disciplined and quantifiable approaches to development, operation, and maintenance of Web-based applications. It is both a pro-active approach and a growing collection of theoretical and empirical research in Web application development. This paper gives an overview of Web Engineering by addressing the questions: (a) why is it needed? (b) what is its domain of operation? (c) how does it help and what should it do to improve Web application development? and (d) how should it be incorporated in education and training? The paper discusses the significant differences that exist between Web applications and conventional software, the taxonomy of Web applications, the progress made so far and the research issues and experience of creating a specialization at the master's level. The paper reaches a conclusion that Web Engineering at this stage is a moving target since Web technologies are constantly evolving, making new types of applications possible, which in turn may require innovations in how they are built, deployed and maintained.

  11. Sensor-Web Operations Explorer

    Science.gov (United States)

    Meemong, Lee; Miller, Charles; Bowman, Kevin; Weidner, Richard

    2008-01-01

    Understanding the atmospheric state and its impact on air quality requires observations of trace gases, aerosols, clouds, and physical parameters across temporal and spatial scales that range from minutes to days and from meters to more than 10,000 kilometers. Observations include continuous local monitoring for particle formation; field campaigns for emissions, local transport, and chemistry; and periodic global measurements for continental transport and chemistry. Understanding includes global data assimilation framework capable of hierarchical coupling, dynamic integration of chemical data and atmospheric models, and feedback loops between models and observations. The objective of the sensor-web system is to observe trace gases, aerosols, clouds, and physical parameters, an integrated observation infrastructure composed of space-borne, air-borne, and in-situ sensors will be simulated based on their measurement physics properties. The objective of the sensor-web operation is to optimally plan for heterogeneous multiple sensors, the sampling strategies will be explored and science impact will be analyzed based on comprehensive modeling of atmospheric phenomena including convection, transport, and chemical process. Topics include system architecture, software architecture, hardware architecture, process flow, technology infusion, challenges, and future direction.

  12. Collective spatial keyword querying

    DEFF Research Database (Denmark)

    Cao, Xin; Cong, Gao; Jensen, Christian S.;

    2011-01-01

    With the proliferation of geo-positioning and geo-tagging, spatial web objects that possess both a geographical location and a textual description are gaining in prevalence, and spatial keyword queries that exploit both location and textual description are gaining in prominence. However......, the queries studied so far generally focus on finding individual objects that each satisfy a query rather than finding groups of objects where the objects in a group collectively satisfy a query. We define the problem of retrieving a group of spatial web objects such that the group's keywords cover the query...

  13. Web Similarity

    NARCIS (Netherlands)

    Cohen, A.R.; Vitányi, P.M.B.

    2015-01-01

    Normalized web distance (NWD) is a similarity or normalized semantic distance based on the World Wide Web or any other large electronic database, for instance Wikipedia, and a search engine that returns reliable aggregate page counts. For sets of search terms the NWD gives a similarity on a scale fr

  14. Understanding User-Web Interactions via Web Analytics

    CERN Document Server

    Jansen, Bernard J

    2009-01-01

    This lecture presents an overview of the Web analytics process, with a focus on providing insight and actionable outcomes from collecting and analyzing Internet data. The lecture first provides an overview of Web analytics, providing in essence, a condensed version of the entire lecture. The lecture then outlines the theoretical and methodological foundations of Web analytics in order to make obvious the strengths and shortcomings of Web analytics as an approach. These foundational elements include the psychological basis in behaviorism and methodological underpinning of trace data as an empir

  15. Borderless Geospatial Web (bolegweb)

    Science.gov (United States)

    Cetl, V.; Kliment, T.; Kliment, M.

    2016-06-01

    The effective access and use of geospatial information (GI) resources acquires a critical value of importance in modern knowledge based society. Standard web services defined by Open Geospatial Consortium (OGC) are frequently used within the implementations of spatial data infrastructures (SDIs) to facilitate discovery and use of geospatial data. This data is stored in databases located in a layer, called the invisible web, thus are ignored by search engines. SDI uses a catalogue (discovery) service for the web as a gateway to the GI world through the metadata defined by ISO standards, which are structurally diverse to OGC metadata. Therefore, a crosswalk needs to be implemented to bridge the OGC resources discovered on mainstream web with those documented by metadata in an SDI to enrich its information extent. A public global wide and user friendly portal of OGC resources available on the web ensures and enhances the use of GI within a multidisciplinary context and bridges the geospatial web from the end-user perspective, thus opens its borders to everybody. Project "Crosswalking the layers of geospatial information resources to enable a borderless geospatial web" with the acronym BOLEGWEB is ongoing as a postdoctoral research project at the Faculty of Geodesy, University of Zagreb in Croatia (http://bolegweb.geof.unizg.hr/). The research leading to the results of the project has received funding from the European Union Seventh Framework Programme (FP7 2007-2013) under Marie Curie FP7-PEOPLE-2011-COFUND. The project started in the November 2014 and is planned to be finished by the end of 2016. This paper provides an overview of the project, research questions and methodology, so far achieved results and future steps.

  16. Isolation by distance, web service

    OpenAIRE

    Bohonak Andrew J; Jensen Jeffrey L; Kelley Scott T

    2005-01-01

    Abstract Background The population genetic pattern known as "isolation by distance" results from spatially limited gene flow and is a commonly observed phenomenon in natural populations. However, few software programs exist for estimating the degree of isolation by distance among populations, and they tend not to be user-friendly. Results We have created Isolation by Distance Web Service (IBDWS) a user-friendly web interface for determining patterns of isolation by distance. Using this site, ...

  17. A Cooperative Schema between Web Sever and Search Engine for Improving Freshness of Web Repository

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Because the web is huge and web pages are updated frequently, the index maintained by a search engine has to refresh web pages periodically. This is extremely resource consuming because the search engine needs to crawl the web and download web pages to refresh its index. Based on present technologies of web refreshing, we present a cooperative schema between web server and search engine for maintaining freshness of web repository. The web server provides meta-data defined through XML standard to describe web sites. Before updating the web page the crawler visits the meta-data files. If the meta-data indicates that the page is not modified, then the crawler will not update it. So this schema can save bandwidth resource. A primitive model based on the schema is implemented. The cost and efficiency of the schema are analyzed.

  18. WebCom: A Model for Understanding Web Site Communication

    DEFF Research Database (Denmark)

    Godsk, Mikkel; Petersen, Anja Bechmann

    2008-01-01

    This chapter presents a model (WebCom) for understanding and analyzing Web site-mediated communication, also referred to as Web site communication. The model combines three theoretical approaches - communication, medium, and activity theory - into one generic model that benefits from each...... of the approaches' strengths. Furthermore, it is discussed and shortly demonstrated how WebCom can be used for analytical and design purposes with YouTube as an example. The chapter concludes that WebCom is able to serve as a theoretically-based model for understanding complex Web site communication situations...... in their entirety, and that such thoroughly approach is required for successful computer mediated communication (CMC) when communicating across cultures and contexts....

  19. Web Page Recommendation Using Web Mining

    Directory of Open Access Journals (Sweden)

    Modraj Bhavsar

    2014-07-01

    Full Text Available On World Wide Web various kind of content are generated in huge amount, so to give relevant result to user web recommendation become important part of web application. On web different kind of web recommendation are made available to user every day that includes Image, Video, Audio, query suggestion and web page. In this paper we are aiming at providing framework for web page recommendation. 1 First we describe the basics of web mining, types of web mining. 2 Details of each web mining technique.3We propose the architecture for the personalized web page recommendation.

  20. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  1. Web Personalization Using Web Mining

    Directory of Open Access Journals (Sweden)

    Ms.Kavita D.Satokar,

    2010-03-01

    Full Text Available The information on the web is growing dramatically. The users has to spend lots of time on the web finding the information they are interested in. Today, he traditional search engines do not give users enough personalized help but provide the user with lots of irrelevant information. In this paper, we present a personalize Web searchsystem, which can helps users to get the relevant web pages based on their selection from the domain list. Thus, users can obtain a set of interested domains and the web pages from the system. The system is based on features extracted from hyperlinks, such as anchor terms or URL tokens. Our methodology uses an innovative weighted URL Rank algorithm based on user interested domains and user query.

  2. Semantic web services for web databases

    CERN Document Server

    Ouzzani, Mourad

    2011-01-01

    Semantic Web Services for Web Databases introduces an end-to-end framework for querying Web databases using novel Web service querying techniques. This includes a detailed framework for the query infrastructure for Web databases and services. Case studies are covered in the last section of this book. Semantic Web Services For Web Databases is designed for practitioners and researchers focused on service-oriented computing and Web databases.

  3. Sensor web

    Science.gov (United States)

    Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)

    2011-01-01

    A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.

  4. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Nanhay Singh

    2013-07-01

    Full Text Available Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper intended with experimental work in which web log data is used. We have taken the web log data from the “NASA” web server which is analyzed with “Web Log Explorer”. Web Log Explorer is a web usage mining tool which plays the vital role to carry out this work.

  5. BORDERLESS GEOSPATIAL WEB (BOLEGWEB

    Directory of Open Access Journals (Sweden)

    V. Cetl

    2016-06-01

    Full Text Available The effective access and use of geospatial information (GI resources acquires a critical value of importance in modern knowledge based society. Standard web services defined by Open Geospatial Consortium (OGC are frequently used within the implementations of spatial data infrastructures (SDIs to facilitate discovery and use of geospatial data. This data is stored in databases located in a layer, called the invisible web, thus are ignored by search engines. SDI uses a catalogue (discovery service for the web as a gateway to the GI world through the metadata defined by ISO standards, which are structurally diverse to OGC metadata. Therefore, a crosswalk needs to be implemented to bridge the OGC resources discovered on mainstream web with those documented by metadata in an SDI to enrich its information extent. A public global wide and user friendly portal of OGC resources available on the web ensures and enhances the use of GI within a multidisciplinary context and bridges the geospatial web from the end-user perspective, thus opens its borders to everybody. Project “Crosswalking the layers of geospatial information resources to enable a borderless geospatial web” with the acronym BOLEGWEB is ongoing as a postdoctoral research project at the Faculty of Geodesy, University of Zagreb in Croatia (http://bolegweb.geof.unizg.hr/. The research leading to the results of the project has received funding from the European Union Seventh Framework Programme (FP7 2007-2013 under Marie Curie FP7-PEOPLE-2011-COFUND. The project started in the November 2014 and is planned to be finished by the end of 2016. This paper provides an overview of the project, research questions and methodology, so far achieved results and future steps.

  6. Study on End-to-End Web Performance

    Institute of Scientific and Technical Information of China (English)

    GAO Ke-li; DAI Li-zhong

    2004-01-01

    While there are lots of papers discussing one or more aspects of web performance, there are few papers talking of web performance as a whole. This paper most thoroughly discusses aspects that influence web performance and current known web techniques. In addition, we discussed the general methods of web performance measurement and explained the discrepancies between our results and those of others. Finally, we analyzed the bottlenecks of web and come up with possible solutions.

  7. The WebStand Project

    CERN Document Server

    Nguyen, Benjamin; Colazzo, Dario; Vion, Antoine; Manolescu, Ioana; Senellart, Pierre

    2010-01-01

    In this paper we present the state of advancement of the French ANR WebStand project. The objective of this project is to construct a customizable XML based warehouse platform to acquire, transform, analyze, store, query and export data from the web, in particular mailing lists, with the final intension of using this data to perform sociological studies focused on social groups of World Wide Web, with a specific emphasis on the temporal aspects of this data. We are currently using this system to analyze the standardization process of the W3C, through its social network of standard setters.

  8. Application of Ranganathans Laws to the Web

    Directory of Open Access Journals (Sweden)

    Fatemeh Amoohosseini

    2006-10-01

    Full Text Available This paper analyzes the Web and raises a significant question: Does the Web save the time of the users? This question is analyzed in the context of Five Laws of the Web. What do these laws mean? The laws are meant to be elemental, to convey a deep understanding and capture the essential meaning of the World Wide Web. These laws may seem simplistic, but in fact they express a simple, crystal-clear vision of what the Web ought to be. Moreover, we intend to echo the simplicity of Ranganathans Five Laws of Library Science which inspired them

  9. Application of Ranganathan's Laws to the Web

    Directory of Open Access Journals (Sweden)

    Alireza Noruzi

    2004-12-01

    Full Text Available This paper analyzes the Web and raises a significant question: "Does the Web save the time of the users?" This question is analyzed in the context of Five Laws of the Web. What do these laws mean? The laws are meant to be elemental, to convey a deep understanding and capture the essential meaning of the World Wide Web. These laws may seem simplistic, but in fact they express a simple, crystal-clear vision of what the Web ought to be. Moreover, we intend to echo the simplicity of Ranganathan's Five Laws of Library Science which inspired them.

  10. 基于WebService的WebGIS性能的优化%The Optimization on the Performance of WebGIS Based on Web Service

    Institute of Scientific and Technical Information of China (English)

    韩双旺

    2011-01-01

    由于GIS中不但涉及属性数据,而且还涉及地理空间数据,因此数据量相对庞大,所以在设计和实现WebGIS时。必须考虑其性能问题.为了更高效地实现基于WebService的WebGIS的相关功能,有必要对其性能进行优化,这可通过增大Web Service颗粒度,不使用XML作为WebGIS系统内部的接口,压缩SOAP,通过异步访问服务器端Web Service中的Web方法,优化数据库,使用客户端和服务器端缓存等一系列优化措施来加快数据的访问速度,减轻网络传输负载,提高基于Web Service的WebGIS性能.%It not only includes attribute data in the CIS, but also includes the geo-spatial data, arelatively large amount of data, so we must consider the performance issues in the design and implementation of WebGIS. In order to more efficiently" implement the relative functions of a WebGIS based on Web Service, it is necessary to optimize the performance, which can increase the particle size of Web Service, do not use XML as the WebGIS interface within the system; compressed SOAP, an asynchronous Access Web method in Web Service of server-side, optimizing the database, using the client and server side caching and a series of optimization measures to speed up data access speed and reduce network traffic loads and improve the performance of WebGIS based on Web Service.

  11. Moving Spatial Keyword Queries

    DEFF Research Database (Denmark)

    Wu, Dingming; Yiu, Man Lung; Jensen, Christian S.

    2013-01-01

    Web users and content are increasingly being geo-positioned. This development gives prominence to spatial keyword queries, which involve both the locations and textual descriptions of content. We study the efficient processing of continuously moving top-k spatial keyword (MkSK) queries over spatial...... text data. State-of-the-art solutions for moving queries employ safe zones that guarantee the validity of reported results as long as the user remains within the safe zone associated with a result. However, existing safe-zone methods focus solely on spatial locations and ignore text relevancy. We...

  12. 1962-2007年广东干湿时空变化特征分析%Analyzing the spatial-temporal variation of wet and dry spells during 1962-2007 in Guangdong province

    Institute of Scientific and Technical Information of China (English)

    陈子燊; 黄强; 刘曾美

    2013-01-01

    The multi-scale standardized precipitation and evapotranspiration index (SPEI) is calculated using the monthly precipitation and temperature data observed at 74 meteorological stations during 1962-2007 in Guangdong.The spatial-temporal variation of wet and dry spells is analyzed using the methods of the rotated empirical orthogonal function (REOF),the Mann-Kendall trend test and the wavelet analysis.Results show that both the frequency and spatial extent of drought have increased over time since the 1970s.The whole Guangdong can be divided into six wet and dry regions based on the first six modes of REOF.These regions are located in the Pearl River Delta,the upper Hanjiang River and Dongjiang River basins,the Xijiang River basin and the middle and lower Beijiang River basin,the eastern coastal district,the upper Beijiang River basin,and the western coastal district.The trend of wet and dry spells in Guangdong varies significantly across the province from east to east.Significant upward trends in drought have been detected in the Leizhou Peninsula,the Xijiang River basin and the middle and lower Beijiang River basin,the western coastal district.In addition,the temporal variation of wet and dry spells exhibits a periodic oscillation of period of 2 to 8 years in the six regions.%利用广东省74个气象站点1962-2007年的月降水与气温数据,计算多时间尺度的标准化蒸散发指数,采用旋转经验正交函数(REOF)、Mann-Kendall趋势检验和小波分析等方法分析广东近50年来的干湿时空变化特征.研究结果表明:①广东20世纪70年代以来干旱发生事件随时间持续增多,空间范围扩展;②根据REOF时空分解的前6个空间模态,可以将广东划分成6个干湿特征区域,分别位于珠江三角洲、韩江流域及东江流域上游、西江流域及北江中下游流域、粤东沿海区域、北江上游区域和粤西沿海区域;③广东干湿发展具有明显的东西部差异性,其中西江流

  13. Users’ recognition in web using web mining techniques

    Directory of Open Access Journals (Sweden)

    Hamed Ghazanfaripoor

    2013-06-01

    Full Text Available The rapid growth of the web and the lack of structure or an integrated schema create various issues to access the information for users. All users’ access on web information are saved in the related server log files. The circumstance of using these files is implemented as a resource for finding some patterns of user's behavior. Web mining is a subset of data mining and it means the mining of the related data from WWW, which is categorized into three parts including web content mining, web structure mining and web usage mining, based on the part of data, which is mined. It seems necessary to have a technique, which is capable of learning the users’ interests and based on the interests, which could filter the unrelated interests automatically or it could offer the related information to the user in reasonable amount of time. The web usage mining makes a profile from users to recognize them and it has direct relationship to web personalizing. The primary objective of personalizing systems is to prepare the thing, which is required by users, without asking them explicitly. In the other way, formal models prepare the possibility of system’s behavior modeling. The Petri and queue nets as some samples of these models can analyze the user's behavior in web. The primary objective of this paper is to present a colored Petri net to model the user's interactions for offering a list of pages recommendation to them in web. Estimating the user's behavior is implemented in some cases like offering the proper pages to continue the browse in web, ecommerce and targeted advertising. The preliminary results indicate that the proposed method is able to improve the accuracy criterion 8.3% rather static method.

  14. Discovering Authorities and Hubs in Different Topological Web Graph Structures.

    Science.gov (United States)

    Meghabghab, George

    2002-01-01

    Discussion of citation analysis on the Web considers Web hyperlinks as a source to analyze citations. Topics include basic graph theory applied to Web pages, including matrices, linear algebra, and Web topology; and hubs and authorities, including a search technique called HITS (Hyperlink Induced Topic Search). (Author/LRW)

  15. Web Classification Using DYN FP Algorithm

    Directory of Open Access Journals (Sweden)

    Bhanu Pratap Singh

    2014-01-01

    Full Text Available Web mining is the application of data mining techniques to extract knowledge from Web. Web mining has been explored to a vast degree and different techniques have been proposed for a variety of applications that includes Web Search, Classification and Personalization etc. The primary goal of the web site is to provide the relevant information to the users. Web mining technique is used to categorize users and pages by analyzing users behavior, the content of pages and order of URLs accessed. In this paper, proposes an auto-classification algorithm of web pages using data mining techniques. The problem of discovering association rules between terms in a set of web pages belonging to a category in a search engine database, and present an auto – classification algorithm for solving this problem that are fundamentally based on FP-growth algorithm

  16. Evaluation Method of Web Site Structure Based on Web Structure Mining

    Institute of Scientific and Technical Information of China (English)

    Li Jun-e; Zhou Dong-ru

    2003-01-01

    The structure of Web site hecarne more complex titan before. During the design period of a Web site, the lack of model and method results in improper Web structure,which depend on the designer's experience. From the point of view of software engineering, every period in the software life must be evaluated before starting the next period's work. It is very important and essential to search relevant methods for evaluating Web structure before the site is completed. In this work, after studying the related work about the Web struc lure mining and analyzing the major structure mining methods (Page-rank and Hub/Authority), a method based on the Page-rank for Web structure evaluation in design stage is proposecL A Web structure modeling language WSML is designed, and the implement strategies for evaluating system of the Web site structure are given out. Web structure mining has being used mainly in search engines before. It is the first time to employ the Web structure mining technology to evaluate a Web structure in the design period of a Web site. It contributes to the formalization of the design documents for Web site and the improving of software engineering for large scale Web site, and the evaluating system is a practical tool for Web site construction.

  17. Gestor de contenidos web

    OpenAIRE

    García Populin, Iván

    2014-01-01

    Trabajo final de carrera desarrollado en .NET. Presenta un gestor de contenidos web para generar una web publicitaria. Treball final de carrera desenvolupat en .NET. Presenta un gestor de continguts web per generar una web publicitària.

  18. Advances in Sensor Webs for NASA Earth Science Missions

    Science.gov (United States)

    Sherwood, R.; Moe, K.; Smith, S.; Prescott, G.

    2007-12-01

    The world is slowly evolving into a web of interconnected sensors. Innovations such as camera phones that upload directly to the internet, networked devices with built-in GPS chips, traffic sensors, and the wireless networks that connect these devices are transforming our society. Similar advances are occurring in science sensors at NASA. NASA developed autonomy software has demonstrated the potential for space missions to use onboard decision-making to detect, analyze, and respond to science events. This software has also enabled NASA satellites to coordinate with other satellites and ground sensors to form an autonomous sensor web. A vision for NASA sensor webs for Earth science is to enable "on-demand sensing of a broad array of environmental and ecological phenomena across a wide range of spatial and temporal scales, from a heterogeneous suite of sensors both in-situ and in orbit." Several technologies for improved autonomous science and sensor webs are being developed at NASA. Each of these technologies advances the state of the art in sensorwebs in different areas including enabling model interactions with sensorwebs, smart autonomous sensors, and sensorweb communications. Enabling model interactions in sensor webs is focused on the creation and management of new sensor web enabled information products. Specifically, the format of these data products and the sensor webs that use them must be standardized so that sensor web components can more easily communicate with each other. This standardization will allow new components such as models and simulations to be included within sensor webs. Smart sensing implies sophistication in the sensors themselves. The goal of smart sensing is to enable autonomous event detection and reconfiguration. This may include onboard processing, self-healing sensors, and self-identifying sensors. The goal of communication enhancements, especially session layer management, is to support dialog control for autonomous operations

  19. WEB 238 Courses Tutorial / indigohelp

    OpenAIRE

    2015-01-01

    WEB 238 Week 2 JavaScript Events WEB 238 Week 3 Cookies WEB 238 Week 4 Dynamic HTML WEB 238 Week 5 Web Programming Languages WEB 238 Week 1 DQs WEB 238 Week 2DQs WEB 238 Week 3DQs WEB 238 Week 4DQs WEB 238 Week 5DQs  

  20. GIS-facilitated spatial narratives

    DEFF Research Database (Denmark)

    Møller-Jensen, Lasse; Jeppesen, Henrik; Kofie, Richard Y.

    2008-01-01

    -based' exploration of sites related to the narrative and as a tool that facilitates the design of spatial narratives before implementation within portable GIS devices. The Google Earth-based visualization of the spatial narrative is created by a Python script that outputs a web-accessible KML format file. The KML...

  1. Usare WebDewey

    OpenAIRE

    Baldi, Paolo

    2016-01-01

    This presentation shows how to use the WebDewey tool. Features of WebDewey. Italian WebDewey compared with American WebDewey. Querying Italian WebDewey. Italian WebDewey and MARC21. Italian WebDewey and UNIMARC. Numbers, captions, "equivalente verbale": Dewey decimal classification in Italian catalogues. Italian WebDewey and Nuovo soggettario. Italian WebDewey and LCSH. Italian WebDewey compared with printed version of Italian Dewey Classification (22. edition): advantages and disadvantages o...

  2. The content and design of Web sites : an empirical study

    NARCIS (Netherlands)

    Huizingh, EKRE

    2000-01-01

    To support the emergence of a solid knowledge base for analyzing Web activity, we have developed a framework to analyze and categorize the capabilities of Web sites. This distinguishes content from design. Content refers to the information, features, or services that are offered in the Web site, des

  3. The content and design of Web sites : an empirical study

    NARCIS (Netherlands)

    Huizingh, EKRE

    2000-01-01

    To support the emergence of a solid knowledge base for analyzing Web activity, we have developed a framework to analyze and categorize the capabilities of Web sites. This distinguishes content from design. Content refers to the information, features, or services that are offered in the Web site,

  4. The content and design of Web sites : an empirical study

    NARCIS (Netherlands)

    Huizingh, EKRE

    2000-01-01

    To support the emergence of a solid knowledge base for analyzing Web activity, we have developed a framework to analyze and categorize the capabilities of Web sites. This distinguishes content from design. Content refers to the information, features, or services that are offered in the Web site, des

  5. The Strategy of Promoting Web-site of the Company

    OpenAIRE

    Babenko Nelya A.

    2012-01-01

    The article analyzes the strategy of promotion web-site of the company. The problems of developing a strategy to promote the company web-site on the Internet. An algorithm for the development of promotion strategy web-site of the procedure and its promotion. Recommended methods for the primary attraction of visitors to the web-site. F model assessing the effectiveness of company web-site is the algorithm for its optimization.

  6. On-line Generation of Suggestions for Web Users

    OpenAIRE

    2004-01-01

    One important class of Data Mining applications is the so-called "Web Mining" that analyzes and extracts important and non-trivial knowledge from Web related data. Typical applications of Web Mining are represented by the personalization or recommender systems.These systems are aimed to extract knowledge from the analysis of historical information of a web server in order to improve the web site expressiveness in terms of readability and content availability. Typically, these systems are made...

  7. VOSA: A VO SED Analyzer

    Science.gov (United States)

    Rodrigo, C.; Bayo, A.; Solano, E.

    2017-03-01

    VOSA (VO Sed Analyzer, http://svo2.cab.inta-csic.es/theory/vosa) is a public web-tool developed by the Spanish Virtual Observatory (http://svo.cab.inta-csic.es/) and designed to help users to (1) build Spectral Energy Distributions (SEDs) combining private photometric measurements with data available in VO services, (2) obtain relevant properties of these objects (distance, extinction, etc) from VO catalogs, (3) analyze them comparing observed photometry with synthetic photometry from different collections of theoretical models or observational templates, using different techniques (chi-square minimization, Bayesian analysis) to estimate physical parameters of the observed objects (teff, logg, metallicity, stellar radius/distance ratio, infrared excess, etc), and use these results to (4) estimate masses and ages via interpolation of collections of isochrones and evolutionary tracks from the VO. In particular, VOSA offers the advantage of deriving physical parameters using all the available photometric information instead of a restricted subset of colors. The results can be downloaded in different formats or sent to other VO tools using SAMP. We have upgraded VOSA to provide access to Gaia photometry and give a homogeneous estimation of the physical parameters of thousands of objects at a time. This upgrade has required the implementation of a new computation paradigm, including a distributed environment, the capability of submitting and processing jobs in an asynchronous way, the use of parallelized computing to speed up processes (˜ ten times faster) and a new design of the web interface.

  8. Analyzing the Web Services and UniFrame Paradigms

    Science.gov (United States)

    2003-04-01

    B2B e - commerce provides a company with an effective and efficient end-to-end process communication to buy and sell services in an economical way...efficient ways of electronic communication. • HTTP is the preeminent protocol to transfer WS content and is allowed a free access through firewalls. HTTP...Jersey 07458 [5] Dhingra, V., �Business-to-Business Ecommerce ,� http://projects.bus.lsu.edu/independent_study/vdhing1/b2b. [6] A Darwin Partners and

  9. Web Page Categorization Using Artificial Neural Networks

    CERN Document Server

    Kamruzzaman, S M

    2010-01-01

    Web page categorization is one of the challenging tasks in the world of ever increasing web technologies. There are many ways of categorization of web pages based on different approach and features. This paper proposes a new dimension in the way of categorization of web pages using artificial neural network (ANN) through extracting the features automatically. Here eight major categories of web pages have been selected for categorization; these are business & economy, education, government, entertainment, sports, news & media, job search, and science. The whole process of the proposed system is done in three successive stages. In the first stage, the features are automatically extracted through analyzing the source of the web pages. The second stage includes fixing the input values of the neural network; all the values remain between 0 and 1. The variations in those values affect the output. Finally the third stage determines the class of a certain web page out of eight predefined classes. This stage i...

  10. Sexual information seeking on web search engines.

    Science.gov (United States)

    Spink, Amanda; Koricich, Andrew; Jansen, B J; Cole, Charles

    2004-02-01

    Sexual information seeking is an important element within human information behavior. Seeking sexually related information on the Internet takes many forms and channels, including chat rooms discussions, accessing Websites or searching Web search engines for sexual materials. The study of sexual Web queries provides insight into sexually-related information-seeking behavior, of value to Web users and providers alike. We qualitatively analyzed queries from logs of 1,025,910 Alta Vista and AlltheWeb.com Web user queries from 2001. We compared the differences in sexually-related Web searching between Alta Vista and AlltheWeb.com users. Differences were found in session duration, query outcomes, and search term choices. Implications of the findings for sexual information seeking are discussed.

  11. Semantic Web

    Directory of Open Access Journals (Sweden)

    Anna Lamandini

    2011-06-01

    Full Text Available The semantic Web is a technology at the service of knowledge which is aimed at accessibility and the sharing of content; facilitating interoperability between different systems and as such is one of the nine key technological pillars of TIC (technologies for information and communication within the third theme, programme specific cooperation of the seventh programme framework for research and development (7°PQRS, 2007-2013. As a system it seeks to overcome overload or excess of irrelevant information in Internet, in order to facilitate specific or pertinent research. It is an extension of the existing Web in which the aim is for cooperation between and the computer and people (the dream of Sir Tim Berners –Lee where machines can give more support to people when integrating and elaborating data in order to obtain inferences and a global sharing of data. It is a technology that is able to favour the development of a “data web” in other words the creation of a space in both sets of interconnected and shared data (Linked Data which allows users to link different types of data coming from different sources. It is a technology that will have great effect on everyday life since it will permit the planning of “intelligent applications” in various sectors such as education and training, research, the business world, public information, tourism, health, and e-government. It is an innovative technology that activates a social transformation (socio-semantic Web on a world level since it redefines the cognitive universe of users and enables the sharing not only of information but of significance (collective and connected intelligence.

  12. Three types of children’s informational web sites: an inventory of design conventions

    OpenAIRE

    Jochmann-Mannak, Hanna; Lentz, Leo; Huibers, Theo W.C.; Sanders, Ted

    2012-01-01

    "Purpose: Research on Web design conventions has an almost exclusive focus on Web design for adults. There is far less knowledge about Web design for children. For the first time, an overview is presented of the current design conventions for children's informational Web sites. Method: In this study a large corpus of 100 children's international, informational Web sites from four different domains (science, pets, arts, and health) is analyzed. The instrument for analyzing the Web sites includ...

  13. Enabling Spatial OLAP Over Environmental and Farming Data with QB4SOLAP

    DEFF Research Database (Denmark)

    Gur, Nurefsan; Hose, Katja; Pedersen, Torben Bach

    2016-01-01

    Governmental organizations and agencies have been making large amounts of spatial data available on the Semantic Web (SW). However, we still lack efficient techniques for analyzing such large amounts of data as we know them from relational database systems, e.g., multidimensional (MD) data...... warehouses and On-line Analytical Processing (OLAP). A basic prerequisite to enable such advanced analytics is a well-defined schema, which can be defined using the QB4SOLAP vocabulary that provides sufficient context for spatial OLAP (SOLAP). In this paper, we address the challenging problem of MD querying...... with SOLAP operations on the SW by applying QB4SOLAP to a non-trivial spatial use case based on real-world open governmental data sets across various spatial domains. We describe the process of combining, interpreting, and publishing disparate spatial data sets as a spatial data cube on the SW and show how...

  14. Summary of Web-Database Technologies%Web数据库技术综述

    Institute of Scientific and Technical Information of China (English)

    周军

    2000-01-01

    Web-Database is the base of many network applications such as Web information retrieval system, Web information publishing and Electronic Commerce. This article focuses on several popular Web-Database technologies such as CGI, ISAPI, IDC, ASP and Java Applet, analyzing and comparing their structure, characteristics, advantages and disadvantages. Finally, it discusses the main structure of the Web-Database technology.

  15. Evolution of the cosmic web

    CERN Document Server

    Cautun, Marius; Jones, Bernard J T; Frenk, Carlos S

    2014-01-01

    The cosmic web is the largest scale manifestation of the anisotropic gravitational collapse of matter. It represents the transitional stage between linear and non-linear structures and contains easily accessible information about the early phases of structure formation processes. Here we investigate the characteristics and the time evolution of morphological components since. Our analysis involves the application of the NEXUS Multiscale Morphology Filter (MMF) technique, predominantly its NEXUS+ version, to high resolution and large volume cosmological simulations. We quantify the cosmic web components in terms of their mass and volume content, their density distribution and halo populations. We employ new analysis techniques to determine the spatial extent of filaments and sheets, like their total length and local width. This analysis identifies cluster and filaments as the most prominent components of the web. In contrast, while voids and sheets take most of the volume, they correspond to underdense environ...

  16. Advanced Techniques in Web Intelligence-2 Web User Browsing Behaviour and Preference Analysis

    CERN Document Server

    Palade, Vasile; Jain, Lakhmi

    2013-01-01

    This research volume focuses on analyzing the web user browsing behaviour and preferences in traditional web-based environments, social  networks and web 2.0 applications,  by using advanced  techniques in data acquisition, data processing, pattern extraction and  cognitive science for modeling the human actions.  The book is directed to  graduate students, researchers/scientists and engineers  interested in updating their knowledge with the recent trends in web user analysis, for developing the next generation of web-based systems and applications.

  17. C2Analyzer:Co-target-Co-function Analyzer

    Institute of Scientific and Technical Information of China (English)

    Md Aftabuddin; Chittabrata Mal; Arindam Deb; Sudip Kundu

    2014-01-01

    MicroRNAs (miRNAs) interact with their target mRNAs and regulate biological pro-cesses at post-transcriptional level. While one miRNA can target many mRNAs, a single mRNA can also be targeted by a set of miRNAs. The targeted mRNAs may be involved in different bio-logical processes that are described by gene ontology (GO) terms. The major challenges involved in analyzing these multitude regulations include identification of the combinatorial regulation of miR-NAs as well as determination of the co-functionally-enriched miRNA pairs. The C2Analyzer:Co-target-Co-function Analyzer, is a Perl-based, versatile and user-friendly web tool with online instructions. Based on the hypergeometric analysis, this novel tool can determine whether given pairs of miRNAs are co-functionally enriched. For a given set of GO term(s), it can also identify the set of miRNAs whose targets are enriched in the given GO term(s). Moreover, C2Analyzer can also identify the co-targeting miRNA pairs, their targets and GO processes, which they are involved in. The miRNA-miRNA co-functional relationship can also be saved as a .txt file, which can be used to further visualize the co-functional network by using other software like Cytoscape. C2Analyzer is freely available at www.bioinformatics.org/c2analyzer.

  18. Endnote web

    OpenAIRE

    Uezu, Denis

    2015-01-01

    Представлено краткое руководство по работе с сетевой сервисной программой EndNote Web на платформе Web of Knowledge издательства Thomson Reuters на русском языке. EndNote Web разработана для предоставления помощи исследователям и студентам в процессе написания научных публикаций. Позволяет создавать свои базы данных с собственными библиографическими списками для цитирования в научных работах....

  19. Engineering Web Applications

    DEFF Research Database (Denmark)

    Casteleyn, Sven; Daniel, Florian; Dolog, Peter

    Nowadays, Web applications are almost omnipresent. The Web has become a platform not only for information delivery, but also for eCommerce systems, social networks, mobile services, and distributed learning environments. Engineering Web applications involves many intrinsic challenges due...

  20. Web Interactive Campus Map

    Directory of Open Access Journals (Sweden)

    Marylene S. Eder

    2015-03-01

    Full Text Available Abstract Interactive campus map is a web based application that can be accessed through a web browser. With the Google Map Application Programming Interface availability of the overlay function has been taken advantage to create custom map functionalities. Collection of building points were gathered for routing and to create polygons which serves as a representation of each building. The previous campus map provides a static visual representation of the campus. It uses legends building name and its corresponding building number in providing information. Due to its limited capabilities it became a realization to the researchers to create an interactive campus map.Storing data about the building room and staff information and university events and campus guide are among the primary features that this study has to offer. Interactive Web-based Campus Information System is intended in providing a Campus Information System.It is open to constant updates user-friendly for both trained and untrained users and capable of responding to all needs of users and carrying out analyses. Based on the data gathered through questionnaires researchers analyzed the results of the test survey and proved that the system is user friendly deliver information to users and the important features that the students expect.

  1. Towards semantic web mining

    OpenAIRE

    Berendt, Bettina; Hotho, Andreas; Stumme, Gerd

    2002-01-01

    Semantic Web Mining aims at combining the two fast-developing research areas Semantic Web and Web Mining. The idea is to improve, on the one hand, the results of Web Mining by exploiting the new semantic structures in the Web; and to make use of Web Mining, on overview of where the two areas meet today, and sketches ways of how a closer integration could be profitable.

  2. WEB BASED TRANSLATION OF CHINESE ORGANIZATION NAME

    Institute of Scientific and Technical Information of China (English)

    Yang Muyun; Liu Daxin; Zhao Tiejun; Qi Haoliang; Lin Kaiming

    2009-01-01

    A web-based translation method for Chinese organization name is proposed. After analyzing the structure of Chinese organization name, the methods of bilingual query formulation and maximum entropy based translation re-ranking are suggested to retrieve the English translation from the web via public search engine. The experiments on Chinese university names demonstrate the validness of this approach.

  3. Intelligent Overload Control for Composite Web Services

    NARCIS (Netherlands)

    Meulenhoff, P.J.; Ostendorf, D.R.; Zivkovic, M.; Meeuwissen, H.B.; Gijsen, B.M.M.

    2009-01-01

    In this paper, we analyze overload control for composite web services in service oriented architectures by an orchestrating broker, and propose two practical access control rules which effectively mitigate the effects of severe overloads at some web services in the composite service. These two rules

  4. Intelligent overload control for composite web services

    NARCIS (Netherlands)

    Meulenhoff, P.J.; Ostendorf, D.R.; Živković, M.; Meeuwissen, H.B.; Gijsen, B.M.M.

    2009-01-01

    In this paper, we analyze overload control for composite web services in service oriented architectures by an orchestrating broker, and propose two practical access control rules which effectively mitigate the effects of severe overloads at some web services in the composite service. These two rules

  5. DISTANCE LEARNING ONLINE WEB 3 .0

    Directory of Open Access Journals (Sweden)

    S. M. Petryk

    2015-05-01

    Full Text Available This article analyzes the existing methods of identification information in the semantic web, outlines the main problems of its implementation and researches the use of Semantic Web as the part of distance learning. Proposed alternative variant of identification and relationship construction of information and acquired knowledge based on the developed method “spectrum of knowledge”

  6. Web TA Production (WebTA)

    Data.gov (United States)

    US Agency for International Development — WebTA is a web-based time and attendance system that supports USAID payroll administration functions, and is designed to capture hours worked, leave used and...

  7. Semantic Web Technologies for the Adaptive Web

    DEFF Research Database (Denmark)

    Dolog, Peter

    2007-01-01

    Ontologies and reasoning are the key terms brought into focus by the semantic web community. Formal representation of ontologies in a common data model on the web can be taken as a foundation for adaptive web technologies as well. This chapter describes how ontologies shared on the semantic web...... means for deciding which links to show, annotate, hide, generate, and reorder. The semantic web technologies provide means to formalize the domain ontologies and metadata created from them. The formalization enables reasoning for personalization decisions. This chapter describes which components...... are crucial to be formalized by the semantic web ontologies for adaptive web. We use examples from an eLearning domain to illustrate the principles which are broadly applicable to any information domain on the web....

  8. Semantic web for dummies

    CERN Document Server

    Pollock, Jeffrey T

    2009-01-01

    Semantic Web technology is already changing how we interact with data on the Web. By connecting random information on the Internet in new ways, Web 3.0, as it is sometimes called, represents an exciting online evolution. Whether you're a consumer doing research online, a business owner who wants to offer your customers the most useful Web site, or an IT manager eager to understand Semantic Web solutions, Semantic Web For Dummies is the place to start! It will help you:Know how the typical Internet user will recognize the effects of the Semantic WebExplore all the benefits the data Web offers t

  9. Innovating Web Page Classification Through Reducing Noise

    Institute of Scientific and Technical Information of China (English)

    LI Xiaoli (李晓黎); SHI Zhongzhi(史忠植)

    2002-01-01

    This paper presents a new method that eliminates noise in Web page classification. It first describes the presentation of a Web page based on HTML tags. Then through a novel distance formula, it eliminates the noise in similarity measure. After carefully analyzing Web pages, we design an algorithm that can distinguish related hyperlinks from noisy ones.We can utilize non-noisy hyperlinks to improve the performance of Web page classification (the CAWN algorithm). For any page, wecan classify it through the text and category of neighbor pages related to the page. The experimental results show that our approach improved classification accuracy.

  10. Mobile response in web panels

    NARCIS (Netherlands)

    de Bruijne, M.A.; Wijnant, A.

    2014-01-01

    This article investigates unintended mobile access to surveys in online, probability-based panels. We find that spontaneous tablet usage is drastically increasing in web surveys, while smartphone usage remains low. Further, we analyze the bias of respondent profiles using smartphones and tablets com

  11. WEB GIS: IMPLEMENTATION ISSUES

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    With the rapid expansion and development of Internet and WWW (World Wide Web or Web), Web GIS (Web Geographical Information Systen) is becoming ever more popular and as a result numerous sites have added GIS capability on their Web sites. In this paper, the reasons behind developing a Web GIS instead of a “traditional” GIS are first outlined. Then the current status of Web GIS is reviewed, and their implementation methodologies are explored as well.The underlying technologies for developing Web GIS, such as Web Server, Web browser, CGI (Common Gateway Interface), Java, ActiveX, are discussed, and some typical implementation tools from both commercial and public domain are given as well. Finally, the future development direction of Web GIS is predicted.

  12. Ontology Based Qos Driven Web Service Discovery

    Directory of Open Access Journals (Sweden)

    R Suganyakala

    2011-07-01

    Full Text Available In today's scenario web services have become a grand vision to implement the business process functionalities. With increase in number of similar web services, one of the essential challenges is to discover relevant web service with regard to user specification. Relevancy of web service discovery can be improved by augmenting semantics through expressive formats like OWL. QoS based service selection will play a significant role in meeting the non-functional user requirements. Hence QoS and semantics has been used as finer search constraints to discover the most relevant service. In this paper, we describe a QoS framework for ontology based web service discovery. The QoS factors taken into consideration are execution time, response time, throughput, scalability, reputation, accessibility and availability. The behavior of each web service at various instances is observed over a period of time and their QoS based performance is analyzed.

  13. Comparing cosmic web classifiers using information theory

    CERN Document Server

    Leclercq, Florent; Jasche, Jens; Wandelt, Benjamin

    2016-01-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-web, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  14. Comparing cosmic web classifiers using information theory

    Science.gov (United States)

    Leclercq, Florent; Lavaux, Guilhem; Jasche, Jens; Wandelt, Benjamin

    2016-08-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  15. Spatial Data Supply Chains

    Science.gov (United States)

    Varadharajulu, P.; Azeem Saqiq, M.; Yu, F.; McMeekin, D. A.; West, G.; Arnold, L.; Moncrieff, S.

    2015-06-01

    This paper describes current research into the supply of spatial data to the end user in as close to real time as possible via the World Wide Web. The Spatial Data Infrastructure paradigm has been discussed since the early 1990s. The concept has evolved significantly since then but has almost always examined data from the perspective of the supplier. It has been a supplier driven focus rather than a user driven focus. The current research being conducted is making a paradigm shift and looking at the supply of spatial data as a supply chain, similar to a manufacturing supply chain in which users play a significant part. A comprehensive consultation process took place within Australia and New Zealand incorporating a large number of stakeholders. Three research projects that have arisen from this consultation process are examining Spatial Data Supply Chains within Australia and New Zealand and are discussed within this paper.

  16. The spatial glaciological data infrastructure

    Directory of Open Access Journals (Sweden)

    T. Y. Khromova

    2014-01-01

    .mpg.igras.ru, all based on the spatial glaciological data. Another result is the digital and web-atlas «Snow and Ice of the Earth», presenting the example of open source of the spatial data on glaciology in the multi-program environment. Regional data bases created for regions of the Caucasus and the Antarctic Continent make it possible to develop various GIS models and to analyze interrelations, status and dynamics of glaciological parameters. The system of links provides easy access to distributed resources.

  17. Personalized Web Services for Web Information Extraction

    CERN Document Server

    Jarir, Zahi; Erradi, Mahammed

    2011-01-01

    The field of information extraction from the Web emerged with the growth of the Web and the multiplication of online data sources. This paper is an analysis of information extraction methods. It presents a service oriented approach for web information extraction considering both web data management and extraction services. Then we propose an SOA based architecture to enhance flexibility and on-the-fly modification of web extraction services. An implementation of the proposed architecture is proposed on the middleware level of Java Enterprise Edition (JEE) servers.

  18. Web Analytics

    OpenAIRE

    Mužík, Zbyněk

    2006-01-01

    Práce se zabývá problematikou měření ukazatelů souvisejících s provozem webových stránek a aplikací a technologickými prostředky k tomu sloužícími ? Web Analytics (WA). Hlavním cílem práce je otestovat a porovnat vybrané zástupce těchto nástrojů a podrobit je srovnání podle objektivních kriterií, dále také kritické zhodnocení možností WA nástrojů obecně. V první části se práce zaměřuje na popis různých způsobů měření provozu na WWW a definuje související metriky. Poskytuje také přehled dostup...

  19. Measuring extremal dependencies in web graphs

    NARCIS (Netherlands)

    Volkovich, Y.; Litvak, Nelli; Zwart, B.

    We analyze dependencies in power law graph data (Web sample, Wikipedia sample and a preferential attachment graph) using statistical inference for multivariate regular variation. The well developed theory of regular variation is widely applied in extreme value theory, telecommunications and

  20. A Survey of Web Link Structure Information Research%Web链接结构信息研究综述

    Institute of Scientific and Technical Information of China (English)

    李剑; 金蓓弘

    2003-01-01

    As the size of WWW is growing at an incredible rate, there is some limitation in the methods that only analyzes the Web pages' information. This paper presents a basic model of Web link structure. Then it classifies the algorithms that analyze the Web link structure information and their applications. At last, it presents the practical approach of analyzing Web link structure information.

  1. Beyond Google: The Invisible Web in the Academic Library

    Science.gov (United States)

    Devine, Jane; Egger-Sider, Francine

    2004-01-01

    This article analyzes the concept of the Invisible Web and its implication for academic librarianship. It offers a guide to tools that can be used to mine the Invisible Web and discusses the benefits of using the Invisible Web to promote interest in library services. In addition, the article includes an expanded definition, a literature review,…

  2. The WebQuest: constructing creative learning.

    Science.gov (United States)

    Sanford, Julie; Townsend-Rocchiccioli, Judith; Trimm, Donna; Jacobs, Mike

    2010-10-01

    An exciting expansion of online educational opportunities is occurring in nursing. The use of a WebQuest as an inquiry-based learning activity can offer considerable opportunity for nurses to learn how to analyze and synthesize critical information. A WebQuest, as a constructivist, inquiry-oriented strategy, requires learners to use higher levels of thinking as a means to analyze and apply complex information, providing an exciting online teaching and learning strategy. A WebQuest is an inquiry-oriented lesson format in which most or all of the information learners work with comes from the web. This article provides an overview of the WebQuest as a teaching strategy and provides examples of its use. Copyright 2010, SLACK Incorporated.

  3. Ecological food web analysis for chemical risk assessment.

    Science.gov (United States)

    Preziosi, Damian V; Pastorok, Robert A

    2008-12-01

    Food web analysis can be a critical component of ecological risk assessment, yet it has received relatively little attention among risk assessors. Food web data are currently used in modeling bioaccumulation of toxic chemicals and, to a limited extent, in the determination of the ecological significance of risks. Achieving more realism in ecological risk assessments requires new analysis tools and models that incorporate accurate information on key receptors in a food web paradigm. Application of food web analysis in risk assessments demands consideration of: 1) different kinds of food webs; 2) definition of trophic guilds; 3) variation in food webs with habitat, space, and time; and 4) issues for basic sampling design and collection of dietary data. The different kinds of food webs include connectance webs, materials flow webs, and functional (or interaction) webs. These three kinds of webs play different roles throughout various phases of an ecological risk assessment, but risk assessors have failed to distinguish among web types. When modeling food webs, choices must be made regarding the level of complexity for the web, assignment of species to trophic guilds, selection of representative species for guilds, use of average diets, the characterization of variation among individuals or guild members within a web, and the spatial and temporal scales/dynamics of webs. Integrating exposure and effects data in ecological models for risk assessment of toxic chemicals relies on coupling food web analysis with bioaccumulation models (e.g., Gobas-type models for fish and their food webs), wildlife exposure models, dose-response models, and population dynamics models.

  4. The Creation Of Web-Based Geographical Information System Of Drugstore Distribution

    Directory of Open Access Journals (Sweden)

    Sandy Kosasi

    2016-06-01

    Full Text Available The difficulties of finding the locations of the drugstores in a certain area often happen. They make the customers’ demands not satisfied. Low competitiveness of the drugstores is another impact. This research aims to create a web-based geographical information system (WebGis on the distribution of all drugstores located in the sub-districts of Pontianak using a web-based mapping approach. It is beneficial to use WebGis because the integrated information (i.e. spatial and non-spatial data can be provided. The information displays are also interactive and ease the customers to know the locations of the drugstores using certain view. Moreover, the system has a maptip feature that can be used to know the information of coordinate points showing the drugstore names, medical schedules, services, insurance, and addresses. The search column can be used to search the information of medicine and of physical appearance of drugstore buildings. Finally, the system maps all drugstores and gives accurate information. Therefore, the customers can analyze and find the drugstores with the nearest distance.

  5. Het WEB leert begrijpen

    CERN Multimedia

    Stroeykens, Steven

    2004-01-01

    The WEB could be much more useful if the computers understood something of information on the Web pages. That explains the goal of the "semantic Web", a project in which takes part, amongst others, Tim Berners Lee, the inventor of the original WEB

  6. Instant responsive web design

    CERN Document Server

    Simmons, Cory

    2013-01-01

    A step-by-step tutorial approach which will teach the readers what responsive web design is and how it is used in designing a responsive web page.If you are a web-designer looking to expand your skill set by learning the quickly growing industry standard of responsive web design, this book is ideal for you. Knowledge of CSS is assumed.

  7. Handbook of web surveys

    NARCIS (Netherlands)

    Bethlehem, J.; Biffignandi, S.

    2012-01-01

    Best practices to create and implementhighly effective web surveys Exclusively combining design and sampling issues, Handbook of Web Surveys presents a theoretical yet practical approach to creating and conducting web surveys. From the history of web surveys to various modes of data collection to ti

  8. Spatial planning

    OpenAIRE

    Dimitrov, Nikola; Koteski, Cane

    2016-01-01

    The professional book ,, Space planning "processed chapters on: space, concept and definition of space, space as a system, spatial economics, economic essence of space, space planning, social determinants of spatial planning, spatial planning as a process, factors development and elements in spatial planning, methodology, components and content of spatial planning stages and types of preparation of spatial planning, spatial planning and industrialization, industrialization, urbanization and s...

  9. Spatial planning

    OpenAIRE

    Dimitrov, Nikola; Koteski, Cane

    2016-01-01

    The professional book ,, Space planning "processed chapters on: space, concept and definition of space, space as a system, spatial economics, economic essence of space, space planning, social determinants of spatial planning, spatial planning as a process, factors development and elements in spatial planning, methodology, components and content of spatial planning stages and types of preparation of spatial planning, spatial planning and industrialization, industrialization, urbanization and s...

  10. Study on online community user motif using web usage mining

    Science.gov (United States)

    Alphy, Meera; Sharma, Ajay

    2016-04-01

    The Web usage mining is the application of data mining, which is used to extract useful information from the online community. The World Wide Web contains at least 4.73 billion pages according to Indexed Web and it contains at least 228.52 million pages according Dutch Indexed web on 6th august 2015, Thursday. It’s difficult to get needed data from these billions of web pages in World Wide Web. Here is the importance of web usage mining. Personalizing the search engine helps the web user to identify the most used data in an easy way. It reduces the time consumption; automatic site search and automatic restore the useful sites. This study represents the old techniques to latest techniques used in pattern discovery and analysis in web usage mining from 1996 to 2015. Analyzing user motif helps in the improvement of business, e-commerce, personalisation and improvement of websites.

  11. Classifying web genres in context: a case study documenting the web genres used by a software engineer

    NARCIS (Netherlands)

    Montesi, M.; Navarrete, T.

    2008-01-01

    This case study analyzes the Internet-based resources that a software engineer uses in his daily work. Methodologically, we studied the web browser history of the participant, classifying all the web pages he had seen over a period of 12 days into web genres. We interviewed him before and after the

  12. Geographic Information Systems and Web Page Development

    Science.gov (United States)

    Reynolds, Justin

    2004-01-01

    The Facilities Engineering and Architectural Branch is responsible for the design and maintenance of buildings, laboratories, and civil structures. In order to improve efficiency and quality, the FEAB has dedicated itself to establishing a data infrastructure based on Geographic Information Systems, GIs. The value of GIS was explained in an article dating back to 1980 entitled "Need for a Multipurpose Cadastre which stated, "There is a critical need for a better land-information system in the United States to improve land-conveyance procedures, furnish a basis for equitable taxation, and provide much-needed information for resource management and environmental planning." Scientists and engineers both point to GIS as the solution. What is GIS? According to most text books, Geographic Information Systems is a class of software that stores, manages, and analyzes mapable features on, above, or below the surface of the earth. GIS software is basically database management software to the management of spatial data and information. Simply put, Geographic Information Systems manage, analyze, chart, graph, and map spatial information. At the outset, I was given goals and expectations from my branch and from my mentor with regards to the further implementation of GIs. Those goals are as follows: (1) Continue the development of GIS for the underground structures. (2) Extract and export annotated data from AutoCAD drawing files and construct a database (to serve as a prototype for future work). (3) Examine existing underground record drawings to determine existing and non-existing underground tanks. Once this data was collected and analyzed, I set out on the task of creating a user-friendly database that could be assessed by all members of the branch. It was important that the database be built using programs that most employees already possess, ruling out most AutoCAD-based viewers. Therefore, I set out to create an Access database that translated onto the web using Internet

  13. Web Project Management

    OpenAIRE

    Suralkar, Sunita; Joshi, Nilambari; Meshram, B B

    2013-01-01

    This paper describes about the need for Web project management, fundamentals of project management for web projects: what it is, why projects go wrong, and what's different about web projects. We also discuss Cost Estimation Techniques based on Size Metrics. Though Web project development is similar to traditional software development applications, the special characteristics of Web Application development requires adaption of many software engineering approaches or even development of comple...

  14. Web Project Management

    OpenAIRE

    2013-01-01

    This paper describes about the need for Web project management, fundamentals of project management for web projects: what it is, why projects go wrong, and what's different about web projects. We also discuss Cost Estimation Techniques based on Size Metrics. Though Web project development is similar to traditional software development applications, the special characteristics of Web Application development requires adaption of many software engineering approaches or even development of comple...

  15. Web Science 2015

    OpenAIRE

    Boucher, Andy; Cameron, David; Gaver, William; Hauenstein, Mark; Jarvis, Nadine; Kerridge, Tobie; Michael, Mike; Ovalle, Liliana; Pennington, Sarah; Wilkie, Alex

    2015-01-01

    Web Science 2015 conference exhibition. Web Science is the emergent study of the people and technologies, applications, processes and practices that shape and are shaped by the World Wide Web. Web Science aims to draw together theories, methods and findings from across academic disciplines, and to collaborate with industry, business, government and civil society, to develop knowledge and understanding of the Web: the largest socio-technical infrastructure in human history.

  16. A Web-Based Information System for Field Data Management

    Science.gov (United States)

    Weng, Y. H.; Sun, F. S.

    2014-12-01

    A web-based field data management system has been designed and developed to allow field geologists to store, organize, manage, and share field data online. System requirements were analyzed and clearly defined first regarding what data are to be stored, who the potential users are, and what system functions are needed in order to deliver the right data in the right way to the right user. A 3-tiered architecture was adopted to create this secure, scalable system that consists of a web browser at the front end while a database at the back end and a functional logic server in the middle. Specifically, HTML, CSS, and JavaScript were used to implement the user interface in the front-end tier, the Apache web server runs PHP scripts, and MySQL to server is used for the back-end database. The system accepts various types of field information, including image, audio, video, numeric, and text. It allows users to select data and populate them on either Google Earth or Google Maps for the examination of the spatial relations. It also makes the sharing of field data easy by converting them into XML format that is both human-readable and machine-readable, and thus ready for reuse.

  17. WEB-GIS SOLUTIONS DEVELOPMENT FOR CITIZENS AND WATER COMPANIES

    Directory of Open Access Journals (Sweden)

    M. Şercăianu

    2013-05-01

    Full Text Available This paper describes the development of a web-GIS solution in which urban residents, from Buzau City, could be involved in decision-support process of water companies, in order to reduce water losses, by collecting information directly from citizens. In recent years, reducing material and economic losses, recorded in the entire municipal networks management process has become the main focus of public companies in Romania. Due to problems complexity that arise in collecting information from citizens and issues identified in urban areas, more analyzes were required related to web-GIS solutions used in areas such as local government, public utilities, environmental protection or financial management. Another important problem is the poor infrastructure development of spatial databases founded in public companies, and connection to web platforms. Developing the entire communication process between residents and municipal companies has required the use of concept "citizen-sensor" in the entire reporting process. Reported problems are related to water distribution networks with the possibility of covering the entire public utilities infrastructure.

  18. Web-Gis Solutions Development for Citizens and Water Companies

    Science.gov (United States)

    Şercăianu, M.

    2013-05-01

    This paper describes the development of a web-GIS solution in which urban residents, from Buzau City, could be involved in decision-support process of water companies, in order to reduce water losses, by collecting information directly from citizens. In recent years, reducing material and economic losses, recorded in the entire municipal networks management process has become the main focus of public companies in Romania. Due to problems complexity that arise in collecting information from citizens and issues identified in urban areas, more analyzes were required related to web-GIS solutions used in areas such as local government, public utilities, environmental protection or financial management. Another important problem is the poor infrastructure development of spatial databases founded in public companies, and connection to web platforms. Developing the entire communication process between residents and municipal companies has required the use of concept "citizen-sensor" in the entire reporting process. Reported problems are related to water distribution networks with the possibility of covering the entire public utilities infrastructure.

  19. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  20. DERIVING USER ACCESS PATTERNS AND MINING WEB COMMUNITY WITH WEB-LOG DATA FOR PREDICTING USER SESSIONS WITH PAJEK

    Directory of Open Access Journals (Sweden)

    S. Balaji

    2012-10-01

    Full Text Available Web logs are a young and dynamic media type. Due to the intrinsic relationship among Web objects and the deficiency of a uniform schema of web documents, Web community mining has become significant area for Web data management and analysis. The research of Web communities extents a number of research domains. In this paper an ontological model has been present with some recent studies on this topic, which cover finding relevant Web pages based on linkage information, discovering user access patterns through analyzing Web log files from Web data. A simulation has been created with the academic website crawled data. The simulation is done in JAVA and ORACLE environment. Results show that prediction of user session could give us plenty of vital information for the Business Intelligence. Search Engine Optimization could also use these potential results which are discussed in the paper in detail.

  1. Web Information Extraction%Web信息抽取

    Institute of Scientific and Technical Information of China (English)

    李晶; 陈恩红

    2003-01-01

    With the tremendous amount of information available on the Web, the ability to quickly obtain information has become a crucial problem. It is not enough for us to acquire information only with Web information retrieval technology. Therefore more and more people pay attention to Web information extraction technology. This paper first in- troduces some concepts of information extraction technology, then introduces and analyzes several typical Web information extraction methods based on the differences in extraction patterns.

  2. Analyzing Agricultural Agglomeration in China

    Directory of Open Access Journals (Sweden)

    Erling Li

    2017-02-01

    Full Text Available There has been little scholarly research on Chinese agriculture’s geographic pattern of agglomeration and its evolutionary mechanisms, which are essential to sustainable development in China. By calculating the barycenter coordinates, the Gini coefficient, spatial autocorrelation and specialization indices for 11 crops during 1981–2012, we analyze the evolutionary pattern and mechanisms of agricultural agglomeration. We argue that the degree of spatial concentration of Chinese planting has been gradually increasing and that regional specialization and diversification have progressively been strengthened. Furthermore, Chinese crop production is moving from the eastern provinces to the central and western provinces. This is in contrast to Chinese manufacturing growth which has continued to be concentrated in the coastal and southeastern regions. In Northeast China, the Sanjiang and Songnen plains have become agricultural clustering regions, and the earlier domination of aquaculture and rice production in Southeast China has gradually decreased. In summary, this paper provides a political economy framework for understanding the regionalization of Chinese agriculture, focusing on the interaction among the objectives, decisionmaking behavior, path dependencies and spatial effects.

  3. On the Querying for Places on the Mobile Web

    DEFF Research Database (Denmark)

    Jensen, Christian S.

    2011-01-01

    The web is undergoing a fundamental transformation: it is becoming mobile and is acquiring a spatial dimension. Thus, the web is increasingly being used from mobile devices, notably smartphones, that can be geo-positioned using GPS or technologies that exploit wireless communication networks....... In addition, web content is being geo-tagged. This transformation calls for new, spatio-textual query functionality. The research community is hard at work enabling efficient support for such functionality....

  4. Watershed management and the web

    Energy Technology Data Exchange (ETDEWEB)

    Voinov, A.; Costanza, R. [Univ. of Maryland, Solomons, MD (United States). Inst. for Ecological Economics

    1999-08-01

    Watershed analysis and watershed management are developing as tools of integrated ecological and economic study. They also assist decision-making at the regional scale. The new technology and thinking offered by the advent of the Internet and the World Wide Web is highly complementary to some of the goals of watershed analysis. Services delivered by the Web are open, interactive, gas, spatially distributed, hierarchical and flexible. The Web offers the ability to display information creatively, to interact with that information and to change and modify it remotely. In this way the Internet provides a much-needed opportunity to deliver scientific findings and information to stakeholders and to link stakeholders together providing for collective decision=making. The benefits fall into two major categories: methological and educational. Methodologically the approach furthers the watershed management concept, offering an avenue for practical implementation of watershed management principles. For educational purposes the Web is a source of data and insight serving a variety of needs at all levels.

  5. Semantic Web Technologies for the Adaptive Web

    DEFF Research Database (Denmark)

    Dolog, Peter

    2007-01-01

    Ontologies and reasoning are the key terms brought into focus by the semantic web community. Formal representation of ontologies in a common data model on the web can be taken as a foundation for adaptive web technologies as well. This chapter describes how ontologies shared on the semantic web...... provide conceptualization for the links which are a main vehicle to access information on the web. The subject domain ontologies serve as constraints for generating only those links which are relevant for the domain a user is currently interested in. Furthermore, user model ontologies provide additional...... means for deciding which links to show, annotate, hide, generate, and reorder. The semantic web technologies provide means to formalize the domain ontologies and metadata created from them. The formalization enables reasoning for personalization decisions. This chapter describes which components...

  6. “Wrapping” X3DOM around Web Audio API

    Directory of Open Access Journals (Sweden)

    Andreas Stamoulias

    2015-12-01

    Full Text Available Spatial sound has a conceptual role in the Web3D environments, due to highly realism scenes that can provide. Lately the efforts are concentrated on the extension of the X3D/ X3DOM through spatial sound attributes. This paper presents a novel method for the introduction of spatial sound components in the X3DOM framework, based on X3D specification and Web Audio API. The proposed method incorporates the introduction of enhanced sound nodes for X3DOM which are derived by the implementation of the X3D standard components, enriched with accessional features of Web Audio API. Moreover, several examples-scenarios developed for the evaluation of our approach. The implemented examples established the achievability of new registered nodes in X3DOM, for spatial sound characteristics in Web3D virtual worlds.

  7. Web geoprocessing services on GML with a fast XML database ...

    African Journals Online (AJOL)

    Web geoprocessing services on GML with a fast XML database. ... However, sometimes the users first have to process available spatial data to obtain the ... we proposed a suitable system prototype design combining the Model View Controller ...

  8. U.S. Geological Survey spatial data access

    Science.gov (United States)

    Faundeen, John L.; Kanengieter, Ronald L.; Buswell, Michael D.

    2002-01-01

    The U.S. Geological Survey (USGS) has done a progress review on improving access to its spatial data holdings over the Web. The USGS EROS Data Center has created three major Web-based interfaces to deliver spatial data to the general public; they are Earth Explorer, the Seamless Data Distribution System (SDDS), and the USGS Web Mapping Portal. Lessons were learned in developing these systems, and various resources were needed for their implementation. The USGS serves as a fact-finding agency in the U.S. Government that collects, monitors, analyzes, and provides scientific information about natural resource conditions and issues. To carry out its mission, the USGS has created and managed spatial data since its inception. Originally relying on paper maps, the USGS now uses advanced technology to produce digital representations of the Earth’s features. The spatial products of the USGS include both source and derivative data. Derivative datasets include Digital Orthophoto Quadrangles (DOQ), Digital Elevation Models, Digital Line Graphs, land-cover Digital Raster Graphics, and the seamless National Elevation Dataset. These products, created with automated processes, use aerial photographs, satellite images, or other cartographic information such as scanned paper maps as source data. With Earth Explorer, users can search multiple inventories through metadata queries and can browse satellite and DOQ imagery. They can place orders and make payment through secure credit card transactions. Some USGS spatial data can be accessed with SDDS. The SDDS uses an ArcIMS map service interface to identify the user’s areas of interest and determine the output format; it allows the user to either download the actual spatial data directly for small areas or place orders for larger areas to be delivered on media. The USGS Web Mapping Portal provides views of national and international datasets through an ArcIMS map service interface. In addition, the map portal posts news about new

  9. Research and Application of WebGIS Open-Source Platform Based on OGC Standards%基于 OGC 规范的 WebGIS 开源平台研究

    Institute of Scientific and Technical Information of China (English)

    于艳超; 许捍卫

    2015-01-01

    The paper designs and develops a WebGIS Open -Source platform based on OGC standards, on the basis of analyzing WebGIS platform, open-source GIS, OGC standards and Web services, in order to satisfy the needs of spatial information sharing. Then a way how to realize the WebGIS is elaborated in detail.The system can implement vector data storage, visualization, browsing, spatial object attribute query, map editing and spatial analysis tasks efficiently, and provide standard data access interface for integrat-ed applications.Finally,the system is applied to the rainfall and water information service in Taihu Basin and gets a better result.%为了更好地实现地理空间数据和信息共享,在分析WebGIS平台、开源GIS、OGC规范以及Web服务的基础上,设计与开发一个基于OGC规范的WebGIS开源服务平台,并给出具体的实现步骤。该方案可高效地完成矢量数据入库、可视化浏览、空间对象属性查询、地图编辑和空间分析等任务,能够为集成应用提供标准的数据访问和获取接口。最后,将其应用到“太湖流域雨量水情信息服务”中,取得较好的效果。

  10. Sounds of Web Advertising

    DEFF Research Database (Denmark)

    Jessen, Iben Bredahl; Graakjær, Nicolai Jørgensgaard

    2010-01-01

    Sound seems to be a neglected issue in the study of web ads. Web advertising is predominantly regarded as visual phenomena–commercial messages, as for instance banner ads that we watch, read, and eventually click on–but only rarely as something that we listen to. The present chapter presents...... an overview of the auditory dimensions in web advertising: Which kinds of sounds do we hear in web ads? What are the conditions and functions of sound in web ads? Moreover, the chapter proposes a theoretical framework in order to analyse the communicative functions of sound in web advertising. The main...... argument is that an understanding of the auditory dimensions in web advertising must include a reflection on the hypertextual settings of the web ad as well as a perspective on how users engage with web content....

  11. Web Mining and Social Networking

    DEFF Research Database (Denmark)

    Xu, Guandong; Zhang, Yanchun; Li, Lin

    This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web ...... sense of individuals or communities. The volume will benefit both academic and industry communities interested in the techniques and applications of web search, web data management, web mining and web knowledge discovery, as well as web community and social network analysis....

  12. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  13. Enabling Spatial OLAP Over Environmental and Farming Data with QB4SOLAP

    DEFF Research Database (Denmark)

    Gur, Nurefsan; Hose, Katja; Pedersen, Torben Bach

    2016-01-01

    Governmental organizations and agencies have been making large amounts of spatial data available on the Semantic Web (SW). However, we still lack efficient techniques for analyzing such large amounts of data as we know them from relational database systems, e.g., multidimensional (MD) data...... warehouses and On-line Analytical Processing (OLAP). A basic prerequisite to enable such advanced analytics is a well-defined schema, which can be defined using the QB4SOLAP vocabulary that provides sufficient context for spatial OLAP (SOLAP). In this paper, we address the challenging problem of MD querying...

  14. Web 2.0

    CERN Document Server

    Han, Sam

    2012-01-01

    Web 2.0 is a highly accessible introductory text examining all the crucial discussions and issues which surround the changing nature of the World Wide Web. It not only contextualises the Web 2.0 within the history of the Web, but also goes on to explore its position within the broader dispositif of emerging media technologies.The book uncovers the connections between diverse media technologies including mobile smart phones, hand-held multimedia players, ""netbooks"" and electronic book readers such as the Amazon Kindle, all of which are made possible only by the Web 2.0. In addition, Web 2.0 m

  15. Handbook of web surveys

    CERN Document Server

    Bethlehem, Jelke

    2011-01-01

    BEST PRACTICES TO CREATE AND IMPLEMENTHIGHLY EFFECTIVE WEB SURVEYS Exclusively combining design and sampling issues, Handbook of Web Surveys presents a theoretical yet practical approach to creating and conducting web surveys. From the history of web surveys to various modes of data collection to tips for detecting error, this book thoroughly introduces readers to the this cutting-edge technique and offers tips for creating successful web surveys. The authors provide a history of web surveys and go on to explore the advantages and disadvantages of this mode of dat

  16. The Intermodulation Lockin Analyzer

    CERN Document Server

    Tholen, Erik A; Forchheimer, Daniel; Schuler, Vivien; Tholen, Mats O; Hutter, Carsten; Haviland, David B

    2011-01-01

    Nonlinear systems can be probed by driving them with two or more pure tones while measuring the intermodulation products of the drive tones in the response. We describe a digital lock-in analyzer which is designed explicitly for this purpose. The analyzer is implemented on a field-programmable gate array, providing speed in analysis, real-time feedback and stability in operation. The use of the analyzer is demonstrated for Intermodulation Atomic Force Microscopy. A generalization of the intermodulation spectral technique to arbitrary drive waveforms is discussed.

  17. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts...... the interdependency between researcher and researched. On this basis, we advocate an explicit “open-state-of mind” listening as a key aspect of analyzing qualitative material, often described only as a matter of reading transcribed empirical materials, reading theory, and writing. The article contributes...

  18. Citizen Science and the Modern Web

    CERN Document Server

    CERN. Geneva

    2014-01-01

    Beginning as a research project to help scientists communicate, the Web has transformed into a ubiquitous medium. As the sciences continue to transform, new techniques are needed to analyze the vast amounts of data being produced by large experiments. The advent of the Sloan Digital Sky Survey increased throughput of astronomical data, giving rise to Citizen Science projects such as Galaxy Zoo. The Web is no longer exclusively used by researchers, but rather, a place where anyone can share information, or even, partake in citizen science projects. As the Web continues to evolve, new and open technologies enable web applications to become more sophisticated. Scientific toolsets may now target the Web as a platform, opening an application to a wider audience, and potentially citizen scientists. With the latest browser technologies, scientific data may be consumed and visualized, opening the browser as a new platform for scientific analysis.

  19. Web Log Clustering Approaches – A Survey

    Directory of Open Access Journals (Sweden)

    G. Sudhamathy,

    2011-07-01

    Full Text Available As more organization rely on the Internet and the World Wide Web to conduct business, the proposed strategies and techniques for market analysis need to be revisited in this context. We therefore present a survey of the most recent work in the field of Web usage mining, focusing on three different approaches towards web logs clustering. Clustering analysis is a widely used data mining algorithm which is a process of partitioning a set of data objects into a number of object clusters, where each data object shares the high similarity with the other objects within the same cluster but is quite dissimilar to objects in other clusters. In this work we discuss three different approaches on web logs clustering, analyze their benefits and drawbacks. We finally conclude on the most efficient algorithm based on the results of experiments conducted with various web log files.

  20. Being, space and time in the Web

    CERN Document Server

    Vafopoulos, Michalis

    2011-01-01

    The Web emerged as the antidote to rapidly increasing quantity of accumulated knowledge because it successfully enables massive representation and communication with minimum costs. Despite the fact that its gigantic scale and impact make difficult to anticipate the effects in humans, we claim from it to be fast, secure, reliable, all-inclusive and trustworthy. It is time for science to compensate and provide an epistemological "antidote" to these issues. On this campaign, Philosophy should be in the front line by forming the relevant questions. We initiate the dialogue for a theory about being in the Web that will serve as a bridge between philosophical thinking and engineering. We analyze existence and spatiotemporality in the Web, as a closed techno-social system, and how it transforms the traditional conceptions about actuality. Location in the Web space is specified by the Web being's URI and the URI's of incoming and outgoing links. The primer role of visiting durations is best approximated by Bergsonian...

  1. Analyzing binding data.

    Science.gov (United States)

    Motulsky, Harvey J; Neubig, Richard R

    2010-07-01

    Measuring the rate and extent of radioligand binding provides information on the number of binding sites, and their affinity and accessibility of these binding sites for various drugs. This unit explains how to design and analyze such experiments.

  2. Analog multivariate counting analyzers

    CERN Document Server

    Nikitin, A V; Armstrong, T P

    2003-01-01

    Characterizing rates of occurrence of various features of a signal is of great importance in numerous types of physical measurements. Such signal features can be defined as certain discrete coincidence events, e.g. crossings of a signal with a given threshold, or occurrence of extrema of a certain amplitude. We describe measuring rates of such events by means of analog multivariate counting analyzers. Given a continuous scalar or multicomponent (vector) input signal, an analog counting analyzer outputs a continuous signal with the instantaneous magnitude equal to the rate of occurrence of certain coincidence events. The analog nature of the proposed analyzers allows us to reformulate many problems of the traditional counting measurements, and cast them in a form which is readily addressed by methods of differential calculus rather than by algebraic or logical means of digital signal processing. Analog counting analyzers can be easily implemented in discrete or integrated electronic circuits, do not suffer fro...

  3. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  4. Analyzing Microarray Data.

    Science.gov (United States)

    Hung, Jui-Hung; Weng, Zhiping

    2017-03-01

    Because there is no widely used software for analyzing RNA-seq data that has a graphical user interface, this protocol provides an example of analyzing microarray data using Babelomics. This analysis entails performing quantile normalization and then detecting differentially expressed genes associated with the transgenesis of a human oncogene c-Myc in mice. Finally, hierarchical clustering is performed on the differentially expressed genes using the Cluster program, and the results are visualized using TreeView.

  5. EVOLUTION OF THE WORLD WIDE WEB: FROM WEB 1.0 TO WEB 4.0

    Directory of Open Access Journals (Sweden)

    Sareh Aghaei

    2012-02-01

    Full Text Available The World Wide Web as the largest information construct has had much progress since its advent. Thispaper provides a background of the evolution of the web from web 1.0 to web 4.0. Web 1.0 as a web ofinformation connections, Web 2.0 as a web of people connections, Web 3.0 as a web of knowledgeconnections and web 4.0 as a web of intelligence connections are described as four generations of the webin the paper.

  6. EPA Web Taxonomy

    Data.gov (United States)

    U.S. Environmental Protection Agency — EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's...

  7. Chemical Search Web Utility

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Chemical Search Web Utility is an intuitive web application that allows the public to easily find the chemical that they are interested in using, and which...

  8. Practical web development

    CERN Document Server

    Wellens, Paul

    2015-01-01

    This book is perfect for beginners who want to get started and learn the web development basics, but also offers experienced developers a web development roadmap that will help them to extend their capabilities.

  9. Wordpress web application development

    CERN Document Server

    Ratnayake, Rakhitha Nimesh

    2015-01-01

    This book is intended for WordPress developers and designers who want to develop quality web applications within a limited time frame and for maximum profit. Prior knowledge of basic web development and design is assumed.

  10. Study And Implementation Of LCS Algorithm For Web Mining

    Directory of Open Access Journals (Sweden)

    Vrishali P. Sonavane

    2012-03-01

    Full Text Available The Internet is the roads and the highways in the information World, the content providers are the road workers, and the visitors are the drivers. As in the real world, there can be traffic jams, wrong signs, blind alleys, and so on. The content providers, as the road workers, need information about their users to make possible Web site adjustments. Web logs store every motion on the provider's Web site. So the providers need only a tool to analyze these logs. This tool is called Web Usage Mining. Web Usage Mining is a part of Web Mining. It is the foundation for a Web site analysis. It employs various knowledge discovery methods to gain Web usage patterns. In this paper we used LCS algorithm for improving accuracy of recommendation. The Expremental results show that the approach can improve accuracy of classification in the architecture. Using LCS algorithm we can predict users future request more accurately.

  11. Survey of Web Technologies

    OpenAIRE

    Špoljar, Boris

    2011-01-01

    The World Wide Web bas become an important platform for developing and running applications. A vital process while developing web applications is the choice of web technologies, on which the application will be build. The developers face a dizzying array of platforms, languages, frameworks and technical artifacts to choose from. The decison carries consequences on most other decisions in the development process. Thesis contains analisis, classifications and comparison of web technologies s...

  12. Semiautomatic Web service generation

    OpenAIRE

    Fuentes, José María de; Corella, Miguel Ángel; Castells, Pablo; Rico, Mariano

    2005-01-01

    Proceedings of the IADIS International Conference WWW/Internet 2005, held in Lisbon (Portugal). The lack of a critical mass of actually deployed web services, semantic or not, is an important hurdle for the advancement and further innovation in web service technologies. In this paper we introduce Federica, a platform for semi-automatic generation and implementation of semantic web services by exploiting existing web applications published in internet. Federica generates semantical...

  13. SPATIAL STABILITY

    OpenAIRE

    Pascal Mossay

    2004-01-01

    We consider a continuous spatial economy consisting of pure exchange local economies. Agents are allowed to change their location over time as a response to spatial utility differentials. These spatial adjustments toward higher utility neighborhoods lead the spatial economy to converge to a spatially uniform allocation of resources, provided that the matrix of price effects is quasi-negative definite. Furthermore our model provides a real time interpretation of the tâtonnement story. Also, sp...

  14. Web Mining%Web 数学挖掘

    Institute of Scientific and Technical Information of China (English)

    王实; 高文; 李锦涛

    2000-01-01

    Web Mining is an important branch in Data Mining.It attracts more research interest for rapidly developing Internet. Web Mining includes:(1)Web Content Mining;(g)Web Usage Mining;(3) Web structure Mining.In this paper we define Web Mining and present an overview of the various research issues,techniques and development efforts.

  15. Web semántica y servicios web semanticos

    OpenAIRE

    Marquez Solis, Santiago

    2007-01-01

    Des d'aquest TFC volem estudiar l'evolució de la Web actual cap a la Web Semàntica. Desde este TFC queremos estudiar la evolución de la Web actual hacia la Web Semántica. From this Final Degree Project we want to study the evolution of the current Web to the Semantic Web.

  16. WebSelF

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær; Ernst, Erik; Brabrand, Claus

    2012-01-01

    previous work on web scraping. We conducted an experiment that evaluated several qualitatively different web scraping constituents (including previous work and combinations hereof) on about 11,000 HTML pages on daily versions of 17 web sites over a period of more than one year. Our framework solves three...

  17. Evaluating Web Usability

    Science.gov (United States)

    Snider, Jean; Martin, Florence

    2012-01-01

    Web usability focuses on design elements and processes that make web pages easy to use. A website for college students was evaluated for underutilization. One-on-one testing, focus groups, web analytics, peer university review and marketing focus group and demographic data were utilized to conduct usability evaluation. The results indicated that…

  18. Web Browser Programming

    OpenAIRE

    Luján Mora, Sergio

    2006-01-01

    Presentaciones del curso "Web Browser Programming" impartido en la Université M'Hamed Bougara (Bourmerdes, Argelia) en junio de 2006. Proyecto financiado por la Unión Europea: TEMPUS JEP-32102-2004, Licence Professionnelle Technologies des Applications Web (Professional License for Web Application Technologies).

  19. WebSelF

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær; Ernst, Erik; Brabrand, Claus

    2012-01-01

    We present WebSelF, a framework for web scraping which models the process of web scraping and decomposes it into four conceptually independent, reusable, and composable constituents. We have validated our framework through a full parameterized implementation that is flexible enough to capture...

  20. Instant PHP web scraping

    CERN Document Server

    Ward, Jacob

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. Short, concise recipes to learn a variety of useful web scraping techniques using PHP.This book is aimed at those new to web scraping, with little or no previous programming experience. Basic knowledge of HTML and the Web is useful, but not necessary.

  1. Web Search Engines

    OpenAIRE

    Rajashekar, TB

    1998-01-01

    The World Wide Web is emerging as an all-in-one information source. Tools for searching Web-based information include search engines, subject directories and meta search tools. We take a look at key features of these tools and suggest practical hints for effective Web searching.

  2. Classification of the web

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper discusses the challenges faced by investigations into the classification of the Web and outlines inquiries that are needed to use principles for bibliographic classification to construct classifications of the Web. This paper suggests that the classification of the Web meets challenges...

  3. Impact of invasive plants on food webs and pathways

    Directory of Open Access Journals (Sweden)

    Sikai Wang

    2013-05-01

    Full Text Available In natural ecosystems, energy mainly flows along food chains in food webs. Numerous studies have shown that plant invasions influence ecosystem functions through altering food webs. In recent decades, more attention has been paid to the effects of alien plants on local food webs. In this review, we analyze the influence of exotic plants on food webs and pathways, and explore the impacts of local food web characteristics on community invasibility. Invasive plants alter food webs mainly by changing basal resources and environment conditions in different ways. First, they are consumed by native herbivores due to their high availability, and are therefore incorporated into the native food web. Second, if they show low availability to native herbivores, a new food web is generated through introduction of new consumers or by changing the energy pathway. Third, environmental changes induced by plant invasions may alter population density and feeding behavior of various species at different trophic levels, thus alien plants will affect the communities and food web structures along non-trophic pathways. The influence of the local food web on plant invasions depends on web size and trophic connections. Issues that deserve attention in future studies are raised and discussed. Future research should extend from short-term experiments to long-term monitoring. More quantitative researches to define the responses of food web parameters are needed. In addition, recovering of food web structure and species interaction in restored habitats is an important issue requiring future research.

  4. Efficient Web Log Mining using Doubly Linked Tree

    CERN Document Server

    Jain, Ratnesh Kumar; Jain, Dr Suresh

    2009-01-01

    World Wide Web is a huge data repository and is growing with the explosive rate of about 1 million pages a day. As the information available on World Wide Web is growing the usage of the web sites is also growing. Web log records each access of the web page and number of entries in the web logs is increasing rapidly. These web logs, when mined properly can provide useful information for decision-making. The designer of the web site, analyst and management executives are interested in extracting this hidden information from web logs for decision making. Web access pattern, which is the frequently used sequence of accesses, is one of the important information that can be mined from the web logs. This information can be used to gather business intelligence to improve sales and advertisement, personalization for a user, to analyze system performance and to improve the web site organization. There exist many techniques to mine access patterns from the web logs. This paper describes the powerful algorithm that mine...

  5. Invasive mutualists erode native pollination webs.

    Directory of Open Access Journals (Sweden)

    Marcelo A Aizen

    2008-02-01

    Full Text Available Plant-animal mutualisms are characterized by weak or asymmetric mutual dependences between interacting species, a feature that could increase community stability. If invasive species integrate into mutualistic webs, they may alter web structure, with consequences for species persistence. However, the effect of alien mutualists on the architecture of plant-pollinator webs remains largely unexplored. We analyzed the extent of mutual dependency between interacting species, as a measure of mutualism strength, and the connectivity of 10 paired plant-pollinator webs, eight from forests of the southern Andes and two from oceanic islands, with different incidences of alien species. Highly invaded webs exhibited weaker mutualism than less-invaded webs. This potential increase in network stability was the result of a disproportionate increase in the importance and participation of alien species in the most asymmetric interactions. The integration of alien mutualists did not alter overall network connectivity, but links were transferred from generalist native species to super-generalist alien species during invasion. Therefore, connectivity among native species declined in highly invaded webs. These modifications in the structure of pollination webs, due to dominance of alien mutualists, can leave many native species subject to novel ecological and evolutionary dynamics.

  6. A Web GIS Framework for Participatory Sensing Service: An Open Source-Based Implementation

    Directory of Open Access Journals (Sweden)

    Yu Nakayama

    2017-04-01

    Full Text Available Participatory sensing is the process in which individuals or communities collect and analyze systematic data using mobile phones and cloud services. To efficiently develop participatory sensing services, some server-side technologies have been proposed. Although they provide a good platform for participatory sensing, they are not optimized for spatial data management and processing. For the purpose of spatial data collection and management, many web GIS approaches have been studied. However, they still have not focused on the optimal framework for participatory sensing services. This paper presents a web GIS framework for participatory sensing service (FPSS. The proposed FPSS enables an integrated deployment of spatial data capture, storage, and data management functions. In various types of participatory sensing experiments, users can collect and manage spatial data in a unified manner. This feature is realized by the optimized system architecture and use case based on the general requirements for participatory sensing. We developed an open source GIS-based implementation of the proposed framework, which can overcome financial difficulties that are one of the major problems of deploying sensing experiments. We confirmed with the prototype that participatory sensing experiments can be performed efficiently with the proposed FPSS.

  7. GIS-facilitated spatial narratives

    DEFF Research Database (Denmark)

    Møller-Jensen, Lasse; Jeppesen, Henrik; Kofie, Richard Y.

    2008-01-01

    -based' exploration of sites related to the narrative and as a tool that facilitates the design of spatial narratives before implementation within portable GIS devices. The Google Earth-based visualization of the spatial narrative is created by a Python script that outputs a web-accessible KML format file. The KML...... on the thematically and narrative linking of a set of locations within an area. A spatial narrative that describes the - largely unsuccessful - history of Danish plantations on the Gold Coast (1788-1850) is implemented through the Google Earth client. This client is seen both as a type of media in itself for ‘home...

  8. Total organic carbon analyzer

    Science.gov (United States)

    Godec, Richard G.; Kosenka, Paul P.; Smith, Brian D.; Hutte, Richard S.; Webb, Johanna V.; Sauer, Richard L.

    The development and testing of a breadboard version of a highly sensitive total-organic-carbon (TOC) analyzer are reported. Attention is given to the system components including the CO2 sensor, oxidation reactor, acidification module, and the sample-inlet system. Research is reported for an experimental reagentless oxidation reactor, and good results are reported for linearity, sensitivity, and selectivity in the CO2 sensor. The TOC analyzer is developed with gravity-independent components and is designed for minimal additions of chemical reagents. The reagentless oxidation reactor is based on electrolysis and UV photolysis and is shown to be potentially useful. The stability of the breadboard instrument is shown to be good on a day-to-day basis, and the analyzer is capable of 5 sample analyses per day for a period of about 80 days. The instrument can provide accurate TOC and TIC measurements over a concentration range of 20 ppb to 50 ppm C.

  9. Web Applications of Bibliometrics and Link Analysis

    Directory of Open Access Journals (Sweden)

    Faribourz Droudi

    2010-04-01

    Full Text Available The present study aims to introduce and analyze bibliometric application within Web and also to expounds on the status of link analysis in order to point out its application with respect to the existing web-based information sources. Findings indicate that bibliometrics could have required application in the area of digital resources available through Net. Link analysis is a process by which one could make statistical analysis of correlation between hyperlinks and therefore understand the accuracy, veracity and efficacy of citations within a digital document. Link analysis, in effect, is counted as a part of information ranking algorithm within the web environment. The number, linkage and quality of given links to a website are of utmost importance for ranking/status in the Web. The tools applied in this topic include, page ranking strategy, link analysis algorithm, latent semantic indexing and the classical input-output model. The present study analyzes Big Web and Small Web link analysis and explains the means for utilizing web charts in order to better understand the link analysis process.

  10. Analyzing radioligand binding data.

    Science.gov (United States)

    Motulsky, Harvey; Neubig, Richard

    2002-08-01

    Radioligand binding experiments are easy to perform, and provide useful data in many fields. They can be used to study receptor regulation, discover new drugs by screening for compounds that compete with high affinity for radioligand binding to a particular receptor, investigate receptor localization in different organs or regions using autoradiography, categorize receptor subtypes, and probe mechanisms of receptor signaling, via measurements of agonist binding and its regulation by ions, nucleotides, and other allosteric modulators. This unit reviews the theory of receptor binding and explains how to analyze experimental data. Since binding data are usually best analyzed using nonlinear regression, this unit also explains the principles of curve fitting with nonlinear regression.

  11. Advances in hematology analyzers.

    Science.gov (United States)

    DeNicola, Dennis B

    2011-05-01

    The complete blood count is one of the basic building blocks of the minimum database in veterinary medicine. Over the past 20 years, there has been a tremendous advancement in the technology of hematology analyzers and their availability to the general practitioner. There are 4 basic methodologies that can be used to generate data for a complete blood count: manual methods, quantitative buffy coat analysis, automated impedance analysis, and flow cytometric analysis. This article will review the principles of these methodologies, discuss some of their advantages and disadvantages, and describe some of the hematology analyzers that are available for the in-house veterinary laboratory.

  12. Web Mining: An Overview

    Directory of Open Access Journals (Sweden)

    P. V. G. S. Mudiraj B. Jabber K. David raju

    2011-12-01

    Full Text Available Web usage mining is a main research area in Web mining focused on learning about Web users and their interactions with Web sites. The motive of mining is to find users’ access models automatically and quickly from the vast Web log data, such as frequent access paths, frequent access page groups and user clustering. Through web usage mining, the server log, registration information and other relative information left by user provide foundation for decision making of organizations. This article provides a survey and analysis of current Web usage mining systems and technologies. There are generally three tasks in Web Usage Mining: Preprocessing, Pattern analysis and Knowledge discovery. Preprocessing cleans log file of server by removing log entries such as error or failure and repeated request for the same URL from the same host etc... The main task of Pattern analysis is to filter uninteresting information and to visualize and interpret the interesting pattern to users. The statistics collected from the log file can help to discover the knowledge. This knowledge collected can be used to take decision on various factors like Excellent, Medium, Weak users and Excellent, Medium and Weak web pages based on hit counts of the web page in the web site. The design of the website is restructured based on user’s behavior or hit counts which provides quick response to the web users, saves memory space of servers and thus reducing HTTP requests and bandwidth utilization. This paper addresses challenges in three phases of Web Usage mining along with Web Structure Mining.This paper also discusses an application of WUM, an online Recommender System that dynamically generates links to pages that have not yet been visited by a user and might be of his potential interest. Differently from the recommender systems proposed so far, ONLINE MINER does not make use of any off-line component, and is able to manage Web sites made up of pages dynamically generated.

  13. Research on Web Press Tension Control System

    OpenAIRE

    Chen Sheng Jiang; Zhang Chun Feng; Wang Zhong You; Li Qing Lin

    2016-01-01

    Tension control of press is a key and difficult point of the whole machine control. The stand or fall of tension is directly related to the quality of the products. According to the characteristics of the web press tension control, this paper expounds the main factors influencing tension and the purpose of tension control, researches on the tension control principle of web tape, analyzes control rule and control circuit of tension control system, illustrates the advantages of PID control law ...

  14. Analyzing Stereotypes in Media.

    Science.gov (United States)

    Baker, Jackie

    1996-01-01

    A high school film teacher studied how students recognized messages in film, examining how film education could help students identify and analyze racial and gender stereotypes. Comparison of students' attitudes before and after the film course found that the course was successful in raising students' consciousness. (SM)

  15. Analyzing Workforce Education. Monograph.

    Science.gov (United States)

    Texas Community & Technical Coll. Workforce Education Consortium.

    This monograph examines the issue of task analysis as used in workplace literacy programs, debating the need for it and how to perform it in a rapidly changing environment. Based on experiences of community colleges in Texas, the report analyzes ways that task analysis can be done and how to implement work force education programs more quickly.…

  16. Ajax and Web Services

    CERN Document Server

    Pruett, Mark

    2006-01-01

    Ajax and web services are a perfect match for developing web applications. Ajax has built-in abilities to access and manipulate XML data, the native format for almost all REST and SOAP web services. Using numerous examples, this document explores how to fit the pieces together. Examples demonstrate how to use Ajax to access publicly available web services fromYahoo! and Google. You'll also learn how to use web proxies to access data on remote servers and how to transform XML data using XSLT.

  17. Web services foundations

    CERN Document Server

    Bouguettaya, Athman; Daniel, Florian

    2013-01-01

    Web services and Service-Oriented Computing (SOC) have become thriving areas of academic research, joint university/industry research projects, and novel IT products on the market. SOC is the computing paradigm that uses Web services as building blocks for the engineering of composite, distributed applications out of the reusable application logic encapsulated by Web services. Web services could be considered the best-known and most standardized technology in use today for distributed computing over the Internet.Web Services Foundations is the first installment of a two-book collection coverin

  18. Web Security, Privacy & Commerce

    CERN Document Server

    Garfinkel, Simson

    2011-01-01

    Since the first edition of this classic reference was published, World Wide Web use has exploded and e-commerce has become a daily part of business and personal life. As Web use has grown, so have the threats to our security and privacy--from credit card fraud to routine invasions of privacy by marketers to web site defacements to attacks that shut down popular web sites. Web Security, Privacy & Commerce goes behind the headlines, examines the major security risks facing us today, and explains how we can minimize them. It describes risks for Windows and Unix, Microsoft Internet Exp

  19. Interfacing with the WEB

    CERN Document Server

    Dönszelmann, M

    1995-01-01

    Interfacing to the Web or programming interfaces for the Web is used to provide dynamic information for Web users. Using the Web as a transport system of information poses three constraints: namespace, statelessness and performance. To build interfaces on either server or client side of the Web one has to meet these constraints. Several examples, currently in use in High Energy Physics Experiments are described. They range from an interface to show where buildings are located to an interface showing active values of the On-line System of the DELPHI (CERN)..

  20. Excavando la web

    OpenAIRE

    Ricardo, Baeza-Yates

    2004-01-01

    The web is the internet's most important phenomenon, as demonstrated by its exponential growth and diversity. Hence, due to the volume and wealth of its data, search engines have become among the web's main tools. They are useful when we know what we are looking for. However, certainly the web holds answers to questions never imagined. The process of finding relations or interesting patterns within a data set is called "data mining" and in the case of the web, "web mining". In this article...

  1. Development of web-based geocoding applications for the population-based Birth Defects Surveillance System in New York state.

    Science.gov (United States)

    Wang, Ying; Le, Linh H; Wang, Xiaohang; Tao, Zhen; Druschel, Charlotte D; Cross, Philip K; Hwang, Syni-An

    2010-01-01

    Geographic information systems (GIS) have been widely used in mapping health data and analyzing the geographic distribution of disease. Mapping and spatially analyzing data normally begins with geocoding, a process of assigning geographic coordinates to an address so that it can be displayed and analyzed on a map. The objectives of this project were to develop Web-based geocoding applications for the New York State birth defects surveillance system to geocode, both automatically and interactively, the birth defect cases of the Congenital Malformations Registry (CMR) and evaluate the geocoding results. Geocoding software, in conjunction with a Java-based development tool (J Server), was used to develop the Web-based applications on the New York State Department of Health's Health Commerce System. The Web-based geocoding applications have been developed and implemented for the New York State birth defects surveillance system. These menu-driven applications empower users to conduct geocoding activities using only a PC and a Web browser without the installation of any GIS software. These powerful tools provide automatic, real-time, street-level geocoding of the routinely collected birth defects records in the CMR. Up to 92% of the CMR records have been geocoded with addresses exactly matched to the reference addresses on house number, street name, and city or zip code.

  2. Historical Evolution of Spatial Abilities

    Directory of Open Access Journals (Sweden)

    A. Ardila

    1993-01-01

    Full Text Available Historical evolution and cross-cultural differences in spatial abilities are analyzed. Spatial abilities have been found to be significantly associated with the complexity of geographical conditions and survival demands. Although impaired spatial cognition is found in cases of, exclusively or predominantly, right hemisphere pathology, it is proposed that this asymmetry may depend on the degree of training in spatial abilities. It is further proposed that spatial cognition might have evolved in a parallel way with cultural evolution and environmental demands. Contemporary city humans might be using spatial abilities in some new, conceptual tasks that did not exist in prehistoric times: mathematics, reading, writing, mechanics, music, etc. Cross-cultural analysis of spatial abilities in different human groups, normalization of neuropsychological testing instruments, and clinical observations of spatial ability disturbances in people with different cultural backgrounds and various spatial requirements, are required to construct a neuropsychological theory of brain organization of spatial cognition.

  3. Spatial Premise Integration in Hindi

    Science.gov (United States)

    Mishra, Ramesh Kumar

    2007-01-01

    Spatial reasoning or locating objects in a spatial space has long been an important area of research in cognitive science because analyzing space categorically and finding objects is a fundamental act of mental perception and cognition. Premise integration in tasks of spatial reasoning has recently received considerable research attention. This is…

  4. Spatial Premise Integration in Hindi

    Science.gov (United States)

    Mishra, Ramesh Kumar

    2007-01-01

    Spatial reasoning or locating objects in a spatial space has long been an important area of research in cognitive science because analyzing space categorically and finding objects is a fundamental act of mental perception and cognition. Premise integration in tasks of spatial reasoning has recently received considerable research attention. This is…

  5. Basis-neutral Hilbert-space analyzers

    CERN Document Server

    Martin, Lane; Kondakci, H Esat; Larson, Walker D; Shabahang, Soroush; Jahromi, Ali K; Malhotra, Tanya; Vamivakas, A Nick; Atia, George K; Abouraddy, Ayman F

    2016-01-01

    Interferometry is one of the central organizing principles of optics. Key to interferometry is the concept of optical delay, which facilitates spectral analysis in terms of time-harmonics. In contrast, when analyzing a beam in a Hilbert space spanned by spatial modes -- a critical task for spatial-mode multiplexing and quantum communication -- basis-specific principles are invoked that are altogether distinct from that of `delay.' Here, we extend the traditional concept of temporal delay to the spatial domain, thereby enabling the analysis of a beam in an arbitrary spatial-mode basis -- exemplified using Hermite-Gaussian and radial Laguerre-Gaussian modes. Such generalized delays correspond to optical implementations of fractional transforms; for example, the fractional Hankel transform is the generalized delay associated with the space of Laguerre-Gaussian modes, and an interferometer incorporating such a `delay' obtains modal weights in the associated Hilbert space. By implementing an inherently stable, rec...

  6. WebMGA: a customizable web server for fast metagenomic sequence analysis

    Directory of Open Access Journals (Sweden)

    Niu Beifang

    2011-09-01

    Full Text Available Abstract Background The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. Results We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. Conclusions WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  7. Increasing access to terrestrial ecology and remote sensing (MODIS) data through Web services and visualization tools

    Science.gov (United States)

    Santhana Vannan, S.; Cook, R. B.; Wei, Y.

    2012-12-01

    In recent years user access to data and information is increasingly handled through tools, services, and applications. Standards-based services have facilitated this development. These service-based methods to access data has boosted the use of data and in increasingly complex ways. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) has taken the approach of service-based access to data and visualization for distribution and visualization of its terrestrial ecology data, including MODIS (Moderate Resolution Imaging Spectroradiometer) remote sensing data products. The MODIS data products are highly useful for field research. The spectral, spatial and temporal characteristics of MODIS products have made them an important data source for analyzing key science questions relating to Earth system processes at multiple spatial and temporal scales. However, MODIS data volume and the complexity in data format make it less usable in some cases. To solve this usability issue, the ORNL DAAC has developed a system that prepares and distributes subsets of selected MODIS land products in a scale and format useful for field researchers. Web and Web service tools provide MODIS subsets in comma-delimited text format and in GIS compatible GeoTIFF format. Users can download and visualize MODIS subsets for a set of pre-defined locations, order MODIS subsets for any land location or automate the process of subset extraction using a SOAP-based Web service. The MODIS tools and services can be extended to support the large volume of data that would be produced by the various decadal survey missions. http://daac.ornl.gov/MODIS . The ORNL DAAC has also created a Web-based Spatial Data Access Tool (SDAT) that enables users to browse, visualize, and download a wide variety of geospatial data in various user-selected spatial/temporal extents, formats, and projections. SDAT is based on Open Geospatial Consortium (OGC) Web service standards that allows users to

  8. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  9. Magnetoresistive emulsion analyzer.

    Science.gov (United States)

    Lin, Gungun; Baraban, Larysa; Han, Luyang; Karnaushenko, Daniil; Makarov, Denys; Cuniberti, Gianaurelio; Schmidt, Oliver G

    2013-01-01

    We realize a magnetoresistive emulsion analyzer capable of detection, multiparametric analysis and sorting of ferrofluid-containing nanoliter-droplets. The operation of the device in a cytometric mode provides high throughput and quantitative information about the dimensions and magnetic content of the emulsion. Our method offers important complementarity to conventional optical approaches involving ferrofluids, and paves the way to the development of novel compact tools for diagnostics and nanomedicine including drug design and screening.

  10. IPv6 Protocol Analyzer

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    With the emerging of next generation Intemet protocol (IPv6), it is expected to replace the current version of Internet protocol (IPv4) that will be exhausted in the near future. Besides providing adequate address space, some other new features are included into the new 128 bits of IP such as IP auto configuration, quality of service, simple routing capability, security, mobility and multicasting. The current protocol analyzer will not be able to handle IPv6 packets. This paper will focus on developing protocol analyzer that decodes IPv6 packet. IPv6 protocol analyzer is an application module,which is able to decode the IPv6 packet and provide detail breakdown of the construction of the packet. It has to understand the detail construction of the IPv6, and provide a high level abstraction of bits and bytes of the IPv6 packet.Thus it increases network administrators' understanding of a network protocol,helps he/she in solving protocol related problem in a IPv6 network environment.

  11. Estimating Maintenance Cost for Web Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2016-01-01

    Full Text Available The current paper tackles the issue of determining a method for estimating maintenance costs for web applications. The current state of research in the field of web application maintenance is summarized and leading theories and results are highlighted. The cost of web maintenance is determined by the number of man-hours invested in maintenance tasks. Web maintenance tasks are categorized into content maintenance and technical maintenance. Research is centered on analyzing technical maintenance tasks. The research hypothesis is formulated on the assumption that the number of man-hours invested in maintenance tasks can be assessed based on the web application’s user interaction level, complexity and content update effort. Data regarding the costs of maintenance tasks is collected from 24 maintenance projects implemented by a web development company that tackles a wide area of web applications. Homogeneity and diversity of collected data is submitted for debate by presenting a sample of the data and depicting the overall size and comprehensive nature of the entire dataset. A set of metrics dedicated to estimating maintenance costs in web applications is defined based on conclusions formulated by analyzing the collected data and the theories and practices dominating the current state of research. Metrics are validated with regards to the initial research hypothesis. Research hypothesis are validated and conclusions are formulated on the topic of estimating the maintenance cost of web applications. The limits of the research process which represented the basis for the current paper are enunciated. Future research topics are submitted for debate.

  12. Exploration behaviour and behavioural flexibility in orb-web spiders: A review

    Institute of Scientific and Technical Information of China (English)

    Thomas HESSELBERG

    2015-01-01

    Orb-web spiders and their webs constitute an ideal model system in which to study behavioural flexibility and spatial cognition in invertebrates due to the easily quantifiable nature of the orb web.A large number of studies demonstrate how spiders are able to modify the geometry of their webs in response to a range of different conditions including the ability to adapt their webs to spatial constraints.However,the mechanisms behind this impressive web-building flexibility in these cognitively limited animals remain poorly explored.One possible mechanism though may be spatial learning during the spiders' exploration of their immediate surroundings.This review discusses the importance of exploration behaviour,the reliance on simple behavioural rules,and the use of already laid threads as guidelines for web-building in orb-web spiders.The focus is on the spiders' ability to detect and adapt their webs to space limitations and other spatial disruptions.I will also review the few published studies on how spatial information is gathered during the exploration phase and discuss the possibility of the use of ‘cognitive map’-like processes in spiders.Finally,the review provides suggestions for designing experimental studies to shed light on whether spiders gather metric information during the site exploration (cognitive map hypothesis) or rely on more simple binary information in combination with previously laid threads to build their webs (stigmergy hypothesis) [Current Zoology 61 (2):313-327,2015].

  13. Spatial Indexing for Data Searching in Mobile Sensing Environments

    Directory of Open Access Journals (Sweden)

    Yuchao Zhou

    2017-06-01

    Full Text Available Data searching and retrieval is one of the fundamental functionalities in many Web of Things applications, which need to collect, process and analyze huge amounts of sensor stream data. The problem in fact has been well studied for data generated by sensors that are installed at fixed locations; however, challenges emerge along with the popularity of opportunistic sensing applications in which mobile sensors keep reporting observation and measurement data at variable intervals and changing geographical locations. To address these challenges, we develop the Geohash-Grid Tree, a spatial indexing technique specially designed for searching data integrated from heterogeneous sources in a mobile sensing environment. Results of the experiments on a real-world dataset collected from the SmartSantander smart city testbed show that the index structure allows efficient search based on spatial distance, range and time windows in a large time series database.

  14. Web applications using the Google Web Toolkit

    OpenAIRE

    von Wenckstern, Michael

    2013-01-01

    This diploma thesis describes how to create or convert traditional Java programs to desktop-like rich internet applications with the Google Web Toolkit. The Google Web Toolkit is an open source development environment, which translates Java code to browser and device independent HTML and JavaScript. Most of the GWT framework parts, including the Java to JavaScript compiler as well as important security issues of websites will be introduced. The famous Agricola board game will be ...

  15. Web applications using the Google Web Toolkit

    OpenAIRE

    von Wenckstern, Michael

    2013-01-01

    This diploma thesis describes how to create or convert traditional Java programs to desktop-like rich internet applications with the Google Web Toolkit. The Google Web Toolkit is an open source development environment, which translates Java code to browser and device independent HTML and JavaScript. Most of the GWT framework parts, including the Java to JavaScript compiler as well as important security issues of websites will be introduced. The famous Agricola board game will be ...

  16. HIDDEN WEB EXTRACTOR DYNAMIC WAY TO UNCOVER THE DEEP WEB

    OpenAIRE

    DR. ANURADHA; BABITA AHUJA

    2012-01-01

    In this era of digital tsunami of information on the web, everyone is completely dependent on the WWW for information retrieval. This has posed a challenging problem in extracting relevant data. Traditional web crawlers focus only on the surface web while the deep web keeps expanding behind the scene. The web databases are hidden behind the query interfaces. In this paper, we propose a Hidden Web Extractor (HWE) that can automatically discover and download data from the Hidden Web databases. ...

  17. Online data analysis using Web GDL

    Science.gov (United States)

    Jaffey, A.; Cheung, M.; Kobashi, A.

    2008-12-01

    The ever improving capability of modern astronomical instruments to capture data at high spatial resolution and cadence is opening up unprecedented opportunities for scientific discovery. When data sets become so large that they cannot be easily transferred over the internet, the researcher must find alternative ways to perform data analysis. One strategy is to bring the data analysis code to where the data resides. We present Web GDL, an implementation of GDL (GNU Data Language, open source incremental compiler compatible with IDL) that allows users to perform interactive data analysis within a web browser.

  18. Web-based 3-D GIS and its applications for pipeline planning and construction

    Energy Technology Data Exchange (ETDEWEB)

    Tao, V.; Wang, T.Q.K. [Calgary Univ., Calgary, AB (Canada). Dept. of Geomatics Engineering

    2000-07-01

    The many benefits that web-based 3D geographical information system (GIS) technology can bring to pipeline planning and construction was discussed. GIS can effectively integrate and manage a variety of data sources including geological, geographical, environmental, engineering and socioeconomic data. The third dimension of geospatial data is also very significant for pipeline planning, construction and maintenance which explains the increased demand for the development of a 3D GIS for pipeline applications. The Internet has made it possible to integrate GIS, visualization and distributed object computing technologies for a web-based 3D GIS. While this offers many advantages, it also poses several technical challenges. The technology allows users to access, manipulate and analyze geospatial objects remotely. This has positive implications for pipeline operating companies in their collaborative decision making for large pipeline projects that cover large areas with multiple landowners and different government sections. The technology will enhance their capability and productivity by making it possible to run their operations more efficiently. The Department of Geomatics Engineering at the University of Calgary has developed a web-based 3D GIS, GeoEye 3D prototype using a pure Java solution. The system is based on an advanced client/server model for visualization, manipulation and analysis of spatial data such as 3D terrain, wells, linear objects such as roads or pipelines and solid objects such as buildings. The system can be linked to other databases for spatial inquiry. 7 refs., 3 figs.

  19. Integrating natural language processing and web GIS for interactive knowledge domain visualization

    Science.gov (United States)

    Du, Fangming

    Recent years have seen a powerful shift towards data-rich environments throughout society. This has extended to a change in how the artifacts and products of scientific knowledge production can be analyzed and understood. Bottom-up approaches are on the rise that combine access to huge amounts of academic publications with advanced computer graphics and data processing tools, including natural language processing. Knowledge domain visualization is one of those multi-technology approaches, with its aim of turning domain-specific human knowledge into highly visual representations in order to better understand the structure and evolution of domain knowledge. For example, network visualizations built from co-author relations contained in academic publications can provide insight on how scholars collaborate with each other in one or multiple domains, and visualizations built from the text content of articles can help us understand the topical structure of knowledge domains. These knowledge domain visualizations need to support interactive viewing and exploration by users. Such spatialization efforts are increasingly looking to geography and GIS as a source of metaphors and practical technology solutions, even when non-georeferenced information is managed, analyzed, and visualized. When it comes to deploying spatialized representations online, web mapping and web GIS can provide practical technology solutions for interactive viewing of knowledge domain visualizations, from panning and zooming to the overlay of additional information. This thesis presents a novel combination of advanced natural language processing - in the form of topic modeling - with dimensionality reduction through self-organizing maps and the deployment of web mapping/GIS technology towards intuitive, GIS-like, exploration of a knowledge domain visualization. A complete workflow is proposed and implemented that processes any corpus of input text documents into a map form and leverages a web

  20. An intelligent method for geographic Web search

    Science.gov (United States)

    Mei, Kun; Yuan, Ying

    2008-10-01

    While the electronically available information in the World-Wide Web is explosively growing and thus increasing, the difficulty to find relevant information is also increasing for search engine user. In this paper we discuss how to constrain web queries geographically. A number of search queries are associated with geographical locations, either explicitly or implicitly. Accurately and effectively detecting the locations where search queries are truly about has huge potential impact on increasing search relevance, bringing better targeted search results, and improving search user satisfaction. Our approach focus on both in the way geographic information is extracted from the web and, as far as we can tell, in the way it is integrated into query processing. This paper gives an overview of a spatially aware search engine for semantic querying of web document. It also illustrates algorithms for extracting location from web documents and query requests using the location ontologies to encode and reason about formal semantics of geographic web search. Based on a real-world scenario of tourism guide search, the application of our approach shows that the geographic information retrieval can be efficiently supported.

  1. Spatiotemporal Land Use Change Analysis Using Open-source GIS and Web Based Application

    Directory of Open Access Journals (Sweden)

    Wan Yusryzal Wan Ibrahim

    2015-05-01

    Full Text Available Spatiotemporal changes are very important information to reveal the characteristics of the urbanization process. Sharing the information is beneficial for public awareness which then improves their participation in adaptive management for spatial planning process. Open-source software and web application are freely available tools that can be the best medium used by any individual or agencies to share this important information. The objective of the paper is to discuss on the spatiotemporal land use change in Iskandar Malaysia by using open-source GIS (Quantum GIS and publish them through web application (Mash-up. Land use in 1994 to 2011 were developed and analyzed to show the landscape change of the region. Subsequently, web application was setup to distribute the findings of the study. The result show there is significant changes of land use in the study area especially on the decline of agricultural and natural land which were converted to urban land uses. Residential and industrial areas largely replaced the agriculture and natural areas particularly along the coastal zone of the region. This information is published through interactive GIS web in order to share it with the public and stakeholders. There are some limitations of web application but still not hindering the advantages of using it. The integration of open-source GIS and web application is very helpful in sharing planning information particularly in the study area that experiences rapid land use and land cover change. Basic information from this study is vital for conducting further study such as projecting future land use change and other related studies in the area.

  2. Web Video Mining: Metadata Predictive Analysis using Classification Techniques

    Directory of Open Access Journals (Sweden)

    Siddu P. Algur

    2016-02-01

    Full Text Available Now a days, the Data Engineering becoming emerging trend to discover knowledge from web audiovisual data such as- YouTube videos, Yahoo Screen, Face Book videos etc. Different categories of web video are being shared on such social websites and are being used by the billions of users all over the world. The uploaded web videos will have different kind of metadata as attribute information of the video data. The metadata attributes defines the contents and features/characteristics of the web videos conceptually. Hence, accomplishing web video mining by extracting features of web videos in terms of metadata is a challenging task. In this work, effective attempts are made to classify and predict the metadata features of web videos such as length of the web videos, number of comments of the web videos, ratings information and view counts of the web videos using data mining algorithms such as Decision tree J48 and navie Bayesian algorithms as a part of web video mining. The results of Decision tree J48 and navie Bayesian classification models are analyzed and compared as a step in the process of knowledge discovery from web videos.

  3. The Applied Research of Web Usage Mining%Web使用挖掘的应用研究

    Institute of Scientific and Technical Information of China (English)

    刘丽珍; 宋瀚涛; 陆玉昌

    2003-01-01

    Some effective and efficient knowledge patterns will be gained through searching, integrating, mining and analyzing on the Web. These useful knowledge patterns can help us to build so efficient Web site that WWW can ser-vice people well. In this paper we point out Web Usage Mining process influenced by Web site structure and content,and introduce the application of Web Usage mining in E-commerce. In the end a example of Web Usage Mining is given.

  4. Fuzzification of Web Objects: A Semantic Web Mining Approach

    Directory of Open Access Journals (Sweden)

    Tasawar Hussain

    2012-03-01

    Full Text Available Web Mining is becoming essential to support the web administrators and web users in multi-ways such as information retrieval; website performance management; web personalization; web marketing and website designing. Due to uncontrolled exponential growth in web data, knowledge base retrieval has become a very challenging task. The one viable solution to the problem is the merging of conventional web mining with semantic web technologies. This merging process will be more beneficial to web users by reducing the search space and by providing information that is more relevant. Key web objects play significant role in this process. The extraction of key web objects from a website is a challenging task. In this paper, we have proposed a framework, which extracts the key web objects from web log file and apply a semantic web to mine actionable intelligence. This proposed framework can be applied to non-semantic web for the extraction of key web objects. We also have defined an objective function to calculate key web object from users perspective. We named this function as key web object function. KWO function helps to fuzzify the extracted key web objects into three categories as Most Interested, Interested, and Least Interested. Fuzzification of web objects helps us to accommodate the uncertainty among the web objects of being user attractive. We also have validated the proposed scheme with the help of a case study.

  5. Analysis of Compute Vs Retrieve Intensive Web Applications and Its Impact On The Performance Of A Web Server

    Directory of Open Access Journals (Sweden)

    Syed Mutahar Aaqib

    2012-01-01

    Full Text Available The World Wide Web (WWW has undergone remarkable change over the past few years, placing substantially heavy load on Web servers. Today’s web servers host web applications that demand high computational resources. Also some applications require heavy database retrieval processing, making server load even more critical. In this paper, performance of Apache web server running compute and retrieve-intensive web workloads is analyzed. Workload files implemented in three dynamic web programming technologies: PERL, PHP and Java Servlets are used with MySQL acting as a data source. Measurements are performed with the intent to analyze the impact of application workloads on the overall performance of the web server and determine which web technology yields better performance on Windows and Linux platforms. Experimental results depict that for both compute and retrieve intensive applications, PHP exhibits better performance than PERL and Java Servlets. A multiple linear regression model was also developed to predict the web server performance and to validate the experimental results. This regression model showed that for compute and retrieve intensive web applications, PHP exhibits better performance than Perl and Java Servlets.

  6. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  7. Mineral/Water Analyzer

    Science.gov (United States)

    1983-01-01

    An x-ray fluorescence spectrometer developed for the Viking Landers by Martin Marietta was modified for geological exploration, water quality monitoring, and aircraft engine maintenance. The aerospace system was highly miniaturized and used very little power. It irradiates the sample causing it to emit x-rays at various energies, then measures the energy levels for sample composition analysis. It was used in oceanographic applications and modified to identify element concentrations in ore samples, on site. The instrument can also analyze the chemical content of water, and detect the sudden development of excessive engine wear.

  8. University rankings: The web ranking

    Directory of Open Access Journals (Sweden)

    Isidro F. Aguillo

    2012-03-01

    Full Text Available The publication in 2003 of the Ranking of Universities by Jiao Tong University of Shanghai has revolutionized not only academic studies on Higher Education, but has also had an important impact on the national policies and the individual strategies of the sector. The work gathers the main characteristics of this and other global university rankings, paying special attention to their potential benefits and limitations. The Web Ranking is analyzed in depth, presenting the model on which its compound indicator is based and analyzing its different variables. ------- Rankings de universidades: El ranking web Resumen La publicación en 2003 del Ranking de Universidades de la Universidad Jiao Tong de Shanghai ha revolucionado no sólo los estudios académicos sobre la Educación Superior, sino que también ha tenido un importante impacto sobre las políticas nacionales y las estrategias individuales del sector. El trabajo recoge las principales características de este y otros rankings mundiales de universidades, prestando especial atención a sus potencialidades y limitaciones. Se analiza en profundidad el Ranking Web, presentando el modelo en el que se basa su indicador compuesto y analizando sus diferentes variables y principales resultados. DOI: 10.18870/hlrc.v2i1.56   PDF document contains both the original in Spanish and an English translation.

  9. Spatial Databases

    Science.gov (United States)

    2007-09-19

    for a city . Spatial attributes are used to define the spatial location and extent of spatial objects [35]. The spatial attributes of a spatial object...regarding both geometry and thematic differentiation. It can be used to model 2.5D data (e.g., digital terrain model), as well as 3D data ( walkable ...within a city , if the coverage area of a wireless antenna is considered to be the visible area, then the union of coverage areas of all the antennas in

  10. Analyzing Aeroelasticity in Turbomachines

    Science.gov (United States)

    Reddy, T. S. R.; Srivastava, R.

    2003-01-01

    ASTROP2-LE is a computer program that predicts flutter and forced responses of blades, vanes, and other components of such turbomachines as fans, compressors, and turbines. ASTROP2-LE is based on the ASTROP2 program, developed previously for analysis of stability of turbomachinery components. In developing ASTROP2- LE, ASTROP2 was modified to include a capability for modeling forced responses. The program was also modified to add a capability for analysis of aeroelasticity with mistuning and unsteady aerodynamic solutions from another program, LINFLX2D, that solves the linearized Euler equations of unsteady two-dimensional flow. Using LINFLX2D to calculate unsteady aerodynamic loads, it is possible to analyze effects of transonic flow on flutter and forced response. ASTROP2-LE can be used to analyze subsonic, transonic, and supersonic aerodynamics and structural mistuning for rotors with blades of differing structural properties. It calculates the aerodynamic damping of a blade system operating in airflow so that stability can be assessed. The code also predicts the magnitudes and frequencies of the unsteady aerodynamic forces on the airfoils of a blade row from incoming wakes. This information can be used in high-cycle fatigue analysis to predict the fatigue lives of the blades.

  11. Field Deployable DNA analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  12. Analyzing the platelet proteome.

    Science.gov (United States)

    García, Angel; Zitzmann, Nicole; Watson, Steve P

    2004-08-01

    During the last 10 years, mass spectrometry (MS) has become a key tool for protein analysis and has underpinned the emerging field of proteomics. Using high-throughput tandem MS/MS following protein separation, it is potentially possible to analyze hundreds to thousands of proteins in a sample at a time. This technology can be used to analyze the protein content (i.e., the proteome) of any cell or tissue and complements the powerful field of genomics. The technology is particularly suitable for platelets because of the absence of a nucleus. Cellular proteins can be separated by either gel-based methods such as two-dimensional gel electrophoresis or one-dimensional sodium dodecyl sulfate polyacrylamide gel electrophoresis followed by liquid chromatography (LC) -MS/MS or by multidimensional LC-MS/MS. Prefractionation techniques, such as subcellular fractionations or immunoprecipitations, can be used to improve the analysis. Each method has particular advantages and disadvantages. Proteomics can be used to compare the proteome of basal and diseased platelets, helping to reveal information on the molecular basis of the disease.

  13. Spatial recurrence plots.

    Science.gov (United States)

    Vasconcelos, D B; Lopes, S R; Viana, R L; Kurths, J

    2006-05-01

    We propose an extension of the recurrence plot concept to perform quantitative analyzes of roughness and disorder of spatial patterns at a fixed time. We introduce spatial recurrence plots (SRPs) as a graphical representation of the pointwise correlation matrix, in terms of a two-dimensional spatial return plot. This technique is applied to the study of complex patterns generated by coupled map lattices, which are characterized by measures of complexity based on SRPs. We show that the complexity measures we propose for SRPs provide a systematic way of investigating the distribution of spatially coherent structures, such as synchronization domains, in lattice profiles. This approach has potential for many more applications, e.g., in surface roughness analyzes.

  14. The Web economy: goods, users, models and policies

    CERN Document Server

    Vafopoulos, Michalis

    2011-01-01

    Web emerged as an antidote to the rapidly increasing quantity of accumulated knowledge and become successful because it facilitates massive participation and communication with minimum costs. Today, its enormous impact, scale and dynamism in time and space make very difficult (and sometimes impossible) to measure and anticipate the effects in human society. In addition to that, we demand from the Web to be fast, secure, reliable, all-inclusive and trustworthy in any transaction. The scope of the present article is to review a part of the Web economy literature that will help us to identify its major participants and their functions. The goal is to understand how the Web economy differs from the traditional setting and what implications have these differences. Secondarily, we attempt to establish a minimal common understanding about the incentives and properties of the Web economy. In this direction the concept of Web Goods and a new classification of Web Users are introduced and analyzed This article, is not,...

  15. Web测试综述%A Survey of Web Testing

    Institute of Scientific and Technical Information of China (English)

    许蕾; 徐宝文; 陈振强

    2003-01-01

    With the extensive application of Web technology,it becomes more and more critical of the request to thequality and reliability of Web applications. Then it is crucial to test Web applications automaticly,entirely and thor-oughly. So we make a study of some Web testing methods and technologies. First,we discuss the necessity of Webtesting,then analyze where the faults may take place based on the architecture of the Web ,next discuss various meth-ods of Web testing in details. Then,based on the ideology of Object-Oriented,we build a model for Web testing,anddiscuss the method of doing some pertinence testing on pages when we utilize the information of statistic. At last,weintroduce some tools for white-box testing.

  16. Basin Assessment Spatial Planning Platform

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-26

    The tool is intended to facilitate hydropower development and water resource planning by improving synthesis and interpretation of disparate spatial datasets that are considered in development actions (e.g., hydrological characteristics, environmentally and culturally sensitive areas, existing or proposed water power resources, climate-informed forecasts). The tool enables this capability by providing a unique framework for assimilating, relating, summarizing, and visualizing disparate spatial data through the use of spatial aggregation techniques, relational geodatabase platforms, and an interactive web-based Geographic Information Systems (GIS). Data are aggregated and related based on shared intersections with a common spatial unit; in this case, industry-standard hydrologic drainage areas for the U.S. (National Hydrography Dataset) are used as the spatial unit to associate planning data. This process is performed using all available scalar delineations of drainage areas (i.e., region, sub-region, basin, sub-basin, watershed, sub-watershed, catchment) to create spatially hierarchical relationships among planning data and drainages. These entity-relationships are stored in a relational geodatabase that provides back-end structure to the web GIS and its widgets. The full technology stack was built using all open-source software in modern programming languages. Interactive widgets that function within the viewport are also compatible with all modern browsers.

  17. Engineering Web Applications

    DEFF Research Database (Denmark)

    Casteleyn, Sven; Daniel, Florian; Dolog, Peter

    Nowadays, Web applications are almost omnipresent. The Web has become a platform not only for information delivery, but also for eCommerce systems, social networks, mobile services, and distributed learning environments. Engineering Web applications involves many intrinsic challenges due to their......Nowadays, Web applications are almost omnipresent. The Web has become a platform not only for information delivery, but also for eCommerce systems, social networks, mobile services, and distributed learning environments. Engineering Web applications involves many intrinsic challenges due...... to their distributed nature, content orientation, and the requirement to make them available to a wide spectrum of users who are unknown in advance. The authors discuss these challenges in the context of well-established engineering processes, covering the whole product lifecycle from requirements engineering through...

  18. Metadata and the Web

    Directory of Open Access Journals (Sweden)

    Mehdi Safari

    2004-12-01

    Full Text Available The rapid increase in the number and variety of resources on the World Wide Web has made the problem of resource description and discovery central to discussions about the efficiency and evolution of this medium. The inappropriateness of traditional schemas of resource description for web resources has encouraged significant activities recently on defining web-compatible schemas named "metadata". While conceptually old for library and information professionals, metadata has taken more significant and paramount role than ever before and is considered as the golden key for the next evolution of the web in the form of semantic web. This article is intended to be a brief introduction to metadata and tries to present its overview in the web.

  19. The RCSB Protein Data Bank: redesigned web site and web services.

    Science.gov (United States)

    Rose, Peter W; Beran, Bojan; Bi, Chunxiao; Bluhm, Wolfgang F; Dimitropoulos, Dimitris; Goodsell, David S; Prlic, Andreas; Quesada, Martha; Quinn, Gregory B; Westbrook, John D; Young, Jasmine; Yukich, Benjamin; Zardecki, Christine; Berman, Helen M; Bourne, Philip E

    2011-01-01

    The RCSB Protein Data Bank (RCSB PDB) web site (http://www.pdb.org) has been redesigned to increase usability and to cater to a larger and more diverse user base. This article describes key enhancements and new features that fall into the following categories: (i) query and analysis tools for chemical structure searching, query refinement, tabulation and export of query results; (ii) web site customization and new structure alerts; (iii) pair-wise and representative protein structure alignments; (iv) visualization of large assemblies; (v) integration of structural data with the open access literature and binding affinity data; and (vi) web services and web widgets to facilitate integration of PDB data and tools with other resources. These improvements enable a range of new possibilities to analyze and understand structure data. The next generation of the RCSB PDB web site, as described here, provides a rich resource for research and education.

  20. Stability of Spatial Equilibrium

    OpenAIRE

    Tabuchi, Takatoshi; Dao-Zhi, Zeng

    2000-01-01

    This paper focuses on externalities between economic agents. We consider spatial dis- tribution of economic activities in a multiregional dynamical system, where regions may be interpreted as clubs, social subgroups, species, or strategies. Our dynamics includes gravity models and replicator dynamics as special cases. Assuming that other variables, such as prices are solved as a function of the population distribution, we analyze both interior and corner equilibria of spatial distribution in ...

  1. Using Open Web APIs in Teaching Web Mining

    Science.gov (United States)

    Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju

    2009-01-01

    With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…

  2. Using Open Web APIs in Teaching Web Mining

    Science.gov (United States)

    Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju

    2009-01-01

    With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…

  3. Analyzing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian

    2014-01-01

    , because the costs of processing and analyzing it exceed the benefits indicating bounded rationality. Hutton (2002) concludes that the analyst community’s inability to raise important questions on quality of management and the viability of its business model inevitably led to the Enron debacle. There seems...... financial statement. Plumlee (2003) finds for instance that such information imposes significant costs on even expert users such as analysts and fund managers and reduces their use of it. Analysts’ ability to incorporate complex information in their analyses is a decreasing function of its complexity...... to be evidence of the fact that all types of corporate stakeholders from management to employees, owners, the media and politicians have grave difficulties in interpreting new forms of reporting. One hypothesis could be that if managements’ own understanding of value creation is disclosed to the other...

  4. Analyzing architecture articles

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In the present study, we express the quality, function, and characteristics of architecture to help people comprehensively understand what architecture is. We also reveal the problems and conflict found in population, land, water resources, pollution, energy, and the organization systems in construction. China’s economy is transforming. We should focus on the cities, architectural environment, energy conservation, emission-reduction, and low-carbon output that will result in successful green development. We should macroscopically and microscopically analyze the development, from the natural environment to the artificial environment; from the relationship between human beings and nature to the combination of social ecology in cities, and farmlands. We must learn to develop and control them harmoniously and scientifically to provide a foundation for the methods used in architecture research.

  5. Analyzing geographic clustered response

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1991-08-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst's task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm. 21 refs., 15 figs., 2 tabs.

  6. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  7. Design & implementation of distributed spatial computing node based on WPS

    Science.gov (United States)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-03-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed.

  8. Web Service开发

    Institute of Scientific and Technical Information of China (English)

    张彬桥; 吴成明

    2007-01-01

    本文以实际项目为例介绍了J2EE中Axis框架下一个Web Service的完整开发过程,包括Axis下Web Service的编写方式和安装部署,基于JDOM的XML操作方法,并给出了以JSP页面作客户端调用Web Service的参考代码.

  9. Web Security Testing Cookbook

    CERN Document Server

    Hope, Paco

    2008-01-01

    Among the tests you perform on web applications, security testing is perhaps the most important, yet it's often the most neglected. The recipes in the Web Security Testing Cookbook demonstrate how developers and testers can check for the most common web security issues, while conducting unit tests, regression tests, or exploratory tests. Unlike ad hoc security assessments, these recipes are repeatable, concise, and systematic-perfect for integrating into your regular test suite.

  10. Fatigue Reliability Assessment of Correlated Welded Web-frame Joints

    Institute of Scientific and Technical Information of China (English)

    W. Huang; Y. Garbatov; C. Guedes Soares

    2014-01-01

    The objective of this work is to analyze the fatigue reliability of complex welded structures composed of multiple web-frame joints, accounting for correlation effects. A three-dimensional finite element model using the 20-node solid elements is generated. A linear elastic finite element analysis was performed, hotspot stresses in a web-frame joint were analyzed and fatigue damage was quantified employing the S-N approach. The statistical descriptors of the fatigue life of a non-correlated web-frame joint containing several critical hotspots were estimated. The fatigue reliability of a web-frame joint wasmodeled as a series system of correlated components using the Ditlevsen bounds. The fatigue reliability of the entire welded structure with multiple web-frame joints, modeled as a parallel system of non-correlated web-frame joints was also calculated.

  11. A Sorting Method of Meta-search Based on User Web Page Interactive Model

    Institute of Scientific and Technical Information of China (English)

    Zongli Jiang; Tengyu Zhang

    2012-01-01

    Nowadays, there is a problem in most meta-search engines that many web pages searched have nothing to do with users' expectations. We introduce a new user web page interactive model under the framework ofmeta search, which analyzes users' action to get users' interests and storages them, and update these information with users' feedback. Meanwhile this model analyzes user records stored in web, attaches labels to the web page with statistics of user interest. We calculate the similarity about user and web page with the information from model and add similarity to scores of web pages. The experimental results reveal that this method can improve the relevance of the information retrieval.

  12. Creating Web Pages Simplified

    CERN Document Server

    Wooldridge, Mike

    2011-01-01

    The easiest way to learn how to create a Web page for your family or organization Do you want to share photos and family lore with relatives far away? Have you been put in charge of communication for your neighborhood group or nonprofit organization? A Web page is the way to get the word out, and Creating Web Pages Simplified offers an easy, visual way to learn how to build one. Full-color illustrations and concise instructions take you through all phases of Web publishing, from laying out and formatting text to enlivening pages with graphics and animation. This easy-to-follow visual guide sho

  13. Building Social Web Applications

    CERN Document Server

    Bell, Gavin

    2009-01-01

    Building a web application that attracts and retains regular visitors is tricky enough, but creating a social application that encourages visitors to interact with one another requires careful planning. This book provides practical solutions to the tough questions you'll face when building an effective community site -- one that makes visitors feel like they've found a new home on the Web. If your company is ready to take part in the social web, this book will help you get started. Whether you're creating a new site from scratch or reworking an existing site, Building Social Web Applications

  14. An introduction to webs

    Science.gov (United States)

    White, C. D.

    2016-04-01

    Webs are sets of Feynman diagrams that contribute to the exponents of scattering amplitudes, in the kinematic limit in which emitted radiation is soft. As such, they have a number of phenomenological and formal applications, and offer tantalizing glimpses into the all-order structure of perturbative quantum field theory. This article is based on a series of lectures given to graduate students, and aims to provide a pedagogical introduction to webs. Topics covered include exponentiation in (non-)abelian gauge theories, the web mixing matrix formalism for non-abelian gauge theories, and recent progress on the calculation of web diagrams. Problems are included throughout the text, to aid understanding.

  15. Advanced web services

    CERN Document Server

    Bouguettaya, Athman; Daniel, Florian

    2013-01-01

    Web services and Service-Oriented Computing (SOC) have become thriving areas of academic research, joint university/industry research projects, and novel IT products on the market. SOC is the computing paradigm that uses Web services as building blocks for the engineering of composite, distributed applications out of the reusable application logic encapsulated by Web services. Web services could be considered the best-known and most standardized technology in use today for distributed computing over the Internet. This book is the second installment of a two-book collection covering the state-o

  16. Semantic Web Evaluation Challenge

    CERN Document Server

    2014-01-01

    This book constitutes the thoroughly refereed post conference proceedings of the first edition of the Semantic Web Evaluation Challenge, SemWebEval 2014, co-located with the 11th Extended Semantic Web conference, held in Anissaras, Crete, Greece, in May 2014. This book includes the descriptions of all methods and tools that competed at SemWebEval 2014, together with a detailed description of the tasks, evaluation procedures and datasets. The contributions are grouped in three areas: semantic publishing (sempub), concept-level sentiment analysis (ssa), and linked-data enabled recommender systems (recsys).

  17. An Introduction to Webs

    CERN Document Server

    White, C D

    2015-01-01

    Webs are sets of Feynman diagrams that contribute to the exponents of scattering amplitudes, in the kinematic limit in which emitted radiation is soft. As such, they have a number of phenomenological and formal applications, and offer tantalising glimpses into the all-order structure of perturbative quantum field theory. This article is based on a series of lectures given to graduate students, and aims to provide a pedagogical introduction to webs. Topics covered include exponentiation in (non-)abelian gauge theories, the web mixing matrix formalism for non-abelian gauge theories, and recent progress on the calculation of web diagrams. Problems are included throughout the text, to aid understanding.

  18. Programming the semantic web

    CERN Document Server

    Segaran, Toby; Taylor, Jamie

    2009-01-01

    With this book, the promise of the Semantic Web -- in which machines can find, share, and combine data on the Web -- is not just a technical possibility, but a practical reality Programming the Semantic Web demonstrates several ways to implement semantic web applications, using current and emerging standards and technologies. You'll learn how to incorporate existing data sources into semantically aware applications and publish rich semantic data. Each chapter walks you through a single piece of semantic technology and explains how you can use it to solve real problems. Whether you're writing

  19. Exploring Multimedia Web Conferencing

    Directory of Open Access Journals (Sweden)

    Ana-Maria SUDUC

    2009-01-01

    Full Text Available Internet changed the perspective on meetings and also on decision making processes. Virtualization of meetings has become a common way for collaboration among employees, customers, partners, trainees and trainers, etc. Web conferencing allows the collaboration between teams' members to achieve common goals. Without the need of travelling and meeting organization, the web conferencing applications permit the participation of people from different location. Web conferencing applications are multimedia systems that allow various remote collaborations with multiple types of resources. The paper presents an exploratory study on multimedia web conferencing systems, its advantages and disadvantages and also a use case, meant to highlight several of this technology benefits and problems.

  20. Web designer's idea book

    CERN Document Server

    McNeil, Patrick

    2014-01-01

    Discover the latest trends in web design! Looking for inspiration for your latest web design project? Expert Patrick McNeil, author of the popular Web Designer's Idea Book series, is back with all new examples of today's best website design. Featuring more than 650 examples of the latest trends, this fourth volume of The Web Designer's Idea Book is overflowing with visual inspiration. Arranged categorically, this fully illustrated guide puts important topics like design styles, elements, themes and responsive design at your fingertips. This new volume also includes a detailed discussion o

  1. RESTful Web Services Cookbook

    CERN Document Server

    Allamaraju, Subbu

    2010-01-01

    While the REST design philosophy has captured the imagination of web and enterprise developers alike, using this approach to develop real web services is no picnic. This cookbook includes more than 100 recipes to help you take advantage of REST, HTTP, and the infrastructure of the Web. You'll learn ways to design RESTful web services for client and server applications that meet performance, scalability, reliability, and security goals, no matter what programming language and development framework you use. Each recipe includes one or two problem statements, with easy-to-follow, step-by-step i

  2. Programming the Mobile Web

    CERN Document Server

    Firtman, Maximiliano

    2010-01-01

    Today's market for mobile apps goes beyond the iPhone to include BlackBerry, Nokia, Windows Phone, and smartphones powered by Android, webOS, and other platforms. If you're an experienced web developer, this book shows you how to build a standard app core that you can extend to work with specific devices. You'll learn the particulars and pitfalls of building mobile apps with HTML, CSS, and other standard web tools. You'll also explore platform variations, finicky mobile browsers, Ajax design patterns for mobile, and much more. Before you know it, you'll be able to create mashups using Web 2.

  3. Web Accessibility and Guidelines

    Science.gov (United States)

    Harper, Simon; Yesilada, Yeliz

    Access to, and movement around, complex online environments, of which the World Wide Web (Web) is the most popular example, has long been considered an important and major issue in the Web design and usability field. The commonly used slang phrase ‘surfing the Web’ implies rapid and free access, pointing to its importance among designers and users alike. It has also been long established that this potentially complex and difficult access is further complicated, and becomes neither rapid nor free, if the user is disabled. There are millions of people who have disabilities that affect their use of the Web. Web accessibility aims to help these people to perceive, understand, navigate, and interact with, as well as contribute to, the Web, and thereby the society in general. This accessibility is, in part, facilitated by the Web Content Accessibility Guidelines (WCAG) currently moving from version one to two. These guidelines are intended to encourage designers to make sure their sites conform to specifications, and in that conformance enable the assistive technologies of disabled users to better interact with the page content. In this way, it was hoped that accessibility could be supported. While this is in part true, guidelines do not solve all problems and the new WCAG version two guidelines are surrounded by controversy and intrigue. This chapter aims to establish the published literature related to Web accessibility and Web accessibility guidelines, and discuss limitations of the current guidelines and future directions.

  4. Human dynamics revealed through Web analytics

    Science.gov (United States)

    Gonçalves, Bruno; Ramasco, José J.

    2008-08-01

    The increasing ubiquity of Internet access and the frequency with which people interact with it raise the possibility of using the Web to better observe, understand, and monitor several aspects of human social behavior. Web sites with large numbers of frequently returning users are ideal for this task. If these sites belong to companies or universities, their usage patterns can furnish information about the working habits of entire populations. In this work, we analyze the properly anonymized logs detailing the access history to Emory University’s Web site. Emory is a medium-sized university located in Atlanta, Georgia. We find interesting structure in the activity patterns of the domain and study in a systematic way the main forces behind the dynamics of the traffic. In particular, we find that linear preferential linking, priority-based queuing, and the decay of interest for the contents of the pages are the essential ingredients to understand the way users navigate the Web.

  5. Web Mining and Social Networking

    DEFF Research Database (Denmark)

    Xu, Guandong; Zhang, Yanchun; Li, Lin

    sense of individuals or communities. The volume will benefit both academic and industry communities interested in the techniques and applications of web search, web data management, web mining and web knowledge discovery, as well as web community and social network analysis.......This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web...... mining, and the issue of how to incorporate web mining into web personalization and recommendation systems are also reviewed. Additionally, the volume explores web community mining and analysis to find the structural, organizational and temporal developments of web communities and reveal the societal...

  6. Using Open data in analyzing urban growth: urban density and change detection

    Science.gov (United States)

    murgante, Beniamino; Nolè, Gabriele; Lasaponara, Rosa; Lanorte, Antonio

    2013-04-01

    In recent years a great attention has been paid to the evolution and the use of spatial data. Internet technologies accelerated such a process, allowing more direct access to spatial information. It is estimated that more than 600 million people have been connected to the Internet at least once to display maps on the web. Consequently, there is an irreversible process which considers geographical dimension as a fundamental attribute for the management of information flows. Furthermore, the great activity produced by open data movement leads to an easier and clearer access to geospatial information. This trend concerns, in a less evident way, also satellite data, which are increasingly accessible through the web. Spatial planning, geography and other regional sciences find it difficult to build knowledge related to spatial transformation. These problems can be significantly reduced due to a large data availability, producing significant opportunities to capture knowledge useful for a better territorial governance. This study has been developed in a heavily anthropized area in southern Italy, Apulia region, using free spatial data and free multispectral and multitemporal satellite data (Apulia region was one of the first regions in Italy to adopt open data policies). The analysis concerns urban growth, which, in recent decades, showed a rapid increase. In a first step the evolution in time and change detection of urban areas has been analyzed paying particular attention to soil consumption. In the second step Kernel Density has been adopted in order to assess development pressures. KDE (Kernel Density Estimation) function is a technique that provides the density of a phenomenon based on point data. A mobile three dimensional surface has been produced from a set of points distributed over a region of space, which weighs the events within its sphere of influence, depending on their distance from the point from which intensity is estimated. It produces, considering as

  7. Bios data analyzer.

    Science.gov (United States)

    Sabelli, H; Sugerman, A; Kovacevic, L; Kauffman, L; Carlson-Sabelli, L; Patel, M; Konecki, J

    2005-10-01

    The Bios Data Analyzer (BDA) is a set of computer programs (CD-ROM, in Sabelli et al., Bios. A Study of Creation, 2005) for new time series analyses that detects and measures creative phenomena, namely diversification, novelty, complexes, nonrandom complexity. We define a process as creative when its time series displays these properties. They are found in heartbeat interval series, the exemplar of bios .just as turbulence is the exemplar of chaos, in many other empirical series (galactic distributions, meteorological, economic and physiological series), in biotic series generated mathematically by the bipolar feedback, and in stochastic noise, but not in chaotic attractors. Differencing, consecutive recurrence and partial autocorrelation indicate nonrandom causation, thereby distinguishing chaos and bios from random and random walk. Embedding plots distinguish causal creative processes (e.g. bios) that include both simple and complex components of variation from stochastic processes (e.g. Brownian noise) that include only complex components, and from chaotic processes that decay from order to randomness as the number of dimensions is increased. Varying bin and dimensionality show that entropy measures symmetry and variety, and that complexity is associated with asymmetry. Trigonometric transformations measure coexisting opposites in time series and demonstrate bipolar, partial, and uncorrelated opposites in empirical processes and bios, supporting the hypothesis that bios is generated by bipolar feedback, a concept which is at variance with standard concepts of polar and complementary opposites.

  8. TEAMS Model Analyzer

    Science.gov (United States)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  9. Analyzing Teachers' Stories

    Directory of Open Access Journals (Sweden)

    Anat Kainan

    2002-09-01

    Full Text Available This article presents an integrated socio-literal approach as a way to analyze work stories. It uses a case of teachers' stories about the administration as an example. The stories focus on grumbles about various activities of members of the management of a school in a small town. The complaints appear in descriptions of the action, the characters, and, in particular, in the way the story is presented to the audience. The stories present a situation of two opposing groups-the administration and the teachers. The presentation of the stories creates a sense of togetherness among the veterans and new teachers in the staff room, and helps the integration of the new teachers into the staff. The veterans use the stories as an opportunity to express their anger at not having been assigned responsibilities on the one hand and their hopes of such promotion on the other. The stories act as a convenient medium to express criticism without entering into open hostilities. Behind them, a common principle can be discerned- the good of the school. The stories describe the infringement of various aspects of the school's social order, and it is possible to elicit from them what general pattern the teachers want to preserve in the school.

  10. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  11. Analyzing Spacecraft Telecommunication Systems

    Science.gov (United States)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  12. The Research on Web Service based Network Management

    Directory of Open Access Journals (Sweden)

    Wenli Dong

    2010-07-01

    Full Text Available This paper proposes Web Service based network management. The Web Service based network management system is analyzed. It consists of network management layer, collaborative management implementation layer, and management function layer mainly. The complex management network tasks can be accomplished respectively by more than one Web Service distributed on Internet and the Web Services interchange information based on XML message. The SNMP/XML gateway and the translation between GDMO/ASN.1 and XML/Schema are designed and implemented to implement the integration between the legacy network management systems and the network management developed by Web Service technologies. The service management in Web Service based network management is discussed. Service composition/re-composition in Web Service based network management is analyzed based on the QoS requirements negotiation between the network management requirements and the statement of Web Service and network, OWL-S being used to described the network management requirements to discover the suitable Web Service, BPEL being used to describe the Web Service composition.

  13. Spatial cognition

    Science.gov (United States)

    Kaiser, Mary Kister; Remington, Roger

    1988-01-01

    Spatial cognition is the ability to reason about geometric relationships in the real (or a metaphorical) world based on one or more internal representations of those relationships. The study of spatial cognition is concerned with the representation of spatial knowledge, and our ability to manipulate these representations to solve spatial problems. Spatial cognition is utilized most critically when direct perceptual cues are absent or impoverished. Examples are provided of how human spatial cognitive abilities impact on three areas of space station operator performance: orientation, path planning, and data base management. A videotape provides demonstrations of relevant phenomena (e.g., the importance of orientation for recognition of complex, configural forms). The presentation is represented by abstract and overhead visuals only.

  14. An Internet-Based GIS Platform Providing Data for Visualization and Spatial Analysis of Urbanization in Major Asian and African Cities

    Directory of Open Access Journals (Sweden)

    Hao Gong

    2017-08-01

    Full Text Available Rapid urbanization in developing countries has been observed to be relatively high in the last two decades, especially in the Asian and African regions. Although many researchers have made efforts to improve the understanding of the urbanization trends of various cities in Asia and Africa, the absence of platforms where local stakeholders can visualize and obtain processed urbanization data for their specific needs or analysis, still remains a gap. In this paper, we present an Internet-based GIS platform called MEGA-WEB. The Platform was developed in view of the urban planning and management challenges in developing countries of Asia and Africa due to the limited availability of data resources, effective tools, and proficiency in data analysis. MEGA-WEB provides online access, visualization, spatial analysis, and data sharing services following a mashup framework of the MEGA-WEB Geo Web Services (GWS, with the third-party map services using HTML5/JavaScript techniques. Through the integration of GIS, remote sensing, geo-modelling, and Internet GIS, several indicators for analyzing urbanization are provided in MEGA-WEB to give diverse perspectives on the urbanization of not only the physical land surface condition, but also the relationships of population, energy use, and the environment. The design, architecture, system functions, and uses of MEGA-WEB are discussed in the paper. The MEGA-WEB project is aimed at contributing to sustainable urban development in developing countries of Asia and Africa.

  15. Spatial Data Analysis.

    Science.gov (United States)

    Banerjee, Sudipto

    2016-01-01

    With increasing accessibility to geographic information systems (GIS) software, statisticians and data analysts routinely encounter scientific data sets with geocoded locations. This has generated considerable interest in statistical modeling for location-referenced spatial data. In public health, spatial data routinely arise as aggregates over regions, such as counts or rates over counties, census tracts, or some other administrative delineation. Such data are often referred to as areal data. This review article provides a brief overview of statistical models that account for spatial dependence in areal data. It does so in the context of two applications: disease mapping and spatial survival analysis. Disease maps are used to highlight geographic areas with high and low prevalence, incidence, or mortality rates of a specific disease and the variability of such rates over a spatial domain. They can also be used to detect hot spots or spatial clusters that may arise owing to common environmental, demographic, or cultural effects shared by neighboring regions. Spatial survival analysis refers to the modeling and analysis for geographically referenced time-to-event data, where a subject is followed up to an event (e.g., death or onset of a disease) or is censored, whichever comes first. Spatial survival analysis is used to analyze clustered survival data when the clustering arises from geographical regions or strata. Illustrations are provided in these application domains.

  16. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  17. An Authentication system of Web Services Based on Web Server Log Analysis

    Directory of Open Access Journals (Sweden)

    R. Joseph Manoj

    2014-01-01

    Full Text Available Authentication is a method which validates users' identity prior to permitting them to access the web services. To enhance the security of web services, providers follow varieties of authentication methods to restrict malicious users from accessing the services. This paper proposes a new authentication method which claims user’s identity by analyzing web server log files which includes the details of requesting user’s IP address, username, password, date and time of request, status code, URL etc., and checks IP address spoofing using ingress packet filtering method. This paper also analyses the resultant data and performance of the proposed work.

  18. A Plausible Comprehensive Web Intelligent System for Investigation of Web User Behaviour Adaptable to Incremental Mining

    Directory of Open Access Journals (Sweden)

    V.V.R. Maheswara Rao

    2010-08-01

    Full Text Available With the continued increase in the usage of the World Wide Web (WWW Web mining has beenestablished as an important area of research. The WWW is a vast repository of unstructured information,in the form of interrelated files, distributed on numerous web servers over wide geographical regions.Web mining deals with the discovering and analyzing of useful information from the WWW. Web usagemining focuses on investigating the potential knowledge from the browsing patterns of users and to findthe correlation between the pages on analysis. To proceed towards web intelligence, obviating the needfor human interaction, need to incorporate and embed artificial intelligence into web tools. Beforeapplying mining techniques, the data in the web log has to be pre-processed, integrated and transformed.The data pre-processing stage is the most important phase in the process of web mining and is criticaland complex in successful extraction of useful data. The web log is non scalable, impractical anddistributed in nature thus conventional data pre-processing techniques are proved to be not suitable asthey assume that the data is static. Hence intelligent system is required for capable of pre processingweblog efficiently. Due to the incremental nature of the web log, it is necessary for web miners to useincremental mining techniques to extract the usage patterns and study the visiting characteristics of user,hence one can require a comprehensive algorithm which reduces the computing cost significantly.This paper introduces an Intelligent System IPS for pre-processing of web log, in addition a learningalgorithm IFP-tree model is proposed for pattern recognition. The Intelligent Pre-processing System(IPS can differentiate human user and web search engine accesses intelligently in less time, and discardssearch engine accesses. The present system reduces the error rate and improves significant learningperformance of the algorithm. The Incremental Frequent Pattern Tree

  19. The Technology of Extracting Content Information from Web Page Based on DOM Tree

    Science.gov (United States)

    Yuan, Dingrong; Mo, Zhuoying; Xie, Bing; Xie, Yangcai

    There are huge amounts of information on Web pages, which includes content information and other useless information, such as navigation, advertisement and flash of animation etc. Reducing the toils of Web users, we estabished a thechnique to extract the content information from web page. Fristly, we analyzed the semantic of web documents by V8 engine of Google and parsed the web document into DOM tree. And then, traversed the DOM tree, pruned the DOM tree in the light of the characteristic of Web page's edit language. Finally, we extracted the content information from Web page. Theoretics and experiments showed that the technique could simplify the web page, present the content information to web users and supply clean data for applicable area, such as retrieval, KDD and DM from web.

  20. Soft Decision Analyzer

    Science.gov (United States)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  1. Crew Activity Analyzer

    Science.gov (United States)

    Murray, James; Kirillov, Alexander

    2008-01-01

    The crew activity analyzer (CAA) is a system of electronic hardware and software for automatically identifying patterns of group activity among crew members working together in an office, cockpit, workshop, laboratory, or other enclosed space. The CAA synchronously records multiple streams of data from digital video cameras, wireless microphones, and position sensors, then plays back and processes the data to identify activity patterns specified by human analysts. The processing greatly reduces the amount of time that the analysts must spend in examining large amounts of data, enabling the analysts to concentrate on subsets of data that represent activities of interest. The CAA has potential for use in a variety of governmental and commercial applications, including planning for crews for future long space flights, designing facilities wherein humans must work in proximity for long times, improving crew training and measuring crew performance in military settings, human-factors and safety assessment, development of team procedures, and behavioral and ethnographic research. The data-acquisition hardware of the CAA (see figure) includes two video cameras: an overhead one aimed upward at a paraboloidal mirror on the ceiling and one mounted on a wall aimed in a downward slant toward the crew area. As many as four wireless microphones can be worn by crew members. The audio signals received from the microphones are digitized, then compressed in preparation for storage. Approximate locations of as many as four crew members are measured by use of a Cricket indoor location system. [The Cricket indoor location system includes ultrasonic/radio beacon and listener units. A Cricket beacon (in this case, worn by a crew member) simultaneously transmits a pulse of ultrasound and a radio signal that contains identifying information. Each Cricket listener unit measures the difference between the times of reception of the ultrasound and radio signals from an identified beacon

  2. Web Design Matters

    Science.gov (United States)

    Mathews, Brian

    2009-01-01

    The web site is a library's most important feature. Patrons use the web site for numerous functions, such as renewing materials, placing holds, requesting information, and accessing databases. The homepage is the place they turn to look up the hours, branch locations, policies, and events. Whether users are at work, at home, in a building, or on…

  3. Decoding Technology: Web Browsers

    Science.gov (United States)

    Walker, Tim; Donohue, Chip

    2007-01-01

    More than ever, early childhood administrators are relying on the Internet for information. A key to becoming an exceptional Web "surfer" is getting to know the ins and outs of the Web browser being used. There are several options available, and almost all can be downloaded for free. However, many of the functions and features they offer are very…

  4. Making WEB Meaning.

    Science.gov (United States)

    McKenzie, Jamie

    1996-01-01

    Poorly organized and dominated by amateurs, hucksters, and marketeers, the net requires efficient navigating devices. Students at Bellingham (Washington) Public Schools tackle information overload by contributing to virtual museums on school Web sites, using annotated Web curriculum lists, and conducting research in cooperative teams stressing…

  5. Uncovering the unarchived web

    NARCIS (Netherlands)

    Samar, T.; Huurdeman, H.C.; Ben-David, A.; Kamps, J.; Vries, A.P. de

    2014-01-01

    Many national and international heritage institutes realize the importance of archiving the web for future culture heritage. Web archiving is currently performed either by harvesting a national domain, or by crawling a pre-defined list of websites selected by the archiving institution. In either met

  6. Mastering Go web services

    CERN Document Server

    Kozyra, Nathan

    2015-01-01

    If you are a web programmer with experience in developing web services and have a rudimentary knowledge of using Go, then this is the book for you. Basic knowledge of Go as well as knowledge of relational databases and non-relational NoSQL datastores is assumed. Some basic concurrency knowledge is also required.

  7. CERN celebrates Web anniversary

    CERN Document Server

    2003-01-01

    "Ten years ago, CERN issued a statement declaring that a little known piece of software called the World Wide Web was in the public domain. That was on 30 April 1993, and it opened the floodgates to Web development around the world" (1 page).

  8. Web Auctions in Europe

    NARCIS (Netherlands)

    A. Pouloudi; J. Paarlberg; H.W.G.M. van Heck (Eric)

    2001-01-01

    textabstractThis paper argues that a better understanding of the business model of web auctions can be reached if we adopt a broader view and provide empirical research from different sites. In this paper the business model of web auctions is refined into four dimensions. These are auction model, mo

  9. Web Design Matters

    Science.gov (United States)

    Mathews, Brian

    2009-01-01

    The web site is a library's most important feature. Patrons use the web site for numerous functions, such as renewing materials, placing holds, requesting information, and accessing databases. The homepage is the place they turn to look up the hours, branch locations, policies, and events. Whether users are at work, at home, in a building, or on…

  10. Sign Language Web Pages

    Science.gov (United States)

    Fels, Deborah I.; Richards, Jan; Hardman, Jim; Lee, Daniel G.

    2006-01-01

    The World Wide Web has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The…

  11. Web Page Design.

    Science.gov (United States)

    Lindsay, Lorin

    Designing a web home page involves many decisions that affect how the page will look, the kind of technology required to use the page, the links the page will provide, and kinds of patrons who can use the page. The theme of information literacy needs to be built into every web page; users need to be taught the skills of sorting and applying…

  12. Causal analyzing on regional economic disparities based on the spatial economic model: A case study of Lan-Xin railway radiation belt%基于空间计量经济模型的区域经济差异成因分析——以兰新铁路辐射带为例

    Institute of Scientific and Technical Information of China (English)

    李建豹; 白永平; 李建虎; 侯成成

    2012-01-01

    利用空间分析法确定兰新铁路辐射带的辐射范围,以县级行政单元为基本研究单元,综合运用SPSS、GeoDA和ARCGIS分析区域经济差异后发现:在兰新铁路辐射带内,兰州市市辖区、乌鲁木齐市市辖区、嘉峪关市、哈密市、阿拉善左旗的经济发展水平明显比其它地区高,甘肃段内区域经济差异较大,新疆段内区域经济差异较小;区域经济空间集聚特征明显;利用空间计量经济模型分析区域经济差异成因可知,财政收入对经济发展具有明显的负面影响,市场规模、经济结构、工业化、虚拟变量对经济发展具有明显的促进作用,其中虚拟变量与经济发展水平的回归系数最大,说明农村城市化对经济发展的促进作用最大。区域投资水平对经济发展影响不明显。%In this paper we discussed economic development disparities and the causes by taking administrative county units in the radiation range through spatial analysis of GIS.Based on the ten relative economic indices in Lan-Xin railway radiation belt,the general score of regional economic level was calculated by SPSS,and the disparities of regional economic level in Lan-Xin railway radiation belt were analyzed by means of GIS spatial analysis provided by ARCGIS and GeoDA.Based on studying on the economic development disparities,some conclusions were drawn as follows.In Lan-Xin railway radiation belt,the regional economic level is significantly higher in the municipal of Lanzhou city,the municipal of Urumqi city,Jiayuguan city,Hami city,Alxa Left Banner than others.The economic disparities is larger in Gansu section,it is comparatively smaller in Xinjiang section.Regional economy takes on significantly spatial agglomeration.Based on the spatial econometric models,we analyzed the causes of regional economic disparities,and got some conclusions as follows.Financial receipt has obvious negative impact on economic development.Market scale

  13. Mining topological relations from the web

    OpenAIRE

    Schockaert, Steven; Smart, Philip D.; Abdelmoty, Alia I.; Jones, Christopher B.

    2008-01-01

    Topological relations between geographic regions are of interest in many applications. When the exact boundaries of regions are not available, such relations can be established by analysing natural language information from web documents. In particular we demonstrate how redundancy-based techniques can be used to acquire containment and adjacency relations, and how fuzzy spatial reasoning can be employed to maintain the consistency of the resulting knowledge base.

  14. The Chemnitz LogAnalyzer: a tool for analyzing data from hypertext navigation research.

    Science.gov (United States)

    Brunstein, Angela; Naumann, Anja; Krems, Josef F

    2005-05-01

    Computer-based studies usually produce log files as raw data. These data cannot be analyzed adequately with conventional statistical software. The Chemnitz LogAnalyzer provides tools for quick and comfortable visualization and analyses of hypertext navigation behavior by individual users and for aggregated data. In addition, it supports analogous analyses of questionnaire data and reanalysis with respect to several predefined orders of nodes of the same hypertext. As an illustration of how to use the Chemnitz LogAnalyzer, we give an account of one study on learning with hypertext. Participants either searched for specific details or read a hypertext document to familiarize themselves with its content. The tool helped identify navigation strategies affected by these two processing goals and provided comparisons, for example, of processing times and visited sites. Altogether, the Chemnitz LogAnalyzer fills the gap between log files as raw data of Web-based studies and conventional statistical software.

  15. Vibration Propagation in Spider Webs

    Science.gov (United States)

    Hatton, Ross; Otto, Andrew; Elias, Damian

    Due to their poor eyesight, spiders rely on web vibrations for situational awareness. Web-borne vibrations are used to determine the location of prey, predators, and potential mates. The influence of web geometry and composition on web vibrations is important for understanding spider's behavior and ecology. Past studies on web vibrations have experimentally measured the frequency response of web geometries by removing threads from existing webs. The full influence of web structure and tension distribution on vibration transmission; however, has not been addressed in prior work. We have constructed physical artificial webs and computer models to better understand the effect of web structure on vibration transmission. These models provide insight into the propagation of vibrations through the webs, the frequency response of the bare web, and the influence of the spider's mass and stiffness on the vibration transmission patterns. Funded by NSF-1504428.

  16. Engineering Adaptive Web Applications

    DEFF Research Database (Denmark)

    Dolog, Peter

    2007-01-01

    suit the user profile the most. This paper summarizes the domain engineering framework for such adaptive web applications. The framework provides guidelines to develop adaptive web applications as members of a family. It suggests how to utilize the design artifacts as knowledge which can be used......Information and services on the web are accessible for everyone. Users of the web differ in their background, culture, political and social environment, interests and so on. Ambient intelligence was envisioned as a concept for systems which are able to adapt to user actions and needs....... With the growing amount of information and services, the web applications become natural candidates to adopt the concepts of ambient intelligence. Such applications can deal with divers user intentions and actions based on the user profile and can suggest the combination of information content and services which...

  17. IL web tutorials

    DEFF Research Database (Denmark)

    Hyldegård, Jette; Lund, Haakon

    2012-01-01

    The paper presents the results from a study on information literacy in a higher education (HE) context based on a larger research project evaluating 3 Norwegian IL web tutorials at 6 universities and colleges in Norway. The aim was to evaluate how the 3 web tutorials served students’ information...... seeking and writing process in an study context and to identify barriers to the employment and use of the IL web tutorials, hence to the underlying information literacy intentions by the developer. Both qualitative and quantitative methods were employed. A clear mismatch was found between intention...... and use of the web tutorials. In addition, usability only played a minor role compared to relevance. It is concluded that the positive expectations of the IL web tutorials tend to be overrated by the developers. Suggestions for further research are presented....

  18. Webs and Posets

    CERN Document Server

    Dukes, Mark; McAslan, Heather; Scott, Darren J; White, Chris D

    2013-01-01

    The non-Abelian exponentiation theorem has recently been generalised to correlators of multiple Wilson line operators. The perturbative expansions of these correlators exponentiate in terms of sets of diagrams called webs, which together give rise to colour factors corresponding to connected graphs. The colour and kinematic degrees of freedom of individual diagrams in a web are entangled by mixing matrices of purely combinatorial origin. In this paper we relate the combinatorial study of these matrices to properties of partially ordered sets (posets), and hence obtain explicit solutions for certain families of web-mixing matrix, at arbitrary order in perturbation theory. We also provide a general expression for the rank of a general class of mixing matrices, which governs the number of independent colour factors arising from such webs. Finally, we use the poset language to examine a previously conjectured sum rule for the columns of web-mixing matrices which governs the cancellation of the leading subdivergen...

  19. Quantifying Human Visible Color Variation from High Definition Digital Images of Orb Web Spiders

    Science.gov (United States)

    Ajuria Ibarra, Helena; Rao, Dinesh

    2016-01-01

    Digital processing and analysis of high resolution images of 30 individuals of the orb web spider Verrucosa arenata were performed to extract and quantify human visible colors present on the dorsal abdomen of this species. Color extraction was performed with minimal user intervention using an unsupervised algorithm to determine groups of colors on each individual spider, which was then analyzed in order to quantify and classify the colors obtained, both spatially and using energy and entropy measures of the digital images. Analysis shows that the colors cover a small region of the visible spectrum, are not spatially homogeneously distributed over the patterns and from an entropic point of view, colors that cover a smaller region on the whole pattern carry more information than colors covering a larger region. This study demonstrates the use of processing tools to create automatic systems to extract valuable information from digital images that are precise, efficient and helpful for the understanding of the underlying biology. PMID:27902724

  20. Spatializing Time

    DEFF Research Database (Denmark)

    Thomsen, Bodil Marie Stavning

    2011-01-01

    The article analyses some of artist Søren Lose's photographic installations in which time, history and narration is reflected in the creation of allegoric, spatial relations.......The article analyses some of artist Søren Lose's photographic installations in which time, history and narration is reflected in the creation of allegoric, spatial relations....

  1. Spatializing Time

    DEFF Research Database (Denmark)

    2011-01-01

    The article analyses some of artist Søren Lose's photographic installations in which time, history and narration is reflected in the creation of allegoric, spatial relations.......The article analyses some of artist Søren Lose's photographic installations in which time, history and narration is reflected in the creation of allegoric, spatial relations....

  2. Parasites in the Wadden Sea food web

    Science.gov (United States)

    Thieltges, David W.; Engelsma, Marc Y.; Wendling, Carolin C.; Wegner, K. Mathias

    2013-09-01

    While the free-living fauna of the Wadden Sea has received much interest, little is known on the distribution and effects of parasites in the Wadden Sea food web. However, recent studies on this special type of trophic interaction indicate a high diversity of parasites in the Wadden Sea and suggest a multitude of effects on the hosts. This also includes effects on specific predator-prey relationships and the general structure of the food web. Focussing on molluscs, a major group in the Wadden Sea in terms of biomass and abundance and an important link between primary producers and predators, we review existing studies and exemplify the ecological role of parasites in the Wadden Sea food web. First, we give a brief inventory of parasites occurring in the Wadden Sea, ranging from microparasites (e.g. protozoa, bacteria) to macroparasites (e.g. helminths, parasitic copepods) and discuss the effects of spatial scale on heterogeneities in infection levels. We then demonstrate how parasites can affect host population dynamics by acting as a strong mortality factor, causing mollusc mass mortalities. In addition, we will exemplify how parasites can mediate the interaction strength of predator-prey relationships and affect the topological structure of the Wadden Sea food web as a whole. Finally, we highlight some ongoing changes regarding parasitism in the Wadden Sea in the course of global change (e.g. species introduction, climate change) and identify important future research questions to entangle the role of parasites in the Wadden Sea food web.

  3. Optimizing Web Sites for Customer Retention

    CERN Document Server

    Hahsler, Michael

    2008-01-01

    With customer relationship management (CRM) companies move away from a mainly product-centered view to a customer-centered view. Resulting from this change, the effective management of how to keep contact with customers throughout different channels is one of the key success factors in today's business world. Company Web sites have evolved in many industries into an extremely important channel through which customers can be attracted and retained. To analyze and optimize this channel, accurate models of how customers browse through the Web site and what information within the site they repeatedly view are crucial. Typically, data mining techniques are used for this purpose. However, there already exist numerous models developed in marketing research for traditional channels which could also prove valuable to understanding this new channel. In this paper we propose the application of an extension of the Logarithmic Series Distribution (LSD) model repeat-usage of Web-based information and thus to analyze and op...

  4. Web Log Analysis: A Study of Instructor Evaluations Done Online

    Science.gov (United States)

    Klassen, Kenneth J.; Smith, Wayne

    2004-01-01

    This paper focuses on developing a relatively simple method for analyzing web-logs. It also explores the challenges and benefits of web-log analysis. The study of student behavior on this site provides insights into website design and the effectiveness of this site in particular. Another benefit realized from the paper is the ease with which these…

  5. Text mining of web-based medical content

    CERN Document Server

    Neustein, Amy

    2014-01-01

    Text Mining of Web-Based Medical Content examines web mining for extracting useful information that can be used for treating and monitoring the healthcare of patients. This work provides methodological approaches to designing mapping tools that exploit data found in social media postings. Specific linguistic features of medical postings are analyzed vis-a-vis available data extraction tools for culling useful information.

  6. Classical Hypermedia Virtues on the Web with Webstrates

    DEFF Research Database (Denmark)

    Bouvin, Niels Olof; Klokmose, Clemens Nylandsted

    2016-01-01

    We show and analyze herein how Webstrates can augment the Web from a classical hypermedia perspective. Webstrates turns the DOM of Web pages into persistent and collaborative objects. We demonstrate how this can be applied to realize bidirectional links, shared collaborative annotations, and in...

  7. An Analysis of Academic Library Web Pages for Faculty

    Science.gov (United States)

    Gardner, Susan J.; Juricek, John Eric; Xu, F. Grace

    2008-01-01

    Web sites are increasingly used by academic libraries to promote key services and collections to teaching faculty. This study analyzes the content, location, language, and technological features of fifty-four academic library Web pages designed especially for faculty to expose patterns in the development of these pages.

  8. Gaining insight into food webs reconstructed by the inverse method

    NARCIS (Netherlands)

    Kones, J.; Soetaert, K.E.R.; Van Oevelen, D.; Owino, J.; Mavuti, K.

    2006-01-01

    The use of the inverse method to analyze flow patterns of organic components in ecological systems has had wide application in ecological modeling. Through this approach, an infinite number of food web flows describing the food web and satisfying biological constraints are generated, from which one

  9. Programming Collective Intelligence Building Smart Web 2.0 Applications

    CERN Document Server

    Segaran, Toby

    2008-01-01

    This fascinating book demonstrates how you can build web applications to mine the enormous amount of data created by people on the Internet. With the sophisticated algorithms in this book, you can write smart programs to access interesting datasets from other web sites, collect data from users of your own applications, and analyze and understand the data once you've found it.

  10. Surveying the Commons: Current Implementation of Information Commons Web sites

    Science.gov (United States)

    Leeder, Christopher

    2009-01-01

    This study assessed the content of 72 academic library Information Commons (IC) Web sites using content analysis, quantitative assessment and qualitative surveys of site administrators to analyze current implementation by the academic library community. Results show that IC Web sites vary widely in content, design and functionality, with few…

  11. Harnessing the Deep Web: Present and Future

    CERN Document Server

    Madhavan, Jayant; Antova, Lyublena; Halevy, Alon

    2009-01-01

    Over the past few years, we have built a system that has exposed large volumes of Deep-Web content to Google.com users. The content that our system exposes contributes to more than 1000 search queries per-second and spans over 50 languages and hundreds of domains. The Deep Web has long been acknowledged to be a major source of structured data on the web, and hence accessing Deep-Web content has long been a problem of interest in the data management community. In this paper, we report on where we believe the Deep Web provides value and where it does not. We contrast two very different approaches to exposing Deep-Web content -- the surfacing approach that we used, and the virtual integration approach that has often been pursued in the data management literature. We emphasize where the values of each of the two approaches lie and caution against potential pitfalls. We outline important areas of future research and, in particular, emphasize the value that can be derived from analyzing large collections of potenti...

  12. Web users’ language utilization behaviors in China

    Institute of Scientific and Technical Information of China (English)

    LAI; Maosheng; QU; Peng; ZHAO; Kang

    2009-01-01

    The paper focuses on the habits of China Web users’language utilization behaviors in accessing the Web.It also seeks to make a general study on the basic nature of language phenomenon with regard to digital accessing.A questionnaire survey was formulated and distributed online for these research purposes.There were 1,267 responses collected.The data were analyzed with descriptive statistics,Chi-square testing and contingency table analyses.Results revealed the following findings.Tagging has already played an important role in Web2.0 communication for China’s Web users.China users rely greatly on all kinds of taxonomies in browsing and have also an awareness of them in effective searching.These imply that the classified languages in digital environment may aid Chinese Web users in a more satisfying manner.Highly subject-specific words,especially those from authorized tools,yielded better results in searching.Chinese users have high recognition for related terms.As to the demographic aspect,there is little difference between different genders in the utilization of information retrieval languages.Age may constitute a variable element to a certain degree.Educational background has a complex effect on language utilizations in searching.These research findings characterize China Web users’behaviors in digital information accessing.They also can be potentially valuable for the modeling and further refinement of digital accessing services.

  13. El web como sistema de información

    OpenAIRE

    Rodríguez Perojo, Keilyn

    2006-01-01

    The practical, theoretical, and historic antecedents necessary for the rise of a new area in information sciences are analyzed: the recovery and the importance of the Web as a new space for the interaction of man with the hypertextual information. The concepts of superficial Web and deep Web are exposed, and some of the main useful search engines to explore the deep Web, as well as the new tools for the information retrieval, such as the textual and data mining and the discovery of know-h...

  14. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  15. Spatial Text Visualization Using Automatic Typographic Maps.

    Science.gov (United States)

    Afzal, S; Maciejewski, R; Jang, Yun; Elmqvist, N; Ebert, D S

    2012-12-01

    We present a method for automatically building typographic maps that merge text and spatial data into a visual representation where text alone forms the graphical features. We further show how to use this approach to visualize spatial data such as traffic density, crime rate, or demographic data. The technique accepts a vector representation of a geographic map and spatializes the textual labels in the space onto polylines and polygons based on user-defined visual attributes and constraints. Our sample implementation runs as a Web service, spatializing shape files from the OpenStreetMap project into typographic maps for any region.

  16. Analyzing the security posture of South African websites

    CSIR Research Space (South Africa)

    Mtsweni, Jabu, S

    2015-08-12

    Full Text Available . Consequently, relevant web-based vulnerabilities and security countermeasures were selected for the analysis. The results of the study suggest that most of the 70 South African websites analyzed are vulnerable to cross-site scripting, injection vulnerabilities...

  17. Watch out for superman: first visualize, then analyze.

    Science.gov (United States)

    Kozak, Marcin

    2012-01-01

    A visit from Superman shows why data visualization should come before data analysis. The Web extra is a dataset that comprises 100 observations of the quantitative variables y and x plus the qualitative variable group. When analyzed correctly, this dataset exhibits an interesting pattern.

  18. Isolation by distance, web service

    Directory of Open Access Journals (Sweden)

    Bohonak Andrew J

    2005-03-01

    Full Text Available Abstract Background The population genetic pattern known as "isolation by distance" results from spatially limited gene flow and is a commonly observed phenomenon in natural populations. However, few software programs exist for estimating the degree of isolation by distance among populations, and they tend not to be user-friendly. Results We have created Isolation by Distance Web Service (IBDWS a user-friendly web interface for determining patterns of isolation by distance. Using this site, population geneticists can perform a variety of powerful statistical tests including Mantel tests, Reduced Major Axis (RMA regression analysis, as well as calculate FST between all pairs of populations and perform basic summary statistics (e.g., heterozygosity. All statistical results, including publication-quality scatter plots in Postscript format, are returned rapidly to the user and can be easily downloaded. Conclusion IBDWS population genetics analysis software is hosted at http://phage.sdsu.edu/~jensen/ and documentation is available at http://www.bio.sdsu.edu/pub/andy/IBD.html. The source code has been made available on Source Forge at http://sourceforge.net/projects/ibdws/.

  19. Analyzing the user behavior towards Electronic Commerce stimuli

    Directory of Open Access Journals (Sweden)

    Carlota Lorenzo-Romero

    2016-11-01

    Full Text Available Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e. navigational structure as utilitarian stimulus versus nonverbal web technology (music and presentation of products as hedonic stimuli. Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human bebaviour (i.e. users’ internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes- within the retail online store created by computer, taking into account some mediator variables (i.e. involvement, atmospheric responsiveness, and perceived risk. A 2(free versus hierarchical navigational structure x2(on versus off music x2(moving versus static images between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  20. Analyzing the User Behavior toward Electronic Commerce Stimuli.

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-Del-Amo, María-Del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies -which constitute the web atmosphere or webmosphere of a website- on shopping human behavior (i.e., users' internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 ("free" versus "hierarchical" navigational structure) × 2 ("on" versus "off" music) × 2 ("moving" versus "static" images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.