WorldWideScience

Sample records for web analyzing spatial

  1. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  2. Analyzing Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M.-Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2007-01-01

    Web services should be dependable, because businesses rely on them. For that purpose the Service Oriented Architecture has standardized specifications at a syntactical level. In this paper, we demonstrate how such specifications are used to derive semantic models in the form of (timed) automata...

  3. Elements of a Spatial Web

    DEFF Research Database (Denmark)

    Jensen, Christian S.

    2010-01-01

    Driven by factors such as the increasingly mobile use of the web and the proliferation of geo-positioning technologies, the web is rapidly acquiring a spatial aspect. Specifically, content and users are being geo-tagged, and services are being developed that exploit these tags. The research...... community is hard at work inventing means of efficiently supporting new spatial query functionality. Points of interest with a web presence, called spatial web objects, have a location as well as a textual description. Spatio-textual queries return such objects that are near a location argument...... and are relevant to a text argument. An important element in enabling such queries is to be able to rank spatial web objects. Another is to be able to determine the relevance of an object to a query. Yet another is to enable the efficient processing of such queries. The talk covers recent results on spatial web...

  4. Analyzing Web Behavior in Indoor Retail Spaces

    OpenAIRE

    Ren, Yongli; Tomko, Martin; Salim, Flora; Ong, Kevin; Sanderson, Mark

    2015-01-01

    We analyze 18 million rows of Wi-Fi access logs collected over a one year period from over 120,000 anonymized users at an inner-city shopping mall. The anonymized dataset gathered from an opt-in system provides users' approximate physical location, as well as Web browsing and some search history. Such data provides a unique opportunity to analyze the interaction between people's behavior in physical retail spaces and their Web behavior, serving as a proxy to their information needs. We find: ...

  5. Data management on the spatial web

    DEFF Research Database (Denmark)

    Jensen, Christian S.

    2012-01-01

    Due in part to the increasing mobile use of the web and the proliferation of geo-positioning, the web is fast acquiring a significant spatial aspect. Content and users are being augmented with locations that are used increasingly by location-based services. Studies suggest that each week, several...... billion web queries are issued that have local intent and target spatial web objects. These are points of interest with a web presence, and they thus have locations as well as textual descriptions. This development has given prominence to spatial web data management, an area ripe with new and exciting...... opportunities and challenges. The research community has embarked on inventing and supporting new query functionality for the spatial web. Different kinds of spatial web queries return objects that are near a location argument and are relevant to a text argument. To support such queries, it is important...

  6. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  7. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  8. APPROACHES TO ANALYZE THE QUALITY OF ROMANIAN TOURISM WEB SITES

    Directory of Open Access Journals (Sweden)

    Lacurezeanu Ramona

    2013-07-01

    The purpose of our work is to analyze travel web-sites, more exactly, whether the criteria used to analyze virtual stores are also adequate for the Romanian tourism product. Following the study, we concluded that the Romanian online tourism web-sites for the Romanian market have the features that we found listed on similar web-sites of France, England, Germany, etc. In conclusion, online Romanian tourism can be considered one of the factors of economic growth.

  9. A WebGIS-based system for analyzing and visualizing air quality data for Shanghai Municipality

    Science.gov (United States)

    Wang, Manyi; Liu, Chaoshun; Gao, Wei

    2014-10-01

    An online visual analytical system based on Java Web and WebGIS for air quality data for Shanghai Municipality was designed and implemented to quantitatively analyze and qualitatively visualize air quality data. By analyzing the architecture of WebGIS and Java Web, we firstly designed the overall scheme for system architecture, then put forward the software and hardware environment and also determined the main function modules for the system. The visual system was ultimately established with the DIV + CSS layout method combined with JSP, JavaScript, and some other computer programming languages based on the Java programming environment. Moreover, Struts, Spring, and Hibernate frameworks (SSH) were integrated in the system for the purpose of easy maintenance and expansion. To provide mapping service and spatial analysis functions, we selected ArcGIS for Server as the GIS server. We also used Oracle database and ESRI file geodatabase to store spatial data and non-spatial data in order to ensure the data security. In addition, the response data from the Web server are resampled to implement rapid visualization through the browser. The experimental successes indicate that this system can quickly respond to user's requests, and efficiently return the accurate processing results.

  10. TOWARD SEMANTIC WEB INFRASTRUCTURE FOR SPATIAL FEATURES' INFORMATION

    Directory of Open Access Journals (Sweden)

    R. Arabsheibani

    2015-12-01

    Full Text Available The Web and its capabilities can be employed as a tool for data and information integration if comprehensive datasets and appropriate technologies and standards enable the web with interpretation and easy alignment of data and information. Semantic Web along with the spatial functionalities enable the web to deal with the huge amount of data and information. The present study investigate the advantages and limitations of the Spatial Semantic Web and compare its capabilities with relational models in order to build a spatial data infrastructure. An architecture is proposed and a set of criteria is defined for the efficiency evaluation. The result demonstrate that when using the data with special characteristics such as schema dynamicity, sparse data or available relations between the features, the spatial semantic web and graph databases with spatial operations are preferable.

  11. Spatial Data Web Services Pricing Model Infrastructure

    Science.gov (United States)

    Ozmus, L.; Erkek, B.; Colak, S.; Cankurt, I.; Bakıcı, S.

    2013-08-01

    The General Directorate of Land Registry and Cadastre (TKGM) which is the leader in the field of cartography largely continues its missions which are; to keep and update land registry and cadastre system of the country under the responsibility of the treasure, to perform transactions related to real estate and to establish Turkish national spatial information system. TKGM a public agency has completed many projects. Such as; Continuously Operating GPS Reference Stations (TUSAGA-Aktif), Geo-Metadata Portal (HBB), Orthophoto-Base Map Production and web services, Completion of Initial Cadastre, Cadastral Renovation Project (TKMP), Land Registry and Cadastre Information System (TAKBIS), Turkish National Spatial Data Infrastructure Project (TNSDI), Ottoman Land Registry Archive Information System (TARBIS). TKGM provides updated map and map information to not only public institutions but also to related society in the name of social responsibility principals. Turkish National Spatial Data Infrastructure activities have been started by the motivation of Circular No. 2003/48 which was declared by Turkish Prime Ministry in 2003 within the context of e-Transformation of Turkey Short-term Action Plan. Action No. 47 in the mentioned action plan implies that "A Feasibility Study shall be made in order to establish the Turkish National Spatial Data Infrastructure" whose responsibility has been given to General Directorate of Land Registry and Cadastre. Feasibility report of NSDI has been completed in 10th of December 2010. After decision of Steering Committee, feasibility report has been send to Development Bank (old name State Planning Organization) for further evaluation. There are two main arrangements with related this project (feasibility report).First; Now there is only one Ministry which is Ministry of Environment and Urbanism responsible for establishment, operating and all national level activities of NSDI. And Second arrangement is related to institutional Level. The

  12. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    Science.gov (United States)

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  13. A WebQuest for Spatial Skills

    Science.gov (United States)

    Wood, Pamela L.; Quitadamo, Ian J.; DePaepe, James L.; Loverro, Ian

    2007-01-01

    The WebQuest is a four-step process integrated at appropriate points in the Animal Studies unit. Through the WebQuest, students create a series of habitat maps that build on the knowledge gained from conducting the various activities of the unit. The quest concludes with an evaluation using the WebQuest rubric and an oral presentation of a final…

  14. KOMPONEN WEB DATA ANALYZER PADA IE STUDI KASUS: AKSES WEB TERFAVORIT LABORATORIUM IBS TEKNIK INFORMATIKA - ITS

    Directory of Open Access Journals (Sweden)

    Darlis Heru Murti

    2005-07-01

    Explorer. Untuk itu di dalam pelaksanaan penelitian ini, akan dilakukan perancangan dan pembuatan sebuah perangkat lunak komponen Web Data Analyzer yang melekat pada browser Internet Explorer untuk pencarian akses web terfavorit pengguna. Uji coba dan evaluasi pada penelitian ini dilakukan dengan melakukan instalasi komponen Web Data Analyzer pada sejumlah workstation di Laboratorium IBS Teknik Informatika ITS. Hasil uji coba menunjukkan bahwa komponen Web Data Analyzer mampu memonitor dan menganalisa data aktivitas browsing pengguna serta melakukan otomatisasi terhadap fitur Favorites Internet Explorer dari data aktivitas browsing pengguna yang berhasil tersimpan ke database server. Kata kunci: band object, explorer bar, browser helper object (bho, http analyzer.

  15. Analyzing Web pages visual scanpaths: between and within tasks variability.

    Science.gov (United States)

    Drusch, Gautier; Bastien, J M Christian

    2012-01-01

    In this paper, we propose a new method for comparing scanpaths in a bottom-up approach, and a test of the scanpath theory. To do so, we conducted a laboratory experiment in which 113 participants were invited to accomplish a set of tasks on two different websites. For each site, they had to perform two tasks that had to be repeated ounce. The data were analyzed using a procedure similar to the one used by Duchowski et al. [8]. The first step was to automatically identify, then label, AOIs with the mean-shift clustering procedure [19]. Then, scanpaths were compared two by two with a modified version of the string-edit method, which take into account the order of AOIs visualizations [2]. Our results show that scanpaths variability between tasks but within participants seems to be lower than the variability within task for a given participant. In other words participants seem to be more coherent when they perform different tasks, than when they repeat the same tasks. In addition, participants view more of the same AOI when they perform a different task on the same Web page than when they repeated the same task. These results are quite different from what predicts the scanpath theory.

  16. DESIGN FOR CONNECTING SPATIAL DATA INFRASTRUCTURES WITH SENSOR WEB (SENSDI

    Directory of Open Access Journals (Sweden)

    D. Bhattacharya

    2016-06-01

    Full Text Available Integrating Sensor Web With Spatial Data Infrastructures (SENSDI aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS; 'Sensor Planning Service' (SPS; 'Sensor Alert Service' (SAS; a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS. Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.

  17. Importance of the spatial data and the sensor web in the ubiquitous computing area

    Science.gov (United States)

    Akçit, Nuhcan; Tomur, Emrah; Karslıoǧlu, Mahmut O.

    2014-08-01

    Spatial data has become a critical issue in recent years. In the past years, nearly more than three quarters of databases, were related directly or indirectly to locations referring to physical features, which constitute the relevant aspects. Spatial data is necessary to identify or calculate the relationships between spatial objects when using spatial operators in programs or portals. Originally, calculations were conducted using Geographic Information System (GIS) programs on local computers. Subsequently, through the Internet, they formed a geospatial web, which is integrated into a discoverable collection of geographically related web standards and key features, and constitutes a global network of geospatial data that employs the World Wide Web to process textual data. In addition, the geospatial web is used to gather spatial data producers, resources, and users. Standards also constitute a critical dimension in further globalizing the idea of the geospatial web. The sensor web is an example of the real time service that the geospatial web can provide. Sensors around the world collect numerous types of data. The sensor web is a type of sensor network that is used for visualizing, calculating, and analyzing collected sensor data. Today, people use smart devices and systems more frequently because of the evolution of technology and have more than one mobile device. The considerable number of sensors and different types of data that are positioned around the world have driven the production of interoperable and platform-independent sensor web portals. The focus of such production has been on further developing the idea of an interoperable and interdependent sensor web of all devices that share and collect information. The other pivotal idea consists of encouraging people to use and send data voluntarily for numerous purposes with the some level of credibility. The principal goal is to connect mobile and non-mobile device in the sensor web platform together to

  18. Spatial and social connectedness in web-based work collaboration

    NARCIS (Netherlands)

    Handberg, L.; Gullström, C.; Kort, J.; Nyström, J.

    2016-01-01

    The work presented here seeks an integration of spatial and social features supporting shared activities, and engages users in multiple locations to manipulate realtime video-streams. Standard and easily available equipment is used together with the communication standard WebRTC. It adds a spatial

  19. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DEFF Research Database (Denmark)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    2017-01-01

    ://openmsinersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was :evaluated by analyzing an MSI data set of a high-throughput glycoside...... processing tools for the analysis of large arrayed MSI sample sets. The OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http...

  20. Food-web structure of seagrass communities across different spatial scales and human impacts.

    Science.gov (United States)

    Coll, Marta; Schmidt, Allison; Romanuk, Tamara; Lotze, Heike K

    2011-01-01

    Seagrass beds provide important habitat for a wide range of marine species but are threatened by multiple human impacts in coastal waters. Although seagrass communities have been well-studied in the field, a quantification of their food-web structure and functioning, and how these change across space and human impacts has been lacking. Motivated by extensive field surveys and literature information, we analyzed the structural features of food webs associated with Zostera marina across 16 study sites in 3 provinces in Atlantic Canada. Our goals were to (i) quantify differences in food-web structure across local and regional scales and human impacts, (ii) assess the robustness of seagrass webs to simulated species loss, and (iii) compare food-web structure in temperate Atlantic seagrass beds with those of other aquatic ecosystems. We constructed individual food webs for each study site and cumulative webs for each province and the entire region based on presence/absence of species, and calculated 16 structural properties for each web. Our results indicate that food-web structure was similar among low impact sites across regions. With increasing human impacts associated with eutrophication, however, food-web structure show evidence of degradation as indicated by fewer trophic groups, lower maximum trophic level of the highest top predator, fewer trophic links connecting top to basal species, higher fractions of herbivores and intermediate consumers, and higher number of prey per species. These structural changes translate into functional changes with impacted sites being less robust to simulated species loss. Temperate Atlantic seagrass webs are similar to a tropical seagrass web, yet differed from other aquatic webs, suggesting consistent food-web characteristics across seagrass ecosystems in different regions. Our study illustrates that food-web structure and functioning of seagrass habitats change with human impacts and that the spatial scale of food-web analysis

  1. Food-web structure of seagrass communities across different spatial scales and human impacts.

    Directory of Open Access Journals (Sweden)

    Marta Coll

    Full Text Available Seagrass beds provide important habitat for a wide range of marine species but are threatened by multiple human impacts in coastal waters. Although seagrass communities have been well-studied in the field, a quantification of their food-web structure and functioning, and how these change across space and human impacts has been lacking. Motivated by extensive field surveys and literature information, we analyzed the structural features of food webs associated with Zostera marina across 16 study sites in 3 provinces in Atlantic Canada. Our goals were to (i quantify differences in food-web structure across local and regional scales and human impacts, (ii assess the robustness of seagrass webs to simulated species loss, and (iii compare food-web structure in temperate Atlantic seagrass beds with those of other aquatic ecosystems. We constructed individual food webs for each study site and cumulative webs for each province and the entire region based on presence/absence of species, and calculated 16 structural properties for each web. Our results indicate that food-web structure was similar among low impact sites across regions. With increasing human impacts associated with eutrophication, however, food-web structure show evidence of degradation as indicated by fewer trophic groups, lower maximum trophic level of the highest top predator, fewer trophic links connecting top to basal species, higher fractions of herbivores and intermediate consumers, and higher number of prey per species. These structural changes translate into functional changes with impacted sites being less robust to simulated species loss. Temperate Atlantic seagrass webs are similar to a tropical seagrass web, yet differed from other aquatic webs, suggesting consistent food-web characteristics across seagrass ecosystems in different regions. Our study illustrates that food-web structure and functioning of seagrass habitats change with human impacts and that the spatial scale of

  2. Analyzing the Web Services and UniFrame Paradigms

    Science.gov (United States)

    2003-04-01

    Web Service) using SOAP. 2.1.2 Business-To-Business (B2B) Solutions The Internet has given birth to a �digital economy� [5]. In such an economy...Jersey 07458 [5] Dhingra, V., �Business-to-Business Ecommerce ,� http://projects.bus.lsu.edu/independent_study/vdhing1/b2b. [6] A Darwin Partners and

  3. Analyzing Spatial Factors in Crime-inducing Culture in Chaloos

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Omidi Noghlehbari

    2017-02-01

    Full Text Available Feelings of security in urban areas are one of the qualitative criteria of living space. With the rise in urbanization and increasing abnormal urban behavior, especially offenses, this issue has become of great importance. This study aims to identify and analyze the effects of spatial factors on crime-inducing culture in affluent and non-affluent neighborhoods of Chaloos. The method used in this study is analytical-descriptive and it is practical in terms of objective. In order to study and understand the status of physical-spatial structures of affluent and non-affluent neighborhoods in Chaloos, we used field study method. Data were collected through interview, notes taking and questionnaire. The results indicate that in affluent neighborhoods, all criteria components of spatial differences are above the mean (Mean=3, but in non-affluent neighborhoods, all components are lower than the mean. In addition, there is a significant relationship between criteria components of spatial differences and the formation of crime-inducing culture (except the diagnosis component in the affluent and non-affluent neighborhoods.

  4. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    de Raad, Markus [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); de Rond, Tristan [Univ. of California, Berkeley, CA (United States); Rübel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Keasling, Jay D. [Univ. of California, Berkeley, CA (United States); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Technical Univ. of Denmark, Lyngby (Denmark); Northen, Trent R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Bowen, Benjamin P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States)

    2017-05-03

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. In this paper, the OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http://openmsi.nersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. Finally, these results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat.

  5. Reconstruction of paleoenvironments by analyzing spatial shell orientation

    Science.gov (United States)

    Lukeneder, Susanne; Lukeneder, Alexander; Weber, Gerhard W.; Exner, Ulrike

    2013-04-01

    one side of the shell (transverse axis) was measured (landmark s & c). Spatial orientation was characterized by dip and dip direction of the longitudinal axis, as well as by strike and azimuth of a plane defined by both axes. The exact spatial orientation data was determined for a sample of 699 ammonoids within the bed and statistically analyzed. The results provide a hint on the geodynamic processes (paleocurrents), depositional conditions (allochthonous or autochthonous) and other general information about the ancient environment. The method can be adapted for other mass-occurring fossils and thus represents a good template for studies of topographical paleoenvironmental factors. References: Flügel, E. 2004. Microfacies of carbonate rocks. Analysis, Interpretation and Application. Springer, Berlin Heidelberg New York, p.182. Lukeneder S., Lukeneder A., Harzhauser M., Islamoglu Y., Krystyn L., Lein R. 2012. A delayed carbonate factory breakdown during the Tethyan-wide Carnian Pluvial Episode along the Cimmerian terranes (Taurus, Turkey). Facies 58: 279-296.

  6. Web-Based Spatial Training Using Handheld Touch Screen Devices

    Science.gov (United States)

    Martin-Dorta, Norena; Saorin, Jose Luis; Contero, Manuel

    2011-01-01

    This paper attempts to harness the opportunities for mobility and the new user interfaces that handheld touch screen devices offer, in a non-formal learning context, with a view to developing spatial ability. This research has addressed two objectives: first, analyzing the effects that training can have on spatial visualisation using the…

  7. Visual Communication in Web Design - Analyzing Visual Communication in Web Design

    Science.gov (United States)

    Thorlacius, Lisbeth

    Web sites are rapidly becoming the preferred media choice for information search, company presentation, shopping, entertainment, education, and social contacts. And along with the various forms of communication that the Web offers the aesthetic aspects have begun to play an increasingly important role. However, studies in the design and the relevance of focusing on the aesthetic aspects in planning and using Web sites have only to a smaller degree been subject of theoretical reflection. For example, Miller (2000), Thorlacius (2001, 2002, 2005), Engholm (2002, 2003), and Beaird (2007) have been contributing to set a beginning agenda that address the aesthetic aspects. On the other hand, there is a considerable amount of literature addressing the theoretical and methodological aspects focusing on the technical and functional aspects. In this context it is the aim of this article to introduce a model for analysis of visual communication on websites.

  8. Analyzing Web Server Logs to Improve a Site's Usage. The Systems Librarian

    Science.gov (United States)

    Breeding, Marshall

    2005-01-01

    This column describes ways to streamline and optimize how a Web site works in order to improve both its usability and its visibility. The author explains how to analyze logs and other system data to measure the effectiveness of the Web site design and search engine.

  9. Web Platform for Sharing Spatial Data and Manipulating Them Online

    Science.gov (United States)

    Bachelet, Dominique; Comendant, Tosha; Strittholt, Jim

    2011-04-01

    To fill the need for readily accessible conservation-relevant spatial data sets, the Conservation Biology Institute (CBI) launched in 2010 a Web-based platform called Data Basin (http://www.databasin.org). It is the first custom application of ArcGIS technology, which provides Web access to free maps and imagery using the most current version of Environmental Systems Research Institute (ESRI; http://www.esri.com/) geographic information system (GIS) software, and its core functionality is being made freely available. Data Basin includes spatial data sets (Arc format shapefiles and grids, or layer packages) that can be biological (e.g., prairie dog range), physical (e.g., average summer temperature, 1950-2000), or socioeconomic (e.g., locations of Alaska oil and gas wells); based on observations as well as on simulation results; and of local to global relevance. They can be uploaded, downloaded, or simply visualized. Maps (overlays of multiple data sets) can be created and customized (e.g., western Massachusetts protected areas, time series of the Deep Water Horizon oil spill). Galleries are folders containing data sets and maps focusing on a theme (e.g., sea level rise projections for the Pacific Northwest region from the National Wildlife Federation, soil data sets for the conterminous United States).

  10. Analyzing the simplicial decomposition of spatial protein structures

    Directory of Open Access Journals (Sweden)

    Szabadka Zoltán

    2008-02-01

    Full Text Available Abstract Background The fast growing Protein Data Bank contains the three-dimensional description of more than 45000 protein- and nucleic-acid structures today. The large majority of the data in the PDB are measured by X-ray crystallography by thousands of researchers in millions of work-hours. Unfortunately, lots of structural errors, bad labels, missing atoms, falsely identified chains and groups make dificult the automated processing of this treasury of structural biological data. Results After we performed a rigorous re-structuring of the whole PDB on graph-theoretical basis, we created the RS-PDB (Rich-Structure PDB database. Using this cleaned and repaired database, we defined simplicial complexes on the heavy-atoms of the PDB, and analyzed the tetrahedra for geometric properties. Conclusion We have found surprisingly characteristic differences between simplices with atomic vertices of different types, and between the atomic neighborhoods – described also by simplices – of different ligand atoms in proteins.

  11. Bad on the net, or bipolars' lives on the web: analyzing discussion web pages for individuals with bipolar affective disorder.

    Science.gov (United States)

    Latalova, Klara; Prasko, Jan; Kamaradova, Dana; Ivanova, Katerina; Jurickova, Lubica

    2014-01-01

    The main therapeutic approach in the treatment of bipolar affective disorder is the administration of drugs. The effectiveness of this approach can be increased by specific psychotherapeutic interventions. There is not much knowledge about self-help initiatives in this field. Anonymous internet communication may be beneficial, regardless of the fact that it is non-professional. It offers a chance to confide and share symptoms with other patients, to open up for persons with feelings of shame, and to obtain relevant information without having a direct contact with an expert. Qualitative analysis of web discussions used by patients with bipolar disorder in Czech language was performed. Using key words "diskuze" (discussion), "maniodeprese" (manic depression) and "bipolární porucha" (bipolar disorder), 8 discussions were found, but only 3 of them were anonymous and non-professional. Individual discussion entries were analyzed for basic categories or subcategories, and these were subsequently assessed so that their relationships could be better understood. A total of 436 entries from 3 discussion web pages were analyzed. Subsequently, six categories were identified (participant, diagnosis, relationships, communication, topic and treatment), each having 5-12 subcategories. These were analyzed in terms of relationships and patterns. Czech discussion web pages for people suffering from bipolar disorder are a lively community of users supporting each other, that may be characterized as a compact body open to newcomers. They seem to fulfill patients' needs that are not fully met by health care services. It also has a "self-cleaning" ability, effectively dealing with posts that are inappropriate, provocative, criticizing, aggressive or meaningless.

  12. Web-based, Interactive, Nuclear Reactor Transient Analyzer using LabVIEW and RELAP5 (ATHENA)

    International Nuclear Information System (INIS)

    Kim, K. D.; Chung, B. D.; Rizwan-uddin

    2006-01-01

    In nuclear engineering, large system analysis codes such as RELAP5, TRAC-M, etc. play an important role in evaluating a reactor system behavior during a wide range of transient conditions. One limitation that restricts their use on a wider scale is that these codes often have a complicated I/O structure. This has motivated the development of GUI tools for best estimate codes, such as SNAP and ViSA, etc. In addition to a user interface, a greater degree of freedom in simulation and analyses of nuclear transient phenomena can be achieved if computer codes and their outputs are accessible from anywhere through the web. Such a web-based interactive interface can be very useful for geographically distributed groups when there is a need to share real-time data. Using mostly off-the-shelf technology, such a capability - a web-based transient analyzer based on a best-estimate code - has been developed. Specifically, the widely used best-estimate code RELAP5 is linked with a graphical interface. Moreover, a capability to web-cast is also available. This has been achieved by using the LabVIEW virtual instruments (VIs). In addition to the graphical display of the results, interactive control functions have also been added that allow operator's actions as well as, if permitted, by a distant user through the web

  13. SWORS: a system for the efficient retrieval of relevant spatial web objects

    DEFF Research Database (Denmark)

    Cao, Xin; Cong, Gao; Jensen, Christian S.

    2012-01-01

    Spatial web objects that possess both a geographical location and a textual description are gaining in prevalence. This gives prominence to spatial keyword queries that exploit both location and textual arguments. Such queries are used in many web services such as yellow pages and maps services....

  14. Spatial scales of carbon flow in a river food web

    Science.gov (United States)

    Finlay, J.C.; Khandwala, S.; Power, M.E.

    2002-01-01

    Spatial extents of food webs that support stream and river consumers are largely unknown, but such information is essential for basic understanding and management of lotic ecosystems. We used predictable variation in algal ??13C with water velocity, and measurements of consumer ??13C and ??15N to examine carbon flow and trophic structure in food webs of the South Fork Eel River in Northern California. Analyses of ??13C showed that the most abundant macroinvertebrate groups (collector-gatherers and scrapers) relied on algae from local sources within their riffle or shallow pool habitats. In contrast, filter-feeding invertebrates in riffles relied in part on algal production derived from upstream shallow pools. Riffle invertebrate predators also relied in part on consumers of pool-derived algal carbon. One abundant taxon drifting from shallow pools and riffles (baetid mayflies) relied on algal production derived from the habitats from which they dispersed. The trophic linkage from pool algae to riffle invertebrate predators was thus mediated through either predation on pool herbivores dispersing into riffles, or on filter feeders. Algal production in shallow pool habitats dominated the resource base of vertebrate predators in all habitats at the end of the summer. We could not distinguish between the trophic roles of riffle algae and terrestrial detritus, but both carbon sources appeared to play minor roles for vertebrate consumers. In shallow pools, small vertebrates, including three-spined stickleback (Gasterosteus aculeatus), roach (Hesperoleucas symmetricus), and rough-skinned newts (Taricha granulosa), relied on invertebrate prey derived from local pool habitats. During the most productive summer period, growth of all size classes of steelhead and resident rainbow trout (Oncorhynchus mykiss) in all habitats (shallow pools, riffles, and deep unproductive pools) was largely derived from algal production in shallow pools. Preliminary data suggest that the strong

  15. Hydrological and Biogeochemical Controls on Seasonal and Spatial Differences in Food Webs in the Everglades

    Science.gov (United States)

    Kendall, C.; Wankel, S. D.; Bemis, B. E.; Rawlik, P. S.; Krabbenhoft, D. P.; Lange, T.

    2002-05-01

    Stable isotopes can be used to determine the relative trophic positions of biota within a food web, and to improve our understanding of the biomagnification of contaminants. Plants at the base of the food web uptake dissolved organic carbon (DIC) and nitrogen (DIN) for growth, and their tissue reflects the isotopic composition of these sources. Animals then mirror the isotopic composition of the primary producers, as modified by consumer-diet fractionations at successive trophic steps. During 1995-99, we collected algae, macrophyte, invertebrate, and fish samples from 15 USGS sites in the Everglades and analyzed them for d13C and d15N with the goal of characterizing seasonal and spatial differences in food web relations. Carbon isotopes effectively distinguish between two main types of food webs: ones where algae is the dominant base of the food web, which are characteristic of relatively pristine marsh sites with long hydroperiods, and ones where macrophyte debris appears to be a significant source of nutrients, which are apparently characteristic of shorter hydroperiod sites, and nutrient-impacted marshes and canals. There usually is an inverse relation between d13C and d15N of organisms over time, especially in more pristine environments, reflecting seasonal changes in the d13C of DIC and the d15N of DIN. The d13C and d15N of algae also show strong positive correlations with seasonal changes in water levels. This variability is substantially damped up the food chain, probably because of the longer integration times of animals vs. plants. We speculate that these seasonal shifts in water level result in changes in biogeochemical reactions and nutrient levels, with corresponding variations in the d15N and d13C of biota. For example, small changes in water level may change the balance of photosynthesis, bacterial respiration, and atmospheric exchange reactions that control the d13C of DIC. Such changes will probably also affect the d15N of dissolved inorganic N (DIN

  16. Earth Exploration Toolbook Workshops: Helping Teachers and Students Analyze Web-based Scientific Data

    Science.gov (United States)

    McAuliffe, C.; Ledley, T.; Dahlman, L.; Haddad, N.

    2007-12-01

    One of the challenges faced by Earth science teachers, particularly in K-12 settings, is that of connecting scientific research to classroom experiences. Helping teachers and students analyze Web-based scientific data is one way to bring scientific research to the classroom. The Earth Exploration Toolbook (EET) was developed as an online resource to accomplish precisely that. The EET consists of chapters containing step-by-step instructions for accessing Web-based scientific data and for using a software analysis tool to explore issues or concepts in science, technology, and mathematics. For example, in one EET chapter, users download Earthquake data from the USGS and bring it into a geographic information system (GIS), analyzing factors affecting the distribution of earthquakes. The goal of the EET Workshops project is to provide professional development that enables teachers to incorporate Web-based scientific data and analysis tools in ways that meet their curricular needs. In the EET Workshops project, Earth science teachers participate in a pair of workshops that are conducted in a combined teleconference and Web-conference format. In the first workshop, the EET Data Analysis Workshop, participants are introduced to the National Science Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). They also walk through an Earth Exploration Toolbook (EET) chapter and discuss ways to use Earth science datasets and tools with their students. In a follow-up second workshop, the EET Implementation Workshop, teachers share how they used these materials in the classroom by describing the projects and activities that they carried out with students. The EET Workshops project offers unique and effective professional development. Participants work at their own Internet-connected computers, and dial into a toll-free group teleconference for step-by-step facilitation and interaction. They also receive support via Elluminate, a Web

  17. Geo-communication, web-services, and spatial data infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2007-01-01

    The introduction of web-services as index-portals based on geo-information has changed the conditions for both content and form of geo-communication. A high number of players and interactions as well as a very high number of all kinds of information and combinations of these caracterise web...... looks very complex, and it will get even more complex. Therefore, there is a strong need for theories and models that can describe this complex web in the SDI and geo-communication consisting of active components, passive components, users, and information in order to make it possible to handle...

  18. SMART CITIES INTELLIGENCE SYSTEM (SMACiSYS) INTEGRATING SENSOR WEB WITH SPATIAL DATA INFRASTRUCTURES (SENSDI)

    OpenAIRE

    D. Bhattacharya; M. Painho

    2017-01-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists...

  19. Efficient Top-k Locality Search for Co-located Spatial Web Objects

    DEFF Research Database (Denmark)

    Qu, Qiang; Liu, Siyuan; Yang, Bin

    2014-01-01

    In step with the web being used widely by mobile users, user location is becoming an essential signal in services, including local intent search. Given a large set of spatial web objects consisting of a geographical location and a textual description (e.g., online business directory entries of re...

  20. Spatially resolvable optical emission spectrometer for analyzing density uniformity of semiconductor process plasma

    International Nuclear Information System (INIS)

    Oh, Changhoon; Ryoo, Hoonchul; Lee, Hyungwoo; Hahn, Jae W.; Kim, Se-Yeon; Yi, Hun-Jung

    2010-01-01

    We proposed a spatially resolved optical emission spectrometer (SROES) for analyzing the uniformity of plasma density for semiconductor processes. To enhance the spatial resolution of the SROES, we constructed a SROES system using a series of lenses, apertures, and pinholes. We calculated the spatial resolution of the SROES for the variation of pinhole size, and our calculated results were in good agreement with the measured spatial variation of the constructed SROES. The performance of the SROES was also verified by detecting the correlation between the distribution of a fluorine radical in inductively coupled plasma etch process and the etch rate of a SiO 2 film on a silicon wafer.

  1. Web-based GIS for spatial pattern detection: application to malaria incidence in Vietnam.

    Science.gov (United States)

    Bui, Thanh Quang; Pham, Hai Minh

    2016-01-01

    There is a great concern on how to build up an interoperable health information system of public health and health information technology within the development of public information and health surveillance programme. Technically, some major issues remain regarding to health data visualization, spatial processing of health data, health information dissemination, data sharing and the access of local communities to health information. In combination with GIS, we propose a technical framework for web-based health data visualization and spatial analysis. Data was collected from open map-servers and geocoded by open data kit package and data geocoding tools. The Web-based system is designed based on Open-source frameworks and libraries. The system provides Web-based analyst tool for pattern detection through three spatial tests: Nearest neighbour, K function, and Spatial Autocorrelation. The result is a web-based GIS, through which end users can detect disease patterns via selecting area, spatial test parameters and contribute to managers and decision makers. The end users can be health practitioners, educators, local communities, health sector authorities and decision makers. This web-based system allows for the improvement of health related services to public sector users as well as citizens in a secure manner. The combination of spatial statistics and web-based GIS can be a solution that helps empower health practitioners in direct and specific intersectional actions, thus provide for better analysis, control and decision-making.

  2. Geo-communication and Web-based Spatial Data Infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2006-01-01

    -services. This paper discusses the relations between the different components of SDI and geo-communication as well as the impacts thereof. Discussed is also a model for the organization of the passive components of the infrastructure; i.e. legislation, collaboration, standards, models, specifications, web......! Therefore there is a strong need for theories and models that can describe this complex web in the SDI and geo-communication consisting of active components, passive components, users and information in order to make it possible to handle the complexity and to give the necessary framework....

  3. Voids and the Cosmic Web: cosmic depression & spatial complexity

    NARCIS (Netherlands)

    van de Weygaert, Rien; Shandarin, S.; Saar, E.; Einasto, J.

    2016-01-01

    Voids form a prominent aspect of the Megaparsec distribution of galaxies and matter. Not only do theyrepresent a key constituent of the Cosmic Web, they also are one of the cleanest probesand measures of global cosmological parameters. The shape and evolution of voids are highly sensitive tothe

  4. A framework for efficient spatial web object retrieval

    DEFF Research Database (Denmark)

    Wu, Dinging; Cong, Gao; Jensen, Christian S.

    2012-01-01

    The conventional Internet is acquiring a geospatial dimension. Web documents are being geo-tagged and geo-referenced objects such as points of interest are being associated with descriptive text documents. The resulting fusion of geo-location and documents enables new kinds of queries that take...

  5. When the Web meets the cell: using personalized PageRank for analyzing protein interaction networks.

    Science.gov (United States)

    Iván, Gábor; Grolmusz, Vince

    2011-02-01

    Enormous and constantly increasing quantity of biological information is represented in metabolic and in protein interaction network databases. Most of these data are freely accessible through large public depositories. The robust analysis of these resources needs novel technologies, being developed today. Here we demonstrate a technique, originating from the PageRank computation for the World Wide Web, for analyzing large interaction networks. The method is fast, scalable and robust, and its capabilities are demonstrated on metabolic network data of the tuberculosis bacterium and the proteomics analysis of the blood of melanoma patients. The Perl script for computing the personalized PageRank in protein networks is available for non-profit research applications (together with sample input files) at the address: http://uratim.com/pp.zip.

  6. SLDAssay: A software package and web tool for analyzing limiting dilution assays.

    Science.gov (United States)

    Trumble, Ilana M; Allmon, Andrew G; Archin, Nancie M; Rigdon, Joseph; Francis, Owen; Baldoni, Pedro L; Hudgens, Michael G

    2017-11-01

    Serial limiting dilution (SLD) assays are used in many areas of infectious disease related research. This paper presents SLDAssay, a free and publicly available R software package and web tool for analyzing data from SLD assays. SLDAssay computes the maximum likelihood estimate (MLE) for the concentration of target cells, with corresponding exact and asymptotic confidence intervals. Exact and asymptotic goodness of fit p-values, and a bias-corrected (BC) MLE are also provided. No other publicly available software currently implements the BC MLE or the exact methods. For validation of SLDAssay, results from Myers et al. (1994) are replicated. Simulations demonstrate the BC MLE is less biased than the MLE. Additionally, simulations demonstrate that exact methods tend to give better confidence interval coverage and goodness-of-fit tests with lower type I error than the asymptotic methods. Additional advantages of using exact methods are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. SambVca 2. A Web Tool for Analyzing Catalytic Pockets with Topographic Steric Maps

    KAUST Repository

    Falivene, Laura; Credendino, Raffaele; Poater, Albert; Petta, Andrea; Serra, Luigi; Oliva, Romina; Scarano, Vittorio; Cavallo, Luigi

    2016-01-01

    Developing more efficient catalysts remains one of the primary targets of organometallic chemists. To accelerate reaching this goal, effective molecular descriptors and visualization tools can represent a remarkable aid. Here, we present a Web application for analyzing the catalytic pocket of metal complexes using topographic steric maps as a general and unbiased descriptor that is suitable for every class of catalysts. To show the broad applicability of our approach, we first compared the steric map of a series of transition metal complexes presenting popular mono-, di-, and tetracoordinated ligands and three classic zirconocenes. This comparative analysis highlighted similarities and differences between totally unrelated ligands. Then, we focused on a recently developed Fe(II) catalyst that is active in the asymmetric transfer hydrogenation of ketones and imines. Finally, we expand the scope of these tools to rationalize the inversion of enantioselectivity in enzymatic catalysis, achieved by point mutation of three amino acids of mononuclear p-hydroxymandelate synthase.

  8. SambVca 2. A Web Tool for Analyzing Catalytic Pockets with Topographic Steric Maps

    KAUST Repository

    Falivene, Laura

    2016-06-27

    Developing more efficient catalysts remains one of the primary targets of organometallic chemists. To accelerate reaching this goal, effective molecular descriptors and visualization tools can represent a remarkable aid. Here, we present a Web application for analyzing the catalytic pocket of metal complexes using topographic steric maps as a general and unbiased descriptor that is suitable for every class of catalysts. To show the broad applicability of our approach, we first compared the steric map of a series of transition metal complexes presenting popular mono-, di-, and tetracoordinated ligands and three classic zirconocenes. This comparative analysis highlighted similarities and differences between totally unrelated ligands. Then, we focused on a recently developed Fe(II) catalyst that is active in the asymmetric transfer hydrogenation of ketones and imines. Finally, we expand the scope of these tools to rationalize the inversion of enantioselectivity in enzymatic catalysis, achieved by point mutation of three amino acids of mononuclear p-hydroxymandelate synthase.

  9. Spatial capture-recapture: a promising method for analyzing data collected using artificial cover objects

    Science.gov (United States)

    Sutherland, Chris; Munoz, David; Miller, David A.W.; Grant, Evan H. Campbell

    2016-01-01

    Spatial capture–recapture (SCR) is a relatively recent development in ecological statistics that provides a spatial context for estimating abundance and space use patterns, and improves inference about absolute population density. SCR has been applied to individual encounter data collected noninvasively using methods such as camera traps, hair snares, and scat surveys. Despite the widespread use of capture-based surveys to monitor amphibians and reptiles, there are few applications of SCR in the herpetological literature. We demonstrate the utility of the application of SCR for studies of reptiles and amphibians by analyzing capture–recapture data from Red-Backed Salamanders, Plethodon cinereus, collected using artificial cover boards. Using SCR to analyze spatial encounter histories of marked individuals, we found evidence that density differed little among four sites within the same forest (on average, 1.59 salamanders/m2) and that salamander detection probability peaked in early October (Julian day 278) reflecting expected surface activity patterns of the species. The spatial scale of detectability, a measure of space use, indicates that the home range size for this population of Red-Backed Salamanders in autumn was 16.89 m2. Surveying reptiles and amphibians using artificial cover boards regularly generates spatial encounter history data of known individuals, which can readily be analyzed using SCR methods, providing estimates of absolute density and inference about the spatial scale of habitat use.

  10. SMART CITIES INTELLIGENCE SYSTEM (SMACiSYS INTEGRATING SENSOR WEB WITH SPATIAL DATA INFRASTRUCTURES (SENSDI

    Directory of Open Access Journals (Sweden)

    D. Bhattacharya

    2017-09-01

    Full Text Available The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS with sensor-web access (SENSDI utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  11. Smart Cities Intelligence System (SMACiSYS) Integrating Sensor Web with Spatial Data Infrastructures (sensdi)

    Science.gov (United States)

    Bhattacharya, D.; Painho, M.

    2017-09-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  12. Reasoning with spatial plans on the semantic web

    NARCIS (Netherlands)

    Hoekstra, R.; Winkels, R.; Hupkes, E.

    2009-01-01

    There are several reasons why citizens, businesses and civil servants need access to regulations. Unfortunately, traditional approaches that aim to provide this access fall short, especially in the area of spatial planning. Fairly straight-forward questions such as "where will I be able to perform

  13. OLTARIS: An Efficient Web-Based Tool for Analyzing Materials Exposed to Space Radiation

    Science.gov (United States)

    Slaba, Tony; McMullen, Amelia M.; Thibeault, Sheila A.; Sandridge, Chris A.; Clowdsley, Martha S.; Blatting, Steve R.

    2011-01-01

    The near-Earth space radiation environment includes energetic galactic cosmic rays (GCR), high intensity proton and electron belts, and the potential for solar particle events (SPE). These sources may penetrate shielding materials and deposit significant energy in sensitive electronic devices on board spacecraft and satellites. Material and design optimization methods may be used to reduce the exposure and extend the operational lifetime of individual components and systems. Since laboratory experiments are expensive and may not cover the range of particles and energies relevant for space applications, such optimization may be done computationally with efficient algorithms that include the various constraints placed on the component, system, or mission. In the present work, the web-based tool OLTARIS (On-Line Tool for the Assessment of Radiation in Space) is presented, and the applicability of the tool for rapidly analyzing exposure levels within either complicated shielding geometries or user-defined material slabs exposed to space radiation is demonstrated. An example approach for material optimization is also presented. Slabs of various advanced multifunctional materials are defined and exposed to several space radiation environments. The materials and thicknesses defining each layer in the slab are then systematically adjusted to arrive at an optimal slab configuration.

  14. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui

    Science.gov (United States)

    2012-01-01

    Background The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. Methods This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Results Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. Conclusions This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics

  15. Spatial Visualization Learning in Engineering: Traditional Methods vs. a Web-Based Tool

    Science.gov (United States)

    Pedrosa, Carlos Melgosa; Barbero, Basilio Ramos; Miguel, Arturo Román

    2014-01-01

    This study compares an interactive learning manager for graphic engineering to develop spatial vision (ILMAGE_SV) to traditional methods. ILMAGE_SV is an asynchronous web-based learning tool that allows the manipulation of objects with a 3D viewer, self-evaluation, and continuous assessment. In addition, student learning may be monitored, which…

  16. Retrieving top-k prestige-based relevant spatial web objects

    DEFF Research Database (Denmark)

    Cao, Xin; Cong, Gao; Jensen, Christian S.

    2010-01-01

    The location-aware keyword query returns ranked objects that are near a query location and that have textual descriptions that match query keywords. This query occurs inherently in many types of mobile and traditional web services and applications, e.g., Yellow Pages and Maps services. Previous...... of prestige-based relevance to capture both the textual relevance of an object to a query and the effects of nearby objects. Based on this, a new type of query, the Location-aware top-k Prestige-based Text retrieval (LkPT) query, is proposed that retrieves the top-k spatial web objects ranked according...... to both prestige-based relevance and location proximity. We propose two algorithms that compute LkPT queries. Empirical studies with real-world spatial data demonstrate that LkPT queries are more effective in retrieving web objects than a previous approach that does not consider the effects of nearby...

  17. Spatial data efficient transmission in WebGIS based on IPv6

    Science.gov (United States)

    Wang, Zhen-feng; Liu, Ji-ping; Wang, Liang; Tao, Kun-wang

    2008-12-01

    Large-size of spatial data and limited bandwidth of network make it restricted to transmit spatial data in WebGIS. This paper employs IPv6 (Internet Protocol version 6), the successor of IPv4 running now, to transmit spatial data efficiently. As the core of NGN (Next Generation Network), IPv6 brings us many advantages to resolve performance problems in current IPv4 network applications. Multicast, which is mandatory in IPv6 routers, can make one server serve many clients simultaneously efficiently, thus to improve capacity of network applications. The new type of anycast address in IPv6 will make network client applications possible to find the nearest server. This makes data transmission between client and server fastest. The paper introduces how to apply IPv6 multicast and anycast in WebGIS to transmit data efficiently.

  18. The Implementation of a Cost Effectiveness Analyzer for Web-Supported Academic Instruction: An Example from Life Science

    Science.gov (United States)

    Cohen, Anat; Nachmias, Rafi

    2012-01-01

    This paper describes implementation of a quantitative cost effectiveness analyzer for Web-supported academic instruction that was developed in our University. The paper presents the cost effectiveness analysis of one academic exemplary course in Life Science department and its introducing to the course lecturer for evaluation. The benefits and…

  19. Spatial Search Techniques for Mobile 3D Queries in Sensor Web Environments

    Directory of Open Access Journals (Sweden)

    James D. Carswell

    2013-03-01

    Full Text Available Developing mobile geo-information systems for sensor web applications involves technologies that can access linked geographical and semantically related Internet information. Additionally, in tomorrow’s Web 4.0 world, it is envisioned that trillions of inexpensive micro-sensors placed throughout the environment will also become available for discovery based on their unique geo-referenced IP address. Exploring these enormous volumes of disparate heterogeneous data on today’s location and orientation aware smartphones requires context-aware smart applications and services that can deal with “information overload”. 3DQ (Three Dimensional Query is our novel mobile spatial interaction (MSI prototype that acts as a next-generation base for human interaction within such geospatial sensor web environments/urban landscapes. It filters information using “Hidden Query Removal” functionality that intelligently refines the search space by calculating the geometry of a three dimensional visibility shape (Vista space at a user’s current location. This 3D shape then becomes the query “window” in a spatial database for retrieving information on only those objects visible within a user’s actual 3D field-of-view. 3DQ reduces information overload and serves to heighten situation awareness on constrained commercial off-the-shelf devices by providing visibility space searching as a mobile web service. The effects of variations in mobile spatial search techniques in terms of query speed vs. accuracy are evaluated and presented in this paper.

  20. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    Science.gov (United States)

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This

  1. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    Science.gov (United States)

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web

  2. Development of a Spatial Decision Support System for Analyzing Changes in Hydro-meteorological Risk

    Science.gov (United States)

    van Westen, Cees

    2013-04-01

    In the framework of the EU FP7 Marie Curie ITN Network "CHANGES: Changing Hydro-meteorological Risks, as Analyzed by a New Generation of European Scientists (http://www.changes-itn.eu)", a spatial decision support system is under development with the aim to analyze the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. The SDSS is one of the main outputs of the CHANGES network, which will develop an advanced understanding of how global changes, related to environmental and climate change as well as socio-economical change, may affect the temporal and spatial patterns of hydro-meteorological hazards and associated risks in Europe; how these changes can be assessed, modeled, and incorporated in sustainable risk management strategies, focusing on spatial planning, emergency preparedness and risk communication. The CHANGES network consists of 11 full partners and 6 associate partners of which 5 private companies, representing 10 European countries. The CHANGES network has hired 12 Early Stage Researchers (ESRs) and is currently hiring 3-6 researchers more for the implementation of the SDSS. The Spatial Decision Support System will be composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and

  3. Analyzing engagement in a web-based intervention platform through visualizing log-data.

    Science.gov (United States)

    Morrison, Cecily; Doherty, Gavin

    2014-11-13

    Engagement has emerged as a significant cross-cutting concern within the development of Web-based interventions. There have been calls to institute a more rigorous approach to the design of Web-based interventions, to increase both the quantity and quality of engagement. One approach would be to use log-data to better understand the process of engagement and patterns of use. However, an important challenge lies in organizing log-data for productive analysis. Our aim was to conduct an initial exploration of the use of visualizations of log-data to enhance understanding of engagement with Web-based interventions. We applied exploratory sequential data analysis to highlight sequential aspects of the log data, such as time or module number, to provide insights into engagement. After applying a number of processing steps, a range of visualizations were generated from the log-data. We then examined the usefulness of these visualizations for understanding the engagement of individual users and the engagement of cohorts of users. The visualizations created are illustrated with two datasets drawn from studies using the SilverCloud Platform: (1) a small, detailed dataset with interviews (n=19) and (2) a large dataset (n=326) with 44,838 logged events. We present four exploratory visualizations of user engagement with a Web-based intervention, including Navigation Graph, Stripe Graph, Start-Finish Graph, and Next Action Heat Map. The first represents individual usage and the last three, specific aspects of cohort usage. We provide examples of each with a discussion of salient features. Log-data analysis through data visualization is an alternative way of exploring user engagement with Web-based interventions, which can yield different insights than more commonly used summative measures. We describe how understanding the process of engagement through visualizations can support the development and evaluation of Web-based interventions. Specifically, we show how visualizations

  4. SynergyFinder: a web application for analyzing drug combination dose-response matrix data.

    Science.gov (United States)

    Ianevski, Aleksandr; He, Liye; Aittokallio, Tero; Tang, Jing

    2017-08-01

    Rational design of drug combinations has become a promising strategy to tackle the drug sensitivity and resistance problem in cancer treatment. To systematically evaluate the pre-clinical significance of pairwise drug combinations, functional screening assays that probe combination effects in a dose-response matrix assay are commonly used. To facilitate the analysis of such drug combination experiments, we implemented a web application that uses key functions of R-package SynergyFinder, and provides not only the flexibility of using multiple synergy scoring models, but also a user-friendly interface for visualizing the drug combination landscapes in an interactive manner. The SynergyFinder web application is freely accessible at https://synergyfinder.fimm.fi ; The R-package and its source-code are freely available at http://bioconductor.org/packages/release/bioc/html/synergyfinder.html . jing.tang@helsinki.fi. © The Author(s) 2017. Published by Oxford University Press.

  5. A spatial theory for emergent multiple predator-prey interactions in food webs.

    Science.gov (United States)

    Northfield, Tobin D; Barton, Brandon T; Schmitz, Oswald J

    2017-09-01

    Predator-prey interaction is inherently spatial because animals move through landscapes to search for and consume food resources and to avoid being consumed by other species. The spatial nature of species interactions necessitates integrating spatial processes into food web theory and evaluating how predators combine to impact their prey. Here, we present a spatial modeling approach that examines emergent multiple predator effects on prey within landscapes. The modeling is inspired by the habitat domain concept derived from empirical synthesis of spatial movement and interactions studies. Because these principles are motivated by synthesis of short-term experiments, it remains uncertain whether spatial contingency principles hold in dynamical systems. We address this uncertainty by formulating dynamical systems models, guided by core habitat domain principles, to examine long-term multiple predator-prey spatial dynamics. To describe habitat domains, we use classical niche concepts describing resource utilization distributions, and assume species interactions emerge from the degree of overlap between species. The analytical results generally align with those from empirical synthesis and present a theoretical framework capable of demonstrating multiple predator effects that does not depend on the small spatial or temporal scales typical of mesocosm experiments, and help bridge between empirical experiments and long-term dynamics in natural systems.

  6. Remote sensing, geographical information systems, and spatial modeling for analyzing public transit services

    Science.gov (United States)

    Wu, Changshan

    Public transit service is a promising transportation mode because of its potential to address urban sustainability. Current ridership of public transit, however, is very low in most urban regions, particularly those in the United States. This woeful transit ridership can be attributed to many factors, among which poor service quality is key. Given this, there is a need for transit planning and analysis to improve service quality. Traditionally, spatially aggregate data are utilized in transit analysis and planning. Examples include data associated with the census, zip codes, states, etc. Few studies, however, address the influences of spatially aggregate data on transit planning results. In this research, previous studies in transit planning that use spatially aggregate data are reviewed. Next, problems associated with the utilization of aggregate data, the so-called modifiable areal unit problem (MAUP), are detailed and the need for fine resolution data to support public transit planning is argued. Fine resolution data is generated using intelligent interpolation techniques with the help of remote sensing imagery. In particular, impervious surface fraction, an important socio-economic indicator, is estimated through a fully constrained linear spectral mixture model using Landsat Enhanced Thematic Mapper Plus (ETM+) data within the metropolitan area of Columbus, Ohio in the United States. Four endmembers, low albedo, high albedo, vegetation, and soil are selected to model heterogeneous urban land cover. Impervious surface fraction is estimated by analyzing low and high albedo endmembers. With the derived impervious surface fraction, three spatial interpolation methods, spatial regression, dasymetric mapping, and cokriging, are developed to interpolate detailed population density. Results suggest that cokriging applied to impervious surface is a better alternative for estimating fine resolution population density. With the derived fine resolution data, a multiple

  7. Comparison of the common spatial interpolation methods used to analyze potentially toxic elements surrounding mining regions.

    Science.gov (United States)

    Ding, Qian; Wang, Yong; Zhuang, Dafang

    2018-04-15

    The appropriate spatial interpolation methods must be selected to analyze the spatial distributions of Potentially Toxic Elements (PTEs), which is a precondition for evaluating PTE pollution. The accuracy and effect of different spatial interpolation methods, which include inverse distance weighting interpolation (IDW) (power = 1, 2, 3), radial basis function interpolation (RBF) (basis function: thin-plate spline (TPS), spline with tension (ST), completely regularized spline (CRS), multiquadric (MQ) and inverse multiquadric (IMQ)) and ordinary kriging interpolation (OK) (semivariogram model: spherical, exponential, gaussian and linear), were compared using 166 unevenly distributed soil PTE samples (As, Pb, Cu and Zn) in the Suxian District, Chenzhou City, Hunan Province as the study subject. The reasons for the accuracy differences of the interpolation methods and the uncertainties of the interpolation results are discussed, then several suggestions for improving the interpolation accuracy are proposed, and the direction of pollution control is determined. The results of this study are as follows: (i) RBF-ST and OK (exponential) are the optimal interpolation methods for As and Cu, and the optimal interpolation method for Pb and Zn is RBF-IMQ. (ii) The interpolation uncertainty is positively correlated with the PTE concentration, and higher uncertainties are primarily distributed around mines, which is related to the strong spatial variability of PTE concentrations caused by human interference. (iii) The interpolation accuracy can be improved by increasing the sample size around the mines, introducing auxiliary variables in the case of incomplete sampling and adopting the partition prediction method. (iv) It is necessary to strengthen the prevention and control of As and Pb pollution, particularly in the central and northern areas. The results of this study can provide an effective reference for the optimization of interpolation methods and parameters for

  8. Supporting spatial data harmonization process with the use of ontologies and Semantic Web technologies

    Science.gov (United States)

    Strzelecki, M.; Iwaniak, A.; Łukowicz, J.; Kaczmarek, I.

    2013-10-01

    Nowadays, spatial information is not only used by professionals, but also by common citizens, who uses it for their daily activities. Open Data initiative states that data should be freely and unreservedly available for all users. It also applies to spatial data. As spatial data becomes widely available it is essential to publish it in form which guarantees the possibility of integrating it with other, heterogeneous data sources. Interoperability is the possibility to combine spatial data sets from different sources in a consistent way as well as providing access to it. Providing syntactic interoperability based on well-known data formats is relatively simple, unlike providing semantic interoperability, due to the multiple possible data interpretation. One of the issues connected with the problem of achieving interoperability is data harmonization. It is a process of providing access to spatial data in a representation that allows combining it with other harmonized data in a coherent way by using a common set of data product specification. Spatial data harmonization is performed by creating definition of reclassification and transformation rules (mapping schema) for source application schema. Creation of those rules is a very demanding task which requires wide domain knowledge and a detailed look into application schemas. The paper focuses on proposing methods for supporting data harmonization process, by automated or supervised creation of mapping schemas with the use of ontologies, ontology matching methods and Semantic Web technologies.

  9. Isotopic evidence for the spatial heterogeneity of the planktonic food webs in the transition zone between river and lake ecosystems

    Directory of Open Access Journals (Sweden)

    Hideyuki Doi

    2013-12-01

    Full Text Available Resources and organisms in food webs are distributed patchily. The spatial structure of food webs is important and critical to understanding their overall structure. However, there is little available information about the small-scale spatial structure of food webs. We investigated the spatial structure of food webs in a lake ecosystem at the littoral transition zone between an inflowing river and a lake. We measured the carbon isotope ratios of zooplankton and particulate organic matter (POM; predominantly phytoplankton in the littoral zone of a saline lake. Parallel changes in the δ 13C values of zooplankton and their respective POMs indicated that there is spatial heterogeneity of the food web in this study area. Lake ecosystems are usually classified at the landscape level as either pelagic or littoral habitats. However, we showed small-scale spatial heterogeneity among planktonic food webs along an environmental gradient. Stable isotope data is useful for detecting spatial heterogeneity of habitats, populations, communities, and ecosystems.

  10. A user-friendly web portal for analyzing conformational changes in structures of Mycobacterium tuberculosis.

    Science.gov (United States)

    Hassan, Sameer; Thangam, Manonanthini; Vasudevan, Praveen; Kumar, G Ramesh; Unni, Rahul; Devi, P K Gayathri; Hanna, Luke Elizabeth

    2015-10-01

    Initiation of the Tuberculosis Structural Consortium has resulted in the expansion of the Mycobacterium tuberculosis (MTB) protein structural database. Currently, 969 experimentally solved structures are available for 354 MTB proteins. This includes multiple crystal structures for a given protein under different functional conditions, such as the presence of different ligands or mutations. In depth analysis of the multiple structures reveal that subtle differences exist in conformations of a given protein under varied conditions. Therefore, it is immensely important to understand the conformational differences between the multiple structures of a given protein in order to select the most suitable structure for molecular docking and structure-based drug designing. Here, we introduce a web portal ( http://bmi.icmr.org.in/mtbsd/torsion.php ) that we developed to provide comparative data on the ensemble of available structures of MTB proteins, such as Cα root means square deviation (RMSD), sequence identity, presence of mutations and torsion angles. Additionally, torsion angles were used to perform principal component analysis (PCA) to identify the conformational differences between the structures. Additionally, we present a few case studies to demonstrate this database. Graphical Abstract Conformational changes seen in the structures of the enoyl-ACP reductase protein encoded by the Mycobacterial gene inhA.

  11. Efficient Retrieval of the Top-k Most Relevant Spatial Web Objects

    DEFF Research Database (Denmark)

    Cong, Gao; Jensen, Christian Søndergaard; Wu, Dingming

    2009-01-01

    The conventional Internet is acquiring a geo-spatial dimension. Web documents are being geo-tagged, and geo-referenced objects such as points of interest are being associated with descriptive text documents. The resulting fusion of geo-location and documents enables a new kind of top-k query...... that takes into account both location proximity and text relevancy. To our knowledge, only naive techniques exist that are capable of computing a general web information retrieval query while also taking location into account. This paper proposes a new indexing framework for location-aware top-k text...... both text relevancy and location proximity to prune the search space. Results of empirical studies with an implementation of the framework demonstrate that the paper’s proposal offers scalability and is capable of excellent performance....

  12. Spatially Analyzing the Inequity of the Hong Kong Urban Heat Island by Socio-Demographic Characteristics

    Directory of Open Access Journals (Sweden)

    Man Sing Wong

    2016-03-01

    Full Text Available Recent studies have suggested that some disadvantaged socio-demographic groups face serious environmental-related inequities in Hong Kong due to the rising ambient urban temperatures. Identifying heat-vulnerable groups and locating areas of Surface Urban Heat Island (SUHI inequities is thus important for prioritizing interventions to mitigate death/illness rates from heat. This study addresses this problem by integrating methods of remote sensing retrieval, logistic regression modelling, and spatial autocorrelation. In this process, the SUHI effect was first estimated from the Land Surface Temperature (LST derived from a Landsat image. With the scale assimilated to the SUHI and socio-demographic data, a logistic regression model was consequently adopted to ascertain their relationships based on Hong Kong Tertiary Planning Units (TPUs. Lastly, inequity “hotspots” were derived using spatial autocorrelation methods. Results show that disadvantaged socio-demographic groups were significantly more prone to be exposed to an intense SUHI effect: over half of 287 TPUs characterized by age groups of 60+ years, secondary and matriculation education attainment, widowed, divorced and separated, low and middle incomes, and certain occupation groups of workers, have significant Odds Ratios (ORs larger than 1.2. It can be concluded that a clustering analysis stratified by age, income, educational attainment, marital status, and occupation is an effective way to detect the inequity hotspots of SUHI exposure. Additionally, inequities explored using income, marital status and occupation factors were more significant than the age and educational attainment in these areas. The derived maps and model can be further analyzed in urban/city planning, in order to mitigate the physical and social causes of the SUHI effect.

  13. Historical Website Ecology : Analyzing Past States of the Web Using Archived Source Code

    NARCIS (Netherlands)

    Helmond, A.; Brügger, N.

    2017-01-01

    In this chapter I offer a historical perspective on the changing composition of a website over time. I propose to see the website as an ecosystem through which we can analyze the larger techno-commercial configurations that websites are embedded in. In doing so, I reconceptualize the study of

  14. Concept of a spatial data infrastructure for web-mapping, processing and service provision for geo-hazards

    Science.gov (United States)

    Weinke, Elisabeth; Hölbling, Daniel; Albrecht, Florian; Friedl, Barbara

    2017-04-01

    for the possibility of rapid mapping. The server tier consists of java based web and GIS server. Sub and main services are part of the service tier. Sub services are for example map services, feature editing services, geometry services, geoprocessing services and metadata services. For (meta)data provision and to support data interoperability, web standards of the OGC and the rest-interface is used. Four central main services are designed and developed: (1) a mapping service (including image segmentation and classification approaches), (2) a monitoring service to monitor changes over time, (3) a validation service to analyze landslide delineations from different sources and (4) an infrastructure service to identify affected landslides. The main services use and combine parts of the sub services. Furthermore, a series of client applications based on new technology standards making use of the data and services offered by the spatial data infrastructure. Next steps include the design to extend the current spatial data infrastructure to other areas and geo-hazard types to develop a spatial data infrastructure that can assist targeted mapping and monitoring of geo-hazards on a global context.

  15. How can mental maps, applied to the coast environment, help in collecting and analyzing spatial representations?

    Directory of Open Access Journals (Sweden)

    Servane Gueben-Venière

    2011-09-01

    Full Text Available Après avoir été principalement utilisées en géographie urbaine, puis quelque peu mises de côté par les géographes, les cartes mentales font désormais l’objet d’un regain d’intérêt, en particulier dans le champ de la géographie de l’environnement. Appliquées à l’espace littoral et employées en complément de l’entretien, elles se révèlent être non seulement un bon outil de recueil des représentations spatiales, mais aussi une aide précieuse pour leur analyse. Cet article s’appuie sur l’exemple de l’utilisation des cartes mentales dans le poster scientifique Des ingénieurs de plus en plus « verts ». Évolution du regard des ingénieurs en charge de la gestion du littoral néerlandais, lauréat du concours organisé par le forum de l’École Doctorale de Géographie de Paris de 2011.After having been mainly used in urban geography, then cast aside by the geographers, mental maps are now the object of renewed interest, particularly in the field of environmental geography. Applied to the coast, and used as a supplement to the interview, these maps are not only of great assistance in collecting spatial representations, but also helpful in analyzing them. This article uses the example of the integration of mental maps in the scientific poster “Des ingénieurs de plus en plus “verts”. Évolution du regard des ingénieurs en charge de la gestion du littoral néerlandais”(Engineers are ‘greener and greener’. Evolution of the thinking of engineers in charge of Dutch coastal management., prize-winner of the competition organized by the Paris Doctoral School of Geography Forum in 2011.

  16. Monitoring, analyzing and simulating of spatial-temporal changes of landscape pattern over mining area

    Science.gov (United States)

    Liu, Pei; Han, Ruimei; Wang, Shuangting

    2014-11-01

    According to the merits of remotely sensed data in depicting regional land cover and Land changes, multi- objective information processing is employed to remote sensing images to analyze and simulate land cover in mining areas. In this paper, multi-temporal remotely sensed data were selected to monitor the pattern, distri- bution and trend of LUCC and predict its impacts on ecological environment and human settlement in mining area. The monitor, analysis and simulation of LUCC in this coal mining areas are divided into five steps. The are information integration of optical and SAR data, LULC types extraction with SVM classifier, LULC trends simulation with CA Markov model, landscape temporal changes monitoring and analysis with confusion matrixes and landscape indices. The results demonstrate that the improved data fusion algorithm could make full use of information extracted from optical and SAR data; SVM classifier has an efficient and stable ability to obtain land cover maps, which could provide a good basis for both land cover change analysis and trend simulation; CA Markov model is able to predict LULC trends with good performance, and it is an effective way to integrate remotely sensed data with spatial-temporal model for analysis of land use / cover change and corresponding environmental impacts in mining area. Confusion matrixes are combined with landscape indices to evaluation and analysis show that, there was a sustained downward trend in agricultural land and bare land, but a continues growth trend tendency in water body, forest and other lands, and building area showing a wave like change, first increased and then decreased; mining landscape has undergone a from small to large and large to small process of fragmentation, agricultural land is the strongest influenced landscape type in this area, and human activities are the primary cause, so the problem should be pay more attentions by government and other organizations.

  17. Spatial variations in food web structures with alternative stable states: evidence from stable isotope analysis in a large eutrophic lake

    Science.gov (United States)

    Li, Yunkai; Zhang, Yuying; Xu, Jun; Zhang, Shuo

    2018-03-01

    Food web structures are well known to vary widely among ecosystems. Moreover, many food web studies of lakes have generally attempted to characterize the overall food web structure and have largely ignored internal spatial and environmental variations. In this study, we hypothesize that there is a high degree of spatial heterogeneity within an ecosystem and such heterogeneity may lead to strong variations in environmental conditions and resource availability, in turn resulting in different trophic pathways. Stable carbon and nitrogen isotopes were employed for the whole food web to describe the structure of the food web in different sub-basins within Taihu Lake. This lake is a large eutrophic freshwater lake that has been intensively managed and highly influenced by human activities for more than 50 years. The results show significant isotopic differences between basins with different environmental characteristics. Such differences likely result from isotopic baseline differences combining with a shift in food web structure. Both are related to local spatial heterogeneity in nutrient loading in waters. Such variation should be explicitly considered in future food web studies and ecosystem-based management in this lake ecosystem.

  18. Spatial variations in food web structures with alternative stable states: evidence from stable isotope analysis in a large eutrophic lake

    Science.gov (United States)

    Li, Yunkai; Zhang, Yuying; Xu, Jun; Zhang, Shuo

    2017-05-01

    Food web structures are well known to vary widely among ecosystems. Moreover, many food web studies of lakes have generally attempted to characterize the overall food web structure and have largely ignored internal spatial and environmental variations. In this study, we hypothesize that there is a high degree of spatial heterogeneity within an ecosystem and such heterogeneity may lead to strong variations in environmental conditions and resource availability, in turn resulting in different trophic pathways. Stable carbon and nitrogen isotopes were employed for the whole food web to describe the structure of the food web in different sub-basins within Taihu Lake. This lake is a large eutrophic freshwater lake that has been intensively managed and highly influenced by human activities for more than 50 years. The results show significant isotopic differences between basins with different environmental characteristics. Such differences likely result from isotopic baseline differences combining with a shift in food web structure. Both are related to local spatial heterogeneity in nutrient loading in waters. Such variation should be explicitly considered in future food web studies and ecosystem-based management in this lake ecosystem.

  19. Analyzing crop change scenario with the SmartScape spatial decision support system

    NARCIS (Netherlands)

    Tayyebi, A; Tayyebi, AH; Jokar Arsanjani, J; Vaz, E; Helbich, M

    2016-01-01

    Agricultural land use is increasingly changing due to different anthropogenic activities. A combination of economic, socio-political, and cultural factors exerts a direct impact on agricultural changes. This study aims to illustrate how stakeholders and policymakers can take advantage of a web-based

  20. Utilizing mixed methods research in analyzing Iranian researchers’ informarion search behaviour in the Web and presenting current pattern

    Directory of Open Access Journals (Sweden)

    Maryam Asadi

    2015-12-01

    Full Text Available Using mixed methods research design, the current study has analyzed Iranian researchers’ information searching behaviour on the Web.Then based on extracted concepts, the model of their information searching behavior was revealed. . Forty-four participants, including academic staff from universities and research centers were recruited for this study selected by purposive sampling. Data were gathered from questionnairs including ten questions and semi-structured interview. Each participant’s memos were analyzed using grounded theory methods adapted from Strauss & Corbin (1998. Results showed that the main objectives of subjects were doing a research, writing a paper, studying, doing assignments, downloading files and acquiring public information in using Web. The most important of learning about how to search and retrieve information were trial and error and get help from friends among the subjects. Information resources are identified by searching in information resources (e.g. search engines, references in papers, and search in Online database… communications facilities & tools (e.g. contact with colleagues, seminars & workshops, social networking..., and information services (e.g. RSS, Alerting, and SDI. Also, Findings indicated that searching by search engines, reviewing references, searching in online databases, and contact with colleagues and studying last issue of the electronic journals were the most important for searching. The most important strategies were using search engines and scientific tools such as Google Scholar. In addition, utilizing from simple (Quick search method was the most common among subjects. Using of topic, keywords, title of paper were most important of elements for retrieval information. Analysis of interview showed that there were nine stages in researchers’ information searching behaviour: topic selection, initiating search, formulating search query, information retrieval, access to information

  1. A web-tool to find spatially explicit climate-smart solutions for the sector agriculture

    Science.gov (United States)

    Verzandvoort, Simone; Kuikman, Peter; Walvoort, Dennis

    2017-04-01

    Europe faces the challenge to produce more food and more biomass for the bio-economy, to adapt its agricultural sector to negative consequences of climate change, and to reduce greenhouse gas emissions from agriculture. Climate-smart agriculture (CSA) solutions and technologies improve agriculture's productivity and provide economic growth and stability, increase resilience, and help to reduce GHG emissions from agricultural activities. The Climate Smart Agriculture Booster (CSAb) (http://csabooster.climate-kic.org/) is a Flagship Program under Climate-KIC, aiming to facilitate the adoption of CSA solutions and technologies in the European agro-food sector. This adoption requires spatially explicit, contextual information on farming activities and risks and opportunities related to climate change in regions across Europe. Other spatial information supporting adoption includes Information on where successful implementations were already done, on where CSA would profit from enabling policy conditions, and where markets or business opportunities for selling or purchasing technology and knowledge are located or emerging. The Spatial Solution Finder is a web-based spatial tool aiming to help agri-food companies (supply and processing), authorities or agricultural organisations find CSA solutions and technologies that fit local farmers and regions, and to demonstrate examples of successful implementations as well as expected impact at the farm and regional level. The tool is based on state of the art (geo)datasets of environmental and socio-economic conditions (partly open access, partly derived from previous research) and open source web-technology. The philosophy of the tool is that combining existing datasets with contextual information on the region of interest with personalized information entered by the user provides a suitable basis for offering a basket of options for CSA solutions and technologies. Solutions and technologies are recommended to the user based on

  2. A spatial web/agent-based model to support stakeholders' negotiation regarding land development.

    Science.gov (United States)

    Pooyandeh, Majeed; Marceau, Danielle J

    2013-11-15

    Decision making in land management can be greatly enhanced if the perspectives of concerned stakeholders are taken into consideration. This often implies negotiation in order to reach an agreement based on the examination of multiple alternatives. This paper describes a spatial web/agent-based modeling system that was developed to support the negotiation process of stakeholders regarding land development in southern Alberta, Canada. This system integrates a fuzzy analytic hierarchy procedure within an agent-based model in an interactive visualization environment provided through a web interface to facilitate the learning and negotiation of the stakeholders. In the pre-negotiation phase, the stakeholders compare their evaluation criteria using linguistic expressions. Due to the uncertainty and fuzzy nature of such comparisons, a fuzzy Analytic Hierarchy Process is then used to prioritize the criteria. The negotiation starts by a development plan being submitted by a user (stakeholder) through the web interface. An agent called the proposer, which represents the proposer of the plan, receives this plan and starts negotiating with all other agents. The negotiation is conducted in a step-wise manner where the agents change their attitudes by assigning a new set of weights to their criteria. If an agreement is not achieved, a new location for development is proposed by the proposer agent. This process is repeated until a location is found that satisfies all agents to a certain predefined degree. To evaluate the performance of the model, the negotiation was simulated with four agents, one of which being the proposer agent, using two hypothetical development plans. The first plan was selected randomly; the other one was chosen in an area that is of high importance to one of the agents. While the agents managed to achieve an agreement about the location of the land development after three rounds of negotiation in the first scenario, seven rounds were required in the second

  3. Analyzing the Effects of Spatial Interaction among City Clusters on Urban Growth—Case of Wuhan Urban Agglomeration

    Directory of Open Access Journals (Sweden)

    Ronghui Tan

    2016-08-01

    Full Text Available For the past two decades, China’s urbanization has attracted increasing attention from scholars around the world. Numerous insightful studies have attempted to determine the socioeconomic causes of the rapid urban growth in Chinese cities. However, most of these studies regarded each city as a single entity, with few considering inter-city relationships. The present study uses a gravity-based model to measure the spatial interaction among city clusters in the Wuhan urban agglomeration (WUA, which is one of China’s most rapidly urbanizing regions. The effects of spatial interaction on urban growth area were also analyzed. Empirical results indicate that, similar to urban population or employment in secondary and tertiary industries in the WUA from 2000 to 2005, the spatial interaction among city clusters is one of the main drivers of urban growth. In fact, this study finds the effects of spatial interaction as the only socioeconomic factor that affected the spatial expansion from 2005 to 2010. This finding suggests that population migration and information and commodity flows showed greater influence than the socioeconomic drivers of each city did on promoting urbanization in the WUA during this period. We thus argue that spatial interaction among city clusters should be a consideration in future regional planning.

  4. Easier surveillance of climate-related health vulnerabilities through a Web-based spatial OLAP application

    Directory of Open Access Journals (Sweden)

    Gosselin Pierre

    2009-04-01

    Full Text Available Abstract Background Climate change has a significant impact on population health. Population vulnerabilities depend on several determinants of different types, including biological, psychological, environmental, social and economic ones. Surveillance of climate-related health vulnerabilities must take into account these different factors, their interdependence, as well as their inherent spatial and temporal aspects on several scales, for informed analyses. Currently used technology includes commercial off-the-shelf Geographic Information Systems (GIS and Database Management Systems with spatial extensions. It has been widely recognized that such OLTP (On-Line Transaction Processing systems were not designed to support complex, multi-temporal and multi-scale analysis as required above. On-Line Analytical Processing (OLAP is central to the field known as BI (Business Intelligence, a key field for such decision-support systems. In the last few years, we have seen a few projects that combine OLAP and GIS to improve spatio-temporal analysis and geographic knowledge discovery. This has given rise to SOLAP (Spatial OLAP and a new research area. This paper presents how SOLAP and climate-related health vulnerability data were investigated and combined to facilitate surveillance. Results Based on recent spatial decision-support technologies, this paper presents a spatio-temporal web-based application that goes beyond GIS applications with regard to speed, ease of use, and interactive analysis capabilities. It supports the multi-scale exploration and analysis of integrated socio-economic, health and environmental geospatial data over several periods. This project was meant to validate the potential of recent technologies to contribute to a better understanding of the interactions between public health and climate change, and to facilitate future decision-making by public health agencies and municipalities in Canada and elsewhere. The project also aimed at

  5. Easier surveillance of climate-related health vulnerabilities through a Web-based spatial OLAP application.

    Science.gov (United States)

    Bernier, Eveline; Gosselin, Pierre; Badard, Thierry; Bédard, Yvan

    2009-04-03

    Climate change has a significant impact on population health. Population vulnerabilities depend on several determinants of different types, including biological, psychological, environmental, social and economic ones. Surveillance of climate-related health vulnerabilities must take into account these different factors, their interdependence, as well as their inherent spatial and temporal aspects on several scales, for informed analyses. Currently used technology includes commercial off-the-shelf Geographic Information Systems (GIS) and Database Management Systems with spatial extensions. It has been widely recognized that such OLTP (On-Line Transaction Processing) systems were not designed to support complex, multi-temporal and multi-scale analysis as required above. On-Line Analytical Processing (OLAP) is central to the field known as BI (Business Intelligence), a key field for such decision-support systems. In the last few years, we have seen a few projects that combine OLAP and GIS to improve spatio-temporal analysis and geographic knowledge discovery. This has given rise to SOLAP (Spatial OLAP) and a new research area. This paper presents how SOLAP and climate-related health vulnerability data were investigated and combined to facilitate surveillance. Based on recent spatial decision-support technologies, this paper presents a spatio-temporal web-based application that goes beyond GIS applications with regard to speed, ease of use, and interactive analysis capabilities. It supports the multi-scale exploration and analysis of integrated socio-economic, health and environmental geospatial data over several periods. This project was meant to validate the potential of recent technologies to contribute to a better understanding of the interactions between public health and climate change, and to facilitate future decision-making by public health agencies and municipalities in Canada and elsewhere. The project also aimed at integrating an initial collection of geo

  6. Analyzing Variability in Landscape Nutrient Loading Using Spatially-Explicit Maps in the Great Lakes Basin

    Science.gov (United States)

    Hamlin, Q. F.; Kendall, A. D.; Martin, S. L.; Whitenack, H. D.; Roush, J. A.; Hannah, B. A.; Hyndman, D. W.

    2017-12-01

    Excessive loading of nitrogen and phosphorous to the landscape has caused biologically and economically damaging eutrophication and harmful algal blooms in the Great Lakes Basin (GLB) and across the world. We mapped source-specific loads of nitrogen and phosphorous to the landscape using broadly available data across the GLB. SENSMap (Spatially Explicit Nutrient Source Map) is a 30m resolution snapshot of nutrient loads ca. 2010. We use these maps to study variable nutrient loading and provide this information to watershed managers through NOAA's GLB Tipping Points Planner. SENSMap individually maps nutrient point sources and six non-point sources: 1) atmospheric deposition, 2) septic tanks, 3) non-agricultural chemical fertilizer, 4) agricultural chemical fertilizer, 5) manure, and 6) nitrogen fixation from legumes. To model source-specific loads at high resolution, SENSMap synthesizes a wide range of remotely sensed, surveyed, and tabular data. Using these spatially explicit nutrient loading maps, we can better calibrate local land use-based water quality models and provide insight to watershed managers on how to focus nutrient reduction strategies. Here we examine differences in dominant nutrient sources across the GLB, and how those sources vary by land use. SENSMap's high resolution, source-specific approach offers a different lens to understand nutrient loading than traditional semi-distributed or land use based models.

  7. A Web-based spatial decision supporting system for land management and soil conservation

    Science.gov (United States)

    Terribile, F.; Agrillo, A.; Bonfante, A.; Buscemi, G.; Colandrea, M.; D'Antonio, A.; De Mascellis, R.; De Michele, C.; Langella, G.; Manna, P.; Marotta, L.; Mileti, F. A.; Minieri, L.; Orefice, N.; Valentini, S.; Vingiani, S.; Basile, A.

    2015-07-01

    Today it is evident that there are many contrasting demands on our landscape (e.g. food security, more sustainable agriculture, higher income in rural areas, etc.) as well as many land degradation problems. It has been proved that providing operational answers to these demands and problems is extremely difficult. Here we aim to demonstrate that a spatial decision support system based on geospatial cyberinfrastructure (GCI) can address all of the above, so producing a smart system for supporting decision making for agriculture, forestry, and urban planning with respect to the landscape. In this paper, we discuss methods and results of a special kind of GCI architecture, one that is highly focused on land management and soil conservation. The system allows us to obtain dynamic, multidisciplinary, multiscale, and multifunctional answers to agriculture, forestry, and urban planning issues through the Web. The system has been applied to and tested in an area of about 20 000 ha in the south of Italy, within the framework of a European LIFE+ project (SOILCONSWEB). The paper reports - as a case study - results from two different applications dealing with agriculture (olive growth tool) and environmental protection (soil capability to protect groundwater). Developed with the help of end users, the system is starting to be adopted by local communities. The system indirectly explores a change of paradigm for soil and landscape scientists. Indeed, the potential benefit is shown of overcoming current disciplinary fragmentation over landscape issues by offering - through a smart Web-based system - truly integrated geospatial knowledge that may be directly and freely used by any end user (www.landconsultingweb.eu). This may help bridge the last very important divide between scientists working on the landscape and end users.

  8. A web based spatial decision supporting system for land management and soil conservation

    Science.gov (United States)

    Terribile, F.; Agrillo, A.; Bonfante, A.; Buscemi, G.; Colandrea, M.; D'Antonio, A.; De Mascellis, R.; De Michele, C.; Langella, G.; Manna, P.; Marotta, L.; Mileti, F. A.; Minieri, L.; Orefice, N.; Valentini, S.; Vingiani, S.; Basile, A.

    2015-02-01

    Today it is evident that there are many contrasting demands on our landscape (e.g. food security, more sustainable agriculture, higher income in rural areas, etc.) but also many land degradation problems. It has been proved that providing operational answers to these demands and problems is extremely difficult. Here we aim to demonstrate that a Spatial Decision Support System based on geospatial cyber-infrastructure (GCI) can embody all of the above, so producing a smart system for supporting decision making for agriculture, forestry and urban planning with respect to the landscape. In this paper, we discuss methods and results of a special kind of GCI architecture, one that is highly focused on soil and land conservation (SOILCONSWEB-LIFE+ project). The system allows us to obtain dynamic, multidisciplinary, multiscale, and multifunctional answers to agriculture, forestry and urban planning issues through the web. The system has been applied to and tested in an area of about 20 000 ha in the South of Italy, within the framework of a European LIFE+ project. The paper reports - as a case study - results from two different applications dealing with agriculture (olive growth tool) and environmental protection (soil capability to protect groundwater). Developed with the help of end users, the system is starting to be adopted by local communities. The system indirectly explores a change of paradigm for soil and landscape scientists. Indeed, the potential benefit is shown of overcoming current disciplinary fragmentation over landscape issues by offering - through a smart web based system - truly integrated geospatial knowledge that may be directly and freely used by any end user (http://www.landconsultingweb.eu). This may help bridge the last much important divide between scientists working on the landscape and end users.

  9. Taxonomical and ecological characteristics of the desmids placoderms in reservoir: analyzing the spatial and temporal distribution

    Directory of Open Access Journals (Sweden)

    Sirlene Aparecida Felisberto

    2014-12-01

    Full Text Available AIM: This study aimed to evaluate the influence of river-dam axis and abiotic factors on the composition of Closteriaceae, Gonatozygaceae, Mesotaeniaceae and Peniaceae in a tropical reservoir METHODS: Water samples for physical, chemical and periphyton analysis were collected in April and August 2002 in different regions along the axis of the river-dam of Rosana Reservoir, River Basin Paranapanema. The substrates collected, always in the litoranea region, were petioles of Eichhornia azurea (Swartz Kunth. To examine the relationship of abiotic variables with reservoir zones and between the floristic composition of desmids, we used principal component analysis (PCA and canonical correspondence analysis (CCA RESULTS: The results of the PCA explained 81.3% of the total variability in the first two axes. In the first axis, the variables of conductivity, water temperature and the pH were related to the sampling regions of April with higher values, while for the month of August, nitrate, total phosphorus and dissolved oxygen showed higher values. We identified 20 taxa, distributed in the genera Closterium (14, Gonatozygon (4, Netrium (1 and Penium (1. Spatially, the higher taxa were recorded in the lacustrine region for both collection periods. The canonical correspondence analysis (CCA summarized 62.2% of total data variability of taxa in the first two axes, and in August, Closterium incurvum Brébisson, C. cornu Ehrenberg ex Ralfs and Gonatozygon monotaenium De Bary, were related to higher values of turbidity and nitrate to the lacustrine and intermediate regions CONCLUSION: Thus, the formation of groups was due to the regions along the longitudinal axis, then the seasonal period, which must be related to the low current velocity, the higher values of temperature and the water transparency, especially in late summer

  10. Modern tools for development of interactive web map applications for visualization spatial data on the internet

    Directory of Open Access Journals (Sweden)

    Horáková Bronislava

    2009-11-01

    Full Text Available In the last few years has begun the development of dynamic web applications, often called Web2.0. From this development wascreated a technology called Mashups. Mashups may easily combine huge amounts of data sources and functionalities of existing as wellas future web applications and services. Therefore they are used to develop a new device, which offers new possibilities of informationusage. This technology provides possibilities of developing basic as well as robust web applications not only for IT or GIS specialists,but also for common users. Software companies have developed web projects for building mashup application also called mashupeditors.

  11. Analyzing Spatial Behavior of Backcountry Skiers in Mountain Protected Areas Combining GPS Tracking and Graph Theory

    Directory of Open Access Journals (Sweden)

    Karolina Taczanowska

    2017-12-01

    Full Text Available Mountain protected areas (PAs aim to preserve vulnerable environments and at the same time encourage numerous outdoor leisure activities. Understanding the way people use natural environments is crucial to balance the needs of visitors and site capacities. This study aims to develop an approach to evaluate the structure and use of designated skiing zones in PAs combining Global Positioning System (GPS tracking and analytical methods based on graph theory. The study is based on empirical data (n = 609 GPS tracks of backcountry skiers collected in Tatra National Park (TNP, Poland. The physical structure of the entire skiing zones system has been simplified into a graph structure (structural network; undirected graph. In a second step, the actual use of the area by skiers (functional network; directed graph was analyzed using a graph-theoretic approach. Network coherence (connectivity indices: β, γ, α, movement directions at path segments, and relative importance of network nodes (node centrality measures: degree, betweenness, closeness, and proximity prestige were calculated. The system of designated backcountry skiing zones was not evenly used by the visitors. Therefore, the calculated parameters differ significantly between the structural and the functional network. In particular, measures related to the actually used trails are of high importance from the management point of view. Information about the most important node locations can be used for planning sign-posts, on-site maps, interpretative boards, or other tourist infrastructure.

  12. Temporal and Spatial Independent Component Analysis for fMRI Data Sets Embedded in the AnalyzeFMRI R Package

    Directory of Open Access Journals (Sweden)

    Pierre Lafaye de Micheaux

    2011-10-01

    Full Text Available For statistical analysis of functional magnetic resonance imaging (fMRI data sets, we propose a data-driven approach based on independent component analysis (ICA implemented in a new version of the AnalyzeFMRI R package. For fMRI data sets, spatial dimension being much greater than temporal dimension, spatial ICA is the computationally tractable approach generally proposed. However, for some neuroscientific applications, temporal independence of source signals can be assumed and temporal ICA becomes then an attractive exploratory technique. In this work, we use a classical linear algebra result ensuring the tractability of temporal ICA. We report several experiments on synthetic data and real MRI data sets that demonstrate the potential interest of our R package.

  13. ST Spot Detector: a web-based application for automatic spot and tissue detection for Spatial Transcriptomics image data sets.

    Science.gov (United States)

    Wong, Kim; Fernández Navarro, José; Bergenstråhle, Ludvig; Ståhl, Patrik L; Lundeberg, Joakim

    2018-01-17

    Spatial transcriptomics (ST) is a method which combines high resolution tissue imaging with high throughput transcriptome sequencing data. This data must be aligned with the images for correct visualisation, a process that involves several manual steps. Here we present ST Spot Detector, a web tool that automates and facilitates this alignment through a user friendly interface. Open source under the MIT license, available from https://github.com/SpatialTranscriptomicsResearch/st_spot_detector. jose.fernandez.navarro@scilifelab.se. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  14. Analyzing the causes and spatial pattern of the European 2003 carbon flux anomaly using seven models

    Directory of Open Access Journals (Sweden)

    M. Vetter

    2008-04-01

    Full Text Available Globally, the year 2003 is associated with one of the largest atmospheric CO2 rises on record. In the same year, Europe experienced an anomalously strong flux of CO2 from the land to the atmosphere associated with an exceptionally dry and hot summer in Western and Central Europe. In this study we analyze the magnitude of this carbon flux anomaly and key driving ecosystem processes using simulations of seven terrestrial ecosystem models of different complexity and types (process-oriented and diagnostic. We address the following questions: (1 how large were deviations in the net European carbon flux in 2003 relative to a short-term baseline (1998–2002 and to longer-term variations in annual fluxes (1980 to 2005, (2 which European regions exhibited the largest changes in carbon fluxes during the growing season 2003, and (3 which ecosystem processes controlled the carbon balance anomaly .

    In most models the prominence of 2003 anomaly in carbon fluxes declined with lengthening of the reference period from one year to 16 years. The 2003 anomaly for annual net carbon fluxes ranged between 0.35 and –0.63 Pg C for a reference period of one year and between 0.17 and –0.37 Pg C for a reference period of 16 years for the whole Europe.

    In Western and Central Europe, the anomaly in simulated net ecosystem productivity (NEP over the growing season in 2003 was outside the 1σ variance bound of the carbon flux anomalies for 1980–2005 in all models. The estimated anomaly in net carbon flux ranged between –42 and –158 Tg C for Western Europe and between 24 and –129 Tg C for Central Europe depending on the model used. All models responded to a dipole pattern of the climate anomaly in 2003. In Western and Central Europe NEP was reduced due to heat and drought. In contrast, lower than normal temperatures and higher air humidity decreased NEP over Northeastern Europe. While models agree on the sign of changes in

  15. Mining the Social Web Analyzing Data from Facebook, Twitter, LinkedIn, and Other Social Media Sites

    CERN Document Server

    Russell, Matthew

    2011-01-01

    Want to tap the tremendous amount of valuable social data in Facebook, Twitter, LinkedIn, and Google+? This refreshed edition helps you discover who's making connections with social media, what they're talking about, and where they're located. You'll learn how to combine social web data, analysis techniques, and visualization to find what you've been looking for in the social haystack-as well as useful information you didn't know existed. Each standalone chapter introduces techniques for mining data in different areas of the social Web, including blogs and email. All you need to get started

  16. Open Source Web-Based Solutions for Disseminating and Analyzing Flood Hazard Information at the Community Level

    Science.gov (United States)

    Santillan, M. M.-M.; Santillan, J. R.; Morales, E. M. O.

    2017-09-01

    We discuss in this paper the development, including the features and functionalities, of an open source web-based flood hazard information dissemination and analytical system called "Flood EViDEns". Flood EViDEns is short for "Flood Event Visualization and Damage Estimations", an application that was developed by the Caraga State University to address the needs of local disaster managers in the Caraga Region in Mindanao, Philippines in accessing timely and relevant flood hazard information before, during and after the occurrence of flood disasters at the community (i.e., barangay and household) level. The web application made use of various free/open source web mapping and visualization technologies (GeoServer, GeoDjango, OpenLayers, Bootstrap), various geospatial datasets including LiDAR-derived elevation and information products, hydro-meteorological data, and flood simulation models to visualize various scenarios of flooding and its associated damages to infrastructures. The Flood EViDEns application facilitates the release and utilization of this flood-related information through a user-friendly front end interface consisting of web map and tables. A public version of the application can be accessed at http://121.97.192.11:8082/. The application is currently expanded to cover additional sites in Mindanao, Philippines through the "Geo-informatics for the Systematic Assessment of Flood Effects and Risks for a Resilient Mindanao" or the "Geo-SAFER Mindanao" Program.

  17. Analyzing spatial and temporal trends in Aboveground Biomass within the Acadian New England Forests using the complete Landsat Archive

    Science.gov (United States)

    Kilbride, J. B.; Fraver, S.; Ayrey, E.; Weiskittel, A.; Braaten, J.; Hughes, J. M.; Hayes, D. J.

    2017-12-01

    Forests within the New England states and Canadian Maritime provinces, here described as the Acadian New England (ANE) forests, have undergone substantial disturbances due to insect, fire, and anthropogenic factors. Through repeated satellite observations captures by USGS's Landsat program, 45 years of disturbance information can be incorporated into modeling efforts to better understand the spatial and temporal trends in forest above ground biomass (AGB). Using Google's Earth Engine, annual mosaics were developed for the ANE study area and then disturbance and recovery metrics were developed using the temporal segmentation algorithm VeRDET. Normalization procedures were developed to incorporate the Landsat Multispectral Scanner (MSS, 1972 - 1985) data alongside the modern era of Landsat Thematic Mapper (TM, 1984-2013), Enhanced Thematic Mapper plus (ETM+, 1999 - present), and Operational Land Imager (OLI, 2013- present) data products. This has enabled the creation of a dataset with an unprecedented spatial and temporal view of forest landscape change. Model training was performed using was the Forest Inventory Analysis (FIA) and New Brunswick Permanent Sample Plot data datasets. Modeling was performed using parametric techniques such as mixed effects models and non-parametric techniques such as k-NN imputation and generalized boosted regression. We compare the biomass estimate and model accuracy to other inventory and modeling studies produced within this study area. The spatial and temporal patterns of stock changes are analyzed against resource policy, land ownership changes, and forest management.

  18. The importance of landscape and spatial structure for hymenopteran-based food webs in an agro-ecosystem.

    Science.gov (United States)

    Fabian, Yvonne; Sandau, Nadine; Bruggisser, Odile T; Aebi, Alex; Kehrli, Patrik; Rohr, Rudolf P; Naisbit, Russell E; Bersier, Louis-Félix

    2013-11-01

    1. Understanding the environmental factors that structure biodiversity and food webs among communities is central to assess and mitigate the impact of landscape changes. 2. Wildflower strips are ecological compensation areas established in farmland to increase pollination services and biological control of crop pests and to conserve insect diversity. They are arranged in networks in order to favour high species richness and abundance of the fauna. 3. We describe results from experimental wildflower strips in a fragmented agricultural landscape, comparing the importance of landscape, of spatial arrangement and of vegetation on the diversity and abundance of trap-nesting bees, wasps and their enemies, and the structure of their food webs. 4. The proportion of forest cover close to the wildflower strips and the landscape heterogeneity stood out as the most influential landscape elements, resulting in a more complex trap-nest community with higher abundance and richness of hosts, and with more links between species in the food webs and a higher diversity of interactions. We disentangled the underlying mechanisms for variation in these quantitative food web metrics. 5. We conclude that in order to increase the diversity and abundance of pollinators and biological control agents and to favour a potentially stable community of cavity-nesting hymenoptera in wildflower strips, more investment is needed in the conservation and establishment of forest habitats within agro-ecosystems, as a reservoir of beneficial insect populations. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.

  19. Effects of Seasonal and Spatial Differences in Food Webs on Mercury Concentrations in Fish in the Everglades

    Science.gov (United States)

    Kendall, C.; Bemis, B. E.; Wankel, S. D.; Rawlik, P. S.; Lange, T.; Krabbenhoft, D. P.

    2002-05-01

    A clear understanding of the aquatic food web is essential for determining the entry points and subsequent biomagnification pathways of contaminants such as methyl-mercury (MeHg) in the Everglades. Anthropogenic changes in nutrients can significantly affect the entry points of MeHg by changing food web structure from one dominated by algal productivity to one dominated by macrophytes and associated microbial activity. These changes in the base of the food web can also influence the distribution of animals within the ecosystem, and subsequently the bioaccumulation of MeHg up the food chain. As part of several collaborations with local and other federal agencies, more than 7000 Everglades samples were collected in 1995-99, and analysed for d13C and d15N. Many organisms were also analysed for d34S, gut contents, total Hg, and MeHg. Carbon isotopes effectively distinguish between two main types of food webs: ones where algae is the dominant base of the food web, which are characteristic of relatively pristine marsh sites with long hydroperiods, and ones where macrophyte debris appears to be a significant source of nutrients, which are apparently characteristic of shorter hydroperiod sites, and nutrient-impacted marshes and canals. Many organisms show significant (5-12%) spatial and temporal differences in d13C and d15N values across the Everglades. These differences may reflect site and season-specific differences in the relative importance of algae vs. macrophyte debris to the food web. However, there is a lack of evidence that these sites otherwise differ in food chain length (as determined by d15N values). This conclusion is generally supported by gut contents and mercury data. Furthermore, there are no statistically significant differences between the Delta d15N (predator-algae) values at pristine marsh, nutrient-impacted marsh, or canal sites. The main conclusions from this preliminary comparison of gut contents, stable isotope, and Hg data are: (1) there is

  20. Food-web inferences of stable isotope spatial patterns in copepods and yellowfin tuna in the pelagic eastern Pacific Ocean

    Science.gov (United States)

    Olson, Robert J.; Popp, Brian N.; Graham, Brittany S.; López-Ibarra, Gladis A.; Galván-Magaña, Felipe; Lennert-Cody, Cleridy E.; Bocanegra-Castillo, Noemi; Wallsgrove, Natalie J.; Gier, Elizabeth; Alatorre-Ramírez, Vanessa; Ballance, Lisa T.; Fry, Brian

    2010-07-01

    Evaluating the impacts of climate and fishing on oceanic ecosystems requires an improved understanding of the trophodynamics of pelagic food webs. Our approach was to examine broad-scale spatial relationships among the stable N isotope values of copepods and yellowfin tuna ( Thunnus albacares), and to quantify yellowfin tuna trophic status in the food web based on stable-isotope and stomach-contents analyses. Using a generalized additive model fitted to abundance-weighted-average δ 15N values of several omnivorous copepod species, we examined isotopic spatial relationships among yellowfin tuna and copepods. We found a broad-scale, uniform gradient in δ 15N values of copepods increasing from south to north in a region encompassing the eastern Pacific warm pool and parts of several current systems. Over the same region, a similar trend was observed for the δ 15N values in the white muscle of yellowfin tuna caught by the purse-seine fishery, implying limited movement behavior. Assuming the omnivorous copepods represent a proxy for the δ 15N values at the base of the food web, the isotopic difference between these two taxa, “ ΔYFT-COP,” was interpreted as a trophic-position offset. Yellowfin tuna trophic-position estimates based on their bulk δ 15N values were not significantly different than independent estimates based on stomach contents, but are sensitive to errors in the trophic enrichment factor and the trophic position of copepods. An apparent inshore-offshore, east to west gradient in yellowfin tuna trophic position was corroborated using compound-specific isotope analysis of amino acids conducted on a subset of samples. The gradient was not explained by the distribution of yellowfin tuna of different sizes, by seasonal variability at the base of the food web, or by known ambit distances (i.e. movements). Yellowfin tuna stomach contents did not show a regular inshore-offshore gradient in trophic position during 2003-2005, but the trophic

  1. Web-site of the UGKK. The core of national spatial infrastructure

    International Nuclear Information System (INIS)

    Lacena, M.; Klobusiak, M.

    2005-01-01

    Geodetic and Cartographic Institute Bratislava (GKU) as an executive organization of government department Geodesy, Cartography and Cadastre Authority of the Slovak Republic (Urad geodezie, kartografie a katastra na Slovensku UGKK SR) is a provider and administrator of geodetic fundamentals and basic database of reference data of GIS. It creates one of most important elements of space data infrastructure of the Slovak Republic. The Open Source software UMN MapServer was selected for creating of web-application. The web site of the UGKK SR, its structure, services and perspective are discussed

  2. Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast

    Science.gov (United States)

    Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.

    2009-12-01

    This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the

  3. Development and Evaluation of a Web Map Mind Tool Environment with the Theory of Spatial Thinking and Project-Based Learning Strategy

    Science.gov (United States)

    Hou, Huei-Tse; Yu, Tsai-Fang; Wu, Yi-Xuan; Sung, Yao-Ting; Chang, Kuo-En

    2016-01-01

    The theory of spatial thinking is relevant to the learning and teaching of many academic domains. One promising method to facilitate learners' higher-order thinking is to utilize a web map mind tool to assist learners in applying spatial thinking to cooperative problem solving. In this study, an environment is designed based on the theory of…

  4. Facilitating Spatial Thinking in World Geography Using Web-Based GIS

    Science.gov (United States)

    Jo, Injeong; Hong, Jung Eun; Verma, Kanika

    2016-01-01

    Advocates for geographic information system (GIS) education contend that learning about GIS promotes students' spatial thinking. Empirical studies are still needed to elucidate the potential of GIS as an instructional tool to support spatial thinking in other geography courses. Using a non-equivalent control group research design, this study…

  5. Voronoi tessellations and the cosmic web : Spatial patterns and clustering across the universe

    NARCIS (Netherlands)

    van de Weygaert, Rien; Gold, CM

    2007-01-01

    The spatial cosmic matter distribution on scales of a few up to more than a hundred Megaparsec(1) displays a salient and pervasive foamlike pattern. Voronoi tessellations are a versatile and flexible mathematical model for such weblike spatial patterns. They would be the natural result of an

  6. The leisure commons: A spatial history of web 2.0

    NARCIS (Netherlands)

    P.A. Arora (Payal)

    2014-01-01

    textabstractThere is much excitement about Web 2.0 as an unprecedented, novel, community-building space for experiencing, producing, and consuming leisure, particularly through social network sites. What is needed is a perspective that is invested in neither a utopian or dystopian posture but sees

  7. A web-based spatial decision supporting system (S-DSS) for grapevine quality: the viticultural tool of the SOILCONS-WEB Project

    Science.gov (United States)

    Manna, Piero; Bonfante, Antonello; Basile, Angelo; Langella, Giuliano; Agrillo, Antonietta; De Mascellis, Roberto; Florindo Mileti, Antonio; Minieri, Luciana; Orefice, Nadia; Terribile, Fabio

    2014-05-01

    The SOILCONSWEB Project aims to create a decision support system operating at the landscape scale (Spatial-DSS) for the protection and the management of soils in both agricultural and environmental issues; it is a cyber-infrastructure built on remote servers operating through the web at www.landconsultingweb.eu. It includes - among others - a series of tools specifically designed to a Viticulture aiming at high quality wines production. The system is realized thanks to a collaboration between the University of Naples Federico II, CNR ISAFoM, Ariespace srl and SeSIRCA-Campania Region within a 5-years LIFE+ project funded by European Community. The system includes tools based on modelling procedures at different level of complexity some of which specifically designed for viticulture issues. One of the implemented models arise from the original desktop based SWAP model (Kroes et al, 2008). It can be run "on the fly" through a very user friendly web-interface. The specific tool, thanks to the model based on the Richard's equation can produce data on vineyard water stress, simulating the soil water balances of the different soil types within an area of interest. Thanks to a specific program developed within the project activities, the Spatial-DSS every day acquires punctual weather data and automatically spatialize them with geostatistical approaches in order to use the data as input for the SPA (Soil Plant Atmosphere ) model running. In particular for defining the upper boundary condition (rainfall and temperatures to estimate ET0 by the Hargraves model). Soil hydraulic properties (47 soil profiles within the study area), also essential for modelling simulation, were measured in laboratory using the Wind's approach or estimated through HYPRES PTF. Water retention and hydraulic conductivity relationships were parameterized according to the van Genuchten-Mualem model; Decision makers (individuals, groups of interests and public bodies) through the DSS can have real

  8. The Earth Observation Monitor - Automated monitoring and alerting for spatial time-series data based on OGC web services

    Science.gov (United States)

    Eberle, J.; Hüttich, C.; Schmullius, C.

    2014-12-01

    Spatial time series data are freely available around the globe from earth observation satellites and meteorological stations for many years until now. They provide useful and important information to detect ongoing changes of the environment; but for end-users it is often too complex to extract this information out of the original time series datasets. This issue led to the development of the Earth Observation Monitor (EOM), an operational framework and research project to provide simple access, analysis and monitoring tools for global spatial time series data. A multi-source data processing middleware in the backend is linked to MODIS data from Land Processes Distributed Archive Center (LP DAAC) and Google Earth Engine as well as daily climate station data from NOAA National Climatic Data Center. OGC Web Processing Services are used to integrate datasets from linked data providers or external OGC-compliant interfaces to the EOM. Users can either use the web portal (webEOM) or the mobile application (mobileEOM) to execute these processing services and to retrieve the requested data for a given point or polygon in userfriendly file formats (CSV, GeoTiff). Beside providing just data access tools, users can also do further time series analyses like trend calculations, breakpoint detections or the derivation of phenological parameters from vegetation time series data. Furthermore data from climate stations can be aggregated over a given time interval. Calculated results can be visualized in the client and downloaded for offline usage. Automated monitoring and alerting of the time series data integrated by the user is provided by an OGC Sensor Observation Service with a coupled OGC Web Notification Service. Users can decide which datasets and parameters are monitored with a given filter expression (e.g., precipitation value higher than x millimeter per day, occurrence of a MODIS Fire point, detection of a time series anomaly). Datasets integrated in the SOS service are

  9. Analyzing HT-SELEX data with the Galaxy Project tools--A web based bioinformatics platform for biomedical research.

    Science.gov (United States)

    Thiel, William H; Giangrande, Paloma H

    2016-03-15

    The development of DNA and RNA aptamers for research as well as diagnostic and therapeutic applications is a rapidly growing field. In the past decade, the process of identifying aptamers has been revolutionized with the advent of high-throughput sequencing (HTS). However, bioinformatics tools that enable the average molecular biologist to analyze these large datasets and expedite the identification of candidate aptamer sequences have been lagging behind the HTS revolution. The Galaxy Project was developed in order to efficiently analyze genome, exome, and transcriptome HTS data, and we have now applied these tools to aptamer HTS data. The Galaxy Project's public webserver is an open source collection of bioinformatics tools that are powerful, flexible, dynamic, and user friendly. The online nature of the Galaxy webserver and its graphical interface allow users to analyze HTS data without compiling code or installing multiple programs. Herein we describe how tools within the Galaxy webserver can be adapted to pre-process, compile, filter and analyze aptamer HTS data from multiple rounds of selection. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. SequenceCEROSENE: a computational method and web server to visualize spatial residue neighborhoods at the sequence level.

    Science.gov (United States)

    Heinke, Florian; Bittrich, Sebastian; Kaiser, Florian; Labudde, Dirk

    2016-01-01

    To understand the molecular function of biopolymers, studying their structural characteristics is of central importance. Graphics programs are often utilized to conceive these properties, but with the increasing number of available structures in databases or structure models produced by automated modeling frameworks this process requires assistance from tools that allow automated structure visualization. In this paper a web server and its underlying method for generating graphical sequence representations of molecular structures is presented. The method, called SequenceCEROSENE (color encoding of residues obtained by spatial neighborhood embedding), retrieves the sequence of each amino acid or nucleotide chain in a given structure and produces a color coding for each residue based on three-dimensional structure information. From this, color-highlighted sequences are obtained, where residue coloring represent three-dimensional residue locations in the structure. This color encoding thus provides a one-dimensional representation, from which spatial interactions, proximity and relations between residues or entire chains can be deduced quickly and solely from color similarity. Furthermore, additional heteroatoms and chemical compounds bound to the structure, like ligands or coenzymes, are processed and reported as well. To provide free access to SequenceCEROSENE, a web server has been implemented that allows generating color codings for structures deposited in the Protein Data Bank or structure models uploaded by the user. Besides retrieving visualizations in popular graphic formats, underlying raw data can be downloaded as well. In addition, the server provides user interactivity with generated visualizations and the three-dimensional structure in question. Color encoded sequences generated by SequenceCEROSENE can aid to quickly perceive the general characteristics of a structure of interest (or entire sets of complexes), thus supporting the researcher in the initial

  11. The leisure commons a spatial history of web 2.0

    CERN Document Server

    Arora, Payal

    2014-01-01

    There is much excitement about Web 2.0 as an unprecedented, novel, community-building space for experiencing, producing, and consuming leisure, particularly through social network sites. What is needed is a perspective that is invested in neither a utopian or dystopian posture but sees historical continuity to this cyberleisure geography. This book investigates the digital public sphere by drawing parallels to another leisure space that shares its rhetoric of being open, democratic, and free for all: the urban park. It makes the case that the history and politics of public parks as an urban co

  12. Spatial synchrony propagates through a forest food web via consumer-resource interactions

    Science.gov (United States)

    Kyle J. ​Haynes; Andrew M. Liebhold; Todd M. Fearer; Guiming Wang; Gary W. Norman; Derek M. Johnson

    2009-01-01

    In many study systems, populations fluctuate synchronously across large regions. Several mechanisms have been advanced to explain this, but their importance in nature is often uncertain. Theoretical studies suggest that spatial synchrony initiated in one species through Moran effects may propagate among trophically linked species, but evidence for this in nature is...

  13. Using a Web GIS Plate Tectonics Simulation to Promote Geospatial Thinking

    Science.gov (United States)

    Bodzin, Alec M.; Anastasio, David; Sharif, Rajhida; Rutzmoser, Scott

    2016-01-01

    Learning with Web-based geographic information system (Web GIS) can promote geospatial thinking and analysis of georeferenced data. Web GIS can enable learners to analyze rich data sets to understand spatial relationships that are managed in georeferenced data visualizations. We developed a Web GIS plate tectonics simulation as a capstone learning…

  14. Development of a web GIS application for emissions inventory spatial allocation based on open source software tools

    Science.gov (United States)

    Gkatzoflias, Dimitrios; Mellios, Giorgos; Samaras, Zissis

    2013-03-01

    Combining emission inventory methods and geographic information systems (GIS) remains a key issue for environmental modelling and management purposes. This paper examines the development of a web GIS application as part of an emission inventory system that produces maps and files with spatial allocated emissions in a grid format. The study is not confined in the maps produced but also presents the features and capabilities of a web application that can be used by every user even without any prior knowledge of the GIS field. The development of the application was based on open source software tools such as MapServer for the GIS functions, PostgreSQL and PostGIS for the data management and HTML, PHP and JavaScript as programming languages. In addition, background processes are used in an innovative manner to handle the time consuming and computational costly procedures of the application. Furthermore, a web map service was created to provide maps to other clients such as the Google Maps API v3 that is used as part of the user interface. The output of the application includes maps in vector and raster format, maps with temporal resolution on daily and hourly basis, grid files that can be used by air quality management systems and grid files consistent with the European Monitoring and Evaluation Programme Grid. Although the system was developed and validated for the Republic of Cyprus covering a remarkable wide range of pollutant and emissions sources, it can be easily customized for use in other countries or smaller areas, as long as geospatial and activity data are available.

  15. Behind the web store: the organisational and spatial evolution of multichannel retailing in Toronto

    OpenAIRE

    Andrew Currah

    2002-01-01

    In this paper I address two issues of general relevance to contemporary debates in economic geography: first, the organisational and spatial implications of new information technologies for the economic landscape; and, second, the enduring role of place to digital capitalism. Specifically, I examine the organisational evolution of multichannel retailing in Toronto from a geographical perspective. Bricks-and-mortar retailers are increasingly pursuing a multichannel strategy by operating an Int...

  16. Spatially Referenced Educational Achievement Data Exploration: A Web-Based Interactive System Integration of GIS, PHP, and MySQL Technologies

    Science.gov (United States)

    Mulvenon, Sean W.; Wang, Kening; Mckenzie, Sarah; Anderson, Travis

    2006-01-01

    Effective exploration of spatially referenced educational achievement data can help educational researchers and policy analysts speed up gaining valuable insight into datasets. This article illustrates a demo system developed in the National Office for Research on Measurement and Evaluation Systems (NORMES) for supporting Web-based interactive…

  17. Using the World Wide Web as a Teaching Tool: Analyzing Images of Aging and the Visual Needs of an Aging Society.

    Science.gov (United States)

    Jakobi, Patricia

    1999-01-01

    Analysis of Web site images of aging to identify positive and negative representations can help teach students about social perceptions of older adults. Another learning experience involves consideration of the needs of older adults in Web site design. (SK)

  18. Fano lineshapes of 'Peak-tracking chip' spatial profiles analyzed with correlation analysis for bioarray imaging and refractive index sensing

    KAUST Repository

    Bougot-Robin, K.

    2013-05-22

    The asymmetric Fano resonance lineshapes, resulting from interference between background and a resonant scattering, is archetypal in resonant waveguide grating (RWG) reflectivity. Resonant profile shift resulting from a change of refractive index (from fluid medium or biomolecules at the chip surface) is classically used to perform label-free sensing. Lineshapes are sometimes sampled at discretized “detuning” values to relax instrumental demands, the highest reflectivity element giving a coarse resonance estimate. A finer extraction, needed to increase sensor sensitivity, can be obtained using a correlation approach, correlating the sensed signal to a zero-shifted reference signal. Fabrication process is presented leading to discrete Fano profiles. Our findings are illustrated with resonance profiles from silicon nitride RWGs operated at visible wavelengths. We recently demonstrated that direct imaging multi-assay RWGs sensing may be rendered more reliable using “chirped” RWG chips, by varying a RWG structure parameter. Then, the spatial reflectivity profiles of tracks composed of RWGs units with slowly varying filling factor (thus slowly varying resonance condition) are measured under monochromatic conditions. Extracting the resonance location using spatial Fano profiles allows multiplex refractive index based sensing. Discretization and sensitivity are discussed both through simulation and experiment for different filling factor variation, here Δf=0.0222 and Δf=0.0089. This scheme based on a “Peak-tracking chip” demonstrates a new technique for bioarray imaging using a simpler set-up that maintains high performance with cheap lenses, with down to Δn=2×10-5 RIU sensitivity for the highest sampling of Fano lineshapes. © (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

  19. Fano lineshapes of 'Peak-tracking chip' spatial profiles analyzed with correlation analysis for bioarray imaging and refractive index sensing

    KAUST Repository

    Bougot-Robin, K.; Li, S.; Yue, W.; Chen, L. Q.; Zhang, Xixiang; Wen, W. J.; Benisty, H.

    2013-01-01

    The asymmetric Fano resonance lineshapes, resulting from interference between background and a resonant scattering, is archetypal in resonant waveguide grating (RWG) reflectivity. Resonant profile shift resulting from a change of refractive index (from fluid medium or biomolecules at the chip surface) is classically used to perform label-free sensing. Lineshapes are sometimes sampled at discretized “detuning” values to relax instrumental demands, the highest reflectivity element giving a coarse resonance estimate. A finer extraction, needed to increase sensor sensitivity, can be obtained using a correlation approach, correlating the sensed signal to a zero-shifted reference signal. Fabrication process is presented leading to discrete Fano profiles. Our findings are illustrated with resonance profiles from silicon nitride RWGs operated at visible wavelengths. We recently demonstrated that direct imaging multi-assay RWGs sensing may be rendered more reliable using “chirped” RWG chips, by varying a RWG structure parameter. Then, the spatial reflectivity profiles of tracks composed of RWGs units with slowly varying filling factor (thus slowly varying resonance condition) are measured under monochromatic conditions. Extracting the resonance location using spatial Fano profiles allows multiplex refractive index based sensing. Discretization and sensitivity are discussed both through simulation and experiment for different filling factor variation, here Δf=0.0222 and Δf=0.0089. This scheme based on a “Peak-tracking chip” demonstrates a new technique for bioarray imaging using a simpler set-up that maintains high performance with cheap lenses, with down to Δn=2×10-5 RIU sensitivity for the highest sampling of Fano lineshapes. © (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

  20. Analyzing key constraints to biogas production from crop residues and manure in the EU—A spatially explicit model

    Science.gov (United States)

    Persson, U. Martin

    2017-01-01

    This paper presents a spatially explicit method for making regional estimates of the potential for biogas production from crop residues and manure, accounting for key technical, biochemical, environmental and economic constraints. Methods for making such estimates are important as biofuels from agricultural residues are receiving increasing policy support from the EU and major biogas producers, such as Germany and Italy, in response to concerns over unintended negative environmental and social impacts of conventional biofuels. This analysis comprises a spatially explicit estimate of crop residue and manure production for the EU at 250 m resolution, and a biogas production model accounting for local constraints such as the sustainable removal of residues, transportation of substrates, and the substrates’ biochemical suitability for anaerobic digestion. In our base scenario, the EU biogas production potential from crop residues and manure is about 0.7 EJ/year, nearly double the current EU production of biogas from agricultural substrates, most of which does not come from residues or manure. An extensive sensitivity analysis of the model shows that the potential could easily be 50% higher or lower, depending on the stringency of economic, technical and biochemical constraints. We find that the potential is particularly sensitive to constraints on the substrate mixtures’ carbon-to-nitrogen ratio and dry matter concentration. Hence, the potential to produce biogas from crop residues and manure in the EU depends to large extent on the possibility to overcome the challenges associated with these substrates, either by complementing them with suitable co-substrates (e.g. household waste and energy crops), or through further development of biogas technology (e.g. pretreatment of substrates and recirculation of effluent). PMID:28141827

  1. Analyzing key constraints to biogas production from crop residues and manure in the EU-A spatially explicit model.

    Science.gov (United States)

    Einarsson, Rasmus; Persson, U Martin

    2017-01-01

    This paper presents a spatially explicit method for making regional estimates of the potential for biogas production from crop residues and manure, accounting for key technical, biochemical, environmental and economic constraints. Methods for making such estimates are important as biofuels from agricultural residues are receiving increasing policy support from the EU and major biogas producers, such as Germany and Italy, in response to concerns over unintended negative environmental and social impacts of conventional biofuels. This analysis comprises a spatially explicit estimate of crop residue and manure production for the EU at 250 m resolution, and a biogas production model accounting for local constraints such as the sustainable removal of residues, transportation of substrates, and the substrates' biochemical suitability for anaerobic digestion. In our base scenario, the EU biogas production potential from crop residues and manure is about 0.7 EJ/year, nearly double the current EU production of biogas from agricultural substrates, most of which does not come from residues or manure. An extensive sensitivity analysis of the model shows that the potential could easily be 50% higher or lower, depending on the stringency of economic, technical and biochemical constraints. We find that the potential is particularly sensitive to constraints on the substrate mixtures' carbon-to-nitrogen ratio and dry matter concentration. Hence, the potential to produce biogas from crop residues and manure in the EU depends to large extent on the possibility to overcome the challenges associated with these substrates, either by complementing them with suitable co-substrates (e.g. household waste and energy crops), or through further development of biogas technology (e.g. pretreatment of substrates and recirculation of effluent).

  2. Analyzing key constraints to biogas production from crop residues and manure in the EU-A spatially explicit model.

    Directory of Open Access Journals (Sweden)

    Rasmus Einarsson

    Full Text Available This paper presents a spatially explicit method for making regional estimates of the potential for biogas production from crop residues and manure, accounting for key technical, biochemical, environmental and economic constraints. Methods for making such estimates are important as biofuels from agricultural residues are receiving increasing policy support from the EU and major biogas producers, such as Germany and Italy, in response to concerns over unintended negative environmental and social impacts of conventional biofuels. This analysis comprises a spatially explicit estimate of crop residue and manure production for the EU at 250 m resolution, and a biogas production model accounting for local constraints such as the sustainable removal of residues, transportation of substrates, and the substrates' biochemical suitability for anaerobic digestion. In our base scenario, the EU biogas production potential from crop residues and manure is about 0.7 EJ/year, nearly double the current EU production of biogas from agricultural substrates, most of which does not come from residues or manure. An extensive sensitivity analysis of the model shows that the potential could easily be 50% higher or lower, depending on the stringency of economic, technical and biochemical constraints. We find that the potential is particularly sensitive to constraints on the substrate mixtures' carbon-to-nitrogen ratio and dry matter concentration. Hence, the potential to produce biogas from crop residues and manure in the EU depends to large extent on the possibility to overcome the challenges associated with these substrates, either by complementing them with suitable co-substrates (e.g. household waste and energy crops, or through further development of biogas technology (e.g. pretreatment of substrates and recirculation of effluent.

  3. Computer simulation on spatial resolution of X-ray bright-field imaging by dynamical diffraction theory for a Laue-case crystal analyzer

    International Nuclear Information System (INIS)

    Suzuki, Yoshifumi; Chikaura, Yoshinori; Ando, Masami

    2011-01-01

    Recently, dark-field imaging (DFI) and bright-field imaging (BFI) have been proposed and applied to visualize X-ray refraction effects yielded in biomedical objects. In order to clarify the spatial resolution due to a crystal analyzer in Laue geometry, a program based on the Takagi-Taupin equation was modified to be used for carrying out simulations to evaluate the spatial resolution of images coming into a Laue angular analyzer (LAA). The calculation was done with a perfect plane wave for diffraction wave-fields, which corresponded to BFI, under the conditions of 35 keV and a diffraction index 440 for a 2100 μm thick LAA. As a result, the spatial resolution along the g-vector direction showed approximately 37.5 μm. 126 μm-thick LAA showed a spatial resolution better than 3.1 μm under the conditions of 13.7 keV and a diffraction index 220.

  4. Mercury bioaccumulation in the food web of Three Gorges Reservoir (China): Tempo-spatial patterns and effect of reservoir management

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jun [College of Fisheries, Huazhong Agricultural University, Key Laboratory of Freshwater Animal Breeding, Ministry of Agriculture, Wuhan 430070 (China); Freshwater Aquaculture Collaborative Innovation Center of Hubei Province, Wuhan 430070 (China); Zhou, Qiong, E-mail: hainan@mail.hzau.edu.cn [College of Fisheries, Huazhong Agricultural University, Key Laboratory of Freshwater Animal Breeding, Ministry of Agriculture, Wuhan 430070 (China); Freshwater Aquaculture Collaborative Innovation Center of Hubei Province, Wuhan 430070 (China); Yuan, Gailing; He, Xugang [College of Fisheries, Huazhong Agricultural University, Key Laboratory of Freshwater Animal Breeding, Ministry of Agriculture, Wuhan 430070 (China); Freshwater Aquaculture Collaborative Innovation Center of Hubei Province, Wuhan 430070 (China); Xie, Ping [College of Fisheries, Huazhong Agricultural University, Key Laboratory of Freshwater Animal Breeding, Ministry of Agriculture, Wuhan 430070 (China); Donghu Experimental Station of Lake Ecosystems, State Key Laboratory of Freshwater Ecology and Biotechnology of China, Institute of Hydrobiology, Chinese Academy of Sciences, Wuhan 430072 (China)

    2015-09-15

    Tempo-spatial patterns of mercury bioaccumulation and tropho-dynamics, and the potential for a reservoir effect were evaluated in the Three Gorges Reservoir (TGR, China) from 2011 to 2012, using total mercury concentrations (THg) and stable isotopes (δ{sup 13}C and δ{sup 15}N) of food web components (seston, aquatic invertebrates and fish). Hg concentrations in aquatic invertebrates and fish indicated a significant temporal trend associated with regular seasonal water-level manipulation. This includes water level lowering to allow for storage of water during the wet season (summer); a decrease of water levels from September to June providing a setting for flood storage. Hg concentrations in organisms were the highest after flooding. Higher Hg concentrations in fish were observed at the location farthest from the dam. Hg concentrations in water and sediment were correlated. Compared with the reservoirs of United States and Canada, TGR had lower trophic magnification factors (0.046–0.066), that are explained primarily by organic carbon concentrations in sediment, and the effect of “growth dilution”. Based on comparison before and after the impoundment of TGR, THg concentration in biota did not display an obvious long-term reservoir effect due to (i) short time since inundation, (ii) regular water discharge associated with water-level regulation, and/or (iii) low organic matter content in the sediment. - Highlights: • Hg concentrations were measured in biota of the main stem of 3 Gorges Reservoir. • Fish Hg concentration post-flood period > pre-flood period > flood period. • Fish Hg concentrations were the highest farthest from the dam. • THg in fish 2 years after inundation were the same as before impoundment. • Low biomagnification was ascribed to low DOC content in the sediment.

  5. Estimating Air Particulate Matter Using MODIS Data and Analyzing Its Spatial and Temporal Pattern over the Yangtze Delta Region

    Directory of Open Access Journals (Sweden)

    Jianhui Xu

    2016-09-01

    Full Text Available The deteriorating air quality in the Yangtze delta region is attracting growing public concern. In this paper, seasonal estimation models of the surface particulate matter (PM were established by using aerosol optical thickness (AOT retrievals from the moderate resolution imaging spectro-radiometer (MODIS on board NASA’s Terra satellite. The change of the regional distribution of the atmospheric mixed layer, relative humidity and meteorological elements have been taken into account in these models. We also used PM mass concentrations of ground measurements to evaluate the estimation accuracy of those models. The results show that model estimation of PM2.5 and PM10 mass concentrations were in good agreement with the ground-based observation of PM mass concentrations (p < 0.01, the R2 value of the PM2.5 concentrations experimental model for four seasons are 0.48, 0.62, 0.61 and 0.52 respectively. The R2 value of PM10 concentrations experimental model for four seasons are 0.57, 0.56, 0.64 and 0.68 respectively. At the same time, spatial and temporal variations of PM2.5 and PM10 mass concentrations were analysed over the Yangtze delta region from 2000 to 2013. The results show that PM2.5 and PM10 show a trend of increase in the Yangtze delta region from 2000 to 2013 and change periodically. The maximum mass concentration of PM2.5 and PM10 was in January–February, and the minimum was in July–August. The highest values of PM2.5 and PM10 mass concentration are in the region of urban agglomeration which is grouped to a delta-shaped region by Shanghai, Hangzhou and Nanjing, while the low values are in the forest far away from the city. PM mass concentration over main cities and rural areas increased gradually year by year, and were increasing more quickly in urban areas than in rural areas.

  6. Analyzing the spatial patterns and drivers of ecosystem services in rapidly urbanizing Taihu Lake Basin of China

    Science.gov (United States)

    Ai, Junyong; Sun, Xiang; Feng, Lan; Li, Yangfan; Zhu, Xiaodong

    2015-09-01

    Quantifying and mapping the distribution patterns of ecosystem services can help to ascertain which services should be protected and where investments should be directed to improve synergies and reduce tradeoffs. Moreover, the indicators of urbanization that affect the provision of ecosystem services must be identified to determine which approach to adopt in formulating policies related to these services. This paper presents a case study that maps the distribution of multiple ecosystem services and analyzes the ways in which they interact. The relationship between the supply of ecosystem services and the socio-economic development in the Taihu Lake Basin of eastern China is also revealed. Results show a significant negative relationship between crop production and tourism income ( p<0.005) and a positive relationship between crop production, nutrient retention, and carbon sequestration ( p<0.005). The negative effects of the urbanization process on providing and regulating services are also identified through a comparison of the ecosystem services in large and small cities. Regression analysis was used to compare and elucidate the relative significance of the selected urbanization factors to ecosystem services. The results indicate that urbanization level is the most substantial factor inversely correlated with crop production ( R 2 = 0.414) and nutrient retention services ( R 2 = 0.572). Population density is the most important factor that negatively affects carbon sequestration ( R 2 = 0.447). The findings of this study suggest the potential relevance of ecosystem service dynamics to urbanization management and decision making.

  7. Geospatial semantic web

    CERN Document Server

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such as WFS, WMS, RDF, OWL, and GeoSPARQL, and demonstrates how to use the Geospatial Semantic Web technologies to solve practical real-world problems such as spatial data interoperability.

  8. Pengembangan Aplikasi Sistem Informasi Geografis (SIG berbasis Web untuk Manajemen Pemanfaatan Air Tanah menggunakan PHP, Java dan MySQL Spatial (Studi Kasus di Kabupaten Banyumas

    Directory of Open Access Journals (Sweden)

    J Jumadi

    2009-12-01

    Full Text Available In the existing world of geographic information systems (GIS, desktop mapping has taken a critical role for managing and using spatial information for business. But desktop-based GIS application having any limitation for users. The research was conducted to develop the web-based GIS in order to manage groundwater exploration and production, preventing from uncontrolled exploration, using Java Applet, MySQL Spatial and PHP. The system development was designed by using waterfall model of system life cycle with following steps: 1 system requirements, 2 software requirements, 3 analysis, 4 program design, 5 coding, 6 testing, and 7 operation, supported by reference study, observation, and peer discussion. The result shows that by using Java Applet, MySQL Spatial and PHP, web-based GIS for groundwater management is customizable to create spatial modeling and well log modeling, user friendly, interactive, interoperable, informative, and easy to access with LAN/WAN connected PC. The application is very helpful in order to balance between groundwater supply and production, groundwater level monitoring, water quality monitoring, and groundwater user monitoring. Hopefully, the implementation of the system will help the groundwater supply conservation for sustainable development.

  9. Web-Scale Multidimensional Visualization of Big Spatial Data to Support Earth Sciences—A Case Study with Visualizing Climate Simulation Data

    Directory of Open Access Journals (Sweden)

    Sizhe Wang

    2017-06-01

    Full Text Available The world is undergoing rapid changes in its climate, environment, and ecosystems due to increasing population growth, urbanization, and industrialization. Numerical simulation is becoming an important vehicle to enhance the understanding of these changes and their impacts, with regional and global simulation models producing vast amounts of data. Comprehending these multidimensional data and fostering collaborative scientific discovery requires the development of new visualization techniques. In this paper, we present a cyberinfrastructure solution—PolarGlobe—that enables comprehensive analysis and collaboration. PolarGlobe is implemented upon an emerging web graphics library, WebGL, and an open source virtual globe system Cesium, which has the ability to map spatial data onto a virtual Earth. We have also integrated volume rendering techniques, value and spatial filters, and vertical profile visualization to improve rendered images and support a comprehensive exploration of multi-dimensional spatial data. In this study, the climate simulation dataset produced by the extended polar version of the well-known Weather Research and Forecasting Model (WRF is used to test the proposed techniques. PolarGlobe is also easily extendable to enable data visualization for other Earth Science domains, such as oceanography, weather, or geology.

  10. The ChIP-Seq tools and web server: a resource for analyzing ChIP-seq and other types of genomic data.

    Science.gov (United States)

    Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp

    2016-11-18

    ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .

  11. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  12. Geostatistics and Geographic Information System to Analyze the Spatial Distribution of the Diversity of Anastrepha Species (Diptera: Tephritidae): the Effect of Forest Fragments in an Urban Area.

    Science.gov (United States)

    Garcia, A G; Araujo, M R; Uramoto, K; Walder, J M M; Zucchi, R A

    2017-12-08

    Fruit flies are among the most damaging insect pests of commercial fruit in Brazil. It is important to understand the landscape elements that may favor these flies. In the present study, spatial data from surveys of species of Anastrepha Schiner (Diptera: Tephritidae) in an urban area with forest fragments were analyzed, using geostatistics and Geographic Information System (GIS) to map the diversity of insects and evaluate how the forest fragments drive the spatial patterns. The results indicated a high diversity of species associated with large fragments, and a trend toward lower diversity in the more urbanized area, as the fragment sizes decreased. We concluded that the diversity of Anastrepha species is directly and positively related to large and continuous forest fragments in urbanized areas, and that combining geostatistics and GIS is a promising method for use in insect-pest management and sampling involving fruit flies. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Participatory GIS in design of the Wroclaw University of Science and Technology campus web map and spatial analysis of campus area quality

    Science.gov (United States)

    Blachowski, Jan; Łuczak, Jakub; Zagrodnik, Paulina

    2018-01-01

    Public participation geographic information system (GIS) and participatory mapping data collection methods are means that enhance capacity in generating, managing, and communicating spatial information in various fields ranging from local planning to environmental management. In this study these methods have been used in two ways. The first one, to gather information on the additional functionality of campus web map expected by its potential users, i.e. students, staff and visitors, through web based survey. The second, to collect geographically referenced information on campus areas that are liked and disliked in a geo-survey carried out with ArcGIS Online GeoForm Application. The results of the first survey were used to map facilities such as: bicycle infrastructure, building entrances, wheelchair accessible infrastructure and benches. The results of the second one, to analyse the most and the least attractive parts of the campus with heat and hot spot analyses in GIS. In addition, the answers have been studied with regard to the visual and functional aspects of campus area raised in the survey. The thematic layers developed in the results of field mapping and geoprocessing of geosurvey data were included in the campus web map project. The paper describes the applied methodology of data collection, processing, analysis, interpretation and geovisualisation.

  14. Participatory GIS in design of the Wroclaw University of Science and Technology campus web map and spatial analysis of campus area quality

    Directory of Open Access Journals (Sweden)

    Blachowski Jan

    2018-01-01

    Full Text Available Public participation geographic information system (GIS and participatory mapping data collection methods are means that enhance capacity in generating, managing, and communicating spatial information in various fields ranging from local planning to environmental management. In this study these methods have been used in two ways. The first one, to gather information on the additional functionality of campus web map expected by its potential users, i.e. students, staff and visitors, through web based survey. The second, to collect geographically referenced information on campus areas that are liked and disliked in a geo-survey carried out with ArcGIS Online GeoForm Application. The results of the first survey were used to map facilities such as: bicycle infrastructure, building entrances, wheelchair accessible infrastructure and benches. The results of the second one, to analyse the most and the least attractive parts of the campus with heat and hot spot analyses in GIS. In addition, the answers have been studied with regard to the visual and functional aspects of campus area raised in the survey. The thematic layers developed in the results of field mapping and geoprocessing of geosurvey data were included in the campus web map project. The paper describes the applied methodology of data collection, processing, analysis, interpretation and geovisualisation.

  15. Analyzing spatial variability of soil properties in the urban park before and after reconstruction to support decision-making in landscaping

    Science.gov (United States)

    Romzaikina, Olga; Vasenev, Viacheslav; Khakimova, Rita

    2017-04-01

    On-going urbanization stresses a necessity for structural and aesthetically organized urban landscapes to improve citizen's life quality. Urban soils and vegetation are the main components of urban ecosystems. Urban greenery regulates the climate, controls and air quality and supports biodiversity in urban areas. Soils play a key role in supporting urban greenery. However, soils of urban parks also perform other important environmental functions. Urban soils are influenced by a variety of environmental and anthropogenic factors and, in the result, are highly heterogeneous and dynamic. Reconstructions of green zones and urban parks, usually occurring in cities, alter soil properties. Analyzing spatial variability and dynamics of soil properties is important to support decision-making in landscaping. Therefore, the research aimed to analyze the spatial distribution of the key soil properties (acidity, soil organic carbon (SOC) and nutrient contents) in the urban park before and after reconstruction to support decision-making in selecting ornamental plants for landscaping. The research was conducted in the urban park named after Artyom Borovik in Moscow before (2012) and after (2014) the reconstruction. Urban soil's properties maps for both periods were created by interpolation of the field data. The observed urban soils included recreazems, urbanozems and constuctozems. Before the reconstruction soils were sampled using the uniform design (the net with 100 m side and key plots with 50m size). After the reconstructions the additional samples were collected at locations, where the land cover and functional zones changed in a result of the reconstruction.We sample from the depths 0-30, 30-50 and 50-100 cm. The following soil properties were measured: pH, SOC, K2O and P2O5. The maps of the analyzed properties were developed using open QGIS2.4 software by IDW. The vegetation in the park was examined using the scale of the visual assessment. The results of the visual

  16. Analyzing Snowpack Metrics Over Large Spatial Extents Using Calibrated, Enhanced-Resolution Brightness Temperature Data and Long Short Term Memory Artificial Neural Networks

    Science.gov (United States)

    Norris, W.; J Q Farmer, C.

    2017-12-01

    Snow water equivalence (SWE) is a difficult metric to measure accurately over large spatial extents; snow-tell sites are too localized, and traditional remotely sensed brightness temperature data is at too coarse of a resolution to capture variation. The new Calibrated Enhanced-Resolution Brightness Temperature (CETB) data from the National Snow and Ice Data Center (NSIDC) offers remotely sensed brightness temperature data at an enhanced resolution of 3.125 km versus the original 25 km, which allows for large spatial extents to be analyzed with reduced uncertainty compared to the 25km product. While the 25km brightness temperature data has proved useful in past research — one group found decreasing trends in SWE outweighed increasing trends three to one in North America; other researchers used the data to incorporate winter conditions, like snow cover, into ecological zoning criterion — with the new 3.125 km data, it is possible to derive more accurate metrics for SWE, since we have far more spatial variability in measurements. Even with higher resolution data, using the 37 - 19 GHz frequencies to estimate SWE distorts the data during times of melt onset and accumulation onset. Past researchers employed statistical splines, while other successful attempts utilized non-parametric curve fitting to smooth out spikes distorting metrics. In this work, rather than using legacy curve fitting techniques, a Long Short Term Memory (LSTM) Artificial Neural Network (ANN) was trained to perform curve fitting on the data. LSTM ANN have shown great promise in modeling time series data, and with almost 40 years of data available — 14,235 days — there is plenty of training data for the ANN. LSTM's are ideal for this type of time series analysis because they allow important trends to persist for long periods of time, but ignore short term fluctuations; since LSTM's have poor mid- to short-term memory, they are ideal for smoothing out the large spikes generated in the melt

  17. Use of a Web-based physical activity record system to analyze behavior in a large population: cross-sectional study.

    Science.gov (United States)

    Namba, Hideyuki; Yamada, Yosuke; Ishida, Mika; Takase, Hideto; Kimura, Misaka

    2015-03-19

    The use of Web-based physical activity systems has been proposed as an easy method for collecting physical activity data. We have developed a system that has exhibited high accuracy as assessed by the doubly labeled water method. The purpose of this study was to collect behavioral data from a large population using our Web-based physical activity record system and assess the physical activity of the population based on these data. In this paper, we address the difference in physical activity for each urban scale. In total, 2046 participants (aged 30-59 years; 1105 men and 941 women) participated in the study. They were asked to complete data entry before bedtime using their personal computer on 1 weekday and 1 weekend day. Their residential information was categorized as urban, urban-rural, or rural. Participant responses expressed the intensity of each activity at 15-minute increments and were recorded on a Web server. Residential areas were compared and multiple regression analysis was performed. Most participants had a metabolic equivalent (MET) ranging from 1.4 to 1.8, and the mean MET was 1.60 (SD 0.28). The median value of moderate-to-vigorous physical activity (MVPA, ≥3 MET) was 7.92 MET-hours/day. A 1-way ANCOVA showed that total physical activity differed depending on the type of residential area (F2,2027=5.19, P=.006). The urban areas (n=950) had the lowest MET-hours/day (mean 37.8, SD, 6.0), followed by urban-rural areas (n=432; mean 38.6, SD 6.5; P=.04), and rural areas (n=664; mean 38.8, SD 7.4; P=.002). Two-way ANCOVA showed a significant interaction between sex and area of residence on the urban scale (F2,2036=4.53, P=.01). Men in urban areas had the lowest MET-hours/day (MVPA, ≥3 MET) at mean 7.9 (SD 8.7); men in rural areas had a MET-hours/day (MVPA, ≥3 MET) of mean 10.8 (SD 12.1, P=.002). No significant difference was noted in women among the 3 residential areas. Multiple regression analysis showed that physical activity consisting of

  18. Web-based spatial data infrastructure: a solution for the sustainable management of thematic information supported by aerial orthophotography

    Directory of Open Access Journals (Sweden)

    David Hernández López

    2013-01-01

    Full Text Available En el marco de ejecución del Plan Nacional Español de Ortofotografía Aérea de Castilla-La Mancha se ha desarrollado una infraestructura web de datos espaciales que permite la gestión sostenible de información espacial y temática junto con el cumplimiento del control de calidad de producción de ortofoto de una forma eficiente. En concreto, se ha llevado a cabo una difusión de información cartográfica temática por medio de imágenes infrarrojas y sus parámetros físicos (reflectancia y radiancia, los cuales permiten su explotación en aplicaciones relacionadas con la extracción de parámetros físicos, evolución de la cubierta forestal y análisis agronómico de especies vegetales. Dado que se ha perseguido en todo momento la mayor transparencia posible, se ha creado un geoportal que ofrece toda la información de interés relativa al proyecto basándose en tecnologías de infraestructura de datos espaciales (http://ide.jccm.es/pnoa, incorporando servicios web de mapas.

  19. simSALUD - a Web-based Spatial Microsimulation Application to Support Regional Health Planning in Austria

    OpenAIRE

    Melanie Tomintz; Bernhard Kosar; Victor Garcia-Barrios

    2013-01-01

    The Austrian Federal Ministry of Health aims to improve the health of all people living in Austria and to decrease health and social inequalities. This leads to a careful planning and distribution of the available health care resources to meet government aims. The research project SALUD, funded by the Federal Ministry for Transport, Innovation and Technology and the Austrian Science Fund, focuses on building a Spatial Microsimulation Model for Austria by combining survey and census data to mo...

  20. CAZymes Analysis Toolkit (CAT): web service for searching and analyzing carbohydrate-active enzymes in a newly sequenced organism using CAZy database.

    Science.gov (United States)

    Park, Byung H; Karpinets, Tatiana V; Syed, Mustafa H; Leuze, Michael R; Uberbacher, Edward C

    2010-12-01

    The Carbohydrate-Active Enzyme (CAZy) database provides a rich set of manually annotated enzymes that degrade, modify, or create glycosidic bonds. Despite rich and invaluable information stored in the database, software tools utilizing this information for annotation of newly sequenced genomes by CAZy families are limited. We have employed two annotation approaches to fill the gap between manually curated high-quality protein sequences collected in the CAZy database and the growing number of other protein sequences produced by genome or metagenome sequencing projects. The first approach is based on a similarity search against the entire nonredundant sequences of the CAZy database. The second approach performs annotation using links or correspondences between the CAZy families and protein family domains. The links were discovered using the association rule learning algorithm applied to sequences from the CAZy database. The approaches complement each other and in combination achieved high specificity and sensitivity when cross-evaluated with the manually curated genomes of Clostridium thermocellum ATCC 27405 and Saccharophagus degradans 2-40. The capability of the proposed framework to predict the function of unknown protein domains and of hypothetical proteins in the genome of Neurospora crassa is demonstrated. The framework is implemented as a Web service, the CAZymes Analysis Toolkit, and is available at http://cricket.ornl.gov/cgi-bin/cat.cgi.

  1. Spatial Keyword Querying

    DEFF Research Database (Denmark)

    Cao, Xin; Chen, Lisi; Cong, Gao

    2012-01-01

    The web is increasingly being used by mobile users. In addition, it is increasingly becoming possible to accurately geo-position mobile users and web content. This development gives prominence to spatial web data management. Specifically, a spatial keyword query takes a user location and user-sup...... different kinds of functionality as well as the ideas underlying their definition....

  2. Web Caching

    Indian Academy of Sciences (India)

    leveraged through Web caching technology. Specifically, Web caching becomes an ... Web routing can improve the overall performance of the Internet. Web caching is similar to memory system caching - a Web cache stores Web resources in ...

  3. Using Web Server Logs in Evaluating Instructional Web Sites.

    Science.gov (United States)

    Ingram, Albert L.

    2000-01-01

    Web server logs contain a great deal of information about who uses a Web site and how they use it. This article discusses the analysis of Web logs for instructional Web sites; reviews the data stored in most Web server logs; demonstrates what further information can be gleaned from the logs; and discusses analyzing that information for the…

  4. Temporal and spatial variation in polychlorinated biphenyl chiral signatures of the Greenland shark (Somniosus microcephalus) and its arctic marine food web

    International Nuclear Information System (INIS)

    Lu, Zhe; Fisk, Aaron T.; Kovacs, Kit M.; Lydersen, Christian; McKinney, Melissa A.; Tomy, Gregg T.; Rosenburg, Bruno; McMeans, Bailey C.; Muir, Derek C.G.; Wong, Charles S.

    2014-01-01

    Polychlorinated biphenyls (PCBs) chiral signatures were measured in Greenland sharks (Somniosus microcephalus) and their potential prey in arctic marine food webs from Canada (Cumberland Sound) and Europe (Svalbard) to assess temporal and spatial variation in PCB contamination at the stereoisomer level. Marine mammals had species-specific enantiomer fractions (EFs), likely due to a combination of in vivo biotransformation and direct trophic transfer. Greenland sharks from Cumberland Sound in 2007–2008 had similar EFs to those sharks collected a decade ago in the same location (PCBs 91, 136 and 149) and also similar to their conspecifics from Svalbard for some PCB congeners (PCBs 95, 136 and 149). However, other PCB EFs in the sharks varied temporally (PCB 91) or spatially (PCB 95), suggesting a possible spatiotemporal variation in their diets, since biotransformation capacity was unlikely to have varied within this species from region to region or over the time frame studied. -- Highlights: • Chiral PCB signatures were measured in Greenland sharks and their prey. • Marine mammals accumulated non-racemic PCBs from biotransformation and their diet. • Chiral PCB signatures were similar in sharks at two different arctic locations. • Some changes in chiral PCB signatures in sharks over a decade. -- PCB chiral signatures in Greenland sharks shift over time and space, likely in parallel with dietary variation

  5. Energy Zones Study: A Comprehensive Web-Based Mapping Tool to Identify and Analyze Clean Energy Zones in the Eastern Interconnection

    Energy Technology Data Exchange (ETDEWEB)

    Koritarov, V.; Kuiper, J.; Hlava, K.; Orr, A.; Rollins, K.; Brunner, D.; Green, H.; Makar, J.; Ayers, A.; Holm, M.; Simunich, K.; Wang, J.; Augustine, C.; Heimiller, D.; Hurlbut, D. J.; Milbrandt, A.; Schneider, T. R.; et al.

    2013-09-01

    This report describes the work conducted in support of the Eastern Interconnection States’ Planning Council (EISPC) Energy Zones Study and the development of the Energy Zones Mapping Tool performed by a team of experts from three National Laboratories. The multi-laboratory effort was led by Argonne National Laboratory (Argonne), in collaboration with the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). In June 2009, the U.S. Department of Energy (DOE) and the National Energy Technology Laboratory published Funding Opportunity Announcement FOA-0000068, which invited applications for interconnection-level analysis and planning. In December 2009, the Eastern Interconnection Planning Collaborative (EIPC) and the EISPC were selected as two award recipients for the Eastern Interconnection. Subsequently, in 2010, DOE issued Research Call RC-BM-2010 to DOE’s Federal Laboratories to provide research support and assistance to FOA-0000068 awardees on a variety of key subjects. Argonne was selected as the lead laboratory to provide support to EISPC in developing a methodology and a mapping tool for identifying potential clean energy zones in the Eastern Interconnection. In developing the EISPC Energy Zones Mapping Tool (EZ Mapping Tool), Argonne, NREL, and ORNL closely collaborated with the EISPC Energy Zones Work Group which coordinated the work on the Energy Zones Study. The main product of the Energy Zones Study is the EZ Mapping Tool, which is a web-based decision support system that allows users to locate areas with high suitability for clean power generation in the U.S. portion of the Eastern Interconnection. The mapping tool includes 9 clean (low- or no-carbon) energy resource categories and 29 types of clean energy technologies. The EZ Mapping Tool contains an extensive geographic information system database and allows the user to apply a flexible modeling approach for the identification and analysis of potential energy zones

  6. Quantifying Uncertainty in the Trophic Magnification Factor Related to Spatial Movements of Organisms in a Food Web

    DEFF Research Database (Denmark)

    McLeod, Anne; Arnot, Jon; Borgå, Katrine

    2015-01-01

    included in the model. The model predictions of magnitude of TMFs conformed to empirical studies. There were differences in the relationship between the TMF and the octanol–water partitioning coefficient (KOW) depending on the modeling approach used; a parabolic relationship was predicted under...... deterministic scenarios, whereas a linear TMF–KOW relationship was predicted when the model was run stochastically. Incorporating spatial movements by fish had a major influence on the magnitude and variation of TMFs. Under conditions where organisms are collected exclusively from clean locations in highly...... heterogeneous systems, the results showed bias toward higher TMF estimates, for example the TMF for PCB 153 increased from 2.7 to 5.6 when fish movement was included. Small underestimations of TMFs were found where organisms were exclusively sampled in contaminated regions, although the model was found...

  7. How wide is a stream? Spatial extent of the potential "stream signature" in terrestrial food webs using meta-analysis.

    Science.gov (United States)

    Muehlbauer, Jeffrey D; Collins, Scott F; Doyle, Martin W; Tockner, Klement

    2014-01-01

    The magnitude of cross-ecosystem resource subsidies is increasingly well recognized; however, less is known about the distance these subsidies travel into the recipient landscape. In streams and rivers, this distance can delimit the "biological stream width," complementary to hydro-geomorphic measures (e.g., channel banks) that have typically defined stream ecosystem boundaries. In this study we used meta-analysis to define a "stream signature" on land that relates the stream-to-land subsidy to distance. The 50% stream signature, for example, identifies the point on the landscape where subsidy resources are still at half of their maximum (in- or near-stream) level. The decay curve for these data was best fit by a negative power function in which the 50% stream signature was concentrated near stream banks (1.5 m), but a non-trivial (10%) portion of the maximum subsidy level was still found > 0.5 km from the water's edge. The meta-analysis also identified explanatory variables that affect the stream signature. This improves our understanding of ecosystem conditions that permit spatially extensive subsidy transmission, such as in highly productive, middle-order streams and rivers. Resultant multivariate models from this analysis may be useful to managers implementing buffer rules and conservation strategies for stream and riparian function, as they facilitate prediction of the extent of subsidies. Our results stress that much of the subsidy remains near the stream, but also that subsidies (and aquatic organisms) are capable of long-distance dispersal into adjacent environments, and that the effective "biological stream width" of stream and river ecosystems is often much larger than has been defined by hydro-geomorphic metrics alone. Limited data available from marine and lake sources overlap well with the stream signature data, indicating that the "signature" approach may also be applicable to subsidy spatial dynamics across other ecosystems.

  8. CANGS DB: a stand-alone web-based database tool for processing, managing and analyzing 454 data in biodiversity studies

    Directory of Open Access Journals (Sweden)

    Schlötterer Christian

    2011-06-01

    Full Text Available Abstract Background Next generation sequencing (NGS is widely used in metagenomic and transcriptomic analyses in biodiversity. The ease of data generation provided by NGS platforms has allowed researchers to perform these analyses on their particular study systems. In particular the 454 platform has become the preferred choice for PCR amplicon based biodiversity surveys because it generates the longest sequence reads. Nevertheless, the handling and organization of massive amounts of sequencing data poses a major problem for the research community, particularly when multiple researchers are involved in data acquisition and analysis. An integrated and user-friendly tool, which performs quality control, read trimming, PCR primer removal, and data organization is desperately needed, therefore, to make data interpretation fast and manageable. Findings We developed CANGS DB (Cleaning and Analyzing Next Generation Sequences DataBase a flexible, stand alone and user-friendly integrated database tool. CANGS DB is specifically designed to organize and manage the massive amount of sequencing data arising from various NGS projects. CANGS DB also provides an intuitive user interface for sequence trimming and quality control, taxonomy analysis and rarefaction analysis. Our database tool can be easily adapted to handle multiple sequencing projects in parallel with different sample information, amplicon sizes, primer sequences, and quality thresholds, which makes this software especially useful for non-bioinformaticians. Furthermore, CANGS DB is especially suited for projects where multiple users need to access the data. CANGS DB is available at http://code.google.com/p/cangsdb/. Conclusion CANGS DB provides a simple and user-friendly solution to process, store and analyze 454 sequencing data. Being a local database that is accessible through a user-friendly interface, CANGS DB provides the perfect tool for collaborative amplicon based biodiversity surveys

  9. A Theoretical Approach to Analyze the Parametric Influence on Spatial Patterns of Spodoptera frugiperda (J.E. Smith) (Lepidoptera: Noctuidae) Populations.

    Science.gov (United States)

    Garcia, A G; Godoy, W A C

    2017-06-01

    Studies of the influence of biological parameters on the spatial distribution of lepidopteran insects can provide useful information for managing agricultural pests, since the larvae of many species cause serious impacts on crops. Computational models to simulate the spatial dynamics of insect populations are increasingly used, because of their efficiency in representing insect movement. In this study, we used a cellular automata model to explore different patterns of population distribution of Spodoptera frugiperda (J.E. Smith) (Lepidoptera: Noctuidae), when the values of two biological parameters that are able to influence the spatial pattern (larval viability and adult longevity) are varied. We mapped the spatial patterns observed as the parameters varied. Additionally, by using population data for S. frugiperda obtained in different hosts under laboratory conditions, we were able to describe the expected spatial patterns occurring in corn, cotton, millet, and soybean crops based on the parameters varied. The results are discussed from the perspective of insect ecology and pest management. We concluded that computational approaches can be important tools to study the relationship between the biological parameters and spatial distributions of lepidopteran insect pests.

  10. Analyzing Clickstreams

    DEFF Research Database (Denmark)

    Andersen, Jesper; Giversen, Anders; Jensen, Allan H.

    in modern enterprises. In the data warehousing pproach, selected information is extracted in advance and stored in a repository. This approach is used because of its high performance. However, in many situations a logical (rather than physical) integration of data is preferable. Previous web-based data......On-Line Analytical Processing (OLAP) enables analysts to gain insight into data through fast and interactive access to a variety of possible views on information, organized in a dimensional model. The demand for data integration is rapidly becoming larger as more and more information sources appear....... Extensible Markup Language (XML) is fast becoming the new standard for data representation and exchange on the World Wide Web. The rapid emergence of XML data on the web, e.g., business-to-business (B2B) ecommerce, is making it necessary for OLAP and other data analysis tools to handleXML data as well...

  11. Computer Programs for Obtaining and Analyzing Daily Mean Steamflow Data from the U.S. Geological Survey National Water Information System Web Site

    Science.gov (United States)

    Granato, Gregory E.

    2009-01-01

    Research Council, 2004). The USGS maintains the National Water Information System (NWIS), a distributed network of computers and file servers used to store and retrieve hydrologic data (Mathey, 1998; U.S. Geological Survey, 2008). NWISWeb is an online version of this database that includes water data from more than 24,000 streamflow-gaging stations throughout the United States (U.S. Geological Survey, 2002, 2008). Information from NWISWeb is commonly used to characterize streamflows at gaged sites and to help predict streamflows at ungaged sites. Five computer programs were developed for obtaining and analyzing streamflow from the National Water Information System (NWISWeb). The programs were developed as part of a study by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, to develop a stochastic empirical loading and dilution model. The programs were developed because reliable, efficient, and repeatable methods are needed to access and process streamflow information and data. The first program is designed to facilitate the downloading and reformatting of NWISWeb streamflow data. The second program is designed to facilitate graphical analysis of streamflow data. The third program is designed to facilitate streamflow-record extension and augmentation to help develop long-term statistical estimates for sites with limited data. The fourth program is designed to facilitate statistical analysis of streamflow data. The fifth program is a preprocessor to create batch input files for the U.S. Environmental Protection Agency DFLOW3 program for calculating low-flow statistics. These computer programs were developed to facilitate the analysis of daily mean streamflow data for planning-level water-quality analyses but also are useful for many other applications pertaining to streamflow data and statistics. These programs and the associated documentation are included on the CD-ROM accompanying this report. This report and the appendixes on the

  12. Coupling of the spatial dynamic of picoplankton and nanoflagellate grazing pressure and carbon flow of the microbial food web in the subtropical pelagic continental shelf ecosystem

    Science.gov (United States)

    Chiang, K.-P.; Tsai, A.-Y.; Tsai, P.-J.; Gong, G.-C.; Tsai, S.-F.

    2013-01-01

    In order to investigate the mechanism of spatial dynamics of picoplankton community (bacteria and Synechococcus spp.) and estimate the carbon flux of the microbial food web in the oligotrophic Taiwan Warm Current Water of subtropical marine pelagic ecosystem, we conducted size-fractionation experiments in five cruises by the R/V Ocean Research II during the summers of 2010 and 2011 in the southern East China Sea. We carried out culture experiments using surface water which, according to a temperature-salinity (T-S) diagram, is characterized as oligotrophic Taiwan Current Warm Water. We found a negative correlation bettween bacteria growth rate and temperature, indicating that the active growth of heterotrophic bacteria might be induced by nutrients lifted from deep layer by cold upwelling water. This finding suggests that the area we studied was a bottom-up control pelagic ecosystem. We suggest that the microbial food web of an oligotrophic ecosystem may be changed from top-down control to resource supply (bottom-up control) when a physical force brings nutrient into the oligotrophic ecosystem. Upwelling brings nutrient-rich water to euphotic zone and promotes bacteria growth, increasing the picoplankton biomass which increased the consumption rate of nanoflagellate. The net growth rate (growth rate-grazing rate) becomes negative when the densities of bacteria and Synechococcus spp. are lower than the threshold values. The interaction between growth and grazing will limit the abundances of bacteria (105-106 cells mL-1 and Synechococcus spp. (104-105 cells mL-1) within a narrow range, forming a predator-prey eddy. Meanwhile, 62% of bacteria production and 55% of Synechococcus spp. production are transported to higher trophic level (nanoflagellate), though the cascade effect might cause an underestimation of both percentages of transported carbon. Based on the increasing number of sizes we found in the size-fractionation experiments, we estimated that the predation

  13. The influence of nanoflagellates on the spatial variety of picoplankton and the carbon flow of the microbial food web in the oligotrophic subtropical pelagic continental shelf ecosystem

    Science.gov (United States)

    Chiang, Kuo-Ping; Tsai, An-Yi; Tsai, Pei-Jung; Gong, Gwo-Ching; Huang, Bang-Qin; Tsai, Sheng-Fang

    2014-06-01

    To investigate the mechanism of the spatial dynamics of picoplankton community (bacteria and Synechococcus spp.) and to estimate the carbon flux of the microbial food web in the oligotrophic Taiwan Warm Current Water of the subtropical marine pelagic ecosystem, we conducted size-fractionation experiments during five cruises by the R/V Ocean Research II during the summers of 2010 and 2011 in the southern East China Sea. We carried out culture experiments using surface water, which according to a temperature-salinity (T-S) diagram, is characterized as oligotrophic Taiwan Current Warm Water. We found a negative correlation between bacteria growth rate and temperature, and another negative correlation between nitrate and temperature indicating that the active growth of heterotrophic bacteria might be induced by nutrients lifted from a deep layer by cold upwelling water. This finding suggests that the area we studied was a bottom-up control pelagic ecosystem. Upwelling brings nutrient-rich water to the euphotic zone and promotes bacterial growth, resulting in increased picoplankton biomass, which increases the consumption rate of nanoflagellates. The net growth rate (growth rate-grazing rate) becomes negative when the densities of bacteria and Synechococcus spp. are lower than the threshold values. The interaction between growth and grazing will limit the abundance of bacteria (105-106 cells ml-1) and Synechococcus spp. (104-105 cells ml-1) within a narrow range. Meanwhile, 61% of bacteria production and 54% of Synechococcus spp. production are transported to a higher trophic level (nanoflagellate), though the cascade effect might cause an underestimation of both percentages of transported carbon. Based on the successive size-fractionation experiments, we estimated that the predation values were underestimated and that the diet of nanoflagellates is composed of 64% bacteria and 36% Synechococcus spp.

  14. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  15. Stochastic analysis of web page ranking

    NARCIS (Netherlands)

    Volkovich, Y.

    2009-01-01

    Today, the study of the World Wide Web is one of the most challenging subjects. In this work we consider the Web from a probabilistic point of view. We analyze the relations between various characteristics of the Web. In particular, we are interested in the Web properties that affect the Web page

  16. Characterizing web heuristics

    NARCIS (Netherlands)

    de Jong, Menno D.T.; van der Geest, Thea

    2000-01-01

    This article is intended to make Web designers more aware of the qualities of heuristics by presenting a framework for analyzing the characteristics of heuristics. The framework is meant to support Web designers in choosing among alternative heuristics. We hope that better knowledge of the

  17. Here be web proxies

    DEFF Research Database (Denmark)

    Weaver, Nicholas; Kreibich, Christian; Dam, Martin

    2014-01-01

    ,000 clients that include a novel proxy location technique based on traceroutes of the responses to TCP connection establishment requests, which provides additional clues regarding the purpose of the identified web proxies. Overall, we see 14% of Netalyzr-analyzed clients with results that suggest the presence...... of web proxies....

  18. Radiometric analyzer

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring the characteristic values of a sample by radiation includes a humer of radiation measuring subsystems having different ratios of sensitivities to the elements of the sample and linearizing circuits having inverse function characteristics of calibration functions which correspond to the radiation measuring subsystems. A weighing adder operates a desirable linear combination of the outputs of the linearizing circuits. Operators for operating between two or more different linear combinations are included

  19. Contamination Analyzer

    Science.gov (United States)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  20. Analyzing the Facebook Friendship Graph

    OpenAIRE

    Catanese, Salvatore; De Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo

    2010-01-01

    Online Social Networks (OSN) during last years acquired a huge and increasing popularity as one of the most important emerging Web phenomena, deeply modifying the behavior of users and contributing to build a solid substrate of connections and relationships among people using the Web. In this preliminary work paper, our purpose is to analyze Facebook, considering a significant sample of data reflecting relationships among subscribed users. Our goal is to extract, from this platform, relevant ...

  1. Dynamic Web Pages: Performance Impact on Web Servers.

    Science.gov (United States)

    Kothari, Bhupesh; Claypool, Mark

    2001-01-01

    Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)

  2. ANGDelMut – a web-based tool for predicting and analyzing functional loss mechanisms of amyotrophic lateral sclerosis-associated angiogenin mutations [v2; ref status: indexed, http://f1000r.es/2mc

    Directory of Open Access Journals (Sweden)

    Aditya K Padhi

    2013-12-01

    Full Text Available ANGDelMut is a web-based tool for predicting the functional consequences of missense mutations in the angiogenin (ANG protein, which is associated with amyotrophic lateral sclerosis (ALS. Missense mutations in ANG result in loss of either ribonucleolytic activity or nuclear translocation activity or both of these functions, and in turn cause ALS. However, no web-based tools are available to predict whether a newly identified ANG mutation will possibly lead to ALS. More importantly, no web-implemented method is currently available to predict the mechanisms of loss-of-function(s of ANG mutants. In light of this observation, we developed the ANGDelMut web-based tool, which predicts whether an ANG mutation is deleterious or benign. The user selects certain attributes from the input panel, which serves as a query to infer whether a mutant will exhibit loss of ribonucleolytic activity or nuclear translocation activity or whether the overall stability will be affected. The output states whether the mutation is deleterious or benign, and if it is deleterious, gives the possible mechanism(s of loss-of-function. This web-based tool, freely available at http://bioschool.iitd.ernet.in/DelMut/, is the first of its kind to provide a platform for researchers and clinicians, to infer the functional consequences of ANG mutations and correlate their possible association with ALS ahead of experimental findings.

  3. ANGDelMut – a web-based tool for predicting and analyzing functional loss mechanisms of amyotrophic lateral sclerosis-associated angiogenin mutations [v3; ref status: indexed, http://f1000r.es/2yt

    Directory of Open Access Journals (Sweden)

    Aditya K Padhi

    2014-02-01

    Full Text Available ANGDelMut is a web-based tool for predicting the functional consequences of missense mutations in the angiogenin (ANG protein, which is associated with amyotrophic lateral sclerosis (ALS. Missense mutations in ANG result in loss of either ribonucleolytic activity or nuclear translocation activity or both of these functions, and in turn cause ALS. However, no web-based tools are available to predict whether a newly identified ANG mutation will possibly lead to ALS. More importantly, no web-implemented method is currently available to predict the mechanisms of loss-of-function(s of ANG mutants. In light of this observation, we developed the ANGDelMut web-based tool, which predicts whether an ANG mutation is deleterious or benign. The user selects certain attributes from the input panel, which serves as a query to infer whether a mutant will exhibit loss of ribonucleolytic activity or nuclear translocation activity or whether the overall stability will be affected. The output states whether the mutation is deleterious or benign, and if it is deleterious, gives the possible mechanism(s of loss-of-function. This web-based tool, freely available at http://bioschool.iitd.ernet.in/DelMut/, is the first of its kind to provide a platform for researchers and clinicians, to infer the functional consequences of ANG mutations and correlate their possible association with ALS ahead of experimental findings.

  4. Web Mining

    Science.gov (United States)

    Fürnkranz, Johannes

    The World-Wide Web provides every internet citizen with access to an abundance of information, but it becomes increasingly difficult to identify the relevant pieces of information. Research in web mining tries to address this problem by applying techniques from data mining and machine learning to Web data and documents. This chapter provides a brief overview of web mining techniques and research areas, most notably hypertext classification, wrapper induction, recommender systems and web usage mining.

  5. Teaching Tectonics to Undergraduates with Web GIS

    Science.gov (United States)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  6. The planktonic food web of the Bizerte lagoon (south-western Mediterranean) during summer: I. Spatial distribution under different anthropogenic pressures

    OpenAIRE

    Hlaili, A; Grami, B; Niquil, N; Gosselin, M; Hamel, D; Troussellier, Marc; Mabrouk, H

    2008-01-01

    The structure and the trophic interactions of the planktonic food web were investigated during summer 2004 in a coastal lagoon of southwestern Mediterranean Sea. Biomasses of planktonic components as well as bacterial and phytoplankton production and grazing by microzooplankton were quantified at four stations (MA, MB, MJ and R) inside the lagoon. Station MA was impacted by urban discharge, station MB was influenced by industrial activity, station MJ was located in a shellfish farming sector,...

  7. The planktonic food web of the Bizerte lagoon (south-western Mediterranean) during summer: I. Spatial distribution under different anthropogenic pressures

    Science.gov (United States)

    Sakka Hlaili, Asma; Grami, Boutheina; Niquil, Nathalie; Gosselin, Michel; Hamel, Dominique; Troussellier, Marc; Hadj Mabrouk, Hassine

    2008-06-01

    The structure and the trophic interactions of the planktonic food web were investigated during summer 2004 in a coastal lagoon of south-western Mediterranean Sea. Biomasses of planktonic components as well as bacterial and phytoplankton production and grazing by microzooplankton were quantified at four stations (MA, MB, MJ and R) inside the lagoon. Station MA was impacted by urban discharge, station MB was influenced by industrial activity, station MJ was located in a shellfish farming sector, while station R represented the lagoon central area. Biomasses and production rates of bacteria (7-33 mg C m -3; 17.5-35 mg C m -3 d -1) and phytoplankton (80-299 mg C m -3; 34-210 mg C m -3 d -1) showed high values at station MJ, where substantial concentrations of nutrients (NO 3- and Si(OH) 4) were found. Microphytoplankton, which dominated the total algal biomass and production (>82%), were characterized by the proliferation of several chain-forming diatoms. Microzooplankton was mainly composed of dinoflagellates ( Torodinium, Protoperidinium and Dinophysis) and aloricate ( Lohmaniellea and Strombidium) and tintinnid ( Tintinnopsis, Tintinnus, Favella and Eutintinnus) ciliates. Higher biomass of these protozoa (359 mg C m -3) was observed at station MB, where large tintinnids were encountered. Mesozooplankton mainly represented by Calanoida ( Acartia, Temora, Calanus, Eucalanus, Paracalanus and Centropages) and Cyclopoida ( Oithona) copepods, exhibited higher and lower biomasses at stations MA/MJ and MB, respectively. Bacterivory represented only 35% of bacterial production at stations MB and R, but higher fractions (65-70%) were observed at stations MA and MJ. Small heterotrophic flagellates and aloricate ciliates seemed to be the main controllers of bacteria. Pico- and nanophytoplankton represented a significant alternative carbon pool for micrograzers, which grazing represented 67-90% of pico- and nano-algal production in all stations. Microzooplankton has, however, a

  8. The Geospatial Web and Local Geographical Education

    Science.gov (United States)

    Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.

    2010-01-01

    Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…

  9. Web archives

    DEFF Research Database (Denmark)

    Finnemann, Niels Ole

    2018-01-01

    This article deals with general web archives and the principles for selection of materials to be preserved. It opens with a brief overview of reasons why general web archives are needed. Section two and three present major, long termed web archive initiatives and discuss the purposes and possible...... values of web archives and asks how to meet unknown future needs, demands and concerns. Section four analyses three main principles in contemporary web archiving strategies, topic centric, domain centric and time-centric archiving strategies and section five discuss how to combine these to provide...... a broad and rich archive. Section six is concerned with inherent limitations and why web archives are always flawed. The last sections deal with the question how web archives may fit into the rapidly expanding, but fragmented landscape of digital repositories taking care of various parts...

  10. Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  11. Web-based geo-visualisation of spatial information to support evidence-based health policy: a case study of the development process of HealthTracks.

    Science.gov (United States)

    Jardine, Andrew; Mullan, Narelle; Gudes, Ori; Cosford, James; Moncrieff, Simon; West, Geoff; Xiao, Jianguo; Yun, Grace; Someford, Peter

    Place is of critical importance to health as it can reveal patterns of disease spread and clustering, associations with risk factors, and areas with greatest need for, or least access to healthcare services and promotion activities. Furthermore, in order to get a good understanding of the health status and needs of a particular area a broad range of data are required which can often be difficult and time consuming to obtain and collate. This process has been expedited by bringing together multiple data sources and making them available in an online geo-visualisation, HealthTracks, which consists of a mapping and reporting component. The overall aim of the HealthTracks project is to make spatial health information more accessible to policymakers, analysts, planners and program managers to inform decision-making across the Department of Health Western Australia. Preliminary mapping and reporting applications that have been utilised to inform service planning, increased awareness of the utility of spatial information and improved efficiency in data access were developed. The future for HealthTracks involves expanding the range of data available and developing new analytical capabilities in order to work towards providing external agencies, researchers and eventually the general public access to rich local area spatial data.

  12. Web Engineering

    Energy Technology Data Exchange (ETDEWEB)

    White, Bebo

    2003-06-23

    Web Engineering is the application of systematic, disciplined and quantifiable approaches to development, operation, and maintenance of Web-based applications. It is both a pro-active approach and a growing collection of theoretical and empirical research in Web application development. This paper gives an overview of Web Engineering by addressing the questions: (a) why is it needed? (b) what is its domain of operation? (c) how does it help and what should it do to improve Web application development? and (d) how should it be incorporated in education and training? The paper discusses the significant differences that exist between Web applications and conventional software, the taxonomy of Web applications, the progress made so far and the research issues and experience of creating a specialization at the master's level. The paper reaches a conclusion that Web Engineering at this stage is a moving target since Web technologies are constantly evolving, making new types of applications possible, which in turn may require innovations in how they are built, deployed and maintained.

  13. Web 25

    DEFF Research Database (Denmark)

    the reader on an exciting time travel journey to learn more about the prehistory of the hyperlink, the birth of the Web, the spread of the early Web, and the Web’s introduction to the general public in mainstream media. Fur- thermore, case studies of blogs, literature, and traditional media going online...

  14. Web Page Recommendation Using Web Mining

    OpenAIRE

    Modraj Bhavsar; Mrs. P. M. Chavan

    2014-01-01

    On World Wide Web various kind of content are generated in huge amount, so to give relevant result to user web recommendation become important part of web application. On web different kind of web recommendation are made available to user every day that includes Image, Video, Audio, query suggestion and web page. In this paper we are aiming at providing framework for web page recommendation. 1) First we describe the basics of web mining, types of web mining. 2) Details of each...

  15. Understanding User-Web Interactions via Web Analytics

    CERN Document Server

    Jansen, Bernard J

    2009-01-01

    This lecture presents an overview of the Web analytics process, with a focus on providing insight and actionable outcomes from collecting and analyzing Internet data. The lecture first provides an overview of Web analytics, providing in essence, a condensed version of the entire lecture. The lecture then outlines the theoretical and methodological foundations of Web analytics in order to make obvious the strengths and shortcomings of Web analytics as an approach. These foundational elements include the psychological basis in behaviorism and methodological underpinning of trace data as an empir

  16. Sensor web

    Science.gov (United States)

    Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)

    2011-01-01

    A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.

  17. Web Service

    Science.gov (United States)

    ... topic data in XML format. Using the Web service, software developers can build applications that utilize MedlinePlus health topic information. The service accepts keyword searches as requests and returns relevant ...

  18. Assessment of spatial data infrastructures

    African Journals Online (AJOL)

    bases, networks, Web services and portals to facilitate and coordinate the availability, ... need for an SDI to support the spatial and land development planning .... inform integrated and development planning ... provincial and regional planning.

  19. Fiber webs

    Science.gov (United States)

    Roger M. Rowell; James S. Han; Von L. Byrd

    2005-01-01

    Wood fibers can be used to produce a wide variety of low-density three-dimensional webs, mats, and fiber-molded products. Short wood fibers blended with long fibers can be formed into flexible fiber mats, which can be made by physical entanglement, nonwoven needling, or thermoplastic fiber melt matrix technologies. The most common types of flexible mats are carded, air...

  20. Web Sitings.

    Science.gov (United States)

    Lo, Erika

    2001-01-01

    Presents seven mathematics games, located on the World Wide Web, for elementary students, including: Absurd Math: Pre-Algebra from Another Dimension; The Little Animals Activity Centre; MathDork Game Room (classic video games focusing on algebra); Lemonade Stand (students practice math and business skills); Math Cats (teaches the artistic beauty…

  1. Tracheal web

    International Nuclear Information System (INIS)

    Legasto, A.C.; Haller, J.O.; Giusti, R.J.

    2004-01-01

    Congenital tracheal web is a rare entity often misdiagnosed as refractory asthma. Clinical suspicion based on patient history, examination, and pulmonary function tests should lead to its consideration. Bronchoscopy combined with CT imaging and multiplanar reconstruction is an accepted, highly sensitive means of diagnosis. (orig.)

  2. Web components and the semantic web

    OpenAIRE

    Casey, Maire; Pahl, Claus

    2003-01-01

    Component-based software engineering on the Web differs from traditional component and software engineering. We investigate Web component engineering activites that are crucial for the development,com position, and deployment of components on the Web. The current Web Services and Semantic Web initiatives strongly influence our work. Focussing on Web component composition we develop description and reasoning techniques that support a component developer in the composition activities,fo cussing...

  3. Collective spatial keyword querying

    DEFF Research Database (Denmark)

    Cao, Xin; Cong, Gao; Jensen, Christian S.

    2011-01-01

    With the proliferation of geo-positioning and geo-tagging, spatial web objects that possess both a geographical location and a textual description are gaining in prevalence, and spatial keyword queries that exploit both location and textual description are gaining in prominence. However, the quer......With the proliferation of geo-positioning and geo-tagging, spatial web objects that possess both a geographical location and a textual description are gaining in prevalence, and spatial keyword queries that exploit both location and textual description are gaining in prominence. However......, the queries studied so far generally focus on finding individual objects that each satisfy a query rather than finding groups of objects where the objects in a group collectively satisfy a query. We define the problem of retrieving a group of spatial web objects such that the group's keywords cover the query......'s keywords and such that objects are nearest to the query location and have the lowest inter-object distances. Specifically, we study two variants of this problem, both of which are NP-complete. We devise exact solutions as well as approximate solutions with provable approximation bounds to the problems. We...

  4. Web Apollo: a web-based genomic annotation editing platform.

    Science.gov (United States)

    Lee, Eduardo; Helt, Gregg A; Reese, Justin T; Munoz-Torres, Monica C; Childers, Chris P; Buels, Robert M; Stein, Lincoln; Holmes, Ian H; Elsik, Christine G; Lewis, Suzanna E

    2013-08-30

    Web Apollo is the first instantaneous, collaborative genomic annotation editor available on the web. One of the natural consequences following from current advances in sequencing technology is that there are more and more researchers sequencing new genomes. These researchers require tools to describe the functional features of their newly sequenced genomes. With Web Apollo researchers can use any of the common browsers (for example, Chrome or Firefox) to jointly analyze and precisely describe the features of a genome in real time, whether they are in the same room or working from opposite sides of the world.

  5. Sistemas de Inteligencia Web basados en Redes Sociales

    Directory of Open Access Journals (Sweden)

    de la Rosa Troyano, Fco. Fernando

    2007-06-01

    Full Text Available Social Network Analysis (SNA is an emerging area, essential in decision making processes. Its capacities to analyze and intervene in a social network can be used to implant surveillance tasks in research centers or technological-based businesses. The aim of this work is to make a proposal to design intelligence web systems based on social networks. The first obstacle to implant these systems is the data gather process. In order to solve this problem, an extracting social networks methodology is presented. The extraction process is carried out by analyzing the search engine results. Queries are based on electronic mails. From the extracted network, its spatial distribution of social relationships, the global thematic impact and the institutional relationships are also analyzed. The social structure of REDES email distribution list is analyzed as an example.

  6. This paper describes 14 Colombian web based “edu-communicational” projects. The aim is to analyze different types of platforms, different type of use and the elements that facilitate interaction with final users. The study sample is composed of three main

    Directory of Open Access Journals (Sweden)

    Tomás Durán Becerra

    2017-12-01

    Full Text Available This paper describes 14 Colombian web based “edu-communicational” projects. The aim is to analyze different types of platforms, different type of use and the elements that facilitate interaction with final users. The study sample is composed of three main categories of sites: formal education sites, informal education sites and other types of sites that contain some kind of educational content. The research establishes different variables aimed at discovering educommunicative tools. Both the theoretical framework and the conceptual approach to edu-communication, as well as the methodological proposal applied are retrieved from the works of De Oliveira (2009, Freire (2002, Barbas Coslado (2012, Pérez-Tornero (2004, Tejedor (2010, Said and Arcila (2011a and O’Reilly (2009, among others. In conclusion, the article shows similarities and differences among the platforms that shape the online edu-communicational landscape in Colombia.

  7. ARCGIS ONLINE UTILIZATION AS MEDIA SUBMISSION OF THE SPATIAL INFORMATION IN MALANG

    Directory of Open Access Journals (Sweden)

    Akhmad Faruq Hamdani

    2017-04-01

    Full Text Available Development of technology has encouraged the delivery of information to be more interactive. Technology is the provision of spatial information via ArcGIS Online. ArcGIS Online is a geographic information system based on Web developed by ESRI to use, create, analyze, and share maps. ArcGIS Online can be used to present the spatial data Malang. The results of the analysis in the form of presentation of spatial information Malang in the shape of an interactive map that contains a general overview of Malang, geographical conditions, and social conditions of Malang through a feature story map in ArcGIS Online.

  8. Usare WebDewey

    OpenAIRE

    Baldi, Paolo

    2016-01-01

    This presentation shows how to use the WebDewey tool. Features of WebDewey. Italian WebDewey compared with American WebDewey. Querying Italian WebDewey. Italian WebDewey and MARC21. Italian WebDewey and UNIMARC. Numbers, captions, "equivalente verbale": Dewey decimal classification in Italian catalogues. Italian WebDewey and Nuovo soggettario. Italian WebDewey and LCSH. Italian WebDewey compared with printed version of Italian Dewey Classification (22. edition): advantages and disadvantages o...

  9. Semantic Web

    Directory of Open Access Journals (Sweden)

    Anna Lamandini

    2011-06-01

    Full Text Available The semantic Web is a technology at the service of knowledge which is aimed at accessibility and the sharing of content; facilitating interoperability between different systems and as such is one of the nine key technological pillars of TIC (technologies for information and communication within the third theme, programme specific cooperation of the seventh programme framework for research and development (7°PQRS, 2007-2013. As a system it seeks to overcome overload or excess of irrelevant information in Internet, in order to facilitate specific or pertinent research. It is an extension of the existing Web in which the aim is for cooperation between and the computer and people (the dream of Sir Tim Berners –Lee where machines can give more support to people when integrating and elaborating data in order to obtain inferences and a global sharing of data. It is a technology that is able to favour the development of a “data web” in other words the creation of a space in both sets of interconnected and shared data (Linked Data which allows users to link different types of data coming from different sources. It is a technology that will have great effect on everyday life since it will permit the planning of “intelligent applications” in various sectors such as education and training, research, the business world, public information, tourism, health, and e-government. It is an innovative technology that activates a social transformation (socio-semantic Web on a world level since it redefines the cognitive universe of users and enables the sharing not only of information but of significance (collective and connected intelligence.

  10. Semantic Web Requirements through Web Mining Techniques

    OpenAIRE

    Hassanzadeh, Hamed; Keyvanpour, Mohammad Reza

    2012-01-01

    In recent years, Semantic web has become a topic of active research in several fields of computer science and has applied in a wide range of domains such as bioinformatics, life sciences, and knowledge management. The two fast-developing research areas semantic web and web mining can complement each other and their different techniques can be used jointly or separately to solve the issues in both areas. In addition, since shifting from current web to semantic web mainly depends on the enhance...

  11. The content and design of Web sites : an empirical study

    NARCIS (Netherlands)

    Huizingh, EKRE

    2000-01-01

    To support the emergence of a solid knowledge base for analyzing Web activity, we have developed a framework to analyze and categorize the capabilities of Web sites. This distinguishes content from design. Content refers to the information, features, or services that are offered in the Web site,

  12. Responsive web design workflow

    OpenAIRE

    LAAK, TIMO

    2013-01-01

    Responsive Web Design Workflow is a literature review about Responsive Web Design, a web standards based modern web design paradigm. The goals of this research were to define what responsive web design is, determine its importance in building modern websites and describe a workflow for responsive web design projects. Responsive web design is a paradigm to create adaptive websites, which respond to the properties of the media that is used to render them. The three key elements of responsi...

  13. Analyzing the User Behavior toward Electronic Commerce Stimuli

    OpenAIRE

    Lorenzo-Romero, Carlota; Alarcón-del-Amo, María-del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this res...

  14. Analyzing the user behavior towards Electronic Commerce stimuli

    OpenAIRE

    Carlota Lorenzo-Romero; María-del-Carmen Alarcón-del-Amo

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e. navigational structure as utilitarian stimulus) versus nonverbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this resea...

  15. Sexual information seeking on web search engines.

    Science.gov (United States)

    Spink, Amanda; Koricich, Andrew; Jansen, B J; Cole, Charles

    2004-02-01

    Sexual information seeking is an important element within human information behavior. Seeking sexually related information on the Internet takes many forms and channels, including chat rooms discussions, accessing Websites or searching Web search engines for sexual materials. The study of sexual Web queries provides insight into sexually-related information-seeking behavior, of value to Web users and providers alike. We qualitatively analyzed queries from logs of 1,025,910 Alta Vista and AlltheWeb.com Web user queries from 2001. We compared the differences in sexually-related Web searching between Alta Vista and AlltheWeb.com users. Differences were found in session duration, query outcomes, and search term choices. Implications of the findings for sexual information seeking are discussed.

  16. Caustic Skeleton & Cosmic Web

    Science.gov (United States)

    Feldbrugge, Job; van de Weygaert, Rien; Hidding, Johan; Feldbrugge, Joost

    2018-05-01

    We present a general formalism for identifying the caustic structure of a dynamically evolving mass distribution, in an arbitrary dimensional space. The identification of caustics in fluids with Hamiltonian dynamics, viewed in Lagrangian space, corresponds to the classification of singularities in Lagrangian catastrophe theory. On the basis of this formalism we develop a theoretical framework for the dynamics of the formation of the cosmic web, and specifically those aspects that characterize its unique nature: its complex topological connectivity and multiscale spinal structure of sheetlike membranes, elongated filaments and compact cluster nodes. Given the collisionless nature of the gravitationally dominant dark matter component in the universe, the presented formalism entails an accurate description of the spatial organization of matter resulting from the gravitationally driven formation of cosmic structure. The present work represents a significant extension of the work by Arnol'd et al. [1], who classified the caustics that develop in one- and two-dimensional systems that evolve according to the Zel'dovich approximation. His seminal work established the defining role of emerging singularities in the formation of nonlinear structures in the universe. At the transition from the linear to nonlinear structure evolution, the first complex features emerge at locations where different fluid elements cross to establish multistream regions. Involving a complex folding of the 6-D sheetlike phase-space distribution, it manifests itself in the appearance of infinite density caustic features. The classification and characterization of these mass element foldings can be encapsulated in caustic conditions on the eigenvalue and eigenvector fields of the deformation tensor field. In this study we introduce an alternative and transparent proof for Lagrangian catastrophe theory. This facilitates the derivation of the caustic conditions for general Lagrangian fluids, with

  17. The spatial glaciological data infrastructure

    Directory of Open Access Journals (Sweden)

    T. Y. Khromova

    2014-01-01

    .mpg.igras.ru, all based on the spatial glaciological data. Another result is the digital and web-atlas «Snow and Ice of the Earth», presenting the example of open source of the spatial data on glaciology in the multi-program environment. Regional data bases created for regions of the Caucasus and the Antarctic Continent make it possible to develop various GIS models and to analyze interrelations, status and dynamics of glaciological parameters. The system of links provides easy access to distributed resources.

  18. Uncovering obfuscated web tracking

    OpenAIRE

    Espuña Buxó, Álvaro

    2016-01-01

    En este proyecto creamos una plataforma para detectar automáticamente y de forma dinámica si en una cierta página web se esta usando "canvas fingerprinting" y si el uso de éste está siendo ofuscado. Además analizamos las páginas más visitadas según Alexa y exponemos los resultado obtenidos. In this project we develop a framework that tries to detect automatically and dynamically if a website is using canvas fingerprinting and if its usage is being obfuscated. We also analyze the top ranked...

  19. Characteristics of scientific web publications

    DEFF Research Database (Denmark)

    Thorlund Jepsen, Erik; Seiden, Piet; Ingwersen, Peter Emil Rerup

    2004-01-01

    were generated based on specifically selected domain topics that are searched for in three publicly accessible search engines (Google, AllTheWeb, and AltaVista). A sample of the retrieved hits was analyzed with regard to how various publication attributes correlated with the scientific quality...... of the content and whether this information could be employed to harvest, filter, and rank Web publications. The attributes analyzed were inlinks, outlinks, bibliographic references, file format, language, search engine overlap, structural position (according to site structure), and the occurrence of various...... types of metadata. As could be expected, the ranked output differs between the three search engines. Apparently, this is caused by differences in ranking algorithms rather than the databases themselves. In fact, because scientific Web content in this subject domain receives few inlinks, both Alta...

  20. Web TA Production (WebTA)

    Data.gov (United States)

    US Agency for International Development — WebTA is a web-based time and attendance system that supports USAID payroll administration functions, and is designed to capture hours worked, leave used and...

  1. Advanced Techniques in Web Intelligence-2 Web User Browsing Behaviour and Preference Analysis

    CERN Document Server

    Palade, Vasile; Jain, Lakhmi

    2013-01-01

    This research volume focuses on analyzing the web user browsing behaviour and preferences in traditional web-based environments, social  networks and web 2.0 applications,  by using advanced  techniques in data acquisition, data processing, pattern extraction and  cognitive science for modeling the human actions.  The book is directed to  graduate students, researchers/scientists and engineers  interested in updating their knowledge with the recent trends in web user analysis, for developing the next generation of web-based systems and applications.

  2. DISTANCE LEARNING ONLINE WEB 3 .0

    Directory of Open Access Journals (Sweden)

    S. M. Petryk

    2015-05-01

    Full Text Available This article analyzes the existing methods of identification information in the semantic web, outlines the main problems of its implementation and researches the use of Semantic Web as the part of distance learning. Proposed alternative variant of identification and relationship construction of information and acquired knowledge based on the developed method “spectrum of knowledge”

  3. Intelligent Overload Control for Composite Web Services

    NARCIS (Netherlands)

    Meulenhoff, P.J.; Ostendorf, D.R.; Zivkovic, Miroslav; Meeuwissen, H.B.; Gijsen, B.M.M.

    2009-01-01

    In this paper, we analyze overload control for composite web services in service oriented architectures by an orchestrating broker, and propose two practical access control rules which effectively mitigate the effects of severe overloads at some web services in the composite service. These two rules

  4. Semantic web for dummies

    CERN Document Server

    Pollock, Jeffrey T

    2009-01-01

    Semantic Web technology is already changing how we interact with data on the Web. By connecting random information on the Internet in new ways, Web 3.0, as it is sometimes called, represents an exciting online evolution. Whether you're a consumer doing research online, a business owner who wants to offer your customers the most useful Web site, or an IT manager eager to understand Semantic Web solutions, Semantic Web For Dummies is the place to start! It will help you:Know how the typical Internet user will recognize the effects of the Semantic WebExplore all the benefits the data Web offers t

  5. Mobile response in web panels

    NARCIS (Netherlands)

    de Bruijne, M.A.; Wijnant, A.

    2014-01-01

    This article investigates unintended mobile access to surveys in online, probability-based panels. We find that spontaneous tablet usage is drastically increasing in web surveys, while smartphone usage remains low. Further, we analyze the bias of respondent profiles using smartphones and tablets

  6. Web geoprocessing services on GML with a fast XML database ...

    African Journals Online (AJOL)

    Nowadays there exist quite a lot of Spatial Database Infrastructures (SDI) that facilitate the Geographic Information Systems (GIS) user community in getting access to distributed spatial data through web technology. However, sometimes the users first have to process available spatial data to obtain the needed information.

  7. Electron attachment analyzer

    International Nuclear Information System (INIS)

    Popp, P.; Grosse, H.J.; Leonhardt, J.; Mothes, S.; Oppermann, G.

    1984-01-01

    The invention concerns an electron attachment analyzer for detecting traces of electroaffine substances in electronegative gases, especially in air. The analyzer can be used for monitoring working places, e. g., in operating theatres. The analyzer consists of two electrodes inserted in a base frame of insulating material (quartz or ceramics) and a high-temperature resistant radiation source ( 85 Kr, 3 H, or 63 Ni)

  8. Geographic Information Systems and Web Page Development

    Science.gov (United States)

    Reynolds, Justin

    2004-01-01

    The Facilities Engineering and Architectural Branch is responsible for the design and maintenance of buildings, laboratories, and civil structures. In order to improve efficiency and quality, the FEAB has dedicated itself to establishing a data infrastructure based on Geographic Information Systems, GIs. The value of GIS was explained in an article dating back to 1980 entitled "Need for a Multipurpose Cadastre which stated, "There is a critical need for a better land-information system in the United States to improve land-conveyance procedures, furnish a basis for equitable taxation, and provide much-needed information for resource management and environmental planning." Scientists and engineers both point to GIS as the solution. What is GIS? According to most text books, Geographic Information Systems is a class of software that stores, manages, and analyzes mapable features on, above, or below the surface of the earth. GIS software is basically database management software to the management of spatial data and information. Simply put, Geographic Information Systems manage, analyze, chart, graph, and map spatial information. At the outset, I was given goals and expectations from my branch and from my mentor with regards to the further implementation of GIs. Those goals are as follows: (1) Continue the development of GIS for the underground structures. (2) Extract and export annotated data from AutoCAD drawing files and construct a database (to serve as a prototype for future work). (3) Examine existing underground record drawings to determine existing and non-existing underground tanks. Once this data was collected and analyzed, I set out on the task of creating a user-friendly database that could be assessed by all members of the branch. It was important that the database be built using programs that most employees already possess, ruling out most AutoCAD-based viewers. Therefore, I set out to create an Access database that translated onto the web using Internet

  9. Comparing cosmic web classifiers using information theory

    International Nuclear Information System (INIS)

    Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin; Jasche, Jens

    2016-01-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  10. Comparing cosmic web classifiers using information theory

    Energy Technology Data Exchange (ETDEWEB)

    Leclercq, Florent [Institute of Cosmology and Gravitation (ICG), University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth PO1 3FX (United Kingdom); Lavaux, Guilhem; Wandelt, Benjamin [Institut d' Astrophysique de Paris (IAP), UMR 7095, CNRS – UPMC Université Paris 6, Sorbonne Universités, 98bis boulevard Arago, F-75014 Paris (France); Jasche, Jens, E-mail: florent.leclercq@polytechnique.org, E-mail: lavaux@iap.fr, E-mail: j.jasche@tum.de, E-mail: wandelt@iap.fr [Excellence Cluster Universe, Technische Universität München, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2016-08-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  11. Nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Stritar, A.

    1986-01-01

    The development of Nuclear Power Plant Analyzers in USA is described. There are two different types of Analyzers under development in USA, the forst in Idaho and Los Alamos national Lab, the second in brookhaven National lab. That one is described in detail. The computer hardware and the mathematical models of the reactor vessel thermalhydraulics are described. (author)

  12. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  13. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts of vari...

  14. Gearbox vibration diagnostic analyzer

    Science.gov (United States)

    1992-01-01

    This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.

  15. Het WEB leert begrijpen

    CERN Multimedia

    Stroeykens, Steven

    2004-01-01

    The WEB could be much more useful if the computers understood something of information on the Web pages. That explains the goal of the "semantic Web", a project in which takes part, amongst others, Tim Berners Lee, the inventor of the original WEB

  16. Instant responsive web design

    CERN Document Server

    Simmons, Cory

    2013-01-01

    A step-by-step tutorial approach which will teach the readers what responsive web design is and how it is used in designing a responsive web page.If you are a web-designer looking to expand your skill set by learning the quickly growing industry standard of responsive web design, this book is ideal for you. Knowledge of CSS is assumed.

  17. U.S. Geological Survey spatial data access

    Science.gov (United States)

    Faundeen, John L.; Kanengieter, Ronald L.; Buswell, Michael D.

    2002-01-01

    The U.S. Geological Survey (USGS) has done a progress review on improving access to its spatial data holdings over the Web. The USGS EROS Data Center has created three major Web-based interfaces to deliver spatial data to the general public; they are Earth Explorer, the Seamless Data Distribution System (SDDS), and the USGS Web Mapping Portal. Lessons were learned in developing these systems, and various resources were needed for their implementation. The USGS serves as a fact-finding agency in the U.S. Government that collects, monitors, analyzes, and provides scientific information about natural resource conditions and issues. To carry out its mission, the USGS has created and managed spatial data since its inception. Originally relying on paper maps, the USGS now uses advanced technology to produce digital representations of the Earth’s features. The spatial products of the USGS include both source and derivative data. Derivative datasets include Digital Orthophoto Quadrangles (DOQ), Digital Elevation Models, Digital Line Graphs, land-cover Digital Raster Graphics, and the seamless National Elevation Dataset. These products, created with automated processes, use aerial photographs, satellite images, or other cartographic information such as scanned paper maps as source data. With Earth Explorer, users can search multiple inventories through metadata queries and can browse satellite and DOQ imagery. They can place orders and make payment through secure credit card transactions. Some USGS spatial data can be accessed with SDDS. The SDDS uses an ArcIMS map service interface to identify the user’s areas of interest and determine the output format; it allows the user to either download the actual spatial data directly for small areas or place orders for larger areas to be delivered on media. The USGS Web Mapping Portal provides views of national and international datasets through an ArcIMS map service interface. In addition, the map portal posts news about new

  18. The WebQuest: constructing creative learning.

    Science.gov (United States)

    Sanford, Julie; Townsend-Rocchiccioli, Judith; Trimm, Donna; Jacobs, Mike

    2010-10-01

    An exciting expansion of online educational opportunities is occurring in nursing. The use of a WebQuest as an inquiry-based learning activity can offer considerable opportunity for nurses to learn how to analyze and synthesize critical information. A WebQuest, as a constructivist, inquiry-oriented strategy, requires learners to use higher levels of thinking as a means to analyze and apply complex information, providing an exciting online teaching and learning strategy. A WebQuest is an inquiry-oriented lesson format in which most or all of the information learners work with comes from the web. This article provides an overview of the WebQuest as a teaching strategy and provides examples of its use. Copyright 2010, SLACK Incorporated.

  19. Recommendations for Benchmarking Web Site Usage among Academic Libraries.

    Science.gov (United States)

    Hightower, Christy; Sih, Julie; Tilghman, Adam

    1998-01-01

    To help library directors and Web developers create a benchmarking program to compare statistics of academic Web sites, the authors analyzed the Web server log files of 14 university science and engineering libraries. Recommends a centralized voluntary reporting structure coordinated by the Association of Research Libraries (ARL) and a method for…

  20. Virtual Web Services

    OpenAIRE

    Rykowski, Jarogniew

    2007-01-01

    In this paper we propose an application of software agents to provide Virtual Web Services. A Virtual Web Service is a linked collection of several real and/or virtual Web Services, and public and private agents, accessed by the user in the same way as a single real Web Service. A Virtual Web Service allows unrestricted comparison, information merging, pipelining, etc., of data coming from different sources and in different forms. Detailed architecture and functionality of a single Virtual We...

  1. The Semantic Web Revisited

    OpenAIRE

    Shadbolt, Nigel; Berners-Lee, Tim; Hall, Wendy

    2006-01-01

    The original Scientific American article on the Semantic Web appeared in 2001. It described the evolution of a Web that consisted largely of documents for humans to read to one that included data and information for computers to manipulate. The Semantic Web is a Web of actionable information--information derived from data through a semantic theory for interpreting the symbols.This simple idea, however, remains largely unrealized. Shopbots and auction bots abound on the Web, but these are esse...

  2. Web Project Management

    OpenAIRE

    Suralkar, Sunita; Joshi, Nilambari; Meshram, B B

    2013-01-01

    This paper describes about the need for Web project management, fundamentals of project management for web projects: what it is, why projects go wrong, and what's different about web projects. We also discuss Cost Estimation Techniques based on Size Metrics. Though Web project development is similar to traditional software development applications, the special characteristics of Web Application development requires adaption of many software engineering approaches or even development of comple...

  3. Analyzing the spatial positioning of nuclei in polynuclear giant cells

    International Nuclear Information System (INIS)

    Stange, Maike; Hintsche, Marius; Sachse, Kirsten; Gerhardt, Matthias; Beta, Carsten; Valleriani, Angelo

    2017-01-01

    How cells establish and maintain a well-defined size is a fundamental question of cell biology. Here we investigated to what extent the microtubule cytoskeleton can set a predefined cell size, independent of an enclosing cell membrane. We used electropulse-induced cell fusion to form giant multinuclear cells of the social amoeba Dictyostelium discoideum . Based on dual-color confocal imaging of cells that expressed fluorescent markers for the cell nucleus and the microtubules, we determined the subcellular distributions of nuclei and centrosomes in the giant cells. Our two- and three-dimensional imaging results showed that the positions of nuclei in giant cells do not fall onto a regular lattice. However, a comparison with model predictions for random positioning showed that the subcellular arrangement of nuclei maintains a low but still detectable degree of ordering. This can be explained by the steric requirements of the microtubule cytoskeleton, as confirmed by the effect of a microtubule degrading drug. (paper)

  4. Analyzing the spatial positioning of nuclei in polynuclear giant cells

    Science.gov (United States)

    Stange, Maike; Hintsche, Marius; Sachse, Kirsten; Gerhardt, Matthias; Valleriani, Angelo; Beta, Carsten

    2017-11-01

    How cells establish and maintain a well-defined size is a fundamental question of cell biology. Here we investigated to what extent the microtubule cytoskeleton can set a predefined cell size, independent of an enclosing cell membrane. We used electropulse-induced cell fusion to form giant multinuclear cells of the social amoeba Dictyostelium discoideum. Based on dual-color confocal imaging of cells that expressed fluorescent markers for the cell nucleus and the microtubules, we determined the subcellular distributions of nuclei and centrosomes in the giant cells. Our two- and three-dimensional imaging results showed that the positions of nuclei in giant cells do not fall onto a regular lattice. However, a comparison with model predictions for random positioning showed that the subcellular arrangement of nuclei maintains a low but still detectable degree of ordering. This can be explained by the steric requirements of the microtubule cytoskeleton, as confirmed by the effect of a microtubule degrading drug.

  5. WEB-GIS SOLUTIONS DEVELOPMENT FOR CITIZENS AND WATER COMPANIES

    Directory of Open Access Journals (Sweden)

    M. Şercăianu

    2013-05-01

    Full Text Available This paper describes the development of a web-GIS solution in which urban residents, from Buzau City, could be involved in decision-support process of water companies, in order to reduce water losses, by collecting information directly from citizens. In recent years, reducing material and economic losses, recorded in the entire municipal networks management process has become the main focus of public companies in Romania. Due to problems complexity that arise in collecting information from citizens and issues identified in urban areas, more analyzes were required related to web-GIS solutions used in areas such as local government, public utilities, environmental protection or financial management. Another important problem is the poor infrastructure development of spatial databases founded in public companies, and connection to web platforms. Developing the entire communication process between residents and municipal companies has required the use of concept "citizen-sensor" in the entire reporting process. Reported problems are related to water distribution networks with the possibility of covering the entire public utilities infrastructure.

  6. Web-Gis Solutions Development for Citizens and Water Companies

    Science.gov (United States)

    Şercăianu, M.

    2013-05-01

    This paper describes the development of a web-GIS solution in which urban residents, from Buzau City, could be involved in decision-support process of water companies, in order to reduce water losses, by collecting information directly from citizens. In recent years, reducing material and economic losses, recorded in the entire municipal networks management process has become the main focus of public companies in Romania. Due to problems complexity that arise in collecting information from citizens and issues identified in urban areas, more analyzes were required related to web-GIS solutions used in areas such as local government, public utilities, environmental protection or financial management. Another important problem is the poor infrastructure development of spatial databases founded in public companies, and connection to web platforms. Developing the entire communication process between residents and municipal companies has required the use of concept "citizen-sensor" in the entire reporting process. Reported problems are related to water distribution networks with the possibility of covering the entire public utilities infrastructure.

  7. A Web-Based Information System for Field Data Management

    Science.gov (United States)

    Weng, Y. H.; Sun, F. S.

    2014-12-01

    A web-based field data management system has been designed and developed to allow field geologists to store, organize, manage, and share field data online. System requirements were analyzed and clearly defined first regarding what data are to be stored, who the potential users are, and what system functions are needed in order to deliver the right data in the right way to the right user. A 3-tiered architecture was adopted to create this secure, scalable system that consists of a web browser at the front end while a database at the back end and a functional logic server in the middle. Specifically, HTML, CSS, and JavaScript were used to implement the user interface in the front-end tier, the Apache web server runs PHP scripts, and MySQL to server is used for the back-end database. The system accepts various types of field information, including image, audio, video, numeric, and text. It allows users to select data and populate them on either Google Earth or Google Maps for the examination of the spatial relations. It also makes the sharing of field data easy by converting them into XML format that is both human-readable and machine-readable, and thus ready for reuse.

  8. Enabling Spatial OLAP Over Environmental and Farming Data with QB4SOLAP

    DEFF Research Database (Denmark)

    Gur, Nurefsan; Hose, Katja; Pedersen, Torben Bach

    2016-01-01

    Governmental organizations and agencies have been making large amounts of spatial data available on the Semantic Web (SW). However, we still lack efficient techniques for analyzing such large amounts of data as we know them from relational database systems, e.g., multidimensional (MD) data...... warehouses and On-line Analytical Processing (OLAP). A basic prerequisite to enable such advanced analytics is a well-defined schema, which can be defined using the QB4SOLAP vocabulary that provides sufficient context for spatial OLAP (SOLAP). In this paper, we address the challenging problem of MD querying...

  9. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  10. WEB STRUCTURE MINING

    Directory of Open Access Journals (Sweden)

    CLAUDIA ELENA DINUCĂ

    2011-01-01

    Full Text Available The World Wide Web became one of the most valuable resources for information retrievals and knowledge discoveries due to the permanent increasing of the amount of data available online. Taking into consideration the web dimension, the users get easily lost in the web’s rich hyper structure. Application of data mining methods is the right solution for knowledge discovery on the Web. The knowledge extracted from the Web can be used to raise the performances for Web information retrievals, question answering and Web based data warehousing. In this paper, I provide an introduction of Web mining categories and I focus on one of these categories: the Web structure mining. Web structure mining, one of three categories of web mining for data, is a tool used to identify the relationship between Web pages linked by information or direct link connection. It offers information about how different pages are linked together to form this huge web. Web Structure Mining finds hidden basic structures and uses hyperlinks for more web applications such as web search.

  11. Semantic Web Technologies for the Adaptive Web

    DEFF Research Database (Denmark)

    Dolog, Peter; Nejdl, Wolfgang

    2007-01-01

    Ontologies and reasoning are the key terms brought into focus by the semantic web community. Formal representation of ontologies in a common data model on the web can be taken as a foundation for adaptive web technologies as well. This chapter describes how ontologies shared on the semantic web...... provide conceptualization for the links which are a main vehicle to access information on the web. The subject domain ontologies serve as constraints for generating only those links which are relevant for the domain a user is currently interested in. Furthermore, user model ontologies provide additional...... means for deciding which links to show, annotate, hide, generate, and reorder. The semantic web technologies provide means to formalize the domain ontologies and metadata created from them. The formalization enables reasoning for personalization decisions. This chapter describes which components...

  12. Classifying web genres in context: a case study documenting the web genres used by a software engineer

    NARCIS (Netherlands)

    Montesi, M.; Navarrete, T.

    2008-01-01

    This case study analyzes the Internet-based resources that a software engineer uses in his daily work. Methodologically, we studied the web browser history of the participant, classifying all the web pages he had seen over a period of 12 days into web genres. We interviewed him before and after the

  13. Historical Evolution of Spatial Abilities

    Directory of Open Access Journals (Sweden)

    A. Ardila

    1993-01-01

    Full Text Available Historical evolution and cross-cultural differences in spatial abilities are analyzed. Spatial abilities have been found to be significantly associated with the complexity of geographical conditions and survival demands. Although impaired spatial cognition is found in cases of, exclusively or predominantly, right hemisphere pathology, it is proposed that this asymmetry may depend on the degree of training in spatial abilities. It is further proposed that spatial cognition might have evolved in a parallel way with cultural evolution and environmental demands. Contemporary city humans might be using spatial abilities in some new, conceptual tasks that did not exist in prehistoric times: mathematics, reading, writing, mechanics, music, etc. Cross-cultural analysis of spatial abilities in different human groups, normalization of neuropsychological testing instruments, and clinical observations of spatial ability disturbances in people with different cultural backgrounds and various spatial requirements, are required to construct a neuropsychological theory of brain organization of spatial cognition.

  14. Applying semantic web services to enterprise web

    OpenAIRE

    Hu, Y; Yang, Q P; Sun, X; Wei, P

    2008-01-01

    Enterprise Web provides a convenient, extendable, integrated platform for information sharing and knowledge management. However, it still has many drawbacks due to complexity and increasing information glut, as well as the heterogeneity of the information processed. Research in the field of Semantic Web Services has shown the possibility of adding higher level of semantic functionality onto the top of current Enterprise Web, enhancing usability and usefulness of resource, enabling decision su...

  15. Study on online community user motif using web usage mining

    Science.gov (United States)

    Alphy, Meera; Sharma, Ajay

    2016-04-01

    The Web usage mining is the application of data mining, which is used to extract useful information from the online community. The World Wide Web contains at least 4.73 billion pages according to Indexed Web and it contains at least 228.52 million pages according Dutch Indexed web on 6th august 2015, Thursday. It’s difficult to get needed data from these billions of web pages in World Wide Web. Here is the importance of web usage mining. Personalizing the search engine helps the web user to identify the most used data in an easy way. It reduces the time consumption; automatic site search and automatic restore the useful sites. This study represents the old techniques to latest techniques used in pattern discovery and analysis in web usage mining from 1996 to 2015. Analyzing user motif helps in the improvement of business, e-commerce, personalisation and improvement of websites.

  16. WebGIS based on semantic grid model and web services

    Science.gov (United States)

    Zhang, WangFei; Yue, CaiRong; Gao, JianGuo

    2009-10-01

    As the combination point of the network technology and GIS technology, WebGIS has got the fast development in recent years. With the restriction of Web and the characteristics of GIS, traditional WebGIS has some prominent problems existing in development. For example, it can't accomplish the interoperability of heterogeneous spatial databases; it can't accomplish the data access of cross-platform. With the appearance of Web Service and Grid technology, there appeared great change in field of WebGIS. Web Service provided an interface which can give information of different site the ability of data sharing and inter communication. The goal of Grid technology was to make the internet to a large and super computer, with this computer we can efficiently implement the overall sharing of computing resources, storage resource, data resource, information resource, knowledge resources and experts resources. But to WebGIS, we only implement the physically connection of data and information and these is far from the enough. Because of the different understanding of the world, following different professional regulations, different policies and different habits, the experts in different field will get different end when they observed the same geographic phenomenon and the semantic heterogeneity produced. Since these there are large differences to the same concept in different field. If we use the WebGIS without considering of the semantic heterogeneity, we will answer the questions users proposed wrongly or we can't answer the questions users proposed. To solve this problem, this paper put forward and experienced an effective method of combing semantic grid and Web Services technology to develop WebGIS. In this paper, we studied the method to construct ontology and the method to combine Grid technology and Web Services and with the detailed analysis of computing characteristics and application model in the distribution of data, we designed the WebGIS query system driven by

  17. Development of a web based GIS for health facilities mapping ...

    African Journals Online (AJOL)

    Hilary Mushonga

    Key Words: Spatial Decision Support System, Web GIS, Mapping, Health geography. 1. Introduction ... Health geography is an area of medical research that incorporates geographic techniques into the study of ... street water pump. Once the ...

  18. “Wrapping” X3DOM around Web Audio API

    Directory of Open Access Journals (Sweden)

    Andreas Stamoulias

    2015-12-01

    Full Text Available Spatial sound has a conceptual role in the Web3D environments, due to highly realism scenes that can provide. Lately the efforts are concentrated on the extension of the X3D/ X3DOM through spatial sound attributes. This paper presents a novel method for the introduction of spatial sound components in the X3DOM framework, based on X3D specification and Web Audio API. The proposed method incorporates the introduction of enhanced sound nodes for X3DOM which are derived by the implementation of the X3D standard components, enriched with accessional features of Web Audio API. Moreover, several examples-scenarios developed for the evaluation of our approach. The implemented examples established the achievability of new registered nodes in X3DOM, for spatial sound characteristics in Web3D virtual worlds.

  19. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  20. Sounds of Web Advertising

    DEFF Research Database (Denmark)

    Jessen, Iben Bredahl; Graakjær, Nicolai Jørgensgaard

    2010-01-01

    Sound seems to be a neglected issue in the study of web ads. Web advertising is predominantly regarded as visual phenomena–commercial messages, as for instance banner ads that we watch, read, and eventually click on–but only rarely as something that we listen to. The present chapter presents...... an overview of the auditory dimensions in web advertising: Which kinds of sounds do we hear in web ads? What are the conditions and functions of sound in web ads? Moreover, the chapter proposes a theoretical framework in order to analyse the communicative functions of sound in web advertising. The main...... argument is that an understanding of the auditory dimensions in web advertising must include a reflection on the hypertextual settings of the web ad as well as a perspective on how users engage with web content....

  1. On the Querying for Places on the Mobile Web

    DEFF Research Database (Denmark)

    Jensen, Christian S.

    2011-01-01

    The web is undergoing a fundamental transformation: it is becoming mobile and is acquiring a spatial dimension. Thus, the web is increasingly being used from mobile devices, notably smartphones, that can be geo-positioned using GPS or technologies that exploit wireless communication networks...

  2. GIS-facilitated spatial narratives

    DEFF Research Database (Denmark)

    Møller-Jensen, Lasse; Jeppesen, Henrik; Kofie, Richard Y.

    2008-01-01

    on the thematically and narrative linking of a set of locations within an area. A spatial narrative that describes the - largely unsuccessful - history of Danish plantations on the Gold Coast (1788-1850) is implemented through the Google Earth client. This client is seen both as a type of media in itself for ‘home......-based' exploration of sites related to the narrative and as a tool that facilitates the design of spatial narratives before implementation within portable GIS devices. The Google Earth-based visualization of the spatial narrative is created by a Python script that outputs a web-accessible KML format file. The KML...

  3. Extraction spectrophotometric analyzer

    International Nuclear Information System (INIS)

    Batik, J.; Vitha, F.

    1985-01-01

    Automation is discussed of extraction spectrophotometric determination of uranium in a solution. Uranium is extracted from accompanying elements in an HCl medium with a solution of tributyl phosphate in benzene. The determination is performed by measuring absorbance at 655 nm in a single-phase ethanol-water-benzene-tributyl phosphate medium. The design is described of an analyzer consisting of an analytical unit and a control unit. The analyzer performance promises increased productivity of labour, improved operating and hygiene conditions, and mainly more accurate results of analyses. (J.C.)

  4. Web-based interventions in nursing.

    Science.gov (United States)

    Im, Eun-Ok; Chang, Sun Ju

    2013-02-01

    With recent advances in computer and Internet technologies and high funding priority on technological aspects of nursing research, researchers at the field level began to develop, use, and test various types of Web-based interventions. Despite high potential impacts of Web-based interventions, little is still known about Web-based interventions in nursing. In this article, to identify strengths and weaknesses of Web-based nursing interventions, a literature review was conducted using multiple databases with combined keywords of "online," "Internet" or "Web," "intervention," and "nursing." A total of 95 articles were retrieved through the databases and sorted by research topics. These articles were then analyzed to identify strengths and weaknesses of Web-based interventions in nursing. A strength of the Web-based interventions was their coverage of various content areas. In addition, many of them were theory-driven. They had advantages in their flexibility and comfort. They could provide consistency in interventions and require less cost in the intervention implementation. However, Web-based intervention studies had selected participants. They lacked controllability and had high dropouts. They required technical expertise and high development costs. Based on these findings, directions for future Web-based intervention research were provided.

  5. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  6. Analyzing Political Television Advertisements.

    Science.gov (United States)

    Burson, George

    1992-01-01

    Presents a lesson plan to help students understand that political advertisements often mislead, lie, or appeal to emotion. Suggests that the lesson will enable students to examine political advertisements analytically. Includes a worksheet to be used by students to analyze individual political advertisements. (DK)

  7. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  8. Building web information systems using web services

    NARCIS (Netherlands)

    Frasincar, F.; Houben, G.J.P.M.; Barna, P.; Vasilecas, O.; Eder, J.; Caplinskas, A.

    2006-01-01

    Hera is a model-driven methodology for designing Web information systems. In the past a CASE tool for the Hera methodology was implemented. This software had different components that together form one centralized application. In this paper, we present a distributed Web service-oriented architecture

  9. World wide spatial capital.

    Science.gov (United States)

    Sen, Rijurekha; Quercia, Daniele

    2018-01-01

    In its most basic form, the spatial capital of a neighborhood entails that most aspects of daily life are located close at hand. Urban planning researchers have widely recognized its importance, not least because it can be transformed in other forms of capital such as economical capital (e.g., house prices, retail sales) and social capital (e.g., neighborhood cohesion). Researchers have already studied spatial capital from official city data. Their work led to important planning decisions, yet it also relied on data that is costly to create and update, and produced metrics that are difficult to compare across cities. By contrast, we propose to measure spatial capital in cheap and standardized ways around the world. Hence the name of our project "World Wide Spatial Capital". Our measures are cheap as they rely on the most basic information about a city that is currently available on the Web (i.e., which amenities are available and where). They are also standardized because they can be applied in any city in the five continents (as opposed to previous metrics that were mainly applied in USA and UK). We show that, upon these metrics, one could produce insights at the core of the urban planning discipline: which areas would benefit the most from urban interventions; how to inform planning depending on whether a city's activity is mono- or poly-centric; how different cities fare against each other; and how spatial capital correlates with other urban characteristics such as mobility patterns and road network structure.

  10. Soft Decision Analyzer

    Science.gov (United States)

    Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam

    2013-01-01

    We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.

  11. KWU Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Bennewitz, F.; Hummel, R.; Oelmann, K.

    1986-01-01

    The KWU Nuclear Plant Analyzer is a real time engineering simulator based on the KWU computer programs used in plant transient analysis and licensing. The primary goal is to promote the understanding of the technical and physical processes of a nuclear power plant at an on-site training facility. Thus the KWU Nuclear Plant Analyzer is available with comparable low costs right at the time when technical questions or training needs arise. This has been achieved by (1) application of the transient code NLOOP; (2) unrestricted operator interaction including all simulator functions; (3) using the mainframe computer Control Data Cyber 176 in the KWU computing center; (4) four color graphic displays controlled by a dedicated graphic computer, no control room equipment; and (5) coupling of computers by telecommunication via telephone

  12. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  13. Wordpress web application development

    CERN Document Server

    Ratnayake, Rakhitha Nimesh

    2015-01-01

    This book is intended for WordPress developers and designers who want to develop quality web applications within a limited time frame and for maximum profit. Prior knowledge of basic web development and design is assumed.

  14. Promoting Your Web Site.

    Science.gov (United States)

    Raeder, Aggi

    1997-01-01

    Discussion of ways to promote sites on the World Wide Web focuses on how search engines work and how they retrieve and identify sites. Appropriate Web links for submitting new sites and for Internet marketing are included. (LRW)

  15. Practical web development

    CERN Document Server

    Wellens, Paul

    2015-01-01

    This book is perfect for beginners who want to get started and learn the web development basics, but also offers experienced developers a web development roadmap that will help them to extend their capabilities.

  16. EPA Web Taxonomy

    Data.gov (United States)

    U.S. Environmental Protection Agency — EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's...

  17. Private Web Browsing

    National Research Council Canada - National Science Library

    Syverson, Paul F; Reed, Michael G; Goldschlag, David M

    1997-01-01

    .... These are both kept confidential from network elements as well as external observers. Private Web browsing is achieved by unmodified Web browsers using anonymous connections by means of HTTP proxies...

  18. Chemical Search Web Utility

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Chemical Search Web Utility is an intuitive web application that allows the public to easily find the chemical that they are interested in using, and which...

  19. Emission spectrometric isotope analyzer

    International Nuclear Information System (INIS)

    Mauersberger, K.; Meier, G.; Nitschke, W.; Rose, W.; Schmidt, G.; Rahm, N.; Andrae, G.; Krieg, D.; Kuefner, W.; Tamme, G.; Wichlacz, D.

    1982-01-01

    An emission spectrometric isotope analyzer has been designed for determining relative abundances of stable isotopes in gaseous samples in discharge tubes, in liquid samples, and in flowing gaseous samples. It consists of a high-frequency generator, a device for defined positioning of discharge tubes, a grating monochromator with oscillating slit and signal converter, signal generator, window discriminator, AND connection, read-out display, oscillograph, gas dosing device and chemical conversion system with carrier gas source and vacuum pump

  20. A Web GIS Framework for Participatory Sensing Service: An Open Source-Based Implementation

    Directory of Open Access Journals (Sweden)

    Yu Nakayama

    2017-04-01

    Full Text Available Participatory sensing is the process in which individuals or communities collect and analyze systematic data using mobile phones and cloud services. To efficiently develop participatory sensing services, some server-side technologies have been proposed. Although they provide a good platform for participatory sensing, they are not optimized for spatial data management and processing. For the purpose of spatial data collection and management, many web GIS approaches have been studied. However, they still have not focused on the optimal framework for participatory sensing services. This paper presents a web GIS framework for participatory sensing service (FPSS. The proposed FPSS enables an integrated deployment of spatial data capture, storage, and data management functions. In various types of participatory sensing experiments, users can collect and manage spatial data in a unified manner. This feature is realized by the optimized system architecture and use case based on the general requirements for participatory sensing. We developed an open source GIS-based implementation of the proposed framework, which can overcome financial difficulties that are one of the major problems of deploying sensing experiments. We confirmed with the prototype that participatory sensing experiments can be performed efficiently with the proposed FPSS.

  1. Citizen Science and the Modern Web

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Beginning as a research project to help scientists communicate, the Web has transformed into a ubiquitous medium. As the sciences continue to transform, new techniques are needed to analyze the vast amounts of data being produced by large experiments. The advent of the Sloan Digital Sky Survey increased throughput of astronomical data, giving rise to Citizen Science projects such as Galaxy Zoo. The Web is no longer exclusively used by researchers, but rather, a place where anyone can share information, or even, partake in citizen science projects. As the Web continues to evolve, new and open technologies enable web applications to become more sophisticated. Scientific toolsets may now target the Web as a platform, opening an application to a wider audience, and potentially citizen scientists. With the latest browser technologies, scientific data may be consumed and visualized, opening the browser as a new platform for scientific analysis.

  2. Web Application Vulnerabilities

    OpenAIRE

    Yadav, Bhanu

    2014-01-01

    Web application security has been a major issue in information technology since the evolvement of dynamic web application. The main objective of this project was to carry out a detailed study on the top three web application vulnerabilities such as injection, cross site scripting, broken authentication and session management, present the situation where an application can be vulnerable to these web threats and finally provide preventative measures against them. ...

  3. Reactivity on the Web

    OpenAIRE

    Bailey, James; Bry, François; Eckert, Michael; Patrânjan, Paula Lavinia

    2005-01-01

    Reactivity, the ability to detect simple and composite events and respond in a timely manner, is an essential requirement in many present-day information systems. With the emergence of new, dynamic Web applications, reactivity on the Web is receiving increasing attention. Reactive Web-based systems need to detect and react not only to simple events but also to complex, real-life situations. This paper introduces XChange, a language for programming reactive behaviour on the Web,...

  4. Electromagnetic spatial coherence wavelets

    International Nuclear Information System (INIS)

    Castaneda, R.; Garcia-Sucerquia, J.

    2005-10-01

    The recently introduced concept of spatial coherence wavelets is generalized for describing the propagation of electromagnetic fields in the free space. For this aim, the spatial coherence wavelet tensor is introduced as an elementary amount, in terms of which the formerly known quantities for this domain can be expressed. It allows analyzing the relationship between the spatial coherence properties and the polarization state of the electromagnetic wave. This approach is completely consistent with the recently introduced unified theory of coherence and polarization for random electromagnetic beams, but it provides a further insight about the causal relationship between the polarization states at different planes along the propagation path. (author)

  5. Architecture and the Web.

    Science.gov (United States)

    Money, William H.

    Instructors should be concerned with how to incorporate the World Wide Web into an information systems (IS) curriculum organized across three areas of knowledge: information technology, organizational and management concepts, and theory and development of systems. The Web fits broadly into the information technology component. For the Web to be…

  6. Semantic Web Primer

    NARCIS (Netherlands)

    Antoniou, Grigoris; Harmelen, Frank van

    2004-01-01

    The development of the Semantic Web, with machine-readable content, has the potential to revolutionize the World Wide Web and its use. A Semantic Web Primer provides an introduction and guide to this still emerging field, describing its key ideas, languages, and technologies. Suitable for use as a

  7. Evaluating Web Usability

    Science.gov (United States)

    Snider, Jean; Martin, Florence

    2012-01-01

    Web usability focuses on design elements and processes that make web pages easy to use. A website for college students was evaluated for underutilization. One-on-one testing, focus groups, web analytics, peer university review and marketing focus group and demographic data were utilized to conduct usability evaluation. The results indicated that…

  8. Web Search Engines

    OpenAIRE

    Rajashekar, TB

    1998-01-01

    The World Wide Web is emerging as an all-in-one information source. Tools for searching Web-based information include search engines, subject directories and meta search tools. We take a look at key features of these tools and suggest practical hints for effective Web searching.

  9. Semantic Web status model

    CSIR Research Space (South Africa)

    Gerber, AJ

    2006-06-01

    Full Text Available Semantic Web application areas are experiencing intensified interest due to the rapid growth in the use of the Web, together with the innovation and renovation of information content technologies. The Semantic Web is regarded as an integrator across...

  10. Classification of the web

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper discusses the challenges faced by investigations into the classification of the Web and outlines inquiries that are needed to use principles for bibliographic classification to construct classifications of the Web. This paper suggests that the classification of the Web meets challenges...... that call for inquiries into the theoretical foundation of bibliographic classification theory....

  11. Analyzing Population Genetics Data: A Comparison of the Software

    Science.gov (United States)

    Choosing a software program for analyzing population genetic data can be a challenge without prior knowledge of the methods used by each program. There are numerous web sites listing programs by type of data analyzed, type of analyses performed, or other criteria. Even with programs categorized in ...

  12. Spatial Indexing for Data Searching in Mobile Sensing Environments

    Directory of Open Access Journals (Sweden)

    Yuchao Zhou

    2017-06-01

    Full Text Available Data searching and retrieval is one of the fundamental functionalities in many Web of Things applications, which need to collect, process and analyze huge amounts of sensor stream data. The problem in fact has been well studied for data generated by sensors that are installed at fixed locations; however, challenges emerge along with the popularity of opportunistic sensing applications in which mobile sensors keep reporting observation and measurement data at variable intervals and changing geographical locations. To address these challenges, we develop the Geohash-Grid Tree, a spatial indexing technique specially designed for searching data integrated from heterogeneous sources in a mobile sensing environment. Results of the experiments on a real-world dataset collected from the SmartSantander smart city testbed show that the index structure allows efficient search based on spatial distance, range and time windows in a large time series database.

  13. Spatial Indexing for Data Searching in Mobile Sensing Environments.

    Science.gov (United States)

    Zhou, Yuchao; De, Suparna; Wang, Wei; Moessner, Klaus; Palaniswami, Marimuthu S

    2017-06-18

    Data searching and retrieval is one of the fundamental functionalities in many Web of Things applications, which need to collect, process and analyze huge amounts of sensor stream data. The problem in fact has been well studied for data generated by sensors that are installed at fixed locations; however, challenges emerge along with the popularity of opportunistic sensing applications in which mobile sensors keep reporting observation and measurement data at variable intervals and changing geographical locations. To address these challenges, we develop the Geohash-Grid Tree, a spatial indexing technique specially designed for searching data integrated from heterogeneous sources in a mobile sensing environment. Results of the experiments on a real-world dataset collected from the SmartSantander smart city testbed show that the index structure allows efficient search based on spatial distance, range and time windows in a large time series database.

  14. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility......Phosphoproteomic experiments are routinely conducted in laboratories worldwide, and because of the fast development of mass spectrometric techniques and efficient phosphopeptide enrichment methods, researchers frequently end up having lists with tens of thousands of phosphorylation sites...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  15. Electrodynamic thermogravimetric analyzer

    International Nuclear Information System (INIS)

    Spjut, R.E.; Bar-Ziv, E.; Sarofim, A.F.; Longwell, J.P.

    1986-01-01

    The design and operation of a new device for studying single-aerosol-particle kinetics at elevated temperatures, the electrodynamic thermogravimetric analyzer (EDTGA), was examined theoretically and experimentally. The completed device consists of an electrodynamic balance modified to permit particle heating by a CO 2 laser, temperature measurement by a three-color infrared-pyrometry system, and continuous weighing by a position-control system. In this paper, the position-control, particle-weight-measurement, heating, and temperature-measurement systems are described and their limitations examined

  16. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  17. Inductive dielectric analyzer

    International Nuclear Information System (INIS)

    Agranovich, Daniel; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri; Polygalov, Eugene

    2017-01-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions. (paper)

  18. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  19. Multiple capillary biochemical analyzer

    Science.gov (United States)

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  20. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  1. Web services foundations

    CERN Document Server

    Bouguettaya, Athman; Daniel, Florian

    2013-01-01

    Web services and Service-Oriented Computing (SOC) have become thriving areas of academic research, joint university/industry research projects, and novel IT products on the market. SOC is the computing paradigm that uses Web services as building blocks for the engineering of composite, distributed applications out of the reusable application logic encapsulated by Web services. Web services could be considered the best-known and most standardized technology in use today for distributed computing over the Internet.Web Services Foundations is the first installment of a two-book collection coverin

  2. Web Security, Privacy & Commerce

    CERN Document Server

    Garfinkel, Simson

    2011-01-01

    Since the first edition of this classic reference was published, World Wide Web use has exploded and e-commerce has become a daily part of business and personal life. As Web use has grown, so have the threats to our security and privacy--from credit card fraud to routine invasions of privacy by marketers to web site defacements to attacks that shut down popular web sites. Web Security, Privacy & Commerce goes behind the headlines, examines the major security risks facing us today, and explains how we can minimize them. It describes risks for Windows and Unix, Microsoft Internet Exp

  3. Cooperative Mobile Web Browsing

    DEFF Research Database (Denmark)

    Perrucci, GP; Fitzek, FHP; Zhang, Qi

    2009-01-01

    This paper advocates a novel approach for mobile web browsing based on cooperation among wireless devices within close proximity operating in a cellular environment. In the actual state of the art, mobile phones can access the web using different cellular technologies. However, the supported data......-range links can then be used for cooperative mobile web browsing. By implementing the cooperative web browsing on commercial mobile phones, it will be shown that better performance is achieved in terms of increased data rate and therefore reduced access times, resulting in a significantly enhanced web...

  4. Basin Assessment Spatial Planning Platform

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-26

    The tool is intended to facilitate hydropower development and water resource planning by improving synthesis and interpretation of disparate spatial datasets that are considered in development actions (e.g., hydrological characteristics, environmentally and culturally sensitive areas, existing or proposed water power resources, climate-informed forecasts). The tool enables this capability by providing a unique framework for assimilating, relating, summarizing, and visualizing disparate spatial data through the use of spatial aggregation techniques, relational geodatabase platforms, and an interactive web-based Geographic Information Systems (GIS). Data are aggregated and related based on shared intersections with a common spatial unit; in this case, industry-standard hydrologic drainage areas for the U.S. (National Hydrography Dataset) are used as the spatial unit to associate planning data. This process is performed using all available scalar delineations of drainage areas (i.e., region, sub-region, basin, sub-basin, watershed, sub-watershed, catchment) to create spatially hierarchical relationships among planning data and drainages. These entity-relationships are stored in a relational geodatabase that provides back-end structure to the web GIS and its widgets. The full technology stack was built using all open-source software in modern programming languages. Interactive widgets that function within the viewport are also compatible with all modern browsers.

  5. Trace impurity analyzer

    International Nuclear Information System (INIS)

    Schneider, W.J.; Edwards, D. Jr.

    1979-01-01

    The desirability for long-term reliability of large scale helium refrigerator systems used on superconducting accelerator magnets has necessitated detection of impurities to levels of a few ppM. An analyzer that measures trace impurity levels of condensable contaminants in concentrations of less than a ppM in 15 atm of He is described. The instrument makes use of the desorption temperature at an indicated pressure of the various impurities to determine the type of contaminant. The pressure rise at that temperature yields a measure of the contaminant level of the impurity. A LN 2 cryogenic charcoal trap is also employed to measure air impurities (nitrogen and oxygen) to obtain the full range of contaminant possibilities. The results of this detector which will be in use on the research and development helium refrigerator of the ISABELLE First-Cell is described

  6. Analyzing Water's Optical Absorption

    Science.gov (United States)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  7. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  8. Spatial Operations

    Directory of Open Access Journals (Sweden)

    Anda VELICANU

    2010-09-01

    Full Text Available This paper contains a brief description of the most important operations that can be performed on spatial data such as spatial queries, create, update, insert, delete operations, conversions, operations on the map or analysis on grid cells. Each operation has a graphical example and some of them have code examples in Oracle and PostgreSQL.

  9. Spatializing Time

    DEFF Research Database (Denmark)

    Thomsen, Bodil Marie Stavning

    2011-01-01

    The article analyses some of artist Søren Lose's photographic installations in which time, history and narration is reflected in the creation of allegoric, spatial relations.......The article analyses some of artist Søren Lose's photographic installations in which time, history and narration is reflected in the creation of allegoric, spatial relations....

  10. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  11. A neutron activation analyzer

    International Nuclear Information System (INIS)

    Westphal, G.P.; Lemmel, H.; Grass, F.; De Regge, P.P.; Burns, K.; Markowicz, A.

    2005-01-01

    Dubbed 'Analyzer' because of its simplicity, a neutron activation analysis facility for short-lived isomeric transitions is based on a low-cost rabbit system and an adaptive digital filter which are controlled by a software performing irradiation control, loss-free gamma-spectrometry, spectra evaluation, nuclide identification and calculation of concentrations in a fully automatic flow of operations. Designed for TRIGA reactors and constructed from inexpensive plastic tubing and an aluminum in-core part, the rabbit system features samples of 5 ml and 10 ml with sample separation at 150 ms and 200 ms transport time or 25 ml samples without separation at a transport time of 300 ms. By automatically adapting shaping times to pulse intervals the preloaded digital filter gives best throughput at best resolution up to input counting rates of 10 6 cps. Loss-free counting enables quantitative correction of counting losses of up to 99%. As a test of system reproducibility in sample separation geometry, K, Cl, Mn, Mg, Ca, Sc, and V have been determined in various reference materials at excellent agreement with consensus values. (author)

  12. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  13. Analyzing Visibility Configurations.

    Science.gov (United States)

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  14. Integration of modern statistical tools for the analysis of climate extremes into the web-GIS “CLIMATE”

    Science.gov (United States)

    Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.

  15. Estimating Maintenance Cost for Web Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2016-01-01

    Full Text Available The current paper tackles the issue of determining a method for estimating maintenance costs for web applications. The current state of research in the field of web application maintenance is summarized and leading theories and results are highlighted. The cost of web maintenance is determined by the number of man-hours invested in maintenance tasks. Web maintenance tasks are categorized into content maintenance and technical maintenance. Research is centered on analyzing technical maintenance tasks. The research hypothesis is formulated on the assumption that the number of man-hours invested in maintenance tasks can be assessed based on the web application’s user interaction level, complexity and content update effort. Data regarding the costs of maintenance tasks is collected from 24 maintenance projects implemented by a web development company that tackles a wide area of web applications. Homogeneity and diversity of collected data is submitted for debate by presenting a sample of the data and depicting the overall size and comprehensive nature of the entire dataset. A set of metrics dedicated to estimating maintenance costs in web applications is defined based on conclusions formulated by analyzing the collected data and the theories and practices dominating the current state of research. Metrics are validated with regards to the initial research hypothesis. Research hypothesis are validated and conclusions are formulated on the topic of estimating the maintenance cost of web applications. The limits of the research process which represented the basis for the current paper are enunciated. Future research topics are submitted for debate.

  16. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  17. WebMGA: a customizable web server for fast metagenomic sequence analysis.

    Science.gov (United States)

    Wu, Sitao; Zhu, Zhengwei; Fu, Liming; Niu, Beifang; Li, Weizhong

    2011-09-07

    The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  18. WebMGA: a customizable web server for fast metagenomic sequence analysis

    Directory of Open Access Journals (Sweden)

    Niu Beifang

    2011-09-01

    Full Text Available Abstract Background The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. Results We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. Conclusions WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  19. Using Open data in analyzing urban growth: urban density and change detection

    Science.gov (United States)

    murgante, Beniamino; Nolè, Gabriele; Lasaponara, Rosa; Lanorte, Antonio

    2013-04-01

    In recent years a great attention has been paid to the evolution and the use of spatial data. Internet technologies accelerated such a process, allowing more direct access to spatial information. It is estimated that more than 600 million people have been connected to the Internet at least once to display maps on the web. Consequently, there is an irreversible process which considers geographical dimension as a fundamental attribute for the management of information flows. Furthermore, the great activity produced by open data movement leads to an easier and clearer access to geospatial information. This trend concerns, in a less evident way, also satellite data, which are increasingly accessible through the web. Spatial planning, geography and other regional sciences find it difficult to build knowledge related to spatial transformation. These problems can be significantly reduced due to a large data availability, producing significant opportunities to capture knowledge useful for a better territorial governance. This study has been developed in a heavily anthropized area in southern Italy, Apulia region, using free spatial data and free multispectral and multitemporal satellite data (Apulia region was one of the first regions in Italy to adopt open data policies). The analysis concerns urban growth, which, in recent decades, showed a rapid increase. In a first step the evolution in time and change detection of urban areas has been analyzed paying particular attention to soil consumption. In the second step Kernel Density has been adopted in order to assess development pressures. KDE (Kernel Density Estimation) function is a technique that provides the density of a phenomenon based on point data. A mobile three dimensional surface has been produced from a set of points distributed over a region of space, which weighs the events within its sphere of influence, depending on their distance from the point from which intensity is estimated. It produces, considering as

  20. Numerical analysis of beam with sinusoidally corrugated webs

    Science.gov (United States)

    Górecki, Marcin; Pieńko, Michał; Łagoda, GraŻyna

    2018-01-01

    The paper presents numerical tests results of the steel beam with sinusoidally corrugated web, which were performed in the Autodesk Algor Simulation Professional 2010. The analysis was preceded by laboratory tests including the beam's work under the influence of the four point bending as well as the study of material characteristics. Significant web's thickness and use of tools available in the software allowed to analyze the behavior of the plate girder as beam, and also to observe the occurrence of stresses in the characteristic element - the corrugated web. The stress distribution observed on the both web's surfaces was analyzed.

  1. Spatiotemporal Land Use Change Analysis Using Open-source GIS and Web Based Application

    Directory of Open Access Journals (Sweden)

    Wan Yusryzal Wan Ibrahim

    2015-05-01

    Full Text Available Spatiotemporal changes are very important information to reveal the characteristics of the urbanization process. Sharing the information is beneficial for public awareness which then improves their participation in adaptive management for spatial planning process. Open-source software and web application are freely available tools that can be the best medium used by any individual or agencies to share this important information. The objective of the paper is to discuss on the spatiotemporal land use change in Iskandar Malaysia by using open-source GIS (Quantum GIS and publish them through web application (Mash-up. Land use in 1994 to 2011 were developed and analyzed to show the landscape change of the region. Subsequently, web application was setup to distribute the findings of the study. The result show there is significant changes of land use in the study area especially on the decline of agricultural and natural land which were converted to urban land uses. Residential and industrial areas largely replaced the agriculture and natural areas particularly along the coastal zone of the region. This information is published through interactive GIS web in order to share it with the public and stakeholders. There are some limitations of web application but still not hindering the advantages of using it. The integration of open-source GIS and web application is very helpful in sharing planning information particularly in the study area that experiences rapid land use and land cover change. Basic information from this study is vital for conducting further study such as projecting future land use change and other related studies in the area.

  2. Engineering Web Applications

    DEFF Research Database (Denmark)

    Casteleyn, Sven; Daniel, Florian; Dolog, Peter

    Nowadays, Web applications are almost omnipresent. The Web has become a platform not only for information delivery, but also for eCommerce systems, social networks, mobile services, and distributed learning environments. Engineering Web applications involves many intrinsic challenges due...... to their distributed nature, content orientation, and the requirement to make them available to a wide spectrum of users who are unknown in advance. The authors discuss these challenges in the context of well-established engineering processes, covering the whole product lifecycle from requirements engineering through...... design and implementation to deployment and maintenance. They stress the importance of models in Web application development, and they compare well-known Web-specific development processes like WebML, WSDM and OOHDM to traditional software development approaches like the waterfall model and the spiral...

  3. Desarrollo de aplicaciones web

    OpenAIRE

    Luján Mora, Sergio

    2010-01-01

    Agradecimientos 1. Introducción a las aplicaciones web 2. Instalación del servidor 3. Diseño de páginas web 4. Formato estructurado de texto: XML 5. Contenido dinámico 6. Acceso a bases de datos: JDBC 7. Servicios web 8. Utilización y mantenimiento 9. Monitorización y análisis Bibliografía GNU Free Documentation License

  4. Web Science emerges

    OpenAIRE

    Shadbolt, Nigel; Berners-Lee, Tim

    2008-01-01

    The relentless rise in Web pages and links is creating emergent properties, from social networks to virtual identity theft, that are transforming society. A new discipline, Web Science, aims to discover how Web traits arise and how they can be harnessed or held in check to benefit society. Important advances are beginning to be made; more work can solve major issues such as securing privacy and conveying trust.

  5. Programming the semantic web

    CERN Document Server

    Segaran, Toby; Taylor, Jamie

    2009-01-01

    With this book, the promise of the Semantic Web -- in which machines can find, share, and combine data on the Web -- is not just a technical possibility, but a practical reality Programming the Semantic Web demonstrates several ways to implement semantic web applications, using current and emerging standards and technologies. You'll learn how to incorporate existing data sources into semantically aware applications and publish rich semantic data. Each chapter walks you through a single piece of semantic technology and explains how you can use it to solve real problems. Whether you're writing

  6. RESTful Web Services Cookbook

    CERN Document Server

    Allamaraju, Subbu

    2010-01-01

    While the REST design philosophy has captured the imagination of web and enterprise developers alike, using this approach to develop real web services is no picnic. This cookbook includes more than 100 recipes to help you take advantage of REST, HTTP, and the infrastructure of the Web. You'll learn ways to design RESTful web services for client and server applications that meet performance, scalability, reliability, and security goals, no matter what programming language and development framework you use. Each recipe includes one or two problem statements, with easy-to-follow, step-by-step i

  7. Programming the Mobile Web

    CERN Document Server

    Firtman, Maximiliano

    2010-01-01

    Today's market for mobile apps goes beyond the iPhone to include BlackBerry, Nokia, Windows Phone, and smartphones powered by Android, webOS, and other platforms. If you're an experienced web developer, this book shows you how to build a standard app core that you can extend to work with specific devices. You'll learn the particulars and pitfalls of building mobile apps with HTML, CSS, and other standard web tools. You'll also explore platform variations, finicky mobile browsers, Ajax design patterns for mobile, and much more. Before you know it, you'll be able to create mashups using Web 2.

  8. Advanced web services

    CERN Document Server

    Bouguettaya, Athman; Daniel, Florian

    2013-01-01

    Web services and Service-Oriented Computing (SOC) have become thriving areas of academic research, joint university/industry research projects, and novel IT products on the market. SOC is the computing paradigm that uses Web services as building blocks for the engineering of composite, distributed applications out of the reusable application logic encapsulated by Web services. Web services could be considered the best-known and most standardized technology in use today for distributed computing over the Internet. This book is the second installment of a two-book collection covering the state-o

  9. Web-sovelluskehityksen tekniikat

    OpenAIRE

    Kettunen, Werner

    2015-01-01

    Web-sovelluskehitykseen käytettäviä tekniikoita, työkaluja ja ohjelmakirjastoja on olemassa useita erilaisia ja niiden lähestymistapa web-sovelluskehitykseen poikkeaa jonkin verran toisistaan. Opinnäytetyössä selvitetään teoriassa ja käytännön esimerkkiprojektin avulla yleisimmin web-sovelluskehityksessä käytettyjä tekniikoita ja kirjastoja. Työssä esimerkkinä luodussa web-sovelluksessa käytettiin Laravel-ohjelmakehystä ja alkuosassa käsiteltyjä työkaluja ja kirjastoja, kuten Bootstrap ja ...

  10. Creating Web Pages Simplified

    CERN Document Server

    Wooldridge, Mike

    2011-01-01

    The easiest way to learn how to create a Web page for your family or organization Do you want to share photos and family lore with relatives far away? Have you been put in charge of communication for your neighborhood group or nonprofit organization? A Web page is the way to get the word out, and Creating Web Pages Simplified offers an easy, visual way to learn how to build one. Full-color illustrations and concise instructions take you through all phases of Web publishing, from laying out and formatting text to enlivening pages with graphics and animation. This easy-to-follow visual guide sho

  11. Web Accessibility and Guidelines

    Science.gov (United States)

    Harper, Simon; Yesilada, Yeliz

    Access to, and movement around, complex online environments, of which the World Wide Web (Web) is the most popular example, has long been considered an important and major issue in the Web design and usability field. The commonly used slang phrase ‘surfing the Web’ implies rapid and free access, pointing to its importance among designers and users alike. It has also been long established that this potentially complex and difficult access is further complicated, and becomes neither rapid nor free, if the user is disabled. There are millions of people who have disabilities that affect their use of the Web. Web accessibility aims to help these people to perceive, understand, navigate, and interact with, as well as contribute to, the Web, and thereby the society in general. This accessibility is, in part, facilitated by the Web Content Accessibility Guidelines (WCAG) currently moving from version one to two. These guidelines are intended to encourage designers to make sure their sites conform to specifications, and in that conformance enable the assistive technologies of disabled users to better interact with the page content. In this way, it was hoped that accessibility could be supported. While this is in part true, guidelines do not solve all problems and the new WCAG version two guidelines are surrounded by controversy and intrigue. This chapter aims to establish the published literature related to Web accessibility and Web accessibility guidelines, and discuss limitations of the current guidelines and future directions.

  12. Building Social Web Applications

    CERN Document Server

    Bell, Gavin

    2009-01-01

    Building a web application that attracts and retains regular visitors is tricky enough, but creating a social application that encourages visitors to interact with one another requires careful planning. This book provides practical solutions to the tough questions you'll face when building an effective community site -- one that makes visitors feel like they've found a new home on the Web. If your company is ready to take part in the social web, this book will help you get started. Whether you're creating a new site from scratch or reworking an existing site, Building Social Web Applications

  13. Photography activities for developing students’ spatial orientation and spatial visualization

    Science.gov (United States)

    Hendroanto, Aan; van Galen, Frans; van Eerde, D.; Prahmana, R. C. I.; Setyawan, F.; Istiandaru, A.

    2017-12-01

    Spatial orientation and spatial visualization are the foundation of students’ spatial ability. They assist students’ performance in learning mathematics, especially geometry. Considering its importance, the present study aims to design activities to help young learners developing their spatial orientation and spatial visualization ability. Photography activity was chosen as the context of the activity to guide and support the students. This is a design research study consisting of three phases: 1) preparation and designing 2) teaching experiment, and 3) retrospective analysis. The data is collected by tests and interview and qualitatively analyzed. We developed two photography activities to be tested. In the teaching experiments, 30 students of SD Laboratorium UNESA, Surabaya were involved. The results showed that the activities supported the development of students’ spatial orientation and spatial visualization indicated by students’ learning progresses, answers, and strategies when they solved the problems in the activities.

  14. Using Open Web APIs in Teaching Web Mining

    Science.gov (United States)

    Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju

    2009-01-01

    With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…

  15. Spatial Theography

    OpenAIRE

    van Noppen, Jean Pierre

    1995-01-01

    Descriptive theology («theography») frequently resorts to metaphorical modes of meaning. Among these metaphors, the spatial language of localization and orientation plays an important role to delineate tentative insights into the relationship between the human and the divine. These spatial metaphors are presumably based on the universal human experience of interaction between the body and its environment. It is dangerous, however, to postulate universal agreement on meanings associated with s...

  16. The RCSB Protein Data Bank: redesigned web site and web services.

    Science.gov (United States)

    Rose, Peter W; Beran, Bojan; Bi, Chunxiao; Bluhm, Wolfgang F; Dimitropoulos, Dimitris; Goodsell, David S; Prlic, Andreas; Quesada, Martha; Quinn, Gregory B; Westbrook, John D; Young, Jasmine; Yukich, Benjamin; Zardecki, Christine; Berman, Helen M; Bourne, Philip E

    2011-01-01

    The RCSB Protein Data Bank (RCSB PDB) web site (http://www.pdb.org) has been redesigned to increase usability and to cater to a larger and more diverse user base. This article describes key enhancements and new features that fall into the following categories: (i) query and analysis tools for chemical structure searching, query refinement, tabulation and export of query results; (ii) web site customization and new structure alerts; (iii) pair-wise and representative protein structure alignments; (iv) visualization of large assemblies; (v) integration of structural data with the open access literature and binding affinity data; and (vi) web services and web widgets to facilitate integration of PDB data and tools with other resources. These improvements enable a range of new possibilities to analyze and understand structure data. The next generation of the RCSB PDB web site, as described here, provides a rich resource for research and education.

  17. World wide spatial capital.

    Directory of Open Access Journals (Sweden)

    Rijurekha Sen

    Full Text Available In its most basic form, the spatial capital of a neighborhood entails that most aspects of daily life are located close at hand. Urban planning researchers have widely recognized its importance, not least because it can be transformed in other forms of capital such as economical capital (e.g., house prices, retail sales and social capital (e.g., neighborhood cohesion. Researchers have already studied spatial capital from official city data. Their work led to important planning decisions, yet it also relied on data that is costly to create and update, and produced metrics that are difficult to compare across cities. By contrast, we propose to measure spatial capital in cheap and standardized ways around the world. Hence the name of our project "World Wide Spatial Capital". Our measures are cheap as they rely on the most basic information about a city that is currently available on the Web (i.e., which amenities are available and where. They are also standardized because they can be applied in any city in the five continents (as opposed to previous metrics that were mainly applied in USA and UK. We show that, upon these metrics, one could produce insights at the core of the urban planning discipline: which areas would benefit the most from urban interventions; how to inform planning depending on whether a city's activity is mono- or poly-centric; how different cities fare against each other; and how spatial capital correlates with other urban characteristics such as mobility patterns and road network structure.

  18. The wireless Web and patient care.

    Science.gov (United States)

    Bergeron, B P

    2001-01-01

    Wireless computing, when integrated with the Web, is poised to revolutionize the practice and teaching of medicine. As vendors introduce wireless Web technologies in the medical community that have been used successfully in the business and consumer markets, clinicians can expect profound increases in the amount of patient data, as well as the ease with which those data are acquired, analyzed, and disseminated. The enabling technologies involved in this transformation to the wireless Web range from the new generation of wireless PDAs, eBooks, and wireless data acquisition peripherals to new wireless network protocols. The rate-limiting step in the application of this technology in medicine is not technology per se but rather how quickly clinicians and their patients come to accept and appreciate the benefits and limitations of the application of wireless Web technology.

  19. Enabling Semantic Queries Against the Spatial Database

    Directory of Open Access Journals (Sweden)

    PENG, X.

    2012-02-01

    Full Text Available The spatial database based upon the object-relational database management system (ORDBMS has the merits of a clear data model, good operability and high query efficiency. That is why it has been widely used in spatial data organization and management. However, it cannot express the semantic relationships among geospatial objects, making the query results difficult to meet the user's requirement well. Therefore, this paper represents an attempt to combine the Semantic Web technology with the spatial database so as to make up for the traditional database's disadvantages. In this way, on the one hand, users can take advantages of ORDBMS to store and manage spatial data; on the other hand, if the spatial database is released in the form of Semantic Web, the users could describe a query more concisely with the cognitive pattern which is similar to that of daily life. As a consequence, this methodology enables the benefits of both Semantic Web and the object-relational database (ORDB available. The paper discusses systematically the semantic enriched spatial database's architecture, key technologies and implementation. Subsequently, we demonstrate the function of spatial semantic queries via a practical prototype system. The query results indicate that the method used in this study is feasible.

  20. Charge Analyzer Responsive Local Oscillations

    Science.gov (United States)

    Krause, Linda Habash; Thornton, Gary

    2015-01-01

    The first transatlantic radio transmission, demonstrated by Marconi in December of 1901, revealed the essential role of the ionosphere for radio communications. This ionized layer of the upper atmosphere controls the amount of radio power transmitted through, reflected off of, and absorbed by the atmospheric medium. Low-frequency radio signals can propagate long distances around the globe via repeated reflections off of the ionosphere and the Earth's surface. Higher frequency radio signals can punch through the ionosphere to be received at orbiting satellites. However, any turbulence in the ionosphere can distort these signals, compromising the performance or even availability of space-based communication and navigations systems. The physics associated with this distortion effect is analogous to the situation when underwater images are distorted by convecting air bubbles. In fact, these ionospheric features are often called 'plasma bubbles' since they exhibit some of the similar behavior as underwater air bubbles. These events, instigated by solar and geomagnetic storms, can cause communication and navigation outages that last for hours. To help understand and predict these outages, a world-wide community of space scientists and technologists are devoted to researching this topic. One aspect of this research is to develop instruments capable of measuring the ionospheric plasma bubbles. Figure 1 shows a photo of the Charge Analyzer Responsive to Local Oscillations (CARLO), a new instrument under development at NASA Marshall Space Flight Center (MSFC). It is a frequency-domain ion spectrum analyzer designed to measure the distributions of ionospheric turbulence from 1 Hz to 10 kHz (i.e., spatial scales from a few kilometers down to a few centimeters). This frequency range is important since it focuses on turbulence scales that affect VHF/UHF satellite communications, GPS systems, and over-the-horizon radar systems. CARLO is based on the flight-proven Plasma Local

  1. Typography of web design

    OpenAIRE

    Uhlířová, Martina

    2016-01-01

    Typography is one of the most important elements of web design and marketing. Good typography makes web design more appealing, which is important for readers in evaluating titles and the quality of text. The aim of this thesis is to provide a characterization of good and bad typography. I will use this characterization to identify modern typographical trends in a digital background....

  2. The Creative Web.

    Science.gov (United States)

    Yudess, Jo

    2003-01-01

    This article lists the Web sites of 12 international not-for-profit creativity associations designed to trigger more creative thought and research possibilities. Along with Web addresses, the entries include telephone contact information and a brief description of the organization. (CR)

  3. Mastering Go web services

    CERN Document Server

    Kozyra, Nathan

    2015-01-01

    If you are a web programmer with experience in developing web services and have a rudimentary knowledge of using Go, then this is the book for you. Basic knowledge of Go as well as knowledge of relational databases and non-relational NoSQL datastores is assumed. Some basic concurrency knowledge is also required.

  4. Web Publishing Schedule

    Science.gov (United States)

    Section 207(f)(2) of the E-Gov Act requires federal agencies to develop an inventory and establish a schedule of information to be published on their Web sites, make those schedules available for public comment. To post the schedules on the web site.

  5. Web Auctions in Europe

    NARCIS (Netherlands)

    A. Pouloudi; J. Paarlberg; H.W.G.M. van Heck (Eric)

    2001-01-01

    textabstractThis paper argues that a better understanding of the business model of web auctions can be reached if we adopt a broader view and provide empirical research from different sites. In this paper the business model of web auctions is refined into four dimensions. These are auction model,

  6. EPA Web Training Classes

    Science.gov (United States)

    Scheduled webinars can help you better manage EPA web content. Class topics include Drupal basics, creating different types of pages in the WebCMS such as document pages and forms, using Google Analytics, and best practices for metadata and accessibility.

  7. Sign Language Web Pages

    Science.gov (United States)

    Fels, Deborah I.; Richards, Jan; Hardman, Jim; Lee, Daniel G.

    2006-01-01

    The World Wide Web has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The…

  8. A reasonable Semantic Web

    NARCIS (Netherlands)

    Hitzler, Pascal; Van Harmelen, Frank

    2010-01-01

    The realization of Semantic Web reasoning is central to substantiating the Semantic Web vision. However, current mainstream research on this topic faces serious challenges, which forces us to question established lines of research and to rethink the underlying approaches. We argue that reasoning for

  9. Web Team Development

    Science.gov (United States)

    Church, Jennifer; Felker, Kyle

    2005-01-01

    The dynamic world of the Web has provided libraries with a wealth of opportunities, including new approaches to the provision of information and varied internal staffing structures. The development of self-managed Web teams, endowed with authority and resources, can create an adaptable and responsive culture within libraries. This new working team…

  10. Web Design Matters

    Science.gov (United States)

    Mathews, Brian

    2009-01-01

    The web site is a library's most important feature. Patrons use the web site for numerous functions, such as renewing materials, placing holds, requesting information, and accessing databases. The homepage is the place they turn to look up the hours, branch locations, policies, and events. Whether users are at work, at home, in a building, or on…

  11. Aspects of Data Warehouse Technologies for Complex Web Data

    OpenAIRE

    Thomsen, Christian

    2008-01-01

    This thesis is about aspects of specification and development of datawarehouse technologies for complex web data. Today, large amounts of dataexist in different web resources and in different formats. But it is oftenhard to analyze and query the often big and complex data or data about thedata (i.e., metadata). It is therefore interesting to apply Data Warehouse(DW) technology to the data. But to apply DW technology to complex web datais not straightforward and the DW community faces new and ...

  12. A systematic framework to discover pattern for web spam classification

    OpenAIRE

    Jelodar, Hamed; Wang, Yongli; Yuan, Chi; Jiang, Xiaohui

    2017-01-01

    Web spam is a big problem for search engine users in World Wide Web. They use deceptive techniques to achieve high rankings. Although many researchers have presented the different approach for classification and web spam detection still it is an open issue in computer science. Analyzing and evaluating these websites can be an effective step for discovering and categorizing the features of these websites. There are several methods and algorithms for detecting those websites, such as decision t...

  13. Web Server Embedded System

    Directory of Open Access Journals (Sweden)

    Adharul Muttaqin

    2014-07-01

    Full Text Available Abstrak Embedded sistem saat ini menjadi perhatian khusus pada teknologi komputer, beberapa sistem operasi linux dan web server yang beraneka ragam juga sudah dipersiapkan untuk mendukung sistem embedded, salah satu aplikasi yang dapat digunakan dalam operasi pada sistem embedded adalah web server. Pemilihan web server pada lingkungan embedded saat ini masih jarang dilakukan, oleh karena itu penelitian ini dilakukan dengan menitik beratkan pada dua buah aplikasi web server yang tergolong memiliki fitur utama yang menawarkan “keringanan” pada konsumsi CPU maupun memori seperti Light HTTPD dan Tiny HTTPD. Dengan menggunakan parameter thread (users, ramp-up periods, dan loop count pada stress test embedded system, penelitian ini menawarkan solusi web server manakah diantara Light HTTPD dan Tiny HTTPD yang memiliki kecocokan fitur dalam penggunaan embedded sistem menggunakan beagleboard ditinjau dari konsumsi CPU dan memori. Hasil penelitian menunjukkan bahwa dalam hal konsumsi CPU pada beagleboard embedded system lebih disarankan penggunaan Light HTTPD dibandingkan dengan tiny HTTPD dikarenakan terdapat perbedaan CPU load yang sangat signifikan antar kedua layanan web tersebut Kata kunci: embedded system, web server Abstract Embedded systems are currently of particular concern in computer technology, some of the linux operating system and web server variegated also prepared to support the embedded system, one of the applications that can be used in embedded systems are operating on the web server. Selection of embedded web server on the environment is still rarely done, therefore this study was conducted with a focus on two web application servers belonging to the main features that offer a "lightness" to the CPU and memory consumption as Light HTTPD and Tiny HTTPD. By using the parameters of the thread (users, ramp-up periods, and loop count on a stress test embedded systems, this study offers a solution of web server which between the Light

  14. An Internet-Based GIS Platform Providing Data for Visualization and Spatial Analysis of Urbanization in Major Asian and African Cities

    Directory of Open Access Journals (Sweden)

    Hao Gong

    2017-08-01

    Full Text Available Rapid urbanization in developing countries has been observed to be relatively high in the last two decades, especially in the Asian and African regions. Although many researchers have made efforts to improve the understanding of the urbanization trends of various cities in Asia and Africa, the absence of platforms where local stakeholders can visualize and obtain processed urbanization data for their specific needs or analysis, still remains a gap. In this paper, we present an Internet-based GIS platform called MEGA-WEB. The Platform was developed in view of the urban planning and management challenges in developing countries of Asia and Africa due to the limited availability of data resources, effective tools, and proficiency in data analysis. MEGA-WEB provides online access, visualization, spatial analysis, and data sharing services following a mashup framework of the MEGA-WEB Geo Web Services (GWS, with the third-party map services using HTML5/JavaScript techniques. Through the integration of GIS, remote sensing, geo-modelling, and Internet GIS, several indicators for analyzing urbanization are provided in MEGA-WEB to give diverse perspectives on the urbanization of not only the physical land surface condition, but also the relationships of population, energy use, and the environment. The design, architecture, system functions, and uses of MEGA-WEB are discussed in the paper. The MEGA-WEB project is aimed at contributing to sustainable urban development in developing countries of Asia and Africa.

  15. RS-WebPredictor

    DEFF Research Database (Denmark)

    Zaretzki, J.; Bergeron, C.; Huang, T.-W.

    2013-01-01

    Regioselectivity-WebPredictor (RS-WebPredictor) is a server that predicts isozyme-specific cytochrome P450 (CYP)-mediated sites of metabolism (SOMs) on drug-like molecules. Predictions may be made for the promiscuous 2C9, 2D6 and 3A4 CYP isozymes, as well as CYPs 1A2, 2A6, 2B6, 2C8, 2C19 and 2E1....... RS-WebPredictor is the first freely accessible server that predicts the regioselectivity of the last six isozymes. Server execution time is fast, taking on average 2s to encode a submitted molecule and 1s to apply a given model, allowing for high-throughput use in lead optimization projects.......Availability: RS-WebPredictor is accessible for free use at http://reccr.chem.rpi.edu/ Software/RS-WebPredictor....

  16. Engineering Adaptive Web Applications

    DEFF Research Database (Denmark)

    Dolog, Peter

    2007-01-01

    suit the user profile the most. This paper summarizes the domain engineering framework for such adaptive web applications. The framework provides guidelines to develop adaptive web applications as members of a family. It suggests how to utilize the design artifacts as knowledge which can be used......Information and services on the web are accessible for everyone. Users of the web differ in their background, culture, political and social environment, interests and so on. Ambient intelligence was envisioned as a concept for systems which are able to adapt to user actions and needs....... With the growing amount of information and services, the web applications become natural candidates to adopt the concepts of ambient intelligence. Such applications can deal with divers user intentions and actions based on the user profile and can suggest the combination of information content and services which...

  17. IL web tutorials

    DEFF Research Database (Denmark)

    Hyldegård, Jette; Lund, Haakon

    2012-01-01

    The paper presents the results from a study on information literacy in a higher education (HE) context based on a larger research project evaluating 3 Norwegian IL web tutorials at 6 universities and colleges in Norway. The aim was to evaluate how the 3 web tutorials served students’ information...... seeking and writing process in an study context and to identify barriers to the employment and use of the IL web tutorials, hence to the underlying information literacy intentions by the developer. Both qualitative and quantitative methods were employed. A clear mismatch was found between intention...... and use of the web tutorials. In addition, usability only played a minor role compared to relevance. It is concluded that the positive expectations of the IL web tutorials tend to be overrated by the developers. Suggestions for further research are presented....

  18. Quantifying the web browser ecosystem.

    Science.gov (United States)

    Ferdman, Sela; Minkov, Einat; Bekkerman, Ron; Gefen, David

    2017-01-01

    Contrary to the assumption that web browsers are designed to support the user, an examination of a 900,000 distinct PCs shows that web browsers comprise a complex ecosystem with millions of addons collaborating and competing with each other. It is possible for addons to "sneak in" through third party installations or to get "kicked out" by their competitors without user involvement. This study examines that ecosystem quantitatively by constructing a large-scale graph with nodes corresponding to users, addons, and words (terms) that describe addon functionality. Analyzing addon interactions at user level using the Personalized PageRank (PPR) random walk measure shows that the graph demonstrates ecological resilience. Adapting the PPR model to analyzing the browser ecosystem at the level of addon manufacturer, the study shows that some addon companies are in symbiosis and others clash with each other as shown by analyzing the behavior of 18 prominent addon manufacturers. Results may herald insight on how other evolving internet ecosystems may behave, and suggest a methodology for measuring this behavior. Specifically, applying such a methodology could transform the addon market.

  19. Quantifying the web browser ecosystem

    Science.gov (United States)

    Ferdman, Sela; Minkov, Einat; Gefen, David

    2017-01-01

    Contrary to the assumption that web browsers are designed to support the user, an examination of a 900,000 distinct PCs shows that web browsers comprise a complex ecosystem with millions of addons collaborating and competing with each other. It is possible for addons to “sneak in” through third party installations or to get “kicked out” by their competitors without user involvement. This study examines that ecosystem quantitatively by constructing a large-scale graph with nodes corresponding to users, addons, and words (terms) that describe addon functionality. Analyzing addon interactions at user level using the Personalized PageRank (PPR) random walk measure shows that the graph demonstrates ecological resilience. Adapting the PPR model to analyzing the browser ecosystem at the level of addon manufacturer, the study shows that some addon companies are in symbiosis and others clash with each other as shown by analyzing the behavior of 18 prominent addon manufacturers. Results may herald insight on how other evolving internet ecosystems may behave, and suggest a methodology for measuring this behavior. Specifically, applying such a methodology could transform the addon market. PMID:28644833

  20. Funnel-web spider bite

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/002844.htm Funnel-web spider bite To use the sharing features on ... the effects of a bite from the funnel-web spider. Male funnel-web spiders are more poisonous ...

  1. Parasites in the Wadden Sea food web

    Science.gov (United States)

    Thieltges, David W.; Engelsma, Marc Y.; Wendling, Carolin C.; Wegner, K. Mathias

    2013-09-01

    While the free-living fauna of the Wadden Sea has received much interest, little is known on the distribution and effects of parasites in the Wadden Sea food web. However, recent studies on this special type of trophic interaction indicate a high diversity of parasites in the Wadden Sea and suggest a multitude of effects on the hosts. This also includes effects on specific predator-prey relationships and the general structure of the food web. Focussing on molluscs, a major group in the Wadden Sea in terms of biomass and abundance and an important link between primary producers and predators, we review existing studies and exemplify the ecological role of parasites in the Wadden Sea food web. First, we give a brief inventory of parasites occurring in the Wadden Sea, ranging from microparasites (e.g. protozoa, bacteria) to macroparasites (e.g. helminths, parasitic copepods) and discuss the effects of spatial scale on heterogeneities in infection levels. We then demonstrate how parasites can affect host population dynamics by acting as a strong mortality factor, causing mollusc mass mortalities. In addition, we will exemplify how parasites can mediate the interaction strength of predator-prey relationships and affect the topological structure of the Wadden Sea food web as a whole. Finally, we highlight some ongoing changes regarding parasitism in the Wadden Sea in the course of global change (e.g. species introduction, climate change) and identify important future research questions to entangle the role of parasites in the Wadden Sea food web.

  2. Gaining insight into food webs reconstructed by the inverse method

    NARCIS (Netherlands)

    Kones, J.; Soetaert, K.E.R.; Van Oevelen, D.; Owino, J.; Mavuti, K.

    2006-01-01

    The use of the inverse method to analyze flow patterns of organic components in ecological systems has had wide application in ecological modeling. Through this approach, an infinite number of food web flows describing the food web and satisfying biological constraints are generated, from which one

  3. Programming Collective Intelligence Building Smart Web 2.0 Applications

    CERN Document Server

    Segaran, Toby

    2008-01-01

    This fascinating book demonstrates how you can build web applications to mine the enormous amount of data created by people on the Internet. With the sophisticated algorithms in this book, you can write smart programs to access interesting datasets from other web sites, collect data from users of your own applications, and analyze and understand the data once you've found it.

  4. Design and Implementation of a Web Based System for Orphanage ...

    African Journals Online (AJOL)

    We analyzed and examined the public perception of having a web based information system for orphanage management and also designed and implemented a web based system for management of orphanages. The system we developed keeps track of orphanages, the orphans, the helps received by the orphanages and ...

  5. A study on the personalization methods of the web | Hajighorbani ...

    African Journals Online (AJOL)

    ... methods of correct patterns and analyze them. Here we will discuss the basic concepts of web personalization and consider the three approaches of web personalization and we evaluated the methods belonging to each of them. Keywords: personalization, search engine, user preferences, data mining methods ...

  6. Text mining of web-based medical content

    CERN Document Server

    Neustein, Amy

    2014-01-01

    Text Mining of Web-Based Medical Content examines web mining for extracting useful information that can be used for treating and monitoring the healthcare of patients. This work provides methodological approaches to designing mapping tools that exploit data found in social media postings. Specific linguistic features of medical postings are analyzed vis-a-vis available data extraction tools for culling useful information.

  7. An Analysis of Academic Library Web Pages for Faculty

    Science.gov (United States)

    Gardner, Susan J.; Juricek, John Eric; Xu, F. Grace

    2008-01-01

    Web sites are increasingly used by academic libraries to promote key services and collections to teaching faculty. This study analyzes the content, location, language, and technological features of fifty-four academic library Web pages designed especially for faculty to expose patterns in the development of these pages.

  8. Classical Hypermedia Virtues on the Web with Webstrates

    DEFF Research Database (Denmark)

    Bouvin, Niels Olof; Klokmose, Clemens Nylandsted

    2016-01-01

    We show and analyze herein how Webstrates can augment the Web from a classical hypermedia perspective. Webstrates turns the DOM of Web pages into persistent and collaborative objects. We demonstrate how this can be applied to realize bidirectional links, shared collaborative annotations, and in...

  9. Ontology Versioning and Change Detection on the Web

    NARCIS (Netherlands)

    Klein, Michel; Fensel, Dieter; Kiryakov, Atanas; Ognyanov, Damyan

    2002-01-01

    To effectively use ontologies on the Web, it is essential that changes in ontologies are managed well. This paper analyzes the topic of ontology versioning in the context of the Web by looking at the characteristics of the version relation between ontologies and at the identification of online

  10. A Study of Multimedia Annotation of Web-Based Materials

    Science.gov (United States)

    Hwang, Wu-Yuin; Wang, Chin-Yu; Sharples, Mike

    2007-01-01

    Web-based learning has become an important way to enhance learning and teaching, offering many learning opportunities. A limitation of current Web-based learning is the restricted ability of students to personalize and annotate the learning materials. Providing personalized tools and analyzing some types of learning behavior, such as students'…

  11. Analyzing the User Behavior toward Electronic Commerce Stimuli.

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-Del-Amo, María-Del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies -which constitute the web atmosphere or webmosphere of a website- on shopping human behavior (i.e., users' internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 ("free" versus "hierarchical" navigational structure) × 2 ("on" versus "off" music) × 2 ("moving" versus "static" images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  12. Analyzing the user behavior towards Electronic Commerce stimuli

    Directory of Open Access Journals (Sweden)

    Carlota Lorenzo-Romero

    2016-11-01

    Full Text Available Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e. navigational structure as utilitarian stimulus versus nonverbal web technology (music and presentation of products as hedonic stimuli. Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human bebaviour (i.e. users’ internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes- within the retail online store created by computer, taking into account some mediator variables (i.e. involvement, atmospheric responsiveness, and perceived risk. A 2(free versus hierarchical navigational structure x2(on versus off music x2(moving versus static images between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  13. Web Mining and Social Networking

    CERN Document Server

    Xu, Guandong; Li, Lin

    2011-01-01

    This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web mining, and the issue of how to incorporate web mining into web personalization and recommendation systems are also reviewed. Additionally, the volume explores web community mining and analysis to find the structural, organizational and temporal developments of web communities and reveal the societal s

  14. Spatial networks

    Science.gov (United States)

    Barthélemy, Marc

    2011-02-01

    Complex systems are very often organized under the form of networks where nodes and edges are embedded in space. Transportation and mobility networks, Internet, mobile phone networks, power grids, social and contact networks, and neural networks, are all examples where space is relevant and where topology alone does not contain all the information. Characterizing and understanding the structure and the evolution of spatial networks is thus crucial for many different fields, ranging from urbanism to epidemiology. An important consequence of space on networks is that there is a cost associated with the length of edges which in turn has dramatic effects on the topological structure of these networks. We will thoroughly explain the current state of our understanding of how the spatial constraints affect the structure and properties of these networks. We will review the most recent empirical observations and the most important models of spatial networks. We will also discuss various processes which take place on these spatial networks, such as phase transitions, random walks, synchronization, navigation, resilience, and disease spread.

  15. Spatial interpolation

    NARCIS (Netherlands)

    Stein, A.

    1991-01-01

    The theory and practical application of techniques of statistical interpolation are studied in this thesis, and new developments in multivariate spatial interpolation and the design of sampling plans are discussed. Several applications to studies in soil science are

  16. Multichannel analyzer development in CAMAC

    International Nuclear Information System (INIS)

    Nagy, J.Z.; Zarandy, A.

    1988-01-01

    The data acquisition in TOKAMAK experiments some CAMAC modules have been developed. The modules are the following: 64 K analyzer memory, 32 K analyzer memory, 6-channel pulse peak analyzer memory which contains the 32 K analyzer memory and eight AD-converters

  17. 07051 Working Group Outcomes -- Programming Paradigms for the Web: Web Programming and Web Services

    OpenAIRE

    Hull, Richard; Thiemann, Peter; Wadler, Philip

    2007-01-01

    Participants in the seminar broke into groups on ``Patterns and Paradigms'' for web programming, ``Web Services,'' ``Data on the Web,'' ``Software Engineering'' and ``Security.'' Here we give the raw notes recorded during these sessions.

  18. Kismeth: Analyzer of plant methylation states through bisulfite sequencing

    Directory of Open Access Journals (Sweden)

    Martienssen Robert A

    2008-09-01

    Full Text Available Abstract Background There is great interest in probing the temporal and spatial patterns of cytosine methylation states in genomes of a variety of organisms. It is hoped that this will shed light on the biological roles of DNA methylation in the epigenetic control of gene expression. Bisulfite sequencing refers to the treatment of isolated DNA with sodium bisulfite to convert unmethylated cytosine to uracil, with PCR converting the uracil to thymidine followed by sequencing of the resultant DNA to detect DNA methylation. For the study of DNA methylation, plants provide an excellent model system, since they can tolerate major changes in their DNA methylation patterns and have long been studied for the effects of DNA methylation on transposons and epimutations. However, in contrast to the situation in animals, there aren't many tools that analyze bisulfite data in plants, which can exhibit methylation of cytosines in a variety of sequence contexts (CG, CHG, and CHH. Results Kismeth http://katahdin.mssm.edu/kismeth is a web-based tool for bisulfite sequencing analysis. Kismeth was designed to be used with plants, since it considers potential cytosine methylation in any sequence context (CG, CHG, and CHH. It provides a tool for the design of bisulfite primers as well as several tools for the analysis of the bisulfite sequencing results. Kismeth is not limited to data from plants, as it can be used with data from any species. Conclusion Kismeth simplifies bisulfite sequencing analysis. It is the only publicly available tool for the design of bisulfite primers for plants, and one of the few tools for the analysis of methylation patterns in plants. It facilitates analysis at both global and local scales, demonstrated in the examples cited in the text, allowing dissection of the genetic pathways involved in DNA methylation. Kismeth can also be used to study methylation states in different tissues and disease cells compared to a reference sequence.

  19. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  20. Applied Semantic Web Technologies

    CERN Document Server

    Sugumaran, Vijayan

    2011-01-01

    The rapid advancement of semantic web technologies, along with the fact that they are at various levels of maturity, has left many practitioners confused about the current state of these technologies. Focusing on the most mature technologies, Applied Semantic Web Technologies integrates theory with case studies to illustrate the history, current state, and future direction of the semantic web. It maintains an emphasis on real-world applications and examines the technical and practical issues related to the use of semantic technologies in intelligent information management. The book starts with

  1. Head First Web Design

    CERN Document Server

    Watrall, Ethan

    2008-01-01

    Want to know how to make your pages look beautiful, communicate your message effectively, guide visitors through your website with ease, and get everything approved by the accessibility and usability police at the same time? Head First Web Design is your ticket to mastering all of these complex topics, and understanding what's really going on in the world of web design. Whether you're building a personal blog or a corporate website, there's a lot more to web design than div's and CSS selectors, but what do you really need to know? With this book, you'll learn the secrets of designing effecti

  2. Instant Flask web development

    CERN Document Server

    DuPlain, Ron

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. The book uses a bottom-up approach to help you build applications, and is full of step-by-step instructions and practical examples to help you improve your knowledge.Instant Flask Web Development is for developers who are new to web programming, or are familiar with web programming but new to Flask. This book gives you a head start if you have some beginner experience with Python and HTML, or are willing to learn.

  3. The RNAsnp web server

    DEFF Research Database (Denmark)

    Radhakrishnan, Sabarinathan; Tafer, Hakim; Seemann, Ernst Stefan

    2013-01-01

    , are derived from extensive pre-computed tables of distributions of substitution effects as a function of gene length and GC content. Here, we present a web service that not only provides an interface for RNAsnp but also features a graphical output representation. In addition, the web server is connected...... to a local mirror of the UCSC genome browser database that enables the users to select the genomic sequences for analysis and visualize the results directly in the UCSC genome browser. The RNAsnp web server is freely available at: http://rth.dk/resources/rnasnp/....

  4. Programming NET Web Services

    CERN Document Server

    Ferrara, Alex

    2007-01-01

    Web services are poised to become a key technology for a wide range of Internet-enabled applications, spanning everything from straight B2B systems to mobile devices and proprietary in-house software. While there are several tools and platforms that can be used for building web services, developers are finding a powerful tool in Microsoft's .NET Framework and Visual Studio .NET. Designed from scratch to support the development of web services, the .NET Framework simplifies the process--programmers find that tasks that took an hour using the SOAP Toolkit take just minutes. Programming .NET

  5. Express web application development

    CERN Document Server

    Yaapa, Hage

    2013-01-01

    Express Web Application Development is a practical introduction to learning about Express. Each chapter introduces you to a different area of Express, using screenshots and examples to get you up and running as quickly as possible.If you are looking to use Express to build your next web application, ""Express Web Application Development"" will help you get started and take you right through to Express' advanced features. You will need to have an intermediate knowledge of JavaScript to get the most out of this book.

  6. Silicon web process development

    Science.gov (United States)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1981-01-01

    The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

  7. Building Web Reputation Systems

    CERN Document Server

    Farmer, Randy

    2010-01-01

    What do Amazon's product reviews, eBay's feedback score system, Slashdot's Karma System, and Xbox Live's Achievements have in common? They're all examples of successful reputation systems that enable consumer websites to manage and present user contributions most effectively. This book shows you how to design and develop reputation systems for your own sites or web applications, written by experts who have designed web communities for Yahoo! and other prominent sites. Building Web Reputation Systems helps you ask the hard questions about these underlying mechanisms, and why they're critical

  8. Chemistry WebBook

    Science.gov (United States)

    SRD 69 NIST Chemistry WebBook (Web, free access)   The NIST Chemistry WebBook contains: Thermochemical data for over 7000 organic and small inorganic compounds; thermochemistry data for over 8000 reactions; IR spectra for over 16,000 compounds; mass spectra for over 33,000 compounds; UV/Vis spectra for over 1600 compounds; electronic and vibrational spectra for over 5000 compounds; constants of diatomic molecules(spectroscopic data) for over 600 compounds; ion energetics data for over 16,000 compounds; thermophysical property data for 74 fluids.

  9. Web interface for plasma analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M. [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan)], E-mail: emo@nifs.ac.jp; Murakami, S. [Kyoto University, Yoshida-Honmachi, Sakyo-ku, Kyoto 606-8501 (Japan); Yoshida, M.; Funaba, H.; Nagayama, Y. [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan)

    2008-04-15

    There are many analysis codes that analyze various aspects of plasma physics. However, most of them are FORTRAN programs that are written to be run in supercomputers. On the other hand, many scientists use GUI (graphical user interface)-based operating systems. For those who are not familiar with supercomputers, it is a difficult task to run analysis codes in supercomputers, and they often hesitate to use these programs to substantiate their ideas. Furthermore, these analysis codes are written for personal use, and the programmers do not expect these programs to be run by other users. In order to make these programs to be widely used by many users, the authors developed user-friendly interfaces using a Web interface. Since the Web browser is one of the most common applications, it is useful for both the users and developers. In order to realize interactive Web interface, AJAX technique is widely used, and the authors also adopted AJAX. To build such an AJAX based Web system, Ruby on Rails plays an important role in this system. Since this application framework, which is written in Ruby, abstracts the Web interfaces necessary to implement AJAX and database functions, it enables the programmers to efficiently develop the Web-based application. In this paper, the authors will introduce the system and demonstrate the usefulness of this approach.

  10. Web-Based Learning Support System

    Science.gov (United States)

    Fan, Lisa

    Web-based learning support system offers many benefits over traditional learning environments and has become very popular. The Web is a powerful environment for distributing information and delivering knowledge to an increasingly wide and diverse audience. Typical Web-based learning environments, such as Web-CT, Blackboard, include course content delivery tools, quiz modules, grade reporting systems, assignment submission components, etc. They are powerful integrated learning management systems (LMS) that support a number of activities performed by teachers and students during the learning process [1]. However, students who study a course on the Internet tend to be more heterogeneously distributed than those found in a traditional classroom situation. In order to achieve optimal efficiency in a learning process, an individual learner needs his or her own personalized assistance. For a web-based open and dynamic learning environment, personalized support for learners becomes more important. This chapter demonstrates how to realize personalized learning support in dynamic and heterogeneous learning environments by utilizing Adaptive Web technologies. It focuses on course personalization in terms of contents and teaching materials that is according to each student's needs and capabilities. An example of using Rough Set to analyze student personal information to assist students with effective learning and predict student performance is presented.

  11. Web interface for plasma analysis codes

    International Nuclear Information System (INIS)

    Emoto, M.; Murakami, S.; Yoshida, M.; Funaba, H.; Nagayama, Y.

    2008-01-01

    There are many analysis codes that analyze various aspects of plasma physics. However, most of them are FORTRAN programs that are written to be run in supercomputers. On the other hand, many scientists use GUI (graphical user interface)-based operating systems. For those who are not familiar with supercomputers, it is a difficult task to run analysis codes in supercomputers, and they often hesitate to use these programs to substantiate their ideas. Furthermore, these analysis codes are written for personal use, and the programmers do not expect these programs to be run by other users. In order to make these programs to be widely used by many users, the authors developed user-friendly interfaces using a Web interface. Since the Web browser is one of the most common applications, it is useful for both the users and developers. In order to realize interactive Web interface, AJAX technique is widely used, and the authors also adopted AJAX. To build such an AJAX based Web system, Ruby on Rails plays an important role in this system. Since this application framework, which is written in Ruby, abstracts the Web interfaces necessary to implement AJAX and database functions, it enables the programmers to efficiently develop the Web-based application. In this paper, the authors will introduce the system and demonstrate the usefulness of this approach

  12. WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides

    Energy Technology Data Exchange (ETDEWEB)

    Ma Xiuzeng [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States)]. E-mail: hongju@purdue.edu; Li Yingkui [Department of Geography, University of Missouri-Columbia, Columbia, MO 65211 (United States); Bourgeois, Mike [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Caffee, Marc [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Elmore, David [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Granger, Darryl [Department of Earth and Atmospheric Sciences, Purdue University, West Lafayette, IN 47907 (United States); Muzikar, Paul [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Smith, Preston [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States)

    2007-06-15

    Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for {sup 10}Be and {sup 26}Al has been finished and published at http://www.physics.purdue.edu/primelab/for{sub u}sers/rockage.html. WebCN for {sup 36}Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.

  13. Investigating the Cosmic Web with Topological Data Analysis

    Science.gov (United States)

    Cisewski-Kehe, Jessi; Wu, Mike; Fasy, Brittany; Hellwing, Wojciech; Lovell, Mark; Rinaldo, Alessandro; Wasserman, Larry

    2018-01-01

    Data exhibiting complicated spatial structures are common in many areas of science (e.g. cosmology, biology), but can be difficult to analyze. Persistent homology is a popular approach within the area of Topological Data Analysis that offers a new way to represent, visualize, and interpret complex data by extracting topological features, which can be used to infer properties of the underlying structures. In particular, TDA may be useful for analyzing the large-scale structure (LSS) of the Universe, which is an intricate and spatially complex web of matter. In order to understand the physics of the Universe, theoretical and computational cosmologists develop large-scale simulations that allow for visualizing and analyzing the LSS under varying physical assumptions. Each point in the 3D data set represents a galaxy or a cluster of galaxies, and topological summaries ("persistent diagrams") can be obtained summarizing the different ordered holes in the data (e.g. connected components, loops, voids).The topological summaries are interesting and informative descriptors of the Universe on their own, but hypothesis tests using the topological summaries would provide a way to make more rigorous comparisons of LSS under different theoretical models. For example, the received cosmological model has cold dark matter (CDM); however, while the case is strong for CDM, there are some observational inconsistencies with this theory. Another possibility is warm dark matter (WDM). It is of interest to see if a CDM Universe and WDM Universe produce LSS that is topologically distinct.We present several possible test statistics for two-sample hypothesis tests using the topological summaries, carryout a simulation study to investigate the suitableness of the proposed test statistics using simulated data from a variation of the Voronoi foam model, and finally we apply the proposed inference framework to WDM vs. CDM cosmological simulation data.

  14. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  15. Web Mining and Social Networking

    DEFF Research Database (Denmark)

    Xu, Guandong; Zhang, Yanchun; Li, Lin

    This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web ...... sense of individuals or communities. The volume will benefit both academic and industry communities interested in the techniques and applications of web search, web data management, web mining and web knowledge discovery, as well as web community and social network analysis.......This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web...... mining, and the issue of how to incorporate web mining into web personalization and recommendation systems are also reviewed. Additionally, the volume explores web community mining and analysis to find the structural, organizational and temporal developments of web communities and reveal the societal...

  16. Efficient Web Harvesting Strategies for Monitoring Deep Web Content

    NARCIS (Netherlands)

    Khelghati, Mohammadreza; Hiemstra, Djoerd; van Keulen, Maurice

    2016-01-01

    Web content changes rapidly [18]. In Focused Web Harvesting [17] which aim it is to achieve a complete harvest for a given topic, this dynamic nature of the web creates problems for users who need to access a set of all the relevant web data to their topics of interest. Whether you are a fan

  17. Efficient Web Harvesting Strategies for Monitoring Deep Web Content

    NARCIS (Netherlands)

    Khelghati, Mohammadreza; Hiemstra, Djoerd; van Keulen, Maurice

    2016-01-01

    The change of the web content is rapid. In Focused Web Harvesting [?], which aims at achieving a complete harvest for a given topic, this dynamic nature of the web creates problems for users who need to access a complete set of related web data to their interesting topics. Whether you are a fan

  18. Web document engineering

    International Nuclear Information System (INIS)

    White, B.

    1996-05-01

    This tutorial provides an overview of several document engineering techniques which are applicable to the authoring of World Wide Web documents. It illustrates how pre-WWW hypertext research is applicable to the development of WWW information resources

  19. Fun With Food Webs

    Science.gov (United States)

    Smith, Karl D.

    1977-01-01

    Explains an upper elementary game of tag that illustrates energy flow in food webs using candy bars as food sources. A follow-up field trip to a river and five language arts projects are also suggested. (CS)

  20. Airport Status Web Service

    Data.gov (United States)

    Department of Transportation — A web service that allows end-users the ability to query the current known delays in the National Airspace System as well as the current weather from NOAA by airport...

  1. Semantic Web Development

    National Research Council Canada - National Science Library

    Berners-Lee, Tim; Swick, Ralph

    2006-01-01

    ...) project between 2002 and 2005 provided key steps in the research in the Semantic Web technology, and also played an essential role in delivering the technology to industry and government in the form...

  2. Web cache location

    Directory of Open Access Journals (Sweden)

    Boffey Brian

    2004-01-01

    Full Text Available Stress placed on network infrastructure by the popularity of the World Wide Web may be partially relieved by keeping multiple copies of Web documents at geographically dispersed locations. In particular, use of proxy caches and replication provide a means of storing information 'nearer to end users'. This paper concentrates on the locational aspects of Web caching giving both an overview, from an operational research point of view, of existing research and putting forward avenues for possible further research. This area of research is in its infancy and the emphasis will be on themes and trends rather than on algorithm construction. Finally, Web caching problems are briefly related to referral systems more generally.

  3. OECD eXplorer: Making Regional Statistics Come Alive through a Geo-Visual Web-Tool

    Directory of Open Access Journals (Sweden)

    Monica Brezzi

    2011-06-01

    Full Text Available Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing highly interactive Geovisual Analytics applications for the Internet. An emerging and challenging application domain is geovisualization of regional (sub-national statistics. Higher integration drivenby institutional processes and economic globalisation is eroding national borders and creating competition along regional lines in the world market. Sound information at sub-national level and benchmark of regions across borders have gained importance in the policy agenda of many countries. In this paper, we introduce “OECD eXplorer” — an interactive tool for analyzing and communicating gained insights and discoveries about spatial-temporal and multivariate OECD regional data. This database is a potential treasure chest for policy-makers, researchers and citizens to gain a better understanding of a region’s structure and performance and to carry out analysis of territorial trends and disparities based on sound information comparableacross countries. Many approaches and tools have been developed in spatial-related knowledge discovery but generally they do not scale well with dynamic visualization of larger spatial data on the Internet. In this context, we introduce a web-compliant Geovisual Analytics toolkit that supports a broad collection offunctional components for analysis, hypothesis generation and validation. The same tool enables the communicationof results on the basis of a snapshot mechanism that captures, re-uses and shares task-related explorative findings. Further developments underway are in the creation of a generic highly interactive web “eXplorer” platform that can be the foundation for easy customization of similar web applications usingdifferent geographical boundaries and indicators. Given this global dimension, a “generic eXplorer” will be a powerful tool to explore different territorial dimensions

  4. Forensic web watch.

    Science.gov (United States)

    Abbas, Ali; N Rutty, Guy

    2003-06-01

    When one thinks of print identification techniques one automatically considers fingerprints. Although finger prints have been in use now for over 100 years there is in fact an older type of identification technique related to prints left at scenes of crime and the anatomy of human body parts. This is the world of ear prints. This short web review considers web sites related to ear print identification particularly the continuing controversy as to whether or not an ear print is unique.

  5. Node web development

    CERN Document Server

    Herron, David

    2013-01-01

    Presented in a simple, step-by-step format, this book is an introduction to web development with Node.This book is for anybody looking for an alternative to the ""P"" languages (Perl, PHP, Python), or anyone looking for a new paradigm of server-side application development.The reader should have at least a rudimentary understanding of JavaScript and web application development.

  6. Even Faster Web Sites Performance Best Practices for Web Developers

    CERN Document Server

    Souders, Steve

    2009-01-01

    Performance is critical to the success of any web site, and yet today's web applications push browsers to their limits with increasing amounts of rich content and heavy use of Ajax. In this book, Steve Souders, web performance evangelist at Google and former Chief Performance Yahoo!, provides valuable techniques to help you optimize your site's performance. Souders' previous book, the bestselling High Performance Web Sites, shocked the web development world by revealing that 80% of the time it takes for a web page to load is on the client side. In Even Faster Web Sites, Souders and eight exp

  7. Beginning ASPNET Web Pages with WebMatrix

    CERN Document Server

    Brind, Mike

    2011-01-01

    Learn to build dynamic web sites with Microsoft WebMatrix Microsoft WebMatrix is designed to make developing dynamic ASP.NET web sites much easier. This complete Wrox guide shows you what it is, how it works, and how to get the best from it right away. It covers all the basic foundations and also introduces HTML, CSS, and Ajax using jQuery, giving beginning programmers a firm foundation for building dynamic web sites.Examines how WebMatrix is expected to become the new recommended entry-level tool for developing web sites using ASP.NETArms beginning programmers, students, and educators with al

  8. Analyzing petabytes of data with Hadoop

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  9. From Web accessibility to Web adaptability.

    Science.gov (United States)

    Kelly, Brian; Nevile, Liddy; Sloan, David; Fanou, Sotiris; Ellison, Ruth; Herrod, Lisa

    2009-07-01

    This article asserts that current approaches to enhance the accessibility of Web resources fail to provide a solid foundation for the development of a robust and future-proofed framework. In particular, they fail to take advantage of new technologies and technological practices. The article introduces a framework for Web adaptability, which encourages the development of Web-based services that can be resilient to the diversity of uses of such services, the target audience, available resources, technical innovations, organisational policies and relevant definitions of 'accessibility'. The article refers to a series of author-focussed approaches to accessibility through which the authors and others have struggled to find ways to promote accessibility for people with disabilities. These approaches depend upon the resource author's determination of the anticipated users' needs and their provision. Through approaches labelled as 1.0, 2.0 and 3.0, the authors have widened their focus to account for contexts and individual differences in target audiences. Now, the authors want to recognise the role of users in determining their engagement with resources (including services). To distinguish this new approach, the term 'adaptability' has been used to replace 'accessibility'; new definitions of accessibility have been adopted, and the authors have reviewed their previous work to clarify how it is relevant to the new approach. Accessibility 1.0 is here characterised as a technical approach in which authors are told how to construct resources for a broadly defined audience. This is known as universal design. Accessibility 2.0 was introduced to point to the need to account for the context in which resources would be used, to help overcome inadequacies identified in the purely technical approach. Accessibility 3.0 moved the focus on users from a homogenised universal definition to recognition of the idiosyncratic needs and preferences of individuals and to cater for them. All of

  10. Usability Evaluation of Public Web Mapping Sites

    Science.gov (United States)

    Wang, C.

    2014-04-01

    Web mapping sites are interactive maps that are accessed via Webpages. With the rapid development of Internet and Geographic Information System (GIS) field, public web mapping sites are not foreign to people. Nowadays, people use these web mapping sites for various reasons, in that increasing maps and related map services of web mapping sites are freely available for end users. Thus, increased users of web mapping sites led to more usability studies. Usability Engineering (UE), for instance, is an approach for analyzing and improving the usability of websites through examining and evaluating an interface. In this research, UE method was employed to explore usability problems of four public web mapping sites, analyze the problems quantitatively and provide guidelines for future design based on the test results. Firstly, the development progress for usability studies were described, and simultaneously several usability evaluation methods such as Usability Engineering (UE), User-Centered Design (UCD) and Human-Computer Interaction (HCI) were generally introduced. Then the method and procedure of experiments for the usability test were presented in detail. In this usability evaluation experiment, four public web mapping sites (Google Maps, Bing maps, Mapquest, Yahoo Maps) were chosen as the testing websites. And 42 people, who having different GIS skills (test users or experts), gender (male or female), age and nationality, participated in this test to complete the several test tasks in different teams. The test comprised three parts: a pretest background information questionnaire, several test tasks for quantitative statistics and progress analysis, and a posttest questionnaire. The pretest and posttest questionnaires focused on gaining the verbal explanation of their actions qualitatively. And the design for test tasks targeted at gathering quantitative data for the errors and problems of the websites. Then, the results mainly from the test part were analyzed. The

  11. Spatial distribution

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Hendrichsen, Ditte Katrine; Nachman, Gøsta Støger

    2008-01-01

    , depending on the nature of intraspecific interactions between them: while the individuals of some species repel each other and partition the available area, others form groups of varying size, determined by the fitness of each group member. The spatial distribution pattern of individuals again strongly......Living organisms are distributed over the entire surface of the planet. The distribution of the individuals of each species is not random; on the contrary, they are strongly dependent on the biology and ecology of the species, and vary over different spatial scale. The structure of whole...... populations reflects the location and fragmentation pattern of the habitat types preferred by the species, and the complex dynamics of migration, colonization, and population growth taking place over the landscape. Within these, individuals are distributed among each other in regular or clumped patterns...

  12. Spatial Culture

    DEFF Research Database (Denmark)

    Reeh, Henrik

    2012-01-01

    Spatial Culture – A Humanities Perspective Abstract of introductory essay by Henrik Reeh Secured by alliances between socio-political development and cultural practices, a new field of humanistic studies in spatial culture has developed since the 1990s. To focus on links between urban culture...... and modern society is, however, an intellectual practice which has a much longer history. Already in the 1980s, the debate on the modern and the postmodern cited Paris and Los Angeles as spatio-cultural illustrations of these major philosophical concepts. Earlier, in the history of critical studies, the work...... Foucault considered a constitutive feature of 20th-century thinking and one that continues to occupy intellectual and cultural debates in the third millennium. A conceptual framework is, nevertheless, necessary, if the humanities are to adequa-tely address city and space – themes that have long been...

  13. Webs and posets

    International Nuclear Information System (INIS)

    Dukes, M.; Gardi, E.; McAslan, H.; Scott, D.J.; White, C.D.

    2014-01-01

    The non-Abelian exponentiation theorem has recently been generalised to correlators of multiple Wilson line operators. The perturbative expansions of these correlators exponentiate in terms of sets of diagrams called webs, which together give rise to colour factors corresponding to connected graphs. The colour and kinematic degrees of freedom of individual diagrams in a web are entangled by mixing matrices of purely combinatorial origin. In this paper we relate the combinatorial study of these matrices to properties of partially ordered sets (posets), and hence obtain explicit solutions for certain families of web-mixing matrix, at arbitrary order in perturbation theory. We also provide a general expression for the rank of a general class of mixing matrices, which governs the number of independent colour factors arising from such webs. Finally, we use the poset language to examine a previously conjectured sum rule for the columns of web-mixing matrices which governs the cancellation of the leading subdivergences between diagrams in the web. Our results, when combined with parallel developments in the evaluation of kinematic integrals, offer new insights into the all-order structure of infrared singularities in non-Abelian gauge theories

  14. c-Mantic: A Cytoscape plugin for Semantic Web

    Science.gov (United States)

    Semantic Web tools can streamline the process of storing, analyzing and sharing biological information. Visualization is important for communicating such complex biological relationships. Here we use the flexibility and speed of the Cytoscape platform to interactively visualize s...

  15. GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.

    Science.gov (United States)

    Liang, Steve H L; Huang, Chih-Yuan

    2013-10-02

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  16. Metadata Schema Used in OCLC Sampled Web Pages

    Directory of Open Access Journals (Sweden)

    Fei Yu

    2005-12-01

    Full Text Available The tremendous growth of Web resources has made information organization and retrieval more and more difficult. As one approach to this problem, metadata schemas have been developed to characterize Web resources. However, many questions have been raised about the use of metadata schemas such as which metadata schemas have been used on the Web? How did they describe Web accessible information? What is the distribution of these metadata schemas among Web pages? Do certain schemas dominate the others? To address these issues, this study analyzed 16,383 Web pages with meta tags extracted from 200,000 OCLC sampled Web pages in 2000. It found that only 8.19% Web pages used meta tags; description tags, keyword tags, and Dublin Core tags were the only three schemas used in the Web pages. This article revealed the use of meta tags in terms of their function distribution, syntax characteristics, granularity of the Web pages, and the length distribution and word number distribution of both description and keywords tags.

  17. Easy web interfaces to IDL code for NSTX Data Analysis

    International Nuclear Information System (INIS)

    Davis, W.M.

    2012-01-01

    Highlights: ► Web interfaces to IDL code can be developed quickly. ► Dozens of Web Tools are used effectively on NSTX for Data Analysis. ► Web interfaces are easier to use than X-window applications. - Abstract: Reusing code is a well-known Software Engineering practice to substantially increase the efficiency of code production, as well as to reduce errors and debugging time. A variety of “Web Tools” for the analysis and display of raw and analyzed physics data are in use on NSTX [1], and new ones can be produced quickly from existing IDL [2] code. A Web Tool with only a few inputs, and which calls an IDL routine written in the proper style, can be created in less than an hour; more typical Web Tools with dozens of inputs, and the need for some adaptation of existing IDL code, can be working in a day or so. Efficiency is also increased for users of Web Tools because of the familiar interface of the web browser, and not needing X-windows, or accounts and passwords, when used within our firewall. Web Tools were adapted for use by PPPL physicists accessing EAST data stored in MDSplus with only a few man-weeks of effort; adapting to additional sites should now be even easier. An overview of Web Tools in use on NSTX, and a list of the most useful features, is also presented.

  18. Creation of a Web-Based GIS Server and Custom Geoprocessing Tools for Enhanced Hydrologic Applications

    Science.gov (United States)

    Welton, B.; Chouinard, K.; Sultan, M.; Becker, D.; Milewski, A.; Becker, R.

    2010-12-01

    Rising populations in the arid and semi arid parts of the World are increasing the demand for fresh water supplies worldwide. Many data sets needed for assessment of hydrologic applications across vast regions of the world are expensive, unpublished, difficult to obtain, or at varying scales which complicates their use. Fortunately, this situation is changing with the development of global remote sensing datasets and web-based platforms such as GIS Server. GIS provides a cost effective vehicle for comparing, analyzing, and querying a variety of spatial datasets as geographically referenced layers. We have recently constructed a web-based GIS, that incorporates all relevant geological, geochemical, geophysical, and remote sensing data sets that were readily used to identify reservoir types and potential well locations on local and regional scales in various tectonic settings including: (1) extensional environment (Red Sea rift), (2) transcurrent fault system (Najd Fault in the Arabian-Nubian Shield), and (3) compressional environments (Himalayas). The web-based GIS could also be used to detect spatial and temporal trends in precipitation, recharge, and runoff in large watersheds on local, regional, and continental scales. These applications were enabled through the construction of a web-based ArcGIS Server with Google Map’s interface and the development of customized geoprocessing tools. ArcGIS Server provides out-of-the-box setups that are generic in nature. This platform includes all of the standard web based GIS tools (e.g. pan, zoom, identify, search, data querying, and measurement). In addition to the standard suite of tools provided by ArcGIS Server an additional set of advanced data manipulation and display tools was also developed to allow for a more complete and customizable view of the area of interest. The most notable addition to the standard GIS Server tools is the custom on-demand geoprocessing tools (e.g., graph, statistical functions, custom raster

  19. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  20. Learning from WebQuests

    Science.gov (United States)

    Gaskill, Martonia; McNulty, Anastasia; Brooks, David W.

    2006-01-01

    WebQuests are activities in which students use Web resources to learn about school topics. WebQuests are advocated as constructivist activities and ones generally well regarded by students. Two experiments were conducted in school settings to compare learning using WebQuests versus conventional instruction. Students and teachers both enjoyed…

  1. Semantic Metadata for Heterogeneous Spatial Planning Documents

    Science.gov (United States)

    Iwaniak, A.; Kaczmarek, I.; Łukowicz, J.; Strzelecki, M.; Coetzee, S.; Paluszyński, W.

    2016-09-01

    Spatial planning documents contain information about the principles and rights of land use in different zones of a local authority. They are the basis for administrative decision making in support of sustainable development. In Poland these documents are published on the Web according to a prescribed non-extendable XML schema, designed for optimum presentation to humans in HTML web pages. There is no document standard, and limited functionality exists for adding references to external resources. The text in these documents is discoverable and searchable by general-purpose web search engines, but the semantics of the content cannot be discovered or queried. The spatial information in these documents is geographically referenced but not machine-readable. Major manual efforts are required to integrate such heterogeneous spatial planning documents from various local authorities for analysis, scenario planning and decision support. This article presents results of an implementation using machine-readable semantic metadata to identify relationships among regulations in the text, spatial objects in the drawings and links to external resources. A spatial planning ontology was used to annotate different sections of spatial planning documents with semantic metadata in the Resource Description Framework in Attributes (RDFa). The semantic interpretation of the content, links between document elements and links to external resources were embedded in XHTML pages. An example and use case from the spatial planning domain in Poland is presented to evaluate its efficiency and applicability. The solution enables the automated integration of spatial planning documents from multiple local authorities to assist decision makers with understanding and interpreting spatial planning information. The approach is equally applicable to legal documents from other countries and domains, such as cultural heritage and environmental management.

  2. SEMANTIC METADATA FOR HETEROGENEOUS SPATIAL PLANNING DOCUMENTS

    Directory of Open Access Journals (Sweden)

    A. Iwaniak

    2016-09-01

    Full Text Available Spatial planning documents contain information about the principles and rights of land use in different zones of a local authority. They are the basis for administrative decision making in support of sustainable development. In Poland these documents are published on the Web according to a prescribed non-extendable XML schema, designed for optimum presentation to humans in HTML web pages. There is no document standard, and limited functionality exists for adding references to external resources. The text in these documents is discoverable and searchable by general-purpose web search engines, but the semantics of the content cannot be discovered or queried. The spatial information in these documents is geographically referenced but not machine-readable. Major manual efforts are required to integrate such heterogeneous spatial planning documents from various local authorities for analysis, scenario planning and decision support. This article presents results of an implementation using machine-readable semantic metadata to identify relationships among regulations in the text, spatial objects in the drawings and links to external resources. A spatial planning ontology was used to annotate different sections of spatial planning documents with semantic metadata in the Resource Description Framework in Attributes (RDFa. The semantic interpretation of the content, links between document elements and links to external resources were embedded in XHTML pages. An example and use case from the spatial planning domain in Poland is presented to evaluate its efficiency and applicability. The solution enables the automated integration of spatial planning documents from multiple local authorities to assist decision makers with understanding and interpreting spatial planning information. The approach is equally applicable to legal documents from other countries and domains, such as cultural heritage and environmental management.

  3. Novel Approach to Analyzing MFE of Noncoding RNA Sequences.

    Science.gov (United States)

    George, Tina P; Thomas, Tessamma

    2016-01-01

    Genomic studies have become noncoding RNA (ncRNA) centric after the study of different genomes provided enormous information on ncRNA over the past decades. The function of ncRNA is decided by its secondary structure, and across organisms, the secondary structure is more conserved than the sequence itself. In this study, the optimal secondary structure or the minimum free energy (MFE) structure of ncRNA was found based on the thermodynamic nearest neighbor model. MFE of over 2600 ncRNA sequences was analyzed in view of its signal properties. Mathematical models linking MFE to the signal properties were found for each of the four classes of ncRNA analyzed. MFE values computed with the proposed models were in concordance with those obtained with the standard web servers. A total of 95% of the sequences analyzed had deviation of MFE values within ±15% relative to those obtained from standard web servers.

  4. A tandem parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Fujisawa, A.; Iguchi, H.; Nishizawa, A.; Kawasumi, Y.

    1996-11-01

    By a new modification of a parallel plate analyzer the second-order focus is obtained in an arbitrary injection angle. This kind of an analyzer with a small injection angle will have an advantage of small operational voltage, compared to the Proca and Green analyzer where the injection angle is 30 degrees. Thus, the newly proposed analyzer will be very useful for the precise energy measurement of high energy particles in MeV range. (author)

  5. Metadata in Arabic Libraries' Web Sites in Egypt and Saudi Arabia : An Applied Study

    Directory of Open Access Journals (Sweden)

    Zain A.Hady

    2005-03-01

    Full Text Available An Applied study aims at analyzes the metadata of Arabic Libraries' Web Sites in Egypt and Saudi Arabia, it begins with a methodological introduction, then the study analyzes the web sites using Meta Tag Analyzer software, it included the following web sites : Library of Alexandria, Egyptian Libraries, Egyptian National, King Fahd National Library, King Abdel Aziz Public Library, and Mubarak Public Library.

  6. Analyzing water/wastewater infrastructure interdependencies

    International Nuclear Information System (INIS)

    Gillette, J. L.; Fisher, R. E.; Peerenboom, J. P.; Whitfield, R. G.

    2002-01-01

    This paper describes four general categories of infrastructure interdependencies (physical, cyber, geographic, and logical) as they apply to the water/wastewater infrastructure, and provides an overview of one of the analytic approaches and tools used by Argonne National Laboratory to evaluate interdependencies. Also discussed are the dimensions of infrastructure interdependency that create spatial, temporal, and system representation complexities that make analyzing the water/wastewater infrastructure particularly challenging. An analytical model developed to incorporate the impacts of interdependencies on infrastructure repair times is briefly addressed

  7. A demanding web-based PACS supported by web services technology

    Science.gov (United States)

    Costa, Carlos M. A.; Silva, Augusto; Oliveira, José L.; Ribeiro, Vasco G.; Ribeiro, José

    2006-03-01

    During the last years, the ubiquity of web interfaces have pushed practically all PACS suppliers to develop client applications in which clinical practitioners can receive and analyze medical images, using conventional personal computers and Web browsers. However, due to security and performance issues, the utilization of these software packages has been restricted to Intranets. Paradigmatically, one of the most important advantages of digital image systems is to simplify the widespread sharing and remote access of medical data between healthcare institutions. This paper analyses the traditional PACS drawbacks that contribute to their reduced usage in the Internet and describes a PACS based on Web Services technology that supports a customized DICOM encoding syntax and a specific compression scheme providing all historical patient data in a unique Web interface.

  8. Programming Web services with Perl

    CERN Document Server

    Ray, Randy J

    2003-01-01

    Given Perl's natural fit for web applications development, it's no surprise that Perl is also a natural choice for web services development. It's the most popular web programming language, with strong implementations of both SOAP and XML-RPC, the leading ways to distribute applications using web services. But books on web services focus on writing these applications in Java or Visual Basic, leaving Perl programmers with few resources to get them started. Programming Web Services with Perl changes that, bringing Perl users all the information they need to create web services using their favori

  9. CMS offline web tools

    International Nuclear Information System (INIS)

    Metson, S; Newbold, D; Belforte, S; Kavka, C; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Tuura, L; Evans, D; Fanfani, A; Feichtinger, D; Kuznetsov, V; Lingen, F van; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools

  10. CMS offline web tools

    Energy Technology Data Exchange (ETDEWEB)

    Metson, S; Newbold, D [H.H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Belforte, S; Kavka, C [INFN, Sezione di Trieste (Italy); Bockelman, B [University of Nebraska Lincoln, Lincoln, NE (United States); Dziedziniewicz, K [CERN, Geneva (Switzerland); Egeland, R [University of Minnesota Twin Cities, Minneapolis, MN (United States); Elmer, P [Princeton (United States); Eulisse, G; Tuura, L [Northeastern University, Boston, MA (United States); Evans, D [Fermilab MS234, Batavia, IL (United States); Fanfani, A [Universita degli Studi di Bologna (Italy); Feichtinger, D [PSI, Villigen (Switzerland); Kuznetsov, V [Cornell University, Ithaca, NY (United States); Lingen, F van [California Institute of Technology, Pasedena, CA (United States); Wakefield, S [Blackett Laboratory, Imperial College, London (United Kingdom)

    2008-07-15

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools.

  11. Tracing the cosmic web

    Science.gov (United States)

    Libeskind, Noam I.; van de Weygaert, Rien; Cautun, Marius; Falck, Bridget; Tempel, Elmo; Abel, Tom; Alpaslan, Mehmet; Aragón-Calvo, Miguel A.; Forero-Romero, Jaime E.; Gonzalez, Roberto; Gottlöber, Stefan; Hahn, Oliver; Hellwing, Wojciech A.; Hoffman, Yehuda; Jones, Bernard J. T.; Kitaura, Francisco; Knebe, Alexander; Manti, Serena; Neyrinck, Mark; Nuza, Sebastián E.; Padilla, Nelson; Platen, Erwin; Ramachandra, Nesar; Robotham, Aaron; Saar, Enn; Shandarin, Sergei; Steinmetz, Matthias; Stoica, Radu S.; Sousbie, Thierry; Yepes, Gustavo

    2018-01-01

    The cosmic web is one of the most striking features of the distribution of galaxies and dark matter on the largest scales in the Universe. It is composed of dense regions packed full of galaxies, long filamentary bridges, flattened sheets and vast low-density voids. The study of the cosmic web has focused primarily on the identification of such features, and on understanding the environmental effects on galaxy formation and halo assembly. As such, a variety of different methods have been devised to classify the cosmic web - depending on the data at hand, be it numerical simulations, large sky surveys or other. In this paper, we bring 12 of these methods together and apply them to the same data set in order to understand how they compare. In general, these cosmic-web classifiers have been designed with different cosmological goals in mind, and to study different questions. Therefore, one would not a priori expect agreement between different techniques; however, many of these methods do converge on the identification of specific features. In this paper, we study the agreements and disparities of the different methods. For example, each method finds that knots inhabit higher density regions than filaments, etc. and that voids have the lowest densities. For a given web environment, we find a substantial overlap in the density range assigned by each web classification scheme. We also compare classifications on a halo-by-halo basis; for example, we find that 9 of 12 methods classify around a third of group-mass haloes (i.e. Mhalo ∼ 1013.5 h-1 M⊙) as being in filaments. Lastly, so that any future cosmic-web classification scheme can be compared to the 12 methods used here, we have made all the data used in this paper public.

  12. Graph Structure in Three National Academic Webs: Power Laws with Anomalies.

    Science.gov (United States)

    Thelwall, Mike; Wilkinson, David

    2003-01-01

    Explains how the Web can be modeled as a mathematical graph and analyzes the graph structures of three national university publicly indexable Web sites from Australia, New Zealand, and the United Kingdom. Topics include commercial search engines and academic Web link research; method-analysis environment and data sets; and power laws. (LRW)

  13. Gestión documental y de contenidos web: informe de situación

    OpenAIRE

    Saorín, Tomás; Pástor-Sánchez, Juan-Antonio

    2012-01-01

    Review of major 2011 developments in the field of bibliographic cataloguing, document management and documentary languages. The use of content management systems for web publishing, the emergence of new approaches such as web experience management and the integration of web productivity components are analyzed

  14. Improving the web site's effectiveness by considering each page's temporal information

    NARCIS (Netherlands)

    Li, ZG; Sun, MT; Dunham, MH; Xiao, YQ; Dong, G; Tang, C; Wang, W

    2003-01-01

    Improving the effectiveness of a web site is always one of its owner's top concerns. By focusing on analyzing web users' visiting behavior, web mining researchers have developed a variety of helpful methods, based upon association rules, clustering, prediction and so on. However, we have found

  15. Development of Spatial Database for Regional Development in Romania

    Directory of Open Access Journals (Sweden)

    Anda BELCIU (VELICANU

    2014-01-01

    Full Text Available eographical Information Systems are used in solving many regional development related problems, around the world. Starting from some national programs to famous international ones, such as INSPIRE program, each such initiative uses geospatial data as well in the process of building regional development strategies. This paper presents the main technical components of a geographical information system, meaning the spatial database, the web mapping server and the APIs used to embed the maps into web applications. The development steps for a pre-alpha version of a web GIS application dedicated to the regional development in Romania are also shown. The software tools which were integrated in order to develop the online application were Oracle Spatial, where geospatial data was stored, GeoServer, an open source web mapping server used to generate the map out of the data from Oracle Spatial’s tables and ASP.NET as a web framework for building the website.

  16. Non-visual Web Browsing: Beyond Web Accessibility.

    Science.gov (United States)

    Ramakrishnan, I V; Ashok, Vikas; Billah, Syed Masum

    2017-07-01

    People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability.

  17. WebVis: a hierarchical web homepage visualizer

    Science.gov (United States)

    Renteria, Jose C.; Lodha, Suresh K.

    2000-02-01

    WebVis, the Hierarchical Web Home Page Visualizer, is a tool for managing home web pages. The user can access this tool via the WWW and obtain a hierarchical visualization of one's home web pages. WebVis is a real time interactive tool that supports many different queries on the statistics of internal files such as sizes, age, and type. In addition, statistics on embedded information such as VRML files, Java applets, images and sound files can be extracted and queried. Results of these queries are visualized using color, shape and size of different nodes of the hierarchy. The visualization assists the user in a variety of task, such as quickly finding outdated information or locate large files. WebVIs is one solution to the growing web space maintenance problem. Implementation of WebVis is realized with Perl and Java. Perl pattern matching and file handling routines are used to collect and process web space linkage information and web document information. Java utilizes the collected information to produce visualization of the web space. Java also provides WebVis with real time interactivity, while running off the WWW. Some WebVis examples of home web page visualization are presented.

  18. Two Algorithms for Web Applications Assessment

    Directory of Open Access Journals (Sweden)

    Stavros Ioannis Valsamidis

    2011-09-01

    Full Text Available The usage of web applications can be measured with the use of metrics. In a LMS, a typical web application, there are no appropriate metrics which would facilitate their qualitative and quantitative measurement. The purpose of this paper is to propose the use of existing techniques with a different way, in order to analyze the log file of a typical LMS and deduce useful conclusions. Three metrics for course usage measurement are used. It also describes two algorithms for course classification and suggestion actions. The metrics and the algorithms and were in Open eClass LMS tracking data of an academic institution. The results from 39 courses presented interest insights. Although the case study concerns a LMS it can also be applied to other web applications such as e-government, e-commerce, e-banking, blogs e.t.c.

  19. Web Extensible Display Manager

    Energy Technology Data Exchange (ETDEWEB)

    Slominski, Ryan [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Larrieu, Theodore L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2018-02-01

    Jefferson Lab's Web Extensible Display Manager (WEDM) allows staff to access EDM control system screens from a web browser in remote offices and from mobile devices. Native browser technologies are leveraged to avoid installing and managing software on remote clients such as browser plugins, tunnel applications, or an EDM environment. Since standard network ports are used firewall exceptions are minimized. To avoid security concerns from remote users modifying a control system, WEDM exposes read-only access and basic web authentication can be used to further restrict access. Updates of monitored EPICS channels are delivered via a Web Socket using a web gateway. The software translates EDM description files (denoted with the edl suffix) to HTML with Scalable Vector Graphics (SVG) following the EDM's edl file vector drawing rules to create faithful screen renderings. The WEDM server parses edl files and creates the HTML equivalent in real-time allowing existing screens to work without modification. Alternatively, the familiar drag and drop EDM screen creation tool can be used to create optimized screens sized specifically for smart phones and then rendered by WEDM.

  20. Two-wavelength spatial-heterodyne holography

    Science.gov (United States)

    Hanson, Gregory R.; Bingham, Philip R.; Simpson, John T.; Karnowski, Thomas P.; Voelkl, Edgar

    2007-12-25

    Systems and methods are described for obtaining two-wavelength differential-phase holograms. A method includes determining a difference between a filtered analyzed recorded first spatially heterodyne hologram phase and a filtered analyzed recorded second spatially-heterodyned hologram phase.

  1. Web Accessibility in Romania: The Conformance of Municipal Web Sites to Web Content Accessibility Guidelines

    OpenAIRE

    Costin PRIBEANU; Ruxandra-Dora MARINESCU; Paul FOGARASSY-NESZLY; Maria GHEORGHE-MOISII

    2012-01-01

    The accessibility of public administration web sites is a key quality attribute for the successful implementation of the Information Society. The purpose of this paper is to present a second review of municipal web sites in Romania that is based on automated accessibility checking. A number of 60 web sites were evaluated against WCAG 2.0 recommendations. The analysis of results reveals a relatively low web accessibility of municipal web sites and highlights several aspects. Firstly, a slight ...

  2. 07051 Executive Summary -- Programming Paradigms for the Web: Web Programming and Web Services

    OpenAIRE

    Hull, Richard; Thiemann, Peter; Wadler, Philip

    2007-01-01

    The world-wide web raises a variety of new programming challenges. To name a few: programming at the level of the web browser, data-centric approaches, and attempts to automatically discover and compose web services. This seminar brought together researchers from the web programming and web services communities and strove to engage them in communication with each other. The seminar was held in an unusual style, in a mixture of short presentations and in-depth discussio...

  3. Spatial filtring and thermocouple spatial filter

    International Nuclear Information System (INIS)

    Han Bing; Tong Yunxian

    1989-12-01

    The design and study on thermocouple spatial filter have been conducted for the flow measurement of integrated reactor coolant. The fundamental principle of spatial filtring, mathematical descriptions and analyses of thermocouple spatial filter are given

  4. TIGERweb, 2017, Series Information for the TIGERweb, Web Mapping Service and REST files

    Data.gov (United States)

    US Census Bureau, Department of Commerce — TIGERweb allows the viewing of TIGER spatial data online and for TIGER data to be streamed to your mapping application. TIGERweb consists of a web mapping service...

  5. Correct software in web applications and web services

    CERN Document Server

    Thalheim, Bernhard; Prinz, Andreas; Buchberger, Bruno

    2015-01-01

    The papers in this volume aim at obtaining a common understanding of the challenging research questions in web applications comprising web information systems, web services, and web interoperability; obtaining a common understanding of verification needs in web applications; achieving a common understanding of the available rigorous approaches to system development, and the cases in which they have succeeded; identifying how rigorous software engineering methods can be exploited to develop suitable web applications; and at developing a European-scale research agenda combining theory, methods a

  6. Distributed Web Service Repository

    Directory of Open Access Journals (Sweden)

    Piotr Nawrocki

    2015-01-01

    Full Text Available The increasing availability and popularity of computer systems has resulted in a demand for new, language- and platform-independent ways of data exchange. That demand has in turn led to a significant growth in the importance of systems based on Web services. Alongside the growing number of systems accessible via Web services came the need for specialized data repositories that could offer effective means of searching of available services. The development of mobile systems and wireless data transmission technologies has allowed the use of distributed devices and computer systems on a greater scale. The accelerating growth of distributed systems might be a good reason to consider the development of distributed Web service repositories with built-in mechanisms for data migration and synchronization.

  7. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  8. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  9. Zinc in an ultraoligotrophic lake food web.

    Science.gov (United States)

    Montañez, Juan Cruz; Arribére, María A; Rizzo, Andrea; Arcagni, Marina; Campbell, Linda; Ribeiro Guevara, Sergio

    2018-03-21

    Zinc (Zn) bioaccumulation and trophic transfer were analyzed in the food web of Lake Nahuel Huapi, a deep, unpolluted ultraoligotrophic system in North Patagonia. Benthic macroinvertebrates, plankton, and native and introduced fish were collected at three sites. The effect of pyroclastic inputs on Zn levels in lacustrine food webs was assessed by studying the impact of the eruption of Puyehue-Cordón Caulle volcanic complex (PCCVC) in 2011, by performing three sampling campaigns immediately before and after the PCCVC eruption, and after 2 years of recovery of the ecosystem. Zinc trophodynamics in L. Nahuel Huapi food web was assessed using nitrogen stable isotopes (δ 15 N). There was no significant increase of Zn concentrations ([Zn]) in L. Nahuel Huapi biota after the PCCVC eruption, despite the evidence of [Zn] increase in lake water that could be associated with volcanic ash leaching. The organisms studied exhibited [Zn] above the threshold level considered for dietary deficiency, regulating Zn adequately even under a catastrophic situations like PCCVC 2011 eruption. Zinc concentrations exhibited a biodilution pattern in the lake's food web. To the best of our knowledge, present research is the first report of Zn biodilution in lacustrine systems, and the first to study Zn transfer in a freshwater food web including both pelagic and benthic compartments.

  10. Open-Source web-based geographical information system for health exposure assessment

    Directory of Open Access Journals (Sweden)

    Evans Barry

    2012-01-01

    Full Text Available Abstract This paper presents the design and development of an open source web-based Geographical Information System allowing users to visualise, customise and interact with spatial data within their web browser. The developed application shows that by using solely Open Source software it was possible to develop a customisable web based GIS application that provides functions necessary to convey health and environmental data to experts and non-experts alike without the requirement of proprietary software.

  11. Open-Source web-based geographical information system for health exposure assessment

    DEFF Research Database (Denmark)

    Evans, Barry; Sabel, Clive E

    2012-01-01

    This paper presents the design and development of an open source web-based Geographical Information System allowing users to visualise, customise and interact with spatial data within their web browser. The developed application shows that by using solely Open Source software it was possible to d...... to develop a customisable web based GIS application that provides functions necessary to convey health and environmental data to experts and non-experts alike without the requirement of proprietary software....

  12. The Role of Intermediation in the Governance of Sustainable Chinese Web Marketing

    OpenAIRE

    Choi, Yongrok; Gao, Di

    2014-01-01

    This paper identifies the factors necessary for the sustainable performance of two Chinese web marketing companies. The companies are Alibaba and its twin, Taobao. This research is based on the structural equation model (SEM). The paper analyzes the core governance factors of Chinese trust (Guanxi) from outperforming web marketing mix strategies to determine if Guanxi can be applied to other web community marketing strategies. The empirical tests, in general, show the web marketing mix is imp...

  13. Consistency Checking of Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2008-01-01

    Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...

  14. MEAN STACK WEB DEVELOPMENT

    OpenAIRE

    Le Thanh, Nghi

    2017-01-01

    The aim of the thesis is to provide a universal website using JavaScript as the main programming language. It also shows the basic parts anyone need to create a web application. The thesis creates a simple CMS using MEAN stack. MEAN is a collection of JavaScript based technologies used to develop web application. It is an acronym for MongoDB, Express, AngularJS and Node.js. It also allows non-technical users to easily update and manage a website’s content. But the application also lets o...

  15. Caught in the Web

    Energy Technology Data Exchange (ETDEWEB)

    Gillies, James

    1995-06-15

    The World-Wide Web may have taken the Internet by storm, but many people would be surprised to learn that it owes its existence to CERN. Around half the world's particle physicists come to CERN for their experiments, and the Web is the result of their need to share information quickly and easily on a global scale. Six years after Tim Berners-Lee's inspired idea to marry hypertext to the Internet in 1989, CERN is handing over future Web development to the World-Wide Web Consortium, run by the French National Institute for Research in Computer Science and Control, INRIA, and the Laboratory for Computer Science of the Massachusetts Institute of Technology, MIT, leaving itself free to concentrate on physics. The Laboratory marked this transition with a conference designed to give a taste of what the Web can do, whilst firmly stamping it with the label ''Made in CERN''. Over 200 European journalists and educationalists came to CERN on 8 - 9 March for the World-Wide Web Days, resulting in wide media coverage. The conference was opened by UK Science Minister David Hunt who stressed the importance of fundamental research in generating new ideas. ''Who could have guessed 10 years ago'', he said, ''that particle physics research would lead to a communication system which would allow every school to have the biggest library in the world in a single computer?''. In his introduction, the Minister also pointed out that ''CERN and other basic research laboratories help to break new technological ground and sow the seeds of what will become mainstream manufacturing in the future.'' Learning the jargon is often the hardest part of coming to grips with any new invention, so CERN put it at the top of the agenda. Jacques Altaber, who helped introduce the Internet to CERN in the early 1980s, explained that without the Internet, the Web couldn't exist. The Internet began as a US Defense Department research project in the 1970s and has grown into a global network-ofnetworks linking some

  16. Renaissance of the Web

    Science.gov (United States)

    McCarty, M.

    2009-09-01

    The renaissance of the web has driven development of many new technologies that have forever changed the way we write software. The resulting tools have been applied to both solve problems and creat new ones in a wide range of domains ranging from monitor and control user interfaces to information distribution. This discussion covers which of and how these technologies are being used in the astronomical computing community. Topics include JavaScript, Cascading Style Sheets, HTML, XML, JSON, RSS, iCalendar, Java, PHP, Python, Ruby on Rails, database technologies, and web frameworks/design patterns.

  17. Sustainable web ecosystem design

    CERN Document Server

    O'Toole, Greg

    2013-01-01

    This book is about the process of creating web-based systems (i.e., websites, content, etc.) that consider each of the parts, the modules, the organisms - binary or otherwise - that make up a balanced, sustainable web ecosystem. In the current media-rich environment, a website is more than a collection of relative html documents of text and images on a static desktop computer monitor. There is now an unlimited combination of screens, devices, platforms, browsers, locations, versions, users, and exabytes of data with which to interact. Written in a highly approachable, practical style, this boo

  18. Developing Large Web Applications

    CERN Document Server

    Loudon, Kyle

    2010-01-01

    How do you create a mission-critical site that provides exceptional performance while remaining flexible, adaptable, and reliable 24/7? Written by the manager of a UI group at Yahoo!, Developing Large Web Applications offers practical steps for building rock-solid applications that remain effective even as you add features, functions, and users. You'll learn how to develop large web applications with the extreme precision required for other types of software. Avoid common coding and maintenance headaches as small websites add more pages, more code, and more programmersGet comprehensive soluti

  19. Web resources for myrmecologists

    DEFF Research Database (Denmark)

    Nash, David Richard

    2005-01-01

    The world wide web provides many resources that are useful to the myrmecologist. Here I provide a brief introduc- tion to the types of information currently available, and to recent developments in data provision over the internet which are likely to become important resources for myrmecologists...... in the near future. I discuss the following types of web site, and give some of the most useful examples of each: taxonomy, identification and distribution; conservation; myrmecological literature; individual species sites; news and discussion; picture galleries; personal pages; portals....

  20. Caught in the Web

    International Nuclear Information System (INIS)

    Gillies, James

    1995-01-01

    The World-Wide Web may have taken the Internet by storm, but many people would be surprised to learn that it owes its existence to CERN. Around half the world's particle physicists come to CERN for their experiments, and the Web is the result of their need to share information quickly and easily on a global scale. Six years after Tim Berners-Lee's inspired idea to marry hypertext to the Internet in 1989, CERN is handing over future Web development to the World-Wide Web Consortium, run by the French National Institute for Research in Computer Science and Control, INRIA, and the Laboratory for Computer Science of the Massachusetts Institute of Technology, MIT, leaving itself free to concentrate on physics. The Laboratory marked this transition with a conference designed to give a taste of what the Web can do, whilst firmly stamping it with the label ''Made in CERN''. Over 200 European journalists and educationalists came to CERN on 8 - 9 March for the World-Wide Web Days, resulting in wide media coverage. The conference was opened by UK Science Minister David Hunt who stressed the importance of fundamental research in generating new ideas. ''Who could have guessed 10 years ago'', he said, ''that particle physics research would lead to a communication system which would allow every school to have the biggest library in the world in a single computer?''. In his introduction, the Minister also pointed out that ''CERN and other basic research laboratories help to break new technological ground and sow the seeds of what will become mainstream manufacturing in the future.'' Learning the jargon is often the hardest part of coming to grips with any new invention, so CERN put it at the top of the agenda. Jacques Altaber, who helped introduce the Internet to CERN in the early 1980s, explained that without the Internet, the Web couldn't exist. The Internet began as a US Defense

  1. Sensor Webs to Constellations

    Science.gov (United States)

    Cole, M.

    2017-12-01

    Advanced technology plays a key role in enabling future Earth-observing missions needed for global monitoring and climate research. Rapid progress over the past decade and anticipated for the coming decades have diminished the size of some satellites while increasing the amount of data and required pace of integration and analysis. Sensor web developments provide correlations to constellations of smallsats. Reviewing current advances in sensor webs and requirements for constellations will improve planning, operations, and data management for future architectures of multiple satellites with a common mission goal.

  2. Learning Objects Web

    DEFF Research Database (Denmark)

    Blåbjerg, Niels Jørgen

    2005-01-01

    Learning Objects Web er et DEFF projekt som Aalborg Universitetsbibliotek har initieret. Projektet tager afsæt i de resultater og erfaringer som er opnået med vores tidligere projekt Streaming Webbased Information Modules (SWIM). Vi har et internationalt netværk af interessenter som giver os...... sparring og feedback i forhold til udviklingskoncept både omkring de teoretiske rammer og i forhold til praktisk anvendelse af vores undervisningskoncept. Med disse rygstød og input har vi forfulgt ønsket om at videreudvikle SWIM i det nye projekt Learning Objects Web. Udgivelsesdato: juni...

  3. The Spatial Politics of Spatial Representation

    DEFF Research Database (Denmark)

    Olesen, Kristian; Richardson, Tim

    2011-01-01

    spatial planning in Denmark reveals how fuzzy spatial representations and relational spatial concepts are being used to depoliticise strategic spatial planning processes and to camouflage spatial politics. The paper concludes that, while relational geography might play an important role in building......This paper explores the interplay between the spatial politics of new governance landscapes and innovations in the use of spatial representations in planning. The central premise is that planning experiments with new relational approaches become enmeshed in spatial politics. The case of strategic...

  4. Food-web dynamics in a large river discontinuum

    Science.gov (United States)

    Cross, Wyatt F.; Baxter, Colden V.; Rosi-Marshall, Emma J.; Hall, Robert O.; Kennedy, Theodore A.; Donner, Kevin C.; Kelly, Holly A. Wellard; Seegert, Sarah E.Z.; Behn, Kathrine E.; Yard, Michael D.

    2013-01-01

    Nearly all ecosystems have been altered by human activities, and most communities are now composed of interacting species that have not co-evolved. These changes may modify species interactions, energy and material flows, and food-web stability. Although structural changes to ecosystems have been widely reported, few studies have linked such changes to dynamic food-web attributes and patterns of energy flow. Moreover, there have been few tests of food-web stability theory in highly disturbed and intensely managed freshwater ecosystems. Such synthetic approaches are needed for predicting the future trajectory of ecosystems, including how they may respond to natural or anthropogenic perturbations. We constructed flow food webs at six locations along a 386-km segment of the Colorado River in Grand Canyon (Arizona, USA) for three years. We characterized food-web structure and production, trophic basis of production, energy efficiencies, and interaction-strength distributions across a spatial gradient of perturbation (i.e., distance from Glen Canyon Dam), as well as before and after an experimental flood. We found strong longitudinal patterns in food-web characteristics that strongly correlated with the spatial position of large tributaries. Above tributaries, food webs were dominated by nonnative New Zealand mudsnails (62% of production) and nonnative rainbow trout (100% of fish production). The simple structure of these food webs led to few dominant energy pathways (diatoms to few invertebrate taxa to rainbow trout), large energy inefficiencies (i.e., Below large tributaries, invertebrate production declined ∼18-fold, while fish production remained similar to upstream sites and comprised predominately native taxa (80–100% of production). Sites below large tributaries had increasingly reticulate and detritus-based food webs with a higher prevalence of omnivory, as well as interaction strength distributions more typical of theoretically stable food webs (i

  5. IMPORTANCE OF TEMPERATURE IN MODELLING PCB BIOACCUMULATION IN THE LAKE MICHIGAN FOOD WEB

    Science.gov (United States)

    In most food web models, the exposure temperature of a food web is typically defined using a single spatial compartment. This essentially assumes that the predator and prey are exposed to the same temperature. However, in a large water body such as Lake Michigan, due to the spati...

  6. Web-based control application using WebSocket

    International Nuclear Information System (INIS)

    Furukawa, Y.

    2012-01-01

    The WebSocket allows asynchronous full-duplex communication between a Web-based (i.e. Java Script-based) application and a Web-server. WebSocket started as a part of HTML5 standardization but has now been separated from HTML5 and has been developed independently. Using WebSocket, it becomes easy to develop platform independent presentation layer applications for accelerator and beamline control software. In addition, a Web browser is the only application program that needs to be installed on client computer. The WebSocket-based applications communicate with the WebSocket server using simple text-based messages, so WebSocket is applicable message-based control system like MADOCA, which was developed for the SPring-8 control system. A simple WebSocket server for the MADOCA control system and a simple motor control application were successfully made as a first trial of the WebSocket control application. Using Google-Chrome (version 13.0) on Debian/Linux and Windows 7, Opera (version 11.0) on Debian/Linux and Safari (version 5.0.3) on Mac OS X as clients, the motors can be controlled using a WebSocket-based Web-application. Diffractometer control application use in synchrotron radiation diffraction experiment was also developed. (author)

  7. A new measurement of workload in Web application reliability assessment

    Directory of Open Access Journals (Sweden)

    CUI Xia

    2015-02-01

    Full Text Available Web application has been popular in various fields of social life.It becomes more and more important to study the reliability of Web application.In this paper the definition of Web application failure is firstly brought out,and then the definition of Web application reliability.By analyzing data in the IIS server logs and selecting corresponding usage and information delivery failure data,the paper study the feasibility of Web application reliability assessment from the perspective of Web software system based on IIS server logs.Because the usage for a Web site often has certain regularity,a new measurement of workload in Web application reliability assessment is raised.In this method,the unit is removed by weighted average technique;and the weights are assessed by setting objective function and optimization.Finally an experiment was raised for validation.The experiment result shows the assessment of Web application reliability base on the new workload is better.

  8. Web Based Seismological Monitoring (wbsm)

    Science.gov (United States)

    Giudicepietro, F.; Meglio, V.; Romano, S. P.; de Cesare, W.; Ventre, G.; Martini, M.

    Over the last few decades the seismological monitoring systems have dramatically improved tanks to the technological advancements and to the scientific progresses of the seismological studies. The most modern processing systems use the network tech- nologies to realize high quality performances in data transmission and remote controls. Their architecture is designed to favor the real-time signals analysis. This is, usually, realized by adopting a modular structure that allow to easy integrate any new cal- culation algorithm, without affecting the other system functionalities. A further step in the seismic processing systems evolution is the large use of the web based appli- cations. The web technologies can be an useful support for the monitoring activities allowing to automatically publishing the results of signals processing and favoring the remote access to data, software systems and instrumentation. An application of the web technologies to the seismological monitoring has been developed at the "Os- servatorio Vesuviano" monitoring center (INGV) in collaboration with the "Diparti- mento di Informatica e Sistemistica" of the Naples University. A system named Web Based Seismological Monitoring (WBSM) has been developed. Its main objective is to automatically publish the seismic events processing results and to allow displaying, analyzing and downloading seismic data via Internet. WBSM uses the XML tech- nology for hypocentral and picking parameters representation and creates a seismic events data base containing parametric data and wave-forms. In order to give tools for the evaluation of the quality and reliability of the published locations, WBSM also supplies all the quality parameters calculated by the locating program and allow to interactively display the wave-forms and the related parameters. WBSM is a modular system in which the interface function to the data sources is performed by two spe- cific modules so that to make it working in conjunction with a

  9. Multichannel analyzer type CMA-3

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1978-01-01

    Multichannel analyzer CMA-3 is designed for two-parametric analysis with operator controlled logical windows. It is implemented in CAMAC standard. A single crate contains all required modules and is controlled by the PDP-11/10 minicomputer. Configuration of CMA-3 is shown. CMA-3 is the next version of the multichannel analyzer described in report No 958/E-8. (author)

  10. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  11. Microsoft Expression Web for dummies

    CERN Document Server

    Hefferman, Linda

    2013-01-01

    Expression Web is Microsoft's newest tool for creating and maintaining dynamic Web sites. This FrontPage replacement offers all the simple ""what-you-see-is-what-you-get"" tools for creating a Web site along with some pumped up new features for working with Cascading Style Sheets and other design options. Microsoft Expression Web For Dummies arrives in time for early adopters to get a feel for how to build an attractive Web site. Author Linda Hefferman teams up with longtime FrontPage For Dummies author Asha Dornfest to show the easy way for first-time Web designers, FrontPage ve

  12. Learning from WebQuests

    Science.gov (United States)

    Gaskill, Martonia; McNulty, Anastasia; Brooks, David W.

    2006-04-01

    WebQuests are activities in which students use Web resources to learn about school topics. WebQuests are advocated as constructivist activities and ones generally well regarded by students. Two experiments were conducted in school settings to compare learning using WebQuests versus conventional instruction. Students and teachers both enjoyed WebQuest instruction and spoke highly of it. In one experiment, however, conventional instruction led to significantly greater student learning. In the other, there were no significant differences in the learning outcomes between conventional versus WebQuest-based instruction.

  13. EasyFRAP-web: a web-based tool for the analysis of fluorescence recovery after photobleaching data.

    Science.gov (United States)

    Koulouras, Grigorios; Panagopoulos, Andreas; Rapsomaniki, Maria A; Giakoumakis, Nickolaos N; Taraviras, Stavros; Lygerou, Zoi

    2018-06-13

    Understanding protein dynamics is crucial in order to elucidate protein function and interactions. Advances in modern microscopy facilitate the exploration of the mobility of fluorescently tagged proteins within living cells. Fluorescence recovery after photobleaching (FRAP) is an increasingly popular functional live-cell imaging technique which enables the study of the dynamic properties of proteins at a single-cell level. As an increasing number of labs generate FRAP datasets, there is a need for fast, interactive and user-friendly applications that analyze the resulting data. Here we present easyFRAP-web, a web application that simplifies the qualitative and quantitative analysis of FRAP datasets. EasyFRAP-web permits quick analysis of FRAP datasets through an intuitive web interface with interconnected analysis steps (experimental data assessment, different types of normalization and estimation of curve-derived quantitative parameters). In addition, easyFRAP-web provides dynamic and interactive data visualization and data and figure export for further analysis after every step. We test easyFRAP-web by analyzing FRAP datasets capturing the mobility of the cell cycle regulator Cdt2 in the presence and absence of DNA damage in cultured cells. We show that easyFRAP-web yields results consistent with previous studies and highlights cell-to-cell heterogeneity in the estimated kinetic parameters. EasyFRAP-web is platform-independent and is freely accessible at: https://easyfrap.vmnet.upatras.gr/.

  14. Uncertainty visualisation in the Model Web

    Science.gov (United States)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool

  15. Web design and destruction

    International Nuclear Information System (INIS)

    Graham, J.

    1998-01-01

    This paper notes the ineffectualness of organizational World Wide Web sites which are generally supportive of nuclear science and technology versus those whose mission is to oppose nuclear matters and which do so by providing mis-information to the public. Specific comparisons of pro and con sites are made, and recommendations are made for improving the communication effectiveness of proponent sites. (author)

  16. Also on Web!

    Indian Academy of Sciences (India)

    Also on Web! l . R e journal of science education. 'Resonam:e' is a journal of science education, published monthly since January' 1996 by the Indian Academy of. Sciences, Bangalore,lndia. It is prirnarlly directed to students and teachers at the undergraduate level, though some matenal beyond thiS range. IS also included.

  17. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.; Sarfraz, M.

    2004-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  18. Clojure web development essentials

    CERN Document Server

    Baldwin, Ryan

    2015-01-01

    This book is for anyone who's worked with Clojure and wants to use it to start developing applications for the Web. Experience or familiarity with basic Clojure syntax is a must, and exposure to Leiningen (or other similar build tools such as Maven) would be helpful.

  19. The value (driven) web

    NARCIS (Netherlands)

    Baken, N.H.G.; Wiegel, V.; Van Oortmerssen, G.

    2010-01-01

    This paper presents a vision on the importance of values and ethical aspects in web science. We create(d) the Internet, but now the Internet (technology) is shaping our world increasingly: the way we experience, interact, transact, conduct business et cetera. The Internet is ubiquitous and vital to

  20. Web of Deceit.

    Science.gov (United States)

    Minkel, Walter

    2002-01-01

    Discusses the increase in online plagiarism and what school librarians can do to help. Topics include the need for school district policies on plagiarism; teaching students what plagiarism is; pertinent Web sites; teaching students proper research skills; motivation for cheating; and requiring traditional sources of information for student…

  1. Distributed Deep Web Search

    NARCIS (Netherlands)

    Tjin-Kam-Jet, Kien

    2013-01-01

    The World Wide Web contains billions of documents (and counting); hence, it is likely that some document will contain the answer or content you are searching for. While major search engines like Bing and Google often manage to return relevant results to your query, there are plenty of situations in

  2. Web corpus construction

    CERN Document Server

    Schafer, Roland

    2013-01-01

    The World Wide Web constitutes the largest existing source of texts written in a great variety of languages. A feasible and sound way of exploiting this data for linguistic research is to compile a static corpus for a given language. There are several adavantages of this approach: (i) Working with such corpora obviates the problems encountered when using Internet search engines in quantitative linguistic research (such as non-transparent ranking algorithms). (ii) Creating a corpus from web data is virtually free. (iii) The size of corpora compiled from the WWW may exceed by several orders of magnitudes the size of language resources offered elsewhere. (iv) The data is locally available to the user, and it can be linguistically post-processed and queried with the tools preferred by her/him. This book addresses the main practical tasks in the creation of web corpora up to giga-token size. Among these tasks are the sampling process (i.e., web crawling) and the usual cleanups including boilerplate removal and rem...

  3. Criminal Justice Web Sites.

    Science.gov (United States)

    Dodge, Timothy

    1998-01-01

    Evaluates 15 criminal justice Web sites that have been selected according to the following criteria: authority, currency, purpose, objectivity, and potential usefulness to researchers. The sites provide narrative and statistical information concerning crime, law enforcement, the judicial system, and corrections. Searching techniques are also…

  4. Progressive Web applications

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Progressive Web Applications are native-like applications running inside of a browser context. In my presentation I would like describe their characteristics, benchmarks and building process using a quick and simple case study example with focus on Service Workers api.

  5. Towards the Semantic Web

    NARCIS (Netherlands)

    Davies, John; Fensel, Dieter; Harmelen, Frank Van

    2003-01-01

    With the current changes driven by the expansion of the World Wide Web, this book uses a different approach from other books on the market: it applies ontologies to electronically available information to improve the quality of knowledge management in large and distributed organizations. Ontologies

  6. REopt Lite Web Tool

    Energy Technology Data Exchange (ETDEWEB)

    2018-04-03

    NREL developed a free, publicly available web version of the REopt (TM) renewable energy integration and optimization platform called REopt Lite. REopt Lite recommends the optimal size and dispatch strategy for grid-connected photovoltaics (PV) and battery storage at a site. It also allows users to explore how PV and storage can increase a site's resiliency during a grid outage.

  7. Web Development Simplified

    Science.gov (United States)

    Becker, Bernd W.

    2010-01-01

    The author has discussed the Multimedia Educational Resource for Teaching and Online Learning site, MERLOT, in a recent Electronic Roundup column. In this article, he discusses an entirely new Web page development tool that MERLOT has added for its members. The new tool is called the MERLOT Content Builder and is directly integrated into the…

  8. A Web of Controversies

    DEFF Research Database (Denmark)

    Baron, Christian

    2011-01-01

    consequences for the treatment of apparently independent epistemic problems that are subject of investigation in other thought collectives. For the practicing scientist it is necessary to take this complex web of interactions into account in order to be able to navigate in such a situation. So far most studies...

  9. A Web Policy Primer.

    Science.gov (United States)

    Levine, Elliott

    2001-01-01

    Sound technology policies can spell the difference between an effective website and an online nightmare. An effective web development policy addresses six key areas: roles and responsibilities, content/educational value, privacy and safety, adherence to copyright laws, technical standards, and use of commercial sites and services. (MLH)

  10. Spider Web Pattern

    Science.gov (United States)

    2006-01-01

    A delicate pattern, like that of a spider web, appears on top of the Mars residual polar cap, after the seasonal carbon-dioxide ice slab has disappeared. Next spring, these will likely mark the sites of vents when the carbon-dioxide ice cap returns. This Mars Global Surveyor, Mars Orbiter Camera image is about 3-kilometers wide (2-miles).

  11. Underwater Web Work

    Science.gov (United States)

    Wighting, Mervyn J.; Lucking, Robert A.; Christmann, Edwin P.

    2004-01-01

    Teachers search for ways to enhance oceanography units in the classroom. There are many online resources available to help one explore the mysteries of the deep. This article describes a collection of Web sites on this topic appropriate for middle level classrooms.

  12. The Future of Web Maps in Next Generation Textbooks

    Science.gov (United States)

    DiBiase, D.; Prasad, S.

    2014-12-01

    The reformation of the "Object Formerly Known as Textbook" (coined by the Chronicle of Higher Education) toward a digital future is underway. Emerging nextgen texts look less like electronic books ("ebooks") and more like online courseware. In addition to text and illustrations, nextgen textbooks for STEM subjects are likely to combine quizzes, grade management tools, support for social learning, and interactive media including web maps. Web maps are interactive, multi-scale, online maps that enable teachers and learners to explore, interrogate, and mash up the wide variety of map layers available in the cloud. This presentation will show how web maps coupled with interactive quizzes enable students' purposeful explorations and interpretations of spatial patterns related to humankind's interactions with the earth. Attendees will also learn about Esri's offer to donate ArcGIS Online web mapping subscriptions to every U.S. school as part of the President Obama's ConnectED initiative.

  13. Designing Effective Web Forms for Older Web Users

    Science.gov (United States)

    Li, Hui; Rau, Pei-Luen Patrick; Fujimura, Kaori; Gao, Qin; Wang, Lin

    2012-01-01

    This research aims to provide insight for web form design for older users. The effects of task complexity and information structure of web forms on older users' performance were examined. Forty-eight older participants with abundant computer and web experience were recruited. The results showed significant differences in task time and error rate…

  14. WebCom: A Model for Understanding Web Site Communication

    DEFF Research Database (Denmark)

    Godsk, Mikkel; Petersen, Anja Bechmann

    2008-01-01

    of the approaches' strengths. Furthermore, it is discussed and shortly demonstrated how WebCom can be used for analytical and design purposes with YouTube as an example. The chapter concludes that WebCom is able to serve as a theoretically-based model for understanding complex Web site communication situations...

  15. Web API Fragility : How Robust is Your Web API Client

    NARCIS (Netherlands)

    Espinha, T.; Zaidman, A.; Gross, H.G.

    2014-01-01

    Web APIs provide a systematic and extensible approach for application-to-application interaction. A large number of mobile applications makes use of web APIs to integrate services into apps. Each Web API’s evolution pace is determined by their respective developer and mobile application developers

  16. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  17. Maintaining Web Cache Coherency

    Directory of Open Access Journals (Sweden)

    2000-01-01

    Full Text Available Document coherency is a challenging problem for Web caching. Once the documents are cached throughout the Internet, it is often difficult to keep them coherent with the origin document without generating a new traffic that could increase the traffic on the international backbone and overload the popular servers. Several solutions have been proposed to solve this problem, among them two categories have been widely discussed: the strong document coherency and the weak document coherency. The cost and the efficiency of the two categories are still a controversial issue, while in some studies the strong coherency is far too expensive to be used in the Web context, in other studies it could be maintained at a low cost. The accuracy of these analysis is depending very much on how the document updating process is approximated. In this study, we compare some of the coherence methods proposed for Web caching. Among other points, we study the side effects of these methods on the Internet traffic. The ultimate goal is to study the cache behavior under several conditions, which will cover some of the factors that play an important role in the Web cache performance evaluation and quantify their impact on the simulation accuracy. The results presented in this study show indeed some differences in the outcome of the simulation of a Web cache depending on the workload being used, and the probability distribution used to approximate updates on the cached documents. Each experiment shows two case studies that outline the impact of the considered parameter on the performance of the cache.

  18. Web-based interventions for menopause: A systematic integrated literature review.

    Science.gov (United States)

    Im, Eun-Ok; Lee, Yaelim; Chee, Eunice; Chee, Wonshik

    2017-01-01

    Advances in computer and Internet technologies have allowed health care providers to develop, use, and test various types of Web-based interventions for their practice and research. Indeed, an increasing number of Web-based interventions have recently been developed and tested in health care fields. Despite the great potential for Web-based interventions to improve practice and research, little is known about the current status of Web-based interventions, especially those related to menopause. To identify the current status of Web-based interventions used in the field of menopause, a literature review was conducted using multiple databases, with the keywords "online," "Internet," "Web," "intervention," and "menopause." Using these keywords, a total of 18 eligible articles were analyzed to identify the current status of Web-based interventions for menopause. Six themes reflecting the current status of Web-based interventions for menopause were identified: (a) there existed few Web-based intervention studies on menopause; (b) Web-based decision support systems were mainly used; (c) there was a lack of detail on the interventions; (d) there was a lack of guidance on the use of Web-based interventions; (e) counselling was frequently combined with Web-based interventions; and (f) the pros and cons were similar to those of Web-based methods in general. Based on these findings, directions for future Web-based interventions for menopause are provided. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Code AI Personal Web Pages

    Science.gov (United States)

    Garcia, Joseph A.; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The document consists of a publicly available web site (george.arc.nasa.gov) for Joseph A. Garcia's personal web pages in the AI division. Only general information will be posted and no technical material. All the information is unclassified.

  20. Analyzing the User Behavior toward Electronic Commerce Stimuli

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-del-Amo, María-del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human behavior (i.e., users’ internal states -affective, cognitive, and satisfaction- and behavioral responses – approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 (“free” versus “hierarchical” navigational structure) × 2 (“on” versus “off” music) × 2 (“moving” versus “static” images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior. PMID:27965549

  1. DEMorphy, German Language Morphological Analyzer

    OpenAIRE

    Altinok, Duygu

    2018-01-01

    DEMorphy is a morphological analyzer for German. It is built onto large, compactified lexicons from German Morphological Dictionary. A guesser based on German declension suffixed is also provided. For German, we provided a state-of-art morphological analyzer. DEMorphy is implemented in Python with ease of usability and accompanying documentation. The package is suitable for both academic and commercial purposes wit a permissive licence.

  2. A Categorization of Dynamic Analyzers

    Science.gov (United States)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  3. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  4. Landscape variation influences trophic cascades in dengue vector food webs.

    Science.gov (United States)

    Weterings, Robbie; Umponstira, Chanin; Buckley, Hannah L

    2018-02-01

    The epidemiology of vector-borne diseases is governed by a structured array of correlative and causative factors, including landscape (for example, rural versus urban), abiotic (for example, weather), and biotic (for example, food web) factors. Studies of mosquito-borne diseases rarely address these multiple factors at large spatial scales, which limits insights into how human alterations of landscapes and food webs alter mosquito abundance. We used structural equation modeling to identify the relative magnitude and direction of landscape, abiotic, and food web factors on Aedes larvae and adults across 70 sites in northern Thailand. Food web factors were modeled as mosquito-predator trophic cascades. Landscape context affected mosquito-predator communities in aquatic and terrestrial environments via cascading food web interactions. Several mosquito predators within these food webs showed potential as biocontrol agents in mosquito population control, but their potentials for control were landscape-dependent. In terrestrial food webs, the habitat-sensitive tokay gecko structured mosquito-predator communities, indicating that a conservation approach to vector control could be a useful addition to existing control efforts.

  5. Benchmark of Client and Server-Side Catchment Delineation Approaches on Web-Based Systems

    Science.gov (United States)

    Demir, I.; Sermet, M. Y.; Sit, M. A.

    2016-12-01

    Recent advances in internet and cyberinfrastructure technologies have provided the capability to acquire large scale spatial data from various gauges and sensor networks. The collection of environmental data increased demand for applications which are capable of managing and processing large-scale and high-resolution data sets. With the amount and resolution of data sets provided, one of the challenging tasks for organizing and customizing hydrological data sets is delineation of watersheds on demand. Watershed delineation is a process for creating a boundary that represents the contributing area for a specific control point or water outlet, with intent of characterization and analysis of portions of a study area. Although many GIS tools and software for watershed analysis are available on desktop systems, there is a need for web-based and client-side techniques for creating a dynamic and interactive environment for exploring hydrological data. In this project, we demonstrated several watershed delineation techniques on the web with various techniques implemented on the client-side using JavaScript and WebGL, and on the server-side using Python and C++. We also developed a client-side GPGPU (General Purpose Graphical Processing Unit) algorithm to analyze high-resolution terrain data for watershed delineation which allows parallelization using GPU. The web-based real-time analysis of watershed segmentation can be helpful for decision-makers and interested stakeholders while eliminating the need of installing complex software packages and dealing with large-scale data sets. Utilization of the client-side hardware resources also eliminates the need of servers due its crowdsourcing nature. Our goal for future work is to improve other hydrologic analysis methods such as rain flow tracking by adapting presented approaches.

  6. Standardized acquisition, storing and provision of 3D enabled spatial data

    Science.gov (United States)

    Wagner, B.; Maier, S.; Peinsipp-Byma, E.

    2017-05-01

    In the area of working with spatial data, in addition to the classic, two-dimensional geometrical data (maps, aerial images, etc.), the needs for three-dimensional spatial data (city models, digital elevation models, etc.) is increasing. Due to this increased demand the acquiring, storing and provision of 3D enabled spatial data in Geographic Information Systems (GIS) is more and more important. Existing proprietary solutions quickly reaches their limits during data exchange and data delivery to other systems. They generate a large workload, which will be very costly. However, it is noticeable that these expenses and costs can generally be significantly reduced using standards. The aim of this research is therefore to develop a concept in the field of three-dimensional spatial data that runs on existing standards whenever possible. In this research, the military image analysts are the preferred user group of the system. To achieve the objective of the widest possible use of standards in spatial 3D data, existing standards, proprietary interfaces and standards under discussion have been analyzed. Since the here used GIS of the Fraunhofer IOSB is already using and supporting OGC (Open Geospatial Consortium) and NATO-STANAG (NATO-Standardization Agreement) standards for the most part of it, a special attention for possible use was laid on their standards. The most promising standard is the OGC standard 3DPS (3D Portrayal Service) with its occurrences W3DS (Web 3D Service) and WVS (Web View Service). A demo system was created, using a standardized workflow from the data acquiring, storing and provision and showing the benefit of our approach.

  7. Safety and efficacy of aneurysm treatment with WEB

    DEFF Research Database (Denmark)

    Pierot, Laurent; Costalat, Vincent; Moret, Jacques

    2016-01-01

    OBJECT WEB is an innovative intrasaccular treatment for intracranial aneurysms. Preliminary series have shown good safety and efficacy. The WEB Clinical Assessment of Intrasaccular Aneurysm Therapy (WEBCAST) trial is a prospective European trial evaluating the safety and efficacy of WEB in wide......-neck bifurcation aneurysms. METHODS Patients with wide-neck bifurcation aneurysms for which WEB treatment was indicated were included in this multicentergood clinical practices study. Clinical data including adverse events and clinical status at 1 and 6 months were collected and independently analyzed by a medical....... RESULTS Ten European neurointerventional centers enrolled 51 patients with 51 aneurysms. Treatment with WEB was achieved in 48 of 51 aneurysms (94.1%). Adjunctive implants (coils/stents) were used in 4 of 48 aneurysms (8.3%). Thromboembolic events were observed in 9 of 51 patients (17.6%), resulting...

  8. Tobacco-prevention messages online: social marketing via the Web.

    Science.gov (United States)

    Lin, Carolyn A; Hullman, Gwen A

    2005-01-01

    Antitobacco groups have joined millions of other commercial or noncommercial entities in developing a presence on the Web. These groups primarily represent the following different sponsorship categories: grassroots, medical, government, and corporate. To obtain a better understanding of the strengths and weaknesses in the message design of antitobacco Web sites, this project analyzed 100 antitobacco Web sites ranging across these four sponsorship categories. The results show that the tobacco industry sites posted just enough antismoking information to appease the antismoking publics. Medical organizations designed their Web sites as specialty sites and offered mostly scientific information. While the government sites resembled a clearinghouse for antitobacco related information, the grassroots sites represented the true advocacy outlets. In general, the industry sites provided the weakest persuasive messages and medical sites fared only slightly better. Government and grassroots sites rated most highly in presenting their antitobacco campaign messages on the Web.

  9. RESTful web services with Dropwizard

    CERN Document Server

    Dallas, Alexandros

    2014-01-01

    A hands-on focused step-by-step tutorial to help you create Web Service applications using Dropwizard. If you are a software engineer or a web developer and want to learn more about building your own Web Service application, then this is the book for you. Basic knowledge of Java and RESTful Web Service concepts is assumed and familiarity with SQL/MySQL and command-line scripting would be helpful.

  10. Exploring the academic invisible web

    OpenAIRE

    Lewandowski, Dirk; Mayr, Philipp

    2006-01-01

    Purpose: To provide a critical review of Bergman’s 2001 study on the Deep Web. In addition, we bring a new concept into the discussion, the Academic Invisible Web (AIW). We define the Academic Invisible Web as consisting of all databases and collections relevant to academia but not searchable by the general-purpose internet search engines. Indexing this part of the Invisible Web is central to scientific search engines. We provide an overview of approaches followed thus far. Design/methodol...

  11. Anonymous Web Browsing and Hosting

    OpenAIRE

    MANOJ KUMAR; ANUJ RANI

    2013-01-01

    In today’s high tech environment every organization, individual computer users use internet for accessing web data. To maintain high confidentiality and security of the data secure web solutions are required. In this paper we described dedicated anonymous web browsing solutions which makes our browsing faster and secure. Web application which play important role for transferring our secret information including like email need more and more security concerns. This paper also describes that ho...

  12. Design and implementation of distributed spatial computing node based on WPS

    International Nuclear Information System (INIS)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-01-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed

  13. Designing a WebQuest

    Science.gov (United States)

    Salsovic, Annette R.

    2009-01-01

    A WebQuest is an inquiry-based lesson plan that uses the Internet. This article explains what a WebQuest is, shows how to create one, and provides an example. When engaged in a WebQuest, students use technology to experience cooperative learning and discovery learning while honing their research, writing, and presentation skills. It has been found…

  14. Sensor system for web inspection

    Science.gov (United States)

    Sleefe, Gerard E.; Rudnick, Thomas J.; Novak, James L.

    2002-01-01

    A system for electrically measuring variations over a flexible web has a capacitive sensor including spaced electrically conductive, transmit and receive electrodes mounted on a flexible substrate. The sensor is held against a flexible web with sufficient force to deflect the path of the web, which moves relative to the sensor.

  15. The Semantic Web in Education

    Science.gov (United States)

    Ohler, Jason

    2008-01-01

    The semantic web or Web 3.0 makes information more meaningful to people by making it more understandable to machines. In this article, the author examines the implications of Web 3.0 for education. The author considers three areas of impact: knowledge construction, personal learning network maintenance, and personal educational administration.…

  16. The Evolution of Web Searching.

    Science.gov (United States)

    Green, David

    2000-01-01

    Explores the interrelation between Web publishing and information retrieval technologies and lists new approaches to Web indexing and searching. Highlights include Web directories; search engines; portalisation; Internet service providers; browser providers; meta search engines; popularity based analysis; natural language searching; links-based…

  17. XML and Better Web Searching.

    Science.gov (United States)

    Jackson, Joe; Gilstrap, Donald L.

    1999-01-01

    Addresses the implications of the new Web metalanguage XML for searching on the World Wide Web and considers the future of XML on the Web. Compared to HTML, XML is more concerned with structure of data than documents, and these data structures should prove conducive to precise, context rich searching. (Author/LRW)

  18. Information Diversity in Web Search

    Science.gov (United States)

    Liu, Jiahui

    2009-01-01

    The web is a rich and diverse information source with incredible amounts of information about all kinds of subjects in various forms. This information source affords great opportunity to build systems that support users in their work and everyday lives. To help users explore information on the web, web search systems should find information that…

  19. WebSelF: A Web Scraping Framework

    DEFF Research Database (Denmark)

    Thomsen, Jakob; Ernst, Erik; Brabrand, Claus

    2012-01-01

    We present, WebSelF, a framework for web scraping which models the process of web scraping and decomposes it into four conceptually independent, reusable, and composable constituents. We have validated our framework through a full parameterized implementation that is flexible enough to capture...... previous work on web scraping. We have experimentally evaluated our framework and implementation in an experiment that evaluated several qualitatively different web scraping constituents (including previous work and combinations hereof) on about 11,000 HTML pages on daily versions of 17 web sites over...... a period of more than one year. Our framework solves three concrete problems with current web scraping and our experimental results indicate that com- position of previous and our new techniques achieve a higher degree of accuracy, precision and specificity than existing techniques alone....

  20. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...