WorldWideScience

Sample records for series web surveysweb

  1. Er web- og tv-serien SKAM en serie om skam?

    DEFF Research Database (Denmark)

    Christensen, Christa Lykke

    2017-01-01

    In this article, I discuss whether the Norwegian teen web series drama SKAM (NRK, 2015-2017) is really about shame and to what extent the fictional characters of the series feel ashamed. The theoretical framework is based on a social psychological conceptualization of shame, supplemented by a micro...

  2. Should indications for WEB aneurysm treatment be enlarged? Report of a series of 20 patients with aneurysms in "atypical" locations for WEB treatment.

    Science.gov (United States)

    Pierot, L; Biondi, A; Narata, A-P; Mihalea, C; Januel, A-C; Metaxas, G; Bibi, R; Caroff, J; Soize, S; Cognard, C; Spelle, L; Herbreteau, D

    2017-06-01

    Flow disruption with the WEB device is an innovative technique for the endovascular treatment of wide neck bifurcation aneurysms. Good clinical practice trials have shown high safety of this treatment with good efficacy. Technical developments (single layer devices and smaller microcatheters) facilitate the treatment, potentially leading to enlargement of indications. This series is collecting aneurysms in "atypical" locations for WEB treatment and analyzing safety and efficacy of this treatment. In each participating center, patients with aneurysms treated with WEB were prospectively included in a local database. Patients treated for aneurysms in "atypical" locations were extracted. Patient and aneurysm characteristics, intraoperative complications, and anatomical results at the end of the procedure and at last follow-up were collected and analyzed. Five French neurointerventional centers included 20 patients with 20 aneurysms in "atypical" locations for WEB treatment treated with WEB. Aneurysm locations were ICA carotid-ophthalmic in 9 aneurysms (45.0%), ICA posterior communicating in 4 (20.0%), Pericallosal artery in 5 (25.0%), and basilar artery between P1 and superior cerebellar artery in 2 (10.0%). There were no complications (thromboembolic or intraoperative rupture) in this series. At follow-up (mean: 7.4 months), adequate occlusion was obtained in 100.0% of aneurysms. This series confirms that it is possible to enlarge indications of WEB treatment to "atypical" locations with good safety and efficacy. These data have to be confirmed in large prospective series. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  3. Between butch/femme: On the performance of race, gender, and sexuality in a YouTube web series.

    Science.gov (United States)

    Day, Faithe

    2017-11-27

    Drawing on a legacy of Black television and film production, Black web series remediate earlier media forms in order to usher in a twenty-first-century revival of indie Black cultural production. Specifically, video sharing and social media platforms operate as a sphere in which content creators and users are afforded unique opportunities to engage with video content and each other on a variety of levels. Focusing on the YouTube media sphere, one can also observe the myriad ways in which the performance of race, gender, and sexuality influences the types of discourse that circulate within these sites. In watching and analyzing Black queer web series on YouTube, I examine how the performance of gender and sexuality by Black queer women within and outside of web series are policed and protected by both community insiders and outsiders. Utilizing an ethnographic framework, which includes a critical discourse analysis of the YouTube comments for the series Between Women, as well as a textual analysis of series content, this project draws conclusions about the role that the politics of pleasure, performance, and the public sphere play in the recognition and/or refusal of queer sexuality within Black communities.

  4. Advanced data extraction infrastructure: Web based system for management of time series data

    Energy Technology Data Exchange (ETDEWEB)

    Chilingaryan, S; Beglarian, A [Forschungszentrum Karlsruhe, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Kopmann, A; Voecking, S, E-mail: Suren.Chilingaryan@kit.ed [University of Muenster, Institut fuer Kernphysik, Wilhelm-Klemm-Strasse 9, 48149 Mnster (Germany)

    2010-04-01

    During operation of high energy physics experiments a big amount of slow control data is recorded. It is necessary to examine all collected data checking the integrity and validity of measurements. With growing maturity of AJAX technologies it becomes possible to construct sophisticated interfaces using web technologies only. Our solution for handling time series, generally slow control data, has a modular architecture: backend system for data analysis and preparation, a web service interface for data access and a fast AJAX web display. In order to provide fast interactive access the time series are aggregated over time slices of few predefined lengths. The aggregated values are stored in the temporary caching database and, then, are used to create generalizing data plots. These plots may include indication of data quality and are generated within few hundreds of milliseconds even if very high data rates are involved. The extensible export subsystem provides data in multiple formats including CSV, Excel, ROOT, and TDMS. The search engine can be used to find periods of time where indications of selected sensors are falling into the specified ranges. Utilization of the caching database allows performing most of such lookups within a second. Based on this functionality a web interface facilitating fast (Google-maps style) navigation through the data has been implemented. The solution is at the moment used by several slow control systems at Test Facility for Fusion Magnets (TOSKA) and Karlsruhe Tritium Neutrino (KATRIN).

  5. Advanced data extraction infrastructure: Web based system for management of time series data

    International Nuclear Information System (INIS)

    Chilingaryan, S; Beglarian, A; Kopmann, A; Voecking, S

    2010-01-01

    During operation of high energy physics experiments a big amount of slow control data is recorded. It is necessary to examine all collected data checking the integrity and validity of measurements. With growing maturity of AJAX technologies it becomes possible to construct sophisticated interfaces using web technologies only. Our solution for handling time series, generally slow control data, has a modular architecture: backend system for data analysis and preparation, a web service interface for data access and a fast AJAX web display. In order to provide fast interactive access the time series are aggregated over time slices of few predefined lengths. The aggregated values are stored in the temporary caching database and, then, are used to create generalizing data plots. These plots may include indication of data quality and are generated within few hundreds of milliseconds even if very high data rates are involved. The extensible export subsystem provides data in multiple formats including CSV, Excel, ROOT, and TDMS. The search engine can be used to find periods of time where indications of selected sensors are falling into the specified ranges. Utilization of the caching database allows performing most of such lookups within a second. Based on this functionality a web interface facilitating fast (Google-maps style) navigation through the data has been implemented. The solution is at the moment used by several slow control systems at Test Facility for Fusion Magnets (TOSKA) and Karlsruhe Tritium Neutrino (KATRIN).

  6. The Earth Observation Monitor - Automated monitoring and alerting for spatial time-series data based on OGC web services

    Science.gov (United States)

    Eberle, J.; Hüttich, C.; Schmullius, C.

    2014-12-01

    Spatial time series data are freely available around the globe from earth observation satellites and meteorological stations for many years until now. They provide useful and important information to detect ongoing changes of the environment; but for end-users it is often too complex to extract this information out of the original time series datasets. This issue led to the development of the Earth Observation Monitor (EOM), an operational framework and research project to provide simple access, analysis and monitoring tools for global spatial time series data. A multi-source data processing middleware in the backend is linked to MODIS data from Land Processes Distributed Archive Center (LP DAAC) and Google Earth Engine as well as daily climate station data from NOAA National Climatic Data Center. OGC Web Processing Services are used to integrate datasets from linked data providers or external OGC-compliant interfaces to the EOM. Users can either use the web portal (webEOM) or the mobile application (mobileEOM) to execute these processing services and to retrieve the requested data for a given point or polygon in userfriendly file formats (CSV, GeoTiff). Beside providing just data access tools, users can also do further time series analyses like trend calculations, breakpoint detections or the derivation of phenological parameters from vegetation time series data. Furthermore data from climate stations can be aggregated over a given time interval. Calculated results can be visualized in the client and downloaded for offline usage. Automated monitoring and alerting of the time series data integrated by the user is provided by an OGC Sensor Observation Service with a coupled OGC Web Notification Service. Users can decide which datasets and parameters are monitored with a given filter expression (e.g., precipitation value higher than x millimeter per day, occurrence of a MODIS Fire point, detection of a time series anomaly). Datasets integrated in the SOS service are

  7. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis

    Science.gov (United States)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.

    2017-12-01

    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

  8. Webseries, Orginal Series and Digital Series: The Forms of Serial Fiction on the Web

    Directory of Open Access Journals (Sweden)

    Mirko Lino

    2016-06-01

    Full Text Available lo scopo di questo saggio è quello di fornire attraverso una prospettiva storico formale un’analisi della morfologia della serialità nel web, dalle origini ai più recenti sviluppi linguistico-formali. Oggetto del saggio in questione è la webserie: un tipo di fiction webnativa che seppur trova diverse corrispondenze formali con il modello delle serie Tv al tempo stesso se ne differenzia notevolmente. Partendo dall’appropriazione dei modelli e delle forme della serialità televisiva, la webserie si trova oggi al centro delle trasformazioni che attraversano il modo di comunicare, raccontare e fruire le storie nell’epoca della convergenza (Jenkins, 2007, di una condizione postmediale (Eugeni, 2015, di una cultura che appare sempre retta sulla produzione di forme ibride (Arcagni, 2016. Non si tratta dunque di un semplice modello per il racconto seriale, quanto di un formato che ibrida luoghi dell’intrattenimento rendendo la differenza tra serie offline e online sempre più sfumata, come si evince nei prodotti seriali di player come Amazon e Netflix, e diventa uno dei formati con cui sperimentare i linguaggi e le forme narrative legate ai media immersivi di ultimissima generazione. Nel saggio si proveranno a tracciare le nuove forme della webserialità in relazione allo sviluppo comunicativo dei social network e tecnologico di wereable media come i visori per la Realtà Virtuale al fine di tracciare delle possibili direttive per comprendere l’orizzonte prossimo dello storytelling digitale. Per farlo, si proverà a compiere una sostituzione terminologica-concettuale, ovvero adottare il termine digital series al posto di quello ormai tradizionale e insufficiente di webserie, o di original series.

  9. Interactive Web-based Visualization of Atomic Position-time Series Data

    Science.gov (United States)

    Thapa, S.; Karki, B. B.

    2017-12-01

    Extracting and interpreting the information contained in large sets of time-varying three dimensional positional data for the constituent atoms of simulated material is a challenging task. We have recently implemented a web-based visualization system to analyze the position-time series data extracted from the local or remote hosts. It involves a pre-processing step for data reduction, which involves skipping uninteresting parts of the data uniformly (at full atomic configuration level) or non-uniformly (at atomic species level or individual atom level). Atomic configuration snapshot is rendered using the ball-stick representation and can be animated by rendering successive configurations. The entire atomic dynamics can be captured as the trajectories by rendering the atomic positions at all time steps together as points. The trajectories can be manipulated at both species and atomic levels so that we can focus on one or more trajectories of interest, and can be also superimposed with the instantaneous atomic structure. The implementation was done using WebGL and Three.js for graphical rendering, HTML5 and Javascript for GUI, and Elasticsearch and JSON for data storage and retrieval within the Grails Framework. We have applied our visualization system to the simulation datatsets for proton-bearing forsterite (Mg2SiO4) - an abundant mineral of Earths upper mantle. Visualization reveals that protons (hydrogen ions) incorporated as interstitials are much more mobile than protons substituting the host Mg and Si cation sites. The proton diffusion appears to be anisotropic with high mobility along the x-direction, showing limited discrete jumps in other two directions.

  10. Semantic Web Services Challenge, Results from the First Year. Series: Semantic Web And Beyond, Volume 8.

    Science.gov (United States)

    Petrie, C.; Margaria, T.; Lausen, H.; Zaremba, M.

    Explores trade-offs among existing approaches. Reveals strengths and weaknesses of proposed approaches, as well as which aspects of the problem are not yet covered. Introduces software engineering approach to evaluating semantic web services. Service-Oriented Computing is one of the most promising software engineering trends because of the potential to reduce the programming effort for future distributed industrial systems. However, only a small part of this potential rests on the standardization of tools offered by the web services stack. The larger part of this potential rests upon the development of sufficient semantics to automate service orchestration. Currently there are many different approaches to semantic web service descriptions and many frameworks built around them. A common understanding, evaluation scheme, and test bed to compare and classify these frameworks in terms of their capabilities and shortcomings, is necessary to make progress in developing the full potential of Service-Oriented Computing. The Semantic Web Services Challenge is an open source initiative that provides a public evaluation and certification of multiple frameworks on common industrially-relevant problem sets. This edited volume reports on the first results in developing common understanding of the various technologies intended to facilitate the automation of mediation, choreography and discovery for Web Services using semantic annotations. Semantic Web Services Challenge: Results from the First Year is designed for a professional audience composed of practitioners and researchers in industry. Professionals can use this book to evaluate SWS technology for their potential practical use. The book is also suitable for advanced-level students in computer science.

  11. Does self-directed and web-based support for parents enhance the effects of viewing a reality television series based on the Triple P-Positive Parenting Programme?

    Science.gov (United States)

    Sanders, Matthew; Calam, Rachel; Durand, Marianne; Liversidge, Tom; Carmont, Sue Ann

    2008-09-01

    This study investigated whether providing self-directed and web-based support for parents enhanced the effects of viewing a reality television series based on the Triple P - Positive Parenting Programme. Parents with a child aged 2 to 9 (N = 454) were randomly assigned to either a standard or enhanced intervention condition. In the standard television alone viewing condition, parents watched the six-episode weekly television series, 'Driving Mum and Dad Mad'. Parents in the enhanced television viewing condition received a self-help workbook, extra web support involving downloadable parenting tip sheets, audio and video streaming of positive parenting messages and email support, in addition to viewing the television series. Parents in both conditions reported significant improvements in their child's disruptive behaviour and improvements in dysfunctional parenting practices. Effects were greater for the enhanced condition as seen on the ECBI, two of the three parenting indicators and overall programme satisfaction. However, no significant differences were seen on other measures, including parent affect indicators. The level of improvement was related to number of episodes watched, with greatest changes occurring in families who watched each episode. Improvements achieved at post-intervention by parents in both groups were maintained at six-month follow-up. Online tip sheets were frequently accessed; uptake of web-based resources was highest early in the series. The value of combining self-help approaches, technology and media as part of a comprehensive public health approach to providing parenting support is discussed.

  12. Deconstruyendo a Dexter. Microanálisis y reinterpretación de una serie televisiva en la web

    Directory of Open Access Journals (Sweden)

    Rafael Marfil Carmona

    2011-04-01

    Full Text Available El presente artículo expone, de forma resumida, un planteamiento metodológico para analizar producciones audiovisuales, aportando un ejemplo concreto de las posibilidades que ofrece la web para finalizar ese proceso a través de la expresión creativa. Se propone una metodología integradora aplicable, en el caso aportado, a un ejercicio de microanálisis de la secuencia introductoria a la serie de televisión Dexter (M. Cuesta, T. Goldwyn y R. Lieberman; Showtime television, 2006, mostrando finalmente un interesante ejemplo de reinterpretación en la web, como es el remake titulado Malviviendo (malviviendo.com, ejemplo del uso del lenguaje audiovisual y los nuevos medios de difusión (youtube e internet TV como fase final de un proceso conceptual-creativo. El recorrido analítico parte de una reflexión conceptual y crítica para finalizar en la expresión a través de los propios medios audiovisuales.

  13. Exploring default mode and information flow on the web.

    Science.gov (United States)

    Oka, Mizuki; Ikegami, Takashi

    2013-01-01

    Social networking services (e.g., Twitter, Facebook) are now major sources of World Wide Web (called "Web") dynamics, together with Web search services (e.g., Google). These two types of Web services mutually influence each other but generate different dynamics. In this paper, we distinguish two modes of Web dynamics: the reactive mode and the default mode. It is assumed that Twitter messages (called "tweets") and Google search queries react to significant social movements and events, but they also demonstrate signs of becoming self-activated, thereby forming a baseline Web activity. We define the former as the reactive mode and the latter as the default mode of the Web. In this paper, we investigate these reactive and default modes of the Web's dynamics using transfer entropy (TE). The amount of information transferred between a time series of 1,000 frequent keywords in Twitter and the same keywords in Google queries is investigated across an 11-month time period. Study of the information flow on Google and Twitter revealed that information is generally transferred from Twitter to Google, indicating that Twitter time series have some preceding information about Google time series. We also studied the information flow among different Twitter keywords time series by taking keywords as nodes and flow directions as edges of a network. An analysis of this network revealed that frequent keywords tend to become an information source and infrequent keywords tend to become sink for other keywords. Based on these findings, we hypothesize that frequent keywords form the Web's default mode, which becomes an information source for infrequent keywords that generally form the Web's reactive mode. We also found that the Web consists of different time resolutions with respect to TE among Twitter keywords, which will be another focal point of this paper.

  14. A WebQuest for Spatial Skills

    Science.gov (United States)

    Wood, Pamela L.; Quitadamo, Ian J.; DePaepe, James L.; Loverro, Ian

    2007-01-01

    The WebQuest is a four-step process integrated at appropriate points in the Animal Studies unit. Through the WebQuest, students create a series of habitat maps that build on the knowledge gained from conducting the various activities of the unit. The quest concludes with an evaluation using the WebQuest rubric and an oral presentation of a final…

  15. Process-aware web programming with Jolie

    DEFF Research Database (Denmark)

    Montesi, F.

    2016-01-01

    We extend the Jolie programming language to capture the native modelling of process-aware web information systems, i.e., web information systems based upon the execution of business processes. Our main contribution is to offer a unifying approach for the programming of distributed architectures...... on the web, which can capture web servers, stateful process execution, and the composition of services via mediation. We discuss applications of this approach through a series of examples that cover, e.g., static content serving, multiparty sessions, and the evolution of web systems. Finally, we present...... a performance evaluation that includes a comparison of Jolie-based web systems to other frameworks and a measurement of its scalability. © 2016 Elsevier B.V....

  16. Web-ethics from the Perspective of a Series of Social Research Projects

    OpenAIRE

    CRUZ, HERNANDO; Docente Dpto. Ciencia de la Información - Pontificia Universidad Javeriana; Bogotá

    2009-01-01

    This article puts forth the perspective of an ethics for the web or web-ethics, which the author has identified while doing research in Colombia. The research work has dealt with education, management, design, communication, and use and retrieval of information in the web from 1998 to 2007, particularly the theoretical revision and critical analyses of a specific corpus of research work. These analyses have in turn lead to new questions and challenges related to the balance which must be foun...

  17. Is there life out there ? - A new series for the ESA's Web TV

    Science.gov (United States)

    Clervoy, J. F.; Coliolo, F.

    2012-09-01

    The European Space Agency, ESA, is preparing a new outreach project: a series of short videos for the ESA's Web TV dedicated to the search for life in the Universe. The rationale behind this pilot project is to use stunning images to attract the attention together with a scientific content accessible to people of varying ages, education levels and cultural outlook. We intent to work with scientists across Europe in order to bring the public on a journey from the boundaries of the Cosmos to the core of the Earth looking for the ingredients necessary for life to form and evolve. Our main objectives are: to share discovery, curiosity and sense of adventure in order to make the public a player in the quest of knowledge about who we are, and where do we come from; to educate and engage different target audiences about European space science and exploration activities; encourage international partnerships. I will present you the first trailer that we have realised with two scientists: André Brack, Astrobiologist, Honorary Director of Research at the CNRS, Orleans, France and Gian Gabriele Ori, Research professor in Geology, and Director of the IRSPS, International Reaserch School of Planetary Science, Pescara, Italy. This first presentation gives an overview of the « exobiological » places beyond the Earth and highlights the importance of comparative planetology for better understand our planet. We would like to share with you ideas and advices in order to produce and diffuse this series in the most efficient way.

  18. KeyPathwayMinerWeb

    DEFF Research Database (Denmark)

    List, Markus; Alcaraz, Nicolas; Dissing-Hansen, Martin

    2016-01-01

    , for instance), KeyPathwayMiner extracts connected sub-networks containing a high number of active or differentially regulated genes (proteins, metabolites) in the molecular profiles. The web interface at (http://keypathwayminer.compbio.sdu.dk) implements all core functionalities of the KeyPathwayMiner tool set......We present KeyPathwayMinerWeb, the first online platform for de novo pathway enrichment analysis directly in the browser. Given a biological interaction network (e.g. protein-protein interactions) and a series of molecular profiles derived from one or multiple OMICS studies (gene expression...... such as data integration, input of background knowledge, batch runs for parameter optimization and visualization of extracted pathways. In addition to an intuitive web interface, we also implemented a RESTful API that now enables other online developers to integrate network enrichment as a web service...

  19. Overview of the TREC 2013 Federated Web Search Track

    NARCIS (Netherlands)

    Demeester, Thomas; Trieschnigg, Rudolf Berend; Nguyen, Dong-Phuong; Hiemstra, Djoerd

    2014-01-01

    The TREC Federated Web Search track is intended to promote research related to federated search in a realistic web setting, and hereto provides a large data collection gathered from a series of online search engines. This overview paper discusses the results of the first edition of the track, FedWeb

  20. MR imaging of carotid webs

    International Nuclear Information System (INIS)

    Boesen, Mari E.; Eswaradass, Prasanna Venkatesan; Singh, Dilip; Mitha, Alim P.; Menon, Bijoy K.; Goyal, Mayank; Frayne, Richard

    2017-01-01

    We propose a magnetic resonance (MR) imaging protocol for the characterization of carotid web morphology, composition, and vessel wall dynamics. The purpose of this case series was to determine the feasibility of imaging carotid webs with MR imaging. Five patients diagnosed with carotid web on CT angiography were recruited to undergo a 30-min MR imaging session. MR angiography (MRA) images of the carotid artery bifurcation were acquired. Multi-contrast fast spin echo (FSE) images were acquired axially about the level of the carotid web. Two types of cardiac phase resolved sequences (cineFSE and cine phase contrast) were acquired to visualize the elasticity of the vessel wall affected by the web. Carotid webs were identified on MRA in 5/5 (100%) patients. Multi-contrast FSE revealed vessel wall thickening and cineFSE demonstrated regional changes in distensibility surrounding the webs in these patients. Our MR imaging protocol enables an in-depth evaluation of patients with carotid webs: morphology (by MRA), composition (by multi-contrast FSE), and wall dynamics (by cineFSE). (orig.)

  1. MR imaging of carotid webs

    Energy Technology Data Exchange (ETDEWEB)

    Boesen, Mari E. [University of Calgary, Department of Biomedical Engineering, Calgary (Canada); Foothills Medical Centre, Seaman Family MR Research Centre, Calgary (Canada); Eswaradass, Prasanna Venkatesan; Singh, Dilip; Mitha, Alim P.; Menon, Bijoy K. [University of Calgary, Department of Clinical Neurosciences, Calgary (Canada); Foothills Medical Centre, Calgary Stroke Program, Calgary (Canada); Goyal, Mayank [Foothills Medical Centre, Calgary Stroke Program, Calgary (Canada); University of Calgary, Department of Radiology, Calgary (Canada); Frayne, Richard [Foothills Medical Centre, Seaman Family MR Research Centre, Calgary (Canada); University of Calgary, Hotchkiss Brain Institute, Calgary (Canada)

    2017-04-15

    We propose a magnetic resonance (MR) imaging protocol for the characterization of carotid web morphology, composition, and vessel wall dynamics. The purpose of this case series was to determine the feasibility of imaging carotid webs with MR imaging. Five patients diagnosed with carotid web on CT angiography were recruited to undergo a 30-min MR imaging session. MR angiography (MRA) images of the carotid artery bifurcation were acquired. Multi-contrast fast spin echo (FSE) images were acquired axially about the level of the carotid web. Two types of cardiac phase resolved sequences (cineFSE and cine phase contrast) were acquired to visualize the elasticity of the vessel wall affected by the web. Carotid webs were identified on MRA in 5/5 (100%) patients. Multi-contrast FSE revealed vessel wall thickening and cineFSE demonstrated regional changes in distensibility surrounding the webs in these patients. Our MR imaging protocol enables an in-depth evaluation of patients with carotid webs: morphology (by MRA), composition (by multi-contrast FSE), and wall dynamics (by cineFSE). (orig.)

  2. Sustainable Materials Management Web Academy

    Science.gov (United States)

    The Sustainable Materials Management (SMM) Web Academy series is a free resource for SMM challenge participants, stakeholders, and anyone else interested in learning more about SMM principles from experts in the field.

  3. Technical Evaluation Report 41: WebCT: A major shift of emphasis

    Directory of Open Access Journals (Sweden)

    Kristine Thibeault

    2004-11-01

    Full Text Available The evaluation reports in this series usually feature several products at once. The current review, however, comes at a time when one of the most widely used (and expensive online learning management systems is undergoing a major change in its marketing strategy and corporate focus. WebCT is currently evolving to a new version (WebCT Vista, with much attendant discussion by distance education (DE users. The current review, as the others in this series, adds the DE student's perspective to this discussion. The review compares the existing WebCT Campus Edition with the new WebCT Vista, and examines some of the problems associated with the migration to Vista at the institutional level. A response to the report by the WebCT company is appended.

  4. Adolescents on the Net: Reproductive and Sexual Health. Web Resources, Series One.

    Science.gov (United States)

    United Nations Educational, Scientific and Cultural Organization, Bangkok (Thailand). Principal Regional Office for Asia and the Pacific.

    This document announces web sites that address adolescent reproductive and sexual health. The web sites are arranged alphabetically by name, and refer to the owner of the site rather than the title. The profile of each site consists of basic information such as the address of the organization or owner, fax number, telephone number, e-mail address,…

  5. History of the CERN Web Software Public Releases

    CERN Document Server

    Fluckiger, Francois; CERN. Geneva. IT Department

    2016-01-01

    This note is an extended version of the article “Licencing the Web” (http://home.web.cern.ch/topics/birthweb/licensing-web) published by CERN, Nov 2013, in the “Birth of the Web” series of articles (http://home.cern/topics/birth-web). It describes the successive steps of the public release of the CERN Web software, from public domain to open source, and explains their rationale. It provides in annexes historical documents including release announcement and texts of the licences used by CERN and MIT in public software distributions.

  6. EL ÍNDICE CUANTITATIVO DE CALIDAD WEB COMO INSTRUMENTO OBJETIVO DE MEDICIÓN DE LA CALIDAD DE SITIOS WEB CORPORATIVOS

    Directory of Open Access Journals (Sweden)

    González López, Óscar R.

    2013-01-01

    Full Text Available El nuevo entorno marcado por la crisis económica hace imprescindible conocer la eficiencia de las acciones online llevadas a cabo por la empresa. Este trabajo propone una serie de indicadores para evaluar los sitios web de las compañías de todo el mundo. Para ello se ha diseñado la investigación del problema a tratar, basada en la evaluación manual y automática de una serie de variables objetivas, con la posterior aplicación de un análisis factorial para la elaboración de indicadores. El índice cuantitativo de calidad web (ICCW es una herramienta versátil que nos permite comparar cualquier tipo de organización y detectar los puntos fuertes y débiles del sitio web de la compañía.

  7. Learning in a sheltered Internet environment: The use of WebQuests

    NARCIS (Netherlands)

    Segers, P.C.J.; Verhoeven, L.T.W.

    2009-01-01

    The present study investigated the effects on learning in a sheltered Internet environment using so-called WebQuests in elementary school classrooms in the Netherlands. A WebQuest is an assignment presented together with a series of web pages to help guide children's learning. The learning gains and

  8. Web-Enhanced Instruction and Learning: Findings of a Short- and Long-Term Impact Study and Teacher Use of NASA Web Resources

    Science.gov (United States)

    McCarthy, Marianne C.; Grabowski, Barbara L.; Koszalka, Tiffany

    2003-01-01

    Over a three-year period, researchers and educators from the Pennsylvania State University (PSU), University Park, Pennsylvania, and the NASA Dryden Flight Research Center (DFRC), Edwards, California, worked together to analyze, develop, implement and evaluate materials and tools that enable teachers to use NASA Web resources effectively for teaching science, mathematics, technology and geography. Two conference publications and one technical paper have already been published as part of this educational research series on Web-based instruction and learning. This technical paper, Web-Enhanced Instruction and Learning: Findings of a Short- and Long-Term Impact Study, is the culminating report in this educational research series and is based on the final report submitted to NASA. This report describes the broad spectrum of data gathered from teachers about their experiences using NASA Web resources in the classroom. It also describes participating teachers responses and feedback about the use of the NASA Web-Enhanced Learning Environment Strategies reflection tool on their teaching practices. The reflection tool was designed to help teachers merge the vast array of NASA resources with the best teaching methods, taking into consideration grade levels, subject areas and teaching preferences. The teachers described their attitudes toward technology and innovation in the classroom and their experiences and perceptions as they attempted to integrate Web resources into science, mathematics, technology and geography instruction.

  9. Ensemble learned vaccination uptake prediction using web search queries

    DEFF Research Database (Denmark)

    Hansen, Niels Dalum; Lioma, Christina; Mølbak, Kåre

    2016-01-01

    We present a method that uses ensemble learning to combine clinical and web-mined time-series data in order to predict future vaccination uptake. The clinical data is official vaccination registries, and the web data is query frequencies collected from Google Trends. Experiments with official...... vaccine records show that our method predicts vaccination uptake eff?ectively (4.7 Root Mean Squared Error). Whereas performance is best when combining clinical and web data, using solely web data yields comparative performance. To our knowledge, this is the ?first study to predict vaccination uptake...

  10. Learning in a Sheltered Internet Environment: The Use of WebQuests

    Science.gov (United States)

    Segers, Eliane; Verhoeven, Ludo

    2009-01-01

    The present study investigated the effects on learning in a sheltered Internet environment using so-called WebQuests in elementary school classrooms in the Netherlands. A WebQuest is an assignment presented together with a series of web pages to help guide children's learning. The learning gains and quality of the work of 229 sixth graders…

  11. Teaching with technology: automatically receiving information from the internet and web.

    Science.gov (United States)

    Wink, Diane M

    2010-01-01

    In this bimonthly series, the author examines how nurse educators can use the Internet and Web-based computer technologies such as search, communication, and collaborative writing tools, social networking and social bookmarking sites, virtual worlds, and Web-based teaching and learning programs. This article presents information and tools related to automatically receiving information from the Internet and Web.

  12. Teaching with technology: free Web resources for teaching and learning.

    Science.gov (United States)

    Wink, Diane M; Smith-Stoner, Marilyn

    2011-01-01

    In this bimonthly series, the department editor examines how nurse educators can use Internet and Web-based computer technologies such as search, communication, collaborative writing tools; social networking, and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. In this article, the department editor and her coauthor describe free Web-based resources that can be used to support teaching and learning.

  13. A Method for Transforming Existing Web Service Descriptions into an Enhanced Semantic Web Service Framework

    Science.gov (United States)

    Du, Xiaofeng; Song, William; Munro, Malcolm

    Web Services as a new distributed system technology has been widely adopted by industries in the areas, such as enterprise application integration (EAI), business process management (BPM), and virtual organisation (VO). However, lack of semantics in the current Web Service standards has been a major barrier in service discovery and composition. In this chapter, we propose an enhanced context-based semantic service description framework (CbSSDF+) that tackles the problem and improves the flexibility of service discovery and the correctness of generated composite services. We also provide an agile transformation method to demonstrate how the various formats of Web Service descriptions on the Web can be managed and renovated step by step into CbSSDF+ based service description without large amount of engineering work. At the end of the chapter, we evaluate the applicability of the transformation method and the effectiveness of CbSSDF+ through a series of experiments.

  14. Web Usage Mining, Pattern Discovery dan Log File

    OpenAIRE

    Tri Suratno; Toni Prahasto; Adian Fatchur Rochim

    2014-01-01

    Analysis  of  data  to  access  the  server  can  provide  significant  and  useful  information  for  performance  improvement,  restructuring  andimproving the effectiveness of a web site. Data mining is one of the most effective way to detect a series of patterns of information from large amounts of data. Application of  data mining  on  Internet use  called web  mining  is a set of  data mining  techniques  are  used  for the web. Web mining technologies and data mining is a combination o...

  15. Biomedical semantics in the Semantic Web.

    Science.gov (United States)

    Splendiani, Andrea; Burger, Albert; Paschke, Adrian; Romano, Paolo; Marshall, M Scott

    2011-03-07

    The Semantic Web offers an ideal platform for representing and linking biomedical information, which is a prerequisite for the development and application of analytical tools to address problems in data-intensive areas such as systems biology and translational medicine. As for any new paradigm, the adoption of the Semantic Web offers opportunities and poses questions and challenges to the life sciences scientific community: which technologies in the Semantic Web stack will be more beneficial for the life sciences? Is biomedical information too complex to benefit from simple interlinked representations? What are the implications of adopting a new paradigm for knowledge representation? What are the incentives for the adoption of the Semantic Web, and who are the facilitators? Is there going to be a Semantic Web revolution in the life sciences?We report here a few reflections on these questions, following discussions at the SWAT4LS (Semantic Web Applications and Tools for Life Sciences) workshop series, of which this Journal of Biomedical Semantics special issue presents selected papers from the 2009 edition, held in Amsterdam on November 20th.

  16. Web Environments for Group-Based Project Work in Higher Education

    NARCIS (Netherlands)

    Andernach, J.A.; van Diepen, N.M.; Collis, Betty; Andernach, Toine

    1997-01-01

    We discuss problems confronting the use of group-based project work as an instructional strategy in higher education and describe two courses in which course-specific World Wide Web (Web) environments have evolved over a series of course sequences and are used both as tool environments for

  17. Utilizing Web 2.0 Technologies for Library Web Tutorials: An Examination of Instruction on Community College Libraries' Websites Serving Large Student Bodies

    Science.gov (United States)

    Blummer, Barbara; Kenton, Jeffrey M.

    2015-01-01

    This is the second part of a series on Web 2.0 tools available from community college libraries' Websites. The first article appeared in an earlier volume of this journal and it illustrated the wide variety of Web 2.0 tools on community college libraries' Websites serving large student bodies (Blummer and Kenton 2014). The research found many of…

  18. Enabling Web-Based Analysis of CUAHSI HIS Hydrologic Data Using R and Web Processing Services

    Science.gov (United States)

    Ames, D. P.; Kadlec, J.; Bayles, M.; Seul, M.; Hooper, R. P.; Cummings, B.

    2015-12-01

    The CUAHSI Hydrologic Information System (CUAHSI HIS) provides open access to a large number of hydrological time series observation and modeled data from many parts of the world. Several software tools have been designed to simplify searching and access to the CUAHSI HIS datasets. These software tools include: Desktop client software (HydroDesktop, HydroExcel), developer libraries (WaterML R Package, OWSLib, ulmo), and the new interactive search website, http://data.cuahsi.org. An issue with using the time series data from CUAHSI HIS for further analysis by hydrologists (for example for verification of hydrological and snowpack models) is the large heterogeneity of the time series data. The time series may be regular or irregular, contain missing data, have different time support, and be recorded in different units. R is a widely used computational environment for statistical analysis of time series and spatio-temporal data that can be used to assess fitness and perform scientific analyses on observation data. R includes the ability to record a data analysis in the form of a reusable script. The R script together with the input time series dataset can be shared with other users, making the analysis more reproducible. The major goal of this study is to examine the use of R as a Web Processing Service for transforming time series data from the CUAHSI HIS and sharing the results on the Internet within HydroShare. HydroShare is an online data repository and social network for sharing large hydrological data sets such as time series, raster datasets, and multi-dimensional data. It can be used as a permanent cloud storage space for saving the time series analysis results. We examine the issues associated with running R scripts online: including code validation, saving of outputs, reporting progress, and provenance management. An explicit goal is that the script which is run locally should produce exactly the same results as the script run on the Internet. Our design can

  19. Overview of the TREC 2013 federated web search track

    OpenAIRE

    Demeester, Thomas; Trieschnigg, D; Nguyen, D; Hiemstra, D

    2013-01-01

    The TREC Federated Web Search track is intended to promote research related to federated search in a realistic web setting, and hereto provides a large data collection gathered from a series of online search engines. This overview paper discusses the results of the first edition of the track, FedWeb 2013. The focus was on basic challenges in federated search: (1) resource selection, and (2) results merging. After an overview of the provided data collection and the relevance judgments for the ...

  20. Safety and efficacy of aneurysm treatment with WEB

    DEFF Research Database (Denmark)

    Pierot, Laurent; Costalat, Vincent; Moret, Jacques

    2016-01-01

    OBJECT WEB is an innovative intrasaccular treatment for intracranial aneurysms. Preliminary series have shown good safety and efficacy. The WEB Clinical Assessment of Intrasaccular Aneurysm Therapy (WEBCAST) trial is a prospective European trial evaluating the safety and efficacy of WEB in wide......-neck bifurcation aneurysms. METHODS Patients with wide-neck bifurcation aneurysms for which WEB treatment was indicated were included in this multicentergood clinical practices study. Clinical data including adverse events and clinical status at 1 and 6 months were collected and independently analyzed by a medical....... RESULTS Ten European neurointerventional centers enrolled 51 patients with 51 aneurysms. Treatment with WEB was achieved in 48 of 51 aneurysms (94.1%). Adjunctive implants (coils/stents) were used in 4 of 48 aneurysms (8.3%). Thromboembolic events were observed in 9 of 51 patients (17.6%), resulting...

  1. Noise and Vibration Risk Prevention Virtual Web for Ubiquitous Training

    Science.gov (United States)

    Redel-Macías, María Dolores; Cubero-Atienza, Antonio J.; Martínez-Valle, José Miguel; Pedrós-Pérez, Gerardo; del Pilar Martínez-Jiménez, María

    2015-01-01

    This paper describes a new Web portal offering experimental labs for ubiquitous training of university engineering students in work-related risk prevention. The Web-accessible computer program simulates the noise and machine vibrations met in the work environment, in a series of virtual laboratories that mimic an actual laboratory and provide the…

  2. Update of the FANTOM web resource

    DEFF Research Database (Denmark)

    Lizio, Marina; Harshbarger, Jayson; Abugessaisa, Imad

    2017-01-01

    Upon the first publication of the fifth iteration of the Functional Annotation of Mammalian Genomes collaborative project, FANTOM5, we gathered a series of primary data and database systems into the FANTOM web resource (http://fantom.gsc.riken.jp) to facilitate researchers to explore...... transcriptional regulation and cellular states. In the course of the collaboration, primary data and analysis results have been expanded, and functionalities of the database systems enhanced. We believe that our data and web systems are invaluable resources, and we think the scientific community will benefit...... for this recent update to deepen their understanding of mammalian cellular organization. We introduce the contents of FANTOM5 here, report recent updates in the web resource and provide future perspectives....

  3. Multigraph: Interactive Data Graphs on the Web

    Science.gov (United States)

    Phillips, M. B.

    2010-12-01

    Many aspects of geophysical science involve time dependent data that is often presented in the form of a graph. Considering that the web has become a primary means of communication, there are surprisingly few good tools and techniques available for presenting time-series data on the web. The most common solution is to use a desktop tool such as Excel or Matlab to create a graph which is saved as an image and then included in a web page like any other image. This technique is straightforward, but it limits the user to one particular view of the data, and disconnects the graph from the data in a way that makes updating a graph with new data an often cumbersome manual process. This situation is somewhat analogous to the state of mapping before the advent of GIS. Maps existed only in printed form, and creating a map was a laborious process. In the last several years, however, the world of mapping has experienced a revolution in the form of web-based and other interactive computer technologies, so that it is now commonplace for anyone to easily browse through gigabytes of geographic data. Multigraph seeks to bring a similar ease of access to time series data. Multigraph is a program for displaying interactive time-series data graphs in web pages that includes a simple way of configuring the appearance of the graph and the data to be included. It allows multiple data sources to be combined into a single graph, and allows the user to explore the data interactively. Multigraph lets users explore and visualize "data space" in the same way that interactive mapping applications such as Google Maps facilitate exploring and visualizing geography. Viewing a Multigraph graph is extremely simple and intuitive, and requires no instructions. Creating a new graph for inclusion in a web page involves writing a simple XML configuration file and requires no programming. Multigraph can read data in a variety of formats, and can display data from a web service, allowing users to "surf

  4. Web-based access to near real-time and archived high-density time-series data: cyber infrastructure challenges & developments in the open-source Waveform Server

    Science.gov (United States)

    Reyes, J. C.; Vernon, F. L.; Newman, R. L.; Steidl, J. H.

    2010-12-01

    The Waveform Server is an interactive web-based interface to multi-station, multi-sensor and multi-channel high-density time-series data stored in Center for Seismic Studies (CSS) 3.0 schema relational databases (Newman et al., 2009). In the last twelve months, based on expanded specifications and current user feedback, both the server-side infrastructure and client-side interface have been extensively rewritten. The Python Twisted server-side code-base has been fundamentally modified to now present waveform data stored in cluster-based databases using a multi-threaded architecture, in addition to supporting the pre-existing single database model. This allows interactive web-based access to high-density (broadband @ 40Hz to strong motion @ 200Hz) waveform data that can span multiple years; the common lifetime of broadband seismic networks. The client-side interface expands on it's use of simple JSON-based AJAX queries to now incorporate a variety of User Interface (UI) improvements including standardized calendars for defining time ranges, applying on-the-fly data calibration to display SI-unit data, and increased rendering speed. This presentation will outline the various cyber infrastructure challenges we have faced while developing this application, the use-cases currently in existence, and the limitations of web-based application development.

  5. Time-Series Adaptive Estimation of Vaccination Uptake Using Web Search Queries

    DEFF Research Database (Denmark)

    Dalum Hansen, Niels; Mølbak, Kåre; Cox, Ingemar J.

    2017-01-01

    Estimating vaccination uptake is an integral part of ensuring public health. It was recently shown that vaccination uptake can be estimated automatically from web data, instead of slowly collected clinical records or population surveys [2]. All prior work in this area assumes that features of vac...

  6. Mining the inner structure of the Web graph

    International Nuclear Information System (INIS)

    Donato, Debora; Leonardi, Stefano; Millozzi, Stefano; Tsaparas, Panayiotis

    2008-01-01

    Despite being the sum of decentralized and uncoordinated efforts by heterogeneous groups and individuals, the World Wide Web exhibits a well-defined structure, characterized by several interesting properties. This structure was clearly revealed by Broder et al (2000 Graph structure in the web Comput. Netw. 33 309) who presented the evocative bow-tie picture of the Web. Although, the bow-tie structure is a relatively clear abstraction of the macroscopic picture of the Web, it is quite uninformative with respect to the finer details of the Web graph. In this paper, we mine the inner structure of the Web graph. We present a series of measurements on the Web, which offer a better understanding of the individual components of the bow-tie. In the process, we develop algorithmic techniques for performing these measurements. We discover that the scale-free properties permeate all the components of the bow-tie which exhibit the same macroscopic properties as the Web graph itself. However, close inspection reveals that their inner structure is quite distinct. We show that the Web graph does not exhibit self similarity within its components, and we propose a possible alternative picture for the Web graph, as it emerges from our experiments

  7. From Web accessibility to Web adaptability.

    Science.gov (United States)

    Kelly, Brian; Nevile, Liddy; Sloan, David; Fanou, Sotiris; Ellison, Ruth; Herrod, Lisa

    2009-07-01

    This article asserts that current approaches to enhance the accessibility of Web resources fail to provide a solid foundation for the development of a robust and future-proofed framework. In particular, they fail to take advantage of new technologies and technological practices. The article introduces a framework for Web adaptability, which encourages the development of Web-based services that can be resilient to the diversity of uses of such services, the target audience, available resources, technical innovations, organisational policies and relevant definitions of 'accessibility'. The article refers to a series of author-focussed approaches to accessibility through which the authors and others have struggled to find ways to promote accessibility for people with disabilities. These approaches depend upon the resource author's determination of the anticipated users' needs and their provision. Through approaches labelled as 1.0, 2.0 and 3.0, the authors have widened their focus to account for contexts and individual differences in target audiences. Now, the authors want to recognise the role of users in determining their engagement with resources (including services). To distinguish this new approach, the term 'adaptability' has been used to replace 'accessibility'; new definitions of accessibility have been adopted, and the authors have reviewed their previous work to clarify how it is relevant to the new approach. Accessibility 1.0 is here characterised as a technical approach in which authors are told how to construct resources for a broadly defined audience. This is known as universal design. Accessibility 2.0 was introduced to point to the need to account for the context in which resources would be used, to help overcome inadequacies identified in the purely technical approach. Accessibility 3.0 moved the focus on users from a homogenised universal definition to recognition of the idiosyncratic needs and preferences of individuals and to cater for them. All of

  8. WebDMS: A Web-Based Data Management System for Environmental Data

    Science.gov (United States)

    Ekstrand, A. L.; Haderman, M.; Chan, A.; Dye, T.; White, J. E.; Parajon, G.

    2015-12-01

    DMS is an environmental Data Management System to manage, quality-control (QC), summarize, document chain-of-custody, and disseminate data from networks ranging in size from a few sites to thousands of sites, instruments, and sensors. The server-client desktop version of DMS is used by local and regional air quality agencies (including the Bay Area Air Quality Management District, the South Coast Air Quality Management District, and the California Air Resources Board), the EPA's AirNow Program, and the EPA's AirNow-International (AirNow-I) program, which offers countries the ability to run an AirNow-like system. As AirNow's core data processing engine, DMS ingests, QCs, and stores real-time data from over 30,000 active sensors at over 5,280 air quality and meteorological sites from over 130 air quality agencies across the United States. As part of the AirNow-I program, several instances of DMS are deployed in China, Mexico, and Taiwan. The U.S. Department of State's StateAir Program also uses DMS for five regions in China and plans to expand to other countries in the future. Recent development has begun to migrate DMS from an onsite desktop application to WebDMS, a web-based application designed to take advantage of cloud hosting and computing services to increase scalability and lower costs. WebDMS will continue to provide easy-to-use data analysis tools, such as time-series graphs, scatterplots, and wind- or pollution-rose diagrams, as well as allowing data to be exported to external systems such as the EPA's Air Quality System (AQS). WebDMS will also provide new GIS analysis features and a suite of web services through a RESTful web API. These changes will better meet air agency needs and allow for broader national and international use (for example, by the AirNow-I partners). We will talk about the challenges and advantages of migrating DMS to the web, modernizing the DMS user interface, and making it more cost-effective to enhance and maintain over time.

  9. Web Services and Data Enhancements at the Northern California Earthquake Data Center

    Science.gov (United States)

    Neuhauser, D. S.; Zuzlewski, S.; Lombard, P. N.; Allen, R. M.

    2013-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC offers the following web services that are compliant with the International Federation of Digital Seismograph Networks (FDSN) web services specifications: (1) fdsn-dataselect: time series data delivered in MiniSEED format, (2) fdsn-station: station and channel metadata and time series availability delivered in StationXML format, (3) fdsn-event: earthquake event information delivered in QuakeML format. In addition, the NCEDC offers the the following IRIS-compatible web services: (1) sacpz: provide channel gains, poles, and zeros in SAC format, (2) resp: provide channel response information in RESP format, (3) dataless: provide station and channel metadata in Dataless SEED format. The NCEDC is also developing a web service to deliver timeseries from pre-assembled event waveform gathers. The NCEDC has waveform gathers for ~750,000 northern and central California events from 1984 to the present, many of which were created by the USGS NCSN prior to the establishment of the joint NCSS (Northern California Seismic System). We are currently adding waveforms to these older event gathers with time series from the UCB networks and other networks with waveforms archived at the NCEDC, and ensuring that the waveform for each channel in the event gathers have the highest

  10. Customisable Scientific Web Portal for Fusion Research

    Energy Technology Data Exchange (ETDEWEB)

    Abla, G; Kim, E; Schissel, D; Flannagan, S [General Atomics, San Diego (United States)

    2009-07-01

    The Web browser has become one of the major application interfaces for remotely participating in magnetic fusion. Web portals are used to present very diverse sources of information in a unified way. While a web portal has several benefits over other software interfaces, such as providing single point of access for multiple computational services, and eliminating the need for client software installation, the design and development of a web portal has unique challenges. One of the challenges is that a web portal needs to be fast and interactive despite a high volume of tools and information that it presents. Another challenge is the visual output on the web portal often is overwhelming due to the high volume of data generated by complex scientific instruments and experiments; therefore the applications and information should be customizable depending on the needs of users. An appropriate software architecture and web technologies can meet these problems. A web-portal has been designed to support the experimental activities of DIII-D researchers worldwide. It utilizes a multi-tier software architecture, and web 2.0 technologies, such as AJAX, Django, and Memcached, to develop a highly interactive and customizable user interface. It offers a customizable interface with personalized page layouts and list of services for users to select. Customizable services are: real-time experiment status monitoring, diagnostic data access, interactive data visualization. The web-portal also supports interactive collaborations by providing collaborative logbook, shared visualization and online instant message services. Furthermore, the web portal will provide a mechanism to allow users to create their own applications on the web portal as well as bridging capabilities to external applications such as Twitter and other social networks. In this series of slides, we describe the software architecture of this scientific web portal and our experiences in utilizing web 2.0 technologies. A

  11. TIGERweb, 2017, Series Information for the TIGERweb, Web Mapping Service and REST files

    Data.gov (United States)

    US Census Bureau, Department of Commerce — TIGERweb allows the viewing of TIGER spatial data online and for TIGER data to be streamed to your mapping application. TIGERweb consists of a web mapping service...

  12. Web Page Recommendation Using Web Mining

    OpenAIRE

    Modraj Bhavsar; Mrs. P. M. Chavan

    2014-01-01

    On World Wide Web various kind of content are generated in huge amount, so to give relevant result to user web recommendation become important part of web application. On web different kind of web recommendation are made available to user every day that includes Image, Video, Audio, query suggestion and web page. In this paper we are aiming at providing framework for web page recommendation. 1) First we describe the basics of web mining, types of web mining. 2) Details of each...

  13. Self-test web-based pure-tone audiometry: validity evaluation and measurement error analysis.

    Science.gov (United States)

    Masalski, Marcin; Kręcicki, Tomasz

    2013-04-12

    Potential methods of application of self-administered Web-based pure-tone audiometry conducted at home on a PC with a sound card and ordinary headphones depend on the value of measurement error in such tests. The aim of this research was to determine the measurement error of the hearing threshold determined in the way described above and to identify and analyze factors influencing its value. The evaluation of the hearing threshold was made in three series: (1) tests on a clinical audiometer, (2) self-tests done on a specially calibrated computer under the supervision of an audiologist, and (3) self-tests conducted at home. The research was carried out on the group of 51 participants selected from patients of an audiology outpatient clinic. From the group of 51 patients examined in the first two series, the third series was self-administered at home by 37 subjects (73%). The average difference between the value of the hearing threshold determined in series 1 and in series 2 was -1.54dB with standard deviation of 7.88dB and a Pearson correlation coefficient of .90. Between the first and third series, these values were -1.35dB±10.66dB and .84, respectively. In series 3, the standard deviation was most influenced by the error connected with the procedure of hearing threshold identification (6.64dB), calibration error (6.19dB), and additionally at the frequency of 250Hz by frequency nonlinearity error (7.28dB). The obtained results confirm the possibility of applying Web-based pure-tone audiometry in screening tests. In the future, modifications of the method leading to the decrease in measurement error can broaden the scope of Web-based pure-tone audiometry application.

  14. 'TIME': A Web Application for Obtaining Insights into Microbial Ecology Using Longitudinal Microbiome Data.

    Science.gov (United States)

    Baksi, Krishanu D; Kuntal, Bhusan K; Mande, Sharmila S

    2018-01-01

    Realization of the importance of microbiome studies, coupled with the decreasing sequencing cost, has led to the exponential growth of microbiome data. A number of these microbiome studies have focused on understanding changes in the microbial community over time. Such longitudinal microbiome studies have the potential to offer unique insights pertaining to the microbial social networks as well as their responses to perturbations. In this communication, we introduce a web based framework called 'TIME' (Temporal Insights into Microbial Ecology'), developed specifically to obtain meaningful insights from microbiome time series data. The TIME web-server is designed to accept a wide range of popular formats as input with options to preprocess and filter the data. Multiple samples, defined by a series of longitudinal time points along with their metadata information, can be compared in order to interactively visualize the temporal variations. In addition to standard microbiome data analytics, the web server implements popular time series analysis methods like Dynamic time warping, Granger causality and Dickey Fuller test to generate interactive layouts for facilitating easy biological inferences. Apart from this, a new metric for comparing metagenomic time series data has been introduced to effectively visualize the similarities/differences in the trends of the resident microbial groups. Augmenting the visualizations with the stationarity information pertaining to the microbial groups is utilized to predict the microbial competition as well as community structure. Additionally, the 'causality graph analysis' module incorporated in TIME allows predicting taxa that might have a higher influence on community structure in different conditions. TIME also allows users to easily identify potential taxonomic markers from a longitudinal microbiome analysis. We illustrate the utility of the web-server features on a few published time series microbiome data and demonstrate the

  15. a Web Api and Web Application Development for Dissemination of Air Quality Information

    Science.gov (United States)

    Şahin, K.; Işıkdağ, U.

    2017-11-01

    Various studies have been carried out since 2005 under the leadership of Ministry of Environment and Urbanism of Turkey, in order to observe the quality of air in Turkey, to develop new policies and to develop a sustainable air quality management strategy. For this reason, a national air quality monitoring network has been developed providing air quality indices. By this network, the quality of the air has been continuously monitored and an important information system has been constructed in order to take precautions for preventing a dangerous situation. The biggest handicap in the network is the data access problem for instant and time series data acquisition and processing because of its proprietary structure. Currently, there is no service offered by the current air quality monitoring system for exchanging information with third party applications. Within the context of this work, a web service has been developed to enable location based querying of the current/past air quality data in Turkey. This web service is equipped with up-todate and widely preferred technologies. In other words, an architecture is chosen in which applications can easily integrate. In the second phase of the study, a web-based application was developed to test the developed web service and this testing application can perform location based acquisition of air-quality data. This makes it possible to easily carry out operations such as screening and examination of the area in the given time-frame which cannot be done with the national monitoring network.

  16. Development of web tools to disseminate space geodesy data-related products

    Science.gov (United States)

    Soudarin, Laurent; Ferrage, Pascale; Mezerette, Adrien

    2015-04-01

    In order to promote the products of the DORIS system, the French Space Agency CNES has developed and implemented on the web site of the International DORIS Service (IDS) a set of plot tools to interactively build and display time series of site positions, orbit residuals and terrestrial parameters (scale, geocenter). An interactive global map is also available to select sites, and to get access to their information. Besides the products provided by the CNES Orbitography Team and the IDS components, these tools allow comparing time evolutions of coordinates for collocated DORIS and GNSS stations, thanks to the collaboration with the Terrestrial Frame Combination Center of the International GNSS Service (IGS). A database was created to improve robustness and efficiency of the tools, with the objective to propose a complete web service to foster data exchange with the other geodetic services of the International Association of Geodesy (IAG). The possibility to visualize and compare position time series of the four main space geodetic techniques DORIS, GNSS, SLR and VLBI is already under way at the French level. A dedicated version of these web tools has been developed for the French Space Geodesy Research Group (GRGS). It will give access to position time series provided by the GRGS Analysis Centers involved in DORIS, GNSS, SLR and VLBI data processing for the realization of the International Terrestrial Reference Frame. In this presentation, we will describe the functionalities of these tools, and we will address some aspects of the time series (content, format).

  17. Aaron Swartz's the programmable web an unfinished work

    CERN Document Server

    Swartz, Aaron

    2013-01-01

    This short work is the first draft of a book manuscript by Aaron Swartz written for the series ""Synthesis Lectures on the Semantic Web"" at the invitation of its editor, James Hendler. Unfortunately, the book wasn't completed before Aaron's death in January 2013. As a tribute, the editor and publisher are publishing the work digitally without cost.From the author's introduction:"" . . . we will begin by trying to understand the architecture of the Web -- what it got right and, occasionally, what it got wrong, but most importantly why it is the way it is. We will learn how it allows both users

  18. Planktonic food web structure at a coastal time-series site: I. Partitioning of microbial abundances and carbon biomass

    Science.gov (United States)

    Caron, David A.; Connell, Paige E.; Schaffner, Rebecca A.; Schnetzer, Astrid; Fuhrman, Jed A.; Countway, Peter D.; Kim, Diane Y.

    2017-03-01

    Biogeochemistry in marine plankton communities is strongly influenced by the activities of microbial species. Understanding the composition and dynamics of these assemblages is essential for modeling emergent community-level processes, yet few studies have examined all of the biological assemblages present in the plankton, and benchmark data of this sort from time-series studies are rare. Abundance and biomass of the entire microbial assemblage and mesozooplankton (>200 μm) were determined vertically, monthly and seasonally over a 3-year period at a coastal time-series station in the San Pedro Basin off the southwestern coast of the USA. All compartments of the planktonic community were enumerated (viruses in the femtoplankton size range [0.02-0.2 μm], bacteria + archaea and cyanobacteria in the picoplankton size range [0.2-2.0 μm], phototrophic and heterotrophic protists in the nanoplanktonic [2-20 μm] and microplanktonic [20-200 μm] size ranges, and mesozooplankton [>200 μm]. Carbon biomass of each category was estimated using standard conversion factors. Plankton abundances varied over seven orders of magnitude across all categories, and total carbon biomass averaged approximately 60 μg C l-1 in surface waters of the 890 m water column over the study period. Bacteria + archaea comprised the single largest component of biomass (>1/3 of the total), with the sum of phototrophic protistan biomass making up a similar proportion. Temporal variability at this subtropical station was not dramatic. Monthly depth-specific and depth-integrated biomass varied 2-fold at the station, while seasonal variances were generally web structure and function at this coastal observatory.

  19. Safety and efficacy of aneurysm treatment with WEB in the cumulative population of three prospective, multicenter series

    DEFF Research Database (Denmark)

    Pierot, Laurent; Moret, Jacques; Barreau, Xavier

    2018-01-01

    BACKGROUND: Flow disruption with the WEB is an innovative endovascular approach for treatment of wide-neck bifurcation aneurysms. Initial studies have shown a low complication rate with good efficacy. PURPOSE: To report clinical and anatomical results of the WEB treatment in the cumulative popula...

  20. Web components and the semantic web

    OpenAIRE

    Casey, Maire; Pahl, Claus

    2003-01-01

    Component-based software engineering on the Web differs from traditional component and software engineering. We investigate Web component engineering activites that are crucial for the development,com position, and deployment of components on the Web. The current Web Services and Semantic Web initiatives strongly influence our work. Focussing on Web component composition we develop description and reasoning techniques that support a component developer in the composition activities,fo cussing...

  1. ‘TIME’: A Web Application for Obtaining Insights into Microbial Ecology Using Longitudinal Microbiome Data

    Directory of Open Access Journals (Sweden)

    Krishanu D. Baksi

    2018-01-01

    Full Text Available Realization of the importance of microbiome studies, coupled with the decreasing sequencing cost, has led to the exponential growth of microbiome data. A number of these microbiome studies have focused on understanding changes in the microbial community over time. Such longitudinal microbiome studies have the potential to offer unique insights pertaining to the microbial social networks as well as their responses to perturbations. In this communication, we introduce a web based framework called ‘TIME’ (Temporal Insights into Microbial Ecology’, developed specifically to obtain meaningful insights from microbiome time series data. The TIME web-server is designed to accept a wide range of popular formats as input with options to preprocess and filter the data. Multiple samples, defined by a series of longitudinal time points along with their metadata information, can be compared in order to interactively visualize the temporal variations. In addition to standard microbiome data analytics, the web server implements popular time series analysis methods like Dynamic time warping, Granger causality and Dickey Fuller test to generate interactive layouts for facilitating easy biological inferences. Apart from this, a new metric for comparing metagenomic time series data has been introduced to effectively visualize the similarities/differences in the trends of the resident microbial groups. Augmenting the visualizations with the stationarity information pertaining to the microbial groups is utilized to predict the microbial competition as well as community structure. Additionally, the ‘causality graph analysis’ module incorporated in TIME allows predicting taxa that might have a higher influence on community structure in different conditions. TIME also allows users to easily identify potential taxonomic markers from a longitudinal microbiome analysis. We illustrate the utility of the web-server features on a few published time series microbiome

  2. WebVis: a hierarchical web homepage visualizer

    Science.gov (United States)

    Renteria, Jose C.; Lodha, Suresh K.

    2000-02-01

    WebVis, the Hierarchical Web Home Page Visualizer, is a tool for managing home web pages. The user can access this tool via the WWW and obtain a hierarchical visualization of one's home web pages. WebVis is a real time interactive tool that supports many different queries on the statistics of internal files such as sizes, age, and type. In addition, statistics on embedded information such as VRML files, Java applets, images and sound files can be extracted and queried. Results of these queries are visualized using color, shape and size of different nodes of the hierarchy. The visualization assists the user in a variety of task, such as quickly finding outdated information or locate large files. WebVIs is one solution to the growing web space maintenance problem. Implementation of WebVis is realized with Perl and Java. Perl pattern matching and file handling routines are used to collect and process web space linkage information and web document information. Java utilizes the collected information to produce visualization of the web space. Java also provides WebVis with real time interactivity, while running off the WWW. Some WebVis examples of home web page visualization are presented.

  3. A WEB API AND WEB APPLICATION DEVELOPMENT FOR DISSEMINATION OF AIR QUALITY INFORMATION

    Directory of Open Access Journals (Sweden)

    K. Şahin

    2017-11-01

    Full Text Available Various studies have been carried out since 2005 under the leadership of Ministry of Environment and Urbanism of Turkey, in order to observe the quality of air in Turkey, to develop new policies and to develop a sustainable air quality management strategy. For this reason, a national air quality monitoring network has been developed providing air quality indices. By this network, the quality of the air has been continuously monitored and an important information system has been constructed in order to take precautions for preventing a dangerous situation. The biggest handicap in the network is the data access problem for instant and time series data acquisition and processing because of its proprietary structure. Currently, there is no service offered by the current air quality monitoring system for exchanging information with third party applications. Within the context of this work, a web service has been developed to enable location based querying of the current/past air quality data in Turkey. This web service is equipped with up-todate and widely preferred technologies. In other words, an architecture is chosen in which applications can easily integrate. In the second phase of the study, a web-based application was developed to test the developed web service and this testing application can perform location based acquisition of air-quality data. This makes it possible to easily carry out operations such as screening and examination of the area in the given time-frame which cannot be done with the national monitoring network.

  4. Accelerated Creep Testing of High Strength Aramid Webbing

    Science.gov (United States)

    Jones, Thomas C.; Doggett, William R.; Stnfield, Clarence E.; Valverde, Omar

    2012-01-01

    A series of preliminary accelerated creep tests were performed on four variants of 12K and 24K lbf rated Vectran webbing to help develop an accelerated creep test methodology and analysis capability for high strength aramid webbings. The variants included pristine, aged, folded and stitched samples. This class of webbings is used in the restraint layer of habitable, inflatable space structures, for which the lifetime properties are currently not well characterized. The Stepped Isothermal Method was used to accelerate the creep life of the webbings and a novel stereo photogrammetry system was used to measure the full-field strains. A custom MATLAB code is described, and used to reduce the strain data to produce master creep curves for the test samples. Initial results show good correlation between replicates; however, it is clear that a larger number of samples are needed to build confidence in the consistency of the results. It is noted that local fiber breaks affect the creep response in a similar manner to increasing the load, thus raising the creep rate and reducing the time to creep failure. The stitched webbings produced the highest variance between replicates, due to the combination of higher local stresses and thread-on-fiber damage. Large variability in the strength of the webbings is also shown to have an impact on the range of predicted creep life.

  5. Business use of the World Wide Web: a report on further investigations

    Directory of Open Access Journals (Sweden)

    Hooi-Im Ng

    1998-01-01

    Full Text Available As a continuation of a previous study this paper reports on a series of studies into business use of the World Wide Web and, more generally the Internet. The use of the World Wide Web as a business tool has increased rapidly for the past three years, and the benefits of the World Wide Web to business and customers are discussed, together with the barriers that hold back future development of electronic commerce. As with the previous study we report on a desk survey of 300 randomly selected business Web sites and on the results of an electronic mail questionnaire sent to the sample companies. An extended version of this paper has been submitted to the International Journal of Information Management

  6. Strategies to address participant misrepresentation for eligibility in Web-based research.

    Science.gov (United States)

    Kramer, Jessica; Rubin, Amy; Coster, Wendy; Helmuth, Eric; Hermos, John; Rosenbloom, David; Moed, Rich; Dooley, Meghan; Kao, Ying-Chia; Liljenquist, Kendra; Brief, Deborah; Enggasser, Justin; Keane, Terence; Roy, Monica; Lachowicz, Mark

    2014-03-01

    Emerging methodological research suggests that the World Wide Web ("Web") is an appropriate venue for survey data collection, and a promising area for delivering behavioral intervention. However, the use of the Web for research raises concerns regarding sample validity, particularly when the Web is used for recruitment and enrollment. The purpose of this paper is to describe the challenges experienced in two different Web-based studies in which participant misrepresentation threatened sample validity: a survey study and an online intervention study. The lessons learned from these experiences generated three types of strategies researchers can use to reduce the likelihood of participant misrepresentation for eligibility in Web-based research. Examples of procedural/design strategies, technical/software strategies and data analytic strategies are provided along with the methodological strengths and limitations of specific strategies. The discussion includes a series of considerations to guide researchers in the selection of strategies that may be most appropriate given the aims, resources and target population of their studies. Copyright © 2014 John Wiley & Sons, Ltd.

  7. Python 3 Web Development Beginner's Guide

    CERN Document Server

    Anders, Michel

    2011-01-01

    Part of Packt's Beginner's Guide Series, this book follows a sample application, with lots of screenshots, to help you get to grips with the techniques as quickly as possible. Moderately experienced Python programmers who want to learn how to create fairly complex, database-driven, cross browser compatible web apps that are maintainable and look good will find this book of most use. All key technologies except for Python 3 are explained in detail.

  8. [Improving vaccination social marketing by monitoring the web].

    Science.gov (United States)

    Ferro, A; Bonanni, P; Castiglia, P; Montante, A; Colucci, M; Miotto, S; Siddu, A; Murrone, L; Baldo, V

    2014-01-01

    Immunisation is one of the most important and cost- effective interventions in Public Health because of their significant positive impact on population health.However, since Jenner's discovery there always been a lively debate between supporters and opponents of vaccination; Today the antivaccination movement spreads its message mostly on the web, disseminating inaccurate data through blogs and forums, increasing vaccine rejection.In this context, the Società Italiana di Igiene (SItI) created a web project in order to fight the misinformation on the web regarding vaccinations, through a series of information tools, including scientific articles, educational information, video and multimedia presentations The web portal (http://www.vaccinarsi.org) was published in May 2013 and now is already available over one hundred web pages related to vaccinations Recently a Forum, a periodic newsletter and a Twitter page have been created. There has been an average of 10,000 hits per month. Currently our users are mostly healthcare professionals. The visibility of the site is very good and it currently ranks first in the Google's search engine, taping the word "vaccinarsi" The results of the first four months of activity are extremely encouraging and show the importance of this project; furthermore the application for quality certification by independent international Organizations has been submitted.

  9. Web-based control application using WebSocket

    International Nuclear Information System (INIS)

    Furukawa, Y.

    2012-01-01

    The WebSocket allows asynchronous full-duplex communication between a Web-based (i.e. Java Script-based) application and a Web-server. WebSocket started as a part of HTML5 standardization but has now been separated from HTML5 and has been developed independently. Using WebSocket, it becomes easy to develop platform independent presentation layer applications for accelerator and beamline control software. In addition, a Web browser is the only application program that needs to be installed on client computer. The WebSocket-based applications communicate with the WebSocket server using simple text-based messages, so WebSocket is applicable message-based control system like MADOCA, which was developed for the SPring-8 control system. A simple WebSocket server for the MADOCA control system and a simple motor control application were successfully made as a first trial of the WebSocket control application. Using Google-Chrome (version 13.0) on Debian/Linux and Windows 7, Opera (version 11.0) on Debian/Linux and Safari (version 5.0.3) on Mac OS X as clients, the motors can be controlled using a WebSocket-based Web-application. Diffractometer control application use in synchrotron radiation diffraction experiment was also developed. (author)

  10. Beginning ASPNET Web Pages with WebMatrix

    CERN Document Server

    Brind, Mike

    2011-01-01

    Learn to build dynamic web sites with Microsoft WebMatrix Microsoft WebMatrix is designed to make developing dynamic ASP.NET web sites much easier. This complete Wrox guide shows you what it is, how it works, and how to get the best from it right away. It covers all the basic foundations and also introduces HTML, CSS, and Ajax using jQuery, giving beginning programmers a firm foundation for building dynamic web sites.Examines how WebMatrix is expected to become the new recommended entry-level tool for developing web sites using ASP.NETArms beginning programmers, students, and educators with al

  11. Non-visual Web Browsing: Beyond Web Accessibility.

    Science.gov (United States)

    Ramakrishnan, I V; Ashok, Vikas; Billah, Syed Masum

    2017-07-01

    People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability.

  12. The generation of large networks from web of science data

    NARCIS (Netherlands)

    Leydesdorff, L.; Khan, G.F.; Bornmann, L.

    2014-01-01

    During the 1990s, one of us developed a series of freeware routines (http://www.leydesdorff.net/indicators) that enable the user to organize downloads from the Web of Science (Thomson Reuters) into a relational database, and then to export matrices for further analysis in various formats (for

  13. NutriSonic web expert system for meal management and nutrition counseling with nutrient time-series analysis, e-food exchange and easy data transition.

    Science.gov (United States)

    Hong, Soon-Myung; Cho, Jee-Ye; Lee, Jin-Hee; Kim, Gon; Kim, Min-Chan

    2008-01-01

    This study was conducted to develop the NutriSonic Web Expert System for Meal Management and Nutrition Counseling with Analysis of User's Nutritive Changes of selected days and food exchange information with easy data transition. This program manipulates a food, menu and meal and search database that has been developed. Also, the system provides a function to check the user's nutritive change of selected days. Users can select a recommended general and therapeutic menu using this system. NutriSonic can analyze nutrients and e-food exchange ("e" means the food exchange data base calculated by a computer program) in menus and meals. The expert can insert and store a meal database and generate the synthetic information of age, sex and therapeutic purpose of disease. With investigation and analysis of the user's needs, the meal planning program on the internet has been continuously developed. Users are able to follow up their nutritive changes with nutrient information and ratio of 3 major energy nutrients. Also, users can download another data format like Excel files (.xls) for analysis and verify their nutrient time-series analysis. The results of analysis are presented quickly and accurately. Therefore it can be used by not only usual people, but also by dietitians and nutritionists who take charge of making a menu and experts in the field of food and nutrition. It is expected that the NutriSonic Web Expert System can be useful for nutrition education, nutrition counseling and expert meal management.

  14. Web Caching

    Indian Academy of Sciences (India)

    leveraged through Web caching technology. Specifically, Web caching becomes an ... Web routing can improve the overall performance of the Internet. Web caching is similar to memory system caching - a Web cache stores Web resources in ...

  15. Análisis de estructura de sitios web: el caso de las bibliotecas universitarias andaluzas

    OpenAIRE

    Arellano Pardo, María del Carmen; Rodríguez Mateos, David; Nogales Flores, Tomás; Hernández Pérez, Antonio

    1999-01-01

    Se presenta un análisis de la estructura de los sitios web de las bibliotecas universitarias de las ocho universidades andaluzas. Se definen una serie de variables e indicadores necesarios para la medición de diversos aspectos relacionados con estos sitios web. Entre ellos los índices de hipertextualidad, de interconexión y apertura, que relación número y tipo de enlaces con número de páginas. Para la exploración de los diversos servidores web en los que radican las páginas de estas bibliotec...

  16. A web implementation: the good and the not-so-good.

    Science.gov (United States)

    Bergsneider, C; Piraino, D; Fuerst, M

    2001-06-01

    E-commerce, e-mail, e-greeting, e-this, and e-that everywhere you turn there is a new "e" word for an internet or Web application. We, at the Cleveland Clinic Foundation, have been "e-nlightened" and will discuss in this report the implementation of a web-based radiology information system (RIS) in our radiology division or "e-radiology" division. The application, IDXRad Version 10.0 from IDX Corp, Burlington, VT, is in use at the Cleveland Clinic Foundation and has both intranet (for use in Radiology) and internet (referring physician viewing) modules. We will concentrate on the features of using a web browser for the application's front-end, including easy prototyping for screen review, easier mock-ups of demonstrations by vendors and developers, and easier training as more people become web-addicted. Project communication can be facilitated with an internal project web page, and use of the web browser can accommodate quicker turnaround of software upgrades as the software code is centrally located. Compared with other technologies, including client/server, there is a smaller roll out cost when using a standard web browser. However, the new technology requires a change and changes are never implemented without challenges. A seasoned technologist using a legacy system can enter data quicker using function keys than using a graphical user interface and pointing and clicking through a series of pop-up windows. Also, effective use of a web browser depends on intuitive design for it to be easily implemented and accepted by the user. Some software packages will not work on both of the popular web browsers and then are tailored to specific release levels. As computer-based patient records become a standard, patient confidentiality must be enforced. The technical design and application security features that support the web-based software package will be discussed. Also web technologies have their own implementation issues.

  17. WebSelF: A Web Scraping Framework

    DEFF Research Database (Denmark)

    Thomsen, Jakob; Ernst, Erik; Brabrand, Claus

    2012-01-01

    We present, WebSelF, a framework for web scraping which models the process of web scraping and decomposes it into four conceptually independent, reusable, and composable constituents. We have validated our framework through a full parameterized implementation that is flexible enough to capture...... previous work on web scraping. We have experimentally evaluated our framework and implementation in an experiment that evaluated several qualitatively different web scraping constituents (including previous work and combinations hereof) on about 11,000 HTML pages on daily versions of 17 web sites over...... a period of more than one year. Our framework solves three concrete problems with current web scraping and our experimental results indicate that com- position of previous and our new techniques achieve a higher degree of accuracy, precision and specificity than existing techniques alone....

  18. Analysis and visualization of Arabidopsis thaliana GWAS using web 2.0 technologies.

    Science.gov (United States)

    Huang, Yu S; Horton, Matthew; Vilhjálmsson, Bjarni J; Seren, Umit; Meng, Dazhe; Meyer, Christopher; Ali Amer, Muhammad; Borevitz, Justin O; Bergelson, Joy; Nordborg, Magnus

    2011-01-01

    With large-scale genomic data becoming the norm in biological studies, the storing, integrating, viewing and searching of such data have become a major challenge. In this article, we describe the development of an Arabidopsis thaliana database that hosts the geographic information and genetic polymorphism data for over 6000 accessions and genome-wide association study (GWAS) results for 107 phenotypes representing the largest collection of Arabidopsis polymorphism data and GWAS results to date. Taking advantage of a series of the latest web 2.0 technologies, such as Ajax (Asynchronous JavaScript and XML), GWT (Google-Web-Toolkit), MVC (Model-View-Controller) web framework and Object Relationship Mapper, we have created a web-based application (web app) for the database, that offers an integrated and dynamic view of geographic information, genetic polymorphism and GWAS results. Essential search functionalities are incorporated into the web app to aid reverse genetics research. The database and its web app have proven to be a valuable resource to the Arabidopsis community. The whole framework serves as an example of how biological data, especially GWAS, can be presented and accessed through the web. In the end, we illustrate the potential to gain new insights through the web app by two examples, showcasing how it can be used to facilitate forward and reverse genetics research. Database URL: http://arabidopsis.usc.edu/

  19. 07051 Executive Summary -- Programming Paradigms for the Web: Web Programming and Web Services

    OpenAIRE

    Hull, Richard; Thiemann, Peter; Wadler, Philip

    2007-01-01

    The world-wide web raises a variety of new programming challenges. To name a few: programming at the level of the web browser, data-centric approaches, and attempts to automatically discover and compose web services. This seminar brought together researchers from the web programming and web services communities and strove to engage them in communication with each other. The seminar was held in an unusual style, in a mixture of short presentations and in-depth discussio...

  20. Semantic Web Requirements through Web Mining Techniques

    OpenAIRE

    Hassanzadeh, Hamed; Keyvanpour, Mohammad Reza

    2012-01-01

    In recent years, Semantic web has become a topic of active research in several fields of computer science and has applied in a wide range of domains such as bioinformatics, life sciences, and knowledge management. The two fast-developing research areas semantic web and web mining can complement each other and their different techniques can be used jointly or separately to solve the issues in both areas. In addition, since shifting from current web to semantic web mainly depends on the enhance...

  1. Prey interception drives web invasion and spider size determines successful web takeover in nocturnal orb-web spiders.

    Science.gov (United States)

    Gan, Wenjin; Liu, Shengjie; Yang, Xiaodong; Li, Daiqin; Lei, Chaoliang

    2015-09-24

    A striking feature of web-building spiders is the use of silk to make webs, mainly for prey capture. However, building a web is energetically expensive and increases the risk of predation. To reduce such costs and still have access to abundant prey, some web-building spiders have evolved web invasion behaviour. In general, no consistent patterns of web invasion have emerged and the factors determining web invasion remain largely unexplored. Here we report web invasion among conspecifics in seven nocturnal species of orb-web spiders, and examined the factors determining the probability of webs that could be invaded and taken over by conspecifics. About 36% of webs were invaded by conspecifics, and 25% of invaded webs were taken over by the invaders. A web that was built higher and intercepted more prey was more likely to be invaded. Once a web was invaded, the smaller the size of the resident spider, the more likely its web would be taken over by the invader. This study suggests that web invasion, as a possible way of reducing costs, may be widespread in nocturnal orb-web spiders. © 2015. Published by The Company of Biologists Ltd.

  2. Prey interception drives web invasion and spider size determines successful web takeover in nocturnal orb-web spiders

    Directory of Open Access Journals (Sweden)

    Wenjin Gan

    2015-10-01

    Full Text Available A striking feature of web-building spiders is the use of silk to make webs, mainly for prey capture. However, building a web is energetically expensive and increases the risk of predation. To reduce such costs and still have access to abundant prey, some web-building spiders have evolved web invasion behaviour. In general, no consistent patterns of web invasion have emerged and the factors determining web invasion remain largely unexplored. Here we report web invasion among conspecifics in seven nocturnal species of orb-web spiders, and examined the factors determining the probability of webs that could be invaded and taken over by conspecifics. About 36% of webs were invaded by conspecifics, and 25% of invaded webs were taken over by the invaders. A web that was built higher and intercepted more prey was more likely to be invaded. Once a web was invaded, the smaller the size of the resident spider, the more likely its web would be taken over by the invader. This study suggests that web invasion, as a possible way of reducing costs, may be widespread in nocturnal orb-web spiders.

  3. Using Open Web APIs in Teaching Web Mining

    Science.gov (United States)

    Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju

    2009-01-01

    With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…

  4. Correct software in web applications and web services

    CERN Document Server

    Thalheim, Bernhard; Prinz, Andreas; Buchberger, Bruno

    2015-01-01

    The papers in this volume aim at obtaining a common understanding of the challenging research questions in web applications comprising web information systems, web services, and web interoperability; obtaining a common understanding of verification needs in web applications; achieving a common understanding of the available rigorous approaches to system development, and the cases in which they have succeeded; identifying how rigorous software engineering methods can be exploited to develop suitable web applications; and at developing a European-scale research agenda combining theory, methods a

  5. A web service system supporting three-dimensional post-processing of medical images based on WADO protocol.

    Science.gov (United States)

    He, Longjun; Xu, Lang; Ming, Xing; Liu, Qian

    2015-02-01

    Three-dimensional post-processing operations on the volume data generated by a series of CT or MR images had important significance on image reading and diagnosis. As a part of the DIOCM standard, WADO service defined how to access DICOM objects on the Web, but it didn't involve three-dimensional post-processing operations on the series images. This paper analyzed the technical features of three-dimensional post-processing operations on the volume data, and then designed and implemented a web service system for three-dimensional post-processing operations of medical images based on the WADO protocol. In order to improve the scalability of the proposed system, the business tasks and calculation operations were separated into two modules. As results, it was proved that the proposed system could support three-dimensional post-processing service of medical images for multiple clients at the same moment, which met the demand of accessing three-dimensional post-processing operations on the volume data on the web.

  6. Laboratorio Web SCORM de Control PID con Integración Avanzada

    Directory of Open Access Journals (Sweden)

    Ildefonso Ruano Ruano

    2016-10-01

    Full Text Available Resumen: Los laboratorios Web (WebLabs son recursos cada vez más utilizados en las carreras técnicas universitarias. Cuando se presentan integrados en un sistema de gestión de aprendizaje (LMS, Learning Management System se obtienen una serie de ventajas para alumnos y docentes entre las que destaca el hecho de mostrarse en un entorno conocido y la posibilidad de personalizar la experiencia gracias a la identificación de usuarios que realiza el LMS. Este trabajo muestra un WebLab sobre control Proporcional-Integral-Derivativo (PID, un contenido fundamental de las asignaturas de Automática que se encuentra en todos los grados de Ingeniería Industrial. Este WebLab ha sido desarrollado mediante una metodología innovadora con la que se obtiene un recurso de aprendizaje eficaz basado en un paquete SCORM (Shared Content Object Reference Model. SCORM es el estándar de contenidos de e-learning más utilizado y es compatible con la mayoría de los LMS del mercado, esto permite que el WebLab pueda ser reutilizado fácilmente en diferentes entornos LMS. El WebLab contiene un plan de aprendizaje que incluye una serie de recursos de utilidad docente como teoría de control PID, pruebas de evaluación, un laboratorio virtual de control PID de un motor de corriente continua y experimentos personalizados para cada alumno cuyos resultados son almacenados en el LMS. Este WebLab se ha presentado en el LMS institucional de la Universidad de Jaén a 340 alumnos de la asignatura “Automática Industrial” en el curso 2014-15. Los datos de uso han permitido realizar diversas evaluaciones que demuestran que los alumnos que lo han completado han obtenido un rendimiento excelente en el propio WebLab, han conseguido unos resultados muy superiores al resto de alumnos en la evaluación final de la asignatura y lo han valorado muy positivamente. También se ha demostrado la reusabilidad del WebLab en diferentes LMS

  7. Semantic Web Technologies for the Adaptive Web

    DEFF Research Database (Denmark)

    Dolog, Peter; Nejdl, Wolfgang

    2007-01-01

    Ontologies and reasoning are the key terms brought into focus by the semantic web community. Formal representation of ontologies in a common data model on the web can be taken as a foundation for adaptive web technologies as well. This chapter describes how ontologies shared on the semantic web...... provide conceptualization for the links which are a main vehicle to access information on the web. The subject domain ontologies serve as constraints for generating only those links which are relevant for the domain a user is currently interested in. Furthermore, user model ontologies provide additional...... means for deciding which links to show, annotate, hide, generate, and reorder. The semantic web technologies provide means to formalize the domain ontologies and metadata created from them. The formalization enables reasoning for personalization decisions. This chapter describes which components...

  8. Usare WebDewey

    OpenAIRE

    Baldi, Paolo

    2016-01-01

    This presentation shows how to use the WebDewey tool. Features of WebDewey. Italian WebDewey compared with American WebDewey. Querying Italian WebDewey. Italian WebDewey and MARC21. Italian WebDewey and UNIMARC. Numbers, captions, "equivalente verbale": Dewey decimal classification in Italian catalogues. Italian WebDewey and Nuovo soggettario. Italian WebDewey and LCSH. Italian WebDewey compared with printed version of Italian Dewey Classification (22. edition): advantages and disadvantages o...

  9. Web TA Production (WebTA)

    Data.gov (United States)

    US Agency for International Development — WebTA is a web-based time and attendance system that supports USAID payroll administration functions, and is designed to capture hours worked, leave used and...

  10. Politiken, Alt om Ikast Brande (web), Lemvig Folkeblad (Web), Politiken (web), Dabladet Ringkjøbing Skjern (web)

    DEFF Research Database (Denmark)

    Lauritsen, Jens

    2014-01-01

    Politiken 01.01.2014 14:16 Danskerne skød nytåret ind med et brag, men for enkeltes vedkommende gik det galt, da nytårskrudtet blev tændt. Skadestuerne har behandlet 73 personer for fyrværkeriskader mellem klokken 18 i aftes og klokken 06 i morges. Det viser en optælling, som Politiken har...... foretaget på baggrund af tal fra Ulykkes Analyse Gruppen på Odense Universitetshospital. Artiklen er også bragt i: Alt om Ikast Brande (web), Lemvig Folkeblad (web), Politiken (web), Dagbladet Ringkjøbing Skjern (web)....

  11. Use of a web site to enhance criticality safety training

    International Nuclear Information System (INIS)

    Huang, Song T.; Morman, James A.

    2003-01-01

    Establishment of the NCSP (Nuclear Criticality Safety Program) website represents one attempt by the NCS (Nuclear Criticality Safety) community to meet the need to enhance communication and disseminate NCS information to a wider audience. With the aging work force in this important technical field, there is a common recognition of the need to capture the corporate knowledge of these people and provide an easily accessible, web-based training opportunity to those people just entering the field of criticality safety. A multimedia-based site can provide a wide range of possibilities for criticality safety training. Training modules could range from simple text-based material, similar to the NCSET (Nuclear Criticality Safety Engineer Training) modules, to interactive web-based training classes, to video lecture series. For example, the Los Alamos National Laboratory video series of interviews with pioneers of criticality safety could easily be incorporated into training modules. Obviously, the development of such a program depends largely upon the need and participation of experts who share the same vision and enthusiasm of training the next generation of criticality safety engineers. The NCSP website is just one example of the potential benefits that web-based training can offer. You are encouraged to browse the NCSP website at http://ncsp.llnl.gov. We solicit your ideas in the training of future NCS engineers and welcome your participation with us in developing future multimedia training modules. (author)

  12. Using Web Server Logs in Evaluating Instructional Web Sites.

    Science.gov (United States)

    Ingram, Albert L.

    2000-01-01

    Web server logs contain a great deal of information about who uses a Web site and how they use it. This article discusses the analysis of Web logs for instructional Web sites; reviews the data stored in most Web server logs; demonstrates what further information can be gleaned from the logs; and discusses analyzing that information for the…

  13. WEB STRUCTURE MINING

    Directory of Open Access Journals (Sweden)

    CLAUDIA ELENA DINUCĂ

    2011-01-01

    Full Text Available The World Wide Web became one of the most valuable resources for information retrievals and knowledge discoveries due to the permanent increasing of the amount of data available online. Taking into consideration the web dimension, the users get easily lost in the web’s rich hyper structure. Application of data mining methods is the right solution for knowledge discovery on the Web. The knowledge extracted from the Web can be used to raise the performances for Web information retrievals, question answering and Web based data warehousing. In this paper, I provide an introduction of Web mining categories and I focus on one of these categories: the Web structure mining. Web structure mining, one of three categories of web mining for data, is a tool used to identify the relationship between Web pages linked by information or direct link connection. It offers information about how different pages are linked together to form this huge web. Web Structure Mining finds hidden basic structures and uses hyperlinks for more web applications such as web search.

  14. Web Engineering

    Energy Technology Data Exchange (ETDEWEB)

    White, Bebo

    2003-06-23

    Web Engineering is the application of systematic, disciplined and quantifiable approaches to development, operation, and maintenance of Web-based applications. It is both a pro-active approach and a growing collection of theoretical and empirical research in Web application development. This paper gives an overview of Web Engineering by addressing the questions: (a) why is it needed? (b) what is its domain of operation? (c) how does it help and what should it do to improve Web application development? and (d) how should it be incorporated in education and training? The paper discusses the significant differences that exist between Web applications and conventional software, the taxonomy of Web applications, the progress made so far and the research issues and experience of creating a specialization at the master's level. The paper reaches a conclusion that Web Engineering at this stage is a moving target since Web technologies are constantly evolving, making new types of applications possible, which in turn may require innovations in how they are built, deployed and maintained.

  15. Web Accessibility in Romania: The Conformance of Municipal Web Sites to Web Content Accessibility Guidelines

    OpenAIRE

    Costin PRIBEANU; Ruxandra-Dora MARINESCU; Paul FOGARASSY-NESZLY; Maria GHEORGHE-MOISII

    2012-01-01

    The accessibility of public administration web sites is a key quality attribute for the successful implementation of the Information Society. The purpose of this paper is to present a second review of municipal web sites in Romania that is based on automated accessibility checking. A number of 60 web sites were evaluated against WCAG 2.0 recommendations. The analysis of results reveals a relatively low web accessibility of municipal web sites and highlights several aspects. Firstly, a slight ...

  16. 07051 Working Group Outcomes -- Programming Paradigms for the Web: Web Programming and Web Services

    OpenAIRE

    Hull, Richard; Thiemann, Peter; Wadler, Philip

    2007-01-01

    Participants in the seminar broke into groups on ``Patterns and Paradigms'' for web programming, ``Web Services,'' ``Data on the Web,'' ``Software Engineering'' and ``Security.'' Here we give the raw notes recorded during these sessions.

  17. Web-ADARE: A Web-Aided Data Repairing System

    KAUST Repository

    Gu, Binbin

    2017-03-08

    Data repairing aims at discovering and correcting erroneous data in databases. In this paper, we develop Web-ADARE, an end-to-end web-aided data repairing system, to provide a feasible way to involve the vast data sources on the Web in data repairing. Our main attention in developing Web-ADARE is paid on the interaction problem between web-aided repairing and rule-based repairing, in order to minimize the Web consultation cost while reaching predefined quality requirements. The same interaction problem also exists in crowd-based methods but this is not yet formally defined and addressed. We first prove in theory that the optimal interaction scheme is not feasible to be achieved, and then propose an algorithm to identify a scheme for efficient interaction by investigating the inconsistencies and the dependencies between values in the repairing process. Extensive experiments on three data collections demonstrate the high repairing precision and recall of Web-ADARE, and the efficiency of the generated interaction scheme over several baseline ones.

  18. Web-ADARE: A Web-Aided Data Repairing System

    KAUST Repository

    Gu, Binbin; Li, Zhixu; Yang, Qiang; Xie, Qing; Liu, An; Liu, Guanfeng; Zheng, Kai; Zhang, Xiangliang

    2017-01-01

    Data repairing aims at discovering and correcting erroneous data in databases. In this paper, we develop Web-ADARE, an end-to-end web-aided data repairing system, to provide a feasible way to involve the vast data sources on the Web in data repairing. Our main attention in developing Web-ADARE is paid on the interaction problem between web-aided repairing and rule-based repairing, in order to minimize the Web consultation cost while reaching predefined quality requirements. The same interaction problem also exists in crowd-based methods but this is not yet formally defined and addressed. We first prove in theory that the optimal interaction scheme is not feasible to be achieved, and then propose an algorithm to identify a scheme for efficient interaction by investigating the inconsistencies and the dependencies between values in the repairing process. Extensive experiments on three data collections demonstrate the high repairing precision and recall of Web-ADARE, and the efficiency of the generated interaction scheme over several baseline ones.

  19. Web Mining

    Science.gov (United States)

    Fürnkranz, Johannes

    The World-Wide Web provides every internet citizen with access to an abundance of information, but it becomes increasingly difficult to identify the relevant pieces of information. Research in web mining tries to address this problem by applying techniques from data mining and machine learning to Web data and documents. This chapter provides a brief overview of web mining techniques and research areas, most notably hypertext classification, wrapper induction, recommender systems and web usage mining.

  20. La evaluación de las aplicaciones de lectura web: un paso más en el proceso de editorialización de la web

    Directory of Open Access Journals (Sweden)

    Noelia Patón Rodríguez

    2016-11-01

    Full Text Available En el desarrollo de la lectura digital confluyen una serie de factores que pasan entre otras cuestiones por la evolución tecnológica de los dispositivos y el desarrollo de numerosas aplicaciones que facilitan la lectura al tiempo que está provocando cambios en los hábitos de lectura tal y como lo demuestran los diferentes informes. Uno de los desarrollos tecnológicos está relacionado con el diseño de aplicaciones específicas para la lectura en la pantalla del ordenador, son las llamadas aplicaciones de lectura web. En este artículo, tras definir qué son las aplicaciones de lectura web se explican los objetivos y el proceso seguido para la selección, definición y sistematización de los parámetros que servirán para la evaluación de las aplicaciones de lectura web.

  1. Understanding User-Web Interactions via Web Analytics

    CERN Document Server

    Jansen, Bernard J

    2009-01-01

    This lecture presents an overview of the Web analytics process, with a focus on providing insight and actionable outcomes from collecting and analyzing Internet data. The lecture first provides an overview of Web analytics, providing in essence, a condensed version of the entire lecture. The lecture then outlines the theoretical and methodological foundations of Web analytics in order to make obvious the strengths and shortcomings of Web analytics as an approach. These foundational elements include the psychological basis in behaviorism and methodological underpinning of trace data as an empir

  2. The design and implementation of web mining in web sites security

    Science.gov (United States)

    Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li

    2003-06-01

    The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.

  3. FluDetWeb: an interactive web-based system for the early detection of the onset of influenza epidemics.

    Science.gov (United States)

    Conesa, David; López-Quílez, Antonio; Martínez-Beneito, Miguel Angel; Miralles, María Teresa; Verdejo, Francisco

    2009-07-29

    The early identification of influenza outbreaks has became a priority in public health practice. A large variety of statistical algorithms for the automated monitoring of influenza surveillance have been proposed, but most of them require not only a lot of computational effort but also operation of sometimes not-so-friendly software. In this paper, we introduce FluDetWeb, an implementation of a prospective influenza surveillance methodology based on a client-server architecture with a thin (web-based) client application design. Users can introduce and edit their own data consisting of a series of weekly influenza incidence rates. The system returns the probability of being in an epidemic phase (via e-mail if desired). When the probability is greater than 0.5, it also returns the probability of an increase in the incidence rate during the following week. The system also provides two complementary graphs. This system has been implemented using statistical free-software (R and WinBUGS), a web server environment for Java code (Tomcat) and a software module created by us (Rdp) responsible for managing internal tasks; the software package MySQL has been used to construct the database management system. The implementation is available on-line from: http://www.geeitema.org/meviepi/fludetweb/. The ease of use of FluDetWeb and its on-line availability can make it a valuable tool for public health practitioners who want to obtain information about the probability that their system is in an epidemic phase. Moreover, the architecture described can also be useful for developers of systems based on computationally intensive methods.

  4. Normas de usabilidad de sitios web. Pautas de aplicación

    NARCIS (Netherlands)

    Burgos, Daniel; Ruiz-Mezcua, Belén

    2005-01-01

    En este artículo presentamos una serie de normas básicas a seguir, y las pautas de aplicación relacionadas, con el objetivo de un manejo eficaz y eficiente de sitios web Las normas finalmente seleccionadas han resultado del fruto del trabajo de estudio e investigación realizado sobre más de un

  5. A tool for NDVI time series extraction from wide-swath remotely sensed images

    Science.gov (United States)

    Li, Zhishan; Shi, Runhe; Zhou, Cong

    2015-09-01

    Normalized Difference Vegetation Index (NDVI) is one of the most widely used indicators for monitoring the vegetation coverage in land surface. The time series features of NDVI are capable of reflecting dynamic changes of various ecosystems. Calculating NDVI via Moderate Resolution Imaging Spectrometer (MODIS) and other wide-swath remotely sensed images provides an important way to monitor the spatial and temporal characteristics of large-scale NDVI. However, difficulties are still existed for ecologists to extract such information correctly and efficiently because of the problems in several professional processes on the original remote sensing images including radiometric calibration, geometric correction, multiple data composition and curve smoothing. In this study, we developed an efficient and convenient online toolbox for non-remote sensing professionals who want to extract NDVI time series with a friendly graphic user interface. It is based on Java Web and Web GIS technically. Moreover, Struts, Spring and Hibernate frameworks (SSH) are integrated in the system for the purpose of easy maintenance and expansion. Latitude, longitude and time period are the key inputs that users need to provide, and the NDVI time series are calculated automatically.

  6. Web archives

    DEFF Research Database (Denmark)

    Finnemann, Niels Ole

    2018-01-01

    This article deals with general web archives and the principles for selection of materials to be preserved. It opens with a brief overview of reasons why general web archives are needed. Section two and three present major, long termed web archive initiatives and discuss the purposes and possible...... values of web archives and asks how to meet unknown future needs, demands and concerns. Section four analyses three main principles in contemporary web archiving strategies, topic centric, domain centric and time-centric archiving strategies and section five discuss how to combine these to provide...... a broad and rich archive. Section six is concerned with inherent limitations and why web archives are always flawed. The last sections deal with the question how web archives may fit into the rapidly expanding, but fragmented landscape of digital repositories taking care of various parts...

  7. Time-Dependent Behavior of High-Strength Kevlar and Vectran Webbing

    Science.gov (United States)

    Jones, Thomas C.; Doggett, William R.

    2014-01-01

    High-strength Kevlar and Vectran webbings are currently being used by both NASA and industry as the primary load-bearing structure in inflatable space habitation modules. The time-dependent behavior of high-strength webbing architectures is a vital area of research that is providing critical material data to guide a more robust design process for this class of structures. This paper details the results of a series of time-dependent tests on 1-inch wide webbing including an initial set of comparative tests between specimens that underwent realtime and accelerated creep at 65 and 70% of their ultimate tensile strength. Variability in the ultimate tensile strength of the webbings is investigated and compared with variability in the creep life response. Additional testing studied the effects of load and displacement rate, specimen length and the time-dependent effects of preconditioning the webbings. The creep test facilities, instrumentation and test procedures are also detailed. The accelerated creep tests display consistently longer times to failure than their real-time counterparts; however, several factors were identified that may contribute to the observed disparity. Test setup and instrumentation, grip type, loading scheme, thermal environment and accelerated test postprocessing along with material variability are among these factors. Their effects are discussed and future work is detailed for the exploration and elimination of some of these factors in order to achieve a higher fidelity comparison.

  8. Even Faster Web Sites Performance Best Practices for Web Developers

    CERN Document Server

    Souders, Steve

    2009-01-01

    Performance is critical to the success of any web site, and yet today's web applications push browsers to their limits with increasing amounts of rich content and heavy use of Ajax. In this book, Steve Souders, web performance evangelist at Google and former Chief Performance Yahoo!, provides valuable techniques to help you optimize your site's performance. Souders' previous book, the bestselling High Performance Web Sites, shocked the web development world by revealing that 80% of the time it takes for a web page to load is on the client side. In Even Faster Web Sites, Souders and eight exp

  9. Efficient Web Harvesting Strategies for Monitoring Deep Web Content

    NARCIS (Netherlands)

    Khelghati, Mohammadreza; Hiemstra, Djoerd; van Keulen, Maurice

    2016-01-01

    The change of the web content is rapid. In Focused Web Harvesting [?], which aims at achieving a complete harvest for a given topic, this dynamic nature of the web creates problems for users who need to access a complete set of related web data to their interesting topics. Whether you are a fan

  10. Web2Quests: Updating a Popular Web-Based Inquiry-Oriented Activity

    Science.gov (United States)

    Kurt, Serhat

    2009-01-01

    WebQuest is a popular inquiry-oriented activity in which learners use Web resources. Since the creation of the innovation, almost 15 years ago, the Web has changed significantly, while the WebQuest technique has changed little. This article examines possible applications of new Web trends on WebQuest instructional strategy. Some possible…

  11. Integration of Web mining and web crawler: Relevance and State of Art

    OpenAIRE

    Subhendu kumar pani; Deepak Mohapatra,; Bikram Keshari Ratha

    2010-01-01

    This study presents the role of web crawler in web mining environment. As the growth of the World Wide Web exceeded all expectations,the research on Web mining is growing more and more.web mining research topic which combines two of the activated research areas: Data Mining and World Wide Web .So, the World Wide Web is a very advanced area for data mining research. Search engines that are based on web crawling framework also used in web mining to find theinteracted web pages. This paper discu...

  12. Efficient Web Harvesting Strategies for Monitoring Deep Web Content

    NARCIS (Netherlands)

    Khelghati, Mohammadreza; Hiemstra, Djoerd; van Keulen, Maurice

    2016-01-01

    Web content changes rapidly [18]. In Focused Web Harvesting [17] which aim it is to achieve a complete harvest for a given topic, this dynamic nature of the web creates problems for users who need to access a set of all the relevant web data to their topics of interest. Whether you are a fan

  13. Measuring consistency of web page design and its effects on performance and satisfaction.

    Science.gov (United States)

    Ozok, A A; Salvendy, G

    2000-04-01

    This study examines the methods for measuring the consistency levels of web pages and the effect of consistency on the performance and satisfaction of the world-wide web (WWW) user. For clarification, a home page is referred to as a single page that is the default page of a web site on the WWW. A web page refers to a single screen that indicates a specific address on the WWW. This study has tested a series of web pages that were mostly hyperlinked. Therefore, the term 'web page' has been adopted for the nomenclature while referring to the objects of which the features were tested. It was hypothesized that participants would perform better and be more satisfied using web pages that have consistent rather than inconsistent interface design; that the overall consistency level of an interface design would significantly correlate with the three elements of consistency, physical, communicational and conceptual consistency; and that physical and communicational consistencies would interact with each other. The hypotheses were tested in a four-group, between-subject design, with 10 participants in each group. The results partially support the hypothesis regarding error rate, but not regarding satisfaction and performance time. The results also support the hypothesis that each of the three elements of consistency significantly contribute to the overall consistency of a web page, and that physical and communicational consistencies interact with each other, while conceptual consistency does not interact with them.

  14. VirtualBot, un entorn de programació de robots en format web

    OpenAIRE

    Moreu Reñé, Xavi

    2016-01-01

    VirtualBot és una web on es programen una sèrie de robots en un llenguatge virtual que es mouran per un entorn 3D. Es pot programar més d'un robot per poder experimentar cooperacions. El programa permet editar l'entorn com són els murs i balises, les quals un robot podrà agafar i desar a un interruptor. VirtualBox es una web donde se programan una serie de robots en un lenguaje virtual que se moverán por un entorno 3D. Se puede programar más de un robot para poder experimentar cooperacione...

  15. Dynamic Web Pages: Performance Impact on Web Servers.

    Science.gov (United States)

    Kothari, Bhupesh; Claypool, Mark

    2001-01-01

    Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)

  16. Applying semantic web services to enterprise web

    OpenAIRE

    Hu, Y; Yang, Q P; Sun, X; Wei, P

    2008-01-01

    Enterprise Web provides a convenient, extendable, integrated platform for information sharing and knowledge management. However, it still has many drawbacks due to complexity and increasing information glut, as well as the heterogeneity of the information processed. Research in the field of Semantic Web Services has shown the possibility of adding higher level of semantic functionality onto the top of current Enterprise Web, enhancing usability and usefulness of resource, enabling decision su...

  17. WebVR: an interactive web browser for virtual environments

    Science.gov (United States)

    Barsoum, Emad; Kuester, Falko

    2005-03-01

    The pervasive nature of web-based content has lead to the development of applications and user interfaces that port between a broad range of operating systems and databases, while providing intuitive access to static and time-varying information. However, the integration of this vast resource into virtual environments has remained elusive. In this paper we present an implementation of a 3D Web Browser (WebVR) that enables the user to search the internet for arbitrary information and to seamlessly augment this information into virtual environments. WebVR provides access to the standard data input and query mechanisms offered by conventional web browsers, with the difference that it generates active texture-skins of the web contents that can be mapped onto arbitrary surfaces within the environment. Once mapped, the corresponding texture functions as a fully integrated web-browser that will respond to traditional events such as the selection of links or text input. As a result, any surface within the environment can be turned into a web-enabled resource that provides access to user-definable data. In order to leverage from the continuous advancement of browser technology and to support both static as well as streamed content, WebVR uses ActiveX controls to extract the desired texture skin from industry strength browsers, providing a unique mechanism for data fusion and extensibility.

  18. La investigación en la formacion "Web-Learning"

    Directory of Open Access Journals (Sweden)

    Guillermo Vázquez

    2006-12-01

    Full Text Available La educación médica tiene en el Web Learning, una herramienta que permite el acceso universal a la formación y al entreno. El Web Learning, supone también, un reenfoque y redimensionado de la educación en todas sus dimensiones incluyendo la tecnológica. Todo lo anterior junto con las cuantiosas inversiones que supone la implantación del Web Learning, aconseja que la investigación para identificar el proceso óptimo de implementación y desarrollo, se convierta en uno de sus pilares de referencia. En este artículo se explicitan una serie de preguntas relativas a los diversos aspectos del Web Learning que a criterio de los autores deben de ser contestados a través de la investigación, así como aquellas metodologías científicas mas apropiadas a las preguntas formuladas. La revisión tanto bibliografía, como los libros de ponencias y comunicaciones de congresos nacionales, evidencia la ausencia casi total de investigación en este campo de la medicina, lo cual se agrava con la carencia de becas para financiar estos estudios. Solo una estrategia global puede superar esta situación y promover la educación basada en la mejor evidencia.

  19. La investigación en la formacion "Web-Learning"

    Directory of Open Access Journals (Sweden)

    Guillermo Vázquez

    Full Text Available La educación médica tiene en el Web Learning, una herramienta que permite el acceso universal a la formación y al entreno. El Web Learning, supone también, un reenfoque y redimensionado de la educación en todas sus dimensiones incluyendo la tecnológica. Todo lo anterior junto con las cuantiosas inversiones que supone la implantación del Web Learning, aconseja que la investigación para identificar el proceso óptimo de implementación y desarrollo, se convierta en uno de sus pilares de referencia. En este artículo se explicitan una serie de preguntas relativas a los diversos aspectos del Web Learning que a criterio de los autores deben de ser contestados a través de la investigación, así como aquellas metodologías científicas mas apropiadas a las preguntas formuladas. La revisión tanto bibliografía, como los libros de ponencias y comunicaciones de congresos nacionales, evidencia la ausencia casi total de investigación en este campo de la medicina, lo cual se agrava con la carencia de becas para financiar estos estudios. Solo una estrategia global puede superar esta situación y promover la educación basada en la mejor evidencia.

  20. FluDetWeb: an interactive web-based system for the early detection of the onset of influenza epidemics

    Directory of Open Access Journals (Sweden)

    Miralles María

    2009-07-01

    Full Text Available Abstract Background The early identification of influenza outbreaks has became a priority in public health practice. A large variety of statistical algorithms for the automated monitoring of influenza surveillance have been proposed, but most of them require not only a lot of computational effort but also operation of sometimes not-so-friendly software. Results In this paper, we introduce FluDetWeb, an implementation of a prospective influenza surveillance methodology based on a client-server architecture with a thin (web-based client application design. Users can introduce and edit their own data consisting of a series of weekly influenza incidence rates. The system returns the probability of being in an epidemic phase (via e-mail if desired. When the probability is greater than 0.5, it also returns the probability of an increase in the incidence rate during the following week. The system also provides two complementary graphs. This system has been implemented using statistical free-software (ℝ and WinBUGS, a web server environment for Java code (Tomcat and a software module created by us (Rdp responsible for managing internal tasks; the software package MySQL has been used to construct the database management system. The implementation is available on-line from: http://www.geeitema.org/meviepi/fludetweb/. Conclusion The ease of use of FluDetWeb and its on-line availability can make it a valuable tool for public health practitioners who want to obtain information about the probability that their system is in an epidemic phase. Moreover, the architecture described can also be useful for developers of systems based on computationally intensive methods.

  1. Designing Effective Web Forms for Older Web Users

    Science.gov (United States)

    Li, Hui; Rau, Pei-Luen Patrick; Fujimura, Kaori; Gao, Qin; Wang, Lin

    2012-01-01

    This research aims to provide insight for web form design for older users. The effects of task complexity and information structure of web forms on older users' performance were examined. Forty-eight older participants with abundant computer and web experience were recruited. The results showed significant differences in task time and error rate…

  2. Creation of web applications by Rich Internet Application Adobe Flex

    OpenAIRE

    PEKA, Karel

    2011-01-01

    Bachelor work focuses on explaining the functions and development of interactive applications in Adobe Flex RIA also compared to similar web technologies such as AJAX, Microsoft Silverlight or Adobe Flash. Explain the difference between "ordinary" sites and Rich Internet Application (RIA) and the difference shows a series of demonstration examples were processed in Adobe Flash Builder (environment for building Flex applications). Also will be created large-scale application for comprehensive ...

  3. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data.

    Science.gov (United States)

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques

    2007-04-01

    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.

  4. 75 FR 27986 - Electronic Filing System-Web (EFS-Web) Contingency Option

    Science.gov (United States)

    2010-05-19

    ...] Electronic Filing System--Web (EFS-Web) Contingency Option AGENCY: United States Patent and Trademark Office... contingency option when the primary portal to EFS-Web has an unscheduled outage. Previously, the entire EFS-Web system is not available to the users during such an outage. The contingency option in EFS-Web will...

  5. Geospatial semantic web

    CERN Document Server

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such as WFS, WMS, RDF, OWL, and GeoSPARQL, and demonstrates how to use the Geospatial Semantic Web technologies to solve practical real-world problems such as spatial data interoperability.

  6. 3D Web-based HMI with WebGL Rendering Performance

    Directory of Open Access Journals (Sweden)

    Muennoi Atitayaporn

    2016-01-01

    Full Text Available An HMI, or Human-Machine Interface, is a software allowing users to communicate with a machine or automation system. It usually serves as a display section in SCADA (Supervisory Control and Data Acquisition system for device monitoring and control. In this papper, a 3D Web-based HMI with WebGL (Web-based Graphics Library rendering performance is presented. The main purpose of this work is to attempt to reduce the limitations of traditional 3D web HMI using the advantage of WebGL. To evaluate the performance, frame rate and frame time metrics were used. The results showed 3D Web-based HMI can maintain the frame rate 60FPS for #cube=0.5K/0.8K, 30FPS for #cube=1.1K/1.6K when it was run on Internet Explorer and Chrome respectively. Moreover, the study found that 3D Web-based HMI using WebGL contains similar frame time in each frame even though the numbers of cubes are up to 5K. This indicated stuttering incurred less in the proposed 3D Web-based HMI compared to the chosen commercial HMI product.

  7. 07051 Abstracts Collection -- Programming Paradigms for the Web: Web Programming and Web Services

    OpenAIRE

    Hull, Richard; Thiemann, Peter; Wadler, Philip

    2007-01-01

    From 28.01. to 02.02.2007, the Dagstuhl Seminar 07051 ``Programming Paradigms for the Web: Web Programming and Web Services'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The firs...

  8. Flow Webs: Mechanism and Architecture for the Implementation of Sensor Webs

    Science.gov (United States)

    Gorlick, M. M.; Peng, G. S.; Gasster, S. D.; McAtee, M. D.

    2006-12-01

    The sensor web is a distributed, federated infrastructure much like its predecessors, the internet and the world wide web. It will be a federation of many sensor webs, large and small, under many distinct spans of control, that loosely cooperates and share information for many purposes. Realistically, it will grow piecemeal as distinct, individual systems are developed and deployed, some expressly built for a sensor web while many others were created for other purposes. Therefore, the architecture of the sensor web is of fundamental import and architectural strictures that inhibit innovation, experimentation, sharing or scaling may prove fatal. Drawing upon the architectural lessons of the world wide web, we offer a novel system architecture, the flow web, that elevates flows, sequences of messages over a domain of interest and constrained in both time and space, to a position of primacy as a dynamic, real-time, medium of information exchange for computational services. The flow web captures; in a single, uniform architectural style; the conflicting demands of the sensor web including dynamic adaptations to changing conditions, ease of experimentation, rapid recovery from the failures of sensors and models, automated command and control, incremental development and deployment, and integration at multiple levels—in many cases, at different times. Our conception of sensor webs—dynamic amalgamations of sensor webs each constructed within a flow web infrastructure—holds substantial promise for earth science missions in general, and of weather, air quality, and disaster management in particular. Flow webs, are by philosophy, design and implementation a dynamic infrastructure that permits massive adaptation in real-time. Flows may be attached to and detached from services at will, even while information is in transit through the flow. This concept, flow mobility, permits dynamic integration of earth science products and modeling resources in response to real

  9. SolarSoft Web Services

    Science.gov (United States)

    Freeland, S.; Hurlburt, N.

    2005-12-01

    The SolarSoft system (SSW) is a set of integrated software libraries, databases, and system utilities which provide a common programming and data analysis environment for solar physics. The system includes contributions from a large community base, representing the efforts of many NASA PI team MO&DA teams,spanning many years and multiple NASA and international orbital and ground based missions. The SSW general use libraries include Many hundreds of utilities which are instrument and mission independent. A large subset are also SOLAR independent, such as time conversions, digital detector cleanup, time series analysis, mathematics, image display, WWW server communications and the like. PI teams may draw on these general purpose libraries for analysis and application development while concentrating efforts on instrument specific calibration issues rather than reinvention of general use software. By the same token, PI teams are encouraged to contribute new applications or enhancements to existing utilities which may have more general interest. Recent areas of intense evolution include space weather applications, automated distributed data access and analysis, interfaces with the ongoing Virtual Solar Observatory efforts, and externalization of SolarSoft power through Web Services. We will discuss the current status of SSW web services and demonstrate how this facilitates accessing the underlying power of SolarSoft in more abstract terms. In this context, we will describe the use of SSW services within the Collaborative Sun Earth Connector environment.

  10. The Semantic Web: opportunities and challenges for next-generation Web applications

    Directory of Open Access Journals (Sweden)

    2002-01-01

    Full Text Available Recently there has been a growing interest in the investigation and development of the next generation web - the Semantic Web. While most of the current forms of web content are designed to be presented to humans, but are barely understandable by computers, the content of the Semantic Web is structured in a semantic way so that it is meaningful to computers as well as to humans. In this paper, we report a survey of recent research on the Semantic Web. In particular, we present the opportunities that this revolution will bring to us: web-services, agent-based distributed computing, semantics-based web search engines, and semantics-based digital libraries. We also discuss the technical and cultural challenges of realizing the Semantic Web: the development of ontologies, formal semantics of Semantic Web languages, and trust and proof models. We hope that this will shed some light on the direction of future work on this field.

  11. Working with WebQuests: Making the Web Accessible to Students with Disabilities.

    Science.gov (United States)

    Kelly, Rebecca

    2000-01-01

    This article describes how students with disabilities in regular classes are using the WebQuest lesson format to access the Internet. It explains essential WebQuest principles, creating a draft Web page, and WebQuest components. It offers an example of a WebQuest about salvaging the sunken ships, Titanic and Lusitania. A WebQuest planning form is…

  12. WebGIS based on semantic grid model and web services

    Science.gov (United States)

    Zhang, WangFei; Yue, CaiRong; Gao, JianGuo

    2009-10-01

    As the combination point of the network technology and GIS technology, WebGIS has got the fast development in recent years. With the restriction of Web and the characteristics of GIS, traditional WebGIS has some prominent problems existing in development. For example, it can't accomplish the interoperability of heterogeneous spatial databases; it can't accomplish the data access of cross-platform. With the appearance of Web Service and Grid technology, there appeared great change in field of WebGIS. Web Service provided an interface which can give information of different site the ability of data sharing and inter communication. The goal of Grid technology was to make the internet to a large and super computer, with this computer we can efficiently implement the overall sharing of computing resources, storage resource, data resource, information resource, knowledge resources and experts resources. But to WebGIS, we only implement the physically connection of data and information and these is far from the enough. Because of the different understanding of the world, following different professional regulations, different policies and different habits, the experts in different field will get different end when they observed the same geographic phenomenon and the semantic heterogeneity produced. Since these there are large differences to the same concept in different field. If we use the WebGIS without considering of the semantic heterogeneity, we will answer the questions users proposed wrongly or we can't answer the questions users proposed. To solve this problem, this paper put forward and experienced an effective method of combing semantic grid and Web Services technology to develop WebGIS. In this paper, we studied the method to construct ontology and the method to combine Grid technology and Web Services and with the detailed analysis of computing characteristics and application model in the distribution of data, we designed the WebGIS query system driven by

  13. Web Project Management

    OpenAIRE

    Suralkar, Sunita; Joshi, Nilambari; Meshram, B B

    2013-01-01

    This paper describes about the need for Web project management, fundamentals of project management for web projects: what it is, why projects go wrong, and what's different about web projects. We also discuss Cost Estimation Techniques based on Size Metrics. Though Web project development is similar to traditional software development applications, the special characteristics of Web Application development requires adaption of many software engineering approaches or even development of comple...

  14. The Plant Information Center (PIC): A Web-Based Learning Center for Botanical Study.

    Science.gov (United States)

    Greenberg, J.; Daniel, E.; Massey, J.; White, P.

    The Plant Information Center (PIC) is a project funded under the Institute of Museum and Library Studies that aims to provide global access to both primary and secondary botanical resources via the World Wide Web. Central to the project is the development and employment of a series of applications that facilitate resource discovery, interactive…

  15. The experience of web technologies’ implementation into cartography of the protected areas

    Directory of Open Access Journals (Sweden)

    Іван Олійников

    2017-09-01

    Full Text Available Geoportals and web services containing information about protected territories of different countries of the world are considered, in particular, the analysis of their content, purpose, features of operation, data formats and information that can be obtained with their facilitation. Active development of web technologies contributes to the fact that cartographic web services can be divided into several types: statistical layered maps; Web-maps with the ability to generate queries; Web-maps of collective filling; map-services; cartographic software shells. In the world, when mapping the objects of the nature reserve fund, the first 3 types of cartographic web services are used most often. Statistical map layers are most characteristic of the United States, as they are the first country to integrate web technologies and approaches into classical cartography for mapping of protected areas. For Canada, the experience of regional mapping of protected areas is more widespread than national and local levels. In Australia, particular attention is paid to the mapping of protected areas located on the shores of the oceans and islands. The second category - “Web-maps with the ability to create queries” is the most widespread not only in the field of web mapping of protected areas, but also in general for this area. The development of the volunteer movement contributes to the significant dissemination of services whose thematic content is filled with ordinary people - “Web-maps of collective filling”. The Royal Society for the Protection of Birds (UK project, global mapping rojects that are not specific to a particular country or macro region (a series of interactive maps for the dissemination of information on rare species of animals, the Global Forest Watch geoportal designed to track the dynamics of changes in forest areas are among these mapping services. Web mapping in Ukraine is on the stage of becoming. In addition to the Atlas of the Natural

  16. "May the journey continue": Earth 2 fan fiction, or Filling in gaps to revive a canceled series

    Directory of Open Access Journals (Sweden)

    Francesca Musiani

    2010-09-01

    Full Text Available This essay explores writing practices in a fan community having to give life to a story deprived of an "official" version: the television series Earth 2. I argue that fan fiction writing for this prematurely canceled series exhibits peculiar features in comparison to fan writing for established series: for example, temporality, choice of protagonists, character pairings, and challenges to the original conception(s of the series. Writing fan fiction for a canceled series is not about creating alternatives to an existing story, but about filling in gaps; it brings to light the ways in which fan fiction deals with closure. I take as a case study Earth 2, a series aired by NBC in the United States in 1994–95, whose first and only season ended in a cliffhanger episode hinting that a mysterious ailment had struck the main and most popular character. Shortly afterward, a significant number of Earth 2 Web sites, online conventions, and especially fan stories started developing; they explored what could have happened next and bore nostalgic but combative mottoes and titles such as "May the Journey Continue." I explore the specific features of Earth 2 fan fiction production and sharing by analyzing the main Earth 2 fan fiction archives on the Web and the responses to my email interviews of fan writers. Exemplars of the Earth 2 case are compared to those of other science fiction TV series, both prematurely canceled (Firefly, Space: Above and Beyond and long-lived (Babylon 5, Star Trek: Deep Space 9.

  17. Promoting Teachers' Positive Attitude towards Web Use: A Study in Web Site Development

    Science.gov (United States)

    Akpinar, Yavuz; Bayramoglu, Yusuf

    2008-01-01

    The purpose of the study was to examine effects of a compact training for developing web sites on teachers' web attitude, as composed of: web self efficacy, perceived web enjoyment, perceived web usefulness and behavioral intention to use the web. To measure the related constructs, the Web Attitude Scale was adapted into Turkish and tested with a…

  18. The ATLAS Public Web Pages: Online Management of HEP External Communication Content

    CERN Document Server

    Goldfarb, Steven; Phoboo, Abha Eli; Shaw, Kate

    2015-01-01

    The ATLAS Education and Outreach Group is in the process of migrating its public online content to a professionally designed set of web pages built on the Drupal content management system. Development of the front-end design passed through several key stages, including audience surveys, stakeholder interviews, usage analytics, and a series of fast design iterations, called sprints. Implementation of the web site involves application of the html design using Drupal templates, refined development iterations, and the overall population of the site with content. We present the design and development processes and share the lessons learned along the way, including the results of the data-driven discovery studies. We also demonstrate the advantages of selecting a back-end supported by content management, with a focus on workflow. Finally, we discuss usage of the new public web pages to implement outreach strategy through implementation of clearly presented themes, consistent audience targeting and messaging, and th...

  19. Web wisdom how to evaluate and create information quality on the Web

    CERN Document Server

    Alexander, Janet E

    1999-01-01

    Web Wisdom is an essential reference for anyone needing to evaluate or establish information quality on the World Wide Web. The book includes easy to use checklists for step-by-step quality evaluations of virtually any Web page. The checklists can also be used by Web authors to help them ensure quality information on their pages. In addition, Web Wisdom addresses other important issues, such as understanding the ways that advertising and sponsorship may affect the quality of Web information. It features: * a detailed discussion of the items involved in evaluating Web information; * checklists

  20. Professional WebGL Programming Developing 3D Graphics for the Web

    CERN Document Server

    Anyuru, Andreas

    2012-01-01

    Everything you need to know about developing hardware-accelerated 3D graphics with WebGL! As the newest technology for creating 3D graphics on the web, in both games, applications, and on regular websites, WebGL gives web developers the capability to produce eye-popping graphics. This book teaches you how to use WebGL to create stunning cross-platform apps. The book features several detailed examples that show you how to develop 3D graphics with WebGL, including explanations of code snippets that help you understand the why behind the how. You will also develop a stronger understanding of W

  1. WebCom: A Model for Understanding Web Site Communication

    DEFF Research Database (Denmark)

    Godsk, Mikkel; Petersen, Anja Bechmann

    2008-01-01

    of the approaches' strengths. Furthermore, it is discussed and shortly demonstrated how WebCom can be used for analytical and design purposes with YouTube as an example. The chapter concludes that WebCom is able to serve as a theoretically-based model for understanding complex Web site communication situations...

  2. SELECTION OF ONTOLOGY FOR WEB SERVICE DESCRIPTION LANGUAGE TO ONTOLOGY WEB LANGUAGE CONVERSION

    OpenAIRE

    J. Mannar Mannan; M. Sundarambal; S. Raghul

    2014-01-01

    Semantic web is to extend the current human readable web to encoding some of the semantic of resources in a machine processing form. As a Semantic web component, Semantic Web Services (SWS) uses a mark-up that makes the data into detailed and sophisticated machine readable way. One such language is Ontology Web Language (OWL). Existing conventional web service annotation can be changed to semantic web service by mapping Web Service Description Language (WSDL) with the semantic annotation of O...

  3. AsteriX: a Web server to automatically extract ligand coordinates from figures in PDF articles.

    Science.gov (United States)

    Lounnas, V; Vriend, G

    2012-02-27

    Coordinates describing the chemical structures of small molecules that are potential ligands for pharmaceutical targets are used at many stages of the drug design process. The coordinates of the vast majority of ligands can be obtained from either publicly accessible or commercial databases. However, interesting ligands sometimes are only available from the scientific literature, in which case their coordinates need to be reconstructed manually--a process that consists of a series of time-consuming steps. We present a Web server that helps reconstruct the three-dimensional (3D) coordinates of ligands for which a two-dimensional (2D) picture is available in a PDF file. The software, called AsteriX, analyses every picture contained in the PDF file and attempts to determine automatically whether or not it contains ligands. Areas in pictures that may contain molecular structures are processed to extract connectivity and atom type information that allow coordinates to be subsequently reconstructed. The AsteriX Web server was tested on a series of articles containing a large diversity in graphical representations. In total, 88% of 3249 ligand structures present in the test set were identified as chemical diagrams. Of these, about half were interpreted correctly as 3D structures, and a further one-third required only minor manual corrections. It is principally impossible to always correctly reconstruct 3D coordinates from pictures because there are many different protocols for drawing a 2D image of a ligand, but more importantly a wide variety of semantic annotations are possible. The AsteriX Web server therefore includes facilities that allow the users to augment partial or partially correct 3D reconstructions. All 3D reconstructions are submitted, checked, and corrected by the users domain at the server and are freely available for everybody. The coordinates of the reconstructed ligands are made available in a series of formats commonly used in drug design research. The

  4. Virtual Web Services

    OpenAIRE

    Rykowski, Jarogniew

    2007-01-01

    In this paper we propose an application of software agents to provide Virtual Web Services. A Virtual Web Service is a linked collection of several real and/or virtual Web Services, and public and private agents, accessed by the user in the same way as a single real Web Service. A Virtual Web Service allows unrestricted comparison, information merging, pipelining, etc., of data coming from different sources and in different forms. Detailed architecture and functionality of a single Virtual We...

  5. A web service for controlling the quality of measurements of global solar irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Geiger, M.; Menard, L.; Wald, L. [Ecole des Mines, Paris (France). Centre d' Energetique; Diabate, L. [UFAE/GCMI, Bamako (Mali)

    2002-12-01

    The control of the quality of irradiation data is often a prerequisite to their further processing. Though data are usually controlled by meteorological offices, the sources are so numerous that the user often faces time-series of measurements containing questionable values. As customers of irradiation data, we established our own procedures to screen time-series of measurements. Since this problem of quality control is of concern to many researchers and engineers and since it is often a lengthy and tedious task, we decided to make this screening procedure available to everyone as a web service. This service is the purpose of this paper. The objective is not to perform a precise and fine control, an objective out of reach without details on the site and instruments, but to perform a likelihood control of the data and to check their plausibility. This is achieved by comparing observations with some expectations based upon the extraterrestrial irradiation and a simulation of the irradiation for clear skies. This service is available to everyone on the Web site www.helioclim.net. It offers a very convenient means to check time-series of irradiation: data are input in a HTML page by a copy and paste procedure and the return is also a HTML page that can be analyzed in detail for the data flagged as suspicious. (author)

  6. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    Science.gov (United States)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These

  7. WebScore: An Effective Page Scoring Approach for Uncertain Web Social Networks

    Directory of Open Access Journals (Sweden)

    Shaojie Qiao

    2011-10-01

    Full Text Available To effectively score pages with uncertainty in web social networks, we first proposed a new concept called transition probability matrix and formally defined the uncertainty in web social networks. Second, we proposed a hybrid page scoring algorithm, called WebScore, based on the PageRank algorithm and three centrality measures including degree, betweenness, and closeness. Particularly,WebScore takes into a full consideration of the uncertainty of web social networks by computing the transition probability from one page to another. The basic idea ofWebScore is to: (1 integrate uncertainty into PageRank in order to accurately rank pages, and (2 apply the centrality measures to calculate the importance of pages in web social networks. In order to verify the performance of WebScore, we developed a web social network analysis system which can partition web pages into distinct groups and score them in an effective fashion. Finally, we conducted extensive experiments on real data and the results show that WebScore is effective at scoring uncertain pages with less time deficiency than PageRank and centrality measures based page scoring algorithms.

  8. Applying Sensor Web Technology to Marine Sensor Data

    Science.gov (United States)

    Jirka, Simon; del Rio, Joaquin; Mihai Toma, Daniel; Nüst, Daniel; Stasch, Christoph; Delory, Eric

    2015-04-01

    In this contribution we present two activities illustrating how Sensor Web technology helps to enable a flexible and interoperable sharing of marine observation data based on standards. An important foundation is the Sensor Web Architecture developed by the European FP7 project NeXOS (Next generation Low-Cost Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management). This architecture relies on the Open Geospatial Consortium's (OGC) Sensor Web Enablement (SWE) framework. It is an exemplary solution for facilitating the interoperable exchange of marine observation data within and between (research) organisations. The architecture addresses a series of functional and non-functional requirements which are fulfilled through different types of OGC SWE components. The diverse functionalities offered by the NeXOS Sensor Web architecture are shown in the following overview: - Pull-based observation data download: This is achieved through the OGC Sensor Observation Service (SOS) 2.0 interface standard. - Push-based delivery of observation data to allow users the subscription to new measurements that are relevant for them: For this purpose there are currently several specification activities under evaluation (e.g. OGC Sensor Event Service, OGC Publish/Subscribe Standards Working Group). - (Web-based) visualisation of marine observation data: Implemented through SOS client applications. - Configuration and controlling of sensor devices: This is ensured through the OGC Sensor Planning Service 2.0 interface. - Bridging between sensors/data loggers and Sensor Web components: For this purpose several components such as the "Smart Electronic Interface for Sensor Interoperability" (SEISI) concept are developed; this is complemented by a more lightweight SOS extension (e.g. based on the W3C Efficient XML Interchange (EXI) format). To further advance this architecture, there is on-going work to develop dedicated profiles of selected OGC

  9. Incorporating Quality Control Information in the Sensor Web

    Science.gov (United States)

    Devaraju, Anusuriya; Kunkel, Ralf; Bogena, Heye

    2013-04-01

    The rapid development of sensing technologies had led to the creation of large amounts of heterogeneous environmental observations. The Sensor Web provides a wider access to sensors and observations via common protocols and specifications. Observations typically go through several levels of quality control, and aggregation before they are made available to end-users. Raw data are usually inspected, and related quality flags are assigned. Data are gap-filled, and errors are removed. New data series may also be derived from one or more corrected data sets. Until now, it is unclear how these kinds of information can be captured in the Sensor Web Enablement (SWE) framework. Apart from the quality measures (e.g., accuracy, precision, tolerance, or confidence), the levels of observational series, the changes applied, and the methods involved must be specified. It is important that this kind of quality control information is well described and communicated to end-users to allow for a better usage and interpretation of data products. In this paper, we describe how quality control information can be incorporated into the SWE framework. Concerning this, first, we introduce the TERENO (TERrestrial ENvironmental Observatories), an initiative funded by the large research infrastructure program of the Helmholtz Association in Germany. The main goal of the initiative is to facilitate the study of long-term effects of climate and land use changes. The TERENO Online Data RepOsitORry (TEODOOR) is a software infrastructure that supports acquisition, provision, and management of observations within TERENO via SWE specifications and several other OGC web services. Next, we specify changes made to the existing observational data model to incorporate quality control information. Here, we describe the underlying TERENO data policy in terms of provision and maintenance issues. We present data levels, and their implementation within TEODOOR. The data levels are adapted from those used by

  10. Responsive web design workflow

    OpenAIRE

    LAAK, TIMO

    2013-01-01

    Responsive Web Design Workflow is a literature review about Responsive Web Design, a web standards based modern web design paradigm. The goals of this research were to define what responsive web design is, determine its importance in building modern websites and describe a workflow for responsive web design projects. Responsive web design is a paradigm to create adaptive websites, which respond to the properties of the media that is used to render them. The three key elements of responsi...

  11. Web-based tool for subjective observer ranking of compressed medical images

    Science.gov (United States)

    Langer, Steven G.; Stewart, Brent K.; Andrew, Rex K.

    1999-05-01

    In the course of evaluating various compression schemes for ultrasound teleradiology applications, it became obvious that paper based methods of data collection were time consuming and error prone. A method was sought which allowed participating radiologists to view the ultrasound video clips (compressed to varying degree) at their desks. Furthermore, the method should allow observers to enter their evaluations and when finished, automatically submit the data to our statistical analysis engine. We have found the World Wide Web offered a ready solution. A web page was constructed that contains 18 embedded AVI video clips. The 18 clips represent 6 distinct anatomical areas, compressed by various methods and amounts, and then randomly distributed through the web page. To the right of each video, a series of questions are presented which ask the observer to rank (1 - 5) his/her ability to answer diagnostically relevant questions. When completed, the observer presses 'Submit' and a file of tab delimited test is created which can then be imported to an Excel workbook. Kappa analysis is then performed and the resulting plots demonstrate observer preferences.

  12. Social Software and Academic Practice: Postgraduate Students as Co-Designers of Web 2.0 Tools

    Science.gov (United States)

    Carmichael, Patrick; Burchmore, Helen

    2010-01-01

    In order to develop potentially transformative Web 2.0 tools in higher education, the complexity of existing academic practices, including current patterns of technology use, must be recognised. This paper describes how a series of participatory design activities allowed postgraduate students in education, social sciences and computer sciences to…

  13. A web-based application for initial screening of living kidney donors: development, implementation and evaluation.

    Science.gov (United States)

    Moore, D R; Feurer, I D; Zavala, E Y; Shaffer, D; Karp, S; Hoy, H; Moore, D E

    2013-02-01

    Most centers utilize phone or written surveys to screen candidates who self-refer to be living kidney donors. To increase efficiency and reduce resource utilization, we developed a web-based application to screen kidney donor candidates. The aim of this study was to evaluate the use of this web-based application. Method and time of referral were tabulated and descriptive statistics summarized demographic characteristics. Time series analyses evaluated use over time. Between January 1, 2011 and March 31, 2012, 1200 candidates self-referred to be living kidney donors at our center. Eight hundred one candidates (67%) completed the web-based survey and 399 (33%) completed a phone survey. Thirty-nine percent of donors accessed the application on nights and weekends. Postimplementation of the web-based application, there was a statistically significant increase (p web-based application as opposed to telephone contact. Also, there was a significant increase (p = 0.025) in the total number of self-referrals post-implementation from 61 to 116 per month. An interactive web-based application is an effective strategy for the initial screening of donor candidates. The web-based application increased the ability to interface with donors, process them efficiently and ultimately increased donor self-referral at our center. © Copyright 2012 The American Society of Transplantation and the American Society of Transplant Surgeons.

  14. Web services foundations

    CERN Document Server

    Bouguettaya, Athman; Daniel, Florian

    2013-01-01

    Web services and Service-Oriented Computing (SOC) have become thriving areas of academic research, joint university/industry research projects, and novel IT products on the market. SOC is the computing paradigm that uses Web services as building blocks for the engineering of composite, distributed applications out of the reusable application logic encapsulated by Web services. Web services could be considered the best-known and most standardized technology in use today for distributed computing over the Internet.Web Services Foundations is the first installment of a two-book collection coverin

  15. Semantic web for dummies

    CERN Document Server

    Pollock, Jeffrey T

    2009-01-01

    Semantic Web technology is already changing how we interact with data on the Web. By connecting random information on the Internet in new ways, Web 3.0, as it is sometimes called, represents an exciting online evolution. Whether you're a consumer doing research online, a business owner who wants to offer your customers the most useful Web site, or an IT manager eager to understand Semantic Web solutions, Semantic Web For Dummies is the place to start! It will help you:Know how the typical Internet user will recognize the effects of the Semantic WebExplore all the benefits the data Web offers t

  16. Building web information systems using web services

    NARCIS (Netherlands)

    Frasincar, F.; Houben, G.J.P.M.; Barna, P.; Vasilecas, O.; Eder, J.; Caplinskas, A.

    2006-01-01

    Hera is a model-driven methodology for designing Web information systems. In the past a CASE tool for the Hera methodology was implemented. This software had different components that together form one centralized application. In this paper, we present a distributed Web service-oriented architecture

  17. Development and Evaluation of an Interactive WebQuest Environment: "Web Macerasi"

    Science.gov (United States)

    Gulbahar, Yasemin; Madran, R. Orcun; Kalelioglu, Filiz

    2010-01-01

    This study was conducted to develop a web-based interactive system, Web Macerasi, for teaching-learning and evaluation purposes, and to find out the possible effects of this system. The study has two stages. In the first stage, a WebQuest site was designed as an interactive system in which various Internet and web technologies were used for…

  18. Web Analytics: A Picture of the Academic Library Web Site User

    Science.gov (United States)

    Black, Elizabeth L.

    2009-01-01

    This article describes the usefulness of Web analytics for understanding the users of an academic library Web site. Using a case study, the analysis describes how Web analytics can answer questions about Web site user behavior, including when visitors come, the duration of the visit, how they get there, the technology they use, and the most…

  19. Effects of organizational scheme and labeling on task performance in product-centered and user-centered retail Web sites.

    Science.gov (United States)

    Resnick, Marc L; Sanchez, Julian

    2004-01-01

    As companies increase the quantity of information they provide through their Web sites, it is critical that content is structured with an appropriate architecture. However, resource constraints often limit the ability of companies to apply all Web design principles completely. This study quantifies the effects of two major information architecture principles in a controlled study that isolates the incremental effects of organizational scheme and labeling on user performance and satisfaction. Sixty participants with a wide range of Internet and on-line shopping experience were recruited to complete a series of shopping tasks on a prototype retail shopping Web site. User-centered labels provided a significant benefit in performance and satisfaction over labels obtained through company-centered methods. User-centered organization did not result in improved performance except when the label quality was poor. Significant interactions suggest specific guidelines for allocating resources in Web site design. Applications of this research include the design of Web sites for any commercial application, particularly E-commerce.

  20. The detector control web system of the ATLAS hadronic calorimeter

    International Nuclear Information System (INIS)

    Maidantchik, Carmen; Ferreira, Fernando G.; Marroquim, Fernando

    2011-01-01

    Full text: The hadronic calorimeter (TileCal) of the ATLAS experiment is a sampling device for measuring the energy of particles that cross the detector and is composed by thousands of electronics channels operating over a high rate of acquired events. A complex sourcing mechanism, responsible for powering each channel, comprises low voltages, from 3 V to 15 V, and high voltage, around 800 V, power supplies and a water-based cooling system. The Detector Control System (DCS) is responsible for monitoring and controlling the mechanisms. The good operation of power supplies is really important for the detector data acquisition. A misbehaved power supply can affect the electronic systems or, even in the worst scenario, turn a whole section of the detector off, which would lead to missing events. DCS Web System was developed to provide the required functions to monitor the stability of the power supplies operation by providing a daily or monthly summary of voltages, currents and temperatures. The synopsis is made up by the mean and standard variation of the monitored parameters as well as time plots. The obtained statistics are compared to preset thresholds and the system interface highlight the cases that the collaboration should pay attention. The web system also displays voltage trips, an undesired power-cut that can happen from time to time in some power supplies during their operation. As future steps, the group is developing prediction capabilities based on the analysis of the time series of the monitored parameters. Therefore, it will be possible to indicate which power sources should be replaced during the annual maintenance period, helping to keep a high number of live channels during the data acquisition. This paper describes the DCS Web System and its functionalities, presenting preliminary results from the time series analysis. (author)

  1. Web Mining and Social Networking

    DEFF Research Database (Denmark)

    Xu, Guandong; Zhang, Yanchun; Li, Lin

    This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web ...... sense of individuals or communities. The volume will benefit both academic and industry communities interested in the techniques and applications of web search, web data management, web mining and web knowledge discovery, as well as web community and social network analysis.......This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web...... mining, and the issue of how to incorporate web mining into web personalization and recommendation systems are also reviewed. Additionally, the volume explores web community mining and analysis to find the structural, organizational and temporal developments of web communities and reveal the societal...

  2. Web-based (HTML5) interactive graphics for fusion research and collaboration

    International Nuclear Information System (INIS)

    Kim, E.N.; Schissel, D.P.; Abla, G.; Flanagan, S.; Lee, X.

    2012-01-01

    Highlights: ► Interactive data visualization is supported via the Web without a browser plugin and provides users easy, real-time access to data of different types from various locations. ► Crosshair, zoom, pan as well as toggling dimensionality and a slice bar for multi-dimensional data are available. ► Data with PHP API can be applied: MDSplus and SQL have been tested. ► Modular in design, this has been deployed to support both the experimental and the simulation research arenas. - Abstract: With the continuing development of web technologies, it is becoming feasible for websites to operate a lot like a scientific desktop application. This has opened up more possibilities for utilizing the web browser for interactive scientific research and providing new means of on-line communication and collaboration. This paper describes the research and deployment for utilizing these enhanced web graphics capabilities on the fusion research tools which has led to a general toolkit that can be deployed as required. It allows users to dynamically create, interact with and share with others, the large sets of data generated by the fusion experiments and simulations. Hypertext Preprocessor (PHP), a general-purpose scripting language for the Web, is used to process a series of inputs, and determine the data source types and locations to fetch and organize the data. Protovis, a Javascript and SVG based web graphics package, then quickly draws the interactive graphs and makes it available to the worldwide audience. This toolkit has been deployed to both the simulation and experimental arenas. The deployed applications will be presented as well as the architecture and technologies used in producing the general graphics toolkit.

  3. Web API Fragility : How Robust is Your Web API Client

    NARCIS (Netherlands)

    Espinha, T.; Zaidman, A.; Gross, H.G.

    2014-01-01

    Web APIs provide a systematic and extensible approach for application-to-application interaction. A large number of mobile applications makes use of web APIs to integrate services into apps. Each Web API’s evolution pace is determined by their respective developer and mobile application developers

  4. Sounds of Web Advertising

    DEFF Research Database (Denmark)

    Jessen, Iben Bredahl; Graakjær, Nicolai Jørgensgaard

    2010-01-01

    Sound seems to be a neglected issue in the study of web ads. Web advertising is predominantly regarded as visual phenomena–commercial messages, as for instance banner ads that we watch, read, and eventually click on–but only rarely as something that we listen to. The present chapter presents...... an overview of the auditory dimensions in web advertising: Which kinds of sounds do we hear in web ads? What are the conditions and functions of sound in web ads? Moreover, the chapter proposes a theoretical framework in order to analyse the communicative functions of sound in web advertising. The main...... argument is that an understanding of the auditory dimensions in web advertising must include a reflection on the hypertextual settings of the web ad as well as a perspective on how users engage with web content....

  5. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  6. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  7. Engineering Web Applications

    DEFF Research Database (Denmark)

    Casteleyn, Sven; Daniel, Florian; Dolog, Peter

    Nowadays, Web applications are almost omnipresent. The Web has become a platform not only for information delivery, but also for eCommerce systems, social networks, mobile services, and distributed learning environments. Engineering Web applications involves many intrinsic challenges due...... to their distributed nature, content orientation, and the requirement to make them available to a wide spectrum of users who are unknown in advance. The authors discuss these challenges in the context of well-established engineering processes, covering the whole product lifecycle from requirements engineering through...... design and implementation to deployment and maintenance. They stress the importance of models in Web application development, and they compare well-known Web-specific development processes like WebML, WSDM and OOHDM to traditional software development approaches like the waterfall model and the spiral...

  8. Dynamics of a macroscopic model characterizing mutualism of search engines and web sites

    Science.gov (United States)

    Wang, Yuanshi; Wu, Hong

    2006-05-01

    We present a model to describe the mutualism relationship between search engines and web sites. In the model, search engines and web sites benefit from each other while the search engines are derived products of the web sites and cannot survive independently. Our goal is to show strategies for the search engines to survive in the internet market. From mathematical analysis of the model, we show that mutualism does not always result in survival. We show various conditions under which the search engines would tend to extinction, persist or grow explosively. Then by the conditions, we deduce a series of strategies for the search engines to survive in the internet market. We present conditions under which the initial number of consumers of the search engines has little contribution to their persistence, which is in agreement with the results in previous works. Furthermore, we show novel conditions under which the initial value plays an important role in the persistence of the search engines and deduce new strategies. We also give suggestions for the web sites to cooperate with the search engines in order to form a win-win situation.

  9. Technical Evaluation Report 61: The World-Wide Inaccessible Web, Part 2: Internet routes

    Directory of Open Access Journals (Sweden)

    Jim Klaas

    2007-06-01

    Full Text Available In the previous report in this series, Web browser loading times were measured in 12 Asian countries, and were found to be up to four times slower than commonly prescribed as acceptable. Failure of webpages to load at all was frequent. The current follow-up study compares these loading times with the complexity of the Internet routes linking the Web users and the Web servers hosting them. The study was conducted in the same 12 Asian countries, with the assistance of members of the International Development Research Centre’s PANdora distance education research network. The data were generated by network members in Bhutan, Cambodia, India, Indonesia, Laos, Mongolia, the Philippines, Sri Lanka, Pakistan, Singapore, Thailand, and Vietnam. Additional data for the follow-up study were collected in China. Using a ‘traceroute’ routine, the study indicates that webpage loading time is linked to the complexity of the Internet routes between Web users and the host server. It is indicated that distance educators can apply such information in the design of improved online delivery and mirror sites, notably in areas of the developing world which currently lack an effective infrastructure for online education.

  10. Web Application Vulnerabilities

    OpenAIRE

    Yadav, Bhanu

    2014-01-01

    Web application security has been a major issue in information technology since the evolvement of dynamic web application. The main objective of this project was to carry out a detailed study on the top three web application vulnerabilities such as injection, cross site scripting, broken authentication and session management, present the situation where an application can be vulnerable to these web threats and finally provide preventative measures against them. ...

  11. WebMGA: a customizable web server for fast metagenomic sequence analysis.

    Science.gov (United States)

    Wu, Sitao; Zhu, Zhengwei; Fu, Liming; Niu, Beifang; Li, Weizhong

    2011-09-07

    The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  12. WebMGA: a customizable web server for fast metagenomic sequence analysis

    Directory of Open Access Journals (Sweden)

    Niu Beifang

    2011-09-01

    Full Text Available Abstract Background The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. Results We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. Conclusions WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  13. Web party effect: a cocktail party effect in the web environment.

    Science.gov (United States)

    Rigutti, Sara; Fantoni, Carlo; Gerbino, Walter

    2015-01-01

    In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of variable complexity, in order to reach an information goal located two clicks away from the starting home page. The time needed to reach the goal was accounted for by a novel measure of home page complexity based on a part of (not all) web options: the number of links embedded within web navigation elements weighted by the number and type of embedding elements. Our measure fully mediated the effect of several standard complexity metrics (the overall number of links, words, images, graphical regions, the JPEG file size of home page screenshots) on information seeking time and usability ratings. Furthermore, it predicted the cognitive demand of web navigation, as revealed by the duration judgment ratio (i.e., the ratio of subjective to objective duration of information search). Results demonstrate that focusing on relevant links while ignoring other web objects optimizes the deployment of attentional resources necessary to navigation. This is in line with a web party effect (i.e., a cocktail party effect in the web environment): users tune into web elements that are relevant for the achievement of their navigation goals and tune out all others.

  14. WEB LOG EXPLORER – CONTROL OF MULTIDIMENSIONAL DYNAMICS OF WEB PAGES

    Directory of Open Access Journals (Sweden)

    Mislav Šimunić

    2012-07-01

    Full Text Available Demand markets dictate and pose increasingly more requirements to the supplymarket that are not easily satisfied. The supply market presenting its web pages to thedemand market should find the best and quickest ways to respond promptly to the changesdictated by the demand market. The question is how to do that in the most efficient andquickest way. The data on the usage of web pages on a specific web site are recorded in alog file. The data in a log file are stochastic and unordered and require systematicmonitoring, categorization, analyses, and weighing. From the data processed in this way, itis necessary to single out and sort the data by their importance that would be a basis for acontinuous generation of dynamics/changes to the web site pages in line with the criterionchosen. To perform those tasks successfully, a new software solution is required. For thatpurpose, the authors have developed the first version of the WLE (WebLogExplorersoftware solution, which is actually realization of web page multidimensionality and theweb site as a whole. The WebLogExplorer enables statistical and semantic analysis of a logfile and on the basis thereof, multidimensional control of the web page dynamics. Theexperimental part of the work was done within the web site of HTZ (Croatian NationalTourist Board being the main portal of the global tourist supply in the Republic of Croatia(on average, daily "log" consists of c. 600,000 sets, average size of log file is 127 Mb, andc. 7000-8000 daily visitors on the web site.

  15. Web-Based Course Management and Web Services

    Science.gov (United States)

    Mandal, Chittaranjan; Sinha, Vijay Luxmi; Reade, Christopher M. P.

    2004-01-01

    The architecture of a web-based course management tool that has been developed at IIT [Indian Institute of Technology], Kharagpur and which manages the submission of assignments is discussed. Both the distributed architecture used for data storage and the client-server architecture supporting the web interface are described. Further developments…

  16. Web-page Prediction for Domain Specific Web-search using Boolean Bit Mask

    OpenAIRE

    Sinha, Sukanta; Duttagupta, Rana; Mukhopadhyay, Debajyoti

    2012-01-01

    Search Engine is a Web-page retrieval tool. Nowadays Web searchers utilize their time using an efficient search engine. To improve the performance of the search engine, we are introducing a unique mechanism which will give Web searchers more prominent search results. In this paper, we are going to discuss a domain specific Web search prototype which will generate the predicted Web-page list for user given search string using Boolean bit mask.

  17. Web X-Ray: Developing and Adopting Web Best Practices in Enterprises

    Directory of Open Access Journals (Sweden)

    Reinaldo Ferreira

    2016-12-01

    Full Text Available The adoption of Semantic Web technologies constitutes a promising approach to data structuring and integration, both for public and private usage. While these technologies have been around for some time, their adoption is behind overall expectations, particularly in the case of Enterprises. Having that in mind, we developed a Semantic Web Implementation Model that measures and facilitates the implementation of the technology. The advantages of using the model proposed are two-fold: the model serves as a guide for driving the implementation of the Semantic Web as well as it helps to evaluate the impact of the introduction of the technology. The model was adopted by 19 enterprises in an Action Research intervention of one year with promising results: according to the model's scale, in average, all enterprises evolved from a 6% evaluation to 46% during that period. Furthermore, practical implementation recommendations, a typical consulting tool, were developed and adopted during the project by all enterprises, providing important guidelines for the identification of a development path that may be adopted on a larger scale. Meanwhile, the project also outlined that most enterprises were interested in an even broader scope of the Implementation Model and the ambition of a "All Web Technologies" approach arose. One model that could embrace the observable overlapping of different Web generations, namely the Web of Documents, the Social Web, the Web of Data and, ultimately, the Web of Context. One model that could combine the evaluation and guidance for all enterprises to follow. That's the goal of the undergoing "Project Web X-ray" that aims to involve 200 enterprises in the adoption of best practices that may lead to their business development based on Web technologies. This paper presents a case of how Action Research promoted the simultaneous advancement of academic research and enterprise development and introduces the framework and opportunities

  18. Update of the FANTOM web resource: high resolution transcriptome of diverse cell types in mammals.

    Science.gov (United States)

    Lizio, Marina; Harshbarger, Jayson; Abugessaisa, Imad; Noguchi, Shuei; Kondo, Atsushi; Severin, Jessica; Mungall, Chris; Arenillas, David; Mathelier, Anthony; Medvedeva, Yulia A; Lennartsson, Andreas; Drabløs, Finn; Ramilowski, Jordan A; Rackham, Owen; Gough, Julian; Andersson, Robin; Sandelin, Albin; Ienasescu, Hans; Ono, Hiromasa; Bono, Hidemasa; Hayashizaki, Yoshihide; Carninci, Piero; Forrest, Alistair R R; Kasukawa, Takeya; Kawaji, Hideya

    2017-01-04

    Upon the first publication of the fifth iteration of the Functional Annotation of Mammalian Genomes collaborative project, FANTOM5, we gathered a series of primary data and database systems into the FANTOM web resource (http://fantom.gsc.riken.jp) to facilitate researchers to explore transcriptional regulation and cellular states. In the course of the collaboration, primary data and analysis results have been expanded, and functionalities of the database systems enhanced. We believe that our data and web systems are invaluable resources, and we think the scientific community will benefit for this recent update to deepen their understanding of mammalian cellular organization. We introduce the contents of FANTOM5 here, report recent updates in the web resource and provide future perspectives. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Global Web Accessibility Analysis of National Government Portals and Ministry Web Sites

    DEFF Research Database (Denmark)

    Goodwin, Morten; Susar, Deniz; Nietzio, Annika

    2011-01-01

    Equal access to public information and services for all is an essential part of the United Nations (UN) Declaration of Human Rights. Today, the Web plays an important role in providing information and services to citizens. Unfortunately, many government Web sites are poorly designed and have...... accessibility barriers that prevent people with disabilities from using them. This article combines current Web accessibility benchmarking methodologies with a sound strategy for comparing Web accessibility among countries and continents. Furthermore, the article presents the first global analysis of the Web...... accessibility of 192 United Nation Member States made publically available. The article also identifies common properties of Member States that have accessible and inaccessible Web sites and shows that implementing antidisability discrimination laws is highly beneficial for the accessibility of Web sites, while...

  20. Writing for the web composing, coding, and constructing web sites

    CERN Document Server

    Applen, JD

    2013-01-01

    Writing for the Web unites theory, technology, and practice to explore writing and hypertext for website creation. It integrates such key topics as XHTML/CSS coding, writing (prose) for the Web, the rhetorical needs of the audience, theories of hypertext, usability and architecture, and the basics of web site design and technology. Presenting information in digestible parts, this text enables students to write and construct realistic and manageable Web sites with a strong theoretical understanding of how online texts communicate to audiences. Key features of the book

  1. The Role of the Web Server in a Capstone Web Application Course

    Science.gov (United States)

    Umapathy, Karthikeyan; Wallace, F. Layne

    2010-01-01

    Web applications have become commonplace in the Information Systems curriculum. Much of the discussion about Web development for capstone courses has centered on the scripting tools. Very little has been discussed about different ways to incorporate the Web server into Web application development courses. In this paper, three different ways of…

  2. Web party effect: a cocktail party effect in the web environment

    Directory of Open Access Journals (Sweden)

    Sara Rigutti

    2015-03-01

    Full Text Available In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of variable complexity, in order to reach an information goal located two clicks away from the starting home page. The time needed to reach the goal was accounted for by a novel measure of home page complexity based on a part of (not all web options: the number of links embedded within web navigation elements weighted by the number and type of embedding elements. Our measure fully mediated the effect of several standard complexity metrics (the overall number of links, words, images, graphical regions, the JPEG file size of home page screenshots on information seeking time and usability ratings. Furthermore, it predicted the cognitive demand of web navigation, as revealed by the duration judgment ratio (i.e., the ratio of subjective to objective duration of information search. Results demonstrate that focusing on relevant links while ignoring other web objects optimizes the deployment of attentional resources necessary to navigation. This is in line with a web party effect (i.e., a cocktail party effect in the web environment: users tune into web elements that are relevant for the achievement of their navigation goals and tune out all others.

  3. Web party effect: a cocktail party effect in the web environment

    Science.gov (United States)

    Gerbino, Walter

    2015-01-01

    In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of variable complexity, in order to reach an information goal located two clicks away from the starting home page. The time needed to reach the goal was accounted for by a novel measure of home page complexity based on a part of (not all) web options: the number of links embedded within web navigation elements weighted by the number and type of embedding elements. Our measure fully mediated the effect of several standard complexity metrics (the overall number of links, words, images, graphical regions, the JPEG file size of home page screenshots) on information seeking time and usability ratings. Furthermore, it predicted the cognitive demand of web navigation, as revealed by the duration judgment ratio (i.e., the ratio of subjective to objective duration of information search). Results demonstrate that focusing on relevant links while ignoring other web objects optimizes the deployment of attentional resources necessary to navigation. This is in line with a web party effect (i.e., a cocktail party effect in the web environment): users tune into web elements that are relevant for the achievement of their navigation goals and tune out all others. PMID:25802803

  4. Embedded Web Technology: Applying World Wide Web Standards to Embedded Systems

    Science.gov (United States)

    Ponyik, Joseph G.; York, David W.

    2002-01-01

    Embedded Systems have traditionally been developed in a highly customized manner. The user interface hardware and software along with the interface to the embedded system are typically unique to the system for which they are built, resulting in extra cost to the system in terms of development time and maintenance effort. World Wide Web standards have been developed in the passed ten years with the goal of allowing servers and clients to intemperate seamlessly. The client and server systems can consist of differing hardware and software platforms but the World Wide Web standards allow them to interface without knowing about the details of system at the other end of the interface. Embedded Web Technology is the merging of Embedded Systems with the World Wide Web. Embedded Web Technology decreases the cost of developing and maintaining the user interface by allowing the user to interface to the embedded system through a web browser running on a standard personal computer. Embedded Web Technology can also be used to simplify an Embedded System's internal network.

  5. Web service composition: a semantic web and automated planning technique application

    Directory of Open Access Journals (Sweden)

    Jaime Alberto Guzmán Luna

    2008-09-01

    Full Text Available This article proposes applying semantic web and artificial intelligence planning techniques to a web services composition model dealing with problems of ambiguity in web service description and handling incomplete web information. The model uses an OWL-S services and implements a planning technique which handles open world semantics in its reasoning process to resolve these problems. This resulted in a web services composition system incorporating a module for interpreting OWL-S services and converting them into a planning problem in PDDL (a planning module handling incomplete information and an execution service module concurrently interacting with the planner for executing each composition plan service.

  6. Learning from WebQuests

    Science.gov (United States)

    Gaskill, Martonia; McNulty, Anastasia; Brooks, David W.

    2006-04-01

    WebQuests are activities in which students use Web resources to learn about school topics. WebQuests are advocated as constructivist activities and ones generally well regarded by students. Two experiments were conducted in school settings to compare learning using WebQuests versus conventional instruction. Students and teachers both enjoyed WebQuest instruction and spoke highly of it. In one experiment, however, conventional instruction led to significantly greater student learning. In the other, there were no significant differences in the learning outcomes between conventional versus WebQuest-based instruction.

  7. WebEase: Development of a Web-Based Epilepsy Self-Management Intervention

    OpenAIRE

    DiIorio, Colleen; Escoffery, Cam; Yeager, Katherine A.; Koganti, Archana; Reisinger, Elizabeth; Koganti, Archana; McCarty, Frances; Henry, Thomas R.; Robinson, Elise; Kobau, Rosemarie; Price, Patricia

    2008-01-01

    People with epilepsy must adopt many self-management behaviors, especially regarding medication adherence, stress management, and sleep quality. In response to the need for theory-based self-management programs that people with epilepsy can easily access, the WebEase Web site was created and tested for feasibility, acceptability, and usability. This article discusses the theoretical background and developmental phases of WebEase and lessons learned throughout the development process. The WebE...

  8. A Web System Trace Model and Its Application to Web Design

    OpenAIRE

    Kong, Xiaoying; Liu, Li; Lowe, David

    2007-01-01

    Traceability analysis is crucial to the development of web-centric systems, particularly those with frequent system changes, fine-grained evolution and maintenance, and high level of requirements uncertainty. A trace model at the level of the web system architecture is presented in this paper to address the specific challenges of developing web-centric systems. The trace model separates the concerns of different stakeholders in the web development life cycle into viewpoints; and c...

  9. The effectiveness of web-based, multimedia tutorials for teaching methods of human body composition analysis.

    Science.gov (United States)

    Buzzell, Paul R; Chamberlain, Valerie M; Pintauro, Stephen J

    2002-12-01

    This study examined the effectiveness of a series of Web-based, multimedia tutorials on methods of human body composition analysis. Tutorials were developed around four body composition topics: hydrodensitometry (underwater weighing), dual-energy X-ray absorptiometry, bioelectrical impedance analysis, and total body electrical conductivity. Thirty-two students enrolled in the course were randomly assigned to learn the material through either the Web-based tutorials only ("Computer"), a traditional lecture format ("Lecture"), or lectures supplemented with Web-based tutorials ("Both"). All students were administered a validated pretest before randomization and an identical posttest at the completion of the course. The reliability of the test was 0.84. The mean score changes from pretest to posttest were not significantly different among the groups (65.4 plus minus 17.31, 78.82 plus minus 21.50, and 76 plus minus 21.22 for the Computer, Both, and Lecture groups, respectively). Additionally, a Likert-type assessment found equally positive attitudes toward all three formats. The results indicate that Web-based tutorials are as effective as the traditional lecture format for teaching these topics.

  10. Take control of iWeb

    CERN Document Server

    Sande, Steve

    2009-01-01

    Learn how to make useful, attractive Web sites with iWeb! Apple's iWeb aims to help you build an attractive Web site quickly and easily, but not all of iWeb's features are fully explained. If you want step-by-step instructions and plenty of time-saving tips, Web pro Steve Sande can help. In Take Control of iWeb, Steve walks you through all the steps for building an iWeb site and uploading it to .Mac or to another Web host. You can look over his shoulder as he enhances iWeb's templates with a designer's eye, using tools like masks, reflections, and Instant Alpha.Steve teaches you the best ways

  11. Het WEB leert begrijpen

    CERN Multimedia

    Stroeykens, Steven

    2004-01-01

    The WEB could be much more useful if the computers understood something of information on the Web pages. That explains the goal of the "semantic Web", a project in which takes part, amongst others, Tim Berners Lee, the inventor of the original WEB

  12. WebFTS: File Transfer Web Interface for FTS3

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    WebFTS is a web-delivered file transfer and management solution which allows users to invoke reliable, managed data transfers on distributed infrastructures. The fully open source solution offers a simple graphical interface through which the power of the FTS3 service can be accessed without the installation of any special grid tools. Created following simplicity and efficiency criteria, WebFTS allows the user to access and interact with multiple grid and cloud storage. The “transfer engine” used is FTS3, the service responsible for distributing the majority of LHC data across WLCG infrastructure. This provides WebFTS with reliable, multi-protocol, adaptively optimised data transfers.The talk will focus on the recent development which allows transfers from/to Dropbox and CERNBox (CERN ownCloud deployment)

  13. Web Search Engines

    OpenAIRE

    Rajashekar, TB

    1998-01-01

    The World Wide Web is emerging as an all-in-one information source. Tools for searching Web-based information include search engines, subject directories and meta search tools. We take a look at key features of these tools and suggest practical hints for effective Web searching.

  14. Web-based (HTML5) interactive graphics for fusion research and collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Kim, E.N., E-mail: kimny@fusion.gat.com [General Atomics, P.O. Box 85608, San Diego, CA (United States); Schissel, D.P.; Abla, G.; Flanagan, S.; Lee, X. [General Atomics, P.O. Box 85608, San Diego, CA (United States)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Interactive data visualization is supported via the Web without a browser plugin and provides users easy, real-time access to data of different types from various locations. Black-Right-Pointing-Pointer Crosshair, zoom, pan as well as toggling dimensionality and a slice bar for multi-dimensional data are available. Black-Right-Pointing-Pointer Data with PHP API can be applied: MDSplus and SQL have been tested. Black-Right-Pointing-Pointer Modular in design, this has been deployed to support both the experimental and the simulation research arenas. - Abstract: With the continuing development of web technologies, it is becoming feasible for websites to operate a lot like a scientific desktop application. This has opened up more possibilities for utilizing the web browser for interactive scientific research and providing new means of on-line communication and collaboration. This paper describes the research and deployment for utilizing these enhanced web graphics capabilities on the fusion research tools which has led to a general toolkit that can be deployed as required. It allows users to dynamically create, interact with and share with others, the large sets of data generated by the fusion experiments and simulations. Hypertext Preprocessor (PHP), a general-purpose scripting language for the Web, is used to process a series of inputs, and determine the data source types and locations to fetch and organize the data. Protovis, a Javascript and SVG based web graphics package, then quickly draws the interactive graphs and makes it available to the worldwide audience. This toolkit has been deployed to both the simulation and experimental arenas. The deployed applications will be presented as well as the architecture and technologies used in producing the general graphics toolkit.

  15. WebSpy: An Architecture for Monitoring Web Server Availability in a Multi-Platform Environment

    Directory of Open Access Journals (Sweden)

    Madhan Mohan Thirukonda

    2002-01-01

    Full Text Available For an electronic business (e-business, customer satisfaction can be the difference between long-term success and short-term failure. Customer satisfaction is highly impacted by Web server availability, as customers expect a Web site to be available twenty-four hours a day and seven days a week. Unfortunately, unscheduled Web server downtime is often beyond the control of the organization. What is needed is an effective means of identifying and recovering from Web server downtime in order to minimize the negative impact on the customer. An automated architecture, called WebSpy, has been developed to notify administration and to take immediate action when Web server downtime is detected. This paper describes the WebSpy architecture and differentiates it from other popular Web monitoring tools. The results of a case study are presented as a means of demonstrating WebSpy's effectiveness in monitoring Web server availability.

  16. Maximum Spanning Tree Model on Personalized Web Based Collaborative Learning in Web 3.0

    OpenAIRE

    Padma, S.; Seshasaayee, Ananthi

    2012-01-01

    Web 3.0 is an evolving extension of the current web environme bnt. Information in web 3.0 can be collaborated and communicated when queried. Web 3.0 architecture provides an excellent learning experience to the students. Web 3.0 is 3D, media centric and semantic. Web based learning has been on high in recent days. Web 3.0 has intelligent agents as tutors to collect and disseminate the answers to the queries by the students. Completely Interactive learner's query determine the customization of...

  17. Sustainable Materials Management (SMM) Web Academy Webinar: Compost from Food Waste: Understanding Soil Chemistry and Soil Biology on a College/University Campus

    Science.gov (United States)

    This page contains information about the Sustainable Materials Management (SMM) Web Academy Webinar Series titled Compost from Food Waste:Understanding Soil Chemistry and Soil Biology on a College/University Campus

  18. Implementasi Metode Fuzzy Time Series Cheng untuk prediksi Kosentrasi Gas NO2 Di Udara

    Directory of Open Access Journals (Sweden)

    M Yoka Fathoni

    2017-05-01

    Full Text Available The forecasting process is essential for determining air quality to monitor NO2 gas in the air. The research aims to develop prediction information system of NO2 gas in air. The method used is Fuzzy Time Series Cheng method. The process of acquiring NO2 gas data is integrated with Multichannel-Multistasion. The data acquisition process uses Wireless Sensor Network technology via broadband internet that is sent and stored in an online database form on the web server. Recorded data is used as material for prediction. Acquisition result of  NO2 gas data is obtained from the sensor which is sent to the web server in the data base in the network by on line, then for futher it is predicted using fuzzy time series Cheng applying re-divide to the results of intervals the first partition of the value of the universe of discourse by historical data fuzzification to determine Fuzzy Logical Relationship dan Fuzzy Logical Relationship Group, so that is obtained result value prediction of NO2 gas concentration. By using 36 sample data of NO2 gas, it is obtained that the value of root of mean squared error of 2.08%. This result indicates that the method of Fuzzy Time Series Cheng is good enough to be used in predicting the NO2 gas.

  19. Quick Way to Port Existing C/C++ Chemoinformatics Toolkits to the Web Using Emscripten.

    Science.gov (United States)

    Jiang, Chen; Jin, Xi

    2017-10-23

    Emscripten is a special open source compiler that compiles C and C++ code into JavaScript. By utilizing this compiler, some typical C/C++ chemoinformatics toolkits and libraries are quickly ported to to web. The compiled JavaScript files have sizes similar to native programs, and from a series of constructed benchmarks, the performance of the compiled JavaScript codes is also close to that of the native codes and is better than the handwritten JavaScript codes. Therefore, we believe that Emscripten is a feasible and practical tool for reusing existing C/C++ codes on the web, and many other chemoinformatics or molecular calculation software tools can also be easily ported by Emscripten.

  20. Web Mining and Social Networking

    CERN Document Server

    Xu, Guandong; Li, Lin

    2011-01-01

    This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web mining, and the issue of how to incorporate web mining into web personalization and recommendation systems are also reviewed. Additionally, the volume explores web community mining and analysis to find the structural, organizational and temporal developments of web communities and reveal the societal s

  1. Semantic Web Primer

    NARCIS (Netherlands)

    Antoniou, Grigoris; Harmelen, Frank van

    2004-01-01

    The development of the Semantic Web, with machine-readable content, has the potential to revolutionize the World Wide Web and its use. A Semantic Web Primer provides an introduction and guide to this still emerging field, describing its key ideas, languages, and technologies. Suitable for use as a

  2. Uncertainty visualisation in the Model Web

    Science.gov (United States)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool

  3. CSAR-web: a web server of contig scaffolding using algebraic rearrangements.

    Science.gov (United States)

    Chen, Kun-Tze; Lu, Chin Lung

    2018-05-04

    CSAR-web is a web-based tool that allows the users to efficiently and accurately scaffold (i.e. order and orient) the contigs of a target draft genome based on a complete or incomplete reference genome from a related organism. It takes as input a target genome in multi-FASTA format and a reference genome in FASTA or multi-FASTA format, depending on whether the reference genome is complete or incomplete, respectively. In addition, it requires the users to choose either 'NUCmer on nucleotides' or 'PROmer on translated amino acids' for CSAR-web to identify conserved genomic markers (i.e. matched sequence regions) between the target and reference genomes, which are used by the rearrangement-based scaffolding algorithm in CSAR-web to order and orient the contigs of the target genome based on the reference genome. In the output page, CSAR-web displays its scaffolding result in a graphical mode (i.e. scalable dotplot) allowing the users to visually validate the correctness of scaffolded contigs and in a tabular mode allowing the users to view the details of scaffolds. CSAR-web is available online at http://genome.cs.nthu.edu.tw/CSAR-web.

  4. Going, going, still there: using the WebCite service to permanently archive cited web pages.

    Science.gov (United States)

    Eysenbach, Gunther; Trudel, Mathieu

    2005-12-30

    Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research

  5. WebGL and web audio software lightweight components for multimedia education

    Science.gov (United States)

    Chang, Xin; Yuksel, Kivanc; Skarbek, Władysław

    2017-08-01

    The paper presents the results of our recent work on development of contemporary computing platform DC2 for multimedia education usingWebGL andWeb Audio { the W3C standards. Using literate programming paradigm the WEBSA educational tools were developed. It offers for a user (student), the access to expandable collection of WEBGL Shaders and web Audio scripts. The unique feature of DC2 is the option of literate programming, offered for both, the author and the reader in order to improve interactivity to lightweightWebGL andWeb Audio components. For instance users can define: source audio nodes including synthetic sources, destination audio nodes, and nodes for audio processing such as: sound wave shaping, spectral band filtering, convolution based modification, etc. In case of WebGL beside of classic graphics effects based on mesh and fractal definitions, the novel image processing analysis by shaders is offered like nonlinear filtering, histogram of gradients, and Bayesian classifiers.

  6. Applying Web usage mining for personalizing hyperlinks in Web-based adaptive educational systems

    NARCIS (Netherlands)

    Romero, C.; Ventura, S.; Zafra, A.; Bra, de P.M.E.

    2009-01-01

    Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender

  7. The Semantic Web Revisited

    OpenAIRE

    Shadbolt, Nigel; Berners-Lee, Tim; Hall, Wendy

    2006-01-01

    The original Scientific American article on the Semantic Web appeared in 2001. It described the evolution of a Web that consisted largely of documents for humans to read to one that included data and information for computers to manipulate. The Semantic Web is a Web of actionable information--information derived from data through a semantic theory for interpreting the symbols.This simple idea, however, remains largely unrealized. Shopbots and auction bots abound on the Web, but these are esse...

  8. Soil-Web: An online soil survey for California, Arizona, and Nevada

    Science.gov (United States)

    Beaudette, D. E.; O'Geen, A. T.

    2009-10-01

    Digital soil survey products represent one of the largest and most comprehensive inventories of soils information currently available. The complex structure of these databases, intensive use of codes and scientific jargon make it difficult for non-specialists to utilize digital soil survey resources. A project was initiated to construct a web-based interface to digital soil survey products (STATSGO and SSURGO) for California, Arizona, and Nevada that would be accessible to the general public. A collection of mature, open source applications (including Mapserver, PostGIS and Apache Web Server) were used as a framework to support data storage, querying, map composition, data presentation, and contextual links to related materials. Application logic was written in the PHP language to "glue" together the many components of an online soil survey. A comprehensive website ( http://casoilresource.lawr.ucdavis.edu/map) was created to facilitate access to digital soil survey databases through several interfaces including: interactive map, Google Earth and HTTP-based application programming interface (API). Each soil polygon is linked to a map unit summary page, which includes links to soil component summary pages. The most commonly used soil properties, land interpretations and ratings are presented. Graphical and tabular summaries of soil profile information are dynamically created, and aid with rapid assessment of key soil properties. Quick links to official series descriptions (OSD) and other such information are presented. All terminology is linked back to the USDA-NRCS Soil Survey Handbook which contains extended definitions. The Google Earth interface to Soil-Web can be used to explore soils information in three dimensions. A flexible web API was implemented to allow advanced users of soils information to access our website via simple web page requests. Soil-Web has been successfully used in soil science curriculum, outreach activities, and current research projects

  9. Advanced web services

    CERN Document Server

    Bouguettaya, Athman; Daniel, Florian

    2013-01-01

    Web services and Service-Oriented Computing (SOC) have become thriving areas of academic research, joint university/industry research projects, and novel IT products on the market. SOC is the computing paradigm that uses Web services as building blocks for the engineering of composite, distributed applications out of the reusable application logic encapsulated by Web services. Web services could be considered the best-known and most standardized technology in use today for distributed computing over the Internet. This book is the second installment of a two-book collection covering the state-o

  10. Web-sovelluskehityksen tekniikat

    OpenAIRE

    Kettunen, Werner

    2015-01-01

    Web-sovelluskehitykseen käytettäviä tekniikoita, työkaluja ja ohjelmakirjastoja on olemassa useita erilaisia ja niiden lähestymistapa web-sovelluskehitykseen poikkeaa jonkin verran toisistaan. Opinnäytetyössä selvitetään teoriassa ja käytännön esimerkkiprojektin avulla yleisimmin web-sovelluskehityksessä käytettyjä tekniikoita ja kirjastoja. Työssä esimerkkinä luodussa web-sovelluksessa käytettiin Laravel-ohjelmakehystä ja alkuosassa käsiteltyjä työkaluja ja kirjastoja, kuten Bootstrap ja ...

  11. Learning from WebQuests

    Science.gov (United States)

    Gaskill, Martonia; McNulty, Anastasia; Brooks, David W.

    2006-01-01

    WebQuests are activities in which students use Web resources to learn about school topics. WebQuests are advocated as constructivist activities and ones generally well regarded by students. Two experiments were conducted in school settings to compare learning using WebQuests versus conventional instruction. Students and teachers both enjoyed…

  12. Evaluating Web Usability

    Science.gov (United States)

    Snider, Jean; Martin, Florence

    2012-01-01

    Web usability focuses on design elements and processes that make web pages easy to use. A website for college students was evaluated for underutilization. One-on-one testing, focus groups, web analytics, peer university review and marketing focus group and demographic data were utilized to conduct usability evaluation. The results indicated that…

  13. FedWeb Greatest Hits: Presenting the New Test Collection for Federated Web Search

    NARCIS (Netherlands)

    Demeester, Thomas; Trieschnigg, Rudolf Berend; Zhou, Ke; Nguyen, Dong-Phuong; Hiemstra, Djoerd

    This paper presents 'FedWeb Greatest Hits', a large new test collection for research in web information retrieval. As a combination and extension of the datasets used in the TREC Federated Web Search Track, this collection opens up new research possibilities on federated web search challenges, as

  14. Web technology in the separation of strontium and cesium from INEL-ICPP radioactive acid waste (WM-185)

    International Nuclear Information System (INIS)

    Bray, L.A.; Brown, G.N.

    1995-01-01

    Strontium and cesium were successfully removed from radioactive acidic waste (WM-185) at the Idaho National Engineering Laboratory, Idaho Chemical Processing Plant (ICPP), with web technology from 3M and IBC Advanced Technologies, Inc. (IBC). A technical team from Pacific Northwest Laboratory, ICPP, 3M and IBC conducted a very successful series of experiments from August 15 through 18, 1994. The ICPP, Remote Analytical Laboratory, Idaho Falls, Idaho, provided the hot cell facilities and staff to complete these milestone experiments. The actual waste experiments duplicated the initial 'cold' simulated waste results and confirmed the selective removal provided by ligand-particle web technology

  15. The use of web ontology languages and other semantic web tools in drug discovery.

    Science.gov (United States)

    Chen, Huajun; Xie, Guotong

    2010-05-01

    To optimize drug development processes, pharmaceutical companies require principled approaches to integrate disparate data on a unified infrastructure, such as the web. The semantic web, developed on the web technology, provides a common, open framework capable of harmonizing diversified resources to enable networked and collaborative drug discovery. We survey the state of art of utilizing web ontologies and other semantic web technologies to interlink both data and people to support integrated drug discovery across domains and multiple disciplines. Particularly, the survey covers three major application categories including: i) semantic integration and open data linking; ii) semantic web service and scientific collaboration and iii) semantic data mining and integrative network analysis. The reader will gain: i) basic knowledge of the semantic web technologies; ii) an overview of the web ontology landscape for drug discovery and iii) a basic understanding of the values and benefits of utilizing the web ontologies in drug discovery. i) The semantic web enables a network effect for linking open data for integrated drug discovery; ii) The semantic web service technology can support instant ad hoc collaboration to improve pipeline productivity and iii) The semantic web encourages publishing data in a semantic way such as resource description framework attributes and thus helps move away from a reliance on pure textual content analysis toward more efficient semantic data mining.

  16. Automatically exposing OpenLifeData via SADI semantic Web Services.

    Science.gov (United States)

    González, Alejandro Rodríguez; Callahan, Alison; Cruz-Toledo, José; Garcia, Adrian; Egaña Aranguren, Mikel; Dumontier, Michel; Wilkinson, Mark D

    2014-01-01

    Two distinct trends are emerging with respect to how data is shared, collected, and analyzed within the bioinformatics community. First, Linked Data, exposed as SPARQL endpoints, promises to make data easier to collect and integrate by moving towards the harmonization of data syntax, descriptive vocabularies, and identifiers, as well as providing a standardized mechanism for data access. Second, Web Services, often linked together into workflows, normalize data access and create transparent, reproducible scientific methodologies that can, in principle, be re-used and customized to suit new scientific questions. Constructing queries that traverse semantically-rich Linked Data requires substantial expertise, yet traditional RESTful or SOAP Web Services cannot adequately describe the content of a SPARQL endpoint. We propose that content-driven Semantic Web Services can enable facile discovery of Linked Data, independent of their location. We use a well-curated Linked Dataset - OpenLifeData - and utilize its descriptive metadata to automatically configure a series of more than 22,000 Semantic Web Services that expose all of its content via the SADI set of design principles. The OpenLifeData SADI services are discoverable via queries to the SHARE registry and easy to integrate into new or existing bioinformatics workflows and analytical pipelines. We demonstrate the utility of this system through comparison of Web Service-mediated data access with traditional SPARQL, and note that this approach not only simplifies data retrieval, but simultaneously provides protection against resource-intensive queries. We show, through a variety of different clients and examples of varying complexity, that data from the myriad OpenLifeData can be recovered without any need for prior-knowledge of the content or structure of the SPARQL endpoints. We also demonstrate that, via clients such as SHARE, the complexity of federated SPARQL queries is dramatically reduced.

  17. Design, Development and Testing of Web Services for Multi-Sensor Snow Cover Mapping

    Science.gov (United States)

    Kadlec, Jiri

    This dissertation presents the design, development and validation of new data integration methods for mapping the extent of snow cover based on open access ground station measurements, remote sensing images, volunteer observer snow reports, and cross country ski track recordings from location-enabled mobile devices. The first step of the data integration procedure includes data discovery, data retrieval, and data quality control of snow observations at ground stations. The WaterML R package developed in this work enables hydrologists to retrieve and analyze data from multiple organizations that are listed in the Consortium of Universities for the Advancement of Hydrologic Sciences Inc (CUAHSI) Water Data Center catalog directly within the R statistical software environment. Using the WaterML R package is demonstrated by running an energy balance snowpack model in R with data inputs from CUAHSI, and by automating uploads of real time sensor observations to CUAHSI HydroServer. The second step of the procedure requires efficient access to multi-temporal remote sensing snow images. The Snow Inspector web application developed in this research enables the users to retrieve a time series of fractional snow cover from the Moderate Resolution Imaging Spectroradiometer (MODIS) for any point on Earth. The time series retrieval method is based on automated data extraction from tile images provided by a Web Map Tile Service (WMTS). The average required time for retrieving 100 days of data using this technique is 5.4 seconds, which is significantly faster than other methods that require the download of large satellite image files. The presented data extraction technique and space-time visualization user interface can be used as a model for working with other multi-temporal hydrologic or climate data WMTS services. The third, final step of the data integration procedure is generating continuous daily snow cover maps. A custom inverse distance weighting method has been developed

  18. Web 3.0 Emerging

    Energy Technology Data Exchange (ETDEWEB)

    Hendler, James [Rensselaer Polytechnic Institute

    2012-02-22

    As more and more data and information becomes available on the Web, new technologies that use explicit semantics for information organization are becoming desirable. New terms such as Linked Data, Semantic Web and Web 3.0 are used more and more, although there is increasing confusion as to what each means. In this talk, I will describe how different sorts of models can be used to link data in different ways. I will particularly explore different kinds of Web applications, from Enterprise Data Integration to Web 3.0 startups, government data release, the different needs of Web 2.0 and 3.0, the growing interest in “semantic search”, and the underlying technologies that power these new approaches.

  19. Cooperation and contagion in web-based, networked public goods experiments.

    Directory of Open Access Journals (Sweden)

    Siddharth Suri

    Full Text Available A longstanding idea in the literature on human cooperation is that cooperation should be reinforced when conditional cooperators are more likely to interact. In the context of social networks, this idea implies that cooperation should fare better in highly clustered networks such as cliques than in networks with low clustering such as random networks. To test this hypothesis, we conducted a series of web-based experiments, in which 24 individuals played a local public goods game arranged on one of five network topologies that varied between disconnected cliques and a random regular graph. In contrast with previous theoretical work, we found that network topology had no significant effect on average contributions. This result implies either that individuals are not conditional cooperators, or else that cooperation does not benefit from positive reinforcement between connected neighbors. We then tested both of these possibilities in two subsequent series of experiments in which artificial seed players were introduced, making either full or zero contributions. First, we found that although players did generally behave like conditional cooperators, they were as likely to decrease their contributions in response to low contributing neighbors as they were to increase their contributions in response to high contributing neighbors. Second, we found that positive effects of cooperation were contagious only to direct neighbors in the network. In total we report on 113 human subjects experiments, highlighting the speed, flexibility, and cost-effectiveness of web-based experiments over those conducted in physical labs.

  20. Carotid Web (Intimal Fibromuscular Dysplasia) Has High Stroke Recurrence Risk and Is Amenable to Stenting.

    Science.gov (United States)

    Haussen, Diogo C; Grossberg, Jonathan A; Bouslama, Mehdi; Pradilla, Gustavo; Belagaje, Samir; Bianchi, Nicolas; Allen, Jason W; Frankel, Michael; Nogueira, Raul G

    2017-11-01

    Carotid webs have been increasingly recognized as a cause of recurrent stroke, but evidence remains scarce. We aim to report the clinical outcomes and first series of carotid stenting in a cohort of patients with strokes from symptomatic carotid webs. Prospective and consecutive data of patients web was defined by a shelf-like/linear filling defect in the posterior internal carotid artery bulb by computed tomographic angiography. Twenty-four patients were identified (91.6% strokes/8.4% transient ischemic attacks [TIAs]). Median age was 46 (41-59) years, 61% were female, and 75% were black. Median National Institutes of Health Stroke Scale score was 10.5 (3.0-16.0) and ASPECTS (Alberta Stroke Program Early CT Score) was 8 (7-8). There were no parenchymal hemorrhages, and 96% of patients were independent at 3 months. All webs caused webs (58%), median ipsilateral web length was larger than contralateral (3.1 [3.0-4.5] mm versus 2.6 [1.85-2.9] mm; P =0.01), respectively. Twenty-nine percent of patients had thrombus superimposed on the symptomatic carotid web. A recurrent stroke/TIA involving the territory of the previously symptomatic web occurred in 7 (32%; 6 strokes/1 TIA) patients: 3 1 year of follow-up. Two recurrences occurred on dual antiplatelet therapy, 3 on antiplatelet monotherapy, 1 within 24 hours of thrombolysis, and 1 off antithrombotics. Median follow-up was 12.2 (8.0-18.0) months. Sixteen (66%) patients were stented at a median 12.2 (7.0-18.7) days after stroke with no periprocedural complications. No recurrent strokes/TIAs occurred in stented individuals (median follow-up of 4 [2.4-12.0] months). Carotid web is associated with high recurrent stroke/TIA risk, despite antithrombotic use, and is amenable to carotid stenting. © 2017 American Heart Association, Inc.

  1. Web Server Embedded System

    Directory of Open Access Journals (Sweden)

    Adharul Muttaqin

    2014-07-01

    Full Text Available Abstrak Embedded sistem saat ini menjadi perhatian khusus pada teknologi komputer, beberapa sistem operasi linux dan web server yang beraneka ragam juga sudah dipersiapkan untuk mendukung sistem embedded, salah satu aplikasi yang dapat digunakan dalam operasi pada sistem embedded adalah web server. Pemilihan web server pada lingkungan embedded saat ini masih jarang dilakukan, oleh karena itu penelitian ini dilakukan dengan menitik beratkan pada dua buah aplikasi web server yang tergolong memiliki fitur utama yang menawarkan “keringanan” pada konsumsi CPU maupun memori seperti Light HTTPD dan Tiny HTTPD. Dengan menggunakan parameter thread (users, ramp-up periods, dan loop count pada stress test embedded system, penelitian ini menawarkan solusi web server manakah diantara Light HTTPD dan Tiny HTTPD yang memiliki kecocokan fitur dalam penggunaan embedded sistem menggunakan beagleboard ditinjau dari konsumsi CPU dan memori. Hasil penelitian menunjukkan bahwa dalam hal konsumsi CPU pada beagleboard embedded system lebih disarankan penggunaan Light HTTPD dibandingkan dengan tiny HTTPD dikarenakan terdapat perbedaan CPU load yang sangat signifikan antar kedua layanan web tersebut Kata kunci: embedded system, web server Abstract Embedded systems are currently of particular concern in computer technology, some of the linux operating system and web server variegated also prepared to support the embedded system, one of the applications that can be used in embedded systems are operating on the web server. Selection of embedded web server on the environment is still rarely done, therefore this study was conducted with a focus on two web application servers belonging to the main features that offer a "lightness" to the CPU and memory consumption as Light HTTPD and Tiny HTTPD. By using the parameters of the thread (users, ramp-up periods, and loop count on a stress test embedded systems, this study offers a solution of web server which between the Light

  2. Applying Web Usage Mining for Personalizing Hyperlinks in Web-Based Adaptive Educational Systems

    Science.gov (United States)

    Romero, Cristobal; Ventura, Sebastian; Zafra, Amelia; de Bra, Paul

    2009-01-01

    Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender engine is integrated into the AHA! system in…

  3. An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm.

    Science.gov (United States)

    Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya

    2015-01-01

    Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the "quality of service" as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services.

  4. Plankton food web and its seasonal dynamics in a large monsoonal estuary (Cochin backwaters, India)-significance of mesohaline region

    Digital Repository Service at National Institute of Oceanography (India)

    Sooria, P.M.; Jyothibabu, R; Anjusha, A.; Vineetha, G.; Vinita, J.; Lallu, K.R; Paul, M.; Jagadeesan, L.

    The paper presents the ecology and dynamics of plankton food web in the Cochin backwaters (CBW), the largest monsoonal estuary along the west coast of India. The data source is a time series measurement carried out in the CBW during the Spring...

  5. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    Science.gov (United States)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  6. RS-WebPredictor

    DEFF Research Database (Denmark)

    Zaretzki, J.; Bergeron, C.; Huang, T.-W.

    2013-01-01

    Regioselectivity-WebPredictor (RS-WebPredictor) is a server that predicts isozyme-specific cytochrome P450 (CYP)-mediated sites of metabolism (SOMs) on drug-like molecules. Predictions may be made for the promiscuous 2C9, 2D6 and 3A4 CYP isozymes, as well as CYPs 1A2, 2A6, 2B6, 2C8, 2C19 and 2E1....... RS-WebPredictor is the first freely accessible server that predicts the regioselectivity of the last six isozymes. Server execution time is fast, taking on average 2s to encode a submitted molecule and 1s to apply a given model, allowing for high-throughput use in lead optimization projects.......Availability: RS-WebPredictor is accessible for free use at http://reccr.chem.rpi.edu/ Software/RS-WebPredictor....

  7. IL web tutorials

    DEFF Research Database (Denmark)

    Hyldegård, Jette; Lund, Haakon

    2012-01-01

    The paper presents the results from a study on information literacy in a higher education (HE) context based on a larger research project evaluating 3 Norwegian IL web tutorials at 6 universities and colleges in Norway. The aim was to evaluate how the 3 web tutorials served students’ information...... seeking and writing process in an study context and to identify barriers to the employment and use of the IL web tutorials, hence to the underlying information literacy intentions by the developer. Both qualitative and quantitative methods were employed. A clear mismatch was found between intention...... and use of the web tutorials. In addition, usability only played a minor role compared to relevance. It is concluded that the positive expectations of the IL web tutorials tend to be overrated by the developers. Suggestions for further research are presented....

  8. A development process meta-model for Web based expert systems: The Web engineering point of view

    DEFF Research Database (Denmark)

    Dokas, I.M.; Alapetite, Alexandre

    2006-01-01

    raised their complexity. Unfortunately, there is so far no clear answer to the question: How may the methods and experience of Web engineering and expert systems be combined and applied in order todevelop effective and successful Web based expert systems? In an attempt to answer this question...... on Web based expert systems – will be presented. The idea behind the presentation of theaccessibility evaluation and its conclusions is to show to Web based expert system developers, who typically have little Web engineering background, that Web engineering issues must be considered when developing Web......Similar to many legacy computer systems, expert systems can be accessed via the Web, forming a set of Web applications known as Web based expert systems. The tough Web competition, the way people and organizations rely on Web applications and theincreasing user requirements for better services have...

  9. NGNP Data Management and Analysis System Analysis and Web Delivery Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Cynthia D. Gentillon

    2011-09-01

    Projects for the Very High Temperature Reactor (VHTR) Technology Development Office provide data in support of Nuclear Regulatory Commission licensing of the very high temperature reactor. Fuel and materials to be used in the reactor are tested and characterized to quantify performance in high-temperature and high-fluence environments. The NGNP Data Management and Analysis System (NDMAS) at the Idaho National Laboratory has been established to ensure that VHTR data are (1) qualified for use, (2) stored in a readily accessible electronic form, and (3) analyzed to extract useful results. This document focuses on the third NDMAS objective. It describes capabilities for displaying the data in meaningful ways and for data analysis to identify useful relationships among the measured quantities. The capabilities are described from the perspective of NDMAS users, starting with those who just view experimental data and analytical results on the INL NDMAS web portal. Web display and delivery capabilities are described in detail. Also the current web pages that show Advanced Gas Reactor, Advanced Graphite Capsule, and High Temperature Materials test results are itemized. Capabilities available to NDMAS developers are more extensive, and are described using a second series of examples. Much of the data analysis efforts focus on understanding how thermocouple measurements relate to simulated temperatures and other experimental parameters. Statistical control charts and correlation monitoring provide an ongoing assessment of instrument accuracy. Data analysis capabilities are virtually unlimited for those who use the NDMAS web data download capabilities and the analysis software of their choice. Overall, the NDMAS provides convenient data analysis and web delivery capabilities for studying a very large and rapidly increasing database of well-documented, pedigreed data.

  10. Programming Web services with Perl

    CERN Document Server

    Ray, Randy J

    2003-01-01

    Given Perl's natural fit for web applications development, it's no surprise that Perl is also a natural choice for web services development. It's the most popular web programming language, with strong implementations of both SOAP and XML-RPC, the leading ways to distribute applications using web services. But books on web services focus on writing these applications in Java or Visual Basic, leaving Perl programmers with few resources to get them started. Programming Web Services with Perl changes that, bringing Perl users all the information they need to create web services using their favori

  11. Cooperative Mobile Web Browsing

    DEFF Research Database (Denmark)

    Perrucci, GP; Fitzek, FHP; Zhang, Qi

    2009-01-01

    This paper advocates a novel approach for mobile web browsing based on cooperation among wireless devices within close proximity operating in a cellular environment. In the actual state of the art, mobile phones can access the web using different cellular technologies. However, the supported data......-range links can then be used for cooperative mobile web browsing. By implementing the cooperative web browsing on commercial mobile phones, it will be shown that better performance is achieved in terms of increased data rate and therefore reduced access times, resulting in a significantly enhanced web...

  12. Web Portal Design, Execution and Sustainability for Naval Websites and Web Services

    National Research Council Canada - National Science Library

    Amsden, Saundra

    2003-01-01

    .... The newest Web Service is the development of Web Portals. Portals allow the design of Web Services in such a way as to allow users to define their own needs and create a home of their own within a site...

  13. Instant responsive web design

    CERN Document Server

    Simmons, Cory

    2013-01-01

    A step-by-step tutorial approach which will teach the readers what responsive web design is and how it is used in designing a responsive web page.If you are a web-designer looking to expand your skill set by learning the quickly growing industry standard of responsive web design, this book is ideal for you. Knowledge of CSS is assumed.

  14. SVG-Based Web Publishing

    Science.gov (United States)

    Gao, Jerry Z.; Zhu, Eugene; Shim, Simon

    2003-01-01

    With the increasing applications of the Web in e-commerce, advertising, and publication, new technologies are needed to improve Web graphics technology due to the current limitation of technology. The SVG (Scalable Vector Graphics) technology is a new revolutionary solution to overcome the existing problems in the current web technology. It provides precise and high-resolution web graphics using plain text format commands. It sets a new standard for web graphic format to allow us to present complicated graphics with rich test fonts and colors, high printing quality, and dynamic layout capabilities. This paper provides a tutorial overview about SVG technology and its essential features, capability, and advantages. The reports a comparison studies between SVG and other web graphics technologies.

  15. Sensor web

    Science.gov (United States)

    Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)

    2011-01-01

    A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.

  16. A Sensor Web and Web Service-Based Approach for Active Hydrological Disaster Monitoring

    Directory of Open Access Journals (Sweden)

    Xi Zhai

    2016-09-01

    Full Text Available Rapid advancements in Earth-observing sensor systems have led to the generation of large amounts of remote sensing data that can be used for the dynamic monitoring and analysis of hydrological disasters. The management and analysis of these data could take advantage of distributed information infrastructure technologies such as Web service and Sensor Web technologies, which have shown great potential in facilitating the use of observed big data in an interoperable, flexible and on-demand way. However, it remains a challenge to achieve timely response to hydrological disaster events and to automate the geoprocessing of hydrological disaster observations. This article proposes a Sensor Web and Web service-based approach to support active hydrological disaster monitoring. This approach integrates an event-driven mechanism, Web services, and a Sensor Web and coordinates them using workflow technologies to facilitate the Web-based sharing and processing of hydrological hazard information. The design and implementation of hydrological Web services for conducting various hydrological analysis tasks on the Web using dynamically updating sensor observation data are presented. An application example is provided to demonstrate the benefits of the proposed approach over the traditional approach. The results confirm the effectiveness and practicality of the proposed approach in cases of hydrological disaster.

  17. WebRTC using JSON via XMLHttpRequest and SIP over WebSocket: initial signalling overhead findings

    CSIR Research Space (South Africa)

    Adeyeye, M

    2013-08-01

    Full Text Available Web Real-Time Communication (WebRTC) introduces real-time multimedia communication as native capabilities of Web browsers. With the adoption of WebRTC the Web browsers will be able to use WebRTC to communicate with one another (peer...

  18. Webs and posets

    International Nuclear Information System (INIS)

    Dukes, M.; Gardi, E.; McAslan, H.; Scott, D.J.; White, C.D.

    2014-01-01

    The non-Abelian exponentiation theorem has recently been generalised to correlators of multiple Wilson line operators. The perturbative expansions of these correlators exponentiate in terms of sets of diagrams called webs, which together give rise to colour factors corresponding to connected graphs. The colour and kinematic degrees of freedom of individual diagrams in a web are entangled by mixing matrices of purely combinatorial origin. In this paper we relate the combinatorial study of these matrices to properties of partially ordered sets (posets), and hence obtain explicit solutions for certain families of web-mixing matrix, at arbitrary order in perturbation theory. We also provide a general expression for the rank of a general class of mixing matrices, which governs the number of independent colour factors arising from such webs. Finally, we use the poset language to examine a previously conjectured sum rule for the columns of web-mixing matrices which governs the cancellation of the leading subdivergences between diagrams in the web. Our results, when combined with parallel developments in the evaluation of kinematic integrals, offer new insights into the all-order structure of infrared singularities in non-Abelian gauge theories

  19. Advanced Techniques in Web Intelligence-2 Web User Browsing Behaviour and Preference Analysis

    CERN Document Server

    Palade, Vasile; Jain, Lakhmi

    2013-01-01

    This research volume focuses on analyzing the web user browsing behaviour and preferences in traditional web-based environments, social  networks and web 2.0 applications,  by using advanced  techniques in data acquisition, data processing, pattern extraction and  cognitive science for modeling the human actions.  The book is directed to  graduate students, researchers/scientists and engineers  interested in updating their knowledge with the recent trends in web user analysis, for developing the next generation of web-based systems and applications.

  20. Moving toward a universally accessible web: Web accessibility and education.

    Science.gov (United States)

    Kurt, Serhat

    2017-12-08

    The World Wide Web is an extremely powerful source of information, inspiration, ideas, and opportunities. As such, it has become an integral part of daily life for a great majority of people. Yet, for a significant number of others, the internet offers only limited value due to the existence of barriers which make accessing the Web difficult, if not impossible. This article illustrates some of the reasons that achieving equality of access to the online world of education is so critical, explores the current status of Web accessibility, discusses evaluative tools and methods that can help identify accessibility issues in educational websites, and provides practical recommendations and guidelines for resolving some of the obstacles that currently hinder the achievability of the goal of universal Web access.

  1. A demanding web-based PACS supported by web services technology

    Science.gov (United States)

    Costa, Carlos M. A.; Silva, Augusto; Oliveira, José L.; Ribeiro, Vasco G.; Ribeiro, José

    2006-03-01

    During the last years, the ubiquity of web interfaces have pushed practically all PACS suppliers to develop client applications in which clinical practitioners can receive and analyze medical images, using conventional personal computers and Web browsers. However, due to security and performance issues, the utilization of these software packages has been restricted to Intranets. Paradigmatically, one of the most important advantages of digital image systems is to simplify the widespread sharing and remote access of medical data between healthcare institutions. This paper analyses the traditional PACS drawbacks that contribute to their reduced usage in the Internet and describes a PACS based on Web Services technology that supports a customized DICOM encoding syntax and a specific compression scheme providing all historical patient data in a unique Web interface.

  2. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    Science.gov (United States)

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  3. A resource-oriented architecture for a Geospatial Web

    Science.gov (United States)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping

  4. Web Security, Privacy & Commerce

    CERN Document Server

    Garfinkel, Simson

    2011-01-01

    Since the first edition of this classic reference was published, World Wide Web use has exploded and e-commerce has become a daily part of business and personal life. As Web use has grown, so have the threats to our security and privacy--from credit card fraud to routine invasions of privacy by marketers to web site defacements to attacks that shut down popular web sites. Web Security, Privacy & Commerce goes behind the headlines, examines the major security risks facing us today, and explains how we can minimize them. It describes risks for Windows and Unix, Microsoft Internet Exp

  5. Funnel-web spider bite

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/002844.htm Funnel-web spider bite To use the sharing features on ... the effects of a bite from the funnel-web spider. Male funnel-web spiders are more poisonous ...

  6. THE IMPORTANCE OF WEB DESIGN: VISUAL DESIGN EVALUATION OF DESTINATION WEB SITES

    OpenAIRE

    Fırlar, Belma; Okat Özdem, Özen

    2013-01-01

    As in the literature, the researchs about web site efficiency are mostly about site context. The analysis about function are mostly superficial. Whereas, controlling every little part of a web site respective is a necessity to show its efficiency. Here in this context in the study of perception and response event web sites that play an important role in visual design criteria are below the lens as featured and the web sites evaulated by heuristic evaluation method.The research focus of this s...

  7. Web Apollo: a web-based genomic annotation editing platform.

    Science.gov (United States)

    Lee, Eduardo; Helt, Gregg A; Reese, Justin T; Munoz-Torres, Monica C; Childers, Chris P; Buels, Robert M; Stein, Lincoln; Holmes, Ian H; Elsik, Christine G; Lewis, Suzanna E

    2013-08-30

    Web Apollo is the first instantaneous, collaborative genomic annotation editor available on the web. One of the natural consequences following from current advances in sequencing technology is that there are more and more researchers sequencing new genomes. These researchers require tools to describe the functional features of their newly sequenced genomes. With Web Apollo researchers can use any of the common browsers (for example, Chrome or Firefox) to jointly analyze and precisely describe the features of a genome in real time, whether they are in the same room or working from opposite sides of the world.

  8. Formación de usuarios de la información mediante aplicaciones Web 2.0

    Directory of Open Access Journals (Sweden)

    Eder Ávila Barrientos

    2015-04-01

    Full Text Available Objetivo: Analizar las aplicaciones web 2.0 para estimar su utilización en los programas de formación de usuarios de la información. Método: Análisis documental y la hermenéutica del discurso aplicados a los documentos impresos y digitales que abordan la temática del objeto de estudio. Resultados: se responden una serie de cuestionamientos relacionados con la integración de las aplicaciones Web 2.0 en el diseño de programas de formación de usuarios de la información. En lo que respecta a la selección de aplicaciones web 2.0, se empleo la técnica de análisis de atributos basada en las variables de uso didáctico. Concluisones: Las tecnologías de la web 2.0 y su integración en los programas de formación de usuarios de la información, representan un nicho de oportunidad para los bibliotecólogos. Una oportunidad para alcanzar el futuro y romper viejos paradigmas de la formación de usurios.

  9. Teaching Web 2.0 technologies using Web 2.0 technologies.

    Science.gov (United States)

    Rethlefsen, Melissa L; Piorun, Mary; Prince, J Dale

    2009-10-01

    The research evaluated participant satisfaction with the content and format of the "Web 2.0 101: Introduction to Second Generation Web Tools" course and measured the impact of the course on participants' self-evaluated knowledge of Web 2.0 tools. The "Web 2.0 101" online course was based loosely on the Learning 2.0 model. Content was provided through a course blog and covered a wide range of Web 2.0 tools. All Medical Library Association members were invited to participate. Participants were asked to complete a post-course survey. Respondents who completed the entire course or who completed part of the course self-evaluated their knowledge of nine social software tools and concepts prior to and after the course using a Likert scale. Additional qualitative information about course strengths and weaknesses was also gathered. Respondents' self-ratings showed a significant change in perceived knowledge for each tool, using a matched pair Wilcoxon signed rank analysis (P<0.0001 for each tool/concept). Overall satisfaction with the course appeared high. Hands-on exercises were the most frequently identified strength of the course; the length and time-consuming nature of the course were considered weaknesses by some. Learning 2.0-style courses, though demanding time and self-motivation from participants, can increase knowledge of Web 2.0 tools.

  10. Uses and Gratifications of the World Wide Web: From Couch Potato to Web Potato.

    Science.gov (United States)

    Kaye, Barbara K.

    1998-01-01

    Investigates uses and gratifications of the World Wide Web and its impact on traditional mass media, especially television. Identifies six Web use motivations: entertainment, social interaction, passing of time, escape, information, and Web site preference. Examines relationships between each use motivation and Web affinity, perceived realism, and…

  11. The RCSB Protein Data Bank: redesigned web site and web services.

    Science.gov (United States)

    Rose, Peter W; Beran, Bojan; Bi, Chunxiao; Bluhm, Wolfgang F; Dimitropoulos, Dimitris; Goodsell, David S; Prlic, Andreas; Quesada, Martha; Quinn, Gregory B; Westbrook, John D; Young, Jasmine; Yukich, Benjamin; Zardecki, Christine; Berman, Helen M; Bourne, Philip E

    2011-01-01

    The RCSB Protein Data Bank (RCSB PDB) web site (http://www.pdb.org) has been redesigned to increase usability and to cater to a larger and more diverse user base. This article describes key enhancements and new features that fall into the following categories: (i) query and analysis tools for chemical structure searching, query refinement, tabulation and export of query results; (ii) web site customization and new structure alerts; (iii) pair-wise and representative protein structure alignments; (iv) visualization of large assemblies; (v) integration of structural data with the open access literature and binding affinity data; and (vi) web services and web widgets to facilitate integration of PDB data and tools with other resources. These improvements enable a range of new possibilities to analyze and understand structure data. The next generation of the RCSB PDB web site, as described here, provides a rich resource for research and education.

  12. Semantic Web meets Integrative Biology: a survey.

    Science.gov (United States)

    Chen, Huajun; Yu, Tong; Chen, Jake Y

    2013-01-01

    Integrative Biology (IB) uses experimental or computational quantitative technologies to characterize biological systems at the molecular, cellular, tissue and population levels. IB typically involves the integration of the data, knowledge and capabilities across disciplinary boundaries in order to solve complex problems. We identify a series of bioinformatics problems posed by interdisciplinary integration: (i) data integration that interconnects structured data across related biomedical domains; (ii) ontology integration that brings jargons, terminologies and taxonomies from various disciplines into a unified network of ontologies; (iii) knowledge integration that integrates disparate knowledge elements from multiple sources; (iv) service integration that build applications out of services provided by different vendors. We argue that IB can benefit significantly from the integration solutions enabled by Semantic Web (SW) technologies. The SW enables scientists to share content beyond the boundaries of applications and websites, resulting into a web of data that is meaningful and understandable to any computers. In this review, we provide insight into how SW technologies can be used to build open, standardized and interoperable solutions for interdisciplinary integration on a global basis. We present a rich set of case studies in system biology, integrative neuroscience, bio-pharmaceutics and translational medicine, to highlight the technical features and benefits of SW applications in IB.

  13. WebQuests in special primary education: Learning in a web-based environment

    NARCIS (Netherlands)

    Kleemans, M.A.J.; Segers, P.C.J.; Droop, W.; Wentink, W.M.J.

    2011-01-01

    The present study investigated the differences in learning gain when performing a WebQuest with a well-defined versus an ill-defined assignment. Twenty boys and twenty girls (mean age 11; 10), attending a special primary education school, performed two WebQuests. In each WebQuest, they performed

  14. Web Science emerges

    OpenAIRE

    Shadbolt, Nigel; Berners-Lee, Tim

    2008-01-01

    The relentless rise in Web pages and links is creating emergent properties, from social networks to virtual identity theft, that are transforming society. A new discipline, Web Science, aims to discover how Web traits arise and how they can be harnessed or held in check to benefit society. Important advances are beginning to be made; more work can solve major issues such as securing privacy and conveying trust.

  15. Web Accessibility and Guidelines

    Science.gov (United States)

    Harper, Simon; Yesilada, Yeliz

    Access to, and movement around, complex online environments, of which the World Wide Web (Web) is the most popular example, has long been considered an important and major issue in the Web design and usability field. The commonly used slang phrase ‘surfing the Web’ implies rapid and free access, pointing to its importance among designers and users alike. It has also been long established that this potentially complex and difficult access is further complicated, and becomes neither rapid nor free, if the user is disabled. There are millions of people who have disabilities that affect their use of the Web. Web accessibility aims to help these people to perceive, understand, navigate, and interact with, as well as contribute to, the Web, and thereby the society in general. This accessibility is, in part, facilitated by the Web Content Accessibility Guidelines (WCAG) currently moving from version one to two. These guidelines are intended to encourage designers to make sure their sites conform to specifications, and in that conformance enable the assistive technologies of disabled users to better interact with the page content. In this way, it was hoped that accessibility could be supported. While this is in part true, guidelines do not solve all problems and the new WCAG version two guidelines are surrounded by controversy and intrigue. This chapter aims to establish the published literature related to Web accessibility and Web accessibility guidelines, and discuss limitations of the current guidelines and future directions.

  16. WebViz: A web browser based application for collaborative analysis of 3D data

    Science.gov (United States)

    Ruegg, C. S.

    2011-12-01

    In the age of high speed Internet where people can interact instantly, scientific tools have lacked technology which can incorporate this concept of communication using the web. To solve this issue a web application for geological studies has been created, tentatively titled WebViz. This web application utilizes tools provided by Google Web Toolkit to create an AJAX web application capable of features found in non web based software. Using these tools, a web application can be created to act as piece of software from anywhere in the globe with a reasonably speedy Internet connection. An application of this technology can be seen with data regarding the recent tsunami from the major japan earthquakes. After constructing the appropriate data to fit a computer render software called HVR, WebViz can request images of the tsunami data and display it to anyone who has access to the application. This convenience alone makes WebViz a viable solution, but the option to interact with this data with others around the world causes WebViz to be taken as a serious computational tool. WebViz also can be used on any javascript enabled browser such as those found on modern tablets and smart phones over a fast wireless connection. Due to the fact that WebViz's current state is built using Google Web Toolkit the portability of the application is in it's most efficient form. Though many developers have been involved with the project, each person has contributed to increase the usability and speed of the application. In the project's most recent form a dramatic speed increase has been designed as well as a more efficient user interface. The speed increase has been informally noticed in recent uses of the application in China and Australia with the hosting server being located at the University of Minnesota. The user interface has been improved to not only look better but the functionality has been improved. Major functions of the application are rotating the 3D object using buttons

  17. Web thickness determines the therapeutic effect of endoscopic keel placement on anterior glottic web.

    Science.gov (United States)

    Chen, Jian; Shi, Fang; Chen, Min; Yang, Yue; Cheng, Lei; Wu, Haitao

    2017-10-01

    This work is a retrospective analysis to investigate the critical risk factor for the therapeutic effect of endoscopic keel placement on anterior glottic web. Altogether, 36 patients with anterior glottic web undergoing endoscopic lysis and silicone keel placement were enrolled. Their voice qualities were evaluated using the voice handicap index-10 (VHI-10) questionnaire, and improved significantly 3 months after surgery (21.53 ± 3.89 vs 9.81 ± 6.68, P web recurrence during the at least 1-year follow-up. Therefore, patients were classified according to the Cohen classification or web thickness, and the recurrence rates were compared. The distribution of recurrence rates for Cohen type 1 ~ 4 were 28.6, 16.7, 33.3, and 40%, respectively. The difference was not statistically significant (P = 0.461). When classified by web thickness, only 2 of 27 (7.41%) thin type cases relapsed whereas 8 of 9 (88.9%) cases in the thick group reformed webs (P web thickness rather than the Cohen grades. Endoscopic lysis and keel placement is only effective for cases with thin glottic webs. Patients with thick webs should be treated by other means.

  18. Web cache location

    Directory of Open Access Journals (Sweden)

    Boffey Brian

    2004-01-01

    Full Text Available Stress placed on network infrastructure by the popularity of the World Wide Web may be partially relieved by keeping multiple copies of Web documents at geographically dispersed locations. In particular, use of proxy caches and replication provide a means of storing information 'nearer to end users'. This paper concentrates on the locational aspects of Web caching giving both an overview, from an operational research point of view, of existing research and putting forward avenues for possible further research. This area of research is in its infancy and the emphasis will be on themes and trends rather than on algorithm construction. Finally, Web caching problems are briefly related to referral systems more generally.

  19. Reactivity on the Web

    OpenAIRE

    Bailey, James; Bry, François; Eckert, Michael; Patrânjan, Paula Lavinia

    2005-01-01

    Reactivity, the ability to detect simple and composite events and respond in a timely manner, is an essential requirement in many present-day information systems. With the emergence of new, dynamic Web applications, reactivity on the Web is receiving increasing attention. Reactive Web-based systems need to detect and react not only to simple events but also to complex, real-life situations. This paper introduces XChange, a language for programming reactive behaviour on the Web,...

  20. Semantic Web status model

    CSIR Research Space (South Africa)

    Gerber, AJ

    2006-06-01

    Full Text Available Semantic Web application areas are experiencing intensified interest due to the rapid growth in the use of the Web, together with the innovation and renovation of information content technologies. The Semantic Web is regarded as an integrator across...

  1. Your Life in Web Apps

    CERN Document Server

    Turnbull, Giles

    2008-01-01

    What is a web app? It's software that you use right in your web browser. Rather than installing an application on your computer, you visit a web site and sign up as a new user of its software. Instead of storing your files on your own hard disk, the web app stores them for you, online. Is it possible to switch entirely to web apps? To run nothing but a browser for an entire day? In this PDF we'll take you through one day in the life of a web apps-only user and chronicle the pros and cons of living by browser. And if the idea of switching, fully or partially, to web apps sounds appealing to

  2. Web 25

    DEFF Research Database (Denmark)

    the reader on an exciting time travel journey to learn more about the prehistory of the hyperlink, the birth of the Web, the spread of the early Web, and the Web’s introduction to the general public in mainstream media. Fur- thermore, case studies of blogs, literature, and traditional media going online...

  3. Survey of Techniques for Deep Web Source Selection and Surfacing the Hidden Web Content

    OpenAIRE

    Khushboo Khurana; M.B. Chandak

    2016-01-01

    Large and continuously growing dynamic web content has created new opportunities for large-scale data analysis in the recent years. There is huge amount of information that the traditional web crawlers cannot access, since they use link analysis technique by which only the surface web can be accessed. Traditional search engine crawlers require the web pages to be linked to other pages via hyperlinks causing large amount of web data to be hidden from the crawlers. Enormous data is available in...

  4. Web-GIS approach for integrated analysis of heterogeneous georeferenced data

    Science.gov (United States)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Shulgina, Tamara

    2014-05-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales [1]. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required [2]. Dedicated information-computational system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is presented. It is based on combination of Web and GIS technologies according to Open Geospatial Consortium (OGC) standards, and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library (http://www.geoext.org), ExtJS Framework (http://www.sencha.com/products/extjs) and OpenLayers software (http://openlayers.org). The main advantage of the system lies in it's capability to perform integrated analysis of time series of georeferenced data obtained from different sources (in-situ observations, model results, remote sensing data) and to combine the results in a single map [3, 4] as WMS and WFS layers in a web-GIS application. Also analysis results are available for downloading as binary files from the graphical user interface or can be directly accessed through web mapping (WMS) and web feature (WFS) services for a further processing by the user. Data processing is performed on geographically distributed computational cluster comprising data storage systems and corresponding computational nodes. Several geophysical datasets represented by NCEP/NCAR Reanalysis II, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, reanalysis of Monitoring

  5. Web 1.0 to Web 3.0 Evolution: Reviewing the Impacts on Tourism Development and Opportunities

    Science.gov (United States)

    Eftekhari, M. Hossein; Barzegar, Zeynab; Isaai, M. T.

    The most important event following the establishmenet of the Internet network was the Web introduced by Tim Berners-Lee. Websites give their owners features that allow sharing with which they can publish their content with users and visitors. In the last 5 years, we have seen some changes in the use of web. Users want to participate in content sharing and they like to interact with each other. This is known as Web 2.0. In the last year, Web 2.0 has reached maturity and now we need a smart web which will be accordingly be called Web 3.0. Web 3.0 is based on semantic web definition. Changing the way of using the web has had a clear impact on E-Tourism and its development and also on business models. In this paper, we review the definitions and describe the impacts of web evolution on E-Tourism.

  6. Web 2.0 as a dystopia in the recent internet

    Directory of Open Access Journals (Sweden)

    Antonio Cambra

    2008-05-01

    Full Text Available The term Web 2.0 has recently come to form part of the vocabulary associated with the internet. Semantically imprecise, it looks to capture a moment in the development of the internet where the user becomes the central catalyst with a greater capacity for expression, interaction and participation provided by certain recently developed technologies. Despite this, certain figures and experts are critical of the current course, which they accuse of leading to a whole series of effects that, far from being desirable, bring into question the ideal nature of the evolution of the internet. Expressions such as cultural levelling, cult of the amateur or collective intelligence (used pejoratively have formed around the term Web 2.0 producing an aureole of dystopic resonance.This article looks to relativise some of the arguments that form part of the dystopic view of the Web 2.0, reflecting on the internet from an alternative perspective that helps understand the phenomenon without being overcome by the pessimism that it seems to be subject to in terms of the views of certain experts. Far from presupposing a telos dictating its evolution, the article defends an idea of the internet as a pragmatic field for experimentation that is legitimate as a "path" rather than a "destination" in terms of an implicit or expected development.

  7. Working without a Crystal Ball: Predicting Web Trends for Web Services Librarians

    Science.gov (United States)

    Ovadia, Steven

    2008-01-01

    User-centered design is a principle stating that electronic resources, like library Web sites, should be built around the needs of the users. This article interviews Web developers of library and non-library-related Web sites, determining how they assess user needs and how they decide to adapt certain technologies for users. According to the…

  8. Private Web Browsing

    National Research Council Canada - National Science Library

    Syverson, Paul F; Reed, Michael G; Goldschlag, David M

    1997-01-01

    .... These are both kept confidential from network elements as well as external observers. Private Web browsing is achieved by unmodified Web browsers using anonymous connections by means of HTTP proxies...

  9. Interobserver variability in the assessment of aneurysm occlusion with the WEB aneurysm embolization system.

    Science.gov (United States)

    Fiorella, David; Arthur, Adam; Byrne, James; Pierot, Laurent; Molyneux, Andy; Duckwiler, Gary; McCarthy, Thomas; Strother, Charles

    2015-08-01

    The WEB (WEB aneurysm embolization system, Sequent Medical, Aliso Viejo, California, USA) is a self-expanding, nitinol, mesh device designed to achieve aneurysm occlusion after endosaccular deployment. The WEB Occlusion Scale (WOS) is a standardized angiographic assessment scale for reporting aneurysm occlusion achieved with intrasaccular mesh implants. This study was performed to assess the interobserver variability of the WOS. Seven experienced neurovascular specialists were trained to apply the WOS. These physicians independently reviewed angiographic image sets from 30 patients treated with the WEB under blinded conditions. No additional clinical information was provided. Raters graded each image according to the WOS (complete occlusion, residual neck or residual aneurysm). Final statistics were calculated using the dichotomous outcomes of complete occlusion or incomplete occlusion. The interobserver agreement was measured by the generalized κ statistic. In this series of 30 test case aneurysms, observers rated 12-17 as completely occluded, 3-9 as nearly completely occluded, and 9-11 as demonstrating residual aneurysm filling. Agreement was perfect across all seven observers for the presence or absence of complete occlusion in 22 of 30 cases. Overall, interobserver agreement was substantial (κ statistic 0.779 with a 95% CI of 0.700 to 0.857). The WOS allows a consistent means of reporting angiographic occlusion for aneurysms treated with the WEB device. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  10. Microsoft Expression Web for dummies

    CERN Document Server

    Hefferman, Linda

    2013-01-01

    Expression Web is Microsoft's newest tool for creating and maintaining dynamic Web sites. This FrontPage replacement offers all the simple ""what-you-see-is-what-you-get"" tools for creating a Web site along with some pumped up new features for working with Cascading Style Sheets and other design options. Microsoft Expression Web For Dummies arrives in time for early adopters to get a feel for how to build an attractive Web site. Author Linda Hefferman teams up with longtime FrontPage For Dummies author Asha Dornfest to show the easy way for first-time Web designers, FrontPage ve

  11. Providing Web Interfaces to the NSF EarthScope USArray Transportable Array

    Science.gov (United States)

    Vernon, Frank; Newman, Robert; Lindquist, Kent

    2010-05-01

    Since April 2004 the EarthScope USArray seismic network has grown to over 850 broadband stations that stream multi-channel data in near real-time to the Array Network Facility in San Diego. Providing secure, yet open, access to real-time and archived data for a broad range of audiences is best served by a series of platform agnostic low-latency web-based applications. We present a framework of tools that mediate between the world wide web and Boulder Real Time Technologies Antelope Environmental Monitoring System data acquisition and archival software. These tools provide comprehensive information to audiences ranging from network operators and geoscience researchers, to funding agencies and the general public. This ranges from network-wide to station-specific metadata, state-of-health metrics, event detection rates, archival data and dynamic report generation over a station's two year life span. Leveraging open source web-site development frameworks for both the server side (Perl, Python and PHP) and client-side (Flickr, Google Maps/Earth and jQuery) facilitates the development of a robust extensible architecture that can be tailored on a per-user basis, with rapid prototyping and development that adheres to web-standards. Typical seismic data warehouses allow online users to query and download data collected from regional networks, without the scientist directly visually assessing data coverage and/or quality. Using a suite of web-based protocols, we have recently developed an online seismic waveform interface that directly queries and displays data from a relational database through a web-browser. Using the Python interface to Datascope and the Python-based Twisted network package on the server side, and the jQuery Javascript framework on the client side to send and receive asynchronous waveform queries, we display broadband seismic data using the HTML Canvas element that is globally accessible by anyone using a modern web-browser. We are currently creating

  12. SEMANTIC WEB MINING: ISSUES AND CHALLENGES

    OpenAIRE

    Karan Singh*, Anil kumar, Arun Kumar Yadav

    2016-01-01

    The combination of the two fast evolving scientific research areas “Semantic Web” and “Web Mining” are well-known as “Semantic Web Mining” in computer science. These two areas cover way for the mining of related and meaningful information from the web, by this means giving growth to the term “Semantic Web Mining”. The “Semantic Web” makes mining easy and “Web Mining” can construct new structure of Web. Web Mining applies Data Mining technique on web content, Structure and Usage. This paper gi...

  13. WebQuests in Special Primary Education: Learning in a Web-Based Environment

    Science.gov (United States)

    Kleemans, Tijs; Segers, Eliane; Droop, Mienke; Wentink, Hanneke

    2011-01-01

    The present study investigated the differences in learning gain when performing a WebQuest with a well-defined versus an ill-defined assignment. Twenty boys and twenty girls (mean age 11; 10), attending a special primary education school, performed two WebQuests. In each WebQuest, they performed either a well-defined or an ill-defined assignment.…

  14. TMFoldWeb: a web server for predicting transmembrane protein fold class.

    Science.gov (United States)

    Kozma, Dániel; Tusnády, Gábor E

    2015-09-17

    Here we present TMFoldWeb, the web server implementation of TMFoldRec, a transmembrane protein fold recognition algorithm. TMFoldRec uses statistical potentials and utilizes topology filtering and a gapless threading algorithm. It ranks template structures and selects the most likely candidates and estimates the reliability of the obtained lowest energy model. The statistical potential was developed in a maximum likelihood framework on a representative set of the PDBTM database. According to the benchmark test the performance of TMFoldRec is about 77 % in correctly predicting fold class for a given transmembrane protein sequence. An intuitive web interface has been developed for the recently published TMFoldRec algorithm. The query sequence goes through a pipeline of topology prediction and a systematic sequence to structure alignment (threading). Resulting templates are ordered by energy and reliability values and are colored according to their significance level. Besides the graphical interface, a programmatic access is available as well, via a direct interface for developers or for submitting genome-wide data sets. The TMFoldWeb web server is unique and currently the only web server that is able to predict the fold class of transmembrane proteins while assigning reliability scores for the prediction. This method is prepared for genome-wide analysis with its easy-to-use interface, informative result page and programmatic access. Considering the info-communication evolution in the last few years, the developed web server, as well as the molecule viewer, is responsive and fully compatible with the prevalent tablets and mobile devices.

  15. Measurment of Web Usability: Web Page of Hacettepe University Department of Information Management

    OpenAIRE

    Nazan Özenç Uçak; Tolga Çakmak

    2009-01-01

    Today, information is produced increasingly in electronic form and retrieval of information is provided via web pages. As a result of the rise of the number of web pages, many of them seem to comprise similar contents but different designs. In this respect, presenting information over the web pages according to user expectations and specifications is important in terms of effective usage of information. This study provides an insight about web usability studies that are executed for measuring...

  16. Web-based CERES Clouds QC Property Viewing Tool

    Science.gov (United States)

    Smith, R. A.; Chu, C.; Sun-Mack, S.; Chen, Y.; Heckert, E.; Minnis, P.

    2014-12-01

    This presentation will display the capabilities of a web-based CERES cloud property viewer. Terra data will be chosen for examples. It will demonstrate viewing of cloud properties in gridded global maps, histograms, time series displays, latitudinal zonal images, binned data charts, data frequency graphs, and ISCCP plots. Images can be manipulated by the user to narrow boundaries of the map as well as color bars and value ranges, compare datasets, view data values, and more. Other atmospheric studies groups will be encouraged to put their data into the underlying NetCDF data format and view their data with the tool. A laptop will hopefully be available to allow conference attendees to try navigating the tool.

  17. EPA Web Taxonomy

    Data.gov (United States)

    U.S. Environmental Protection Agency — EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's...

  18. Toward a Model of Sources of Influence in Online Education: Cognitive Learning and the Effects of Web 2.0

    Science.gov (United States)

    Carr, Caleb T.; Zube, Paul; Dickens, Eric; Hayter, Carolyn A.; Barterian, Justin A.

    2013-01-01

    To explore the integration of education processes into social media, we tested an initial model of student learning via interactive web tools and theorized three sources of influence: interpersonal, intrapersonal, and masspersonal. Three-hundred thirty-seven students observed an online lecture and then completed a series of scales. Structural…

  19. WebTag: Web browsing into sensor tags over NFC.

    Science.gov (United States)

    Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Alvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio

    2012-01-01

    Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm.

  20. Difusión en web de materiales de género

    OpenAIRE

    Egido Pacheca, Juan María

    2011-01-01

    Este proyecto surgió de la idea inicial de querer dar difusión y visibilidad a un libro titulado “Mujeres en la Ciencia”. El proyecto pretende ir más allá, definiendo la metodología necesaria para la difusión de material de género en entornos Web. En particular una serie de cuadernillos editados por distintos organismos y que tienen como punto en común la figura de la mujer en algún ámbito concreto (ciencia, letras, arquitectura, etc.). Por tanto la idea consiste en que una ...

  1. Web-based interventions in nursing.

    Science.gov (United States)

    Im, Eun-Ok; Chang, Sun Ju

    2013-02-01

    With recent advances in computer and Internet technologies and high funding priority on technological aspects of nursing research, researchers at the field level began to develop, use, and test various types of Web-based interventions. Despite high potential impacts of Web-based interventions, little is still known about Web-based interventions in nursing. In this article, to identify strengths and weaknesses of Web-based nursing interventions, a literature review was conducted using multiple databases with combined keywords of "online," "Internet" or "Web," "intervention," and "nursing." A total of 95 articles were retrieved through the databases and sorted by research topics. These articles were then analyzed to identify strengths and weaknesses of Web-based interventions in nursing. A strength of the Web-based interventions was their coverage of various content areas. In addition, many of them were theory-driven. They had advantages in their flexibility and comfort. They could provide consistency in interventions and require less cost in the intervention implementation. However, Web-based intervention studies had selected participants. They lacked controllability and had high dropouts. They required technical expertise and high development costs. Based on these findings, directions for future Web-based intervention research were provided.

  2. UrbanWeb: a Platform for Mobile, Context-aware Web Services

    DEFF Research Database (Denmark)

    Hansen, Frank Allan; Grønbæk, Kaj

    2011-01-01

    much benefit from being informed about the user’s context and tailored to the user’s location or the activities the user is engaged in. In this article we focus on the definition of context and context-awareness for mobile Web 2.0 services and we present a framework, UrbanWeb, which has been designed......’s context from sensors in today mobile phones, ranging from GPS data, to 2D visual barcodes, and manual entry of context information and how to utilize this information in Web applications. Finally a number of applications built with the framework are presented.......Faster Internet connections on the mobile Internet and new advanced mobile terminals make it possible to use Web 2.0 applications and service beyond the desktop wherever and whenever you want. However, even though some service may scale in their current form to the mobile Internet, others will very...

  3. Web party effect: a cocktail party effect in the web environment

    OpenAIRE

    Sara Rigutti; Carlo Fantoni; Walter Gerbino

    2015-01-01

    In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of ...

  4. Designing usable web forms- Empirical evaluation of web form improvement guidelines

    DEFF Research Database (Denmark)

    Seckler, Mirjam; Heinz, Silvia; Bargas-Avila, Javier A.

    2014-01-01

    This study reports a controlled eye tracking experiment (N = 65) that shows the combined effectiveness of 20 guidelines to improve interactive online forms when applied to forms found on real company websites. Results indicate that improved web forms lead to faster completion times, fewer form...... submission trials, and fewer eye movements. Data from subjective questionnaires and interviews further show increased user satisfaction. Overall, our findings highlight the importance for web designers to improve their web forms using UX guidelines....

  5. River food webs: an integrative approach to bottom-up flow webs, top-down impact webs, and trophic position.

    Science.gov (United States)

    Benke, Arthur C

    2018-03-31

    The majority of food web studies are based on connectivity, top-down impacts, bottom-up flows, or trophic position (TP), and ecologists have argued for decades which is best. Rarely have any two been considered simultaneously. The present study uses a procedure that integrates the last three approaches based on taxon-specific secondary production and gut analyses. Ingestion flows are quantified to create a flow web and the same data are used to quantify TP for all taxa. An individual predator's impacts also are estimated using the ratio of its ingestion (I) of each prey to prey production (P) to create an I/P web. This procedure was applied to 41 invertebrate taxa inhabiting submerged woody habitat in a southeastern U.S. river. A complex flow web starting with five basal food resources had 462 flows >1 mg·m -2 ·yr -1 , providing far more information than a connectivity web. Total flows from basal resources to primary consumers/omnivores were dominated by allochthonous amorphous detritus and ranged from 1 to >50,000 mg·m -2 ·yr -1 . Most predator-prey flows were much lower (1,000  mg·m -2 ·yr -1 . The I/P web showed that 83% of individual predator impacts were weak (90%). Quantitative estimates of TP ranged from 2 to 3.7, contrasting sharply with seven integer-based trophic levels based on longest feeding chain. Traditional omnivores (TP = 2.4-2.9) played an important role by consuming more prey and exerting higher impacts on primary consumers than strict predators (TP ≥ 3). This study illustrates how simultaneous quantification of flow pathways, predator impacts, and TP together provide an integrated characterization of natural food webs. © 2018 by the Ecological Society of America.

  6. Practical web development

    CERN Document Server

    Wellens, Paul

    2015-01-01

    This book is perfect for beginners who want to get started and learn the web development basics, but also offers experienced developers a web development roadmap that will help them to extend their capabilities.

  7. An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling

    Science.gov (United States)

    Devi, R. Suganya; Manjula, D.; Siddharth, R. K.

    2015-01-01

    Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling. PMID:26137592

  8. Architecture and the Web.

    Science.gov (United States)

    Money, William H.

    Instructors should be concerned with how to incorporate the World Wide Web into an information systems (IS) curriculum organized across three areas of knowledge: information technology, organizational and management concepts, and theory and development of systems. The Web fits broadly into the information technology component. For the Web to be…

  9. SaaS ve web designu

    OpenAIRE

    Míka, Filip

    2011-01-01

    This thesis is aimed to evaluate if the current SaaS market is able to meet functional re-quirements of web design in order to appropriately support web design's activities. The theoretical part introduces the web design model which describes web design's functional requirements. The next section presents a research concept that describes model assessment (i.e. solutions delivered as SaaS that support web design) and evaluation process. The results show that the current SaaS market is able to...

  10. Programming the semantic web

    CERN Document Server

    Segaran, Toby; Taylor, Jamie

    2009-01-01

    With this book, the promise of the Semantic Web -- in which machines can find, share, and combine data on the Web -- is not just a technical possibility, but a practical reality Programming the Semantic Web demonstrates several ways to implement semantic web applications, using current and emerging standards and technologies. You'll learn how to incorporate existing data sources into semantically aware applications and publish rich semantic data. Each chapter walks you through a single piece of semantic technology and explains how you can use it to solve real problems. Whether you're writing

  11. RESTful Web Services Cookbook

    CERN Document Server

    Allamaraju, Subbu

    2010-01-01

    While the REST design philosophy has captured the imagination of web and enterprise developers alike, using this approach to develop real web services is no picnic. This cookbook includes more than 100 recipes to help you take advantage of REST, HTTP, and the infrastructure of the Web. You'll learn ways to design RESTful web services for client and server applications that meet performance, scalability, reliability, and security goals, no matter what programming language and development framework you use. Each recipe includes one or two problem statements, with easy-to-follow, step-by-step i

  12. Using entropy to cut complex time series

    Science.gov (United States)

    Mertens, David; Poncela Casasnovas, Julia; Spring, Bonnie; Amaral, L. A. N.

    2013-03-01

    Using techniques from statistical physics, physicists have modeled and analyzed human phenomena varying from academic citation rates to disease spreading to vehicular traffic jams. The last decade's explosion of digital information and the growing ubiquity of smartphones has led to a wealth of human self-reported data. This wealth of data comes at a cost, including non-uniform sampling and statistically significant but physically insignificant correlations. In this talk I present our work using entropy to identify stationary sub-sequences of self-reported human weight from a weight management web site. Our entropic approach-inspired by the infomap network community detection algorithm-is far less biased by rare fluctuations than more traditional time series segmentation techniques. Supported by the Howard Hughes Medical Institute

  13. SPARQLGraph: a web-based platform for graphically querying biological Semantic Web databases.

    Science.gov (United States)

    Schweiger, Dominik; Trajanoski, Zlatko; Pabinger, Stephan

    2014-08-15

    Semantic Web has established itself as a framework for using and sharing data across applications and database boundaries. Here, we present a web-based platform for querying biological Semantic Web databases in a graphical way. SPARQLGraph offers an intuitive drag & drop query builder, which converts the visual graph into a query and executes it on a public endpoint. The tool integrates several publicly available Semantic Web databases, including the databases of the just recently released EBI RDF platform. Furthermore, it provides several predefined template queries for answering biological questions. Users can easily create and save new query graphs, which can also be shared with other researchers. This new graphical way of creating queries for biological Semantic Web databases considerably facilitates usability as it removes the requirement of knowing specific query languages and database structures. The system is freely available at http://sparqlgraph.i-med.ac.at.

  14. Chemistry WebBook

    Science.gov (United States)

    SRD 69 NIST Chemistry WebBook (Web, free access)   The NIST Chemistry WebBook contains: Thermochemical data for over 7000 organic and small inorganic compounds; thermochemistry data for over 8000 reactions; IR spectra for over 16,000 compounds; mass spectra for over 33,000 compounds; UV/Vis spectra for over 1600 compounds; electronic and vibrational spectra for over 5000 compounds; constants of diatomic molecules(spectroscopic data) for over 600 compounds; ion energetics data for over 16,000 compounds; thermophysical property data for 74 fluids.

  15. PERBANDINGAN ANTARA “BIG” WEB SERVICE DENGAN RESTFUL WEB SERVICE UNTUK INTEGRASI DATA BERFORMAT GML

    Directory of Open Access Journals (Sweden)

    Adi Nugroho

    2012-01-01

    Full Text Available Web Service with Java: SOAP (JAX-WS/Java API for XML Web Services and Java RESTful Web Service (JAX-RS/Java RESTful API for XML Web Services are now a technology competing with each other in terms of their use for integrates data residing in different systems. Both Web Service technologies, of course, have advantages and disadvantages. In this paper, we discuss the comparison of the two technologies is a Java Web Service in relation to the development of GIS application (Geographic Information System integrates the use of data-formatted GML (Geography Markup Language, which is stored in the system database XML (eXtensible Markup Language.

  16. Review of Worcestershire On-line Fabric Type Series website

    Directory of Open Access Journals (Sweden)

    Beverley Nenk

    2003-06-01

    Full Text Available The study of archaeological ceramics is advanced through the creation and development of regional and national pottery type-series, which contain samples of each type of pottery identified from a particular area or region. Pottery researchers working in any period, from prehistoric to post-medieval, require access to such type-series, and to their associated data, in order to be able to advance the identification of all types of pottery, not only those types produced in the local area, but those produced in surrounding regions, as well as those imported from abroad. The publication of such type-series, as well as their accessibility to researchers, is essential if the information they contain is to be disseminated. The development of the Worcestershire On-Line Fabric Type Series is the first stage in a remarkable project designed to make the complete fabric and form type series for Worcestershire ceramics accessible on the internet. As part of the Historic Environment Record for Worcestershire, formerly the Sites and Monuments Record, it is designed to improve access to finds and environmental data, with the aim of encouraging and facilitating research. Funded by Worcestershire County Council as part of its commitment to e-government, it is being developed by Worcestershire County Council Archaeology Service with OxfordArchDigital. It is one of a proposed series of on-line specialist resources (to include, for example, clay pipes, environmental archaeology, flint tools, historic buildings, which are also designed to stand alone as research tools. The ceramics website is the first part of Pottery in Perspective, a web-based project to provide information on the pottery used and made in Worcestershire from prehistory to c. 1900AD.

  17. A survey on web modeling approaches for ubiquitous web applications

    NARCIS (Netherlands)

    Schwinger, W.; Retschitzegger, W.; Schauerhuber, A.; Kappel, G.; Wimmer, M.; Pröll, B.; Cachero Castro, C.; Casteleyn, S.; De Troyer, O.; Fraternali, P.; Garrigos, I.; Garzotto, F.; Ginige, A.; Houben, G.J.P.M.; Koch, N.; Moreno, N.; Pastor, O.; Paolini, P.; Pelechano Ferragud, V.; Rossi, G.; Schwabe, D.; Tisi, M.; Vallecillo, A.; Sluijs, van der K.A.M.; Zhang, G.

    2008-01-01

    Purpose – Ubiquitous web applications (UWA) are a new type of web applications which are accessed in various contexts, i.e. through different devices, by users with various interests, at anytime from anyplace around the globe. For such full-fledged, complex software systems, a methodologically sound

  18. Classification of the web

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper discusses the challenges faced by investigations into the classification of the Web and outlines inquiries that are needed to use principles for bibliographic classification to construct classifications of the Web. This paper suggests that the classification of the Web meets challenges...... that call for inquiries into the theoretical foundation of bibliographic classification theory....

  19. Maintenance-Ready Web Application Development

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2016-01-01

    Full Text Available The current paper tackles the subject of developing maintenance-ready web applications. Maintenance is presented as a core stage in a web application’s lifecycle. The concept of maintenance-ready is defined in the context of web application development. Web application maintenance tasks types are enunciated and suitable task types are identified for further analysis. The research hypothesis is formulated based on a direct link between tackling maintenance in the development stage and reducing overall maintenance costs. A live maintenance-ready web application is presented and maintenance related aspects are highlighted. The web application’s features, that render it maintenance-ready, are emphasize. The cost of designing and building the web-application to be maintenance-ready are disclosed. The savings in maintenance development effort facilitated by maintenance ready features are also disclosed. Maintenance data is collected from 40 projects implemented by a web development company. Homogeneity and diversity of collected data is evaluated. A data sample is presented and the size and comprehensive nature of the entire dataset is depicted. Research hypothesis are validated and conclusions are formulated on the topic of developing maintenance-ready web applications. The limits of the research process which represented the basis for the current paper are enunciated. Future research topics are submitted for debate.

  20. A multidisciplinary database for geophysical time series management

    Science.gov (United States)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  1. Personalized Metaheuristic Clustering Onto Web Documents

    Institute of Scientific and Technical Information of China (English)

    Wookey Lee

    2004-01-01

    Optimal clustering for the web documents is known to complicated cornbinatorial Optimization problem and it is hard to develop a generally applicable oplimal algorithm. An accelerated simuIated arlneaIing aIgorithm is developed for automatic web document classification. The web document classification problem is addressed as the problem of best describing a match between a web query and a hypothesized web object. The normalized term frequency and inverse document frequency coefficient is used as a measure of the match. Test beds are generated on - line during the search by transforming model web sites. As a result, web sites can be clustered optimally in terms of keyword vectofs of corresponding web documents.

  2. Designing a WebQuest

    Science.gov (United States)

    Salsovic, Annette R.

    2009-01-01

    A WebQuest is an inquiry-based lesson plan that uses the Internet. This article explains what a WebQuest is, shows how to create one, and provides an example. When engaged in a WebQuest, students use technology to experience cooperative learning and discovery learning while honing their research, writing, and presentation skills. It has been found…

  3. Web-Based Inquiry Learning: Facilitating Thoughtful Literacy with WebQuests

    Science.gov (United States)

    Ikpeze, Chinwe H.; Boyd, Fenice B.

    2007-01-01

    An action research study investigated how the multiple tasks found in WebQuests facilitate fifth-grade students' literacy skills and higher order thinking. Findings indicate that WebQuests are most successful when activities are carefully selected and systematically delivered. Implications for teaching include the necessity for adequate planning,…

  4. Work of the Web Weavers: Web Development in Academic Libraries

    Science.gov (United States)

    Bundza, Maira; Vander Meer, Patricia Fravel; Perez-Stable, Maria A.

    2009-01-01

    Although the library's Web site has become a standard tool for seeking information and conducting research in academic institutions, there are a variety of ways libraries approach the often challenging--and sometimes daunting--process of Web site development and maintenance. Three librarians at Western Michigan University explored issues related…

  5. Interoperable web applications for sharing data and products of the International DORIS Service

    Science.gov (United States)

    Soudarin, L.; Ferrage, P.

    2017-12-01

    The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French satellite tracking system DORIS and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). Since its start, the organization has continuously evolved, leading to additional and improved operational products from an expanded set of DORIS Analysis Centers. In addition, IDS has developed services for sharing data and products with the users. Metadata and interoperable web applications are proposed to explore, visualize and download the key products such as the position time series of the geodetic points materialized at the ground tracking stations. The Global Geodetic Observing System (GGOS) encourages the IAG Services to develop such interoperable facilities on their website. The objective for GGOS is to set up an interoperable portal through which the data and products produced by the IAG Services can be served to the user community. We present the web applications proposed by IDS to visualize time series of geodetic observables or to get information about the tracking ground stations and the tracked satellites. We discuss the future plans for IDS to meet the recommendations of GGOS. The presentation also addresses the needs for the IAG Services to adopt common metadata thesaurus to describe data and products, and interoperability standards to share them.

  6. Web 2.0 i undervisningen

    DEFF Research Database (Denmark)

    Liburd, Janne J.; Christensen, Inger-Marie F.

    2011-01-01

    Temahæfte om web 2.0, der formidler viden om og inspiration til at arbejde med web 2.0 teknologier i videregående uddannelser. Hæftet introducerer sociale medier og web 2.0, og der redegøres for teoretisk funderede læreprocesser med web 2.0, og hvorledes disse kan indtænkes i undervisningsforløb....... Hæftet præsenterer endvidere en metode til design af læringsaktiviteter med web 2.0, og giver en række eksempler på konkrete forløb....

  7. Historical Quantitative Reasoning on the Web

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Ashkpour, A.

    2016-01-01

    The Semantic Web is an extension of the Web through standards by the World Wide Web Consortium (W3C) [4]. These standards promote common data formats and exchange protocols on the Web, most fundamentally the Resource Description Framework (RDF). Its ultimate goal is to make the Web a suitable data

  8. Building Social Web Applications

    CERN Document Server

    Bell, Gavin

    2009-01-01

    Building a web application that attracts and retains regular visitors is tricky enough, but creating a social application that encourages visitors to interact with one another requires careful planning. This book provides practical solutions to the tough questions you'll face when building an effective community site -- one that makes visitors feel like they've found a new home on the Web. If your company is ready to take part in the social web, this book will help you get started. Whether you're creating a new site from scratch or reworking an existing site, Building Social Web Applications

  9. The Creative Web.

    Science.gov (United States)

    Yudess, Jo

    2003-01-01

    This article lists the Web sites of 12 international not-for-profit creativity associations designed to trigger more creative thought and research possibilities. Along with Web addresses, the entries include telephone contact information and a brief description of the organization. (CR)

  10. Engineering Adaptive Web Applications

    DEFF Research Database (Denmark)

    Dolog, Peter

    2007-01-01

    suit the user profile the most. This paper summarizes the domain engineering framework for such adaptive web applications. The framework provides guidelines to develop adaptive web applications as members of a family. It suggests how to utilize the design artifacts as knowledge which can be used......Information and services on the web are accessible for everyone. Users of the web differ in their background, culture, political and social environment, interests and so on. Ambient intelligence was envisioned as a concept for systems which are able to adapt to user actions and needs....... With the growing amount of information and services, the web applications become natural candidates to adopt the concepts of ambient intelligence. Such applications can deal with divers user intentions and actions based on the user profile and can suggest the combination of information content and services which...

  11. Inter-annual cascade effect on marine food web: A benthic pathway lagging nutrient supply to pelagic fish stock.

    Directory of Open Access Journals (Sweden)

    Lohengrin Dias de Almeida Fernandes

    Full Text Available Currently, spatial and temporal changes in nutrients availability, marine planktonic, and fish communities are best described on a shorter than inter-annual (seasonal scale, primarily because the simultaneous year-to-year variations in physical, chemical, and biological parameters are very complex. The limited availability of time series datasets furnishing simultaneous evaluations of temperature, nutrients, plankton, and fish have limited our ability to describe and to predict variability related to short-term process, as species-specific phenology and environmental seasonality. In the present study, we combine a computational time series analysis on a 15-year (1995-2009 weekly-sampled time series (high-resolution long-term time series, 780 weeks with an Autoregressive Distributed Lag Model to track non-seasonal changes in 10 potentially related parameters: sea surface temperature, nutrient concentrations (NO2, NO3, NH4 and PO4, phytoplankton biomass (as in situ chlorophyll a biomass, meroplankton (barnacle and mussel larvae, and fish abundance (Mugil liza and Caranx latus. Our data demonstrate for the first time that highly intense and frequent upwelling years initiate a huge energy flux that is not fully transmitted through classical size-structured food web by bottom-up stimulus but through additional ontogenetic steps. A delayed inter-annual sequential effect from phytoplankton up to top predators as carnivorous fishes is expected if most of energy is trapped into benthic filter feeding organisms and their larval forms. These sequential events can explain major changes in ecosystem food web that were not predicted in previous short-term models.

  12. Physician training protocol within the WEB Intrasaccular Therapy (WEB-IT) study.

    Science.gov (United States)

    Arthur, Adam; Hoit, Daniel; Coon, Alexander; Delgado Almandoz, Josser E; Elijovich, Lucas; Cekirge, Saruhan; Fiorella, David

    2018-05-01

    The WEB Intra-saccular Therapy (WEB-IT) trial is an investigational device exemption study to demonstrate the safety and effectiveness of the WEB device for the treatment of wide-neck bifurcation aneurysms. The neurovascular replicator (Vascular Simulations, Stony Brook, New York, USA) creates a physical environment that replicates patient-specific neurovascular anatomy and hemodynamic physiology, and allows devices to be implanted under fluoroscopic guidance. To report the results of a unique neurovascular replicator-based training program, which was incorporated into the WEB-IT study to optimize technical performance and patient safety. US investigators participated in a new training program that incorporated full surgical rehearsals on a neurovascular replicator. No roll-in cases were permitted within the trial. Custom replicas of patient-specific neurovascular anatomy were created for the initial cases treated at each center, as well as for cases expected to be challenging. On-site surgical rehearsals were performed before these procedures. A total of 48 participating investigators at 25 US centers trained using the replicator. Sessions included centralized introductory training, on-site training, and patient-specific full surgical rehearsal. Fluoroscopy and procedure times in the WEB-IT study were not significantly different from those seen in two European trials where participating physicians had significant WEB procedure experience before study initiation. A new program of neurovascular-replicator-based physician training was employed within the WEB-IT study. This represents a new methodology for education and training that may be an effective means to optimize technical success and patient safety during the introduction of a new technology. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Stochastic analysis of web page ranking

    NARCIS (Netherlands)

    Volkovich, Y.

    2009-01-01

    Today, the study of the World Wide Web is one of the most challenging subjects. In this work we consider the Web from a probabilistic point of view. We analyze the relations between various characteristics of the Web. In particular, we are interested in the Web properties that affect the Web page

  14. Life Cycle Project Plan Outline: Web Sites and Web-based Applications

    Science.gov (United States)

    This tool is a guideline for planning and checking for 508 compliance on web sites and web based applications. Determine which EIT components are covered or excepted, which 508 standards and requirements apply, and how to implement them.

  15. Pembuatan Aplikasi Web Manajemen Laundry dan Integrasi Data dengan Web Service

    Directory of Open Access Journals (Sweden)

    Refika Khoirunnisa

    2016-01-01

    Full Text Available Selama ini banyak dari perusahaan di bidang jasa laundry masih menggunakan pencatatan secara manual seperti menggunakan buku, sehingga setiap data tidak terintegrasi secara waktu-nyata. Oleh sebab itu, perlu dibuat penelitian untuk merancang sebuah sistem terkomputerisasi yang dapat mempermudah pencatatan dan pengolahan data keuangan laundry. Pembuatan Aplikasi Web Manajemen Laundry menggunakan bahasa pemrograman PHP, HTML, CSS, JavaScript dan basisdata MySQL sebagai tempat penyimpanan data. Aplikasi ini merupakan aplikasi yang terintegrasi dengan aplikasi melalui sebuah teknologi yang disebut web service. Aplikasi Web Manajemen Laundry dikembangkan dengan menggunakan metode RAD (Rapid Application Development yang terdiri dari tahap perencanaan kebutuhan, proses perancangan, implementasi, dan tahap pengujian. Dari hasil penelitian dapat disimpulkan bahwa Aplikasi Web Manajemen Laundry memiliki fitur yang berfungsi untuk mempermudah pencatatan dan pengolahan data secara akurat dan waktu-nyata. Fitur-fitur utama dari aplikasi ini diantaranya adalah pengolahan data transaksi, pengeluaran, dan laporan laba/rugi. Dalam menunjang fitur-fitur utama agar dapat bekerja dengan baik maka terdapat fitur pendukung yaitu pengolahan data pelanggan dan data pengguna aplikasi. Berdasarkan pengujian dengan menggunakan metode black-box, seluruh fungsi menu yang ada dalam aplikasi web telah berhasil dan berjalan sesuai dengan kebutuhan.

  16. Developing Long-Term Computing Skills among Low-Achieving Students via Web-Enabled Problem-Based Learning and Self-Regulated Learning

    Science.gov (United States)

    Tsai, Chia-Wen; Lee, Tsang-Hsiung; Shen, Pei-Di

    2013-01-01

    Many private vocational schools in Taiwan have taken to enrolling students with lower levels of academic achievement. The authors re-designed a course and conducted a series of quasi-experiments to develop students' long-term computing skills, and examined the longitudinal effects of web-enabled, problem-based learning (PBL) and self-regulated…

  17. The "Carbon Data Explorer": Web-Based Space-Time Visualization of Modeled Carbon Fluxes

    Science.gov (United States)

    Billmire, M.; Endsley, K. A.

    2014-12-01

    The visualization of and scientific "sense-making" from large datasets varying in both space and time is a challenge; one that is still being addressed in a number of different fields. The approaches taken thus far are often specific to a given academic field due to the unique questions that arise in different disciplines, however, basic approaches such as geographic maps and time series plots are still widely useful. The proliferation of model estimates of increasing size and resolution further complicates what ought to be a simple workflow: Model some geophysical phenomen(on), obtain results and measure uncertainty, organize and display the data, make comparisons across trials, and share findings. A new tool is in development that is intended to help scientists with the latter parts of that workflow. The tentatively-titled "Carbon Data Explorer" (http://spatial.mtri.org/flux-client/) enables users to access carbon science and related spatio-temporal science datasets over the web. All that is required to access multiple interactive visualizations of carbon science datasets is a compatible web browser and an internet connection. While the application targets atmospheric and climate science datasets, particularly spatio-temporal model estimates of carbon products, the software architecture takes an agnostic approach to the data to be visualized. Any atmospheric, biophysical, or geophysical quanity that varies in space and time, including one or more measures of uncertainty, can be visualized within the application. Within the web application, users have seamless control over a flexible and consistent symbology for map-based visualizations and plots. Where time series data are represented by one or more data "frames" (e.g. a map), users can animate the data. In the "coordinated view," users can make direct comparisons between different frames and different models or model runs, facilitating intermodal comparisons and assessments of spatio-temporal variability. Map

  18. Web Literacy, Web Literacies or Just Literacies on the Web? Reflections from a Study of Personal Homepages.

    Science.gov (United States)

    Karlsson, Anna-Malin

    2002-01-01

    Discusses the question of whether there is such a thing as web literacy. Perspectives from media studies, literacy studies, and the study of multimodal texts are used to find the main contextual parameters involved in what might be classed as web literacy. The parameters suggested are material conditions, domain, power or ideology, and semiotic…

  19. Classification of time series patterns from complex dynamic systems

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.

  20. Analytical Web Tool for CERES Products

    Science.gov (United States)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  1. Finding Web-Based Anxiety Interventions on the World Wide Web: A Scoping Review.

    Science.gov (United States)

    Ashford, Miriam Thiel; Olander, Ellinor K; Ayers, Susan

    2016-06-01

    One relatively new and increasingly popular approach of increasing access to treatment is Web-based intervention programs. The advantage of Web-based approaches is the accessibility, affordability, and anonymity of potentially evidence-based treatment. Despite much research evidence on the effectiveness of Web-based interventions for anxiety found in the literature, little is known about what is publically available for potential consumers on the Web. Our aim was to explore what a consumer searching the Web for Web-based intervention options for anxiety-related issues might find. The objectives were to identify currently publically available Web-based intervention programs for anxiety and to synthesize and review these in terms of (1) website characteristics such as credibility and accessibility; (2) intervention program characteristics such as intervention focus, design, and presentation modes; (3) therapeutic elements employed; and (4) published evidence of efficacy. Web keyword searches were carried out on three major search engines (Google, Bing, and Yahoo-UK platforms). For each search, the first 25 hyperlinks were screened for eligible programs. Included were programs that were designed for anxiety symptoms, currently publically accessible on the Web, had an online component, a structured treatment plan, and were available in English. Data were extracted for website characteristics, program characteristics, therapeutic characteristics, as well as empirical evidence. Programs were also evaluated using a 16-point rating tool. The search resulted in 34 programs that were eligible for review. A wide variety of programs for anxiety, including specific anxiety disorders, and anxiety in combination with stress, depression, or anger were identified and based predominantly on cognitive behavioral therapy techniques. The majority of websites were rated as credible, secure, and free of advertisement. The majority required users to register and/or to pay a program access

  2. Web Dynpro ABAP for practitioners

    CERN Document Server

    Gellert, Ulrich

    2013-01-01

    Web Dynpro ABAP, a NetWeaver web application user interface tool from SAP, enables web programming connected to SAP Systems. The authors' main focus was to create a book based on their own practical experience. Each chapter includes examples which lead through the content step-by-step and enable the reader to gradually explore and grasp the Web Dynpro ABAP process. The authors explain in particular how to design Web Dynpro components, the data binding and interface methods, and the view controller methods. They also describe the other SAP NetWeaver Elements (ABAP Dictionary, Authorization) and

  3. Introduction to the world wide web.

    Science.gov (United States)

    Downes, P K

    2007-05-12

    The World Wide Web used to be nicknamed the 'World Wide Wait'. Now, thanks to high speed broadband connections, browsing the web has become a much more enjoyable and productive activity. Computers need to know where web pages are stored on the Internet, in just the same way as we need to know where someone lives in order to post them a letter. This section explains how the World Wide Web works and how web pages can be viewed using a web browser.

  4. Web traffic and firm performance

    DEFF Research Database (Denmark)

    Farooq, Omar; Aguenaou, Samir

    2013-01-01

    Does the traffic generated by websites of firms signal anything to stock market participants? Does higher web-traffic translate into availability of more information and therefore lower agency problems? And if answers to above questions are in affirmative, does higher web-traffic traffic translate...... into better firm performance? This paper aims to answer these questions by documenting a positive relationship between the extent of web-traffic and firm performance in the MENA region during the 2010. We argue that higher web-traffic lowers the agency problems in firms by disseminating more information...... to stock market participants. Consequently, lower agency problems translate into better performance. Furthermore, we also show that agency reducing role of web-traffic is more pronounced in regimes where information environment is already bad. For example, our results show stronger impact of web...

  5. ANÁLISIS DEL NIVEL DE PRESENCIA DE LOS ESTABLECIMIENTOS HOTELEROS DE LA REGIÓN DE MURCIA EN LA WEB 2.0

    Directory of Open Access Journals (Sweden)

    Soledad María Martínez María-Dolores

    2013-01-01

    Full Text Available Desde hace unos años el fenómeno conocido como web 2.0 ha alcanzado una dimensión muy significativa, sobre todo por la popularización de las redes sociales. Este concepto por el cual los usuarios son partícipes y no meros espectadores, implica que las empresas y las marcas, impulsadas por las cifras de usuarios de las herramientas de la web 2.0, comiencen a actuar e intenten posicionarse. Cabe esperar que el sector turístico, más concretamente el hotelero, ocupe una posición destacada en este sentido, debido a su alta interrelación con Internet. El análisis propuesto pretende estudiar el nivel de presencia de los establecimientos hoteleros de la Región de Murcia en la web 2.0, analizando para ello una serie de factores relacionados con su actividad en la web y redes sociales.

  6. Characterizing web heuristics

    NARCIS (Netherlands)

    de Jong, Menno D.T.; van der Geest, Thea

    2000-01-01

    This article is intended to make Web designers more aware of the qualities of heuristics by presenting a framework for analyzing the characteristics of heuristics. The framework is meant to support Web designers in choosing among alternative heuristics. We hope that better knowledge of the

  7. Web Publishing Schedule

    Science.gov (United States)

    Section 207(f)(2) of the E-Gov Act requires federal agencies to develop an inventory and establish a schedule of information to be published on their Web sites, make those schedules available for public comment. To post the schedules on the web site.

  8. WebPASS Explorer (HR Personnel Management)

    Data.gov (United States)

    US Agency for International Development — WebPass Explorer (WebPASS Framework): USAID is partnering with DoS in the implementation of their WebPass Post Personnel (PS) Module. WebPassPS does not replace...

  9. Programming the Mobile Web

    CERN Document Server

    Firtman, Maximiliano

    2010-01-01

    Today's market for mobile apps goes beyond the iPhone to include BlackBerry, Nokia, Windows Phone, and smartphones powered by Android, webOS, and other platforms. If you're an experienced web developer, this book shows you how to build a standard app core that you can extend to work with specific devices. You'll learn the particulars and pitfalls of building mobile apps with HTML, CSS, and other standard web tools. You'll also explore platform variations, finicky mobile browsers, Ajax design patterns for mobile, and much more. Before you know it, you'll be able to create mashups using Web 2.

  10. Creating Web Pages Simplified

    CERN Document Server

    Wooldridge, Mike

    2011-01-01

    The easiest way to learn how to create a Web page for your family or organization Do you want to share photos and family lore with relatives far away? Have you been put in charge of communication for your neighborhood group or nonprofit organization? A Web page is the way to get the word out, and Creating Web Pages Simplified offers an easy, visual way to learn how to build one. Full-color illustrations and concise instructions take you through all phases of Web publishing, from laying out and formatting text to enlivening pages with graphics and animation. This easy-to-follow visual guide sho

  11. SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services.

    Science.gov (United States)

    Gessler, Damian D G; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T

    2009-09-23

    SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at http://sswap.info (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at http://sswap.info/protocol.jsp, developer tools at http://sswap.info/developer.jsp, and a portal to third-party ontologies at http://sswapmeet.sswap.info (a "swap meet"). SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the

  12. Web 2.1 : Toward a large and qualitative participation on the Web

    Directory of Open Access Journals (Sweden)

    Boubker Sbihi

    2009-06-01

    Full Text Available Normal 0 21 false false false MicrosoftInternetExplorer4 This article presents the results of research done on Web 2.0 within the School of Information Sciences ESI. It aims to study the behavior of different academic actors who deal with information, among whom we cite teachers, students of masters and students of information sciences in Morocco, face to Web 2.0’s services. Firstly, it aims to evaluate the use and production of information in the context of Web 2.0. Then, it   attempts to assess those rates, to identify and analyze the causes of eventual problems and obstacles that academic actors face.  In fact, we intend to understand why information actors in the academic world use often Web 2.0’s services but do rarely produce qualitative content. To achieve the objectives set, we used the on-site survey method, which was based on an electronic questionnaire administered directly to our people via the Internet. We chose the electronic version of questionnaire in order to make an optimal use in terms of new technologies, to gain time and to reduce cost. Then, in order to deepen the understanding of the data collected, we complete the data collected by the questionnaire by an ongoing discussions with actors. Finally, to overcome the problems already identified, we intend to propose the elements of a new version of the Web called Web 2.1 offering new concepts   in order to encourage users to produce information of quality and make the Web more open to a larger community. This version maintains the current contents of   Web 2.0 and adds more value to it. Indeed, the content will be monitored, evaluated and validated before being published. In order to target valuable information, the new version of Web 2.1 proposes to categorize users into three groups: users who just use the contents, producers who use and produce content, and  validators  who validate the content in order to  target information that is validated and of good

  13. Desarrollo de aplicaciones web

    OpenAIRE

    Luján Mora, Sergio

    2010-01-01

    Agradecimientos 1. Introducción a las aplicaciones web 2. Instalación del servidor 3. Diseño de páginas web 4. Formato estructurado de texto: XML 5. Contenido dinámico 6. Acceso a bases de datos: JDBC 7. Servicios web 8. Utilización y mantenimiento 9. Monitorización y análisis Bibliografía GNU Free Documentation License

  14. Evaluation of WebEase: An Epilepsy Self-Management Web Site

    Science.gov (United States)

    DiIorio, Colleen; Escoffery, Cam; McCarty, Frances; Yeager, Katherine A.; Henry, Thomas R.; Koganti, Archana; Reisinger, Elizabeth L.; Wexler, Bethany

    2009-01-01

    People with epilepsy have various education needs and must adopt many self-management behaviors in order to control their condition. This study evaluates WebEase, an Internet-based, theory-driven, self-management program for adults with epilepsy. Thirty-five participants took part in a 6-week pilot implementation of WebEase. The main components of…

  15. RESTful web services with Dropwizard

    CERN Document Server

    Dallas, Alexandros

    2014-01-01

    A hands-on focused step-by-step tutorial to help you create Web Service applications using Dropwizard. If you are a software engineer or a web developer and want to learn more about building your own Web Service application, then this is the book for you. Basic knowledge of Java and RESTful Web Service concepts is assumed and familiarity with SQL/MySQL and command-line scripting would be helpful.

  16. Segmenting The Web 2.0 Market: Behavioural And Usage Patterns Of Social Web Consumers

    NARCIS (Netherlands)

    Lorenzo Romero, Carlota; Constantinides, Efthymios; Alarcon-del-Amo, Maria-del-Carmen

    2010-01-01

    The evolution of the commercial Internet to the current phase, commonly called Web 2.0 (or Social Web) has firmly positioned the web not only as a commercial but also as a social communication platform: an online environment facilitating peer-to-peer interaction, socialization, co-operation and

  17. System configuration on Web with mashup.

    OpenAIRE

    清水, 宏泰; SHIMIZU, Hiroyasu

    2014-01-01

    Mashup become trend for create Web service due to popularizing cloud service. Mashup is method for create Web service from several Web services and API. Mashup has a few problems. One of the problem is deference of data format and label. Semantic Web can solve it. This paper propose method of building a system on Web with mashup using semantic Web. Mashup system configuration can express as URL. So, editing URL for mashup is editing system configuration. And any device can use this system on ...

  18. Trust estimation of the semantic web using semantic web clustering

    Science.gov (United States)

    Shirgahi, Hossein; Mohsenzadeh, Mehran; Haj Seyyed Javadi, Hamid

    2017-05-01

    Development of semantic web and social network is undeniable in the Internet world these days. Widespread nature of semantic web has been very challenging to assess the trust in this field. In recent years, extensive researches have been done to estimate the trust of semantic web. Since trust of semantic web is a multidimensional problem, in this paper, we used parameters of social network authority, the value of pages links authority and semantic authority to assess the trust. Due to the large space of semantic network, we considered the problem scope to the clusters of semantic subnetworks and obtained the trust of each cluster elements as local and calculated the trust of outside resources according to their local trusts and trust of clusters to each other. According to the experimental result, the proposed method shows more than 79% Fscore that is about 11.9% in average more than Eigen, Tidal and centralised trust methods. Mean of error in this proposed method is 12.936, that is 9.75% in average less than Eigen and Tidal trust methods.

  19. Towards Semantic Web Services on Large, Multi-Dimensional Coverages

    Science.gov (United States)

    Baumann, P.

    2009-04-01

    Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it

  20. MedlinePlus Connect: Web Service

    Science.gov (United States)

    ... MedlinePlus Connect → Web Service URL of this page: https://medlineplus.gov/connect/service.html MedlinePlus Connect: Web ... will change.) Old URLs New URLs Web Application https://apps.nlm.nih.gov/medlineplus/services/mpconnect.cfm? ...

  1. MedlinePlus Connect: Web Application

    Science.gov (United States)

    ... MedlinePlus Connect → Web Application URL of this page: https://medlineplus.gov/connect/application.html MedlinePlus Connect: Web ... will change.) Old URLs New URLs Web Application https://apps.nlm.nih.gov/medlineplus/services/mpconnect.cfm? ...

  2. SPADOCK: Adaptive Pipeline Technology for Web System using WebSocket

    Directory of Open Access Journals (Sweden)

    Aries RICHI

    2013-01-01

    Full Text Available As information technology grows to the era of IoT(Internet of Things and cloud computing, the performance ofweb application and web service which acts as the informationgateway becomes an issue. Horizontal quality of serviceimprovement through system performance escalation becomesan issue pursued by engineers and scientists, giving birth toBigPipe pipeline technology which was developed by Facebook.We make SPADOCK, an adaptive pipeline system which is builtunder distributed system architecture with the utilization ofHTML5 WebSocket, then measure its performance. Parametersused for the measurement includes latency, workload, andbandwidth. The result shows that SPADOCK could reduceserving latency by 68.28% compared with the conventional web,and it is 20.63% faster than BigPipe.

  3. PRIS-WEDAS. User’s Manual to the Web Enabled Data Acquisition System for PRIS

    International Nuclear Information System (INIS)

    2015-01-01

    The user manual for the Web Enabled Data Acquisition System (WEDAS), a system that supports the Power Reactor Information System (PRIS), provides instructions, guidelines and detailed definitions for each of the data items required for PRIS. The purpose of this manual is to ensure PRIS performance data are collected consistently and that the required quality of data collection is ensured. This PRIS-WEDAS user’s manual replaces reporting instructions published in the IAEA Technical Reports Series No. 428

  4. WebQuest y anotaciones semánticas WebQuest and semantic annotations

    Directory of Open Access Journals (Sweden)

    Santiago Blanco Suárez

    2007-03-01

    Full Text Available En este artículo se presenta un sistema de búsqueda y recuperación de metadatos de actividades educativas que siguen el modelo WebQuest. Se trata de una base de datos relacional, accesible a través del web, que se complementa con un módulo que permite realizar anotaciones semánticas y cuyo objetivo es capturar y enriquecer el conocimiento acerca del uso de dichos ejercicios por parte de la comunidad de docentes que experimentan con ellos, así como documentar los recursos o sitios web de interés didáctico buscando construir un repositorio de enlaces educativos de calidad. This paper presents a system of searching and recovering educational activities that follow the Web-Quest model through the web, complemented with a module to make semantic annotations aimed at getting and enriching the knowledge on the use of these exercises by the teaching community. It also tries to document the resources or websites with didactic interest in order to build a qualified account of educational links.

  5. Historical Network Analysis of the Web

    DEFF Research Database (Denmark)

    Brügger, Niels

    2013-01-01

    This article discusses some of the fundamental methodological challenges related to doing historical network analyses of the web based on material in web archives. Since the late 1990s many countries have established extensive national web archives, and software supported network analysis...... of the online web has for a number of years gained currency within Internet studies. However, the combination of these two phenomena—historical network analysis of material in web archives—can at best be characterized as an emerging new area of study. Most of the methodological challenges within this new area...... revolve around the specific nature of archived web material. On the basis of an introduction to the processes involved in web archiving as well as of the characteristics of archived web material, the article outlines and scrutinizes some of the major challenges which may arise when doing network analysis...

  6. The iMars WebGIS - Spatio-Temporal Data Queries and Single Image Map Web Services

    Science.gov (United States)

    Walter, Sebastian; Steikert, Ralf; Schreiner, Bjoern; Muller, Jan-Peter; van Gasselt, Stephan; Sidiropoulos, Panagiotis; Lanz-Kroechert, Julia

    2017-04-01

    Introduction: Web-based planetary image dissemination platforms usually show outline coverages of the data and offer querying for metadata as well as preview and download, e.g. the HRSC Mapserver (Walter & van Gasselt, 2014). Here we introduce a new approach for a system dedicated to change detection by simultanous visualisation of single-image time series in a multi-temporal context. While the usual form of presenting multi-orbit datasets is the merge of the data into a larger mosaic, we want to stay with the single image as an important snapshot of the planetary surface at a specific time. In the context of the EU FP-7 iMars project we process and ingest vast amounts of automatically co-registered (ACRO) images. The base of the co-registration are the high precision HRSC multi-orbit quadrangle image mosaics, which are based on bundle-block-adjusted multi-orbit HRSC DTMs. Additionally we make use of the existing bundle-adjusted HRSC single images available at the PDS archives. A prototype demonstrating the presented features is available at http://imars.planet.fu-berlin.de. Multi-temporal database: In order to locate multiple coverage of images and select images based on spatio-temporal queries, we converge available coverage catalogs for various NASA imaging missions into a relational database management system with geometry support. We harvest available metadata entries during our processing pipeline using the Integrated Software for Imagers and Spectrometers (ISIS) software. Currently, this database contains image outlines from the MGS/MOC, MRO/CTX and the MO/THEMIS instruments with imaging dates ranging from 1996 to the present. For the MEx/HRSC data, we already maintain a database which we automatically update with custom software based on the VICAR environment. Web Map Service with time support: The MapServer software is connected to the database and provides Web Map Services (WMS) with time support based on the START_TIME image attribute. It allows temporal

  7. Web corpus construction

    CERN Document Server

    Schafer, Roland

    2013-01-01

    The World Wide Web constitutes the largest existing source of texts written in a great variety of languages. A feasible and sound way of exploiting this data for linguistic research is to compile a static corpus for a given language. There are several adavantages of this approach: (i) Working with such corpora obviates the problems encountered when using Internet search engines in quantitative linguistic research (such as non-transparent ranking algorithms). (ii) Creating a corpus from web data is virtually free. (iii) The size of corpora compiled from the WWW may exceed by several orders of magnitudes the size of language resources offered elsewhere. (iv) The data is locally available to the user, and it can be linguistically post-processed and queried with the tools preferred by her/him. This book addresses the main practical tasks in the creation of web corpora up to giga-token size. Among these tasks are the sampling process (i.e., web crawling) and the usual cleanups including boilerplate removal and rem...

  8. Hera : Development of semantic web information systems

    NARCIS (Netherlands)

    Houben, G.J.P.M.; Barna, P.; Frasincar, F.; Vdovják, R.; Cuella Lovelle, J.M.; et al., xx

    2003-01-01

    As a consequence of the success of the Web, methodologies for information system development need to consider systems that use the Web paradigm. These Web Information Systems (WIS) use Web technologies to retrieve information from the Web and to deliver information in a Web presentation to the

  9. Answering the Call of the Web: UVA Crafts a Innovative Web Certification Program for Its Staff.

    Science.gov (United States)

    Lee, Sandra T.

    2000-01-01

    Describes the development of a Web Certification Program at the University of Virginia. This program offers certificates at three levels: Web Basics, Web Designer, and Web Master. The paper focuses on: determination of criteria for awarding certificates; program status; program evaluation and program effectiveness; and future plans for the Web…

  10. WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides

    Energy Technology Data Exchange (ETDEWEB)

    Ma Xiuzeng [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States)]. E-mail: hongju@purdue.edu; Li Yingkui [Department of Geography, University of Missouri-Columbia, Columbia, MO 65211 (United States); Bourgeois, Mike [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Caffee, Marc [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Elmore, David [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Granger, Darryl [Department of Earth and Atmospheric Sciences, Purdue University, West Lafayette, IN 47907 (United States); Muzikar, Paul [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Smith, Preston [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States)

    2007-06-15

    Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for {sup 10}Be and {sup 26}Al has been finished and published at http://www.physics.purdue.edu/primelab/for{sub u}sers/rockage.html. WebCN for {sup 36}Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.

  11. Web 3.0: implicaciones educativas

    OpenAIRE

    Grupo TACE. Tecnologías Aplicadas a las Ciencias de la Educación

    2012-01-01

    La Web 3.0 se considera la etapa que sigue a la Web 2.0 o Web social. Todavía no es un término unívoco y suele aparecer unido con la web semántica. Es una extensión del World Wide Web, que permite expresar el lenguaje natural y también utilizar un lenguaje que se puede entender, interpretar y utilizar por agentes software, permitiendo encontrar, compartir e integrar la información más fácilmente. Se trata de un nuevo ciclo en el que la inteligencia artificial se combina con la capacidad de l...

  12. Arrangement for selectively irradiating webs

    International Nuclear Information System (INIS)

    Ihme, B.

    1975-01-01

    The arrangement for selectively irradiating a web includes a perforated band of a radiation impermeable substance which is guided in an endless path via a pair of guide rollers and has two juxtaposed runs in this path. A take-up roller conveys a web of material past one of the runs at a side thereof remote from the other run, the direction of movement of the web being other than parallel to that of the band and, preferably, normal thereto. An electron accelerator is provided at the far side of the run remote from the web and is effective for directing a radiation beam at the web through the perforations

  13. Determining the signalling overhead of two common WebRTC methods:JSON via XMLHttpRequest and SIP over WebSocket

    CSIR Research Space (South Africa)

    Adeyeye, M

    2013-09-01

    Full Text Available Web Real-Time Communication (WebRTC) introduces real-time multimedia communication as native capabilities of Web browsers. With the adoption of WebRTC the Web browsers will be able to use WebRTC to communicate with one another (peer...

  14. Anonymous Web Browsing and Hosting

    OpenAIRE

    MANOJ KUMAR; ANUJ RANI

    2013-01-01

    In today’s high tech environment every organization, individual computer users use internet for accessing web data. To maintain high confidentiality and security of the data secure web solutions are required. In this paper we described dedicated anonymous web browsing solutions which makes our browsing faster and secure. Web application which play important role for transferring our secret information including like email need more and more security concerns. This paper also describes that ho...

  15. Sistem Informasi Manajemen Terpadu Tanggap Darurat Bencana Berbasis Web

    Directory of Open Access Journals (Sweden)

    Isnardi ,

    2015-07-01

    Full Text Available A series ofdisasters in Indonesia has caused the deaths ofthousands and even hundreds of lives as the tsunami in Aceh and Nias, and the recent earthquake in West Sumatra Padang region.In such situations, the information about the list of needs and distribution of aid is desperately needed by the whole community. Through these research activities is expected that the information is expected to be available are up to date in order to reach an aid distribution system more efficient and effective, especially shortly after the disaster.With the title of research Integrated Management Information System Web-Based Disaster Response, then have some information may be presented as (a. The impact of disasters in specific regions over the web is visually displayed in the form of pictures or photos, videos and maps of (b. Sign up for urgent basic needs of disaster victims in any disaster (c. The amount of aid available at the disaster relief center, and (d. The amount of aid that has been distributed to the affected area and (d. The data and the number ofvictims affected in each area.In the course of this research, produced a web site that acts as an information center for emergency response that is able to accommodate all of the information society will kind of help is needed and the amount of aid that has been successfully distributed to prevent the buildup of assistance in certain areas dehingga be equitable distribution ofaid to all areas affected the disaster.Keyword: Disaster, Web, System, Distribution, Help Rentetan bencana di Indonesia telah banyak menyebabkan terjadinya korban jiwa bahkan hingga ratusan ribu nyawa seperti bencana tsunami di Aceh ien Nias, dan terakhir gempa bumi dahsyat di wilayah Padang Sumatera Barat.Dalam situasi tersebut, informasi tentang daftar kebutuhan sertapendistribusian bantuan merupakan hal yang sangat dibutuhkan oleh seluruh masyarakat. Melalui kegiatan penelitian ini diharapkan agar informasi yang sangat diharapkan

  16. EuroGOV: Engineering a Multilingual Web Corpus

    NARCIS (Netherlands)

    Sigurbjörnsson, B.; Kamps, J.; de Rijke, M.

    2005-01-01

    EuroGOV is a multilingual web corpus that was created to serve as the document collection for WebCLEF, the CLEF 2005 web retrieval task. EuroGOV is a collection of web pages crawled from the European Union portal, European Union member state governmental web sites, and Russian government web sites.

  17. Data management on the spatial web

    DEFF Research Database (Denmark)

    Jensen, Christian S.

    2012-01-01

    Due in part to the increasing mobile use of the web and the proliferation of geo-positioning, the web is fast acquiring a significant spatial aspect. Content and users are being augmented with locations that are used increasingly by location-based services. Studies suggest that each week, several...... billion web queries are issued that have local intent and target spatial web objects. These are points of interest with a web presence, and they thus have locations as well as textual descriptions. This development has given prominence to spatial web data management, an area ripe with new and exciting...... opportunities and challenges. The research community has embarked on inventing and supporting new query functionality for the spatial web. Different kinds of spatial web queries return objects that are near a location argument and are relevant to a text argument. To support such queries, it is important...

  18. APFEL Web a web-based application for the graphical visualization of parton distribution functions

    CERN Document Server

    Carrazza, Stefano; Palazzo, Daniele; Rojo, Juan

    2015-01-01

    We present APFEL Web, a web-based application designed to provide a flexible user-friendly tool for the graphical visualization of parton distribution functions (PDFs). In this note we describe the technical design of the APFEL Web application, motivating the choices and the framework used for the development of this project. We document the basic usage of APFEL Web and show how it can be used to provide useful input for a variety of collider phenomenological studies. Finally we provide some examples showing the output generated by the application.

  19. APFEL Web: a web-based application for the graphical visualization of parton distribution functions

    International Nuclear Information System (INIS)

    Carrazza, Stefano; Ferrara, Alfio; Palazzo, Daniele; Rojo, Juan

    2015-01-01

    We present APFEL Web, a Web-based application designed to provide a flexible user-friendly tool for the graphical visualization of parton distribution functions. In this note we describe the technical design of the APFEL Web application, motivating the choices and the framework used for the development of this project. We document the basic usage of APFEL Web and show how it can be used to provide useful input for a variety of collider phenomenological studies. Finally we provide some examples showing the output generated by the application. (note)

  20. Teaching and Learning Ecological Modeling over the Web: a Collaborative Approach

    Directory of Open Access Journals (Sweden)

    Alexey Voinov

    2002-06-01

    Full Text Available A framework for web-based collaborative teaching has been created. This framework is implemented as an ecological modeling course (http://iee.umces.edu/AV/Simmod.html, but should be flexible enough to apply to other disciplines. I have developed a series of tools to facilitate interactive communication between students and instructors, and among students taking the course. The course content consists of reading materials that describe the theory of systems analysis and modeling, guidelines on how models can be built, and numerous examples and illustrations. The interactive part includes exercises that can be discussed with and evaluated by the instructor, and provides a means to mimic class discussions. To what extent this approach can replace conventional in-class tutoring has yet to be tested, but the preliminary applications show great promise. I offer this course format as a framework and a prototype for collaborative "open-source" approaches to education, in which the web provides the means to communicate knowledge and skills asynchronously between geographically dispersed educators and students.

  1. The Evolution of Web Searching.

    Science.gov (United States)

    Green, David

    2000-01-01

    Explores the interrelation between Web publishing and information retrieval technologies and lists new approaches to Web indexing and searching. Highlights include Web directories; search engines; portalisation; Internet service providers; browser providers; meta search engines; popularity based analysis; natural language searching; links-based…

  2. WebPresent: a World Wide Web-based telepresentation tool for physicians

    Science.gov (United States)

    Sampath-Kumar, Srihari; Banerjea, Anindo; Moshfeghi, Mehran

    1997-05-01

    In this paper, we present the design architecture and the implementation status of WebPresent - a world wide web based tele-presentation tool. This tool allows a physician to use a conference server workstation and make a presentation of patient cases to a geographically distributed audience. The audience consists of other physicians collaborating on patients' health care management and physicians participating in continuing medical education. These physicians are at several locations with networks of different bandwidth and capabilities connecting them. Audiences also receive the patient case information on different computers ranging form high-end display workstations to laptops with low-resolution displays. WebPresent is a scalable networked multimedia tool which supports the presentation of hypertext, images, audio, video, and a white-board to remote physicians with hospital Intranet access. WebPresent allows the audience to receive customized information. The data received can differ in resolution and bandwidth, depending on the availability of resources such as display resolution and network bandwidth.

  3. Elements of a Spatial Web

    DEFF Research Database (Denmark)

    Jensen, Christian S.

    2010-01-01

    Driven by factors such as the increasingly mobile use of the web and the proliferation of geo-positioning technologies, the web is rapidly acquiring a spatial aspect. Specifically, content and users are being geo-tagged, and services are being developed that exploit these tags. The research...... community is hard at work inventing means of efficiently supporting new spatial query functionality. Points of interest with a web presence, called spatial web objects, have a location as well as a textual description. Spatio-textual queries return such objects that are near a location argument...... and are relevant to a text argument. An important element in enabling such queries is to be able to rank spatial web objects. Another is to be able to determine the relevance of an object to a query. Yet another is to enable the efficient processing of such queries. The talk covers recent results on spatial web...

  4. WEB-IS2: Next Generation Web Services Using Amira Visualization Package

    Science.gov (United States)

    Yang, X.; Wang, Y.; Bollig, E. F.; Kadlec, B. J.; Garbow, Z. A.; Yuen, D. A.; Erlebacher, G.

    2003-12-01

    Amira (www.amiravis.com) is a powerful 3-D visualization package and has been employed recently by the science and engineering communities to gain insight into their data. We present a new web-based interface to Amira, packaged in a Java applet. We have developed a module called WEB-IS/Amira (WEB-IS2), which provides web-based access to Amira. This tool allows earth scientists to manipulate Amira controls remotely and to analyze, render and view large datasets over the internet, without regard for time or location. This could have important ramifications for GRID computing. The design of our implementation will soon allow multiple users to visually collaborate by manipulating a single dataset through a variety of client devices. These clients will only require a browser capable of displaying Java applets. As the deluge of data continues, innovative solutions that maximize ease of use without sacrificing efficiency or flexibility will continue to gain in importance, particularly in the Earth sciences. Major initiatives, such as Earthscope (http://www.earthscope.org), which will generate at least a terabyte of data daily, stand to profit enormously by a system such as WEB-IS/Amira (WEB-IS2). We discuss our use of SOAP (Livingston, D., Advanced SOAP for Web development, Prentice Hall, 2002), a novel 2-way communication protocol, as a means of providing remote commands, and efficient point-to-point transfer of binary image data. We will present our initial experiences with the use of Naradabrokering (www.naradabrokering.org) as a means to decouple clients and servers. Information is submitted to the system as a published item, while it is retrieved through a subscription mechanisms, via what is known as "topics". These topic headers, their contents, and the list of subscribers are automatically tracked by Naradabrokering. This novel approach promises a high degree of fault tolerance, flexibility with respect to client diversity, and language independence for the

  5. Security scanning of Web sites at CERN

    CERN Multimedia

    IT Department

    2010-01-01

    As of early 2010, the CERN Computer Security Team will start regular scanning of all Web sites and Web applications at CERN, visible on the Internet, or on the General Purpose Network (office network). The goal of this scanning is to improve the quality of CERN Web sites. All deficits found will be reported by e-mail to the relevant Web site owners, and must be fixed in a timely manner. Web site owners may also request one-off scans of their Web site or Web application, by sending an e-mail to Computer.Security@cern.ch. These Web scans are designed to limit the impact on the scanned Web sites. Nevertheless, in very rare cases scans may cause undesired side-effects, e.g. generate a large number of log entries, or cause particularly badly designed or less robust Web applications to crash. If a Web site is affected by these security scans, it will also be susceptible to any more aggressive scan that can be performed any time by a malicious attacker. Such Web applications should be fixed, and also additionally...

  6. DIANA-microT web server v5.0: service integration into miRNA functional analysis workflows.

    Science.gov (United States)

    Paraskevopoulou, Maria D; Georgakilas, Georgios; Kostoulas, Nikos; Vlachos, Ioannis S; Vergoulis, Thanasis; Reczko, Martin; Filippidis, Christos; Dalamagas, Theodore; Hatzigeorgiou, A G

    2013-07-01

    MicroRNAs (miRNAs) are small endogenous RNA molecules that regulate gene expression through mRNA degradation and/or translation repression, affecting many biological processes. DIANA-microT web server (http://www.microrna.gr/webServer) is dedicated to miRNA target prediction/functional analysis, and it is being widely used from the scientific community, since its initial launch in 2009. DIANA-microT v5.0, the new version of the microT server, has been significantly enhanced with an improved target prediction algorithm, DIANA-microT-CDS. It has been updated to incorporate miRBase version 18 and Ensembl version 69. The in silico-predicted miRNA-gene interactions in Homo sapiens, Mus musculus, Drosophila melanogaster and Caenorhabditis elegans exceed 11 million in total. The web server was completely redesigned, to host a series of sophisticated workflows, which can be used directly from the on-line web interface, enabling users without the necessary bioinformatics infrastructure to perform advanced multi-step functional miRNA analyses. For instance, one available pipeline performs miRNA target prediction using different thresholds and meta-analysis statistics, followed by pathway enrichment analysis. DIANA-microT web server v5.0 also supports a complete integration with the Taverna Workflow Management System (WMS), using the in-house developed DIANA-Taverna Plug-in. This plug-in provides ready-to-use modules for miRNA target prediction and functional analysis, which can be used to form advanced high-throughput analysis pipelines.

  7. XML databases and the semantic web

    CERN Document Server

    Thuraisingham, Bhavani

    2002-01-01

    Efficient access to data, sharing data, extracting information from data, and making use of the information have become urgent needs for today''s corporations. With so much data on the Web, managing it with conventional tools is becoming almost impossible. New tools and techniques are necessary to provide interoperability as well as warehousing between multiple data sources and systems, and to extract information from the databases. XML Databases and the Semantic Web focuses on critical and new Web technologies needed for organizations to carry out transactions on the Web, to understand how to use the Web effectively, and to exchange complex documents on the Web.This reference for database administrators, database designers, and Web designers working in tandem with database technologists covers three emerging technologies of significant impact for electronic business: Extensible Markup Language (XML), semi-structured databases, and the semantic Web. The first two parts of the book explore these emerging techn...

  8. Fragment-based docking: development of the CHARMMing Web user interface as a platform for computer-aided drug design.

    Science.gov (United States)

    Pevzner, Yuri; Frugier, Emilie; Schalk, Vinushka; Caflisch, Amedeo; Woodcock, H Lee

    2014-09-22

    Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser. One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing's capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of "re-dockings" with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing's docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening.

  9. Anatomy of the ICDS series: A bibliometric analysis

    International Nuclear Information System (INIS)

    Cardona, Manuel; Marx, Werner

    2007-01-01

    In this article, the proceedings of the International Conferences on Defects in Semiconductors (ICDS) have been analyzed by bibliometric methods. The papers of these conferences have been published as articles in regular journals or special proceedings journals and in books with diverse publishers. The conference name/title changed several times. Many of the proceedings did not appear in the so-called 'source journals' covered by the Thomson/ISI citation databases, in particular by the Science Citation Index (SCI). But the number of citations within these source journals can be determined using the Cited Reference Search mode under the Web of Science (WoS) and the SCI offered by the host STN International. The search functions of both systems were needed to select the papers published as different document types and to cover the full time span of the series. The most cited ICDS papers were identified, and the overall numbers of citations as well as the time-dependent impact of these papers, of single conferences, and of the complete series, was established. The complete of citing papers was analyzed with respect to the countries of the citing authors, the citing journals, and the ISI subject categories

  10. Exploring the academic invisible web

    OpenAIRE

    Lewandowski, Dirk; Mayr, Philipp

    2006-01-01

    Purpose: To provide a critical review of Bergman’s 2001 study on the Deep Web. In addition, we bring a new concept into the discussion, the Academic Invisible Web (AIW). We define the Academic Invisible Web as consisting of all databases and collections relevant to academia but not searchable by the general-purpose internet search engines. Indexing this part of the Invisible Web is central to scientific search engines. We provide an overview of approaches followed thus far. Design/methodol...

  11. Web Team Development

    Science.gov (United States)

    Church, Jennifer; Felker, Kyle

    2005-01-01

    The dynamic world of the Web has provided libraries with a wealth of opportunities, including new approaches to the provision of information and varied internal staffing structures. The development of self-managed Web teams, endowed with authority and resources, can create an adaptable and responsive culture within libraries. This new working team…

  12. Augmenting the Web through Open Hypermedia

    DEFF Research Database (Denmark)

    Bouvin, N.O.

    2003-01-01

    Based on an overview of Web augmentation and detailing the three basic approaches to extend the hypermedia functionality of the Web, the author presents a general open hypermedia framework (the Arakne framework) to augment the Web. The aim is to provide users with the ability to link, annotate, a......, and otherwise structure Web pages, as they see fit. The paper further discusses the possibilities of the concept through the description of various experiments performed with an implementation of the framework, the Arakne Environment......Based on an overview of Web augmentation and detailing the three basic approaches to extend the hypermedia functionality of the Web, the author presents a general open hypermedia framework (the Arakne framework) to augment the Web. The aim is to provide users with the ability to link, annotate...

  13. Personalization of Rule-based Web Services.

    Science.gov (United States)

    Choi, Okkyung; Han, Sang Yong

    2008-04-04

    Nowadays Web users have clearly expressed their wishes to receive personalized services directly. Personalization is the way to tailor services directly to the immediate requirements of the user. However, the current Web Services System does not provide any features supporting this such as consideration of personalization of services and intelligent matchmaking. In this research a flexible, personalized Rule-based Web Services System to address these problems and to enable efficient search, discovery and construction across general Web documents and Semantic Web documents in a Web Services System is proposed. This system utilizes matchmaking among service requesters', service providers' and users' preferences using a Rule-based Search Method, and subsequently ranks search results. A prototype of efficient Web Services search and construction for the suggested system is developed based on the current work.

  14. Interrupted time series analysis in drug utilization research is increasing: systematic review and recommendations.

    Science.gov (United States)

    Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M

    2015-08-01

    To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Web Services--A Buzz Word with Potentials

    Science.gov (United States)

    János T. Füstös

    2006-01-01

    The simplest definition of a web service is an application that provides a web API. The web API exposes the functionality of the solution to other applications. The web API relies on other Internet-based technologies to manage communications. The resulting web services are pervasive, vendor-independent, language-neutral, and very low-cost. The main purpose of a web API...

  16. Web-based Surveys: Changing the Survey Process

    OpenAIRE

    Gunn, Holly

    2002-01-01

    Web-based surveys are having a profound influence on the survey process. Unlike other types of surveys, Web page design skills and computer programming expertise play a significant role in the design of Web-based surveys. Survey respondents face new and different challenges in completing a Web-based survey. This paper examines the different types of Web-based surveys, the advantages and challenges of using Web-based surveys, the design of Web-based surveys, and the issues of validity, error, ...

  17. A Runtime System for Interactive Web Services

    DEFF Research Database (Denmark)

    Brabrand, Claus; Møller, Anders; Sandholm, Anders

    1999-01-01

    Interactive web services are increasingly replacing traditional static web pages. Producing web services seems to require a tremendous amount of laborious low-level coding due to the primitive nature of CGI programming. We present ideas for an improved runtime system for interactive web services ...... built on top of CGI running on virtually every combination of browser and HTTP/CGI server. The runtime system has been implemented and used extensively in , a tool for producing interactive web services.......Interactive web services are increasingly replacing traditional static web pages. Producing web services seems to require a tremendous amount of laborious low-level coding due to the primitive nature of CGI programming. We present ideas for an improved runtime system for interactive web services...

  18. Extracting Macroscopic Information from Web Links.

    Science.gov (United States)

    Thelwall, Mike

    2001-01-01

    Discussion of Web-based link analysis focuses on an evaluation of Ingversen's proposed external Web Impact Factor for the original use of the Web, namely the interlinking of academic research. Studies relationships between academic hyperlinks and research activities for British universities and discusses the use of search engines for Web link…

  19. The Semantic Web in Teacher Education

    Science.gov (United States)

    Czerkawski, Betül Özkan

    2014-01-01

    The Semantic Web enables increased collaboration among computers and people by organizing unstructured data on the World Wide Web. Rather than a separate body, the Semantic Web is a functional extension of the current Web made possible by defining relationships among websites and other online content. When explicitly defined, these relationships…

  20. Hacking web intelligence open source intelligence and web reconnaissance concepts and techniques

    CERN Document Server

    Chauhan, Sudhanshu

    2015-01-01

    Open source intelligence (OSINT) and web reconnaissance are rich topics for infosec professionals looking for the best ways to sift through the abundance of information widely available online. In many cases, the first stage of any security assessment-that is, reconnaissance-is not given enough attention by security professionals, hackers, and penetration testers. Often, the information openly present is as critical as the confidential data. Hacking Web Intelligence shows you how to dig into the Web and uncover the information many don't even know exists. The book takes a holistic approach

  1. Designing a responsive web site

    OpenAIRE

    Fejzić , Diana

    2016-01-01

    Due to the increasing prevalence of smartphones and tablet computers design became a crucial part of web design. For a user, responsive web design enables the best user experience, regardless of whether a user is visiting the site via a mobile phone, a tablet or a computer. This thesis covers the process of planning, designing and responsive web site development, for a fictitious company named “Creative Design d.o.o.”, with the help of web technologies. In the initial part of the thesis, w...

  2. WebBio, a web-based management and analysis system for patient data of biological products in hospital.

    Science.gov (United States)

    Lu, Ying-Hao; Kuo, Chen-Chun; Huang, Yaw-Bin

    2011-08-01

    We selected HTML, PHP and JavaScript as the programming languages to build "WebBio", a web-based system for patient data of biological products and used MySQL as database. WebBio is based on the PHP-MySQL suite and is run by Apache server on Linux machine. WebBio provides the functions of data management, searching function and data analysis for 20 kinds of biological products (plasma expanders, human immunoglobulin and hematological products). There are two particular features in WebBio: (1) pharmacists can rapidly find out whose patients used contaminated products for medication safety, and (2) the statistics charts for a specific product can be automatically generated to reduce pharmacist's work loading. WebBio has successfully turned traditional paper work into web-based data management.

  3. A Semantically Automated Protocol Adapter for Mapping SOAP Web Services to RESTful HTTP Format to Enable the Web Infrastructure, Enhance Web Service Interoperability and Ease Web Service Migration

    Directory of Open Access Journals (Sweden)

    Frank Doheny

    2012-04-01

    Full Text Available Semantic Web Services (SWS are Web Service (WS descriptions augmented with semantic information. SWS enable intelligent reasoning and automation in areas such as service discovery, composition, mediation, ranking and invocation. This paper applies SWS to a previous protocol adapter which, operating within clearly defined constraints, maps SOAP Web Services to RESTful HTTP format. However, in the previous adapter, the configuration element is manual and the latency implications are locally based. This paper applies SWS technologies to automate the configuration element and the latency tests are conducted in a more realistic Internet based setting.

  4. USING WEB MINING IN E-COMMERCE APPLICATIONS

    Directory of Open Access Journals (Sweden)

    Claudia Elena Dinucă

    2011-09-01

    Full Text Available Nowadays, the web is an important part of our daily life. The web is now the best medium of doing business. Large companies rethink their business strategy using the web to improve business. Business carried on the Web offers the opportunity to potential customers or partners where their products and specific business can be found. Business presence through a company web site has several advantages as it breaks the barrier of time and space compared with the existence of a physical office. To differentiate through the Internet economy, winning companies have realized that e-commerce transactions is more than just buying / selling, appropriate strategies are key to improve competitive power. One effective technique used for this purpose is data mining. Data mining is the process of extracting interesting knowledge from data. Web mining is the use of data mining techniques to extract information from web data. This article presents the three components of web mining: web usage mining, web structure mining and web content mining.

  5. A Typology for Web 2.0

    DEFF Research Database (Denmark)

    Dalsgaard, Christian; Sorensen, Elsebeth Korsgaard

    2008-01-01

    of a learning environment: 1) organizing communicative processes and 2) organizing resources. Organizing communicative processes is supported by Web 2.0’s ability to provide a range of communicative tools that can be organized flexibly by students. Web 2.0 provides opportunities for communities and groups...... to organize their own communicative processes. Further, Web 2.0 supports organization of resources by empowering students to create, construct, manage and share content themselves. However, the main potential lies within collaborative creation and sharing in networks. Potentially, networking tools......Web 2.0 is a term used to describe recent developments on the World Wide Web. The term is often used to describe the increased use of the web for user-generated content, collaboration, and social networking. However, Web 2.0 is a weakly defined concept, and it is unclear exactly what kind...

  6. An evaluation on the effectiveness of Web 2.0 Startpages (Netvibes & Pageflakes) within NHS libraries.

    Science.gov (United States)

    McCormick, Carol; Pickard, Alison Jane

    2013-06-01

    Carol McCormick was Learning Resources Advisor in the library at James Cook University Hospital, South Teesside when she completed her BSc (Hons) Librarianship (Work Based Learning) degree at Northumbria University. She gained a 1st Class Honours and is now Learning Resources Librarian. Carol's dissertation formed part of a wider action research project into the provision of current awareness services at James Cook University Hospital. This article reports on the evaluation which was conducted after a Web 2.0 Startpage, or portal, had been introduced to improve access to current awareness information for all staff within the Trust. It is the second article in the Dissertations into practice series to examine the use of web-based tools to improve access to information for NHS staff. AM. © 2013 The authors. Health Information and Libraries Journal © 2013 Health Libraries Group.

  7. Sensor system for web inspection

    Science.gov (United States)

    Sleefe, Gerard E.; Rudnick, Thomas J.; Novak, James L.

    2002-01-01

    A system for electrically measuring variations over a flexible web has a capacitive sensor including spaced electrically conductive, transmit and receive electrodes mounted on a flexible substrate. The sensor is held against a flexible web with sufficient force to deflect the path of the web, which moves relative to the sensor.

  8. Chemical Search Web Utility

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Chemical Search Web Utility is an intuitive web application that allows the public to easily find the chemical that they are interested in using, and which...

  9. Nuclear expert web search and crawler algorithm

    International Nuclear Information System (INIS)

    Reis, Thiago; Barroso, Antonio C.O.; Baptista, Benedito Filho D.

    2013-01-01

    In this paper we present preliminary research on web search and crawling algorithm applied specifically to nuclear-related web information. We designed a web-based nuclear-oriented expert system guided by a web crawler algorithm and a neural network able to search and retrieve nuclear-related hyper textual web information in autonomous and massive fashion. Preliminary experimental results shows a retrieval precision of 80% for web pages related to any nuclear theme and a retrieval precision of 72% for web pages related only to nuclear power theme. (author)

  10. Nuclear expert web search and crawler algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Reis, Thiago; Barroso, Antonio C.O.; Baptista, Benedito Filho D., E-mail: thiagoreis@usp.br, E-mail: barroso@ipen.br, E-mail: bdbfilho@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    In this paper we present preliminary research on web search and crawling algorithm applied specifically to nuclear-related web information. We designed a web-based nuclear-oriented expert system guided by a web crawler algorithm and a neural network able to search and retrieve nuclear-related hyper textual web information in autonomous and massive fashion. Preliminary experimental results shows a retrieval precision of 80% for web pages related to any nuclear theme and a retrieval precision of 72% for web pages related only to nuclear power theme. (author)

  11. Collaborative web hosting challenges and research directions

    CERN Document Server

    Ahmed, Reaz

    2014-01-01

    This brief presents a peer-to-peer (P2P) web-hosting infrastructure (named pWeb) that can transform networked, home-entertainment devices into lightweight collaborating Web servers for persistently storing and serving multimedia and web content. The issues addressed include ensuring content availability, Plexus routing and indexing, naming schemes, web ID, collaborative web search, network architecture and content indexing. In pWeb, user-generated voluminous multimedia content is proactively uploaded to a nearby network location (preferably within the same LAN or at least, within the same ISP)

  12. Web Design Matters

    Science.gov (United States)

    Mathews, Brian

    2009-01-01

    The web site is a library's most important feature. Patrons use the web site for numerous functions, such as renewing materials, placing holds, requesting information, and accessing databases. The homepage is the place they turn to look up the hours, branch locations, policies, and events. Whether users are at work, at home, in a building, or on…

  13. Editorial for the special issue on "The Semantic Web for all" of the Semantic Web Journal (SWJ)

    NARCIS (Netherlands)

    Guéret, Christophe; Boyera, Stephane; Powell, Mike; Murillo, Martin

    2014-01-01

    Over the past few years Semantic Web technologies have brought significant changes in the way structured data is published, shared and consumed on the Web. Emerging online applications based on the Web of Objects or Linked Open Data can use the Web as a platform to exchange and reason over

  14. Web metrics for library and information professionals

    CERN Document Server

    Stuart, David

    2014-01-01

    This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

  15. SproutCore web application development

    CERN Document Server

    Keating, Tyler

    2013-01-01

    Written as a practical, step-by-step tutorial, Creating HTML5 Apps with SproutCore is full of engaging examples to help you learn in a practical context.This book is for any person looking to write software for the Web or already writing software for the Web. Whether your background is in web development or in software development, Creating HTML5 Apps with SproutCore will help you expand your skills so that you will be ready to apply the software development principles in the web development space.

  16. Vibration transmission through sheet webs of hobo spiders (Eratigena agrestis) and tangle webs of western black widow spiders (Latrodectus hesperus).

    Science.gov (United States)

    Vibert, Samantha; Scott, Catherine; Gries, Gerhard

    2016-11-01

    Web-building spiders construct their own vibratory signaling environments. Web architecture should affect signal design, and vice versa, such that vibratory signals are transmitted with a minimum of attenuation and degradation. However, the web is the medium through which a spider senses both vibratory signals from courting males and cues produced by captured prey. Moreover, webs function not only in vibration transmission, but also in defense from predators and the elements. These multiple functions may impose conflicting selection pressures on web design. We investigated vibration transmission efficiency and accuracy through two web types with contrasting architectures: sheet webs of Eratigena agrestis (Agelenidae) and tangle webs of Latrodectus hesperus (Theridiidae). We measured vibration transmission efficiencies by playing frequency sweeps through webs with a piezoelectric vibrator and a loudspeaker, recording the resulting web vibrations at several locations on each web using a laser Doppler vibrometer. Transmission efficiencies through both web types were highly variable, with within-web variation greater than among-web variation. There was little difference in transmission efficiencies of longitudinal and transverse vibrations. The inconsistent transmission of specific frequencies through webs suggests that parameters other than frequency are most important in allowing these spiders to distinguish between vibrations of prey and courting males.

  17. Delivering Electronic Resources with Web OPACs and Other Web-based Tools: Needs of Reference Librarians.

    Science.gov (United States)

    Bordeianu, Sever; Carter, Christina E.; Dennis, Nancy K.

    2000-01-01

    Describes Web-based online public access catalogs (Web OPACs) and other Web-based tools as gateway methods for providing access to library collections. Addresses solutions for overcoming barriers to information, such as through the implementation of proxy servers and other authentication tools for remote users. (Contains 18 references.)…

  18. WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets

    Science.gov (United States)

    Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.

    2010-12-01

    WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface

  19. Emergent web intelligence advanced information retrieval

    CERN Document Server

    Badr, Youakim; Abraham, Ajith; Hassanien, Aboul-Ella

    2010-01-01

    Web Intelligence explores the impact of artificial intelligence and advanced information technologies representing the next generation of Web-based systems, services, and environments, and designing hybrid web systems that serve wired and wireless users more efficiently. Multimedia and XML-based data are produced regularly and in increasing way in our daily digital activities, and their retrieval must be explored and studied in this emergent web-based era. 'Emergent Web Intelligence: Advanced information retrieval, provides reviews of the related cutting-edge technologies and insights. It is v

  20. Readability of the web: a study on 1 billion web pages

    NARCIS (Netherlands)

    de Heus, Marije; Hiemstra, Djoerd

    We have performed a readability study on more than 1 billion web pages. The Automated Readability Index was used to determine the average grade level required to easily comprehend a website. Some of the results are that a 16-year-old can easily understand 50% of the web and an 18-year old can easily

  1. web cellHTS2: A web-application for the analysis of high-throughput screening data

    Directory of Open Access Journals (Sweden)

    Boutros Michael

    2010-04-01

    Full Text Available Abstract Background The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. Results The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. Conclusions The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.

  2. The Semantic Web in Education

    Science.gov (United States)

    Ohler, Jason

    2008-01-01

    The semantic web or Web 3.0 makes information more meaningful to people by making it more understandable to machines. In this article, the author examines the implications of Web 3.0 for education. The author considers three areas of impact: knowledge construction, personal learning network maintenance, and personal educational administration.…

  3. XML and Better Web Searching.

    Science.gov (United States)

    Jackson, Joe; Gilstrap, Donald L.

    1999-01-01

    Addresses the implications of the new Web metalanguage XML for searching on the World Wide Web and considers the future of XML on the Web. Compared to HTML, XML is more concerned with structure of data than documents, and these data structures should prove conducive to precise, context rich searching. (Author/LRW)

  4. CAMINO HACIA LA WEB SEMÁNTICA

    Directory of Open Access Journals (Sweden)

    Jorge Alejandro Castillo Morales

    2006-01-01

    Full Text Available El rápido crecimiento de la Word Wide Web ocasiona que sea cada vez más difícil buscar, extraer, interpretar y procesar información de la Web. Como una alternativa a este problema se está desarrollando la Web Semántica - una nueva tecnología que hace que el contenido de la Web sea más significativo a las aplicaciones de software. En la Web Semántica se aumentan anotaciones que expresan el significado de los datos en las páginas Web. Para que estas anotaciones sean útiles, es necesario un entendimiento compartido (entre sus creadores y los usuarios de anotaciones precisamente definidas. Con este propósito se utilizan ontologías – definición de conceptos importantes en un dominio de conocimiento y de las propiedades de cada concepto. Las ontologías permiten definir terminologías y expresar propiedades semánticas. Como resultado, la Web Semántica promete proveer un nivel de automatización e integración que es imposible para la Web actual. Asimismo, la Web Semántica va a poder ejecutar consultas avanzadas que requieren conocimiento de soporte para su resolución.

  5. An open annotation ontology for science on web 3.0.

    Science.gov (United States)

    Ciccarese, Paolo; Ocana, Marco; Garcia Castro, Leyla Jael; Das, Sudeshna; Clark, Tim

    2011-05-17

    There is currently a gap between the rich and expressive collection of published biomedical ontologies, and the natural language expression of biomedical papers consumed on a daily basis by scientific researchers. The purpose of this paper is to provide an open, shareable structure for dynamic integration of biomedical domain ontologies with the scientific document, in the form of an Annotation Ontology (AO), thus closing this gap and enabling application of formal biomedical ontologies directly to the literature as it emerges. Initial requirements for AO were elicited by analysis of integration needs between biomedical web communities, and of needs for representing and integrating results of biomedical text mining. Analysis of strengths and weaknesses of previous efforts in this area was also performed. A series of increasingly refined annotation tools were then developed along with a metadata model in OWL, and deployed for feedback and additional requirements the ontology to users at a major pharmaceutical company and a major academic center. Further requirements and critiques of the model were also elicited through discussions with many colleagues and incorporated into the work. This paper presents Annotation Ontology (AO), an open ontology in OWL-DL for annotating scientific documents on the web. AO supports both human and algorithmic content annotation. It enables "stand-off" or independent metadata anchored to specific positions in a web document by any one of several methods. In AO, the document may be annotated but is not required to be under update control of the annotator. AO contains a provenance model to support versioning, and a set model for specifying groups and containers of annotation. AO is freely available under open source license at http://purl.org/ao/, and extensive documentation including screencasts is available on AO's Google Code page: http://code.google.com/p/annotation-ontology/ . The Annotation Ontology meets critical requirements for

  6. Public health and Web 2.0.

    Science.gov (United States)

    Hardey, Michael

    2008-07-01

    This article examines the nature and role of Web 2.0 resources and their impact on health information made available though the Internet. The transition of the Web from version one to Web 2.0 is described and the main features of the new Web examined. Two characteristic Web 2.0 resources are explored and the implications for the public and practitioners examined. First, what are known as 'user reviews' or 'user testimonials', which allow people to comment on the health services delivered to them, are described. Second, new mapping applications that take advantage of the interactive potential of Web 2.0 and provide tools to visualize complex data are examined. Following a discussion of the potential of Web 2.0, it is concluded that it offers considerable opportunities for disseminating health information and creating new sources of data, as well as generating new questions and dilemmas.

  7. Engineering semantic web information systems in Hera

    NARCIS (Netherlands)

    Vdovják, R.; Frasincar, F.; Houben, G.J.P.M.; Barna, P.

    2003-01-01

    The success of the World Wide Web has caused the concept of information system to change. Web Information Systems (WIS) use from the Web its paradigm and technologies in order to retrieve information from sources on the Web, and to present the information in terms of a Web or hypermedia

  8. Adaptive web data extraction policies

    Directory of Open Access Journals (Sweden)

    Provetti, Alessandro

    2008-12-01

    Full Text Available Web data extraction is concerned, among other things, with routine data accessing and downloading from continuously-updated dynamic Web pages. There is a relevant trade-off between the rate at which the external Web sites are accessed and the computational burden on the accessing client. We address the problem by proposing a predictive model, typical of the Operating Systems literature, of the rate-of-update of each Web source. The presented model has been implemented into a new version of the Dynamo project: a middleware that assists in generating informative RSS feeds out of traditional HTML Web sites. To be effective, i.e., make RSS feeds be timely and informative and to be scalable, Dynamo needs a careful tuning and customization of its polling policies, which are described in detail.

  9. The EMBRACE web service collection

    DEFF Research Database (Denmark)

    Pettifer, S.; Ison, J.; Kalas, M.

    2010-01-01

    The EMBRACE (European Model for Bioinformatics Research and Community Education) web service collection is the culmination of a 5-year project that set out to investigate issues involved in developing and deploying web services for use in the life sciences. The project concluded that in order...... for web services to achieve widespread adoption, standards must be defined for the choice of web service technology, for semantically annotating both service function and the data exchanged, and a mechanism for discovering services must be provided. Building on this, the project developed: EDAM......, an ontology for describing life science web services; BioXSD, a schema for exchanging data between services; and a centralized registry (http://www.embraceregistry.net) that collects together around 1000 services developed by the consortium partners. This article presents the current status of the collection...

  10. WEBnm@: a web application for normal mode analyses of proteins

    Directory of Open Access Journals (Sweden)

    Reuter Nathalie

    2005-03-01

    Full Text Available Abstract Background Normal mode analysis (NMA has become the method of choice to investigate the slowest motions in macromolecular systems. NMA is especially useful for large biomolecular assemblies, such as transmembrane channels or virus capsids. NMA relies on the hypothesis that the vibrational normal modes having the lowest frequencies (also named soft modes describe the largest movements in a protein and are the ones that are functionally relevant. Results We developed a web-based server to perform normal modes calculations and different types of analyses. Starting from a structure file provided by the user in the PDB format, the server calculates the normal modes and subsequently offers the user a series of automated calculations; normalized squared atomic displacements, vector field representation and animation of the first six vibrational modes. Each analysis is performed independently from the others and results can be visualized using only a web browser. No additional plug-in or software is required. For users who would like to analyze the results with their favorite software, raw results can also be downloaded. The application is available on http://www.bioinfo.no/tools/normalmodes. We present here the underlying theory, the application architecture and an illustration of its features using a large transmembrane protein as an example. Conclusion We built an efficient and modular web application for normal mode analysis of proteins. Non specialists can easily and rapidly evaluate the degree of flexibility of multi-domain protein assemblies and characterize the large amplitude movements of their domains.

  11. The IVTANTHERMO-Online database for thermodynamic properties of individual substances with web interface

    Science.gov (United States)

    Belov, G. V.; Dyachkov, S. A.; Levashov, P. R.; Lomonosov, I. V.; Minakov, D. V.; Morozov, I. V.; Sineva, M. A.; Smirnov, V. N.

    2018-01-01

    The database structure, main features and user interface of an IVTANTHERMO-Online system are reviewed. This system continues the series of the IVTANTHERMO packages developed in JIHT RAS. It includes the database for thermodynamic properties of individual substances and related software for analysis of experimental results, data fitting, calculation and estimation of thermodynamical functions and thermochemistry quantities. In contrast to the previous IVTANTHERMO versions it has a new extensible database design, the client-server architecture, a user-friendly web interface with a number of new features for online and offline data processing.

  12. Comparison: Mediation Solutions of WSMOLX and WebML/WebRatio

    Science.gov (United States)

    Zaremba, Maciej; Zaharia, Raluca; Turati, Andrea; Brambilla, Marco; Vitvar, Tomas; Ceri, Stefano

    In this chapter we compare the WSMO/WSML/WSMX andWebML/WebRatio approaches to the SWS-Challenge workshop mediation scenario in terms of the utilized underlying technologies and delivered solutions. In the mediation scenario one partner uses Roset-taNet to define its B2B protocol while the other one operates on a proprietary solution. Both teams shown how these partners could be semantically integrated.

  13. Technical Note: On The Usage and Development of the AWAKE Web Server and Web Applications

    CERN Document Server

    Berger, Dillon Tanner

    2017-01-01

    The purpose of this technical note is to give a brief explanation of the AWAKE Web Server, the current web applications it serves, and how to edit, maintain, and update the source code. The majority of this paper is dedicated to the development of the server and its web applications.

  14. Information Diversity in Web Search

    Science.gov (United States)

    Liu, Jiahui

    2009-01-01

    The web is a rich and diverse information source with incredible amounts of information about all kinds of subjects in various forms. This information source affords great opportunity to build systems that support users in their work and everyday lives. To help users explore information on the web, web search systems should find information that…

  15. Web Service Architecture for e-Learning

    Directory of Open Access Journals (Sweden)

    Xiaohong Qiu

    2005-10-01

    Full Text Available Message-based Web Service architecture provides a unified approach to applications and Web Services that incorporates the flexibility of messaging and distributed components. We propose SMMV and MMMV collaboration as the general architecture of collaboration based on a Web service model, which accommodates both instructor-led learning and participatory learning. This approach derives from our message-based Model-View-Controller (M-MVC architecture of Web applications, comprises an event-driven Publish/Subscribe scheme, and provides effective collaboration with high interactivity of rich Web content for diverse clients over heterogeneous network environments.

  16. Wordpress web application development

    CERN Document Server

    Ratnayake, Rakhitha Nimesh

    2015-01-01

    This book is intended for WordPress developers and designers who want to develop quality web applications within a limited time frame and for maximum profit. Prior knowledge of basic web development and design is assumed.

  17. Using EMBL-EBI Services via Web Interface and Programmatically via Web Services.

    Science.gov (United States)

    Lopez, Rodrigo; Cowley, Andrew; Li, Weizhong; McWilliam, Hamish

    2014-12-12

    The European Bioinformatics Institute (EMBL-EBI) provides access to a wide range of databases and analysis tools that are of key importance in bioinformatics. As well as providing Web interfaces to these resources, Web Services are available using SOAP and REST protocols that enable programmatic access to our resources and allow their integration into other applications and analytical workflows. This unit describes the various options available to a typical researcher or bioinformatician who wishes to use our resources via Web interface or programmatically via a range of programming languages. Copyright © 2014 John Wiley & Sons, Inc.

  18. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  19. Express web application development

    CERN Document Server

    Yaapa, Hage

    2013-01-01

    Express Web Application Development is a practical introduction to learning about Express. Each chapter introduces you to a different area of Express, using screenshots and examples to get you up and running as quickly as possible.If you are looking to use Express to build your next web application, ""Express Web Application Development"" will help you get started and take you right through to Express' advanced features. You will need to have an intermediate knowledge of JavaScript to get the most out of this book.

  20. Integrated Visualization of Multi-sensor Ocean Data across the Web

    Science.gov (United States)

    Platt, F.; Thompson, C. K.; Roberts, J. T.; Tsontos, V. M.; Hin Lam, C.; Arms, S. C.; Quach, N.

    2017-12-01

    Whether for research or operational decision support, oceanographic applications rely on the visualization of multivariate in situ and remote sensing data as an integral part of analysis workflows. However, given their inherently 3D-spatial and temporally dynamic nature, the visual representation of marine in situ data in particular poses a challenge. The Oceanographic In situ data Interoperability Project (OIIP) is a collaborative project funded under the NASA/ACCESS program that seeks to leverage and enhance higher TRL (technology readiness level) informatics technologies to address key data interoperability and integration issues associated with in situ ocean data, including the dearth of effective web-based visualization solutions. Existing web tools for the visualization of key in situ data types - point, profile, trajectory series - are limited in their support for integrated, dynamic and coordinated views of the spatiotemporal characteristics of the data. Via the extension of the JPL Common Mapping Client (CMC) software framework, OIIP seeks to provide improved visualization support for oceanographic in situ data sets. More specifically, this entails improved representation of both horizontal and vertical aspects of these data, which inherently are depth resolved and time referenced, as well as the visual synchronization with relevant remotely-sensed gridded data products, such as sea surface temperature and salinity. Electronic tagging datasets, which are a focal use case for OIIP, provide a representative, if somewhat complex, visualization challenge in this regard. Critical to the achievement of these development objectives has been compilation of a well-rounded set of visualization use cases and requirements based on a series of end-user consultations aimed at understanding their satellite-in situ visualization needs. Here we summarize progress on aspects of the technical work and our approach.

  1. Fourier series

    CERN Document Server

    Tolstov, Georgi P

    1962-01-01

    Richard A. Silverman's series of translations of outstanding Russian textbooks and monographs is well-known to people in the fields of mathematics, physics, and engineering. The present book is another excellent text from this series, a valuable addition to the English-language literature on Fourier series.This edition is organized into nine well-defined chapters: Trigonometric Fourier Series, Orthogonal Systems, Convergence of Trigonometric Fourier Series, Trigonometric Series with Decreasing Coefficients, Operations on Fourier Series, Summation of Trigonometric Fourier Series, Double Fourie

  2. Web Services Integration on the Fly

    National Research Council Canada - National Science Library

    Leong, Hoe W

    2008-01-01

    .... Given data, software agents and supporting software infrastructure, web services integration on the fly means that human coding is not required to integrate web services into a Web Service Architecture...

  3. Node web development

    CERN Document Server

    Herron, David

    2013-01-01

    Presented in a simple, step-by-step format, this book is an introduction to web development with Node.This book is for anybody looking for an alternative to the ""P"" languages (Perl, PHP, Python), or anyone looking for a new paradigm of server-side application development.The reader should have at least a rudimentary understanding of JavaScript and web application development.

  4. Unit 148 - World Wide Web Basics

    OpenAIRE

    148, CC in GIScience; Yeung, Albert K.

    2000-01-01

    This unit explains the characteristics and the working principles of the World Wide Web as the most important protocol of the Internet. Topics covered in this unit include characteristics of the World Wide Web; using the World Wide Web for the dissemination of information on the Internet; and using the World Wide Web for the retrieval of information from the Internet.

  5. Pharyngo-oesophageal webs in dysphageal patients

    International Nuclear Information System (INIS)

    Ekberg, O.; Malmquist, J.; Lindgren, S.

    1986-01-01

    Among 1134 patients, cineradiologically examined because of dysphagia, 85 (7.5%) had webs in the pharyngo-oesophageal segment. Webs were more common in women (10%) compared to men (5%). Radiologic characteristics of the webs such as precise location, multiplicity, circumferential extension, thickness, accompanying streamline phenomenon and encroachment on the lumen, were compared to the presence of concomitant anaemia, thyroid disease, neoplasm, as well as the age and sex of the patients. Webs were regularly deeper in women compared to men. Patients with iron deficiency anaemia had thicker webs compared to patients without such anaemia. No other radiologic characteristics were found that could be used for distinguished these potentially more significant webs from those in patients without such concomitant diseases. (orig.)

  6. The CLIMB Geoportal - A web-based dissemination and documentation platform for hydrological modelling data

    Science.gov (United States)

    Blaschek, Michael; Gerken, Daniel; Ludwig, Ralf; Duttmann, Rainer

    2015-04-01

    Geoportals are important elements of spatial data infrastructures (SDIs) that are strongly based on GIS-related web services. These services are basically meant for distributing, documenting and visualizing (spatial) data in a standardized manner; an important but challenging task especially in large scientific projects with a high number of data suppliers and producers from various countries. This presentation focuses on introducing the free and open-source based geoportal solution developed within the research project CLIMB (Climate Induced Changes on the Hydrology of Mediterranean Basins, www.climb-fp7.eu) that serves as the central platform for interchanging project-related spatial data and information. In this collaboration, financed by the EU-FP7-framework and coordinated at the LMU Munich, 21 partner institutions from nine European and non-European countries were involved. The CLIMB Geoportal (lgi-climbsrv.geographie.uni-kiel.de) stores and provides spatially distributed data about the current state and future changes of the hydrological conditions within the seven CLIMB test sites around the Mediterranean. Hydrological modelling outcome - validated by the CLIMB partners - is offered to the public in forms of Web Map Services (WMS), whereas downloading the underlying data itself through Web Coverage Services (WCS) is possible for registered users only. A selection of common indicators such as discharge, drought index as well as uncertainty measures including their changes over time were used in different spatial resolution. Besides map information, the portal enables the graphical display of time series of selected variables calculated by the individual models applied within the CLIMB-project. The implementation of the CLIMB Geoportal is finally based on version 2.0c5 of the open source geospatial content management system GeoNode. It includes a GeoServer instance for providing the OGC-compliant web services and comes with a metadata catalog (pycsw) as well

  7. Human Trafficking in the United States. Part II. Survey of U.S. Government Web Resources for Publications and Data

    Science.gov (United States)

    Panigabutra-Roberts, Anchalee

    2012-01-01

    This second part of a two-part series is a survey of U.S. government web resources on human trafficking in the United States, particularly of the online publications and data included on agencies' websites. Overall, the goal is to provide an introduction, an overview, and a guide on this topic for library staff to use in their research and…

  8. Increasing Student Performance through the Use of Web Services in Introductory Programming Classrooms: Results from a Series of Quasi-Experiments

    Science.gov (United States)

    Hosack, Bryan; Lim, Billy; Vogt, W. Paul

    2012-01-01

    An introduction to programming course can be a challenge for both students and instructors. This paper describes a study that introduced Web services (WS) and Service-Oriented Architecture in Information Systems 1 (IS 1) and Computer Science 1 (CS 1) programming courses over a two-year period. WS were used as an instruction tool based on their…

  9. WIDM'12 : the proceedings of the twelfth ACM international workshop on Web information and data management, November 2, 2012, Maui, Hawaii, USA

    NARCIS (Netherlands)

    Fletcher, G.H.L.; Mitra, P.

    2012-01-01

    We give an overview of WIDM 2012, held in conjunction with CIKM2012 inMaui, Hawaii. WIDM 2012 is the twelfth in a series of international workshops on Web Information and Data Management held in conjunction with CIKM since 1998. The objective of the workshop is to bring together researchers and

  10. Recommender Systems for the Social Web

    CERN Document Server

    Pazos Arias, José J; Díaz Redondo, Rebeca P

    2012-01-01

    The recommendation of products, content and services cannot be considered newly born, although its widespread application is still in full swing. While its growing success in numerous sectors, the progress of the  Social Web has revolutionized the architecture of participation and relationship in the Web, making it necessary to restate recommendation and reconciling it with Collaborative Tagging, as the popularization of authoring in the Web, and  Social Networking, as the translation of personal relationships to the Web. Precisely, the convergence of recommendation with the above Social Web pillars is what motivates this book, which has collected contributions from well-known experts in the academy and the industry to provide a broader view of the problems that Social Recommenders might face with.  If recommender systems have proven their key role in facilitating the user access to resources on the Web, when sharing resources has become social, it is natural for recommendation strategies in the Social Web...

  11. WAPTT - Web Application Penetration Testing Tool

    Directory of Open Access Journals (Sweden)

    DURIC, Z.

    2014-02-01

    Full Text Available Web applications vulnerabilities allow attackers to perform malicious actions that range from gaining unauthorized account access to obtaining sensitive data. The number of reported web application vulnerabilities in last decade is increasing dramatically. The most of vulnerabilities result from improper input validation and sanitization. The most important of these vulnerabilities based on improper input validation and sanitization are: SQL injection (SQLI, Cross-Site Scripting (XSS and Buffer Overflow (BOF. In order to address these vulnerabilities we designed and developed the WAPTT (Web Application Penetration Testing Tool tool - web application penetration testing tool. Unlike other web application penetration testing tools, this tool is modular, and can be easily extended by end-user. In order to improve efficiency of SQLI vulnerability detection, WAPTT uses an efficient algorithm for page similarity detection. The proposed tool showed promising results as compared to six well-known web application scanners in detecting various web application vulnerabilities.

  12. A Typology for Web 2.0

    DEFF Research Database (Denmark)

    Dalsgaard, Christian; Sorensen, Elsebeth Korsgaard

    2008-01-01

    Web 2.0 is a term used to describe recent developments on the World Wide Web. The term is often used to describe the increased use of the web for user-generated content, collaboration, and social networking. However, Web 2.0 is a weakly defined concept, and it is unclear exactly what kind...... of technologies it covers. The objective of the paper is to develop a typology that can be used to categorize Web 2.0 technologies. Further, the paper will discuss which of these technologies are unique to Web 2.0. Often, Web 2.0 is described by way of different kinds of software; for instance, blogs, wikis......, podcasts, RSS, and social networking sites. The problem with this type of description is that it fails to distinguish between different types or categories of technologies. As an alternative, the typology developed in the paper distinguishes between technologies on basis of, how - and in which contexts...

  13. CMS offline web tools

    International Nuclear Information System (INIS)

    Metson, S; Newbold, D; Belforte, S; Kavka, C; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Tuura, L; Evans, D; Fanfani, A; Feichtinger, D; Kuznetsov, V; Lingen, F van; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools

  14. CMS offline web tools

    Energy Technology Data Exchange (ETDEWEB)

    Metson, S; Newbold, D [H.H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Belforte, S; Kavka, C [INFN, Sezione di Trieste (Italy); Bockelman, B [University of Nebraska Lincoln, Lincoln, NE (United States); Dziedziniewicz, K [CERN, Geneva (Switzerland); Egeland, R [University of Minnesota Twin Cities, Minneapolis, MN (United States); Elmer, P [Princeton (United States); Eulisse, G; Tuura, L [Northeastern University, Boston, MA (United States); Evans, D [Fermilab MS234, Batavia, IL (United States); Fanfani, A [Universita degli Studi di Bologna (Italy); Feichtinger, D [PSI, Villigen (Switzerland); Kuznetsov, V [Cornell University, Ithaca, NY (United States); Lingen, F van [California Institute of Technology, Pasedena, CA (United States); Wakefield, S [Blackett Laboratory, Imperial College, London (United Kingdom)

    2008-07-15

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools.

  15. Tracing the cosmic web

    Science.gov (United States)

    Libeskind, Noam I.; van de Weygaert, Rien; Cautun, Marius; Falck, Bridget; Tempel, Elmo; Abel, Tom; Alpaslan, Mehmet; Aragón-Calvo, Miguel A.; Forero-Romero, Jaime E.; Gonzalez, Roberto; Gottlöber, Stefan; Hahn, Oliver; Hellwing, Wojciech A.; Hoffman, Yehuda; Jones, Bernard J. T.; Kitaura, Francisco; Knebe, Alexander; Manti, Serena; Neyrinck, Mark; Nuza, Sebastián E.; Padilla, Nelson; Platen, Erwin; Ramachandra, Nesar; Robotham, Aaron; Saar, Enn; Shandarin, Sergei; Steinmetz, Matthias; Stoica, Radu S.; Sousbie, Thierry; Yepes, Gustavo

    2018-01-01

    The cosmic web is one of the most striking features of the distribution of galaxies and dark matter on the largest scales in the Universe. It is composed of dense regions packed full of galaxies, long filamentary bridges, flattened sheets and vast low-density voids. The study of the cosmic web has focused primarily on the identification of such features, and on understanding the environmental effects on galaxy formation and halo assembly. As such, a variety of different methods have been devised to classify the cosmic web - depending on the data at hand, be it numerical simulations, large sky surveys or other. In this paper, we bring 12 of these methods together and apply them to the same data set in order to understand how they compare. In general, these cosmic-web classifiers have been designed with different cosmological goals in mind, and to study different questions. Therefore, one would not a priori expect agreement between different techniques; however, many of these methods do converge on the identification of specific features. In this paper, we study the agreements and disparities of the different methods. For example, each method finds that knots inhabit higher density regions than filaments, etc. and that voids have the lowest densities. For a given web environment, we find a substantial overlap in the density range assigned by each web classification scheme. We also compare classifications on a halo-by-halo basis; for example, we find that 9 of 12 methods classify around a third of group-mass haloes (i.e. Mhalo ∼ 1013.5 h-1 M⊙) as being in filaments. Lastly, so that any future cosmic-web classification scheme can be compared to the 12 methods used here, we have made all the data used in this paper public.

  16. BioAssay templates for the semantic web

    Directory of Open Access Journals (Sweden)

    Alex M. Clark

    2016-05-01

    Full Text Available Annotation of bioassay protocols using semantic web vocabulary is a way to make experiment descriptions machine-readable. Protocols are communicated using concise scientific English, which precludes most kinds of analysis by software algorithms. Given the availability of a sufficiently expressive ontology, some or all of the pertinent information can be captured by asserting a series of facts, expressed as semantic web triples (subject, predicate, object. With appropriate annotation, assays can be searched, clustered, tagged and evaluated in a multitude of ways, analogous to other segments of drug discovery informatics. The BioAssay Ontology (BAO has been previously designed for this express purpose, and provides a layered hierarchy of meaningful terms which can be linked to. Currently the biggest challenge is the issue of content creation: scientists cannot be expected to use the BAO effectively without having access to software tools that make it straightforward to use the vocabulary in a canonical way. We have sought to remove this barrier by: (1 defining a BioAssay Template (BAT data model; (2 creating a software tool for experts to create or modify templates to suit their needs; and (3 designing a common assay template (CAT to leverage the most value from the BAO terms. The CAT was carefully assembled by biologists in order to find a balance between the maximum amount of information captured vs. low degrees of freedom in order to keep the user experience as simple as possible. The data format that we use for describing templates and corresponding annotations is the native format of the semantic web (RDF triples, and we demonstrate some of the ways that generated content can be meaningfully queried using the SPARQL language. We have made all of these materials available as open source (http://github.com/cdd/bioassay-template, in order to encourage community input and use within diverse projects, including but not limited to our own

  17. Promoting Your Web Site.

    Science.gov (United States)

    Raeder, Aggi

    1997-01-01

    Discussion of ways to promote sites on the World Wide Web focuses on how search engines work and how they retrieve and identify sites. Appropriate Web links for submitting new sites and for Internet marketing are included. (LRW)

  18. Web Extensible Display Manager

    Energy Technology Data Exchange (ETDEWEB)

    Slominski, Ryan [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Larrieu, Theodore L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2018-02-01

    Jefferson Lab's Web Extensible Display Manager (WEDM) allows staff to access EDM control system screens from a web browser in remote offices and from mobile devices. Native browser technologies are leveraged to avoid installing and managing software on remote clients such as browser plugins, tunnel applications, or an EDM environment. Since standard network ports are used firewall exceptions are minimized. To avoid security concerns from remote users modifying a control system, WEDM exposes read-only access and basic web authentication can be used to further restrict access. Updates of monitored EPICS channels are delivered via a Web Socket using a web gateway. The software translates EDM description files (denoted with the edl suffix) to HTML with Scalable Vector Graphics (SVG) following the EDM's edl file vector drawing rules to create faithful screen renderings. The WEDM server parses edl files and creates the HTML equivalent in real-time allowing existing screens to work without modification. Alternatively, the familiar drag and drop EDM screen creation tool can be used to create optimized screens sized specifically for smart phones and then rendered by WEDM.

  19. Process-oriented semantic web search

    CERN Document Server

    Tran, DT

    2011-01-01

    The book is composed of two main parts. The first part is a general study of Semantic Web Search. The second part specifically focuses on the use of semantics throughout the search process, compiling a big picture of Process-oriented Semantic Web Search from different pieces of work that target specific aspects of the process.In particular, this book provides a rigorous account of the concepts and technologies proposed for searching resources and semantic data on the Semantic Web. To collate the various approaches and to better understand what the notion of Semantic Web Search entails, this bo

  20. Bringing the Web to America

    CERN Multimedia

    Kunz, P F

    1999-01-01

    On 12 December 1991, Dr. Kunz installed the first Web server outside of Europe at the Stanford Linear Accelerator Center. Today, if you do not have access to the Web you are considered disadvantaged. Before it made sense for Tim Berners-Lee to invent the Web at CERN, there had to a number of ingredients in place. Dr. Kunz will present a history of how these ingredients developed and the role the academic research community had in forming them. In particular, the role that big science, such as high energy physics, played in giving us the Web we have today...

  1. ASH External Web Portal (External Portal) -

    Data.gov (United States)

    Department of Transportation — The ASH External Web Portal is a web-based portal that provides single sign-on functionality, making the web portal a single location from which to be authenticated...

  2. Space Physics Data Facility Web Services

    Science.gov (United States)

    Candey, Robert M.; Harris, Bernard T.; Chimiak, Reine A.

    2005-01-01

    The Space Physics Data Facility (SPDF) Web services provides a distributed programming interface to a portion of the SPDF software. (A general description of Web services is available at http://www.w3.org/ and in many current software-engineering texts and articles focused on distributed programming.) The SPDF Web services distributed programming interface enables additional collaboration and integration of the SPDF software system with other software systems, in furtherance of the SPDF mission to lead collaborative efforts in the collection and utilization of space physics data and mathematical models. This programming interface conforms to all applicable Web services specifications of the World Wide Web Consortium. The interface is specified by a Web Services Description Language (WSDL) file. The SPDF Web services software consists of the following components: 1) A server program for implementation of the Web services; and 2) A software developer s kit that consists of a WSDL file, a less formal description of the interface, a Java class library (which further eases development of Java-based client software), and Java source code for an example client program that illustrates the use of the interface.

  3. Development, implementation and pilot evaluation of a Web-based Virtual Patient Case Simulation environment--Web-SP.

    Science.gov (United States)

    Zary, Nabil; Johnson, Gunilla; Boberg, Jonas; Fors, Uno G H

    2006-02-21

    The Web-based Simulation of Patients (Web-SP) project was initiated in order to facilitate the use of realistic and interactive virtual patients (VP) in medicine and healthcare education. Web-SP focuses on moving beyond the technology savvy teachers, when integrating simulation-based education into health sciences curricula, by making the creation and use of virtual patients easier. The project strives to provide a common generic platform for design/creation, management, evaluation and sharing of web-based virtual patients. The aim of this study was to evaluate if it was possible to develop a web-based virtual patient case simulation environment where the entire case authoring process might be handled by teachers and which would be flexible enough to be used in different healthcare disciplines. The Web-SP system was constructed to support easy authoring, management and presentation of virtual patient cases. The case authoring environment was found to facilitate for teachers to create full-fledged patient cases without the assistance of computer specialists. Web-SP was successfully implemented at several universities by taking into account key factors such as cost, access, security, scalability and flexibility. Pilot evaluations in medical, dentistry and pharmacy courses shows that students regarded Web-SP as easy to use, engaging and to be of educational value. Cases adapted for all three disciplines were judged to be of significant educational value by the course leaders. The Web-SP system seems to fulfil the aim of providing a common generic platform for creation, management and evaluation of web-based virtual patient cases. The responses regarding the authoring environment indicated that the system might be user-friendly enough to appeal to a majority of the academic staff. In terms of implementation strengths, Web-SP seems to fulfil most needs from course directors and teachers from various educational institutions and disciplines. The system is currently in

  4. WebGeoOlap: Multidimensional geographic data visualization for web environments

    Directory of Open Access Journals (Sweden)

    SILVA, R. O. L.

    2008-06-01

    Full Text Available As the web is continuously maturing platform, a growing number of technology is leading Geographic Information Systems (GIS and Analytical (OLAP closer to being developed for the web environment. The great challenge of this work is to integrate these technologies, OLAP and GIS, in a single application, using Ajax, so that the user can perform the analysis of analytical data and view data geographic in the map, using Google Maps.

  5. Instant Flask web development

    CERN Document Server

    DuPlain, Ron

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. The book uses a bottom-up approach to help you build applications, and is full of step-by-step instructions and practical examples to help you improve your knowledge.Instant Flask Web Development is for developers who are new to web programming, or are familiar with web programming but new to Flask. This book gives you a head start if you have some beginner experience with Python and HTML, or are willing to learn.

  6. Portal Web 2.0

    OpenAIRE

    Barba Hidalgo, José Manuel

    2008-01-01

    El tema que es tracta en aquest projecte gira al voltant del concepte Web 2.0. Després d’una introducció on es comenten les principals característiques que defineixen el conjunt d’aplicacions agrupades al voltant d’aquesta filosofia, s’analitzen diferents entorns de desenvolupament d’aplicacions Web, amb l’objectiu de crear un portal que segueixi els principis Web 2.0. El resultat de l’estudi presenta a Ruby on Rails com un ferm candidat, això fa que es procedeixi a estudiar aq...

  7. Programming NET Web Services

    CERN Document Server

    Ferrara, Alex

    2007-01-01

    Web services are poised to become a key technology for a wide range of Internet-enabled applications, spanning everything from straight B2B systems to mobile devices and proprietary in-house software. While there are several tools and platforms that can be used for building web services, developers are finding a powerful tool in Microsoft's .NET Framework and Visual Studio .NET. Designed from scratch to support the development of web services, the .NET Framework simplifies the process--programmers find that tasks that took an hour using the SOAP Toolkit take just minutes. Programming .NET

  8. Applied Semantic Web Technologies

    CERN Document Server

    Sugumaran, Vijayan

    2011-01-01

    The rapid advancement of semantic web technologies, along with the fact that they are at various levels of maturity, has left many practitioners confused about the current state of these technologies. Focusing on the most mature technologies, Applied Semantic Web Technologies integrates theory with case studies to illustrate the history, current state, and future direction of the semantic web. It maintains an emphasis on real-world applications and examines the technical and practical issues related to the use of semantic technologies in intelligent information management. The book starts with

  9. The RNAsnp web server

    DEFF Research Database (Denmark)

    Radhakrishnan, Sabarinathan; Tafer, Hakim; Seemann, Ernst Stefan

    2013-01-01

    , are derived from extensive pre-computed tables of distributions of substitution effects as a function of gene length and GC content. Here, we present a web service that not only provides an interface for RNAsnp but also features a graphical output representation. In addition, the web server is connected...... to a local mirror of the UCSC genome browser database that enables the users to select the genomic sequences for analysis and visualize the results directly in the UCSC genome browser. The RNAsnp web server is freely available at: http://rth.dk/resources/rnasnp/....

  10. Silicon web process development

    Science.gov (United States)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1981-01-01

    The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

  11. Building Web Reputation Systems

    CERN Document Server

    Farmer, Randy

    2010-01-01

    What do Amazon's product reviews, eBay's feedback score system, Slashdot's Karma System, and Xbox Live's Achievements have in common? They're all examples of successful reputation systems that enable consumer websites to manage and present user contributions most effectively. This book shows you how to design and develop reputation systems for your own sites or web applications, written by experts who have designed web communities for Yahoo! and other prominent sites. Building Web Reputation Systems helps you ask the hard questions about these underlying mechanisms, and why they're critical

  12. Doctors and the Web. Help your patients surf the Net safely.

    Science.gov (United States)

    Grandinetti, D A

    2000-03-06

    The Internet promises to touch every aspect of a physician's professional life, from patient relations to access to clinical studies, from billing to patient records, from marketing to e-mail. To help you make sense of what may be the most profound force in medical practice today, we're kicking off a new series with this article on helping patients navigate the Internet. Future installments, which will run in our first issue of every month, will look at such topics as online patient charts; Web-based electronic medical records; services that electronically connect doctors with health plans, hospitals, and other providers; and online supply purchasing.

  13. Development of a laboratory niche Web site.

    Science.gov (United States)

    Dimenstein, Izak B; Dimenstein, Simon I

    2013-10-01

    This technical note presents the development of a methodological laboratory niche Web site. The "Grossing Technology in Surgical Pathology" (www.grossing-technology.com) Web site is used as an example. Although common steps in creation of most Web sites are followed, there are particular requirements for structuring the template's menu on methodological laboratory Web sites. The "nested doll principle," in which one object is placed inside another, most adequately describes the methodological approach to laboratory Web site design. Fragmentation in presenting the Web site's material highlights the discrete parts of the laboratory procedure. An optimally minimal triad of components can be recommended for the creation of a laboratory niche Web site: a main set of media, a blog, and an ancillary component (host, contact, and links). The inclusion of a blog makes the Web site a dynamic forum for professional communication. By forming links and portals, cloud computing opens opportunities for connecting a niche Web site with other Web sites and professional organizations. As an additional source of information exchange, methodological laboratory niche Web sites are destined to parallel both traditional and new forms, such as books, journals, seminars, webinars, and internal educational materials. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. The definitive guide to HTML5 WebSocket

    CERN Document Server

    Wang, Vanessa; Moskovits, Peter

    2013-01-01

    The Definitive Guide to HTML5 WebSocket is the ultimate insider's WebSocket resource. This revolutionary new web technology enables you to harness the power of true real-time connectivity and build responsive, modern web applications.   This book contains everything web developers and architects need to know about WebSocket. It discusses how WebSocket-based architectures provide a dramatic reduction in unnecessary network overhead and latency compared to older HTTP (Ajax) architectures, how to layer widely used protocols such as XMPP and STOMP on top of WebSocket, and how to secure WebSocket c

  15. The real business of web design

    CERN Document Server

    Waters, John

    2004-01-01

    Written by a veteran Web designer, The Real Business of Web Design goes beyond the usual philosophy of simply creating a better customer experience online. Instead, it provides an array of visual design practices and tested business principles for clarifying and simplifying the Web development process and making a Website more customer friendly. Filled with anecdotes from the author's own experiences in the web design trenches, this guide shows readers how to use the Web in crucial ways to streamline communications, speed up transactions, boost profits, and much more. Anyone who wants to use t

  16. Creating Web Sites The Missing Manual

    CERN Document Server

    MacDonald, Matthew

    2006-01-01

    Think you have to be a technical wizard to build a great web site? Think again. For anyone who wants to create an engaging web site--for either personal or business purposes--Creating Web Sites: The Missing Manual demystifies the process and provides tools, techniques, and expert guidance for developing a professional and reliable web presence. Like every Missing Manual, you can count on Creating Web Sites: The Missing Manual to be entertaining and insightful and complete with all the vital information, clear-headed advice, and detailed instructions you need to master the task at hand. Autho

  17. Caching web service for TICF project

    International Nuclear Information System (INIS)

    Pais, V.F.; Stancalie, V.

    2008-01-01

    A caching web service was developed to allow caching of any object to a network cache, presented in the form of a web service. This application was used to increase the speed of previously implemented web services and for new ones. Various tests were conducted to determine the impact of using this caching web service in the existing network environment and where it should be placed in order to achieve the greatest increase in performance. Since the cache is presented to applications as a web service, it can also be used for remote access to stored data and data sharing between applications

  18. Office 2010 Web Apps For Dummies

    CERN Document Server

    Weverka, Peter

    2010-01-01

    Enhance your Microsoft Office 2010 experience with Office 2010 Web Apps!. Office Web Apps complement Office, making it easy to access and edit files from anywhere. It also simplifies collaboration with those who don't have Microsoft Office on their computers. This helpful book shows you the optimum ways you can use Office Web Apps to save time and streamline your work. Veteran For Dummies author Peter Weverka begins with an introduction to Office Web Apps and then goes on to clearly explain how Office Web Apps provide you with easier, faster, more flexible ways to get things done.: Walks you t

  19. Building Web Apps for Google TV

    CERN Document Server

    Ferrate, Andres; Lee, Daniels; Ohye, Maile; Carff, Paul; Shen, Shawn; Hines, Steven

    2011-01-01

    By integrating the Web with traditional TV, Google TV offers developers an important new channel for content. But creating apps for Google TV requires learning some new skills-in fact, what you may already know about mobile or desktop web apps isn't entirely applicable. Building Web Apps for Google TV will help you make the transition to Google TV as you learn the tools and techniques necessary to build sophisticated web apps for this platform. This book shows you how Google TV works, how it fits into the web ecosystem, and what the opportunities are for delivering rich content to millions o

  20. EPA Web Training Classes

    Science.gov (United States)

    Scheduled webinars can help you better manage EPA web content. Class topics include Drupal basics, creating different types of pages in the WebCMS such as document pages and forms, using Google Analytics, and best practices for metadata and accessibility.

  1. WebGimm: An integrated web-based platform for cluster analysis, functional analysis, and interactive visualization of results.

    Science.gov (United States)

    Joshi, Vineet K; Freudenberg, Johannes M; Hu, Zhen; Medvedovic, Mario

    2011-01-17

    Cluster analysis methods have been extensively researched, but the adoption of new methods is often hindered by technical barriers in their implementation and use. WebGimm is a free cluster analysis web-service, and an open source general purpose clustering web-server infrastructure designed to facilitate easy deployment of integrated cluster analysis servers based on clustering and functional annotation algorithms implemented in R. Integrated functional analyses and interactive browsing of both, clustering structure and functional annotations provides a complete analytical environment for cluster analysis and interpretation of results. The Java Web Start client-based interface is modeled after the familiar cluster/treeview packages making its use intuitive to a wide array of biomedical researchers. For biomedical researchers, WebGimm provides an avenue to access state of the art clustering procedures. For Bioinformatics methods developers, WebGimm offers a convenient avenue to deploy their newly developed clustering methods. WebGimm server, software and manuals can be freely accessed at http://ClusterAnalysis.org/.

  2. Pedagogy for teaching and learning cooperatively on the Web: a Web-based pharmacology course.

    Science.gov (United States)

    Tse, Mimi M Y; Pun, Sandra P Y; Chan, Moon Fai

    2007-02-01

    The Internet is becoming a preferred place to find information. Millions of people go online in the search of health and medical information. Likewise, the demand for Web-based courses grows. This article presents the development, utilization and evaluation of a web-based pharmacology course for nursing students. The course was developed based on 150 commonly used drugs. There were 110 year 1 nursing students took part in the course. After attending six hours face to face lecture of pharmacology over three weeks, students were invited to complete a questionnaire (pre-test) about learning pharmacology. The course materials were then uploaded to a WebCT for student's self-directed learning and attempts to pass two scheduled online quizzes. At the end of the semester, students were given the same questionnaire (post-test). There were a significant increase in the understanding compared with memorizing the subject content, the development of problem solving ability in learning pharmacology and becoming an independent learner (p ,0.05). Online quizzes yielded satisfactory results. In the focused group interview, students appreciated the time flexibility and convenience associated with web-based learning, also, they had made good suggestions in enhancing web-based learning. Web-based approach is promising for teaching and learning pharmacology for nurses and other health-care professionals.

  3. Materializing the web of linked data

    CERN Document Server

    Konstantinou, Nikolaos

    2015-01-01

    This book explains the Linked Data domain by adopting a bottom-up approach: it introduces the fundamental Semantic Web technologies and building blocks, which are then combined into methodologies and end-to-end examples for publishing datasets as Linked Data, and use cases that harness scholarly information and sensor data. It presents how Linked Data is used for web-scale data integration, information management and search. Special emphasis is given to the publication of Linked Data from relational databases as well as from real-time sensor data streams. The authors also trace the transformation from the document-based World Wide Web into a Web of Data. Materializing the Web of Linked Data is addressed to researchers and professionals studying software technologies, tools and approaches that drive the Linked Data ecosystem, and the Web in general.

  4. EasyFRAP-web: a web-based tool for the analysis of fluorescence recovery after photobleaching data.

    Science.gov (United States)

    Koulouras, Grigorios; Panagopoulos, Andreas; Rapsomaniki, Maria A; Giakoumakis, Nickolaos N; Taraviras, Stavros; Lygerou, Zoi

    2018-06-13

    Understanding protein dynamics is crucial in order to elucidate protein function and interactions. Advances in modern microscopy facilitate the exploration of the mobility of fluorescently tagged proteins within living cells. Fluorescence recovery after photobleaching (FRAP) is an increasingly popular functional live-cell imaging technique which enables the study of the dynamic properties of proteins at a single-cell level. As an increasing number of labs generate FRAP datasets, there is a need for fast, interactive and user-friendly applications that analyze the resulting data. Here we present easyFRAP-web, a web application that simplifies the qualitative and quantitative analysis of FRAP datasets. EasyFRAP-web permits quick analysis of FRAP datasets through an intuitive web interface with interconnected analysis steps (experimental data assessment, different types of normalization and estimation of curve-derived quantitative parameters). In addition, easyFRAP-web provides dynamic and interactive data visualization and data and figure export for further analysis after every step. We test easyFRAP-web by analyzing FRAP datasets capturing the mobility of the cell cycle regulator Cdt2 in the presence and absence of DNA damage in cultured cells. We show that easyFRAP-web yields results consistent with previous studies and highlights cell-to-cell heterogeneity in the estimated kinetic parameters. EasyFRAP-web is platform-independent and is freely accessible at: https://easyfrap.vmnet.upatras.gr/.

  5. Theoretical Foundations of the Web: Cognition, Communication, and Co-Operation. Towards an Understanding of Web 1.0, 2.0, 3.0

    Directory of Open Access Journals (Sweden)

    Robert Bichler

    2010-02-01

    Full Text Available Currently, there is much talk of Web 2.0 and Social Software. A common understanding of these notions is not yet in existence. The question of what makes Social Software social has thus far also remained unacknowledged. In this paper we provide a theoretical understanding of these notions by outlining a model of the Web as a techno-social system that enhances human cognition towards communication and co-operation. According to this understanding, we identify three qualities of the Web, namely Web 1.0 as a Web of cognition, Web 2.0 as a Web of human communication, and Web 3.0 as a Web of co-operation. We use the terms Web 1.0, Web 2.0, Web 3.0 not in a technical sense, but for describing and characterizing the social dynamics and information processes that are part of the Internet.

  6. OneWeb: plataforma de adaptación de contenidos web basada en las recomendaciones del W3C Mobile Web Initiative

    Directory of Open Access Journals (Sweden)

    Francisco O. Martínez P.

    2011-01-01

    Full Text Available Las limitaciones con respecto a la experiencia de navegabilidad y facilidad de uso constituyen los principales obstáculos que enfrenta la web móvil para lograr una amplia aceptación mundial. Recientemente el W3C ha desarrollado una iniciativa conocida como Mobile Web Initiative (MWI, la cual define un conjunto de directrices para el diseño y presentación adecuada de interfaces web dirigidas a dispositivos móviles. El presente artículo describe las principales características y los módulos funcionales de OneWeb, una plataforma de adaptación de contenidos basada en las recomendaciones de MWI, desarrollada por el Grupo de Interés en el Desarrollo de Aplicaciones Móviles para Dispositivos Móviles - W@PColombia, adscrito al Grupo de Ingeniería Telemática de la Universidad del Cauca. Adicionalmente se presentan medidas de rendimiento y una comparación con sistemas de adaptación de contenido que operan actualmente. De manera experimental se obtuvieron tiempos de respuesta satisfactorios para entornos web móviles y se logró el cumplimiento de las recomendaciones de MWI sobre un conjunto de veinte páginas de prueba.

  7. Association and Sequence Mining in Web Usage

    Directory of Open Access Journals (Sweden)

    Claudia Elena DINUCA

    2011-06-01

    Full Text Available Web servers worldwide generate a vast amount of information on web users’ browsing activities. Several researchers have studied these so-called clickstream or web access log data to better understand and characterize web users. Clickstream data can be enriched with information about the content of visited pages and the origin (e.g., geographic, organizational of the requests. The goal of this project is to analyse user behaviour by mining enriched web access log data. With the continued growth and proliferation of e-commerce, Web services, and Web-based information systems, the volumes of click stream and user data collected by Web-based organizations in their daily operations has reached astronomical proportions. This information can be exploited in various ways, such as enhancing the effectiveness of websites or developing directed web marketing campaigns. The discovered patterns are usually represented as collections of pages, objects, or re-sources that are frequently accessed by groups of users with common needs or interests. The focus of this paper is to provide an overview how to use frequent pattern techniques for discovering different types of patterns in a Web log database. In this paper we will focus on finding association as a data mining technique to extract potentially useful knowledge from web usage data. I implemented in Java, using NetBeans IDE, a program for identification of pages’ association from sessions. For exemplification, we used the log files from a commercial web site.

  8. Sexual information seeking on web search engines.

    Science.gov (United States)

    Spink, Amanda; Koricich, Andrew; Jansen, B J; Cole, Charles

    2004-02-01

    Sexual information seeking is an important element within human information behavior. Seeking sexually related information on the Internet takes many forms and channels, including chat rooms discussions, accessing Websites or searching Web search engines for sexual materials. The study of sexual Web queries provides insight into sexually-related information-seeking behavior, of value to Web users and providers alike. We qualitatively analyzed queries from logs of 1,025,910 Alta Vista and AlltheWeb.com Web user queries from 2001. We compared the differences in sexually-related Web searching between Alta Vista and AlltheWeb.com users. Differences were found in session duration, query outcomes, and search term choices. Implications of the findings for sexual information seeking are discussed.

  9. Web-enabling technologies for the factory floor: a web-enabling strategy for emanufacturing

    Science.gov (United States)

    Velez, Ricardo; Lastra, Jose L. M.; Tuokko, Reijo O.

    2001-10-01

    This paper is intended to address the different technologies available for Web-enabling of the factory floor. It will give an overview of the importance of Web-enabling of the factory floor, in the application of the concepts of flexible and intelligent manufacturing, in conjunction with e-commerce. As a last section, it will try to define a Web-enabling strategy for the application in eManufacturing. This is made under the scope of the electronics manufacturing industry, so every application, technology or related matter is presented under such scope.

  10. Using Web 2.0 for health promotion and social marketing efforts: lessons learned from Web 2.0 experts.

    Science.gov (United States)

    Dooley, Jennifer Allyson; Jones, Sandra C; Iverson, Don

    2014-01-01

    Web 2.0 experts working in social marketing participated in qualitative in-depth interviews. The research aimed to document the current state of Web 2.0 practice. Perceived strengths (such as the viral nature of Web 2.0) and weaknesses (such as the time consuming effort it took to learn new Web 2.0 platforms) existed when using Web 2.0 platforms for campaigns. Lessons learned were identified--namely, suggestions for engaging in specific types of content creation strategies (such as plain language and transparent communication practices). Findings present originality and value to practitioners working in social marketing who want to effectively use Web 2.0.

  11. WebAL Comes of Age: A review of the first 21 years of Artificial Life on the Web

    DEFF Research Database (Denmark)

    Taylor, Tim; Auerbach, Joshua E; Bongard, Josh

    2016-01-01

    We present a survey of the first 21 years of web-based artificial life (WebAL) research and applications, broadly construed to include the many different ways in which artificial life and web technologies might intersect. Our survey covers the period from 1994—when the first WebAL work appeared...

  12. Influence of water mixing and food web status on the response of planktonic communities to enhanced ultraviolet-B radiation

    Science.gov (United States)

    Mostajir, B.; Uvbr Team

    2003-04-01

    Two series of mesocosm experiments were carried out in 1996 and 1997 using the natural planktonic assemblage, ultraviolet-B radiation (UVBR: 280-320 nm) at the community level. The water used in the first experiment was rich in nitrate (ca. 8-10 μM) and phytoplankton biomass (5 μg Chlorophyll a L-1: Chl a), conditions typical of an eutrophic coastal zone with herbivorous food web characteristics. In contrast, the water used in the second experiment was poor in nitrate (food web. Furthermore, to understand the influence of vertical mixing on the effects of UVBR on the planktonic community, two mixing regimes (fast and slow) were tested during the mesocosm experiments of 1997. The results showed that the mixing regime can moderate the effects of UVBR on the planktonic community and can also modify completely the species composition in the mesocosms much more than the UVBR. Comparison between the impact of UVBR on the planktonic community presented in these two experiments suggested that regenerated production-based systems (e.g. microbial food webs) tolerate the effects of UVBR more efficiently than do new production-based systems (herbivorous food webs). Results regarding the potential effects of UVBR in different marine systems, coastal versus oceanic, where different physical systems dominate, fast versus slow mixing, and consequently the development of different food webs are favored, herbivorous versus microbial, will be discussed.

  13. Deep Web and Dark Web: Deep World of the Internet

    OpenAIRE

    Çelik, Emine

    2018-01-01

    The Internet is undoubtedly still a revolutionary breakthrough in the history of humanity. Many people use the internet for communication, social media, shopping, political and social agenda, and more. Deep Web and Dark Web concepts not only handled by computer, software engineers but also handled by social siciensists because of the role of internet for the States in international arenas, public institutions and human life. By the moving point that very importantrole of internet for social s...

  14. Gender and web design software

    Directory of Open Access Journals (Sweden)

    Gabor Horvath

    2007-12-01

    Full Text Available There are several studies dealing with the differences between sites originated by men and women. However, these references are mainly related to the "output", the final web site. In our research we examined the input side of web designing. We thoroughly analysed a number of randomly selected web designer softwares to see, whether and to what extent the templates they offer determine the final look of an individual's website. We have found that most of them are typical masculine templates, which makes it difficult to any women to design a feminine looking website. It can be one of the reasons of the masculine website hegemony on the web.

  15. Beyond Web 2.0 … and Beyond the Semantic Web

    Science.gov (United States)

    Bénel, Aurélien; Zhou, Chao; Cahier, Jean-Pierre

    Tim O'Reilly, the famous technology book publisher, changed the life of many of us when he coined the name "Web 2.0" (O' Reilly 2005). Our research topics suddenly became subjects for open discussion in various cultural formats such as radio and TV, while at the same time they became part of an inappropriate marketing discourse according to several scientific reviewers. Indeed Tim O'Reilly's initial thoughts were about economic consequence, since it was about the resurrection of the Web after the bursting of the dot-com bubble. Some opponents of the concept do not think the term should be used at all since it is underpinned by no technological revolution. In contrast, we think that there was a paradigm shift when several sites based on user-generated content became some of the most visited Web sites and massive adoption of that kind is worthy of researchers' attention.

  16. Web-Beagle: a web server for the alignment of RNA secondary structures.

    Science.gov (United States)

    Mattei, Eugenio; Pietrosanto, Marco; Ferrè, Fabrizio; Helmer-Citterich, Manuela

    2015-07-01

    Web-Beagle (http://beagle.bio.uniroma2.it) is a web server for the pairwise global or local alignment of RNA secondary structures. The server exploits a new encoding for RNA secondary structure and a substitution matrix of RNA structural elements to perform RNA structural alignments. The web server allows the user to compute up to 10 000 alignments in a single run, taking as input sets of RNA sequences and structures or primary sequences alone. In the latter case, the server computes the secondary structure prediction for the RNAs on-the-fly using RNAfold (free energy minimization). The user can also compare a set of input RNAs to one of five pre-compiled RNA datasets including lncRNAs and 3' UTRs. All types of comparison produce in output the pairwise alignments along with structural similarity and statistical significance measures for each resulting alignment. A graphical color-coded representation of the alignments allows the user to easily identify structural similarities between RNAs. Web-Beagle can be used for finding structurally related regions in two or more RNAs, for the identification of homologous regions or for functional annotation. Benchmark tests show that Web-Beagle has lower computational complexity, running time and better performances than other available methods. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. The DIRAC Web Portal 2.0

    Science.gov (United States)

    Mathe, Z.; Casajus Ramo, A.; Lazovsky, N.; Stagni, F.

    2015-12-01

    For many years the DIRAC interware (Distributed Infrastructure with Remote Agent Control) has had a web interface, allowing the users to monitor DIRAC activities and also interact with the system. Since then many new web technologies have emerged, therefore a redesign and a new implementation of the DIRAC Web portal were necessary, taking into account the lessons learnt using the old portal. These new technologies allowed to build a more compact, robust and responsive web interface that enables users to have better control over the whole system while keeping a simple interface. The web framework provides a large set of “applications”, each of which can be used for interacting with various parts of the system. Communities can also create their own set of personalised web applications, and can easily extend already existing ones with a minimal effort. Each user can configure and personalise the view for each application and save it using the DIRAC User Profile service as RESTful state provider, instead of using cookies. The owner of a view can share it with other users or within a user community. Compatibility between different browsers is assured, as well as with mobile versions. In this paper, we present the new DIRAC Web framework as well as the LHCb extension of the DIRAC Web portal.

  18. Ten years for the public Web

    CERN Multimedia

    2003-01-01

    Ten years ago, CERN issued a statement declaring that a little known piece of software called the World Wide Web was in the public domain. Nowadays, the Web is an indispensable part of modern communications. The idea for the Web goes back to March 1989 when CERN Computer scientist Tim Berners-Lee wrote a proposal for a 'Distributed Information Management System' for the high-energy physics community. The Web was originaly conceived and developed to meet the demand for information sharing between scientists working all over the world. There were many obstacles in the 1980s to the effective exchange of information. There was, for example a great variety of computer and network systems, with hardly any common features. The main purpose of the web was to allow scientists to access information from any source in a consistent and simple way. By Christmas 1990, Berners-Lee's idea had become the World Wide Web, with its first server and browser running at CERN. Through 1991, the Web spread to other particle physics ...

  19. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  20. Free web-based modelling platform for managed aquifer recharge (MAR) applications

    Science.gov (United States)

    Stefan, Catalin; Junghanns, Ralf; Glaß, Jana; Sallwey, Jana; Fatkhutdinov, Aybulat; Fichtner, Thomas; Barquero, Felix; Moreno, Miguel; Bonilla, José; Kwoyiga, Lydia

    2017-04-01

    . Besides the simulation tools, a web-based data base is under development where geospatial and time series data can be stored, managed, and processed. Furthermore, a web-based information system containing user guides for the various developed tools and applications as well as basic information on MAR and related topics is published and will be regularly expanded as new tools are getting implemented. The INOWAS-DSS including its simulation tools, data base and information system provides an extensive framework to manage, plan and optimize MAR facilities. As the INOWAS-DSS is an open-source software accessible via the internet using standard web browsers, it offers new ways for data sharing and collaboration among various partners and decision makers.

  1. `Indoor` series vending machines; `Indoor` series jido hanbaiki

    Energy Technology Data Exchange (ETDEWEB)

    Gensui, T.; Kida, A. [Fuji Electric Co. Ltd., Tokyo (Japan); Okumura, H. [Fuji Denki Reiki Co. Ltd., Tokyo (Japan)

    1996-07-10

    This paper introduces three series of vending machines that were designed to match the interior of an office building. The three series are vending machines for cups, paper packs, cans, and tobacco. Among the three series, `Interior` series has a symmetric design that was coated in a grain pattern. The inside of the `Interior` series is coated by laser satin to ensure a sense of superior quality and a refined style. The push-button used for product selection is hot-stamped on the plastic surface to ensure the hair-line luster. `Interior Phase II` series has a bay window design with a sense of superior quality and lightness. The inside of the `Interior Phase II` series is coated by laser satin. `Interior 21` series is integrated with the wall except the sales operation panel. The upper and lower dress panels can be detached and attached. The door lock is a wire-type structure with high operativity. The operation block is coated by titanium color. The dimensions of three series are standardized. 6 figs., 1 tab.

  2. Pro JavaScript for web apps

    CERN Document Server

    Freeman, Adam

    2012-01-01

    JavaScript is the engine behind every web app, and a solid knowledge of it is essential for all modern web developers. Pro JavaScript for Web Apps gives you all of the information that you need to create professional, optimized, and efficient JavaScript applications that will run across all devices. It takes you through all aspects of modern JavaScript application creation, showing you how to combine JavaScript with the new features of HTML5 and CSS3 to make the most of the new web technologies. The focus of the book is on creating professional web applications, ensuring that your app provides

  3. 8th Chinese Conference on The Semantic Web and Web Science

    CERN Document Server

    Du, Jianfeng; Wang, Haofen; Wang, Peng; Ji, Donghong; Pan, Jeff Z; CSWS 2014

    2014-01-01

    This book constitutes the thoroughly refereed papers of the 8th Chinese Conference on The Semantic Web and Web Science, CSWS 2014, held in Wuhan, China, in August 2014. The 22 research papers presented were carefully reviewed and selected from 61 submissions. The papers are organized in topical sections such as ontology reasoning and learning; semantic data generation and management; and semantic technology and applications.

  4. Web 2.0: A Strategy Guide

    CERN Document Server

    Shuen, Amy

    2008-01-01

    Web 2.0 makes headlines, but how does it make money? This concise guide explains what's different about Web 2.0 and how those differences can improve the bottom line. Whether you're an executive, a small business owner, or an entrepreneur, Web 2.0: A Strategy Guide illustrates through real life examples how various businesses are creating new opportunities on today's Web. This book is about strategy rather than the technology itself.

  5. Spider-web amphiphiles as artificial lipid clusters: design, synthesis, and accommodation of lipid components at the air-water interface.

    Science.gov (United States)

    Ariga, Katsuhiko; Urakawa, Toshihiro; Michiue, Atsuo; Kikuchi, Jun-ichi

    2004-08-03

    As a novel category of two-dimensional lipid clusters, dendrimers having an amphiphilic structure in every unit were synthesized and labeled "spider-web amphiphiles". Amphiphilic units based on a Lys-Lys-Glu tripeptide with hydrophobic tails at the C-terminal and a polar head at the N-terminal are dendrically connected through stepwise peptide coupling. This structural design allowed us to separately introduce the polar head and hydrophobic tails. Accordingly, we demonstrated the synthesis of the spider-web amphiphile series in three combinations: acetyl head/C16 chain, acetyl head/C18 chain, and ammonium head/C16 chain. All the spider-web amphiphiles were synthesized in satisfactory yields, and characterized by 1H NMR, MALDI-TOFMS, GPC, and elemental analyses. Surface pressure (pi)-molecular area (A) isotherms showed the formation of expanded monolayers except for the C18-chain amphiphile at 10 degrees C, for which the molecular area in the condensed phase is consistent with the cross-sectional area assigned for all the alkyl chains. In all the spider-web amphiphiles, the molecular areas at a given pressure in the expanded phase increased in proportion to the number of units, indicating that alkyl chains freely fill the inner space of the dendritic core. The mixing of octadecanoic acid with the spider-web amphiphiles at the air-water interface induced condensation of the molecular area. From the molecular area analysis, the inclusion of the octadecanoic acid bears a stoichiometric characteristic; i.e., the number of captured octadecanoic acids in the spider-web amphiphile roughly agrees with the number of branching points in the spider-web amphiphile.

  6. A reasonable Semantic Web

    NARCIS (Netherlands)

    Hitzler, Pascal; Van Harmelen, Frank

    2010-01-01

    The realization of Semantic Web reasoning is central to substantiating the Semantic Web vision. However, current mainstream research on this topic faces serious challenges, which forces us to question established lines of research and to rethink the underlying approaches. We argue that reasoning for

  7. Head First Web Design

    CERN Document Server

    Watrall, Ethan

    2008-01-01

    Want to know how to make your pages look beautiful, communicate your message effectively, guide visitors through your website with ease, and get everything approved by the accessibility and usability police at the same time? Head First Web Design is your ticket to mastering all of these complex topics, and understanding what's really going on in the world of web design. Whether you're building a personal blog or a corporate website, there's a lot more to web design than div's and CSS selectors, but what do you really need to know? With this book, you'll learn the secrets of designing effecti

  8. Models and methods for building web recommendation systems

    OpenAIRE

    Stekh, Yu.; Artsibasov, V.

    2012-01-01

    Modern Word Wide Web contains a large number of Web sites and pages in each Web site. Web recommendation system (recommendation system for web pages) are typically implemented on web servers and use the data obtained from the collection viewed web templates (implicit data) or user registration data (explicit data). In article considering methods and algorithms of web recommendation system based on the technology of data mining (web mining). Сучасна мережа Інтернет містить велику кількість веб...

  9. A Web Service Framework for Economic Applications

    Directory of Open Access Journals (Sweden)

    Dan BENTA

    2010-01-01

    Full Text Available The Internet offers multiple solutions to linkcompanies with their partners, customers or suppliersusing IT solutions, including a special focus on Webservices. Web services are able to solve the problem relatedto the exchange of data between business partners, marketsthat can use each other's services, problems ofincompatibility between IT applications. As web servicesare described, discovered and accessed programs based onXML vocabularies and Web protocols, Web servicesrepresents solutions for Web-based technologies for smalland medium-sized enterprises (SMEs. This paper presentsa web service framework for economic applications. Also, aprototype of this IT solution using web services waspresented and implemented in a few companies from IT,commerce and consulting fields measuring the impact ofthe solution in the business environment development.

  10. Configuration of Web services as parametric design

    NARCIS (Netherlands)

    Ten Teije, Annette; Van Harmelen, Frank; Wielinga, Bob

    2004-01-01

    The configuration of Web services is particularly hard given the heterogeneous, unreliable and open nature of the Web. Furthermore, such composite Web services are likely to be complex services, that will require adaptation for each specific use. Current approaches to Web service configuration are

  11. Web-based tailored intervention for preparation of parents and children for outpatient surgery (WebTIPS): development.

    Science.gov (United States)

    Kain, Zeev N; Fortier, Michelle A; Chorney, Jill MacLaren; Mayes, Linda

    2015-04-01

    As a result of cost-containment efforts, preparation programs for outpatient surgery are currently not available to the majority of children and parents. The recent dramatic growth in the Internet presents a unique opportunity to transform how children and their parents are prepared for surgery. In this article, we describe the development of a Web-based Tailored Intervention for Preparation of parents and children undergoing Surgery (WebTIPS). A multidisciplinary taskforce agreed that a Web-based tailored intervention consisting of intake, matrix, and output modules was the preferred approach. Next, the content of the various intake variables, the matrix logic, and the output content was developed. The output product has a parent component and a child component and is described in http://surgerywebtips.com/about.php. The child component makes use of preparation strategies such as information provision, modeling, play, and coping skills training. The parent component of WebTIPS includes strategies such as information provision, coping skills training, and relaxation and distraction techniques. A reputable animation and Web design company developed a secured Web-based product based on the above description. In this article, we describe the development of a Web-based tailored preoperative preparation program that can be accessed by children and parents multiple times before and after surgery. A follow-up article in this issue of Anesthesia & Analgesia describes formative evaluation and preliminary efficacy testing of this Web-based tailored preoperative preparation program.

  12. Creating a web site the missing manual

    CERN Document Server

    MacDonald, Matthew

    2008-01-01

    Think you have to be a technical wizard to build a great web site? Think again. If you want to create an engaging web site, this thoroughly revised, completely updated edition of Creating a Web Site: The Missing Manual demystifies the process and provides tools, techniques, and expert guidance for developing a professional and reliable web presence. Whether you want to build a personal web site, an e-commerce site, a blog, or a web site for a specific occasion or promotion, this book gives you detailed instructions and clear-headed advice for: Everything from planning to launching. From pi

  13. Creatividad y producción audiovisual en la red: el caso de la serie andaluza

    Directory of Open Access Journals (Sweden)

    Jiménez Marín, Gloria

    2012-01-01

    Full Text Available En español: La Web 2.0 ha posibilitado que jóvenes creadores generen contenido audiovisual y puedan difundirlo a través de los medios sociales, sin necesidad de pasar por los canales habituales de distribución, hasta ahora imprescindibles. Al otro lado del ordenador o de los dispositivos móviles le esperan receptores ansiosos por consumir vídeo, una actividad a la que cada vez dedicamos más horas… con una diferencia fundamental: hemos dejado de ver el televisor para consumir más audiovisual online en otro tipo de sistemas y aparatos emisores. Gracias a este contexto, en Andalucía vivimos el nacimiento de series, consideradas ya de culto, que han puesto de manifiesto el potencial de nuestros jóvenes creadores. Sin embargo, la realización periódica de capítulos supone un esfuerzo económico que la mayoría de ellos no pueden sufragar. Mientras, las agencias de comunicación se enfrentan al fenómeno de la publicidad online. Los ejecutivos se encuentran ante receptores que ansían experiencias y contenido: éstas son las claves de la publicidad 2.0. Y a partir de aquí, las piezas comienzan a funcionar: unos poseen las ideas, el contenido; otros, la financiación. Tal es el caso de Niña Repelente, una de las series online que mayor repercusión ha tenido en los últimos años y su acuerdo de patrocinio con la compañía líder de telefonía en España, firmado en 2010, que ha supuesto la difusión de la serie en Tuenti, la red social autóctona de más uso entre los jóvenes españoles. El caso referido da pie a este trabajo que analiza las claves de esas transformaciones en las distintas fases de la creación audiovisual (creación, distribución, consumo, al tiempo que estudia las variables críticas y abre preguntas para el mundo de la comunicación a partir de las últimas tendencias en las aplicaciones web.In english: The Web 2.0 has done that young designers can generate and disseminate audiovisual content through social media

  14. Capturing Trust in Social Web Applications

    Science.gov (United States)

    O'Donovan, John

    The Social Web constitutes a shift in information flow from the traditional Web. Previously, content was provided by the owners of a website, for consumption by the end-user. Nowadays, these websites are being replaced by Social Web applications which are frameworks for the publication of user-provided content. Traditionally, Web content could be `trusted' to some extent based on the site it originated from. Algorithms such as Google's PageRank were (and still are) used to compute the importance of a website, based on analysis of underlying link topology. In the Social Web, analysis of link topology merely tells us about the importance of the information framework which hosts the content. Consumers of information still need to know about the importance/reliability of the content they are reading, and therefore about the reliability of the producers of that content. Research into trust and reputation of the producers of information in the Social Web is still very much in its infancy. Every day, people are forced to make trusting decisions about strangers on the Web based on a very limited amount of information. For example, purchasing a product from an eBay seller with a `reputation' of 99%, downloading a file from a peer-to-peer application such as Bit-Torrent, or allowing Amazon.com tell you what products you will like. Even something as simple as reading comments on a Web-blog requires the consumer to make a trusting decision about the quality of that information. In all of these example cases, and indeed throughout the Social Web, there is a pressing demand for increased information upon which we can make trusting decisions. This chapter examines the diversity of sources from which trust information can be harnessed within Social Web applications and discusses a high level classification of those sources. Three different techniques for harnessing and using trust from a range of sources are presented. These techniques are deployed in two sample Social Web

  15. LAST CHANCE TO HELP PLAN FOR THE 2001-02 LECTURE SERIES

    CERN Multimedia

    Academic Training; Tel. 73127

    2001-01-01

    Please note that you still have the chance to give your contribution to improved planning for next year's Academic Training Lectures Series. At the web site: http://wwwinfo/support/survey/academic-training/ you will find questionnaires concerning the following different categories: high energy physics, applied physics, science and society and post-graduate students lectures. Answering the questionnaire will help ensure that the selected topics are as close as possible to your interests. In particular requests and comments from students will be much appreciated. To encourage your contribution, the AT Committee will reward one lucky winner with a small prize, a 50 CHF coupon for a book purchase at CERN bookshop.

  16. Web-Based Distributed XML Query Processing

    NARCIS (Netherlands)

    Smiljanic, M.; Feng, L.; Jonker, Willem; Blanken, Henk; Grabs, T.; Schek, H-J.; Schenkel, R.; Weikum, G.

    2003-01-01

    Web-based distributed XML query processing has gained in importance in recent years due to the widespread popularity of XML on the Web. Unlike centralized and tightly coupled distributed systems, Web-based distributed database systems are highly unpredictable and uncontrollable, with a rather

  17. Link invariant and $G_2$ web space

    OpenAIRE

    Sakamoto, Takuro; Yonezawa, Yasuyoshi

    2017-01-01

    In this paper, we reconstruct Kuperberg’s $G_2$ web space [5, 6]. We introduce a new web diagram (a trivalent graph with only double edges) and new relations between Kuperberg’s web diagrams and the new web diagram. Using the web diagrams, we give crossing formulas for the $R$-matrices associated to some irreducible representations of $U_q(G_2)$ and calculate $G_2$ quantum link invariants for generalized twist links.

  18. Geospatial Semantics and the Semantic Web

    CERN Document Server

    Ashish, Naveen

    2011-01-01

    The availability of geographic and geospatial information and services, especially on the open Web has become abundant in the last several years with the proliferation of online maps, geo-coding services, geospatial Web services and geospatially enabled applications. The need for geospatial reasoning has significantly increased in many everyday applications including personal digital assistants, Web search applications, local aware mobile services, specialized systems for emergency response, medical triaging, intelligence analysis and more. Geospatial Semantics and the Semantic Web: Foundation

  19. Responsive web design with jQuery

    CERN Document Server

    Carlos, Gilberto

    2013-01-01

    Responsive Web Design with jQuery follows a standard tutorial-based approach, covering various aspects of responsive web design by building a comprehensive website.""Responsive Web Design with jQuery"" is aimed at web designers who are interested in building device-agnostic websites. You should have a grasp of standard HTML, CSS, and JavaScript development, and have a familiarity with graphic design. Some exposure to jQuery and HTML5 will be beneficial but isn't essential.

  20. From ‘Gads’ to ‘Apps’: the key challenges of post-web internet era

    Directory of Open Access Journals (Sweden)

    Alan César Belo Angeluci

    2013-12-01

    Full Text Available This paper aims to discuss the main changes that device’s applications have faced along the past decades in order to show how internet connection has deeply influenced the way people communicate and interact with this media. From the Internet of Things (IoT perspective, the Hannibal serie and its second screen app case is briefly presented for supporting the key challenges of this post-web internet era: (1 protection of sensitive and private user data in a ubiquitous environment, (2 interoperability, (3 communication and (4 proper language.