WorldWideScience

Sample records for inaccessible web part

  1. 60. The World-Wide Inaccessible Web, Part 1: Browsing

    Science.gov (United States)

    Baggaley, Jon; Batpurev, Batchuluun

    2007-01-01

    Two studies are reported, comparing the browser loading times of webpages created using common Web development techniques. The loading speeds were estimated in 12 Asian countries by members of the "PANdora" network, funded by the International Development Research Centre (IDRC) to conduct collaborative research in the development of…

  2. Technical Evaluation Report 61: The World-Wide Inaccessible Web, Part 2: Internet routes

    Directory of Open Access Journals (Sweden)

    Jim Klaas

    2007-06-01

    Full Text Available In the previous report in this series, Web browser loading times were measured in 12 Asian countries, and were found to be up to four times slower than commonly prescribed as acceptable. Failure of webpages to load at all was frequent. The current follow-up study compares these loading times with the complexity of the Internet routes linking the Web users and the Web servers hosting them. The study was conducted in the same 12 Asian countries, with the assistance of members of the International Development Research Centre’s PANdora distance education research network. The data were generated by network members in Bhutan, Cambodia, India, Indonesia, Laos, Mongolia, the Philippines, Sri Lanka, Pakistan, Singapore, Thailand, and Vietnam. Additional data for the follow-up study were collected in China. Using a ‘traceroute’ routine, the study indicates that webpage loading time is linked to the complexity of the Internet routes between Web users and the host server. It is indicated that distance educators can apply such information in the design of improved online delivery and mirror sites, notably in areas of the developing world which currently lack an effective infrastructure for online education.

  3. Global Web Accessibility Analysis of National Government Portals and Ministry Web Sites

    DEFF Research Database (Denmark)

    Goodwin, Morten; Susar, Deniz; Nietzio, Annika

    2011-01-01

    Equal access to public information and services for all is an essential part of the United Nations (UN) Declaration of Human Rights. Today, the Web plays an important role in providing information and services to citizens. Unfortunately, many government Web sites are poorly designed and have...... accessibility barriers that prevent people with disabilities from using them. This article combines current Web accessibility benchmarking methodologies with a sound strategy for comparing Web accessibility among countries and continents. Furthermore, the article presents the first global analysis of the Web...... accessibility of 192 United Nation Member States made publically available. The article also identifies common properties of Member States that have accessible and inaccessible Web sites and shows that implementing antidisability discrimination laws is highly beneficial for the accessibility of Web sites, while...

  4. Keeping Luxury Inaccessible

    OpenAIRE

    Ward, David; Chiari, Claudia

    2008-01-01

    This paper sets out to explain and decipher luxury and especially inaccessible luxury with the intent to provide enterprises three new analytical tools to ensure they stay ‘in front of the pack’. The paper starts by assessing what luxury was and is today and how and why it has evolved so far. It looks at Mass and Intermediate luxuries and then discusses three models to assess also Inaccessible luxury. The three models specifically developed by the authors are: 1. The Tangibility of Luxury,...

  5. Emperor penguins nesting on Inaccessible Island

    Science.gov (United States)

    Jonkel, G.M.; Llano, G.A.

    1975-01-01

    Emperor penguins were observed nesting on Inaccessible I. during the 1973 winter. This is the southernmost nesting of emperor penguins thus far recorded; it also could be the first record of emperors attempting to start a new rookery. This site, however, may have been used by emperors in the past. The closest reported nesting of these penguins to Inaccessible I. is on the Ross Ice Shelf east of Cape Crozier. With the exception of the Inaccessible I. record, there is little evidence that emperor penguins breed in McMurdo Sound proper.

  6. 77 FR 45297 - Children's Toys and Child Care Articles Containing Phthalates; Proposed Guidance on Inaccessible...

    Science.gov (United States)

    2012-07-31

    ... CONSUMER PRODUCT SAFETY COMMISSION 16 CFR Part 1199 [Docket No. CPSC-2012-0040] Children's Toys... containing phthalates does not apply to any component part of children's toys or child care articles that is... guidance on inaccessible component parts in children's toys or child care articles subject to section 108...

  7. The Aalborg Survey / Part 1 - Web Based Survey

    DEFF Research Database (Denmark)

    Harder, Henrik; Christensen, Cecilie Breinholm

    Background and purpose The Aalborg Survey consists of four independent parts: a web, GPS and an interview based survey and a literature study, which together form a consistent investigation and research into use of urban space, and specifically into young people’s use of urban space: what young......) and the research focus within the cluster of Mobility and Tracking Technologies (MoTT), AAU. Summary / Part 1 Web Base Survey The 1st part of the research project Diverse Urban Spaces (DUS) has been carried out during the period from December 1st 2007 to February 1st 2008 as a Web Based Survey of the 27.040 gross...... [statistikbanken.dk, a] young people aged 14-23 living in Aalborg Municipality in 2008. The web based questionnaire has been distributed among the group of young people studying at upper secondary schools in Aalborg, i.e. 7.680 young people [statistikbanken.dk, b]. The resulting data from those respondents who...

  8. Reviewing the design of DAML+OIL : An ontology language for the Semantic Web

    NARCIS (Netherlands)

    Horrocks, Ian; Patel-Schneider, Peter F.; Van Harmelen, Frank

    2002-01-01

    In the current "Syntactic Web", uninterpreted syntactic constructs are given meaning only by private off-line agreements that are inaccessible to computers. In the Semantic Web vision, this is replaced by a web where both data and its semantic definition are accessible and manipulable by computer

  9. Vegetation and checklist of Inaccessible Island, central South Atlantic Ocean, with notes on Nightingale Island

    Directory of Open Access Journals (Sweden)

    J. P. Roux

    1992-10-01

    Full Text Available The physiography and climate of Inaccessible and Nightingale Islands are briefly discussed. The vegetation and the major plant associations are described. Notes are given on the ecology and distribution of each taxon. Taxa newly recorded for Inaccessible Island include Agrostis goughensis, A.holgateana, A. wacei, Calamagrostis deschampsiiformis, Carex thouarsii var.  recurvata, Conyza albida, Elaphoglossum campylolepium and  Uncinia meridensis. One species, C.  albida, is alien to the Tristan group. Two native ferns Asplenium platybasis var.  subnudum and Blechnum australe were found on Nightingale Island for the first time, and the presence of introduced Malus domestica orchards was recorded. Two unidentified taxa were found that may represent new species:  Elaphoglossum sp. at Inaccessible Island and Apium sp. at both Inaccessible and Nightingale Islands. The total number of vascular plant species recorded at Inaccessible and Nightingale Islands now stands at 98 and 43, respectively, of which 26 (28% and seven (16% are introduced species. Only Airiplex plebeja and two species of Cotula occur at Nightingale Island but are absent from Inaccessible Island.

  10. The Mathematical Microscope - Making the inaccessible accessible

    DEFF Research Database (Denmark)

    Ottesen, Johnny T.

    2011-01-01

      In this chapter we introduce a new term, the "Mathematical Microscope", as a method of using mathematics in accessing information about reality when this information is otherwise inaccessible. Furthermore, we discuss how models and experiments are related: none of which are important without th...... of mathematical modeling is discussed for type 1 and type 2 diabetes, depression, cardiovascular diseases and the interactions between the combinations of these, the so-called gray triangle in the metabolic syndrome....

  11. A Fire Detector for Monitoring Inaccessible Areas in Aircrafts, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — En'Urga Inc. will evaluate the feasibility of utilizing reflected, multi-wavelength, near infrared radiation for detecting fires in inaccessible areas within...

  12. Vegetation and checklist of Inaccessible Island, central South Atlantic Ocean, with notes on Nightingale Island

    OpenAIRE

    J. P. Roux; P. G. Ryan; S. J. Milton; C. L. Moloney

    1992-01-01

    The physiography and climate of Inaccessible and Nightingale Islands are briefly discussed. The vegetation and the major plant associations are described. Notes are given on the ecology and distribution of each taxon. Taxa newly recorded for Inaccessible Island include Agrostis goughensis, A.holgateana, A. wacei, Calamagrostis deschampsiiformis, Carex thouarsii var.  recurvata, Conyza albida, Elaphoglossum campylolepium and  Uncinia meridensis. One species, C.  albida, is alien to the Tristan...

  13. Alteration mineral mapping in inaccessible regions using target detection algorithms to ASTER data

    International Nuclear Information System (INIS)

    Pour, A B; Hashim, M; Park, Y

    2017-01-01

    In this study, the applications of target detection algorithms such as Constrained Energy Minimization (CEM), Orthogonal Subspace Projection (OSP) and Adaptive Coherence Estimator (ACE) to shortwave infrared bands of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data was investigated to extract geological information for alteration mineral mapping in poorly exposed lithologies in inaccessible domains. The Oscar II coast area north-eastern Graham Land, Antarctic Peninsula (AP) was selected in this study to conduct a satellite-based remote sensing mapping technique. It is an inaccessible region due to the remoteness of many rock exposures and the necessity to travel over sever mountainous and glacier-cover terrains for geological field mapping and sample collection. Fractional abundance of alteration minerals such as muscovite, kaolinite, illite, montmorillonite, epidote, chlorite and biotite were identified in alteration zones using CEM, OSP and ACE algorithms in poorly mapped and unmapped zones at district scale for the Oscar II coast area. The results of this investigation demonstrated the applicability of ASTER shortwave infrared spectral data for lithological and alteration mineral mapping in poorly exposed lithologies and inaccessible regions, particularly using the image processing algorithms that are capable to detect sub-pixel targets in the remotely sensed images, where no prior information is available. (paper)

  14. A Virtual Learning Environment for Part-Time MASW Students: An Evaluation of the WebCT

    Science.gov (United States)

    Chan, Charles C.; Tsui, Ming-sum; Chan, Mandy Y. C.; Hong, Joe H.

    2008-01-01

    This study aims to evaluate the perception of a cohort of social workers studying for a part-time master's program in social work in using the popular Web-based learning platform--World Wide Web Course Tools (WebCT) as a complimentary method of teaching and learning. It was noted that social work profession began incorporating computer technology…

  15. An Energy-Based State Observer for Dynamical Subsystems with Inaccessible State Variables

    NARCIS (Netherlands)

    Khalil, I.S.M.; Sabanovic, Asif; Misra, Sarthak

    2012-01-01

    This work presents an energy-based state estimation formalism for a class of dynamical systems with inaccessible/unknown outputs, and systems at which sensor utilization is impractical, or when measurements can not be taken. The power-conserving physical interconnections among most of the dynamical

  16. 网络无障碍的发展:政策、理论和方法%Development of Web Accessibility: Policies, Theories and Apporoaches

    Institute of Scientific and Technical Information of China (English)

    Xiaoming Zeng

    2006-01-01

    The article is intended to introduce the readers to the concept and background of Web accessibility in the United States. I will first discuss different definitions of Web accessibility. The beneficiaries of accessible Web or the sufferers from inaccessible Web will be discussed based on the type of disability. The importance of Web accessibility will be introduced from the perspectives of ethical, demographic, legal, and financial importance. Web accessibility related standards and legislations will be discussed in great detail. Previous research on evaluating Web accessibility will be presented. Lastly, a system for automated Web accessibility transformation will be introduced as an alternative approach for enhancing Web accessibility.

  17. Probing the environment of an inaccessible system by a qubit ancilla

    International Nuclear Information System (INIS)

    Campbell, S.; Paternostro, M.; Kim, M. S.; Bose, S.

    2010-01-01

    We study the conditions for probing the environment affecting an inaccessible system by means of continuous interaction and measurements performed only on a probe. The scheme exploits the statistical properties of the probe at its steady state and simple data postprocessing. Our results, highlighting the roles played by interaction and entanglement in this process, are both pragmatically relevant and fundamentally interesting.

  18. Web Page Recommendation Using Web Mining

    OpenAIRE

    Modraj Bhavsar; Mrs. P. M. Chavan

    2014-01-01

    On World Wide Web various kind of content are generated in huge amount, so to give relevant result to user web recommendation become important part of web application. On web different kind of web recommendation are made available to user every day that includes Image, Video, Audio, query suggestion and web page. In this paper we are aiming at providing framework for web page recommendation. 1) First we describe the basics of web mining, types of web mining. 2) Details of each...

  19. Percutaneous Transhepatic Drainage of Inaccessible Abdominal Abscesses Following Abdominal Surgery Under Real-Time CT-Fluoroscopic Guidance

    International Nuclear Information System (INIS)

    Yamakado, Koichiro; Takaki, Haruyuki; Nakatsuka, Atsuhiro; Kashima, Masataka; Uraki, Junji; Yamanaka, Takashi; Takeda, Kan

    2010-01-01

    This study evaluated the safety, feasibility, and clinical utility of transhepatic drainage of inaccessible abdominal abscesses retrospectively under real-time computed tomographic (CT) guidance. For abdominal abscesses, 12 consecutive patients received percutaneous transhepatic drainage. Abscesses were considered inaccessible using the usual access route because they were surrounded by the liver and other organs. The maximum diameters of abscesses were 4.6-9.5 cm (mean, 6.7 ± 1.4 cm). An 8-Fr catheter was advanced into the abscess cavity through the liver parenchyma using real-time CT fluoroscopic guidance. Safety, feasibility, procedure time, and clinical utility were evaluated. Drainage catheters were placed with no complications in abscess cavities through the liver parenchyma in all patients. The mean procedure time was 18.8 ± 9.2 min (range, 12-41 min). All abscesses were drained. They shrank immediately after catheter placement. In conclusions, this transhepatic approach under real-time CT fluoroscopic guidance is a safe, feasible, and useful technique for use of drainage of inaccessible abdominal abscesses.

  20. Cytoskeleton reorganization/disorganization is a key feature of induced inaccessibility for defence to successive pathogen attacks.

    Science.gov (United States)

    Moral, Juan; Montilla-Bascón, Gracia; Canales, Francisco J; Rubiales, Diego; Prats, Elena

    2017-06-01

    In this work, we investigated the involvement of the long-term dynamics of cytoskeletal reorganization on the induced inaccessibility phenomenon by which cells that successfully defend against a previous fungal attack become highly resistant to subsequent attacks. This was performed on pea through double inoculation experiments using inappropriate (Blumeria graminis f. sp. avenae, Bga) and appropriate (Erysiphe pisi, Ep) powdery mildew fungi. Pea leaves previously inoculated with Bga showed a significant reduction of later Ep infection relative to leaves inoculated only with Ep, indicating that cells had developed induced inaccessibility. This reduction in Ep infection was higher when the time interval between Bga and Ep inoculation ranged between 18 and 24 h, although increased penetration resistance in co-infected cells was observed even with time intervals of 24 days between inoculations. Interestingly, this increase in resistance to Ep following successful defence to the inappropriate Bga was associated with an increase in actin microfilament density that reached a maximum at 18-24 h after Bga inoculation and very slowly decreased afterwards. The putative role of cytoskeleton reorganization/disorganization leading to inaccessibility is supported by the suppression of the induced resistance mediated by specific actin (cytochalasin D, latrunculin B) or general protein (cycloheximide) inhibitors. © 2016 BSPP AND JOHN WILEY & SONS LTD.

  1. Enabling web users and developers to script accessibility with Accessmonkey.

    Science.gov (United States)

    Bigham, Jeffrey P; Brudvik, Jeremy T; Leung, Jessica O; Ladner, Richard E

    2009-07-01

    Efficient web access remains elusive for blind computer users. Previous efforts to improve web accessibility have focused on developer awareness, automated improvement, and legislation, but these approaches have left remaining concerns. First, while many tools can help produce accessible content, most are difficult to integrate into existing developer workflows and rarely offer specific suggestions that developers can implement. Second, tools that automatically improve web content for users generally solve specific problems and are difficult to combine and use on a diversity of existing assistive technology. Finally, although blind web users have proven adept at overcoming the shortcomings of the web and existing tools, they have been only marginally involved in improving the accessibility of their own web experience. In a step toward addressing these concerns, we have developed Accessmonkey, a common scripting framework that web users, web developers and web researchers can use to collaboratively improve accessibility. This framework advances the idea that Javascript and dynamic web content can be used to improve inaccessible content instead of being a cause of it. Using Accessmonkey, web users and developers on different platforms and with potentially different goals can collaboratively make the web more accessible. In this article, we first present the design of the Accessmonkey framework and offer several example scripts that demonstrate the utility of our approach. We conclude by discussing possible future extensions that will provide easy access to scripts as users browse the web and enable non-technical blind users to independently create and share improvements.

  2. Lipids: Part of the tangled web

    Energy Technology Data Exchange (ETDEWEB)

    Krauss, R.M.

    1992-08-01

    Analysis of LDL subclasses by non-denaturing gradient gel electrophoresis has led to the identification of a subclass pattern characterized by predominance of small LDL, designated LDL subclass pattern B. The prevalence of pattern B in the general population is approximately 25%, but varies as a function of age and gender, being relatively uncommon in children and in premenopausal women. The remainder of the population has a predominance of larger LDL (pattern A) or an intermediate pattern. Our findings indicate that LDL subclass pattern B is an integral part of the ``tangled web`` of interrelated coronary disease risk factors associated with insulin resistance. It may be that the pathologic features of this lipoprotein profile, including the relative atherogenicity of small, dense LDL and IDL, contribute importantly to the increased risk of cardiovascular disease in subjects with insulin resistance and hypertension. Furthermore, pattern B serves as a marker for a common genetic trait which may underlie a substantial portion of the familial predisposition to coronary artery disease in the general population. Studies of hormonal, dietary, and pharmacologic influences on expression of this atherogenic phenotype should lead to more effective identification and management of high-risk individuals, and improved approaches to disease prevention in high-risk families.

  3. Lipids: Part of the tangled web

    Energy Technology Data Exchange (ETDEWEB)

    Krauss, R.M.

    1992-08-01

    Analysis of LDL subclasses by non-denaturing gradient gel electrophoresis has led to the identification of a subclass pattern characterized by predominance of small LDL, designated LDL subclass pattern B. The prevalence of pattern B in the general population is approximately 25%, but varies as a function of age and gender, being relatively uncommon in children and in premenopausal women. The remainder of the population has a predominance of larger LDL (pattern A) or an intermediate pattern. Our findings indicate that LDL subclass pattern B is an integral part of the tangled web'' of interrelated coronary disease risk factors associated with insulin resistance. It may be that the pathologic features of this lipoprotein profile, including the relative atherogenicity of small, dense LDL and IDL, contribute importantly to the increased risk of cardiovascular disease in subjects with insulin resistance and hypertension. Furthermore, pattern B serves as a marker for a common genetic trait which may underlie a substantial portion of the familial predisposition to coronary artery disease in the general population. Studies of hormonal, dietary, and pharmacologic influences on expression of this atherogenic phenotype should lead to more effective identification and management of high-risk individuals, and improved approaches to disease prevention in high-risk families.

  4. A University Web Portal redesign applying accessibility patterns. Breaking Down Barriers for Visually Impaired Users

    Directory of Open Access Journals (Sweden)

    Hernán Sosa

    2015-08-01

    Full Text Available Definitely, the WWW and ICTs have become the preferred media for the interaction between society and its citizens, and public and private organizations have today the possibility of deploying their activities through the Web. In particular, university education is a domain where the benefits of these technological resources can strongly contribute in caring for students. However, most university Web portals are inaccessible to their user community (students, professors, and non-teaching staff, between others, since these portals do not take into account the needs of people with different capabilities. In this work, we propose an accessibility pattern driven process to the redesign of university Web portals, aiming to break down barriers for visually impaired users. The approach is implemented to a real case study: the Web portal of Universidad Nacional de la Patagonia Austral (UNPA. The results come from applying accessibility recommendations and evaluation tools (automatic and manual from internationally recognized organizations, to both versions of the Web portal: the original and the redesign one.

  5. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part 1: Web Server Log Analysis

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the first part of a two‐part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage instatistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  6. Reactor units for power supply of remote and inaccessible regions: Selection issue

    Directory of Open Access Journals (Sweden)

    Melnikov N.N.

    2015-06-01

    Full Text Available The paper briefly presents the problem aspects on power supply for the remote and inaccessible regions of Russia. Reactor units of different type and installed electric capacity have been considered in relation to the issue of power supply during mineral deposit development in the Chukotka autonomous region, Yakutia and Irkutsk region. Some preliminary assessment of the possible options for use of small nuclear power plants in various sectors of energy consumption have been carried out based on the analysis of different scenarios for economic development of the regions considered

  7. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    Science.gov (United States)

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  8. Het WEB leert begrijpen

    CERN Multimedia

    Stroeykens, Steven

    2004-01-01

    The WEB could be much more useful if the computers understood something of information on the Web pages. That explains the goal of the "semantic Web", a project in which takes part, amongst others, Tim Berners Lee, the inventor of the original WEB

  9. Investigating flood susceptible areas in inaccessible regions using remote sensing and geographic information systems.

    Science.gov (United States)

    Lim, Joongbin; Lee, Kyoo-Seock

    2017-03-01

    Every summer, North Korea (NK) suffers from floods, resulting in decreased agricultural production and huge economic loss. Besides meteorological reasons, several factors can accelerate flood damage. Environmental studies about NK are difficult because NK is inaccessible due to the division of Korea. Remote sensing (RS) can be used to delineate flood inundated areas in inaccessible regions such as NK. The objective of this study was to investigate the spatial characteristics of flood susceptible areas (FSAs) using multi-temporal RS data and digital elevation model data. Such study will provide basic information to restore FSAs after reunification. Defining FSAs at the study site revealed that rice paddies with low elevation and low slope were the most susceptible areas to flood in NK. Numerous sediments from upper streams, especially streams through crop field areas on steeply sloped hills, might have been transported and deposited into stream channels, thus disturbing water flow. In conclusion, NK floods may have occurred not only due to meteorological factors but also due to inappropriate land use for flood management. In order to mitigate NK flood damage, reforestation is needed for terraced crop fields. In addition, drainage capacity for middle stream channel near rice paddies should be improved.

  10. Web-based control application using WebSocket

    International Nuclear Information System (INIS)

    Furukawa, Y.

    2012-01-01

    The WebSocket allows asynchronous full-duplex communication between a Web-based (i.e. Java Script-based) application and a Web-server. WebSocket started as a part of HTML5 standardization but has now been separated from HTML5 and has been developed independently. Using WebSocket, it becomes easy to develop platform independent presentation layer applications for accelerator and beamline control software. In addition, a Web browser is the only application program that needs to be installed on client computer. The WebSocket-based applications communicate with the WebSocket server using simple text-based messages, so WebSocket is applicable message-based control system like MADOCA, which was developed for the SPring-8 control system. A simple WebSocket server for the MADOCA control system and a simple motor control application were successfully made as a first trial of the WebSocket control application. Using Google-Chrome (version 13.0) on Debian/Linux and Windows 7, Opera (version 11.0) on Debian/Linux and Safari (version 5.0.3) on Mac OS X as clients, the motors can be controlled using a WebSocket-based Web-application. Diffractometer control application use in synchrotron radiation diffraction experiment was also developed. (author)

  11. Plugging inaccessible leaks in cooling water pipework in nuclear power plants

    International Nuclear Information System (INIS)

    Powell, A.B.; May, R.; Down, M.G.

    1988-01-01

    The manifestation of initially small leaks in ancilliary reactor cooling water systems is not an unusual event. Often these leaks are in virtually inaccessible locations - for example, buried in thick concrete shielding or situated in cramped and highly radioactive vaults. Such leaks may ultimately prejudice the availability of the entire nuclear system. Continued operation without repair can result in the leak becoming larger, and the leaking water can cause further corrosion problems and interfere with instrumentation. In addition, the water may increase the volume of radwaste. In short, initially trivial leaks may cause significant operating problems. This paper describes the sealing of such leaks in the biological shield cooling system of Ontario Hydro's Pickering nuclear generating station CANDU reactors

  12. Process-oriented semantic web search

    CERN Document Server

    Tran, DT

    2011-01-01

    The book is composed of two main parts. The first part is a general study of Semantic Web Search. The second part specifically focuses on the use of semantics throughout the search process, compiling a big picture of Process-oriented Semantic Web Search from different pieces of work that target specific aspects of the process.In particular, this book provides a rigorous account of the concepts and technologies proposed for searching resources and semantic data on the Semantic Web. To collate the various approaches and to better understand what the notion of Semantic Web Search entails, this bo

  13. Practical tool to assess reliability of web-based medicines information.

    Science.gov (United States)

    Lebanova, Hristina; Getov, Ilko; Grigorov, Evgeni

    2014-02-01

    Information disseminated by medicines information systems is not always easy to apply. Nowadays internet provides access to enormous volume and range of health information that was previously inaccessible both for medical specialists and consumers. The aim of this study is to assess internet as a source of drug and health related information and to create test methodology to evaluate the top 10 visited health-related web-sites in Bulgaria. Using existing scientific methodologies for evaluation of web sources, a new algorithm of three-step approach consisting of score-card validation of the drug-related information in the 10 most visited Bulgarian web-sites was created. In many cases the drug information in the internet sites contained errors and discrepancies. Some of the published materials were not validated; they were out-of-date and could cause confusion for consumers. The quality of the online health information is a cause for considerable information noise and threat to patients' safety and rational drug use. There is a need of monitoring the drugs information available online in order to prevent patient misinformation and confusion that could lead to medication errors and abuse.

  14. Increasing value and reducing waste: addressing inaccessible research.

    Science.gov (United States)

    Chan, An-Wen; Song, Fujian; Vickers, Andrew; Jefferson, Tom; Dickersin, Kay; Gøtzsche, Peter C; Krumholz, Harlan M; Ghersi, Davina; van der Worp, H Bart

    2014-01-18

    The methods and results of health research are documented in study protocols, full study reports (detailing all analyses), journal reports, and participant-level datasets. However, protocols, full study reports, and participant-level datasets are rarely available, and journal reports are available for only half of all studies and are plagued by selective reporting of methods and results. Furthermore, information provided in study protocols and reports varies in quality and is often incomplete. When full information about studies is inaccessible, billions of dollars in investment are wasted, bias is introduced, and research and care of patients are detrimentally affected. To help to improve this situation at a systemic level, three main actions are warranted. First, academic institutions and funders should reward investigators who fully disseminate their research protocols, reports, and participant-level datasets. Second, standards for the content of protocols and full study reports and for data sharing practices should be rigorously developed and adopted for all types of health research. Finally, journals, funders, sponsors, research ethics committees, regulators, and legislators should endorse and enforce policies supporting study registration and wide availability of journal reports, full study reports, and participant-level datasets. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Creating a web-based digital photographic archive: one hospital library's experience.

    Science.gov (United States)

    Marshall, Caroline; Hobbs, Janet

    2017-04-01

    Cedars-Sinai Medical Center is a nonprofit community hospital based in Los Angeles. Its history spans over 100 years, and its growth and development from the merging of 2 Jewish hospitals, Mount Sinai and Cedars of Lebanon, is also part of the history of Los Angeles. The medical library collects and maintains the hospital's photographic archive, to which retiring physicians, nurses, and an active Community Relations Department have donated photographs over the years. The collection was growing rapidly, it was impossible to display all the materials, and much of the collection was inaccessible to patrons. The authors decided to make the photographic collection more accessible to medical staff and researchers by purchasing a web-based digital archival package, Omeka. We decided what material should be digitized by analyzing archival reference requests and considering the institution's plan to create a Timeline Wall documenting and celebrating the history of Cedars-Sinai. Within 8 months, we digitized and indexed over 500 photographs. The digital archive now allows patrons and researchers to access the history of the hospital and enables the library to process archival references more efficiently.

  16. Web archives

    DEFF Research Database (Denmark)

    Finnemann, Niels Ole

    2018-01-01

    This article deals with general web archives and the principles for selection of materials to be preserved. It opens with a brief overview of reasons why general web archives are needed. Section two and three present major, long termed web archive initiatives and discuss the purposes and possible...... values of web archives and asks how to meet unknown future needs, demands and concerns. Section four analyses three main principles in contemporary web archiving strategies, topic centric, domain centric and time-centric archiving strategies and section five discuss how to combine these to provide...... a broad and rich archive. Section six is concerned with inherent limitations and why web archives are always flawed. The last sections deal with the question how web archives may fit into the rapidly expanding, but fragmented landscape of digital repositories taking care of various parts...

  17. Designing a responsive web site

    OpenAIRE

    Fejzić , Diana

    2016-01-01

    Due to the increasing prevalence of smartphones and tablet computers design became a crucial part of web design. For a user, responsive web design enables the best user experience, regardless of whether a user is visiting the site via a mobile phone, a tablet or a computer. This thesis covers the process of planning, designing and responsive web site development, for a fictitious company named “Creative Design d.o.o.”, with the help of web technologies. In the initial part of the thesis, w...

  18. Defrosting the digital library: bibliographic tools for the next generation web.

    Science.gov (United States)

    Hull, Duncan; Pettifer, Steve R; Kell, Douglas B

    2008-10-01

    Many scientists now manage the bulk of their bibliographic information electronically, thereby organizing their publications and citation material from digital libraries. However, a library has been described as "thought in cold storage," and unfortunately many digital libraries can be cold, impersonal, isolated, and inaccessible places. In this Review, we discuss the current chilly state of digital libraries for the computational biologist, including PubMed, IEEE Xplore, the ACM digital library, ISI Web of Knowledge, Scopus, Citeseer, arXiv, DBLP, and Google Scholar. We illustrate the current process of using these libraries with a typical workflow, and highlight problems with managing data and metadata using URIs. We then examine a range of new applications such as Zotero, Mendeley, Mekentosj Papers, MyNCBI, CiteULike, Connotea, and HubMed that exploit the Web to make these digital libraries more personal, sociable, integrated, and accessible places. We conclude with how these applications may begin to help achieve a digital defrost, and discuss some of the issues that will help or hinder this in terms of making libraries on the Web warmer places in the future, becoming resources that are considerably more useful to both humans and machines.

  19. Defrosting the digital library: bibliographic tools for the next generation web.

    Directory of Open Access Journals (Sweden)

    Duncan Hull

    2008-10-01

    Full Text Available Many scientists now manage the bulk of their bibliographic information electronically, thereby organizing their publications and citation material from digital libraries. However, a library has been described as "thought in cold storage," and unfortunately many digital libraries can be cold, impersonal, isolated, and inaccessible places. In this Review, we discuss the current chilly state of digital libraries for the computational biologist, including PubMed, IEEE Xplore, the ACM digital library, ISI Web of Knowledge, Scopus, Citeseer, arXiv, DBLP, and Google Scholar. We illustrate the current process of using these libraries with a typical workflow, and highlight problems with managing data and metadata using URIs. We then examine a range of new applications such as Zotero, Mendeley, Mekentosj Papers, MyNCBI, CiteULike, Connotea, and HubMed that exploit the Web to make these digital libraries more personal, sociable, integrated, and accessible places. We conclude with how these applications may begin to help achieve a digital defrost, and discuss some of the issues that will help or hinder this in terms of making libraries on the Web warmer places in the future, becoming resources that are considerably more useful to both humans and machines.

  20. Writing for the web composing, coding, and constructing web sites

    CERN Document Server

    Applen, JD

    2013-01-01

    Writing for the Web unites theory, technology, and practice to explore writing and hypertext for website creation. It integrates such key topics as XHTML/CSS coding, writing (prose) for the Web, the rhetorical needs of the audience, theories of hypertext, usability and architecture, and the basics of web site design and technology. Presenting information in digestible parts, this text enables students to write and construct realistic and manageable Web sites with a strong theoretical understanding of how online texts communicate to audiences. Key features of the book

  1. Inaccessibility of reinforcement increases persistence and signaling behavior in the fox squirrel (Sciurus niger).

    Science.gov (United States)

    Delgado, Mikel M; Jacobs, Lucia F

    2016-05-01

    Under natural conditions, wild animals encounter situations where previously rewarded actions do not lead to reinforcement. In the laboratory, a surprising omission of reinforcement induces behavioral and emotional responses described as frustration. Frustration can lead to aggressive behaviors and to the persistence of noneffective responses, but it may also lead to new behavioral responses to a problem, a potential adaptation. We assessed the responses to inaccessible reinforcement in free-ranging fox squirrels (Sciurus niger). We trained squirrels to open a box to obtain food reinforcement, a piece of walnut. After 9 training trials, squirrels were tested in 1 of 4 conditions: a control condition with the expected reward, an alternative reinforcement (a piece of dried corn), an empty box, or a locked box. We measured the presence of signals suggesting arousal (e.g., tail flags and tail twitches) and found that squirrels performed fewer of these behaviors in the control condition and increased certain behaviors (tail flags, biting box) in the locked box condition, compared to other experimental conditions. When faced with nonreinforcement, that is, frustration, squirrels increased the number of interactions with the apparatus and spent more time interacting with the apparatus. This study of frustration responses in a free-ranging animal extends the conclusions of captive studies to the field and demonstrates that fox squirrels show short-term negatively valenced responses to the inaccessibility, omission, and change of reinforcement. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. Recent Advances in Immersive Visualization of Ocean Data: Virtual Reality Through the Web on Your Laptop Computer

    Science.gov (United States)

    Hermann, A. J.; Moore, C.; Soreide, N. N.

    2002-12-01

    Ocean circulation is irrefutably three dimensional, and powerful new measurement technologies and numerical models promise to expand our three-dimensional knowledge of the dynamics further each year. Yet, most ocean data and model output is still viewed using two-dimensional maps. Immersive visualization techniques allow the investigator to view their data as a three dimensional world of surfaces and vectors which evolves through time. The experience is not unlike holding a part of the ocean basin in one's hand, turning and examining it from different angles. While immersive, three dimensional visualization has been possible for at least a decade, the technology was until recently inaccessible (both physically and financially) for most researchers. It is not yet fully appreciated by practicing oceanographers how new, inexpensive computing hardware and software (e.g. graphics cards and controllers designed for the huge PC gaming market) can be employed for immersive, three dimensional, color visualization of their increasingly huge datasets and model output. In fact, the latest developments allow immersive visualization through web servers, giving scientists the ability to "fly through" three-dimensional data stored half a world away. Here we explore what additional insight is gained through immersive visualization, describe how scientists of very modest means can easily avail themselves of the latest technology, and demonstrate its implementation on a web server for Pacific Ocean model output.

  3. Pulmonary balloon angioplasty of chronic thromboembolic pulmonary hypertension (CTEPH) in surgically inaccessible cases

    International Nuclear Information System (INIS)

    Pitton, M.B.; Herber, S.; Thelen, M.; Mayer, E.

    2003-01-01

    The clinical course of patients suffering from chronic thromboembolic pulmonary hypertension (CTEPH) depends on the distribution pattern of the thromboembolic material. In patients with thromboembolic findings in the central pulmonary segments pulmonary thrombendarterectomy (PTE) has excellent results and acceptable operative risk. This paper presents two surgically inaccessable cases that were successfully treated with balloon pulmonary angioplasty. Balloon angioplasty improved parenchymal perfusion, increased cardiac index (ΔCI + 19.2% [Case 1], and + 15.4% [2]), reduced pulmonary vascular resistance during follow-up (ΔPVRI - 25.0% [1] and - 15.9% [2]), and is discussed as an alternative treatment option for cases not suited for surgery. (orig.) [de

  4. The Tip-of-the-Tongue Heuristic: How Tip-of-the-Tongue States Confer Perceptibility on Inaccessible Words

    Science.gov (United States)

    Cleary, Anne M.; Claxton, Alexander B.

    2015-01-01

    This study shows that the presence of a tip-of-the-tongue (TOT) state--the sense that a word is in memory when its retrieval fails--is used as a heuristic for inferring that an inaccessible word has characteristics that are consistent with greater word perceptibility. When reporting a TOT state, people judged an unretrieved word as more likely to…

  5. Development of Web GIS for complex processing and visualization of climate geospatial datasets as an integral part of dedicated Virtual Research Environment

    Science.gov (United States)

    Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander

    2017-04-01

    For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map

  6. Human Trafficking in the United States. Part II. Survey of U.S. Government Web Resources for Publications and Data

    Science.gov (United States)

    Panigabutra-Roberts, Anchalee

    2012-01-01

    This second part of a two-part series is a survey of U.S. government web resources on human trafficking in the United States, particularly of the online publications and data included on agencies' websites. Overall, the goal is to provide an introduction, an overview, and a guide on this topic for library staff to use in their research and…

  7. THE IMPORTANCE OF WEB DESIGN: VISUAL DESIGN EVALUATION OF DESTINATION WEB SITES

    OpenAIRE

    Fırlar, Belma; Okat Özdem, Özen

    2013-01-01

    As in the literature, the researchs about web site efficiency are mostly about site context. The analysis about function are mostly superficial. Whereas, controlling every little part of a web site respective is a necessity to show its efficiency. Here in this context in the study of perception and response event web sites that play an important role in visual design criteria are below the lens as featured and the web sites evaulated by heuristic evaluation method.The research focus of this s...

  8. Evaluation of WebEase: An Epilepsy Self-Management Web Site

    Science.gov (United States)

    DiIorio, Colleen; Escoffery, Cam; McCarty, Frances; Yeager, Katherine A.; Henry, Thomas R.; Koganti, Archana; Reisinger, Elizabeth L.; Wexler, Bethany

    2009-01-01

    People with epilepsy have various education needs and must adopt many self-management behaviors in order to control their condition. This study evaluates WebEase, an Internet-based, theory-driven, self-management program for adults with epilepsy. Thirty-five participants took part in a 6-week pilot implementation of WebEase. The main components of…

  9. Web Accessibility and Guidelines

    Science.gov (United States)

    Harper, Simon; Yesilada, Yeliz

    Access to, and movement around, complex online environments, of which the World Wide Web (Web) is the most popular example, has long been considered an important and major issue in the Web design and usability field. The commonly used slang phrase ‘surfing the Web’ implies rapid and free access, pointing to its importance among designers and users alike. It has also been long established that this potentially complex and difficult access is further complicated, and becomes neither rapid nor free, if the user is disabled. There are millions of people who have disabilities that affect their use of the Web. Web accessibility aims to help these people to perceive, understand, navigate, and interact with, as well as contribute to, the Web, and thereby the society in general. This accessibility is, in part, facilitated by the Web Content Accessibility Guidelines (WCAG) currently moving from version one to two. These guidelines are intended to encourage designers to make sure their sites conform to specifications, and in that conformance enable the assistive technologies of disabled users to better interact with the page content. In this way, it was hoped that accessibility could be supported. While this is in part true, guidelines do not solve all problems and the new WCAG version two guidelines are surrounded by controversy and intrigue. This chapter aims to establish the published literature related to Web accessibility and Web accessibility guidelines, and discuss limitations of the current guidelines and future directions.

  10. Primer on client-side web security

    CERN Document Server

    De Ryck, Philippe; Piessens, Frank; Johns, Martin

    2014-01-01

    This volume illustrates the continuous arms race between attackers and defenders of the Web ecosystem by discussing a wide variety of attacks. In the first part of the book, the foundation of the Web ecosystem is briefly recapped and discussed. Based on this model, the assets of the Web ecosystem are identified, and the set of capabilities an attacker may have are enumerated. In the second part, an overview of the web security vulnerability landscape is constructed. Included are selections of the most representative attack techniques reported in great detail. In addition to descriptions of the

  11. Web Security, Privacy & Commerce

    CERN Document Server

    Garfinkel, Simson

    2011-01-01

    Since the first edition of this classic reference was published, World Wide Web use has exploded and e-commerce has become a daily part of business and personal life. As Web use has grown, so have the threats to our security and privacy--from credit card fraud to routine invasions of privacy by marketers to web site defacements to attacks that shut down popular web sites. Web Security, Privacy & Commerce goes behind the headlines, examines the major security risks facing us today, and explains how we can minimize them. It describes risks for Windows and Unix, Microsoft Internet Exp

  12. SaaS ve web designu

    OpenAIRE

    Míka, Filip

    2011-01-01

    This thesis is aimed to evaluate if the current SaaS market is able to meet functional re-quirements of web design in order to appropriately support web design's activities. The theoretical part introduces the web design model which describes web design's functional requirements. The next section presents a research concept that describes model assessment (i.e. solutions delivered as SaaS that support web design) and evaluation process. The results show that the current SaaS market is able to...

  13. It's Time to Use a Wiki as Part of Your Web Site

    Science.gov (United States)

    Ribaric, Tim

    2007-01-01

    Without a doubt, the term "wiki" has leaked into almost every discussion concerning Web 2.0. The real question becomes: Is there a place for a wiki on every library Web site? The answer should be an emphatic "yes." People often praise the wiki because it offers simple page creation and provides instant gratification for amateur Web developers.…

  14. Moving toward a universally accessible web: Web accessibility and education.

    Science.gov (United States)

    Kurt, Serhat

    2017-12-08

    The World Wide Web is an extremely powerful source of information, inspiration, ideas, and opportunities. As such, it has become an integral part of daily life for a great majority of people. Yet, for a significant number of others, the internet offers only limited value due to the existence of barriers which make accessing the Web difficult, if not impossible. This article illustrates some of the reasons that achieving equality of access to the online world of education is so critical, explores the current status of Web accessibility, discusses evaluative tools and methods that can help identify accessibility issues in educational websites, and provides practical recommendations and guidelines for resolving some of the obstacles that currently hinder the achievability of the goal of universal Web access.

  15. The Role of Web Interviews as Part of a National Travel Survey

    DEFF Research Database (Denmark)

    Christensen, Linda

    2013-01-01

    Purpose — The paper is analysing the effect of adding a web survey to a traditional telephone-based national travel survey by asking the respondents to check in on the web and answer the questions there (Computer Assisted Web Interview, CAWI). If they are not participating by web they are as usual...... called by telephone (Computer Assisted Telephone Interview, CATI). Design/methodology/approach — Multivariate regression analyses are used to analyse the difference in response rates by the two media and to analyse if respondents’ answering by the two media have different travel patterns. Findings...... — The analyses show that web interviews are saving money, even though a more intensive post-processing is necessary. The analyses seem to show that the CAWI is resulting in a more careful answering which results in more trips reported. A CAWI is increasing the participation of children in the survey...

  16. Building Social Web Applications

    CERN Document Server

    Bell, Gavin

    2009-01-01

    Building a web application that attracts and retains regular visitors is tricky enough, but creating a social application that encourages visitors to interact with one another requires careful planning. This book provides practical solutions to the tough questions you'll face when building an effective community site -- one that makes visitors feel like they've found a new home on the Web. If your company is ready to take part in the social web, this book will help you get started. Whether you're creating a new site from scratch or reworking an existing site, Building Social Web Applications

  17. Exploring the academic invisible web

    OpenAIRE

    Lewandowski, Dirk; Mayr, Philipp

    2006-01-01

    Purpose: To provide a critical review of Bergman’s 2001 study on the Deep Web. In addition, we bring a new concept into the discussion, the Academic Invisible Web (AIW). We define the Academic Invisible Web as consisting of all databases and collections relevant to academia but not searchable by the general-purpose internet search engines. Indexing this part of the Invisible Web is central to scientific search engines. We provide an overview of approaches followed thus far. Design/methodol...

  18. A HYPERSPECTRAL BASED METHOD TO DETECT CANNABIS PLANTATION IN INACCESSIBLE AREAS

    Directory of Open Access Journals (Sweden)

    M. Houmi

    2018-04-01

    Full Text Available The increase in drug use worldwide has led to sophisticated illegal planting methods. Most countries depend on helicopters, and local knowledge to identify such illegal plantations. However, remote sensing techniques can provide special advantages for monitoring the extent of illegal drug production. This paper sought to assess the ability of the Satellite remote sensing to detect Cannabis plantations. This was achieved in two stages: 1- Preprocessing of Hyperspectral data EO-1, and testing the capability to collect the spectral signature of Cannabis in different sites of the study area (Morocco from well-known Cannabis plantation fields. 2- Applying the method of Spectral Angle Mapper (SAM based on a specific angle threshold on Hyperion data EO-1 in well-known Cannabis plantation sites, and other sites with negative Cannabis plantation in another study area (Algeria, to avoid any false Cannabis detection using these spectra. This study emphasizes the benefits of using hyperspectral remote sensing data as an effective detection tool for illegal Cannabis plantation in inaccessible areas based on SAM classification method with a maximum angle (radians less than 0.03.

  19. a Hyperspectral Based Method to Detect Cannabis Plantation in Inaccessible Areas

    Science.gov (United States)

    Houmi, M.; Mohamadi, B.; Balz, T.

    2018-04-01

    The increase in drug use worldwide has led to sophisticated illegal planting methods. Most countries depend on helicopters, and local knowledge to identify such illegal plantations. However, remote sensing techniques can provide special advantages for monitoring the extent of illegal drug production. This paper sought to assess the ability of the Satellite remote sensing to detect Cannabis plantations. This was achieved in two stages: 1- Preprocessing of Hyperspectral data EO-1, and testing the capability to collect the spectral signature of Cannabis in different sites of the study area (Morocco) from well-known Cannabis plantation fields. 2- Applying the method of Spectral Angle Mapper (SAM) based on a specific angle threshold on Hyperion data EO-1 in well-known Cannabis plantation sites, and other sites with negative Cannabis plantation in another study area (Algeria), to avoid any false Cannabis detection using these spectra. This study emphasizes the benefits of using hyperspectral remote sensing data as an effective detection tool for illegal Cannabis plantation in inaccessible areas based on SAM classification method with a maximum angle (radians) less than 0.03.

  20. Citations to Web pages in scientific articles: the permanence of archived references.

    Science.gov (United States)

    Thorp, Andrea W; Schriger, David L

    2011-02-01

    We validate the use of archiving Internet references by comparing the accessibility of published uniform resource locators (URLs) with corresponding archived URLs over time. We scanned the "Articles in Press" section in Annals of Emergency Medicine from March 2009 through June 2010 for Internet references in research articles. If an Internet reference produced the authors' expected content, the Web page was archived with WebCite (http://www.webcitation.org). Because the archived Web page does not change, we compared it with the original URL to determine whether the original Web page had changed. We attempted to access each original URL and archived Web site URL at 3-month intervals from the time of online publication during an 18-month study period. Once a URL no longer existed or failed to contain the original authors' expected content, it was excluded from further study. The number of original URLs and archived URLs that remained accessible over time was totaled and compared. A total of 121 articles were reviewed and 144 Internet references were found within 55 articles. Of the original URLs, 15% (21/144; 95% confidence interval [CI] 9% to 21%) were inaccessible at publication. During the 18-month observation period, there was no loss of archived URLs (apart from the 4% [5/123; 95% CI 2% to 9%] that could not be archived), whereas 35% (49/139) of the original URLs were lost (46% loss; 95% CI 33% to 61% by the Kaplan-Meier method; difference between curves PWeb page at publication can help preserve the authors' expected information. Copyright © 2010 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.

  1. Geo-communication and web-based infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2005-01-01

    The role of geo-information and the distribution of geo-information have changed dramatically since the introduction of web-services on the Internet. In the framework of web-services maps should be seen as an index to further geo-information. Maps are no longer an aim in themselves. In this context...... web-services perform the function as index-portals on the basis of geoinformation. The introduction of web-services as index-portals based on geoinformation has changed the conditions for both content and form of geocommunication. A high number of players and interactions (as well as a very high...... number of all kinds of information and combinations of these) characterize web-services, where maps are only a part of the whole. These new conditions demand new ways of modelling the processes leading to geo-communication. One new aspect is the fact that the service providers have become a part...

  2. Web party effect: a cocktail party effect in the web environment.

    Science.gov (United States)

    Rigutti, Sara; Fantoni, Carlo; Gerbino, Walter

    2015-01-01

    In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of variable complexity, in order to reach an information goal located two clicks away from the starting home page. The time needed to reach the goal was accounted for by a novel measure of home page complexity based on a part of (not all) web options: the number of links embedded within web navigation elements weighted by the number and type of embedding elements. Our measure fully mediated the effect of several standard complexity metrics (the overall number of links, words, images, graphical regions, the JPEG file size of home page screenshots) on information seeking time and usability ratings. Furthermore, it predicted the cognitive demand of web navigation, as revealed by the duration judgment ratio (i.e., the ratio of subjective to objective duration of information search). Results demonstrate that focusing on relevant links while ignoring other web objects optimizes the deployment of attentional resources necessary to navigation. This is in line with a web party effect (i.e., a cocktail party effect in the web environment): users tune into web elements that are relevant for the achievement of their navigation goals and tune out all others.

  3. WEB LOG EXPLORER – CONTROL OF MULTIDIMENSIONAL DYNAMICS OF WEB PAGES

    Directory of Open Access Journals (Sweden)

    Mislav Šimunić

    2012-07-01

    Full Text Available Demand markets dictate and pose increasingly more requirements to the supplymarket that are not easily satisfied. The supply market presenting its web pages to thedemand market should find the best and quickest ways to respond promptly to the changesdictated by the demand market. The question is how to do that in the most efficient andquickest way. The data on the usage of web pages on a specific web site are recorded in alog file. The data in a log file are stochastic and unordered and require systematicmonitoring, categorization, analyses, and weighing. From the data processed in this way, itis necessary to single out and sort the data by their importance that would be a basis for acontinuous generation of dynamics/changes to the web site pages in line with the criterionchosen. To perform those tasks successfully, a new software solution is required. For thatpurpose, the authors have developed the first version of the WLE (WebLogExplorersoftware solution, which is actually realization of web page multidimensionality and theweb site as a whole. The WebLogExplorer enables statistical and semantic analysis of a logfile and on the basis thereof, multidimensional control of the web page dynamics. Theexperimental part of the work was done within the web site of HTZ (Croatian NationalTourist Board being the main portal of the global tourist supply in the Republic of Croatia(on average, daily "log" consists of c. 600,000 sets, average size of log file is 127 Mb, andc. 7000-8000 daily visitors on the web site.

  4. Forensic web watch.

    Science.gov (United States)

    Abbas, Ali; N Rutty, Guy

    2003-06-01

    When one thinks of print identification techniques one automatically considers fingerprints. Although finger prints have been in use now for over 100 years there is in fact an older type of identification technique related to prints left at scenes of crime and the anatomy of human body parts. This is the world of ear prints. This short web review considers web sites related to ear print identification particularly the continuing controversy as to whether or not an ear print is unique.

  5. Aspectos de seguridad en Web 2.0 y redes sociales

    OpenAIRE

    Caballero Velasco, María Ángeles

    2011-01-01

    Este proyecto final de carrera de universidad va enfocado a la seguridad en la Web 2.0 y las redes sociales. El documento se divide en dos partes. La primera parte define la visión principal del mundo 2.0, cómo nació y que implicaciones tiene hoy en día en la sociedad. Se definen las características de la Web 2.0 y su historia. Se analizan las diferencias entre la Web 2.0 y la Web 1.0, llegando hasta la Web 3.0. Se clasifican los tipos de Web 2.0 por uso y por aplicación y se analizan todas l...

  6. Aesthetics and function in web design

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2004-01-01

    Since the origin of the web site in the first part of the 90’s there has been discussions regarding the relative weighting of function and aesthetics. A renewed discussion is needed, however, to clarify what exactly is meant by aesthetics in web design. Moreover the balance between aesthetics...... and function ought to be considered more in respect to the target group and the genre of web site....

  7. Web party effect: a cocktail party effect in the web environment

    Directory of Open Access Journals (Sweden)

    Sara Rigutti

    2015-03-01

    Full Text Available In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of variable complexity, in order to reach an information goal located two clicks away from the starting home page. The time needed to reach the goal was accounted for by a novel measure of home page complexity based on a part of (not all web options: the number of links embedded within web navigation elements weighted by the number and type of embedding elements. Our measure fully mediated the effect of several standard complexity metrics (the overall number of links, words, images, graphical regions, the JPEG file size of home page screenshots on information seeking time and usability ratings. Furthermore, it predicted the cognitive demand of web navigation, as revealed by the duration judgment ratio (i.e., the ratio of subjective to objective duration of information search. Results demonstrate that focusing on relevant links while ignoring other web objects optimizes the deployment of attentional resources necessary to navigation. This is in line with a web party effect (i.e., a cocktail party effect in the web environment: users tune into web elements that are relevant for the achievement of their navigation goals and tune out all others.

  8. Web party effect: a cocktail party effect in the web environment

    Science.gov (United States)

    Gerbino, Walter

    2015-01-01

    In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of variable complexity, in order to reach an information goal located two clicks away from the starting home page. The time needed to reach the goal was accounted for by a novel measure of home page complexity based on a part of (not all) web options: the number of links embedded within web navigation elements weighted by the number and type of embedding elements. Our measure fully mediated the effect of several standard complexity metrics (the overall number of links, words, images, graphical regions, the JPEG file size of home page screenshots) on information seeking time and usability ratings. Furthermore, it predicted the cognitive demand of web navigation, as revealed by the duration judgment ratio (i.e., the ratio of subjective to objective duration of information search). Results demonstrate that focusing on relevant links while ignoring other web objects optimizes the deployment of attentional resources necessary to navigation. This is in line with a web party effect (i.e., a cocktail party effect in the web environment): users tune into web elements that are relevant for the achievement of their navigation goals and tune out all others. PMID:25802803

  9. Data management on the spatial web

    DEFF Research Database (Denmark)

    Jensen, Christian S.

    2012-01-01

    Due in part to the increasing mobile use of the web and the proliferation of geo-positioning, the web is fast acquiring a significant spatial aspect. Content and users are being augmented with locations that are used increasingly by location-based services. Studies suggest that each week, several...... billion web queries are issued that have local intent and target spatial web objects. These are points of interest with a web presence, and they thus have locations as well as textual descriptions. This development has given prominence to spatial web data management, an area ripe with new and exciting...... opportunities and challenges. The research community has embarked on inventing and supporting new query functionality for the spatial web. Different kinds of spatial web queries return objects that are near a location argument and are relevant to a text argument. To support such queries, it is important...

  10. An Algebraic Specification of the Semantic Web

    OpenAIRE

    Ksystra, Katerina; Triantafyllou, Nikolaos; Stefaneas, Petros; Frangos, Panayiotis

    2011-01-01

    We present a formal specification of the Semantic Web, as an extension of the World Wide Web using the well known algebraic specification language CafeOBJ. Our approach allows the description of the key elements of the Semantic Web technologies, in order to give a better understanding of the system, without getting involved with their implementation details that might not yet be standardized. This specification is part of our work in progress concerning the modeling the Social Semantic Web.

  11. Persistent Web References – Best Practices and New Suggestions

    DEFF Research Database (Denmark)

    Zierau, Eld; Nyvang, Caroline; Kromann, Thomas Hvid

    In this paper, we suggest adjustments to best practices for persistent web referencing; adjustments that aim at preservation and long time accessibility of web referenced resources in general, but with focus on web references in web archives. Web referencing is highly relevant and crucial...... refer to archive URLs which depends on the web archives access implementations. A major part of the suggested adjustments is a new web reference standard for archived web references (called wPID), which is a supplement to the current practices. The purpose of the standard is to support general, global...

  12. Teaching Web 2.0 technologies using Web 2.0 technologies.

    Science.gov (United States)

    Rethlefsen, Melissa L; Piorun, Mary; Prince, J Dale

    2009-10-01

    The research evaluated participant satisfaction with the content and format of the "Web 2.0 101: Introduction to Second Generation Web Tools" course and measured the impact of the course on participants' self-evaluated knowledge of Web 2.0 tools. The "Web 2.0 101" online course was based loosely on the Learning 2.0 model. Content was provided through a course blog and covered a wide range of Web 2.0 tools. All Medical Library Association members were invited to participate. Participants were asked to complete a post-course survey. Respondents who completed the entire course or who completed part of the course self-evaluated their knowledge of nine social software tools and concepts prior to and after the course using a Likert scale. Additional qualitative information about course strengths and weaknesses was also gathered. Respondents' self-ratings showed a significant change in perceived knowledge for each tool, using a matched pair Wilcoxon signed rank analysis (P<0.0001 for each tool/concept). Overall satisfaction with the course appeared high. Hands-on exercises were the most frequently identified strength of the course; the length and time-consuming nature of the course were considered weaknesses by some. Learning 2.0-style courses, though demanding time and self-motivation from participants, can increase knowledge of Web 2.0 tools.

  13. XML databases and the semantic web

    CERN Document Server

    Thuraisingham, Bhavani

    2002-01-01

    Efficient access to data, sharing data, extracting information from data, and making use of the information have become urgent needs for today''s corporations. With so much data on the Web, managing it with conventional tools is becoming almost impossible. New tools and techniques are necessary to provide interoperability as well as warehousing between multiple data sources and systems, and to extract information from the databases. XML Databases and the Semantic Web focuses on critical and new Web technologies needed for organizations to carry out transactions on the Web, to understand how to use the Web effectively, and to exchange complex documents on the Web.This reference for database administrators, database designers, and Web designers working in tandem with database technologists covers three emerging technologies of significant impact for electronic business: Extensible Markup Language (XML), semi-structured databases, and the semantic Web. The first two parts of the book explore these emerging techn...

  14. WebQuest y anotaciones semánticas WebQuest and semantic annotations

    Directory of Open Access Journals (Sweden)

    Santiago Blanco Suárez

    2007-03-01

    Full Text Available En este artículo se presenta un sistema de búsqueda y recuperación de metadatos de actividades educativas que siguen el modelo WebQuest. Se trata de una base de datos relacional, accesible a través del web, que se complementa con un módulo que permite realizar anotaciones semánticas y cuyo objetivo es capturar y enriquecer el conocimiento acerca del uso de dichos ejercicios por parte de la comunidad de docentes que experimentan con ellos, así como documentar los recursos o sitios web de interés didáctico buscando construir un repositorio de enlaces educativos de calidad. This paper presents a system of searching and recovering educational activities that follow the Web-Quest model through the web, complemented with a module to make semantic annotations aimed at getting and enriching the knowledge on the use of these exercises by the teaching community. It also tries to document the resources or websites with didactic interest in order to build a qualified account of educational links.

  15. Web 2.0. Nettet holder liv i grusomme borgerkrige

    DEFF Research Database (Denmark)

    Schmidt, Søren

    2017-01-01

    The article explains the reasons, timing and character of on-going civil wars in the Middle East. The web 2.0 explains many parts of this.......The article explains the reasons, timing and character of on-going civil wars in the Middle East. The web 2.0 explains many parts of this....

  16. Manufacture of plastic parts by radiation molding

    International Nuclear Information System (INIS)

    Leszyk, G.M.; Morrison, E.D.; Williams, R.F. Jr.

    1977-01-01

    Thin plastic parts which can have precise tolerances and can be of complex shape are prepared by casting a viscous radiation-curable composition onto a support, such as a moving web of polymeric material, in the shape of the desired part and then irradiating, for example with ultraviolet radiation or high energy electrons, to cause curing of the composition to a solid plastic. The radiation-curable composition is formulated with viscosity and flow characteristics it to be cast in the exact shape of the part desired yet retain this shape during curing while supported only by the surface on which it has been cast. Plastic parts made by this method can be formed entirely of the radiation-curable composition by casting onto a web having a release surface from which the part can be stripped subsequent to curing or can be formed partially from a web material and partially from the radiation-curable composition by casting onto a web to which the composition will bond and subsequently cutting the web into discrete portions which include the cured composition

  17. Investigation of Flood Risk Assessment in Inaccessible Regions using Multiple Remote Sensing and Geographic Information Systems

    Science.gov (United States)

    Lim, J.; Lee, K. S.

    2017-12-01

    Flooding is extremely dangerous when a river overflows to inundate an urban area. From 1995 to 2016, North Korea (NK) experienced annual extensive damage to life and property almost each year due to a levee breach resulting from typhoons and heavy rainfall during the summer monsoon season. Recently, Hoeryeong City (2016) experienced heavy rainfall during typhoon Lionrock and the resulting flood killed and injured many people (68,900) and destroyed numerous buildings and settlements (11,600). The NK state media described it as the biggest national disaster since 1945. Thus, almost all annual repeat occurrences of floods in NK have had a serious impact, which makes it necessary to figure out the extent of floods in restoring the damaged environment. In addition, traditional hydrological model is impractical to delineate Flood Damaged Areas (FDAs) in NK due to the inaccessibility. Under such a situation, multiple optical Remote Sensing (RS) and radar RS along with a Geographic Information System (GIS)-based spatial analysis were utilized in this study (1) to develop modelling FDA delineation using multiple RS and GIS methods and (2) to conduct flood risk assessment in NK. Interpreting high-resolution web-based satellite imagery were also implemented to confirm the results of the study. From the study result, it was found that (1) on August 30th, 2016, an area of 117.2 km2 (8.6%) at Hoeryeong City was inundated. Most floods occurred in flat areas with a lower and middle stream order. (2) In the binary logistic regression model applied in this study, the distance from the nearest stream map and landform map variables are important factors to delineate FDAs because these two factors reflect heterogeneous mountainous NK topography. (3) Total annual flood risk of study area is estimated to be ₩454.13 million NKW ($504,417.24 USD, and ₩576.53 million SKW). The risk of the confluence of the Tumen River and Hoeryeong stream appears to be the highest. (4) High resolution

  18. Theoretical Foundations of the Web: Cognition, Communication, and Co-Operation. Towards an Understanding of Web 1.0, 2.0, 3.0

    Directory of Open Access Journals (Sweden)

    Robert Bichler

    2010-02-01

    Full Text Available Currently, there is much talk of Web 2.0 and Social Software. A common understanding of these notions is not yet in existence. The question of what makes Social Software social has thus far also remained unacknowledged. In this paper we provide a theoretical understanding of these notions by outlining a model of the Web as a techno-social system that enhances human cognition towards communication and co-operation. According to this understanding, we identify three qualities of the Web, namely Web 1.0 as a Web of cognition, Web 2.0 as a Web of human communication, and Web 3.0 as a Web of co-operation. We use the terms Web 1.0, Web 2.0, Web 3.0 not in a technical sense, but for describing and characterizing the social dynamics and information processes that are part of the Internet.

  19. USING WEB MINING IN E-COMMERCE APPLICATIONS

    Directory of Open Access Journals (Sweden)

    Claudia Elena Dinucă

    2011-09-01

    Full Text Available Nowadays, the web is an important part of our daily life. The web is now the best medium of doing business. Large companies rethink their business strategy using the web to improve business. Business carried on the Web offers the opportunity to potential customers or partners where their products and specific business can be found. Business presence through a company web site has several advantages as it breaks the barrier of time and space compared with the existence of a physical office. To differentiate through the Internet economy, winning companies have realized that e-commerce transactions is more than just buying / selling, appropriate strategies are key to improve competitive power. One effective technique used for this purpose is data mining. Data mining is the process of extracting interesting knowledge from data. Web mining is the use of data mining techniques to extract information from web data. This article presents the three components of web mining: web usage mining, web structure mining and web content mining.

  20. Beyond Web 2.0 … and Beyond the Semantic Web

    Science.gov (United States)

    Bénel, Aurélien; Zhou, Chao; Cahier, Jean-Pierre

    Tim O'Reilly, the famous technology book publisher, changed the life of many of us when he coined the name "Web 2.0" (O' Reilly 2005). Our research topics suddenly became subjects for open discussion in various cultural formats such as radio and TV, while at the same time they became part of an inappropriate marketing discourse according to several scientific reviewers. Indeed Tim O'Reilly's initial thoughts were about economic consequence, since it was about the resurrection of the Web after the bursting of the dot-com bubble. Some opponents of the concept do not think the term should be used at all since it is underpinned by no technological revolution. In contrast, we think that there was a paradigm shift when several sites based on user-generated content became some of the most visited Web sites and massive adoption of that kind is worthy of researchers' attention.

  1. DISTANCE LEARNING ONLINE WEB 3 .0

    Directory of Open Access Journals (Sweden)

    S. M. Petryk

    2015-05-01

    Full Text Available This article analyzes the existing methods of identification information in the semantic web, outlines the main problems of its implementation and researches the use of Semantic Web as the part of distance learning. Proposed alternative variant of identification and relationship construction of information and acquired knowledge based on the developed method “spectrum of knowledge”

  2. Introduction to Webometrics Quantitative Web Research for the Social Sciences

    CERN Document Server

    Thelwall, Michael

    2009-01-01

    Webometrics is concerned with measuring aspects of the web: web sites, web pages, parts of web pages, words in web pages, hyperlinks, web search engine results. The importance of the web itself as a communication medium and for hosting an increasingly wide array of documents, from journal articles to holiday brochures, needs no introduction. Given this huge and easily accessible source of information, there are limitless possibilities for measuring or counting on a huge scale (e.g., the number of web sites, the number of web pages, the number of blogs) or on a smaller scale (e.g., the number o

  3. A Web Server for MACCS Magnetometer Data

    Science.gov (United States)

    Engebretson, Mark J.

    1998-01-01

    NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.

  4. Towards a Pattern Language for Adaptive Web-Based Educational Systems

    NARCIS (Netherlands)

    Avgeriou, P.; Vogiatzis, D.; Tzanavari, A.; Retalis, S.

    2004-01-01

    Adaptive Web-based Educational Systems represent an emerging technology that provides a unique advantage over traditional Web-based Educational Systems; that is the ability to adapt to the user's needs, goals, preferences etc. Adaptive Web-based Educational Systems are increasingly becoming part of

  5. Design Patterns in Adaptive Web-Based Educational Systems : An Overview

    NARCIS (Netherlands)

    Avgeriou, Paris; Vogiatzis, Dimitrios; Tzanavari, Aimilia; Retalis, Symeon

    2004-01-01

    Adaptive Web-based Educational Systems represent an emerging technology that provides a unique advantage over traditional Web-based Educational Systems; that is the ability to adapt to the user's needs, goals, preferences etc. Adaptive Web-based Educational Systems are increasingly becoming part of

  6. Web User Profiling Based on Browsing Behavior Analysis

    OpenAIRE

    Fan , Xiao-Xi; Chow , Kam-Pui; Xu , Fei

    2014-01-01

    Part 1: Internet Crime Investigations; International audience; Determining the source of criminal activity requires a reliable means to estimate a criminal’s identity. One way to do this is to use web browsing history to build a profile of an anonymous user. Since an individual’s web use is unique, matching the web use profile to known samples provides a means to identify an unknown user. This paper describes a model for web user profiling and identification. Two aspects of browsing behavior ...

  7. Semantic Web Services Challenge, Results from the First Year. Series: Semantic Web And Beyond, Volume 8.

    Science.gov (United States)

    Petrie, C.; Margaria, T.; Lausen, H.; Zaremba, M.

    Explores trade-offs among existing approaches. Reveals strengths and weaknesses of proposed approaches, as well as which aspects of the problem are not yet covered. Introduces software engineering approach to evaluating semantic web services. Service-Oriented Computing is one of the most promising software engineering trends because of the potential to reduce the programming effort for future distributed industrial systems. However, only a small part of this potential rests on the standardization of tools offered by the web services stack. The larger part of this potential rests upon the development of sufficient semantics to automate service orchestration. Currently there are many different approaches to semantic web service descriptions and many frameworks built around them. A common understanding, evaluation scheme, and test bed to compare and classify these frameworks in terms of their capabilities and shortcomings, is necessary to make progress in developing the full potential of Service-Oriented Computing. The Semantic Web Services Challenge is an open source initiative that provides a public evaluation and certification of multiple frameworks on common industrially-relevant problem sets. This edited volume reports on the first results in developing common understanding of the various technologies intended to facilitate the automation of mediation, choreography and discovery for Web Services using semantic annotations. Semantic Web Services Challenge: Results from the First Year is designed for a professional audience composed of practitioners and researchers in industry. Professionals can use this book to evaluate SWS technology for their potential practical use. The book is also suitable for advanced-level students in computer science.

  8. Ten years for the public Web

    CERN Multimedia

    2003-01-01

    Ten years ago, CERN issued a statement declaring that a little known piece of software called the World Wide Web was in the public domain. Nowadays, the Web is an indispensable part of modern communications. The idea for the Web goes back to March 1989 when CERN Computer scientist Tim Berners-Lee wrote a proposal for a 'Distributed Information Management System' for the high-energy physics community. The Web was originaly conceived and developed to meet the demand for information sharing between scientists working all over the world. There were many obstacles in the 1980s to the effective exchange of information. There was, for example a great variety of computer and network systems, with hardly any common features. The main purpose of the web was to allow scientists to access information from any source in a consistent and simple way. By Christmas 1990, Berners-Lee's idea had become the World Wide Web, with its first server and browser running at CERN. Through 1991, the Web spread to other particle physics ...

  9. A rare association between dextrogastria, duodenal web, and ...

    African Journals Online (AJOL)

    ... the radiologic investigation for bilious vomiting and feeding intolerance, revealing congenital duodenal stenosis and dextrogastria. During surgery, the association of the dextrogastria with the duodenal web situated in the second part of the duodenum was established. Keywords: dextrogastria, duodenal web, malrotation ...

  10. Lost but Not Forgotten: Finding Pages on the Unarchived Web

    NARCIS (Netherlands)

    Huurdeman, H.C.; Kamps, J.; Samar, T.; de Vries, A.P.; Ben-David, A.; Rogers, R.A.

    2015-01-01

    Web archives attempt to preserve the fast changing web, yet they will always be incomplete. Due to restrictions in crawling depth, crawling frequency, and restrictive selection policies, large parts of the Web are unarchived and, therefore, lost to posterity. In this paper, we propose an approach to

  11. Lost but not forgotten: finding pages on the unarchived web

    NARCIS (Netherlands)

    H.C. Huurdeman; J. Kamps; T. Samar (Thaer); A.P. de Vries (Arjen); A. Ben-David; R.A. Rogers (Richard)

    2015-01-01

    htmlabstractWeb archives attempt to preserve the fast changing web, yet they will always be incomplete. Due to restrictions in crawling depth, crawling frequency, and restrictive selection policies, large parts of the Web are unarchived and, therefore, lost to posterity. In this paper, we propose an

  12. Induction of antibodies against epitopes inaccessible on the HIV type 1 envelope oligomer by immunization with recombinant monomeric glycoprotein 120

    DEFF Research Database (Denmark)

    Schønning, Kristian; Bolmstedt, A; Novotny, J

    1998-01-01

    An N-glycan (N306) at the base of the V3 loop of HIV-BRU gp120 is shielding a linear neutralization epitope at the tip of the V3 loop on oligomeric Env. In contrast, this epitope is readily antigenic on monomeric gp120. Immunization with recombinant monomeric HIV-BRU gp120 may thus be expected...... immunogenic structures inaccessible on the envelope oligomer. The limited ability of recombinant gp120 vaccines to induce neutralizing antibodies against primary isolates may thus not exclusively reflect genetic variation....

  13. MEAN STACK WEB DEVELOPMENT

    OpenAIRE

    Le Thanh, Nghi

    2017-01-01

    The aim of the thesis is to provide a universal website using JavaScript as the main programming language. It also shows the basic parts anyone need to create a web application. The thesis creates a simple CMS using MEAN stack. MEAN is a collection of JavaScript based technologies used to develop web application. It is an acronym for MongoDB, Express, AngularJS and Node.js. It also allows non-technical users to easily update and manage a website’s content. But the application also lets o...

  14. Sustainable web ecosystem design

    CERN Document Server

    O'Toole, Greg

    2013-01-01

    This book is about the process of creating web-based systems (i.e., websites, content, etc.) that consider each of the parts, the modules, the organisms - binary or otherwise - that make up a balanced, sustainable web ecosystem. In the current media-rich environment, a website is more than a collection of relative html documents of text and images on a static desktop computer monitor. There is now an unlimited combination of screens, devices, platforms, browsers, locations, versions, users, and exabytes of data with which to interact. Written in a highly approachable, practical style, this boo

  15. Classroom Web Pages: A "How-To" Guide for Educators.

    Science.gov (United States)

    Fehling, Eric E.

    This manual provides teachers, with very little or no technology experience, with a step-by-step guide for developing the necessary skills for creating a class Web Page. The first part of the manual is devoted to the thought processes preceding the actual creation of the Web Page. These include looking at other Web Pages, deciding what should be…

  16. MACHINE LEARNING IMPLEMENTATION FOR THE CLASSIFICATION OF ATTACKS ON WEB SYSTEMS. PART 2

    Directory of Open Access Journals (Sweden)

    K. Smirnova

    2017-11-01

    Full Text Available The possibility of applying machine learning for the classification of malicious requests to aWeb application is considered. This approach excludes the use of deterministic analysis systems (for example, expert systems,and is based on the application of a cascade of neural networks or perceptrons on an approximate model to the real humanbrain. The main idea of the work is to enable to describe complex attack vectors consisting of feature sets, abstract terms forcompiling a training sample, controlling the quality of recognition and classifying each of the layers (networks participatingin the work, with the ability to adjust not the entire network, but only a small part of it, in the training of which a mistake orinaccuracy crept in. The design of the developed network can be described as a cascaded, scalable neural network.When using neural networks to detect attacks on web systems, the issue of vectorization and normalization of features isacute. The most commonly used methods for solving these problems are not designed for the case of deliberate distortion ofthe signs of an attack.The proposed approach makes it possible to obtain a neural network that has been studied in more detail by small features,and also to eliminate the normalization issues in order to avoid deliberately bypassing the intrusion detection system. Byisolating one more group of neurons in the network and teaching it to samples containing various variants of circumvention ofthe attack classification, the developed intrusion detection system remains able to classify any types of attacks as well as theiraggregates, putting forward more stringent measures to counteract attacks. This allows you to follow the life cycle of theattack in more detail: from the starting trial attack to deliberate sophisticated attempts to bypass the system and introducemore decisive measures to actively counteract the attack, eliminating the chances of a false alarm system.

  17. Human Activity in the Web

    OpenAIRE

    Radicchi, Filippo

    2009-01-01

    The recent information technology revolution has enabled the analysis and processing of large-scale datasets describing human activities. The main source of data is represented by the Web, where humans generally use to spend a relevant part of their day. Here we study three large datasets containing the information about Web human activities in different contexts. We study in details inter-event and waiting time statistics. In both cases, the number of subsequent operations which differ by ta...

  18. Digital libraries and World Wide Web sites and page persistence.

    Directory of Open Access Journals (Sweden)

    Wallace Koehler

    1999-01-01

    Full Text Available Web pages and Web sites, some argue, can either be collected as elements of digital or hybrid libraries, or, as others would have it, the WWW is itself a library. We begin with the assumption that Web pages and Web sites can be collected and categorized. The paper explores the proposition that the WWW constitutes a library. We conclude that the Web is not a digital library. However, its component parts can be aggregated and included as parts of digital library collections. These, in turn, can be incorporated into "hybrid libraries." These are libraries with both traditional and digital collections. Material on the Web can be organized and managed. Native documents can be collected in situ, disseminated, distributed, catalogueed, indexed, controlled, in traditional library fashion. The Web therefore is not a library, but material for library collections is selected from the Web. That said, the Web and its component parts are dynamic. Web documents undergo two kinds of change. The first type, the type addressed in this paper, is "persistence" or the existence or disappearance of Web pages and sites, or in a word the lifecycle of Web documents. "Intermittence" is a variant of persistence, and is defined as the disappearance but reappearance of Web documents. At any given time, about five percent of Web pages are intermittent, which is to say they are gone but will return. Over time a Web collection erodes. Based on a 120-week longitudinal study of a sample of Web documents, it appears that the half-life of a Web page is somewhat less than two years and the half-life of a Web site is somewhat more than two years. That is to say, an unweeded Web document collection created two years ago would contain the same number of URLs, but only half of those URLs point to content. The second type of change Web documents experience is change in Web page or Web site content. Again based on the Web document samples, very nearly all Web pages and sites undergo some

  19. Measuring Dynamic and Kinetic Information in the Previously Inaccessible Supra-tc Window of Nanoseconds to Microseconds by Solution NMR Spectroscopy

    Directory of Open Access Journals (Sweden)

    Donghan Lee

    2013-09-01

    Full Text Available Nuclear Magnetic Resonance (NMR spectroscopy is a powerful tool that has enabled experimentalists to characterize molecular dynamics and kinetics spanning a wide range of time-scales from picoseconds to days. This review focuses on addressing the previously inaccessible supra-τc window (defined as τc < supra-τc < 40 μs; in which τc is the overall tumbling time of a molecule from the perspective of local inter-nuclear vector dynamics extracted from residual dipolar couplings (RDCs and from the perspective of conformational exchange captured by relaxation dispersion measurements (RD. The goal of the first section is to present a detailed analysis of how to extract protein dynamics encoded in RDCs and how to relate this information to protein functionality within the previously inaccessible supra-τc window. In the second section, the current state of the art for RD is analyzed, as well as the considerable progress toward pushing the sensitivity of RD further into the supra-τc scale by up to a factor of two (motion up to 25 ms. From the data obtained with these techniques and methodology, the importance of the supra-τ c scale for protein function and molecular recognition is becoming increasingly clearer as the connection between motion on the supra-τc scale and protein functionality from the experimental side is further strengthened with results from molecular dynamics simulations.

  20. From Field to the Web: Management and Publication of Geoscience Samples in CSIRO Mineral Resources

    Science.gov (United States)

    Devaraju, A.; Klump, J. F.; Tey, V.; Fraser, R.; Reid, N.; Brown, A.; Golodoniuc, P.

    2016-12-01

    Inaccessible samples are an obstacle to the reproducibility of research and may cause waste of time and resources through duplication of sample collection and management. Within the Commonwealth Scientific and Industrial Research Organisation (CSIRO) Mineral Resources there are various research communities who collect or generate physical samples as part of their field studies and analytical processes. Materials can be varied and could be rock, soil, plant materials, water, and even synthetic materials. Given the wide range of applications in CSIRO, each researcher or project may follow their own method of collecting, curating and documenting samples. In many cases samples and their documentation are often only available to the sample collector. For example, the Australian Resources Research Centre stores rock samples and research collections dating as far back as the 1970s. Collecting these samples again would be prohibitively expensive and in some cases impossible because the site has been mined out. These samples would not be easily discoverable by others without an online sample catalog. We identify some of the organizational and technical challenges to provide unambiguous and systematic access to geoscience samples, and present their solutions (e.g., workflow, persistent identifier and tools). We present the workflow starting from field sampling to sample publication on the Web, and describe how the International Geo Sample Number (IGSN) can be applied to identify samples along the process. In our test case geoscientific samples are collected as part of the Capricorn Distal Footprints project, a collaboration project between the CSIRO, the Geological Survey of Western Australia, academic institutions and industry partners. We conclude by summarizing the values of our solutions in terms of sample management and publication.

  1. Monitoring small reservoirs' storage with satellite remote sensing in inaccessible areas

    Science.gov (United States)

    Avisse, Nicolas; Tilmant, Amaury; François Müller, Marc; Zhang, Hua

    2017-12-01

    In river basins with water storage facilities, the availability of regularly updated information on reservoir level and capacity is of paramount importance for the effective management of those systems. However, for the vast majority of reservoirs around the world, storage levels are either not measured or not readily available due to financial, political, or legal considerations. This paper proposes a novel approach using Landsat imagery and digital elevation models (DEMs) to retrieve information on storage variations in any inaccessible region. Unlike existing approaches, the method does not require any in situ measurement and is appropriate for monitoring small, and often undocumented, irrigation reservoirs. It consists of three recovery steps: (i) a 2-D dynamic classification of Landsat spectral band information to quantify the surface area of water, (ii) a statistical correction of DEM data to characterize the topography of each reservoir, and (iii) a 3-D reconstruction algorithm to correct for clouds and Landsat 7 Scan Line Corrector failure. The method is applied to quantify reservoir storage in the Yarmouk basin in southern Syria, where ground monitoring is impeded by the ongoing civil war. It is validated against available in situ measurements in neighbouring Jordanian reservoirs. Coefficients of determination range from 0.69 to 0.84, and the normalized root-mean-square error from 10 to 16 % for storage estimations on six Jordanian reservoirs with maximal water surface areas ranging from 0.59 to 3.79 km2.

  2. Commonsense in parts : Mining part-whole relations from theweb and image tags

    NARCIS (Netherlands)

    Tandon, Niket; Hariman, Charles; Urbani, Jacopo; Rohrbach, Anna; Rohrbach, Marcus; Weikum, Gerhard

    2016-01-01

    Commonsense knowledge about part-whole relations (e.g., screen partOf notebook) is important for interpreting user input in web search and question answering, or for object detection in images. Prior work on knowledge base construction has compiled part-whole assertions, but with substantial

  3. Development of a laboratory niche Web site.

    Science.gov (United States)

    Dimenstein, Izak B; Dimenstein, Simon I

    2013-10-01

    This technical note presents the development of a methodological laboratory niche Web site. The "Grossing Technology in Surgical Pathology" (www.grossing-technology.com) Web site is used as an example. Although common steps in creation of most Web sites are followed, there are particular requirements for structuring the template's menu on methodological laboratory Web sites. The "nested doll principle," in which one object is placed inside another, most adequately describes the methodological approach to laboratory Web site design. Fragmentation in presenting the Web site's material highlights the discrete parts of the laboratory procedure. An optimally minimal triad of components can be recommended for the creation of a laboratory niche Web site: a main set of media, a blog, and an ancillary component (host, contact, and links). The inclusion of a blog makes the Web site a dynamic forum for professional communication. By forming links and portals, cloud computing opens opportunities for connecting a niche Web site with other Web sites and professional organizations. As an additional source of information exchange, methodological laboratory niche Web sites are destined to parallel both traditional and new forms, such as books, journals, seminars, webinars, and internal educational materials. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Pedagogy for teaching and learning cooperatively on the Web: a Web-based pharmacology course.

    Science.gov (United States)

    Tse, Mimi M Y; Pun, Sandra P Y; Chan, Moon Fai

    2007-02-01

    The Internet is becoming a preferred place to find information. Millions of people go online in the search of health and medical information. Likewise, the demand for Web-based courses grows. This article presents the development, utilization and evaluation of a web-based pharmacology course for nursing students. The course was developed based on 150 commonly used drugs. There were 110 year 1 nursing students took part in the course. After attending six hours face to face lecture of pharmacology over three weeks, students were invited to complete a questionnaire (pre-test) about learning pharmacology. The course materials were then uploaded to a WebCT for student's self-directed learning and attempts to pass two scheduled online quizzes. At the end of the semester, students were given the same questionnaire (post-test). There were a significant increase in the understanding compared with memorizing the subject content, the development of problem solving ability in learning pharmacology and becoming an independent learner (p ,0.05). Online quizzes yielded satisfactory results. In the focused group interview, students appreciated the time flexibility and convenience associated with web-based learning, also, they had made good suggestions in enhancing web-based learning. Web-based approach is promising for teaching and learning pharmacology for nurses and other health-care professionals.

  5. Deep Web : acceso, seguridad y análisis de tráfico

    OpenAIRE

    Cagiga Vila, Ignacio

    2017-01-01

    RESUMEN: Este trabajo pretende hacer un análisis técnico de la Deep Web en el ámbito de las redes y las tecnologías de Internet. La parte principal del proyecto puede verse dividida en dos partes: Acceso a la Deep Web como cliente, e implementación de un relay de la Tor Network. La implementación de un relay de la Tor Network permite comprender como se consigue asegurar la anonimidad y seguridad de los usuarios que intentan acceder a la Deep Web a través de esta red. La parte de laboratorio d...

  6. Usability Evaluation of Public Web Mapping Sites

    Science.gov (United States)

    Wang, C.

    2014-04-01

    Web mapping sites are interactive maps that are accessed via Webpages. With the rapid development of Internet and Geographic Information System (GIS) field, public web mapping sites are not foreign to people. Nowadays, people use these web mapping sites for various reasons, in that increasing maps and related map services of web mapping sites are freely available for end users. Thus, increased users of web mapping sites led to more usability studies. Usability Engineering (UE), for instance, is an approach for analyzing and improving the usability of websites through examining and evaluating an interface. In this research, UE method was employed to explore usability problems of four public web mapping sites, analyze the problems quantitatively and provide guidelines for future design based on the test results. Firstly, the development progress for usability studies were described, and simultaneously several usability evaluation methods such as Usability Engineering (UE), User-Centered Design (UCD) and Human-Computer Interaction (HCI) were generally introduced. Then the method and procedure of experiments for the usability test were presented in detail. In this usability evaluation experiment, four public web mapping sites (Google Maps, Bing maps, Mapquest, Yahoo Maps) were chosen as the testing websites. And 42 people, who having different GIS skills (test users or experts), gender (male or female), age and nationality, participated in this test to complete the several test tasks in different teams. The test comprised three parts: a pretest background information questionnaire, several test tasks for quantitative statistics and progress analysis, and a posttest questionnaire. The pretest and posttest questionnaires focused on gaining the verbal explanation of their actions qualitatively. And the design for test tasks targeted at gathering quantitative data for the errors and problems of the websites. Then, the results mainly from the test part were analyzed. The

  7. Geo-communication and web-based geospatial infrastructure

    DEFF Research Database (Denmark)

    Brodersen, Lars; Nielsen, Anders

    2005-01-01

    The introduction of web-services as index-portals based on geoinformation has changed the conditions for both content and form of geocommunication. A high number of players and interactions (as well as a very high number of all kinds of information and combinations of these) characterize web-services......, where maps are only a part of the whole. These new conditions demand new ways of modelling the processes leading to geo-communication. One new aspect is the fact that the service providers have become a part of the geo-communication process with influence on the content. Another aspect...

  8. Mechanistic pathways of recognition of a solvent-inaccessible cavity of protein by a ligand

    Science.gov (United States)

    Mondal, Jagannath; Pandit, Subhendu; Dandekar, Bhupendra; Vallurupalli, Pramodh

    One of the puzzling questions in the realm of protein-ligand recognition is how a solvent-inaccessible hydrophobic cavity of a protein gets recognized by a ligand. We address the topic by simulating, for the first time, the complete binding process of benzene from aqueous media to the well-known buried cavity of L99A T4 Lysozyme at an atomistic resolution. Our multiple unbiased microsecond-long trajectories, which were completely blind to the location of target binding site, are able to unequivocally identify the kinetic pathways along which benzene molecule meanders across the solvent and protein and ultimately spontaneously recognizes the deeply buried cavity of L99A T4 Lysozyme at an accurate precision. Our simulation, combined with analysis based on markov state model and free energy calculation, reveals that there are more than one distinct ligand binding pathways. Intriguingly, each of the identified pathways involves the transient opening of a channel of the protein prior to ligand binding. The work will also decipher rich mechanistic details on unbinding kinetics of the ligand as obtained from enhanced sampling techniques.

  9. The DIRAC Web Portal 2.0

    Science.gov (United States)

    Mathe, Z.; Casajus Ramo, A.; Lazovsky, N.; Stagni, F.

    2015-12-01

    For many years the DIRAC interware (Distributed Infrastructure with Remote Agent Control) has had a web interface, allowing the users to monitor DIRAC activities and also interact with the system. Since then many new web technologies have emerged, therefore a redesign and a new implementation of the DIRAC Web portal were necessary, taking into account the lessons learnt using the old portal. These new technologies allowed to build a more compact, robust and responsive web interface that enables users to have better control over the whole system while keeping a simple interface. The web framework provides a large set of “applications”, each of which can be used for interacting with various parts of the system. Communities can also create their own set of personalised web applications, and can easily extend already existing ones with a minimal effort. Each user can configure and personalise the view for each application and save it using the DIRAC User Profile service as RESTful state provider, instead of using cookies. The owner of a view can share it with other users or within a user community. Compatibility between different browsers is assured, as well as with mobile versions. In this paper, we present the new DIRAC Web framework as well as the LHCb extension of the DIRAC Web portal.

  10. Molecular structure input on the web

    Directory of Open Access Journals (Sweden)

    Ertl Peter

    2010-02-01

    Full Text Available Abstract A molecule editor, that is program for input and editing of molecules, is an indispensable part of every cheminformatics or molecular processing system. This review focuses on a special type of molecule editors, namely those that are used for molecule structure input on the web. Scientific computing is now moving more and more in the direction of web services and cloud computing, with servers scattered all around the Internet. Thus a web browser has become the universal scientific user interface, and a tool to edit molecules directly within the web browser is essential. The review covers a history of web-based structure input, starting with simple text entry boxes and early molecule editors based on clickable maps, before moving to the current situation dominated by Java applets. One typical example - the popular JME Molecule Editor - will be described in more detail. Modern Ajax server-side molecule editors are also presented. And finally, the possible future direction of web-based molecule editing, based on technologies like JavaScript and Flash, is discussed.

  11. Web application security analysis using the Kali Linux operating system

    OpenAIRE

    BABINCEV IVAN M.; VULETIC DEJAN V.

    2016-01-01

    The Kali Linux operating system is described as well as its purpose and possibilities. There are listed groups of tools that Kali Linux has together with the methods of their functioning, as well as a possibility to install and use tools that are not an integral part of Kali. The final part shows a practical testing of web applications using the tools from the Kali Linux operating system. The paper thus shows a part of the possibilities of this operating system in analaysing web applications ...

  12. Untangling Web 2.0: Charting Web 2.0 Tools, the NCSS Guidelines for Effective Use of Technology, and Bloom's Taxonomy

    Science.gov (United States)

    Diacopoulos, Mark M.

    2015-01-01

    The potential for social studies to embrace instructional technology and Web 2.0 applications has become a growing trend in recent social studies research. As part of an ongoing process of collaborative enquiry between an instructional specialist and social studies teachers in a Professional Learning Community, a table of Web 2.0 applications was…

  13. Interactive Web-based e-learning for Studying Flexible Manipulator Systems

    Directory of Open Access Journals (Sweden)

    Abul K. M. Azad

    2008-03-01

    Full Text Available Abstract— This paper presents a web-based e-leaning facility for simulation, modeling, and control of flexible manipulator systems. The simulation and modeling part includes finite difference and finite element simulations along with neural network and genetic algorithm based modeling strategies for flexible manipulator systems. The controller part constitutes a number of open-loop and closed-loop designs. Closed loop control designs include the classical, adaptive, and neuro-model based strategies. Matlab software package and its associated toolboxes are used to implement these. The Matlab web server is used as the gateway between the facility and web-access. ASP.NET technology and SQL database are utilized to develop web applications for access control, user account and password maintenance, administrative management, and facility utilization monitoring. The reported facility provides a flexible but effective approach of web-based interactive e-learning facility of an engineering system. This can be extended to incorporate additional engineering systems within the e-learning framework.

  14. Everyday inclusive Web design: an activity perspective

    Directory of Open Access Journals (Sweden)

    Shaun K. Kane

    2007-01-01

    Full Text Available Introduction. The accessibility of Websites to people with disabilities is a problem that affects millions of people. Current accessibility initiatives generally target large government or commercial sites. A rapidly growing segment of online content is created by non-professionals. This content is often inaccessible, thereby excluding users with disabilities. Method. Activity theory is used to provide a model of the activities of non-professional, 'end-user' designers. Drawing from the author's experiences with technology learners, a holistic model of end-user Web design is produced. Analysis. The activity model is divided into three components. The activities of end-user designers, tool designers and Website consumers are examined. Potential barriers to the adoption of accessibility practices are identified. Results. Barriers to accessibility can occur within individual activity systems, or may be caused by interactions between systems. The accessibility of this content cannot be addressed by a single party, but requires collaboration between the designer and toolmaker. End-user designers work within a complex social environment and may face uncertainty regarding their roles as designers that affects their awareness of accessibility. Conclusion. . Increasing the accessibility of user-produced content may require a holistic approach. An activity model may be helpful in producing tools and educational materials

  15. Exploiting Multimedia in Creating and Analysing Multimedia Web Archives

    Directory of Open Access Journals (Sweden)

    Jonathon S. Hare

    2014-04-01

    Full Text Available The data contained on the web and the social web are inherently multimedia and consist of a mixture of textual, visual and audio modalities. Community memories embodied on the web and social web contain a rich mixture of data from these modalities. In many ways, the web is the greatest resource ever created by human-kind. However, due to the dynamic and distributed nature of the web, its content changes, appears and disappears on a daily basis. Web archiving provides a way of capturing snapshots of (parts of the web for preservation and future analysis. This paper provides an overview of techniques we have developed within the context of the EU funded ARCOMEM (ARchiving COmmunity MEMories project to allow multimedia web content to be leveraged during the archival process and for post-archival analysis. Through a set of use cases, we explore several practical applications of multimedia analytics within the realm of web archiving, web archive analysis and multimedia data on the web in general.

  16. Mobile Web: the democratisation of an essential tool

    CERN Multimedia

    Laëtitia Pedroso

    2011-01-01

    For many of us, using the Web is a natural and even indispensable part of our daily lives. But only 20% of the world’s population have access to it. Tim Berners-Lee, the Web's inventor, created the Web Foundation in 2007 with the aim of accelerating access to the Web for the rest of the world's population. Showcased at the Sharing Knowledge conference, the Mobile Web is one of the Web Foundation’s projects in which members of CERN are involved.   Virtually no access to the Web but a very extensive GSM network: that's the situation that many developing countries especially in Africa find themselves in. “Owing to its size, its unstable soils and its limited infrastructure, it is technically very difficult to bring optic fibres for Internet connections to all regions of Africa. The idea of the Mobile Web project is therefore to be able to use the GSM network to access the Web,” explains Silvano de Gennaro, a member of the video team within CERN's Communication Gro...

  17. LHCb : The DIRAC Web Portal 2.0

    CERN Multimedia

    Mathe, Zoltan; Lazovsky, N; Stagni, Federico

    2015-01-01

    For many years the DIRAC interware (Distributed Infrastructure with Remote Agent Control) has had a web interface, allowing the users to monitor DIRAC activities and also interact with the system. Since then many new web technologies have emerged, therefore a redesign and a new implementation of the DIRAC Web portal were necessary, taking into account the lessons learnt using the old portal. These new technologies allowed to build a more compact and more responsive web interface that is robust and that enables users to have more control over the whole system while keeping a simple interface. The framework provides a large set of "applications", each of which can be used for interacting with various parts of the system. Communities can also create their own set of personalised web applications, and can easily extend already existing web applications with a minimal effort. Each user can configure and personalise the view for each application and save it using the DIRAC User Profile service as RESTful state prov...

  18. Endoscopic ultrasound-guided pancreaticobiliary intervention in patients with surgically altered anatomy and inaccessible papillae: A review of current literature

    Science.gov (United States)

    Martin, Aaron; Kistler, Charles Andrew; Wrobel, Piotr; Yang, Juliana F.; Siddiqui, Ali A.

    2016-01-01

    The management of pancreaticobiliary disease in patients with surgically altered anatomy is a growing problem for gastroenterologists today. Over the years, endoscopic ultrasound (EUS) has emerged as an important diagnostic and therapeutic modality in the treatment of pancreaticobiliary disease. Patient anatomy has become increasingly complex due to advances in surgical resection of pancreaticobiliary disease and EUS has emerged as the therapy of choice when endoscopic retrograde cholangiopancreatography failed cannulation or when the papilla is inaccessible such as in gastric obstruction or duodenal obstruction. The current article gives a comprehensive review of the current literature for EUS-guided intervention of the pancreaticobiliary tract in patients with altered surgical anatomy. PMID:27386471

  19. Monitoring small reservoirs' storage with satellite remote sensing in inaccessible areas

    Directory of Open Access Journals (Sweden)

    N. Avisse

    2017-12-01

    Full Text Available In river basins with water storage facilities, the availability of regularly updated information on reservoir level and capacity is of paramount importance for the effective management of those systems. However, for the vast majority of reservoirs around the world, storage levels are either not measured or not readily available due to financial, political, or legal considerations. This paper proposes a novel approach using Landsat imagery and digital elevation models (DEMs to retrieve information on storage variations in any inaccessible region. Unlike existing approaches, the method does not require any in situ measurement and is appropriate for monitoring small, and often undocumented, irrigation reservoirs. It consists of three recovery steps: (i a 2-D dynamic classification of Landsat spectral band information to quantify the surface area of water, (ii a statistical correction of DEM data to characterize the topography of each reservoir, and (iii a 3-D reconstruction algorithm to correct for clouds and Landsat 7 Scan Line Corrector failure. The method is applied to quantify reservoir storage in the Yarmouk basin in southern Syria, where ground monitoring is impeded by the ongoing civil war. It is validated against available in situ measurements in neighbouring Jordanian reservoirs. Coefficients of determination range from 0.69 to 0.84, and the normalized root-mean-square error from 10 to 16 % for storage estimations on six Jordanian reservoirs with maximal water surface areas ranging from 0.59 to 3.79 km2.

  20. Web components and the semantic web

    OpenAIRE

    Casey, Maire; Pahl, Claus

    2003-01-01

    Component-based software engineering on the Web differs from traditional component and software engineering. We investigate Web component engineering activites that are crucial for the development,com position, and deployment of components on the Web. The current Web Services and Semantic Web initiatives strongly influence our work. Focussing on Web component composition we develop description and reasoning techniques that support a component developer in the composition activities,fo cussing...

  1. A teen's guide to creating web pages and blogs

    CERN Document Server

    Selfridge, Peter; Osburn, Jennifer

    2008-01-01

    Whether using a social networking site like MySpace or Facebook or building a Web page from scratch, millions of teens are actively creating a vibrant part of the Internet. This is the definitive teen''s guide to publishing exciting web pages and blogs on the Web. This easy-to-follow guide shows teenagers how to: Create great MySpace and Facebook pages Build their own unique, personalized Web site Share the latest news with exciting blogging ideas Protect themselves online with cyber-safety tips Written by a teenager for other teens, this book leads readers step-by-step through the basics of web and blog design. In this book, teens learn to go beyond clicking through web sites to learning winning strategies for web design and great ideas for writing blogs that attract attention and readership.

  2. WebVis: a hierarchical web homepage visualizer

    Science.gov (United States)

    Renteria, Jose C.; Lodha, Suresh K.

    2000-02-01

    WebVis, the Hierarchical Web Home Page Visualizer, is a tool for managing home web pages. The user can access this tool via the WWW and obtain a hierarchical visualization of one's home web pages. WebVis is a real time interactive tool that supports many different queries on the statistics of internal files such as sizes, age, and type. In addition, statistics on embedded information such as VRML files, Java applets, images and sound files can be extracted and queried. Results of these queries are visualized using color, shape and size of different nodes of the hierarchy. The visualization assists the user in a variety of task, such as quickly finding outdated information or locate large files. WebVIs is one solution to the growing web space maintenance problem. Implementation of WebVis is realized with Perl and Java. Perl pattern matching and file handling routines are used to collect and process web space linkage information and web document information. Java utilizes the collected information to produce visualization of the web space. Java also provides WebVis with real time interactivity, while running off the WWW. Some WebVis examples of home web page visualization are presented.

  3. PHP The Good Parts

    CERN Document Server

    MacIntyre, Peter

    2010-01-01

    Get past all the hype about PHP and dig into the real power of this language. This book explores the most useful features of PHP and how they can speed up the web development process, and explains why the most commonly used PHP elements are often misused or misapplied. You'll learn which parts add strength to object-oriented programming, and how to use certain features to integrate your application with databases. Written by a longtime member of the PHP community, PHP: The Good Parts is ideal for new PHP programmers, as well as web developers switching from other languages. Become familiar w

  4. A Secure Web Application Providing Public Access to High-Performance Data Intensive Scientific Resources - ScalaBLAST Web Application

    International Nuclear Information System (INIS)

    Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.

    2008-01-01

    This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroic effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster

  5. Realization of a system to demand and acquire information via world wide web; Realizzazione di un sistema per l'accesso e l'acquisizione di informazioni via web in un ente complesso

    Energy Technology Data Exchange (ETDEWEB)

    Bongiovanni, G [Rome Univ. La Sapienza, Rome (Italy); Di Marco, R A [ENEA, Sede Centrale, Rome (Italy). Funzione Centrale Informatica; Cappitelli, A

    1999-07-01

    The project realized within this thesis exactly regards interactive web and shows how it is possible to obtain particular functionalities opportunely fixed tools. Such project consists in the realization of a system which allows users with opportune account to demand and to acquire information, and to manage these users and a part of the informative system by whom holds administration tasks. Developed software contains an experimental part and a section dedicated to the public key cryptography, which has been employed to carry out sure transactions via web. [Italian] Il rapporto descrive un progetto di realizzazione di un web interattivo e mostra come sia possibile ottenere particolari funzionalita' impiegando opportunamente determinati strumenti come Java, specifica CGI. Il sistema permette agli utenti, tramite opportuno account, di richiedere e acquisire informazioni. Nel software sviluppato e' presente una parte sperimentale e una parte dedicata alla crittografia a chiave pubblica, impiegata per effettuare transazioni sicure via web.

  6. Standard biological parts knowledgebase.

    Directory of Open Access Journals (Sweden)

    Michal Galdzicki

    2011-02-01

    Full Text Available We have created the Knowledgebase of Standard Biological Parts (SBPkb as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org. The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org. SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL, a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate "promoter" parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible.

  7. Standard Biological Parts Knowledgebase

    Science.gov (United States)

    Galdzicki, Michal; Rodriguez, Cesar; Chandran, Deepak; Sauro, Herbert M.; Gennari, John H.

    2011-01-01

    We have created the Knowledgebase of Standard Biological Parts (SBPkb) as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org). The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org). SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL), a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate “promoter” parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible. PMID:21390321

  8. Standard biological parts knowledgebase.

    Science.gov (United States)

    Galdzicki, Michal; Rodriguez, Cesar; Chandran, Deepak; Sauro, Herbert M; Gennari, John H

    2011-02-24

    We have created the Knowledgebase of Standard Biological Parts (SBPkb) as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org). The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org). SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL), a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate "promoter" parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible.

  9. WebDMS: A Web-Based Data Management System for Environmental Data

    Science.gov (United States)

    Ekstrand, A. L.; Haderman, M.; Chan, A.; Dye, T.; White, J. E.; Parajon, G.

    2015-12-01

    DMS is an environmental Data Management System to manage, quality-control (QC), summarize, document chain-of-custody, and disseminate data from networks ranging in size from a few sites to thousands of sites, instruments, and sensors. The server-client desktop version of DMS is used by local and regional air quality agencies (including the Bay Area Air Quality Management District, the South Coast Air Quality Management District, and the California Air Resources Board), the EPA's AirNow Program, and the EPA's AirNow-International (AirNow-I) program, which offers countries the ability to run an AirNow-like system. As AirNow's core data processing engine, DMS ingests, QCs, and stores real-time data from over 30,000 active sensors at over 5,280 air quality and meteorological sites from over 130 air quality agencies across the United States. As part of the AirNow-I program, several instances of DMS are deployed in China, Mexico, and Taiwan. The U.S. Department of State's StateAir Program also uses DMS for five regions in China and plans to expand to other countries in the future. Recent development has begun to migrate DMS from an onsite desktop application to WebDMS, a web-based application designed to take advantage of cloud hosting and computing services to increase scalability and lower costs. WebDMS will continue to provide easy-to-use data analysis tools, such as time-series graphs, scatterplots, and wind- or pollution-rose diagrams, as well as allowing data to be exported to external systems such as the EPA's Air Quality System (AQS). WebDMS will also provide new GIS analysis features and a suite of web services through a RESTful web API. These changes will better meet air agency needs and allow for broader national and international use (for example, by the AirNow-I partners). We will talk about the challenges and advantages of migrating DMS to the web, modernizing the DMS user interface, and making it more cost-effective to enhance and maintain over time.

  10. A Collaborative Writing Project Using the Worldwide Web.

    Science.gov (United States)

    Sylvester, Allen; Essex, Christopher

    A student in a distance education course, as part of a midterm project, set out to build a Web site that had written communication as its main focus. The Web site, "The Global Campfire," was modeled on the old Appalachian tradition of the "Story Tree," where a storyteller begins a story and allows group members to add to it.…

  11. Functional webs for freeform architecture

    KAUST Repository

    Deng, Bailin

    2011-08-01

    Rationalization and construction-aware design dominate the issue of realizability of freeform architecture. The former means the decomposition of an intended shape into parts which are sufficiently simple and efficient to manufacture; the latter refers to a design procedure which already incorporates rationalization. Recent contributions to this topic have been concerned mostly with small-scale parts, for instance with planar faces of meshes. The present paper deals with another important aspect, namely long-range parts and supporting structures. It turns out that from the pure geometry viewpoint this means studying families of curves which cover surfaces in certain well-defined ways. Depending on the application one has in mind, different combinatorial arrangements of curves are required. We here restrict ourselves to so-called hexagonal webs which correspond to a triangular or tri-hex decomposition of a surface. The individual curve may have certain special properties, like being planar, being a geodesic, or being part of a circle. Each of these properties is motivated by manufacturability considerations and imposes constraints on the shape of the surface. We investigate the available degrees of freedom, show numerical methods of optimization, and demonstrate the effectivity of our approach and the variability of construction solutions derived from webs by means of actual architectural designs.

  12. Septipyridines as conformationally controlled substitutes for inaccessible bis(terpyridine-derived oligopyridines in two-dimensional self-assembly

    Directory of Open Access Journals (Sweden)

    Daniel Caterbow

    2011-07-01

    Full Text Available The position of the peripheral nitrogen atoms in bis(terpyridine-derived oligopyridines (BTPs has a strong impact on their self-assembly behavior at the liquid/HOPG (highly oriented pyrolytic graphite interface. The intermolecular hydrogen bonding interactions in these peripheral pyridine units show specific 2D structures for each BTP isomer. From nine possible constitutional isomers only four have been described in the literature. The synthesis and self-assembling behavior of an additional isomer is presented here, but the remaining four members of the series are synthetically inaccessible. The self-assembling properties of three of the missing four BTP isomers can be mimicked by making use of the energetically preferred N–C–C–N transoid conformation between 2,2'-bipyridine subunits in a new class of so-called septipyridines. The structures are investigated by scanning tunneling microscopy (STM and a combination of force-field and first-principles electronic structure calculations.

  13. Utilizing Web 2.0 Technologies for Library Web Tutorials: An Examination of Instruction on Community College Libraries' Websites Serving Large Student Bodies

    Science.gov (United States)

    Blummer, Barbara; Kenton, Jeffrey M.

    2015-01-01

    This is the second part of a series on Web 2.0 tools available from community college libraries' Websites. The first article appeared in an earlier volume of this journal and it illustrated the wide variety of Web 2.0 tools on community college libraries' Websites serving large student bodies (Blummer and Kenton 2014). The research found many of…

  14. Web buckling behavior under in-plane compression and shear loads for web reinforced composite sandwich core

    Science.gov (United States)

    Toubia, Elias Anis

    Sandwich construction is one of the most functional forms of composite structures developed by the composite industry. Due to the increasing demand of web-reinforced core for composite sandwich construction, a research study is needed to investigate the web plate instability under shear, compression, and combined loading. If the web, which is an integral part of the three dimensional web core sandwich structure, happens to be slender with respect to one or two of its spatial dimensions, then buckling phenomena become an issue in that it must be quantified as part of a comprehensive strength model for a fiber reinforced core. In order to understand the thresholds of thickness, web weight, foam type, and whether buckling will occur before material yielding, a thorough investigation needs to be conducted, and buckling design equations need to be developed. Often in conducting a parametric study, a special purpose analysis is preferred over a general purpose analysis code, such as a finite element code, due to the cost and effort usually involved in generating a large number of results. A suitable methodology based on an energy method is presented to solve the stability of symmetrical and specially orthotropic laminated plates on an elastic foundation. Design buckling equations were developed for the web modeled as a laminated plate resting on elastic foundations. The proposed equations allow for parametric studies without limitation regarding foam stiffness, geometric dimensions, or mechanical properties. General behavioral trends of orthotropic and symmetrical anisotropic plates show pronounced contribution of the elastic foundation and fiber orientations on the buckling resistance of the plate. The effects of flexural anisotropy on the buckling behavior of long rectangular plates when subjected to pure shear loading are well represented in the model. The reliability of the buckling equations as a design tool is confirmed by comparison with experimental results

  15. ANALISIS DEL COMPORTAMIENTO DEL USUARIO WEB

    OpenAIRE

    ROMAN ASENJO, PABLO ENRIQUE; ROMAN ASENJO, PABLO ENRIQUE

    2011-01-01

    Desde los orígenes de la Web en el CERN, ha existido una pregunta recurrente entre los investigadores y desarrolladores: ¿Cual es la estructura y contenido correcto para que un sitio Web atraiga y/o retenga a sus visitantes? En parte, la respuesta a esta interrogante, se encuentra fuertemente relacionada con una mayor comprensión de las motivaciones que posee un usuario al visitar un sitio. Tradicionalmente, se han utilizado algoritmos de minería de datos (Machine Learning) para extraer...

  16. ANALISIS DEL COMPORTAMIENTO DEL USUARIO WEB

    OpenAIRE

    ROMAN ASENJO, PABLO ENRIQUE

    2011-01-01

    Desde los orígenes de la Web en el CERN, ha existido una pregunta recurrente entre los investigadores y desarrolladores: ¿Cual es la estructura y contenido correcto para que un sitio Web atraiga y/o retenga a sus visitantes? En parte, la respuesta a esta inteJTogante, se encuentra fuertemente relacionada con una mayor comprensión de las motivaciones que posee un usuario al visitar un sitio. Tradicionalmente, se han utilizado algoritmos de minería de datos (Machine Learning) para extraer pa...

  17. Inaccessible Built Environments in Ghana’s Universities: The Bane of a Weak Legal and Regulatory Framework for Persons with Disabilities 1

    Directory of Open Access Journals (Sweden)

    John Tiah Bugri

    2017-05-01

    Full Text Available This is a qualitative study of the role of the legal and regulatory framework in making built environments accessible to Persons with Disabilities in six universities in Ghana. It revealed that the local component of legislation dealing with accessible environments was fragile and fraught with compliance challenges, administrative laxity and the lack of a time conscious approach to issues thereby resulting in inaccessible built environments. In effect, the study gives credence to the proposition of the social model that disability is a creation of humankind and recommends an amendment of Ghana’s Persons with Disability Act.

  18. The Web 2.0 as Marketing Tool: Opportunities for SMEs

    NARCIS (Netherlands)

    Constantinides, Efthymios

    2008-01-01

    The new generation of Internet applications widely known as Social Media or Web 2.0 offers corporations a whole range of opportunities for improving their marketing efficiency and internal operations. Web 2.0 applications have already become part of the daily life of an increasing number of

  19. Realization of a system to demand and acquire information via world wide web; Realizzazione di un sistema per l'accesso e l'acquisizione di informazioni via web in un ente complesso

    Energy Technology Data Exchange (ETDEWEB)

    Bongiovanni, G. [Rome Univ. La Sapienza, Rome (Italy); Di Marco, R.A. [ENEA, Sede Centrale, Rome (Italy). Funzione Centrale Informatica; Cappitelli, A.

    1999-07-01

    The project realized within this thesis exactly regards interactive web and shows how it is possible to obtain particular functionalities opportunely fixed tools. Such project consists in the realization of a system which allows users with opportune account to demand and to acquire information, and to manage these users and a part of the informative system by whom holds administration tasks. Developed software contains an experimental part and a section dedicated to the public key cryptography, which has been employed to carry out sure transactions via web. [Italian] Il rapporto descrive un progetto di realizzazione di un web interattivo e mostra come sia possibile ottenere particolari funzionalita' impiegando opportunamente determinati strumenti come Java, specifica CGI. Il sistema permette agli utenti, tramite opportuno account, di richiedere e acquisire informazioni. Nel software sviluppato e' presente una parte sperimentale e una parte dedicata alla crittografia a chiave pubblica, impiegata per effettuare transazioni sicure via web.

  20. Beginning ASPNET Web Pages with WebMatrix

    CERN Document Server

    Brind, Mike

    2011-01-01

    Learn to build dynamic web sites with Microsoft WebMatrix Microsoft WebMatrix is designed to make developing dynamic ASP.NET web sites much easier. This complete Wrox guide shows you what it is, how it works, and how to get the best from it right away. It covers all the basic foundations and also introduces HTML, CSS, and Ajax using jQuery, giving beginning programmers a firm foundation for building dynamic web sites.Examines how WebMatrix is expected to become the new recommended entry-level tool for developing web sites using ASP.NETArms beginning programmers, students, and educators with al

  1. The Web 2.0 as Marketing Tool: Opportunities for SMEs

    OpenAIRE

    Constantinides, Efthymios

    2008-01-01

    The new generation of Internet applications widely known as Social Media or Web 2.0 offers corporations a whole range of opportunities for improving their marketing efficiency and internal operations. Web 2.0 applications have already become part of the daily life of an increasing number of consumers who regard them as prime channels of communication, information exchange, sharing of expertise, dissemination of individual creativity and entertainment. Web logs, podcasts, online forums and soc...

  2. Non-visual Web Browsing: Beyond Web Accessibility.

    Science.gov (United States)

    Ramakrishnan, I V; Ashok, Vikas; Billah, Syed Masum

    2017-07-01

    People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability.

  3. A midas plugin to enable construction of reproducible web-based image processing pipelines.

    Science.gov (United States)

    Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek

    2013-01-01

    Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.

  4. A Midas Plugin to Enable Construction of Reproducible Web-based Image Processing Pipelines

    Directory of Open Access Journals (Sweden)

    Michael eGrauer

    2013-12-01

    Full Text Available Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based UI, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.

  5. Using crowdsourced web content for informing water systems operations in snow-dominated catchments

    Science.gov (United States)

    Giuliani, Matteo; Castelletti, Andrea; Fedorov, Roman; Fraternali, Piero

    2016-12-01

    Snow is a key component of the hydrologic cycle in many regions of the world. Despite recent advances in environmental monitoring that are making a wide range of data available, continuous snow monitoring systems that can collect data at high spatial and temporal resolution are not well established yet, especially in inaccessible high-latitude or mountainous regions. The unprecedented availability of user-generated data on the web is opening new opportunities for enhancing real-time monitoring and modeling of environmental systems based on data that are public, low-cost, and spatiotemporally dense. In this paper, we contribute a novel crowdsourcing procedure for extracting snow-related information from public web images, either produced by users or generated by touristic webcams. A fully automated process fetches mountain images from multiple sources, identifies the peaks present therein, and estimates virtual snow indexes representing a proxy of the snow-covered area. Our procedure has the potential for complementing traditional snow-related information, minimizing costs and efforts for obtaining the virtual snow indexes and, at the same time, maximizing the portability of the procedure to several locations where such public images are available. The operational value of the obtained virtual snow indexes is assessed for a real-world water-management problem, the regulation of Lake Como, where we use these indexes for informing the daily operations of the lake. Numerical results show that such information is effective in extending the anticipation capacity of the lake operations, ultimately improving the system performance.

  6. Towards Development of Web-based Assessment System Based on Semantic Web Technology

    Directory of Open Access Journals (Sweden)

    Hosam Farouk El-Sofany

    2011-01-01

    Full Text Available The assessment process in an educational system is an important and primordial part of its success to assure the correct way of knowledge transmission and to ensure that students are working correctly and succeed to acquire the needed knowledge. In this study, we aim to include Semantic Web technologies in the E-learning process, as new components. We use Semantic Web (SW to: 1 support the evaluation of open questions in e-learning courses, 2 support the creation of questions and exams automatically, 3 support the evaluation of exams created by the system. These components should allow for measuring academic performance, providing feedback mechanisms, and improving participative and collaborative ideas. Our goal is to use Semantic Web and Wireless technologies to design and implement the assessment system that allows the students, to take: web-based tutorials, quizzes, free exercises, and exams, to download: course reviews, previous exams and their model answers, to access the system through the Mobile and take quick quizzes and exercises. The system facilitates generation of automatic, balanced, and different exam sheets that contain different types of questions covering the entire curriculum, and display gradually from easiness to difficulty. The system provides the teachers and administrators with several services such as: store different types of questions, generate exams with specific criteria, and upload course assignments, exams, and reviews.

  7. The role of ciliates within the microbial food web in the eutrophicated part of Kaštela Bay (Middle Adriatic Sea

    Directory of Open Access Journals (Sweden)

    Natalia Bojanic

    2006-09-01

    Full Text Available Interactions among phytoplankton, bacterioplankton, heterotrophic nanoflagellates (HNF, ciliated protozoa and copepod nauplii were studied in the eutrophicated part of Kas?tela Bay from May 1998 to November 1999. Special emphasis was placed on relationships between size categories of nonloricate ciliates (NLC and other microbial food web components. Biomasses of phytoplankton and bacteria were primarily influenced by abiotic parameters. Temperature indirectly controlled variation in HNF biomass through the changes in biomass of bacteria and the smaller phytoplankton fraction. Besides HNF, bacterial biomass was affected by the NLC

  8. That obscure object of desire: multimedia metadata on the Web, part 2

    NARCIS (Netherlands)

    F.-M. Nack (Frank); J.R. van Ossenbruggen (Jacco); L. Hardman (Lynda)

    2003-01-01

    textabstractThis article discusses the state of the art in metadata for audio-visual media in large semantic networks, such as the Semantic Web. Our discussion is predominantly motivated by the two most widely known approaches towards machine-processable and semantic-based content description,

  9. That obscure object of desire: multimedia metadata on the Web, part 1

    NARCIS (Netherlands)

    F.-M. Nack (Frank); J.R. van Ossenbruggen (Jacco); L. Hardman (Lynda)

    2003-01-01

    textabstractThis article discusses the state of the art in metadata for audio-visual media in large semantic networks, such as the Semantic Web. Our discussion is predominantly motivated by the two most widely known approaches towards machine-processable and semantic-based content description,

  10. Web server for priority ordered multimedia services

    Science.gov (United States)

    Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund

    2001-10-01

    In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.

  11. Web Caching

    Indian Academy of Sciences (India)

    leveraged through Web caching technology. Specifically, Web caching becomes an ... Web routing can improve the overall performance of the Internet. Web caching is similar to memory system caching - a Web cache stores Web resources in ...

  12. Balloon Blocking Technique (BBT) for Superselective Catheterization of Inaccessible Arteries with Conventional and Modified Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Morishita, Hiroyuki, E-mail: hmorif@koto.kpu-m.ac.jp, E-mail: mori-h33@xa2.so-net.ne.jp [Japan Red Cross Kyoto Daiichi Hospital, Department of Diagnostic Radiology (Japan); Takeuchi, Yoshito, E-mail: yotake62@qg8.so-net.ne.jp [Kyoto Prefectural University of Medicine, Department of Radiology, North Medical Center (Japan); Ito, Takaaki, E-mail: takaaki@koto.kpu-m.ac.jp [Japan Red Cross Kyoto Daiichi Hospital, Department of Diagnostic Radiology (Japan); Hayashi, Natsuko, E-mail: hayashin@koto.kpu-m.ac.jp [Kyoto Prefectural University of Medicine, Department of Radiology, Graduate School of Medical Science (Japan); Sato, Osamu, E-mail: osamu-sato@kyoto1-jrc.org [Japan Red Cross Kyoto Daiichi Hospital, Department of Diagnostic Radiology (Japan)

    2016-06-15

    PurposeThe purpose of the study was to retrospectively evaluate the efficacy and safety of the balloon blocking technique (BBT).Materials and MethodsThe BBT was performed in six patients (all males, mean 73.5 years) in whom superselective catheterization for transcatheter arterial embolization by the conventional microcatheter techniques had failed due to anatomical difficulty, including targeted arteries originating steeply or hooked from parent arteries. All BBT procedures were performed using Seldinger’s transfemoral method. Occlusive balloons were deployed and inflated at the distal side of the target artery branching site in the parent artery via transfemoral access. A microcatheter was delivered from a 5-F catheter via another femoral access and was advanced over the microguidewire into the target artery, under balloon blockage of advancement of the microguidewire into non-target branches. After the balloon catheter was deflated and withdrawn, optimal interventions were performed through the microcatheter.ResultsAfter success of accessing the targeted artery by BBT, optimal interventions were accomplished in all patients with no complications other than vasovagal hypotension, which responded to nominal therapy.ConclusionThe BBT may be useful in superselective catheterization of inaccessible arteries due to anatomical difficulties.

  13. WebSelF: A Web Scraping Framework

    DEFF Research Database (Denmark)

    Thomsen, Jakob; Ernst, Erik; Brabrand, Claus

    2012-01-01

    We present, WebSelF, a framework for web scraping which models the process of web scraping and decomposes it into four conceptually independent, reusable, and composable constituents. We have validated our framework through a full parameterized implementation that is flexible enough to capture...... previous work on web scraping. We have experimentally evaluated our framework and implementation in an experiment that evaluated several qualitatively different web scraping constituents (including previous work and combinations hereof) on about 11,000 HTML pages on daily versions of 17 web sites over...... a period of more than one year. Our framework solves three concrete problems with current web scraping and our experimental results indicate that com- position of previous and our new techniques achieve a higher degree of accuracy, precision and specificity than existing techniques alone....

  14. Web 2.0: Inherent tensions and evident challenges for education

    DEFF Research Database (Denmark)

    Dohn, Nina Bonderup

    2009-01-01

    In upper tertiary educational programmes around the world, the new Web-mediated communication practices termed Web 2.0 are introduced as learning activities with the goal of facilitating learning through collaborative knowledge construction. The aim of this paper is to point to discrepancies...... in the views of learning, knowledge, and the goals of the practice implicit in Web 2.0 and educational practices and to argue that these discrepancies lead to theoretical tensions and practical challenges when Web 2.0 practices are utilized for educational purposes. The article is structured into four main...... parts: First, Web 2.0 is characterized from a practice perspective. Second, some conceptual discrepancies between the "practice logics" of Web 2.0 and educational practices are identified. Third, the question of transcending the discrepancies is raised through a discussion of related pedagogical...

  15. Using Technology of .Net Web Services in the Area of Automation

    Directory of Open Access Journals (Sweden)

    Martin Hnik

    2009-12-01

    Full Text Available This work deals with a technology for data exchange XML Web Services and its application to specific tasks. One of the applications created allows you to monitor and control the real thermal process through a number of client devices, independent of the operating system, the type or their location. The thermal process can be controlled, for example, by another process, a website or a mobile phone. The system is designed from its base and contains three main parts. The hardware part consists from a measuring card, actuators and temperature sensors. The core application is a server that is running the XML Web Service, Windows Service and SQL Server. Client software for mobile phones and web sites was also created.

  16. Faculty Recommendations for Web Tools: Implications for Course Management Systems

    Science.gov (United States)

    Oliver, Kevin; Moore, John

    2008-01-01

    A gap analysis of web tools in Engineering was undertaken as one part of the Digital Library Network for Engineering and Technology (DLNET) grant funded by NSF (DUE-0085849). DLNET represents a Web portal and an online review process to archive quality knowledge objects in Engineering and Technology disciplines. The gap analysis coincided with the…

  17. 07051 Executive Summary -- Programming Paradigms for the Web: Web Programming and Web Services

    OpenAIRE

    Hull, Richard; Thiemann, Peter; Wadler, Philip

    2007-01-01

    The world-wide web raises a variety of new programming challenges. To name a few: programming at the level of the web browser, data-centric approaches, and attempts to automatically discover and compose web services. This seminar brought together researchers from the web programming and web services communities and strove to engage them in communication with each other. The seminar was held in an unusual style, in a mixture of short presentations and in-depth discussio...

  18. Research and implementation of a Web-based remote desktop image monitoring system

    International Nuclear Information System (INIS)

    Ren Weijuan; Li Luofeng; Wang Chunhong

    2010-01-01

    It studied and implemented an ISS (Image Snapshot Server) system based on Web, using Java Web technology. The ISS system consisted of client web browser and server. The server part could be divided into three modules as the screen shots software, web server and Oracle database. Screen shots software intercepted the desktop environment of the remote monitored PC and sent these pictures to a Tomcat web server for displaying on the web at real time. At the same time, these pictures were also saved in an Oracle database. Through the web browser, monitor person can view the real-time and historical desktop pictures of the monitored PC during some period. It is very convenient for any user to monitor the desktop image of remote monitoring PC. (authors)

  19. Examination of the 'web mode effect'

    DEFF Research Database (Denmark)

    Clement, Sanne Lund; Shamshiri-Petersen, Ditte

    Declining response rates is one of the most significant challenges for survey based research today. Seen in isolation, traditional interviewer based data collection methods are still the most effective but also the most expensive, especially the greater difficulty in gaining responses taken...... into account. As a solution, mixed-mode designs have been employed as a way to achieve higher response rates, while keeping the overall costs low. In particular, the use of web based surveys has expanded considerably during the last few years, both as a single data collection method and as a component in mixed...... with telephone surveys, not enabling determination of a “web mode effect”. In this case, differences might as well be due to differences between self-administered and interviewer-administered collection methods. Other parts of literature on mixed-mode design including a web option are using stratified sampling...

  20. Semantic Web Requirements through Web Mining Techniques

    OpenAIRE

    Hassanzadeh, Hamed; Keyvanpour, Mohammad Reza

    2012-01-01

    In recent years, Semantic web has become a topic of active research in several fields of computer science and has applied in a wide range of domains such as bioinformatics, life sciences, and knowledge management. The two fast-developing research areas semantic web and web mining can complement each other and their different techniques can be used jointly or separately to solve the issues in both areas. In addition, since shifting from current web to semantic web mainly depends on the enhance...

  1. Prey interception drives web invasion and spider size determines successful web takeover in nocturnal orb-web spiders.

    Science.gov (United States)

    Gan, Wenjin; Liu, Shengjie; Yang, Xiaodong; Li, Daiqin; Lei, Chaoliang

    2015-09-24

    A striking feature of web-building spiders is the use of silk to make webs, mainly for prey capture. However, building a web is energetically expensive and increases the risk of predation. To reduce such costs and still have access to abundant prey, some web-building spiders have evolved web invasion behaviour. In general, no consistent patterns of web invasion have emerged and the factors determining web invasion remain largely unexplored. Here we report web invasion among conspecifics in seven nocturnal species of orb-web spiders, and examined the factors determining the probability of webs that could be invaded and taken over by conspecifics. About 36% of webs were invaded by conspecifics, and 25% of invaded webs were taken over by the invaders. A web that was built higher and intercepted more prey was more likely to be invaded. Once a web was invaded, the smaller the size of the resident spider, the more likely its web would be taken over by the invader. This study suggests that web invasion, as a possible way of reducing costs, may be widespread in nocturnal orb-web spiders. © 2015. Published by The Company of Biologists Ltd.

  2. Prey interception drives web invasion and spider size determines successful web takeover in nocturnal orb-web spiders

    Directory of Open Access Journals (Sweden)

    Wenjin Gan

    2015-10-01

    Full Text Available A striking feature of web-building spiders is the use of silk to make webs, mainly for prey capture. However, building a web is energetically expensive and increases the risk of predation. To reduce such costs and still have access to abundant prey, some web-building spiders have evolved web invasion behaviour. In general, no consistent patterns of web invasion have emerged and the factors determining web invasion remain largely unexplored. Here we report web invasion among conspecifics in seven nocturnal species of orb-web spiders, and examined the factors determining the probability of webs that could be invaded and taken over by conspecifics. About 36% of webs were invaded by conspecifics, and 25% of invaded webs were taken over by the invaders. A web that was built higher and intercepted more prey was more likely to be invaded. Once a web was invaded, the smaller the size of the resident spider, the more likely its web would be taken over by the invader. This study suggests that web invasion, as a possible way of reducing costs, may be widespread in nocturnal orb-web spiders.

  3. Using Open Web APIs in Teaching Web Mining

    Science.gov (United States)

    Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju

    2009-01-01

    With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…

  4. Using a WebCT to Develop a Research Skills Module

    OpenAIRE

    Bellew Martin, Kelli; Lee, Jennifer

    2003-01-01

    At the start of every academic year, the University of Calgary Library welcomes 1,000 first-year biology students to basic library research skills sessions. These sessions are traditionally taught in lecture format with a PowerPoint presentation and students following along on computers. As part of a pilot project in the Fall of 2002, 200 first-year biology students received the session via WebCT. WebCT is the web-based course management system utilized by the University of Calgary1; it d...

  5. Preenchimento automático de formulários na web oculta

    OpenAIRE

    Gustavo Zanini Kantorski

    2014-01-01

    Muitas informações disponíveis na Web estão armazenadas em bancos de dados on-line e são acessíveis somente após um usuário enviar uma consulta por meio de uma interface de busca. Essas informações estão localizadas em uma parte da Web conhecida como Web Oculta ou Web Profunda e, geralmente, são inacessíveis por máquinas de busca tradicionais. Uma vez que a forma de acessar os dados na Web Oculta se dá por intermédio de submissões de consultas, muitos trabalhos têm focado em como preencher au...

  6. Correct software in web applications and web services

    CERN Document Server

    Thalheim, Bernhard; Prinz, Andreas; Buchberger, Bruno

    2015-01-01

    The papers in this volume aim at obtaining a common understanding of the challenging research questions in web applications comprising web information systems, web services, and web interoperability; obtaining a common understanding of verification needs in web applications; achieving a common understanding of the available rigorous approaches to system development, and the cases in which they have succeeded; identifying how rigorous software engineering methods can be exploited to develop suitable web applications; and at developing a European-scale research agenda combining theory, methods a

  7. Hybrid Exploration Agent Platform and Sensor Web System

    Science.gov (United States)

    Stoffel, A. William; VanSteenberg, Michael E.

    2004-01-01

    A sensor web to collect the scientific data needed to further exploration is a major and efficient asset to any exploration effort. This is true not only for lunar and planetary environments, but also for interplanetary and liquid environments. Such a system would also have myriad direct commercial spin-off applications. The Hybrid Exploration Agent Platform and Sensor Web or HEAP-SW like the ANTS concept is a Sensor Web concept. The HEAP-SW is conceptually and practically a very different system. HEAP-SW is applicable to any environment and a huge range of exploration tasks. It is a very robust, low cost, high return, solution to a complex problem. All of the technology for initial development and implementation is currently available. The HEAP Sensor Web or HEAP-SW consists of three major parts, The Hybrid Exploration Agent Platforms or HEAP, the Sensor Web or SW and the immobile Data collection and Uplink units or DU. The HEAP-SW as a whole will refer to any group of mobile agents or robots where each robot is a mobile data collection unit that spends most of its time acting in concert with all other robots, DUs in the web, and the HEAP-SWs overall Command and Control (CC) system. Each DU and robot is, however, capable of acting independently. The three parts of the HEAP-SW system are discussed in this paper. The Goals of the HEAP-SW system are: 1) To maximize the amount of exploration enhancing science data collected; 2) To minimize data loss due to system malfunctions; 3) To minimize or, possibly, eliminate the risk of total system failure; 4) To minimize the size, weight, and power requirements of each HEAP robot; 5) To minimize HEAP-SW system costs. The rest of this paper discusses how these goals are attained.

  8. Web site development: applying aesthetics to promote breast health education and awareness.

    Science.gov (United States)

    Thomas, Barbara; Goldsmith, Susan B; Forrest, Anne; Marshall, Renée

    2002-01-01

    This article describes the process of establishing a Web site as part of a collaborative project using visual art to promote breast health education. The need for a more "user-friendly" comprehensive breast health Web site that is aesthetically rewarding was identified after an analysis of current Web sites available through the World Wide Web. Two predetermined sets of criteria, accountability and aesthetics, were used to analyze these sites and to generate ideas for creating a breast health education Web site using visual art. Results of the analyses conducted are included as well as the factors to consider for incorporating into a Web site. The process specified is thorough and can be applied to establish a Web site that is aesthetically rewarding and informative for a variety of educational purposes.

  9. Research of web application based on B/S structure testing

    International Nuclear Information System (INIS)

    Ou Ge; Zhang Hongmei; Song Liming

    2007-01-01

    Software testing is very important method used to assure the quality of Web application. With the fast development of Web application, the old testing techniques can not satisfied the require any more. Because of this people begin to classify different part of the application, find out the content that can be tested by the test tools and studies the structure of testing to enhance his efficiency. This paper analyses the testing based on the feature of Web application, sums up the testing method and gives some improvements of them. (authors)

  10. Semantic Web Technologies for the Adaptive Web

    DEFF Research Database (Denmark)

    Dolog, Peter; Nejdl, Wolfgang

    2007-01-01

    Ontologies and reasoning are the key terms brought into focus by the semantic web community. Formal representation of ontologies in a common data model on the web can be taken as a foundation for adaptive web technologies as well. This chapter describes how ontologies shared on the semantic web...... provide conceptualization for the links which are a main vehicle to access information on the web. The subject domain ontologies serve as constraints for generating only those links which are relevant for the domain a user is currently interested in. Furthermore, user model ontologies provide additional...... means for deciding which links to show, annotate, hide, generate, and reorder. The semantic web technologies provide means to formalize the domain ontologies and metadata created from them. The formalization enables reasoning for personalization decisions. This chapter describes which components...

  11. A Type System for Dynamic Web Documents

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Sandholm, Anders

    2000-01-01

    Many interactive Web services use the CGI interface for communication with clients. They will dynamically create HTML documents that are presented to the client who then resumes the interaction by submitting data through incorporated form fields. This protocol is difficult to statically type-chec...... system is based on a flow analysis of which we prove soundness. We present an efficient runtime implementation that respects the semantics of only well-typed programs. This work is fully implemented as part of the system for defining interactive Web services.......Many interactive Web services use the CGI interface for communication with clients. They will dynamically create HTML documents that are presented to the client who then resumes the interaction by submitting data through incorporated form fields. This protocol is difficult to statically type...

  12. Usare WebDewey

    OpenAIRE

    Baldi, Paolo

    2016-01-01

    This presentation shows how to use the WebDewey tool. Features of WebDewey. Italian WebDewey compared with American WebDewey. Querying Italian WebDewey. Italian WebDewey and MARC21. Italian WebDewey and UNIMARC. Numbers, captions, "equivalente verbale": Dewey decimal classification in Italian catalogues. Italian WebDewey and Nuovo soggettario. Italian WebDewey and LCSH. Italian WebDewey compared with printed version of Italian Dewey Classification (22. edition): advantages and disadvantages o...

  13. Web TA Production (WebTA)

    Data.gov (United States)

    US Agency for International Development — WebTA is a web-based time and attendance system that supports USAID payroll administration functions, and is designed to capture hours worked, leave used and...

  14. Web Services as new phenomenon in the PHP environment

    Directory of Open Access Journals (Sweden)

    Pavel Horovčák

    2006-06-01

    Full Text Available The support of development and exploitation of Web Services (WS is gradually becoming an integral part of current development environments. Beside standard environments connected with the emergence of WS (Java or .NET, the support is presently time realized also in a widely-used environment for the web application development – PHP, in its updated version 5. This contribution is oriented towards the development and utilization of WS within the framework of PHP 5. It deals with the development of standard WS (calculation mode as well as WS in the database mode (using MySQL, SQLite. It compares the structured and object-oriented approach (which is preferred to the server part of the service development.

  15. Web-based tools from AHRQ's National Resource Center.

    Science.gov (United States)

    Cusack, Caitlin M; Shah, Sapna

    2008-11-06

    The Agency for Healthcare Research and Quality (AHRQ) has made an investment of over $216 million in research around health information technology (health IT). As part of their investment, AHRQ has developed the National Resource Center for Health IT (NRC) which includes a public domain Web site. New content for the web site, such as white papers, toolkits, lessons from the health IT portfolio and web-based tools, is developed as needs are identified. Among the tools developed by the NRC are the Compendium of Surveys and the Clinical Decision Support (CDS) Resources. The Compendium of Surveys is a searchable repository of health IT evaluation surveys made available for public use. The CDS Resources contains content which may be used to develop clinical decision support tools, such as rules, reminders and templates. This live demonstration will show the access, use, and content of both these freely available web-based tools.

  16. Caught in the Web

    Energy Technology Data Exchange (ETDEWEB)

    Gillies, James

    1995-06-15

    The World-Wide Web may have taken the Internet by storm, but many people would be surprised to learn that it owes its existence to CERN. Around half the world's particle physicists come to CERN for their experiments, and the Web is the result of their need to share information quickly and easily on a global scale. Six years after Tim Berners-Lee's inspired idea to marry hypertext to the Internet in 1989, CERN is handing over future Web development to the World-Wide Web Consortium, run by the French National Institute for Research in Computer Science and Control, INRIA, and the Laboratory for Computer Science of the Massachusetts Institute of Technology, MIT, leaving itself free to concentrate on physics. The Laboratory marked this transition with a conference designed to give a taste of what the Web can do, whilst firmly stamping it with the label ''Made in CERN''. Over 200 European journalists and educationalists came to CERN on 8 - 9 March for the World-Wide Web Days, resulting in wide media coverage. The conference was opened by UK Science Minister David Hunt who stressed the importance of fundamental research in generating new ideas. ''Who could have guessed 10 years ago'', he said, ''that particle physics research would lead to a communication system which would allow every school to have the biggest library in the world in a single computer?''. In his introduction, the Minister also pointed out that ''CERN and other basic research laboratories help to break new technological ground and sow the seeds of what will become mainstream manufacturing in the future.'' Learning the jargon is often the hardest part of coming to grips with any new invention, so CERN put it at the top of the agenda. Jacques Altaber, who helped introduce the Internet to CERN in the early 1980s, explained that without the Internet, the Web couldn't exist. The Internet began as a US Defense Department research project in the 1970s and has grown into a global network-ofnetworks linking some

  17. Evaluation of a metal shear web selectively reinforced with filamentary composites for space shuttle application. Phase 1 summary report: Shear web design development

    Science.gov (United States)

    Laakso, J. H.; Zimmerman, D. K.

    1972-01-01

    An advanced composite shear web design concept was developed for the Space Shuttle orbiter main engine thrust beam structure. Various web concepts were synthesized by a computer-aided adaptive random search procedure. A practical concept is identified having a titanium-clad + or - 45 deg boron/epoxy web plate with vertical boron/epoxy reinforced aluminum stiffeners. The boron-epoxy laminate contributes to the strength and stiffness efficiency of the basic web section. The titanium-cladding functions to protect the polymeric laminate parts from damaging environments and is chem-milled to provide reinforcement in selected areas. Detailed design drawings are presented for both boron/epoxy reinforced and all-metal shear webs. The weight saving offered is 24% relative to all-metal construction at an attractive cost per pound of weight saved, based on the detailed designs. Small scale element tests substantiate the boron/epoxy reinforced design details in critical areas. The results show that the titanium-cladding reliably reinforces the web laminate in critical edge load transfer and stiffener fastener hole areas.

  18. Development and Validation of WebQuests in Teaching Epics

    Directory of Open Access Journals (Sweden)

    Ronald Candy Santos Lasaten

    2017-05-01

    Full Text Available Using the Research Development (R&D methodology, the study aimed to develop and validate WebQuests which can be used in literature subjects, particularly in the tertiary level to address the need of literature teachers for pedagogy in the teaching of epic s. The development of the Web Quests was anchored on the Theory of Constructivism. Two groups of experts validated the Web Quests – the literature experts and the ICT experts. The Content Validation Checklist, used by the literature experts, was utilized t o evaluate the content of the Web Quests. Meanwhile, the Rubric for Evaluating Web Quests, used by the ICT experts, was utilized to evaluate the design characteristics of the Web Quests. Computed weighted means using range interval of point scores were emp loyed to treat the data gathered from the evaluation conducted by both group of experts. The Web Quests developed contain five major parts which include: 1 introduction; 2 task; 3 process; 4 evaluation; and 5 conclusion. Based on the findings, the con tent of the Web Quests developed are valid in terms of objectives, activities and instructional characteristics. Likewise, the design characteristics of the Web Quests are excellent in terms of introductions, tasks, processes, resources, evaluations, concl usions and overall designs. Thus, the Web Quests developed are acceptable and can be utilized as instructional materials by literature teachers in the teaching of epics.

  19. Social web artifacts for boosting recommenders theory and implementation

    CERN Document Server

    Ziegler, Cai-Nicolas

    2013-01-01

    Recommender systems, software programs that learn from human behavior and make predictions of what products we are expected to appreciate and purchase, have become an integral part of our everyday life. They proliferate across electronic commerce around the globe and exist for virtually all sorts of consumable goods, such as books, movies, music, or clothes. At the same time, a new evolution on the Web has started to take shape, commonly known as the “Web 2.0” or the “Social Web”: Consumer-generated media has become rife, social networks have emerged and are pulling significant shares of Web traffic. In line with these developments, novel information and knowledge artifacts have become readily available on the Web, created by the collective effort of millions of people. This textbook presents approaches to exploit the new Social Web fountain of knowledge, zeroing in first and foremost on two of those information artifacts, namely classification taxonomies and trust networks. These two are used to impr...

  20. Politiken, Alt om Ikast Brande (web), Lemvig Folkeblad (Web), Politiken (web), Dabladet Ringkjøbing Skjern (web)

    DEFF Research Database (Denmark)

    Lauritsen, Jens

    2014-01-01

    Politiken 01.01.2014 14:16 Danskerne skød nytåret ind med et brag, men for enkeltes vedkommende gik det galt, da nytårskrudtet blev tændt. Skadestuerne har behandlet 73 personer for fyrværkeriskader mellem klokken 18 i aftes og klokken 06 i morges. Det viser en optælling, som Politiken har...... foretaget på baggrund af tal fra Ulykkes Analyse Gruppen på Odense Universitetshospital. Artiklen er også bragt i: Alt om Ikast Brande (web), Lemvig Folkeblad (web), Politiken (web), Dagbladet Ringkjøbing Skjern (web)....

  1. E-commerce Systems and E-shop Web Sites Security

    OpenAIRE

    Suchánek, Petr

    2009-01-01

    Fruitfulnes of contemporary companies rests on new business model development, elimination of communication obstacles, simplification of industrial processes, possibilities of responding in real-time and above all meeting the floating custom needs. Quite a number of company activities and transactions are realized within the framework of e-business. Business transactions are supported by e-commerce systems. One of the e-commerce system part is web interface (web sites). Present trend is putti...

  2. Migrant life stories and the Web

    DEFF Research Database (Denmark)

    Marselis, Randi

    2013-01-01

    The life stories of migrants are increasingly being told, as part of the work of cultural organizations, and websites are well suited to making such life story projects accessible to the public. However, by using the lives of real people as raw material in a public forum, Web projects raise...

  3. Using Web Server Logs in Evaluating Instructional Web Sites.

    Science.gov (United States)

    Ingram, Albert L.

    2000-01-01

    Web server logs contain a great deal of information about who uses a Web site and how they use it. This article discusses the analysis of Web logs for instructional Web sites; reviews the data stored in most Web server logs; demonstrates what further information can be gleaned from the logs; and discusses analyzing that information for the…

  4. WEB STRUCTURE MINING

    Directory of Open Access Journals (Sweden)

    CLAUDIA ELENA DINUCĂ

    2011-01-01

    Full Text Available The World Wide Web became one of the most valuable resources for information retrievals and knowledge discoveries due to the permanent increasing of the amount of data available online. Taking into consideration the web dimension, the users get easily lost in the web’s rich hyper structure. Application of data mining methods is the right solution for knowledge discovery on the Web. The knowledge extracted from the Web can be used to raise the performances for Web information retrievals, question answering and Web based data warehousing. In this paper, I provide an introduction of Web mining categories and I focus on one of these categories: the Web structure mining. Web structure mining, one of three categories of web mining for data, is a tool used to identify the relationship between Web pages linked by information or direct link connection. It offers information about how different pages are linked together to form this huge web. Web Structure Mining finds hidden basic structures and uses hyperlinks for more web applications such as web search.

  5. Services for Graduate Students: A Review of Academic Library Web Sites

    Science.gov (United States)

    Rempel, Hannah Gascho

    2010-01-01

    A library's Web site is well recognized as the gateway to the library for the vast majority of users. Choosing the most user-friendly Web architecture to reflect the many services libraries offer is a complex process, and librarians are still experimenting to find what works best for their users. As part of a redesign of the Oregon State…

  6. Web Engineering

    Energy Technology Data Exchange (ETDEWEB)

    White, Bebo

    2003-06-23

    Web Engineering is the application of systematic, disciplined and quantifiable approaches to development, operation, and maintenance of Web-based applications. It is both a pro-active approach and a growing collection of theoretical and empirical research in Web application development. This paper gives an overview of Web Engineering by addressing the questions: (a) why is it needed? (b) what is its domain of operation? (c) how does it help and what should it do to improve Web application development? and (d) how should it be incorporated in education and training? The paper discusses the significant differences that exist between Web applications and conventional software, the taxonomy of Web applications, the progress made so far and the research issues and experience of creating a specialization at the master's level. The paper reaches a conclusion that Web Engineering at this stage is a moving target since Web technologies are constantly evolving, making new types of applications possible, which in turn may require innovations in how they are built, deployed and maintained.

  7. Web Accessibility in Romania: The Conformance of Municipal Web Sites to Web Content Accessibility Guidelines

    OpenAIRE

    Costin PRIBEANU; Ruxandra-Dora MARINESCU; Paul FOGARASSY-NESZLY; Maria GHEORGHE-MOISII

    2012-01-01

    The accessibility of public administration web sites is a key quality attribute for the successful implementation of the Information Society. The purpose of this paper is to present a second review of municipal web sites in Romania that is based on automated accessibility checking. A number of 60 web sites were evaluated against WCAG 2.0 recommendations. The analysis of results reveals a relatively low web accessibility of municipal web sites and highlights several aspects. Firstly, a slight ...

  8. Making attributes from the Linked Open Data (LOD) cloud a part of ...

    African Journals Online (AJOL)

    This research contributed to bridging the gap between linked data, SDI and web thematic maps and further showed how existing web mapping and OGC technologies can benefit from the Semantic Web. First, the design of a geospatial web service (representing the visible part of an SDI) that accesses attribute data from the ...

  9. An Introduction to XML and Web Technologies

    DEFF Research Database (Denmark)

    Møller, Anders; Schwartzbach, Michael Ignatieff

    , building on top of the early foundations. This book offers a comprehensive introduction to the area. There are two main threads of development, corresponding to the two parts of this book. XML technologies generalize the notion of data on the Web from hypertext documents to arbitrary data, including those...... that have traditionally been the realm of databases. In this book we cover the basic XML technology and the supporting technologies of XPath, DTD, XML Schema, DSD2, RELAX NG, XSLT, XQuery, DOM, JDOM, JAXB, SAX, STX, SDuce, and XACT. Web technologies build on top of the HTTP protocol to provide richer...

  10. Letting go of the words writing web content that works

    CERN Document Server

    Redish, Janice (Ginny)

    2012-01-01

    Web site design and development continues to become more sophisticated an important part of this maturity originates with well laid out and well written content. Ginny Redish is a world-renowned expert on information design and how to produce clear writing in plain language for the web. All of the invaluable information that she  shared in the first edition is included with numerous new examples. New information on content strategy for web sites, search engine optimization (SEO), and social media will enhance the book's content making it once again the only book you need to own to o

  11. 07051 Working Group Outcomes -- Programming Paradigms for the Web: Web Programming and Web Services

    OpenAIRE

    Hull, Richard; Thiemann, Peter; Wadler, Philip

    2007-01-01

    Participants in the seminar broke into groups on ``Patterns and Paradigms'' for web programming, ``Web Services,'' ``Data on the Web,'' ``Software Engineering'' and ``Security.'' Here we give the raw notes recorded during these sessions.

  12. What will Web 3.0 bring to education?

    Directory of Open Access Journals (Sweden)

    Dan Jiang

    2014-08-01

    Full Text Available Every era of technology has, to some extent, formed education in its own image. It is believed that there is a mutually productive convergence between main technological influences on a culture and the contemporary educational theories and practices. As we are stepping into the era of Web 3.0, it is no surprise that t he network has been one part of our daily life and it becomes one of the most important places for us to learn, work, entertain and socialize, especially for the digital natives. For many people, Web 3.0 still may be a new word in education or a future trend for them, but actually the Web 3.0 technology has been applied and it keeps changing the culture, theory and practice in education subtly. The article aims to discuss about the impacts on Web 3.0 on education and try to view the impacts in terms of culture philosophy and sociology.

  13. Web-ADARE: A Web-Aided Data Repairing System

    KAUST Repository

    Gu, Binbin

    2017-03-08

    Data repairing aims at discovering and correcting erroneous data in databases. In this paper, we develop Web-ADARE, an end-to-end web-aided data repairing system, to provide a feasible way to involve the vast data sources on the Web in data repairing. Our main attention in developing Web-ADARE is paid on the interaction problem between web-aided repairing and rule-based repairing, in order to minimize the Web consultation cost while reaching predefined quality requirements. The same interaction problem also exists in crowd-based methods but this is not yet formally defined and addressed. We first prove in theory that the optimal interaction scheme is not feasible to be achieved, and then propose an algorithm to identify a scheme for efficient interaction by investigating the inconsistencies and the dependencies between values in the repairing process. Extensive experiments on three data collections demonstrate the high repairing precision and recall of Web-ADARE, and the efficiency of the generated interaction scheme over several baseline ones.

  14. Web-ADARE: A Web-Aided Data Repairing System

    KAUST Repository

    Gu, Binbin; Li, Zhixu; Yang, Qiang; Xie, Qing; Liu, An; Liu, Guanfeng; Zheng, Kai; Zhang, Xiangliang

    2017-01-01

    Data repairing aims at discovering and correcting erroneous data in databases. In this paper, we develop Web-ADARE, an end-to-end web-aided data repairing system, to provide a feasible way to involve the vast data sources on the Web in data repairing. Our main attention in developing Web-ADARE is paid on the interaction problem between web-aided repairing and rule-based repairing, in order to minimize the Web consultation cost while reaching predefined quality requirements. The same interaction problem also exists in crowd-based methods but this is not yet formally defined and addressed. We first prove in theory that the optimal interaction scheme is not feasible to be achieved, and then propose an algorithm to identify a scheme for efficient interaction by investigating the inconsistencies and the dependencies between values in the repairing process. Extensive experiments on three data collections demonstrate the high repairing precision and recall of Web-ADARE, and the efficiency of the generated interaction scheme over several baseline ones.

  15. Web Mining

    Science.gov (United States)

    Fürnkranz, Johannes

    The World-Wide Web provides every internet citizen with access to an abundance of information, but it becomes increasingly difficult to identify the relevant pieces of information. Research in web mining tries to address this problem by applying techniques from data mining and machine learning to Web data and documents. This chapter provides a brief overview of web mining techniques and research areas, most notably hypertext classification, wrapper induction, recommender systems and web usage mining.

  16. The Web and Information Literacy: Scaffolding the use of Web Sources in a Project-Based Curriculum

    Science.gov (United States)

    Walton, Marion; Archer, Arlene

    2004-01-01

    In this article we describe and discuss a three-year case study of a course in web literacy, part of the academic literacy curriculum for first-year engineering students at the University of Cape Town (UCT). Because they are seen as practical knowledge, not theoretical, information skills tend to be devalued at university and rendered invisible to…

  17. Using Web Services and XML Harvesting to Achieve a Dynamic Web Site. Computers in Small Libraries

    Science.gov (United States)

    Roberts, Gary

    2005-01-01

    Exploiting and contextualizing free information is a natural part of library culture. In this column, Gary Roberts, the information systems and reference librarian at Herrick Library, Alfred University in Alfred, NY, describes how to use XML content on a Web site to link to hundreds of free and useful resources. He gives a general overview of the…

  18. Design and Development of a Web Based User Interface

    OpenAIRE

    László, Magda

    2014-01-01

    The first objective of the thesis is to study the technological background of application design and more specifically the Unified Modeling Language (hereinafter UML). Due to this, the research provides deeper understanding of technical aspects of the practical part of the thesis work. The second and third objectives of this thesis are to design and develop a web application and more specifically a Web Based User Interface for Multimodal Observation and Analysis System for Social Interactions...

  19. Understanding User-Web Interactions via Web Analytics

    CERN Document Server

    Jansen, Bernard J

    2009-01-01

    This lecture presents an overview of the Web analytics process, with a focus on providing insight and actionable outcomes from collecting and analyzing Internet data. The lecture first provides an overview of Web analytics, providing in essence, a condensed version of the entire lecture. The lecture then outlines the theoretical and methodological foundations of Web analytics in order to make obvious the strengths and shortcomings of Web analytics as an approach. These foundational elements include the psychological basis in behaviorism and methodological underpinning of trace data as an empir

  20. Web based machine status display for INDUS-1 And INDUS-2

    International Nuclear Information System (INIS)

    Srivastava, B.S.K.; Fatnani, P.

    2003-01-01

    Web based machine status display for Indus-1 and Indus-2 is designated to provide on-line status of Indus-1 and Indus-2 to the clients located at various places of CAT premises. Presently, this system provides Indus-1 machine status (e.g. beam current, integrated current, beam life-time etc) to the users working in Indus-1 building, but using the web browsers the same information can be accessed throughout the CAT network. This system is basically a part of Indus-1 Control System Web Site which is under construction (partially constructed). (author)

  1. The design and implementation of web mining in web sites security

    Science.gov (United States)

    Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li

    2003-06-01

    The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.

  2. Personality in cyberspace: personal Web sites as media for personality expressions and impressions.

    Science.gov (United States)

    Marcus, Bernd; Machilek, Franz; Schütz, Astrid

    2006-06-01

    This research examined the personality of owners of personal Web sites based on self-reports, visitors' ratings, and the content of the Web sites. The authors compared a large sample of Web site owners with population-wide samples on the Big Five dimensions of personality. Controlling for demographic differences, the average Web site owner reported being slightly less extraverted and more open to experience. Compared with various other samples, Web site owners did not generally differ on narcissism, self-monitoring, or self-esteem, but gender differences on these traits were often smaller in Web site owners. Self-other agreement was highest with Openness to Experience, but valid judgments of all Big Five dimensions were derived from Web sites providing rich information. Visitors made use of quantifiable features of the Web site to infer personality, and the cues they utilized partly corresponded to self-reported traits. Copyright 2006 APA, all rights reserved.

  3. Web-based research publications on Sub-Saharan Africa's prized ...

    African Journals Online (AJOL)

    The study confirms Africa's deep interest in the grasscutter which is not shared by other parts of the world. We recommend increased publication of research on cane rats in web-based journals to quickly spread the food value of this prized meat rodent to other parts of the world and so attract research interest and funding.

  4. Web 2.0 and competitiveness improvement

    Directory of Open Access Journals (Sweden)

    Bauerová, Danuše

    2011-12-01

    Full Text Available Article shows implementation of A Model of Learning Powered by Technology at university environment. A Positive Digital Identity development on Web2.0 cloud is presented. People are creating their own portfolios - Personal Learning Portfolio and Personal Credit Portfolio. The part of it such proposals are generalizing methodologies to be useful to improve competitiveness of students, pedagogues, scientists but also institutions as well as anybody of lifelong learning activities.

    Este artículo muestra la puesta en marcha de un modelo de aprendizaje potenciado por la tecnología en el entorno de la Universidad. Se presenta un desarrollo de la Identidad Digital Positiva sobre la nube Web2.0. Las personas crean sus propios portafolios –Personal Learning Portfolio y Personal Credit Portfolio. Por otra parte, estas propuestas representan la generalización de metodologías que son útiles para mejorar la competitividad de estudiantes, pedagogos, científicos, pero también de las instituciones, así como de las actividades de aprendizaje a lo largo de la vida.

  5. Development and preliminary evaluation of culturally specific web-based intervention for parents of adolescents.

    Science.gov (United States)

    Choi, H; Kim, S; Ko, H; Kim, Y; Park, C G

    2016-10-01

    WHAT IS KNOWN ON THE SUBJECT?: Problematic parent-child relationships have been identified as one of the main predictors of adolescents' mental health problems, but there are few existing interventions that address this issue. The format and delivery method of existing interventions for parents are relatively inaccessible for parents with full-time jobs and families living in rural areas. WHAT DOES THIS PAPER ADD TO EXISTING KNOWLEDGE?: The newly developed 'Stepping Stone' culturally specific web-based intervention, which is intended to help Korean parents of adolescents to acquire both knowledge and communication and conflict management skills, was found to be feasible and well-accepted by parents. This study enabled us to identify areas for improvement in the content and format of the intervention and strategies. This will potentially increase effect sizes for the outcome variables of parents' perception and behaviours. WHAT ARE THE IMPLICATIONS FOR PRACTICE?: This web-based intervention could be delivered across diverse settings, such as schools and community mental health centers, to increase parents' knowledge of adolescent's mental health and allow for early detection of mental health problems. Mental health nurses working in schools may spend a significant amount of time addressing students' mental health issues; thus, this web-based intervention could be a useful resource to share with parents and children. In this way, the mental health nurses could facilitate parental engagement in the intervention and then help them to continue to apply and practice the knowledge and skills obtained through the program. Introduction There is a need for accessible, culturally specific web-based interventions to address parent-child relationships and adolescents' mental health. Aims This study developed and conducted a preliminary evaluation of a 4-week web-based intervention for parents of adolescents aged 11 to 16 years in Korea. Methods We used a two-group, repeated

  6. Interactive web-based programs to teach functional anatomy: the pterygopalatine fossa.

    Science.gov (United States)

    Sinav, Ahmet; Ambron, Richard

    2004-07-01

    Certain areas of the body contain structures that are difficult to envision in their proper spatial orientations and whose functions are complex and difficult to grasp. This is especially true in the head, where many structures are relatively small and inaccessible. To address this problem, we are designing Web-based programs that consist of high-resolution interactive bitmap illustrations, prepared using Adobe Photoshop, and vector-based animations, prepared via Macromedia Flash. Flash action script language is used for the animations. We have used this approach to prepare a program on the pterygopalatine fossa, an important neurovascular junction in the deep face that is especially difficult to approach by dissection and to depict in static images in an atlas. The program can be viewed online at http://cds.osr.columbia.edu/anatomy/ppfossa/. A table of contents simplifies navigation through the program and a menu enables the user to identify each of the vascular and neuronal components and either to insert or to remove each from its position in the fossa. The functional anatomy of the nerves in the fossa is animated. For example, users can activate and subsequently follow action potentials as they course along axons to their targets. This high degree of interactivity helps promote learning.

  7. Caught in the Web

    International Nuclear Information System (INIS)

    Gillies, James

    1995-01-01

    The World-Wide Web may have taken the Internet by storm, but many people would be surprised to learn that it owes its existence to CERN. Around half the world's particle physicists come to CERN for their experiments, and the Web is the result of their need to share information quickly and easily on a global scale. Six years after Tim Berners-Lee's inspired idea to marry hypertext to the Internet in 1989, CERN is handing over future Web development to the World-Wide Web Consortium, run by the French National Institute for Research in Computer Science and Control, INRIA, and the Laboratory for Computer Science of the Massachusetts Institute of Technology, MIT, leaving itself free to concentrate on physics. The Laboratory marked this transition with a conference designed to give a taste of what the Web can do, whilst firmly stamping it with the label ''Made in CERN''. Over 200 European journalists and educationalists came to CERN on 8 - 9 March for the World-Wide Web Days, resulting in wide media coverage. The conference was opened by UK Science Minister David Hunt who stressed the importance of fundamental research in generating new ideas. ''Who could have guessed 10 years ago'', he said, ''that particle physics research would lead to a communication system which would allow every school to have the biggest library in the world in a single computer?''. In his introduction, the Minister also pointed out that ''CERN and other basic research laboratories help to break new technological ground and sow the seeds of what will become mainstream manufacturing in the future.'' Learning the jargon is often the hardest part of coming to grips with any new invention, so CERN put it at the top of the agenda. Jacques Altaber, who helped introduce the Internet to CERN in the early 1980s, explained that without the Internet, the Web couldn't exist. The Internet began as a US Defense

  8. Even Faster Web Sites Performance Best Practices for Web Developers

    CERN Document Server

    Souders, Steve

    2009-01-01

    Performance is critical to the success of any web site, and yet today's web applications push browsers to their limits with increasing amounts of rich content and heavy use of Ajax. In this book, Steve Souders, web performance evangelist at Google and former Chief Performance Yahoo!, provides valuable techniques to help you optimize your site's performance. Souders' previous book, the bestselling High Performance Web Sites, shocked the web development world by revealing that 80% of the time it takes for a web page to load is on the client side. In Even Faster Web Sites, Souders and eight exp

  9. The Canon, the Web, and the Long Tail

    DEFF Research Database (Denmark)

    Sanderhoff, Merete

    2017-01-01

    This article argues that releasing images of artworks into the public domain creates a new possibility for the public to challenge the canon or create their own, based on access to previously inaccessible images. Through the dissemination of openly licensed artworks across the Internet, museums c...

  10. Modelo de web semántica para universidades

    Directory of Open Access Journals (Sweden)

    Karla Abad

    2015-12-01

    Full Text Available A raíz del estudio de estado actual de micrositios y repositorios en la Universidad Estatal Península de Santa Elena se encontró que su información carecía de semántica óptima y adecuada. Bajo estas circunstancias, se plantea entonces la necesidad de crear un modelo de estructura de web semántica para Universidades, el cual posteriormente fue aplicado a micrositios y repositorio digital de la UPSE, como caso de prueba. Parte de este proyecto incluye la instalación de módulos de software con sus respectivas configuraciones y la utilización de estándares de metadatos como DUBLIN CORE, para la mejora del SEO (optimización en motores de búsqueda; con ello se ha logrado la generación de metadatos estandarizados y la creación de políticas para la subida de información. El uso de metadatos transforma datos simples en estructuras bien organizadas que aportan información y conocimiento para generar resultados en buscadores web. Al culminar la implementación del modelo de web semántica es posible decir que la universidad ha mejorado su presencia y visibilidad en la web a través del indexamiento de información en diferentes motores de búsqueda y posicionamiento en la categorización de universidades y de repositorios de Webometrics (ranking que proporciona clasificación de universidades de todo el mundo.   Abstract After examining the current microsites and repositories situation in University, Peninsula of Santa Elena´s, it was found that information lacked optimal and appropriate semantic. Under these circumstances, there is a need to create a semantic web structure model for Universities, which was subsequently applied to UPSE´s microsites and digital repositories, as a test study case. Part of this project includes the installation of software modules with their respective configurations and the use of metadata standards such as DUBLIN CORE, to improve the SEO (Search Engine Optimization; with these applications, it was

  11. Efficient Web Harvesting Strategies for Monitoring Deep Web Content

    NARCIS (Netherlands)

    Khelghati, Mohammadreza; Hiemstra, Djoerd; van Keulen, Maurice

    2016-01-01

    The change of the web content is rapid. In Focused Web Harvesting [?], which aims at achieving a complete harvest for a given topic, this dynamic nature of the web creates problems for users who need to access a complete set of related web data to their interesting topics. Whether you are a fan

  12. The Future of Web Maps in Next Generation Textbooks

    Science.gov (United States)

    DiBiase, D.; Prasad, S.

    2014-12-01

    The reformation of the "Object Formerly Known as Textbook" (coined by the Chronicle of Higher Education) toward a digital future is underway. Emerging nextgen texts look less like electronic books ("ebooks") and more like online courseware. In addition to text and illustrations, nextgen textbooks for STEM subjects are likely to combine quizzes, grade management tools, support for social learning, and interactive media including web maps. Web maps are interactive, multi-scale, online maps that enable teachers and learners to explore, interrogate, and mash up the wide variety of map layers available in the cloud. This presentation will show how web maps coupled with interactive quizzes enable students' purposeful explorations and interpretations of spatial patterns related to humankind's interactions with the earth. Attendees will also learn about Esri's offer to donate ArcGIS Online web mapping subscriptions to every U.S. school as part of the President Obama's ConnectED initiative.

  13. Enabling Problem Based Learning through Web 2.0 Technologies

    DEFF Research Database (Denmark)

    Tambouris, Efthimios; Panopoulou, Eleni; Tarabanis, Konstantinos

    2012-01-01

    of modern educational systems. Established pedagogical strategies, such as Problem Based Learning (PBL), are being adapted for online use in conjunction with modern Web 2.0 technologies and tools. However, even though Web 2.0 and progressive social-networking technologies are automatically associated......Advances in Information and Communications Technology (ICT), particularly the so-called Web 2.0, are affecting all aspects of our life: how we communicate, how we shop, how we socialise, and how we learn. Facilitating learning through the use of ICT, also known as eLearning, is a vital part...... with ideals such as collaboration, sharing, and active learning, it is also possible to use them in a very conservative, teacher-centred way limiting thus their impact. In this paper, we present a PBL 2.0 framework, i.e., a framework combining PBL practices with Web 2.0 technologies. More specifically, we (a...

  14. Web2Quests: Updating a Popular Web-Based Inquiry-Oriented Activity

    Science.gov (United States)

    Kurt, Serhat

    2009-01-01

    WebQuest is a popular inquiry-oriented activity in which learners use Web resources. Since the creation of the innovation, almost 15 years ago, the Web has changed significantly, while the WebQuest technique has changed little. This article examines possible applications of new Web trends on WebQuest instructional strategy. Some possible…

  15. Integration of Web mining and web crawler: Relevance and State of Art

    OpenAIRE

    Subhendu kumar pani; Deepak Mohapatra,; Bikram Keshari Ratha

    2010-01-01

    This study presents the role of web crawler in web mining environment. As the growth of the World Wide Web exceeded all expectations,the research on Web mining is growing more and more.web mining research topic which combines two of the activated research areas: Data Mining and World Wide Web .So, the World Wide Web is a very advanced area for data mining research. Search engines that are based on web crawling framework also used in web mining to find theinteracted web pages. This paper discu...

  16. Efficient Web Harvesting Strategies for Monitoring Deep Web Content

    NARCIS (Netherlands)

    Khelghati, Mohammadreza; Hiemstra, Djoerd; van Keulen, Maurice

    2016-01-01

    Web content changes rapidly [18]. In Focused Web Harvesting [17] which aim it is to achieve a complete harvest for a given topic, this dynamic nature of the web creates problems for users who need to access a set of all the relevant web data to their topics of interest. Whether you are a fan

  17. Law on the web a guide for students and practitioners

    CERN Document Server

    Stein, Stuart

    2014-01-01

    Law on the Web is ideal for anyone who wants to access Law Internet resources quickly and efficiently without becoming an IT expert. The emphasis throughout is on the location of high quality law Internet resources for learning, teaching and research, from among the billions of publicly accessible Web pages. The book is structured so that it will be found useful by both beginners and intermediate level users, and be of continuing use over the course of higher education studies. In addition to extensive coverage on locating files and Web sites, Part III provides a substantial and annotated list of high quality resources for law students.

  18. Teaching Hypertext and Hypermedia through the Web.

    Science.gov (United States)

    de Bra, Paul M. E.

    This paper describes a World Wide Web-based introductory course titled "Hypermedia Structures and Systems," offered as an optional part of the curriculum in computing science at the Eindhoven University of Technology (Netherlands). The technical environment for the current (1996) edition of the course is presented, which features…

  19. Teaching hypertext and hypermedia through the web

    NARCIS (Netherlands)

    De Bra, P.M.E.

    1996-01-01

    Since early 1994 the introductory course 2L670, "Hypermedia Structures and Systems", has been available on World Wide Web, and is an optional part of the curriculum in computing science at the Eindhoven University of Technology. The course has since been completed by more than 200 students from

  20. Usage Of Asp.Net Ajax for Binus School Serpong Web Applications

    Directory of Open Access Journals (Sweden)

    Karto Iskandar

    2016-03-01

    Full Text Available Today web applications have become a necessity and many companies use them as a communication tool to keep in touch with their customers. The usage of Web Application in current time increases as the numberof internet users has been rised. For reason of Rich Internet Application, the desktop application developer wasmoved to web application developer with AJAX technology. BINUS School Serpong is a Cambridge Curriculum base International School that uses web application for access every information about the school. By usingAJAX, performance of web application should be improved and the bandwidth usage is decreased. Problems thatoccur at BINUS School Serpong is not all part of the web application that uses AJAX. This paper introducesusage of AJAX in ASP.NET with C# programming language in web application BINUS School Serpong. It is expected by using ASP.NET AJAX, BINUS School Serpong website performance will be faster because of reducing web page reload. The methodology used in this paper is literature study. Results from this study are to prove that the ASP.NET AJAX can be used easily and improve BINUS School Serpong website performance. Conclusion of this paper is the implementation of ASP.NET AJAX improves performance of web application in BINUS School Serpong.

  1. Accesibilidad web en el espacio universitario público argentino

    Directory of Open Access Journals (Sweden)

    Laitano, María Inés

    2015-03-01

    Full Text Available The study presents a first web accessibility diagnosis carried out in 2012 on a sample of pages from the Argentine public university space. The evaluation establishes the compliance with Web Content Accessibility Guidelines (WCAG 2.0, taking into account the methodological recommendations from the World Wide Web Consortium (W3C. The results suggest that the web accessibility barriers encountered are serious for the most part (level A. The most frequent are related to markup language syntax, content presentation, non-text content and visual readability of text. Likewise it is shown that by addressing these barriers, certain groups of people could benefit specifically.El estudio presenta un primer diagnóstico de accesibilidad web realizado en 2012 sobre una muestra de páginas del espacio universitario público argentino. La evaluación comprueba el cumplimiento de las Pautas de Accesibilidad para el Contenido Web (WCAG en su versión 2.0, contemplando las recomendaciones metodológicas del World Wide Web Consortium (W3C. Los resultados sugieren que las barreras de accesibilidad web encontradas son mayoritariamente graves (nivel A. Las más frecuentes están relacionadas con la sintaxis del lenguaje de marcado, con la presentación del contenido, con el contenido no textual y con la legibilidad visual del texto. De igual modo se muestra que ciertos grupos de personas podrían verse particularmente favorecidos por la solución de estas barreras.

  2. Web-Enhanced Instruction and Learning: Findings of a Short- and Long-Term Impact Study and Teacher Use of NASA Web Resources

    Science.gov (United States)

    McCarthy, Marianne C.; Grabowski, Barbara L.; Koszalka, Tiffany

    2003-01-01

    Over a three-year period, researchers and educators from the Pennsylvania State University (PSU), University Park, Pennsylvania, and the NASA Dryden Flight Research Center (DFRC), Edwards, California, worked together to analyze, develop, implement and evaluate materials and tools that enable teachers to use NASA Web resources effectively for teaching science, mathematics, technology and geography. Two conference publications and one technical paper have already been published as part of this educational research series on Web-based instruction and learning. This technical paper, Web-Enhanced Instruction and Learning: Findings of a Short- and Long-Term Impact Study, is the culminating report in this educational research series and is based on the final report submitted to NASA. This report describes the broad spectrum of data gathered from teachers about their experiences using NASA Web resources in the classroom. It also describes participating teachers responses and feedback about the use of the NASA Web-Enhanced Learning Environment Strategies reflection tool on their teaching practices. The reflection tool was designed to help teachers merge the vast array of NASA resources with the best teaching methods, taking into consideration grade levels, subject areas and teaching preferences. The teachers described their attitudes toward technology and innovation in the classroom and their experiences and perceptions as they attempted to integrate Web resources into science, mathematics, technology and geography instruction.

  3. IS 37 FORM ON EDH WEB

    CERN Multimedia

    2000-01-01

    To Staff Members in charge of the execution of works The “Issuers” are reminded to fill in - if necessary - the form attached to Safety Instruction 37 when disabling all or part of the system generating a level 3 alarm. Reminder: The request must be completed by the issuer and authorised by the TSO/GLIMOS responsible for the building or area. After completion of the works, the TSO/GLIMOS make sure that the system is recommissioned. Please note that the computerized version of this form is available on the web. The icon can be found on the EDH Web Desktop Homepage. The paper version is still in use. If you have any questions, please contact A. Chouvelon/TIS, tel. 74229.

  4. Dynamic Web Pages: Performance Impact on Web Servers.

    Science.gov (United States)

    Kothari, Bhupesh; Claypool, Mark

    2001-01-01

    Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)

  5. PERANCANGAN WEB BASED LEARNING SEBAGAI MEDIA PEMBELAJARAN BERBASIS ICT

    OpenAIRE

    Ricky Firmansyah; Iis Saidah

    2016-01-01

    ABSTRACT The media is very important component of communication process. The effectiveness of media is very influential on extent to which a communication role will be accepted by the audience with fast and precise, or vice versa. E-Learning is present as ICT based learning media that allows students and teachers interact in different places. Web Based Learning (WBL) is used as one part of the E-Learning. This study focuses on designing web-based ICT as a learning medium that is used for ...

  6. Applying semantic web services to enterprise web

    OpenAIRE

    Hu, Y; Yang, Q P; Sun, X; Wei, P

    2008-01-01

    Enterprise Web provides a convenient, extendable, integrated platform for information sharing and knowledge management. However, it still has many drawbacks due to complexity and increasing information glut, as well as the heterogeneity of the information processed. Research in the field of Semantic Web Services has shown the possibility of adding higher level of semantic functionality onto the top of current Enterprise Web, enhancing usability and usefulness of resource, enabling decision su...

  7. WebVR: an interactive web browser for virtual environments

    Science.gov (United States)

    Barsoum, Emad; Kuester, Falko

    2005-03-01

    The pervasive nature of web-based content has lead to the development of applications and user interfaces that port between a broad range of operating systems and databases, while providing intuitive access to static and time-varying information. However, the integration of this vast resource into virtual environments has remained elusive. In this paper we present an implementation of a 3D Web Browser (WebVR) that enables the user to search the internet for arbitrary information and to seamlessly augment this information into virtual environments. WebVR provides access to the standard data input and query mechanisms offered by conventional web browsers, with the difference that it generates active texture-skins of the web contents that can be mapped onto arbitrary surfaces within the environment. Once mapped, the corresponding texture functions as a fully integrated web-browser that will respond to traditional events such as the selection of links or text input. As a result, any surface within the environment can be turned into a web-enabled resource that provides access to user-definable data. In order to leverage from the continuous advancement of browser technology and to support both static as well as streamed content, WebVR uses ActiveX controls to extract the desired texture skin from industry strength browsers, providing a unique mechanism for data fusion and extensibility.

  8. Food Web Assembly Rules for Generalized Lotka-Volterra Equations.

    Directory of Open Access Journals (Sweden)

    Jan O Haerter

    2016-02-01

    Full Text Available In food webs, many interacting species coexist despite the restrictions imposed by the competitive exclusion principle and apparent competition. For the generalized Lotka-Volterra equations, sustainable coexistence necessitates nonzero determinant of the interaction matrix. Here we show that this requirement is equivalent to demanding that each species be part of a non-overlapping pairing, which substantially constrains the food web structure. We demonstrate that a stable food web can always be obtained if a non-overlapping pairing exists. If it does not, the matrix rank can be used to quantify the lack of niches, corresponding to unpaired species. For the species richness at each trophic level, we derive the food web assembly rules, which specify sustainable combinations. In neighboring levels, these rules allow the higher level to avert competitive exclusion at the lower, thereby incorporating apparent competition. In agreement with data, the assembly rules predict high species numbers at intermediate levels and thinning at the top and bottom. Using comprehensive food web data, we demonstrate how omnivores or parasites with hosts at multiple trophic levels can loosen the constraints and help obtain coexistence in food webs. Hence, omnivory may be the glue that keeps communities intact even under extinction or ecological release of species.

  9. Food Web Assembly Rules for Generalized Lotka-Volterra Equations.

    Science.gov (United States)

    Haerter, Jan O; Mitarai, Namiko; Sneppen, Kim

    2016-02-01

    In food webs, many interacting species coexist despite the restrictions imposed by the competitive exclusion principle and apparent competition. For the generalized Lotka-Volterra equations, sustainable coexistence necessitates nonzero determinant of the interaction matrix. Here we show that this requirement is equivalent to demanding that each species be part of a non-overlapping pairing, which substantially constrains the food web structure. We demonstrate that a stable food web can always be obtained if a non-overlapping pairing exists. If it does not, the matrix rank can be used to quantify the lack of niches, corresponding to unpaired species. For the species richness at each trophic level, we derive the food web assembly rules, which specify sustainable combinations. In neighboring levels, these rules allow the higher level to avert competitive exclusion at the lower, thereby incorporating apparent competition. In agreement with data, the assembly rules predict high species numbers at intermediate levels and thinning at the top and bottom. Using comprehensive food web data, we demonstrate how omnivores or parasites with hosts at multiple trophic levels can loosen the constraints and help obtain coexistence in food webs. Hence, omnivory may be the glue that keeps communities intact even under extinction or ecological release of species.

  10. Deep Web: aproximaciones a la ciber irresponsabilidad

    Directory of Open Access Journals (Sweden)

    Dulce María Bautista Luzardo

    2015-01-01

    Full Text Available La Deep web o Hard web es una parte gigantesca de las plataformas virtuales indetectables donde ocurren ciberacciones que tienen como precedente el ocultamiento de la identidad del usuario y han dado pie a la tergiversación del concepto de persona y a la utilización de la web de una manera irresponsable —en algunos casos— para causar desazón, para perseguir o a veces hackear bancos, entidades y cuentas privadas. Este es un artículo de reflexión para analizar los alcances de la práctica de esconder acciones en Internet y de modificar el rostro en la cibersociedad contemporánea. Con esta reflexión se pretende llamar la atención acerca de la responsabilidad que tenemos a la hora de entrar en el mundo del Internet y se analiza los peligros que estas prácticas conllevan.

  11. Designing Effective Web Forms for Older Web Users

    Science.gov (United States)

    Li, Hui; Rau, Pei-Luen Patrick; Fujimura, Kaori; Gao, Qin; Wang, Lin

    2012-01-01

    This research aims to provide insight for web form design for older users. The effects of task complexity and information structure of web forms on older users' performance were examined. Forty-eight older participants with abundant computer and web experience were recruited. The results showed significant differences in task time and error rate…

  12. 75 FR 27986 - Electronic Filing System-Web (EFS-Web) Contingency Option

    Science.gov (United States)

    2010-05-19

    ...] Electronic Filing System--Web (EFS-Web) Contingency Option AGENCY: United States Patent and Trademark Office... contingency option when the primary portal to EFS-Web has an unscheduled outage. Previously, the entire EFS-Web system is not available to the users during such an outage. The contingency option in EFS-Web will...

  13. Fermilab joins in global live Web cast

    CERN Multimedia

    Polansek, Tom

    2005-01-01

    From 2 to 3:30 p.m., Lederman, who won the Nobel Prize for physics in 1988, will host his own wacky, science-centered talk show at Fermi National Accelerator Laboratory as part of a lvie, 12-hour, international Web cast celebrating Albert Einstein and the world Year of Physics (2/3 page)

  14. Geospatial semantic web

    CERN Document Server

    Zhang, Chuanrong; Li, Weidong

    2015-01-01

    This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such as WFS, WMS, RDF, OWL, and GeoSPARQL, and demonstrates how to use the Geospatial Semantic Web technologies to solve practical real-world problems such as spatial data interoperability.

  15. 3D Web-based HMI with WebGL Rendering Performance

    Directory of Open Access Journals (Sweden)

    Muennoi Atitayaporn

    2016-01-01

    Full Text Available An HMI, or Human-Machine Interface, is a software allowing users to communicate with a machine or automation system. It usually serves as a display section in SCADA (Supervisory Control and Data Acquisition system for device monitoring and control. In this papper, a 3D Web-based HMI with WebGL (Web-based Graphics Library rendering performance is presented. The main purpose of this work is to attempt to reduce the limitations of traditional 3D web HMI using the advantage of WebGL. To evaluate the performance, frame rate and frame time metrics were used. The results showed 3D Web-based HMI can maintain the frame rate 60FPS for #cube=0.5K/0.8K, 30FPS for #cube=1.1K/1.6K when it was run on Internet Explorer and Chrome respectively. Moreover, the study found that 3D Web-based HMI using WebGL contains similar frame time in each frame even though the numbers of cubes are up to 5K. This indicated stuttering incurred less in the proposed 3D Web-based HMI compared to the chosen commercial HMI product.

  16. 07051 Abstracts Collection -- Programming Paradigms for the Web: Web Programming and Web Services

    OpenAIRE

    Hull, Richard; Thiemann, Peter; Wadler, Philip

    2007-01-01

    From 28.01. to 02.02.2007, the Dagstuhl Seminar 07051 ``Programming Paradigms for the Web: Web Programming and Web Services'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The firs...

  17. THE DIFFERENCE BETWEEN DEVELOPING SINGLE PAGE APPLICATION AND TRADITIONAL WEB APPLICATION BASED ON MECHATRONICS ROBOT LABORATORY ONAFT APPLICATION

    Directory of Open Access Journals (Sweden)

    V. Solovei

    2018-04-01

    Full Text Available Today most of desktop and mobile applications have analogues in the form of web-based applications.  With evolution of development technologies and web technologies web application increased in functionality to desktop applications. The Web application consists of two parts of the client part and the server part. The client part is responsible for providing the user with visual information through the browser. The server part is responsible for processing and storing data.MPA appeared simultaneously with the Internet. Multiple-page applications work in a "traditional" way. Every change eg. display the data or submit data back to the server. With the advent of AJAX, MPA learned to load not the whole page, but only a part of it, which eventually led to the appearance of the SPA. SPA is the principle of development when only one page is transferred to the client part, and the content is downloaded only to a certain part of the page, without rebooting it, which allows to speed up the application and simplify the user experience of using the application to the level of desktop applications.Based on the SPA, the Mechatronics Robot Laboratory ONAFT application was designed to automate the management process. The application implements the client-server architecture. The server part consists of a RESTful API, which allows you to get unified access to the application functionality, and a database for storing information. Since the client part is a spa, this allows you to reduce the load on the connection to the server and improve the user experience

  18. Developing web map application based on user centered design

    Directory of Open Access Journals (Sweden)

    Petr Voldan

    2012-03-01

    Full Text Available User centred design is an approach in process of development any kind of human product where the main idea is to create a product for the end user. This article presents User centred design method in developing web mapping services. This method can be split into four main phases – user research, creation of concepts, developing with usability research and lunch of product. The article describes each part of this phase with an aim to provide guidelines for developers and primarily with an aim to improve the usability of web mapping services.

  19. BaBar - A Community Web Site in an Organizational Setting

    Energy Technology Data Exchange (ETDEWEB)

    White, Bebo

    2003-07-10

    The BABAR Web site was established in 1993 at the Stanford Linear Accelerator Center (SLAC) to support the BABAR experiment, to report its results, and to facilitate communication among its scientific and engineering collaborators, currently numbering about 600 individuals from 75 collaborating institutions in 10 countries. The BABAR Web site is, therefore, a community Web site. At the same time it is hosted at SLAC and funded by agencies that demand adherence to policies decided under different priorities. Additionally, the BABAR Web administrators deal with the problems that arise during the course of managing users, content, policies, standards, and changing technologies. Desired solutions to some of these problems may be incompatible with the overall administration of the SLAC Web sites and/or the SLAC policies and concerns. There are thus different perspectives of the same Web site and differing expectations in segments of the SLAC population which act as constraints and challenges in any review or re-engineering activities. Web Engineering, which post-dates the BABAR Web, has aimed to provide a comprehensive understanding of all aspects of Web development. This paper reports on the first part of a recent review of application of Web Engineering methods to the BABAR Web site, which has led to explicit user and information models of the BABAR community and how SLAC and the BABAR community relate and react to each other. The paper identifies the issues of a community Web site in a hierarchical, semi-governmental sector and formulates a strategy for periodic reviews of BABAR and similar sites. A separate paper reports on the findings of a user survey and selected interviews with users, along with their implications and recommendations for future.

  20. BaBar - A Community Web Site in an Organizational Setting

    International Nuclear Information System (INIS)

    White, Bebo

    2003-01-01

    The BABAR Web site was established in 1993 at the Stanford Linear Accelerator Center (SLAC) to support the BABAR experiment, to report its results, and to facilitate communication among its scientific and engineering collaborators, currently numbering about 600 individuals from 75 collaborating institutions in 10 countries. The BABAR Web site is, therefore, a community Web site. At the same time it is hosted at SLAC and funded by agencies that demand adherence to policies decided under different priorities. Additionally, the BABAR Web administrators deal with the problems that arise during the course of managing users, content, policies, standards, and changing technologies. Desired solutions to some of these problems may be incompatible with the overall administration of the SLAC Web sites and/or the SLAC policies and concerns. There are thus different perspectives of the same Web site and differing expectations in segments of the SLAC population which act as constraints and challenges in any review or re-engineering activities. Web Engineering, which post-dates the BABAR Web, has aimed to provide a comprehensive understanding of all aspects of Web development. This paper reports on the first part of a recent review of application of Web Engineering methods to the BABAR Web site, which has led to explicit user and information models of the BABAR community and how SLAC and the BABAR community relate and react to each other. The paper identifies the issues of a community Web site in a hierarchical, semi-governmental sector and formulates a strategy for periodic reviews of BABAR and similar sites. A separate paper reports on the findings of a user survey and selected interviews with users, along with their implications and recommendations for future

  1. Flow Webs: Mechanism and Architecture for the Implementation of Sensor Webs

    Science.gov (United States)

    Gorlick, M. M.; Peng, G. S.; Gasster, S. D.; McAtee, M. D.

    2006-12-01

    The sensor web is a distributed, federated infrastructure much like its predecessors, the internet and the world wide web. It will be a federation of many sensor webs, large and small, under many distinct spans of control, that loosely cooperates and share information for many purposes. Realistically, it will grow piecemeal as distinct, individual systems are developed and deployed, some expressly built for a sensor web while many others were created for other purposes. Therefore, the architecture of the sensor web is of fundamental import and architectural strictures that inhibit innovation, experimentation, sharing or scaling may prove fatal. Drawing upon the architectural lessons of the world wide web, we offer a novel system architecture, the flow web, that elevates flows, sequences of messages over a domain of interest and constrained in both time and space, to a position of primacy as a dynamic, real-time, medium of information exchange for computational services. The flow web captures; in a single, uniform architectural style; the conflicting demands of the sensor web including dynamic adaptations to changing conditions, ease of experimentation, rapid recovery from the failures of sensors and models, automated command and control, incremental development and deployment, and integration at multiple levels—in many cases, at different times. Our conception of sensor webs—dynamic amalgamations of sensor webs each constructed within a flow web infrastructure—holds substantial promise for earth science missions in general, and of weather, air quality, and disaster management in particular. Flow webs, are by philosophy, design and implementation a dynamic infrastructure that permits massive adaptation in real-time. Flows may be attached to and detached from services at will, even while information is in transit through the flow. This concept, flow mobility, permits dynamic integration of earth science products and modeling resources in response to real

  2. Information security threats in web-portals on the open journal systems platform

    Directory of Open Access Journals (Sweden)

    Anton A. Abramov

    2018-05-01

    Full Text Available This article addresses the problem of security threats while working with web portals built on the Open Journal Systems platform. The Open Journal Systems (OJS platform was originally developed as part of the Public Knowledge Project and it is one of the most popular open-source platforms for web journals today. Based on the data available in the Public Knowledge Project, there were more than 10,000 active journals using the open journal systems platform by the end of 2016. A migration of a journal to such advanced and complex platform helps to handle the entire workflow over a single web portal. Therefore it is an important move and only peer-reviewed journals that are part of Russian and Worldwide citation systems go for it. At the same time the problem of keeping privacy for a manuscript before it is published is very important for these journals and for authors who submit it to the journal. The paper describes the most common threats for the web portals on the OJS platform as well as a particular model of the security threats, and suggests the measures that could help to neutralize these threats.

  3. Search, Read and Write: An Inquiry into Web Accessibility for People with Dyslexia.

    Science.gov (United States)

    Berget, Gerd; Herstad, Jo; Sandnes, Frode Eika

    2016-01-01

    Universal design in context of digitalisation has become an integrated part of international conventions and national legislations. A goal is to make the Web accessible for people of different genders, ages, backgrounds, cultures and physical, sensory and cognitive abilities. Political demands for universally designed solutions have raised questions about how it is achieved in practice. Developers, designers and legislators have looked towards the Web Content Accessibility Guidelines (WCAG) for answers. WCAG 2.0 has become the de facto standard for universal design on the Web. Some of the guidelines are directed at the general population, while others are targeted at more specific user groups, such as the visually impaired or hearing impaired. Issues related to cognitive impairments such as dyslexia receive less attention, although dyslexia is prevalent in at least 5-10% of the population. Navigation and search are two common ways of using the Web. However, while navigation has received a fair amount of attention, search systems are not explicitly included, although search has become an important part of people's daily routines. This paper discusses WCAG in the context of dyslexia for the Web in general and search user interfaces specifically. Although certain guidelines address topics that affect dyslexia, WCAG does not seem to fully accommodate users with dyslexia.

  4. The Semantic Web: opportunities and challenges for next-generation Web applications

    Directory of Open Access Journals (Sweden)

    2002-01-01

    Full Text Available Recently there has been a growing interest in the investigation and development of the next generation web - the Semantic Web. While most of the current forms of web content are designed to be presented to humans, but are barely understandable by computers, the content of the Semantic Web is structured in a semantic way so that it is meaningful to computers as well as to humans. In this paper, we report a survey of recent research on the Semantic Web. In particular, we present the opportunities that this revolution will bring to us: web-services, agent-based distributed computing, semantics-based web search engines, and semantics-based digital libraries. We also discuss the technical and cultural challenges of realizing the Semantic Web: the development of ontologies, formal semantics of Semantic Web languages, and trust and proof models. We hope that this will shed some light on the direction of future work on this field.

  5. Design, implementation and practice of JBEI-ICE: an open source biological part registry platform and tools.

    Science.gov (United States)

    Ham, Timothy S; Dmytriv, Zinovii; Plahar, Hector; Chen, Joanna; Hillson, Nathan J; Keasling, Jay D

    2012-10-01

    The Joint BioEnergy Institute Inventory of Composable Elements (JBEI-ICEs) is an open source registry platform for managing information about biological parts. It is capable of recording information about 'legacy' parts, such as plasmids, microbial host strains and Arabidopsis seeds, as well as DNA parts in various assembly standards. ICE is built on the idea of a web of registries and thus provides strong support for distributed interconnected use. The information deposited in an ICE installation instance is accessible both via a web browser and through the web application programming interfaces, which allows automated access to parts via third-party programs. JBEI-ICE includes several useful web browser-based graphical applications for sequence annotation, manipulation and analysis that are also open source. As with open source software, users are encouraged to install, use and customize JBEI-ICE and its components for their particular purposes. As a web application programming interface, ICE provides well-developed parts storage functionality for other synthetic biology software projects. A public instance is available at public-registry.jbei.org, where users can try out features, upload parts or simply use it for their projects. The ICE software suite is available via Google Code, a hosting site for community-driven open source projects.

  6. NEW WEB-BASED ACCESS TO NUCLEAR STRUCTURE DATASETS.

    Energy Technology Data Exchange (ETDEWEB)

    WINCHELL,D.F.

    2004-09-26

    As part of an effort to migrate the National Nuclear Data Center (NNDC) databases to a relational platform, a new web interface has been developed for the dissemination of the nuclear structure datasets stored in the Evaluated Nuclear Structure Data File and Experimental Unevaluated Nuclear Data List.

  7. Working with WebQuests: Making the Web Accessible to Students with Disabilities.

    Science.gov (United States)

    Kelly, Rebecca

    2000-01-01

    This article describes how students with disabilities in regular classes are using the WebQuest lesson format to access the Internet. It explains essential WebQuest principles, creating a draft Web page, and WebQuest components. It offers an example of a WebQuest about salvaging the sunken ships, Titanic and Lusitania. A WebQuest planning form is…

  8. WebGIS based on semantic grid model and web services

    Science.gov (United States)

    Zhang, WangFei; Yue, CaiRong; Gao, JianGuo

    2009-10-01

    As the combination point of the network technology and GIS technology, WebGIS has got the fast development in recent years. With the restriction of Web and the characteristics of GIS, traditional WebGIS has some prominent problems existing in development. For example, it can't accomplish the interoperability of heterogeneous spatial databases; it can't accomplish the data access of cross-platform. With the appearance of Web Service and Grid technology, there appeared great change in field of WebGIS. Web Service provided an interface which can give information of different site the ability of data sharing and inter communication. The goal of Grid technology was to make the internet to a large and super computer, with this computer we can efficiently implement the overall sharing of computing resources, storage resource, data resource, information resource, knowledge resources and experts resources. But to WebGIS, we only implement the physically connection of data and information and these is far from the enough. Because of the different understanding of the world, following different professional regulations, different policies and different habits, the experts in different field will get different end when they observed the same geographic phenomenon and the semantic heterogeneity produced. Since these there are large differences to the same concept in different field. If we use the WebGIS without considering of the semantic heterogeneity, we will answer the questions users proposed wrongly or we can't answer the questions users proposed. To solve this problem, this paper put forward and experienced an effective method of combing semantic grid and Web Services technology to develop WebGIS. In this paper, we studied the method to construct ontology and the method to combine Grid technology and Web Services and with the detailed analysis of computing characteristics and application model in the distribution of data, we designed the WebGIS query system driven by

  9. Web accessibility: a longitudinal study of college and university home pages in the northwestern United States.

    Science.gov (United States)

    Thompson, Terrill; Burgstahler, Sheryl; Moore, Elizabeth J

    2010-01-01

    outreach and education may have a positive effect on these measures. However, the results also reveal negative trends in accessibility, and outreach and education may not be strong enough to counter the factors that motivate institutions to deploy inaccessible emerging technologies. Further research is warranted toward identifying the motivational factors that are associated with increased and decreased web accessibility, and much additional work is needed to ensure that higher education web pages are accessible to individuals with disabilities.

  10. How to open & operate a financially successful web site design business

    CERN Document Server

    Evans, Charlotte

    2009-01-01

    The Pricing & Ethical Guidelines Handbook published by the Graphic Arts Guild reports that the average cost of designing a Web site for a small corporation can range from 7,750 to 15,000. It is incredibly easy to see the enormous profit potential. Web design businesses can be run part- or full-time and can easily be started in your own home. As such, they are one of the fastest growing segments of the Internet economy. Here is the manual you need to cash in on this highly profitable segment of the industry. This book is a comprehensive and detailed study of the business side of Web site des

  11. A BPMN solution for chaining OGC services to quality assure location-based crowdsourced data

    Science.gov (United States)

    Meek, Sam; Jackson, Mike; Leibovici, Didier G.

    2016-02-01

    The Open Geospatial Consortium (OGC) Web Processing Service (WPS) standard enables access to a centralized repository of processes and services from compliant clients. A crucial part of the standard includes the provision to chain disparate processes and services to form a reusable workflow. To date this has been realized by methods such as embedding XML requests, using Business Process Execution Language (BPEL) engines and other external orchestration engines. Although these allow the user to define tasks and data artifacts as web services, they are often considered inflexible and complicated, often due to vendor specific solutions and inaccessible documentation. This paper introduces a new method of flexible service chaining using the standard Business Process Markup Notation (BPMN). A prototype system has been developed upon an existing open source BPMN suite to illustrate the advantages of the approach. The motivation for the software design is qualification of crowdsourced data for use in policy-making. The software is tested as part of a project that seeks to qualify, assure, and add value to crowdsourced data in a biological monitoring use case.

  12. Web Project Management

    OpenAIRE

    Suralkar, Sunita; Joshi, Nilambari; Meshram, B B

    2013-01-01

    This paper describes about the need for Web project management, fundamentals of project management for web projects: what it is, why projects go wrong, and what's different about web projects. We also discuss Cost Estimation Techniques based on Size Metrics. Though Web project development is similar to traditional software development applications, the special characteristics of Web Application development requires adaption of many software engineering approaches or even development of comple...

  13. Promoting Teachers' Positive Attitude towards Web Use: A Study in Web Site Development

    Science.gov (United States)

    Akpinar, Yavuz; Bayramoglu, Yusuf

    2008-01-01

    The purpose of the study was to examine effects of a compact training for developing web sites on teachers' web attitude, as composed of: web self efficacy, perceived web enjoyment, perceived web usefulness and behavioral intention to use the web. To measure the related constructs, the Web Attitude Scale was adapted into Turkish and tested with a…

  14. A Network of Automatic Control Web-Based Laboratories

    Science.gov (United States)

    Vargas, Hector; Sanchez Moreno, J.; Jara, Carlos A.; Candelas, F. A.; Torres, Fernando; Dormido, Sebastian

    2011-01-01

    This article presents an innovative project in the context of remote experimentation applied to control engineering education. Specifically, the authors describe their experience regarding the analysis, design, development, and exploitation of web-based technologies within the scope of automatic control. This work is part of an inter-university…

  15. What and how children search on the web

    NARCIS (Netherlands)

    Duarte Torres, Sergio; Weber, Ingmar

    2011-01-01

    The Internet has become an important part of the daily life of children as a source of information and leisure activities. Nonetheless, given that most of the content available on the web is aimed at the general public, children are constantly exposed to inappropriate content, either because the

  16. Web wisdom how to evaluate and create information quality on the Web

    CERN Document Server

    Alexander, Janet E

    1999-01-01

    Web Wisdom is an essential reference for anyone needing to evaluate or establish information quality on the World Wide Web. The book includes easy to use checklists for step-by-step quality evaluations of virtually any Web page. The checklists can also be used by Web authors to help them ensure quality information on their pages. In addition, Web Wisdom addresses other important issues, such as understanding the ways that advertising and sponsorship may affect the quality of Web information. It features: * a detailed discussion of the items involved in evaluating Web information; * checklists

  17. Professional WebGL Programming Developing 3D Graphics for the Web

    CERN Document Server

    Anyuru, Andreas

    2012-01-01

    Everything you need to know about developing hardware-accelerated 3D graphics with WebGL! As the newest technology for creating 3D graphics on the web, in both games, applications, and on regular websites, WebGL gives web developers the capability to produce eye-popping graphics. This book teaches you how to use WebGL to create stunning cross-platform apps. The book features several detailed examples that show you how to develop 3D graphics with WebGL, including explanations of code snippets that help you understand the why behind the how. You will also develop a stronger understanding of W

  18. WebCom: A Model for Understanding Web Site Communication

    DEFF Research Database (Denmark)

    Godsk, Mikkel; Petersen, Anja Bechmann

    2008-01-01

    of the approaches' strengths. Furthermore, it is discussed and shortly demonstrated how WebCom can be used for analytical and design purposes with YouTube as an example. The chapter concludes that WebCom is able to serve as a theoretically-based model for understanding complex Web site communication situations...

  19. SELECTION OF ONTOLOGY FOR WEB SERVICE DESCRIPTION LANGUAGE TO ONTOLOGY WEB LANGUAGE CONVERSION

    OpenAIRE

    J. Mannar Mannan; M. Sundarambal; S. Raghul

    2014-01-01

    Semantic web is to extend the current human readable web to encoding some of the semantic of resources in a machine processing form. As a Semantic web component, Semantic Web Services (SWS) uses a mark-up that makes the data into detailed and sophisticated machine readable way. One such language is Ontology Web Language (OWL). Existing conventional web service annotation can be changed to semantic web service by mapping Web Service Description Language (WSDL) with the semantic annotation of O...

  20. World Wide Web Metaphors for Search Mission Data

    Science.gov (United States)

    Norris, Jeffrey S.; Wallick, Michael N.; Joswig, Joseph C.; Powell, Mark W.; Torres, Recaredo J.; Mittman, David S.; Abramyan, Lucy; Crockett, Thomas M.; Shams, Khawaja S.; Fox, Jason M.; hide

    2010-01-01

    A software program that searches and browses mission data emulates a Web browser, containing standard meta - phors for Web browsing. By taking advantage of back-end URLs, users may save and share search states. Also, since a Web interface is familiar to users, training time is reduced. Familiar back and forward buttons move through a local search history. A refresh/reload button regenerates a query, and loads in any new data. URLs can be constructed to save search results. Adding context to the current search is also handled through a familiar Web metaphor. The query is constructed by clicking on hyperlinks that represent new components to the search query. The selection of a link appears to the user as a page change; the choice of links changes to represent the updated search and the results are filtered by the new criteria. Selecting a navigation link changes the current query and also the URL that is associated with it. The back button can be used to return to the previous search state. This software is part of the MSLICE release, which was written in Java. It will run on any current Windows, Macintosh, or Linux system.

  1. Lossy compression for Animated Web Visualisation

    Science.gov (United States)

    Prudden, R.; Tomlinson, J.; Robinson, N.; Arribas, A.

    2017-12-01

    This talk will discuss an technique for lossy data compression specialised for web animation. We set ourselves the challenge of visualising a full forecast weather field as an animated 3D web page visualisation. This data is richly spatiotemporal, however it is routinely communicated to the public as a 2D map, and scientists are largely limited to visualising data via static 2D maps or 1D scatter plots. We wanted to present Met Office weather forecasts in a way that represents all the generated data. Our approach was to repurpose the technology used to stream high definition videos. This enabled us to achieve high rates of compression, while being compatible with both web browsers and GPU processing. Since lossy compression necessarily involves discarding information, evaluating the results is an important and difficult problem. This is essentially a problem of forecast verification. The difficulty lies in deciding what it means for two weather fields to be "similar", as simple definitions such as mean squared error often lead to undesirable results. In the second part of the talk, I will briefly discuss some ideas for alternative measures of similarity.

  2. Web 2.0 Solutions to Wicked Climate Change Problems

    Directory of Open Access Journals (Sweden)

    Alanah Kazlauskas

    2010-01-01

    Full Text Available One of the most pressing ‘wicked problems’ facing humankind is climate change together with its many interrelated environmental concerns. The complexity of this set of problems can be overwhelming as there is such diversity among both the interpretations of the scientific evidence and the viability of possible solutions. Among the social technologies associated with the second generation of the Internet known as Web 2.0, there are tools that allow people to communicate, coordinate and collaborate in ways that reduce their carbon footprint and a potential to become part of the climate change solution. However the way forward is not obvious or easy as Web 2.0, while readily accepted in the chaotic social world, is often treated with suspicion in the more ordered world of business and government. This paper applies a holistic theoretical sense-making framework to research and practice on potential Web 2.0 solutions to climate change problems. The suite of issues, activities and tools involved are viewed as an ecosystem where all elements are dynamic and inter-related. Through such innovative thinking the Information Systems community can make a valuable contribution to a critical global problem and hence find a new relevance as part of the solution.

  3. The Power and Peril of Web 3.0: It's More than Just Semantics

    Science.gov (United States)

    Ohler, Jason

    2010-01-01

    The Information Age has been built, in part, on the belief that more information is always better. True to that sentiment, people have found ways to make a lot of information available to the masses--perhaps more than anyone ever imagined. The goal of the Semantic Web, often called Web 3.0, is for users to spend less time looking for information…

  4. Virtual Web Services

    OpenAIRE

    Rykowski, Jarogniew

    2007-01-01

    In this paper we propose an application of software agents to provide Virtual Web Services. A Virtual Web Service is a linked collection of several real and/or virtual Web Services, and public and private agents, accessed by the user in the same way as a single real Web Service. A Virtual Web Service allows unrestricted comparison, information merging, pipelining, etc., of data coming from different sources and in different forms. Detailed architecture and functionality of a single Virtual We...

  5. WebScore: An Effective Page Scoring Approach for Uncertain Web Social Networks

    Directory of Open Access Journals (Sweden)

    Shaojie Qiao

    2011-10-01

    Full Text Available To effectively score pages with uncertainty in web social networks, we first proposed a new concept called transition probability matrix and formally defined the uncertainty in web social networks. Second, we proposed a hybrid page scoring algorithm, called WebScore, based on the PageRank algorithm and three centrality measures including degree, betweenness, and closeness. Particularly,WebScore takes into a full consideration of the uncertainty of web social networks by computing the transition probability from one page to another. The basic idea ofWebScore is to: (1 integrate uncertainty into PageRank in order to accurately rank pages, and (2 apply the centrality measures to calculate the importance of pages in web social networks. In order to verify the performance of WebScore, we developed a web social network analysis system which can partition web pages into distinct groups and score them in an effective fashion. Finally, we conducted extensive experiments on real data and the results show that WebScore is effective at scoring uncertain pages with less time deficiency than PageRank and centrality measures based page scoring algorithms.

  6. The structure of the pelagic food web in relation to water column structure in the Skagerrak

    DEFF Research Database (Denmark)

    Kiørboe, Thomas; Kaas, H.; Kruse, B.

    1990-01-01

    by a doming of the pycnocline, with a deep mixed layer along the periphery and a very shallow pycnocline in central parts. Average phytoplankton size increased with the depth of the upper mixed layer, and the central stratified area was characterized by small flagellates while large and chain-forming diatoms...... on particle surface area rather than particle volume or chl a, and showed a distributional pattern that was nearly the inverse of the distribution of copepod activity. That is, peak bacterial growth rates occurred in central, stratified parts and lower rates were found along the margin with a deep mixed layer....... Thus a 'microbial loop' type of food web seemed to be evolving in the central, strongly stratified parts of the Skagerrak, while a shorter 'classical' type of food web appeared to dominate along the margin. The relation between food web structure and vertical mixing processes observed on oceanwide...

  7. Responsive web design workflow

    OpenAIRE

    LAAK, TIMO

    2013-01-01

    Responsive Web Design Workflow is a literature review about Responsive Web Design, a web standards based modern web design paradigm. The goals of this research were to define what responsive web design is, determine its importance in building modern websites and describe a workflow for responsive web design projects. Responsive web design is a paradigm to create adaptive websites, which respond to the properties of the media that is used to render them. The three key elements of responsi...

  8. Learning about the Human Genome. Part 2: Resources for Science Educators. ERIC Digest.

    Science.gov (United States)

    Haury, David L.

    This ERIC Digest identifies how the human genome project fits into the "National Science Education Standards" and lists Human Genome Project Web sites found on the World Wide Web. It is a resource companion to "Learning about the Human Genome. Part 1: Challenge to Science Educators" (Haury 2001). The Web resources and…

  9. The tip-of-the-tongue heuristic: How tip-of-the-tongue states confer perceptibility on inaccessible words.

    Science.gov (United States)

    Cleary, Anne M; Claxton, Alexander B

    2015-09-01

    This study shows that the presence of a tip-of-the-tongue (TOT) state--the sense that a word is in memory when its retrieval fails--is used as a heuristic for inferring that an inaccessible word has characteristics that are consistent with greater word perceptibility. When reporting a TOT state, people judged an unretrieved word as more likely to have previously appeared darker and clearer (Experiment 1a), and larger (Experiment 1b). They also judged an unretrieved word as more likely to be a high frequency word (Experiment 2). This was not because greater fluency or word perceptibility at encoding led to later TOT states: Increased fluency or perceptibility of a word at encoding did not increase the likelihood of a TOT state for it when its retrieval later failed; moreover, the TOT state was not diagnostic of an unretrieved word's fluency or perceptibility when it was last seen. Results instead suggest that TOT states themselves are used as a heuristic for inferring the likely characteristics of unretrieved words. During the uncertainty of retrieval failure, TOT states are a source of information on which people rely in reasoning about the likely characteristics of the unretrieved information, choosing characteristics that are consistent with greater fluency of processing. (c) 2015 APA, all rights reserved).

  10. Reliable execution based on CPN and skyline optimization for Web service composition.

    Science.gov (United States)

    Chen, Liping; Ha, Weitao; Zhang, Guojun

    2013-01-01

    With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets.

  11. Student participation in World Wide Web-based curriculum development of general chemistry

    Science.gov (United States)

    Hunter, William John Forbes

    1998-12-01

    This thesis describes an action research investigation of improvements to instruction in General Chemistry at Purdue University. Specifically, the study was conducted to guide continuous reform of curriculum materials delivered via the World Wide Web by involving students, instructors, and curriculum designers. The theoretical framework for this study was based upon constructivist learning theory and knowledge claims were developed using an inductive analysis procedure. This results of this study are assertions made in three domains: learning chemistry content via the World Wide Web, learning about learning via the World Wide Web, and learning about participation in an action research project. In the chemistry content domain, students were able to learn chemical concepts that utilized 3-dimensional visualizations, but not textual and graphical information delivered via the Web. In the learning via the Web domain, the use of feedback, the placement of supplementary aids, navigation, and the perception of conceptual novelty were all important to students' use of the Web. In the participation in action research domain, students learned about the complexity of curriculum. development, and valued their empowerment as part of the process.

  12. Research on Web-Based Networked Virtual Instrument System

    International Nuclear Information System (INIS)

    Tang, B P; Xu, C; He, Q Y; Lu, D

    2006-01-01

    The web-based networked virtual instrument (NVI) system is designed by using the object oriented methodology (OOM). The architecture of the NVI system consists of two major parts: client-web server interaction and instrument server-virtual instrument (VI) communication. The web server communicates with the instrument server and the clients connected to it over the Internet, and it handles identifying the user's name, managing the connection between the user and the instrument server, adding, removing and configuring VI's information. The instrument server handles setting the parameters of VI, confirming the condition of VI and saving the VI's condition information into the database. The NVI system is required to be a general-purpose measurement system that is easy to maintain, adapt and extend. Virtual instruments are connected to the instrument server and clients can remotely configure and operate these virtual instruments. An application of The NVI system is given in the end of the paper

  13. Accelerating cancer systems biology research through Semantic Web technology.

    Science.gov (United States)

    Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S

    2013-01-01

    Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute's caBIG, so users can interact with the DMR not only through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers' intellectual property. Copyright © 2012 Wiley Periodicals, Inc.

  14. Web services foundations

    CERN Document Server

    Bouguettaya, Athman; Daniel, Florian

    2013-01-01

    Web services and Service-Oriented Computing (SOC) have become thriving areas of academic research, joint university/industry research projects, and novel IT products on the market. SOC is the computing paradigm that uses Web services as building blocks for the engineering of composite, distributed applications out of the reusable application logic encapsulated by Web services. Web services could be considered the best-known and most standardized technology in use today for distributed computing over the Internet.Web Services Foundations is the first installment of a two-book collection coverin

  15. Integrating thematic web portal capabilities into the NASA Earthdata Web Infrastructure

    Science.gov (United States)

    Wong, M. M.; McLaughlin, B. D.; Huang, T.; Baynes, K.

    2015-12-01

    The National Aeronautics and Space Administration (NASA) acquires and distributes an abundance of Earth science data on a daily basis to a diverse user community worldwide. To assist the scientific community and general public in achieving a greater understanding of the interdisciplinary nature of Earth science and of key environmental and climate change topics, the NASA Earthdata web infrastructure is integrating new methods of presenting and providing access to Earth science information, data, research and results. This poster will present the process of integrating thematic web portal capabilities into the NASA Earthdata web infrastructure, with examples from the Sea Level Change Portal. The Sea Level Change Portal will be a source of current NASA research, data and information regarding sea level change. The portal will provide sea level change information through articles, graphics, videos and animations, an interactive tool to view and access sea level change data and a dashboard showing sea level change indicators. Earthdata is a part of the Earth Observing System Data and Information System (EOSDIS) project. EOSDIS is a key core capability in NASA's Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA's Earth science data from various sources - satellites, aircraft, field measurements, and various other programs. It is comprised of twelve Distributed Active Archive Centers (DAACs), Science Computing Facilities (SCFs), data discovery and service access client (Reverb and Earthdata Search), dataset directory (Global Change Master Directory - GCMD), near real-time data (Land Atmosphere Near real-time Capability for EOS - LANCE), Worldview (an imagery visualization interface), Global Imagery Browse Services, the Earthdata Code Collaborative and a host of other discipline specific data discovery, data access, data subsetting and visualization tools.

  16. Semantic web for dummies

    CERN Document Server

    Pollock, Jeffrey T

    2009-01-01

    Semantic Web technology is already changing how we interact with data on the Web. By connecting random information on the Internet in new ways, Web 3.0, as it is sometimes called, represents an exciting online evolution. Whether you're a consumer doing research online, a business owner who wants to offer your customers the most useful Web site, or an IT manager eager to understand Semantic Web solutions, Semantic Web For Dummies is the place to start! It will help you:Know how the typical Internet user will recognize the effects of the Semantic WebExplore all the benefits the data Web offers t

  17. Building web information systems using web services

    NARCIS (Netherlands)

    Frasincar, F.; Houben, G.J.P.M.; Barna, P.; Vasilecas, O.; Eder, J.; Caplinskas, A.

    2006-01-01

    Hera is a model-driven methodology for designing Web information systems. In the past a CASE tool for the Hera methodology was implemented. This software had different components that together form one centralized application. In this paper, we present a distributed Web service-oriented architecture

  18. Affordances of students' using the World Wide Web as a publishing medium in project-based learning environments

    Science.gov (United States)

    Bos, Nathan Daniel

    This dissertation investigates the emerging affordance of the World Wide Web as a place for high school students to become authors and publishers of information. Two empirical studies lay groundwork for student publishing by examining learning issues related to audience adaptation in writing, motivation and engagement with hypermedia, design, problem-solving, and critical evaluation. Two models of student publishing on the World Wide Web were investigated over the course of two 11spth grade project-based science curriculums. In the first curricular model, students worked in pairs to design informative hypermedia projects about infectious diseases that were published on the Web. Four case studies were written, drawing on both product- and process-related data sources. Four theoretically important findings are illustrated through these cases: (1) multimedia, especially graphics, seemed to catalyze some students' design processes by affecting the sequence of their design process and by providing a connection between the science content and their personal interest areas, (2) hypermedia design can demand high levels of analysis and synthesis of science content, (3) students can learn to think about science content representation through engagement with challenging design tasks, and (4) students' consideration of an outside audience can be facilitated by teacher-given design principles. The second Web-publishing model examines how students critically evaluate scientific resources on the Web, and how students can contribute to the Web's organization and usability by publishing critical reviews. Students critically evaluated Web resources using a four-part scheme: summarization of content, content, evaluation of credibility, evaluation of organizational structure, and evaluation of appearance. Content analyses comparing students' reviews and reviewed Web documents showed that students were proficient at summarizing content of Web documents, identifying their publishing

  19. Desarrollo de herramientas inteligentes para la web semántica

    OpenAIRE

    Simari, Guillermo Ricardo; García, Alejandro Javier; Tohmé, Fernando Abel; Delladio, Telma; Martínez, Diego C.; Gómez, Sergio Alejandro; Estévez, Elsa Clara; Fillottrani, Pablo Rubén; Castro, Silvia Mabel; Martig, Sergio R.; Ardenghi, Jorge Raúl; Echaiz, Javier

    2004-01-01

    El objetivo general de este proyecto, en el marco de las recomendaciones del World Wide Web Consortium (W3C), es el desarrollo de las herramientas necesarias para la creación de la infraestructura de soporte cognitivo y la explotación por parte de agentes autónomos del conocimiento almacenado en la Web. Esto incluye lenguajes de representación de conocimiento, herramientas para la creación, mantenimiento y visualización interactiva de ontologías, y máquinas de inferencia especializadas que pu...

  20. Changing Academic Teaching with Web 2.0 Technologies

    Science.gov (United States)

    Newland, Barbara; Byles, Linda

    2014-01-01

    Academic teaching can change with the use of Web 2.0 technologies, such as blogs and wikis, as these enable a different pedagogical approach through collaborative learning and the social construction of knowledge. Student expectations of their university learning experience have changed as they expect e-learning to be part of the learning…

  1. Development and Evaluation of an Interactive WebQuest Environment: "Web Macerasi"

    Science.gov (United States)

    Gulbahar, Yasemin; Madran, R. Orcun; Kalelioglu, Filiz

    2010-01-01

    This study was conducted to develop a web-based interactive system, Web Macerasi, for teaching-learning and evaluation purposes, and to find out the possible effects of this system. The study has two stages. In the first stage, a WebQuest site was designed as an interactive system in which various Internet and web technologies were used for…

  2. Web Analytics: A Picture of the Academic Library Web Site User

    Science.gov (United States)

    Black, Elizabeth L.

    2009-01-01

    This article describes the usefulness of Web analytics for understanding the users of an academic library Web site. Using a case study, the analysis describes how Web analytics can answer questions about Web site user behavior, including when visitors come, the duration of the visit, how they get there, the technology they use, and the most…

  3. The effect of reminders in a web-based intervention study

    DEFF Research Database (Denmark)

    Svensson, M.; Svensson, T.; Hansen, A. W.

    2012-01-01

    Knowledge on effective strategies to encourage participation in epidemiological web-based research is scant. We studied the effects of reminders on overall participation. 3,876 employees were e-mailed a baseline web-based lifestyle questionnaire. Nine months later, a follow-up questionnaire...... was sent. To encourage study participation, 4-5 and 11 e-mail reminders were sent at baseline and follow-up, respectively. Additional reminders (media articles, flyers, SMS etc) were also administered. Reminders (e-mails + additional) were given in low (a parts per thousand currency sign6 reminders......). About 29 % responded before any e-mail reminder, following 26 and 45 % after 1 respective a parts per thousand yen 2 e-mail reminders. Participant characteristics were not related to when the participants responded. The 4-5 e-mail reminders increased total response rate by 15 %, the eleven by 21...

  4. Web Mining and Social Networking

    DEFF Research Database (Denmark)

    Xu, Guandong; Zhang, Yanchun; Li, Lin

    This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web ...... sense of individuals or communities. The volume will benefit both academic and industry communities interested in the techniques and applications of web search, web data management, web mining and web knowledge discovery, as well as web community and social network analysis.......This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web...... mining, and the issue of how to incorporate web mining into web personalization and recommendation systems are also reviewed. Additionally, the volume explores web community mining and analysis to find the structural, organizational and temporal developments of web communities and reveal the societal...

  5. Process development for green part printing using binder jetting additive manufacturing

    Science.gov (United States)

    Miyanaji, Hadi; Orth, Morgan; Akbar, Junaid Muhammad; Yang, Li

    2018-05-01

    Originally developed decades ago, the binder jetting additive manufacturing (BJ-AM) process possesses various advantages compared to other additive manufacturing (AM) technologies such as broad material compatibility and technological expandability. However, the adoption of BJ-AM has been limited by the lack of knowledge with the fundamental understanding of the process principles and characteristics, as well as the relatively few systematic design guideline that are available. In this work, the process design considerations for BJ-AM in green part fabrication were discussed in detail in order to provide a comprehensive perspective of the design for additive manufacturing for the process. Various process factors, including binder saturation, in-process drying, powder spreading, powder feedstock characteristics, binder characteristics and post-process curing, could significantly affect the printing quality of the green parts such as geometrical accuracy and part integrity. For powder feedstock with low flowability, even though process parameters could be optimized to partially offset the printing feasibility issue, the qualities of the green parts will be intrinsically limited due to the existence of large internal voids that are inaccessible to the binder. In addition, during the process development, the balanced combination between the saturation level and in-process drying is of critical importance in the quality control of the green parts.

  6. Web API Fragility : How Robust is Your Web API Client

    NARCIS (Netherlands)

    Espinha, T.; Zaidman, A.; Gross, H.G.

    2014-01-01

    Web APIs provide a systematic and extensible approach for application-to-application interaction. A large number of mobile applications makes use of web APIs to integrate services into apps. Each Web API’s evolution pace is determined by their respective developer and mobile application developers

  7. Sounds of Web Advertising

    DEFF Research Database (Denmark)

    Jessen, Iben Bredahl; Graakjær, Nicolai Jørgensgaard

    2010-01-01

    Sound seems to be a neglected issue in the study of web ads. Web advertising is predominantly regarded as visual phenomena–commercial messages, as for instance banner ads that we watch, read, and eventually click on–but only rarely as something that we listen to. The present chapter presents...... an overview of the auditory dimensions in web advertising: Which kinds of sounds do we hear in web ads? What are the conditions and functions of sound in web ads? Moreover, the chapter proposes a theoretical framework in order to analyse the communicative functions of sound in web advertising. The main...... argument is that an understanding of the auditory dimensions in web advertising must include a reflection on the hypertextual settings of the web ad as well as a perspective on how users engage with web content....

  8. Processing biological literature with customizable Web services supporting interoperable formats.

    Science.gov (United States)

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.

  9. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  10. Carnegie Science Academy Web Site

    Science.gov (United States)

    Kotwicki, John; Atzinger, Joe; Turso, Denise

    1997-11-01

    The Carnegie Science Academy is a professional society "For Teens...By Teens" at the Carnegie Science Center in Pittsburgh. The CSA Web Site [ http://csa.clpgh.org ] is designed for teens who have an interest in science and technology. This online or virtual science academy provides resources for teens in high school science classes. The Web site also allows students around the world to participate and communicate with other students, discuss current events in science, share opinions, find answers to questions, or make online friends. Visitors can enjoy the main components of the site or sign up for a free membership which allows access to our chat room for monthly meeting, online newsletter, members forum, and much more. Main components to the site include a spot for cool links and downloads, available for any visitor to download or view. Online exhibits are created by students to examine and publish an area of study and also allow teachers to easily post classroom activities as exhibits by submitting pictures and text. Random Access, the interactive part of the academy, allows users to share ideas and opinions. Planet CSA focuses on current events in science and the academy. In the future the CSA Web site will become a major resource for teens and science teachers providing materials that will allow students to further enhance their interest and experiences in science.

  11. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  12. A resource-oriented architecture for a Geospatial Web

    Science.gov (United States)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    , systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning

  13. Engineering Web Applications

    DEFF Research Database (Denmark)

    Casteleyn, Sven; Daniel, Florian; Dolog, Peter

    Nowadays, Web applications are almost omnipresent. The Web has become a platform not only for information delivery, but also for eCommerce systems, social networks, mobile services, and distributed learning environments. Engineering Web applications involves many intrinsic challenges due...... to their distributed nature, content orientation, and the requirement to make them available to a wide spectrum of users who are unknown in advance. The authors discuss these challenges in the context of well-established engineering processes, covering the whole product lifecycle from requirements engineering through...... design and implementation to deployment and maintenance. They stress the importance of models in Web application development, and they compare well-known Web-specific development processes like WebML, WSDM and OOHDM to traditional software development approaches like the waterfall model and the spiral...

  14. Measuring participant rurality in Web-based interventions

    Directory of Open Access Journals (Sweden)

    McKay H Garth

    2007-08-01

    Full Text Available Abstract Background Web-based health behavior change programs can reach large groups of disparate participants and thus they provide promise of becoming important public health tools. Data on participant rurality can complement other demographic measures to deepen our understanding of the success of these programs. Specifically, analysis of participant rurality can inform recruitment and social marketing efforts, and facilitate the targeting and tailoring of program content. Rurality analysis can also help evaluate the effectiveness of interventions across population groupings. Methods We describe how the RUCAs (Rural-Urban Commuting Area Codes methodology can be used to examine results from two Randomized Controlled Trials of Web-based tobacco cessation programs: the ChewFree.com project for smokeless tobacco cessation and the Smokers' Health Improvement Program (SHIP project for smoking cessation. Results Using RUCAs methodology helped to highlight the extent to which both Web-based interventions reached a substantial percentage of rural participants. The ChewFree program was found to have more rural participation which is consistent with the greater prevalence of smokeless tobacco use in rural settings as well as ChewFree's multifaceted recruitment program that specifically targeted rural settings. Conclusion Researchers of Web-based health behavior change programs targeted to the US should routinely include RUCAs as a part of analyzing participant demographics. Researchers in other countries should examine rurality indices germane to their country.

  15. MACHINE LEARNING IMPLEMENTATION FOR THE CLASSIFICATION OF ATTACKS ON WEB SYSTEMS. PART 1

    Directory of Open Access Journals (Sweden)

    K. Smirnova

    2017-08-01

    Full Text Available The possibility of applying machine learning is considered for the classification of malicious requests to a Web application. This approach excludes the use of deterministic analysis systems (for example, expert systems, and based on the application of a cascade of neural networks or perceptrons on an approximate model to the real human brain. The main idea of the work is to enable to describe complex attack vectors consisting of feature sets, abstract terms for compiling a training sample, controlling the quality of recognition and classifying each of the layers (networks participating in the work, with the ability to adjust not the entire network, But only a small part of it, in the training of which a mistake or inaccuracy crept in.  The design of the developed network can be described as a cascaded, scalable neural network.  The developed system of intrusion detection uses a three-layer neural network. Layers can be built independently of each other by cascades. In the first layer, for each class of attack recognition, there is a corresponding network and correctness is checked on this network. To learn this layer, we have chosen classes of things that can be classified uniquely as yes or no, that is, they are linearly separable. Thus, a layer is obtained not just of neurons, but of their microsets, which can best determine whether is there some data class in the query or not. The following layers are not trained to recognize the attacks themselves, they are trained that a set of attacks creates certain threats. This allows you to more accurately recognize the attacker's attempts to bypass the defense system, as well as classify the target of the attack, and not just its fact. Simple layering allows you to minimize the percentage of false positives.

  16. Baggy paper webs : Effect of uneven moisture and grammage profiles in different process steps

    OpenAIRE

    Land, Cecilia

    2010-01-01

    One of the problems encountered in paper converting is caused by the occurrence of "baggy webs", which essentially is when the tension profile of the paper web is uneven. In an area with low tension the paper is longer, which results in bagginess. The baggy parts can not usually be stretched to even out the tension of the paper web in a converting machine, with the result that runnability problems are likely to occur. The aim of the work described in this thesis was to investigate three parti...

  17. Web Application Vulnerabilities

    OpenAIRE

    Yadav, Bhanu

    2014-01-01

    Web application security has been a major issue in information technology since the evolvement of dynamic web application. The main objective of this project was to carry out a detailed study on the top three web application vulnerabilities such as injection, cross site scripting, broken authentication and session management, present the situation where an application can be vulnerable to these web threats and finally provide preventative measures against them. ...

  18. Python 3 Web Development Beginner's Guide

    CERN Document Server

    Anders, Michel

    2011-01-01

    Part of Packt's Beginner's Guide Series, this book follows a sample application, with lots of screenshots, to help you get to grips with the techniques as quickly as possible. Moderately experienced Python programmers who want to learn how to create fairly complex, database-driven, cross browser compatible web apps that are maintainable and look good will find this book of most use. All key technologies except for Python 3 are explained in detail.

  19. WebMGA: a customizable web server for fast metagenomic sequence analysis.

    Science.gov (United States)

    Wu, Sitao; Zhu, Zhengwei; Fu, Liming; Niu, Beifang; Li, Weizhong

    2011-09-07

    The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  20. WebMGA: a customizable web server for fast metagenomic sequence analysis

    Directory of Open Access Journals (Sweden)

    Niu Beifang

    2011-09-01

    Full Text Available Abstract Background The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. Results We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. Conclusions WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  1. Web-Based Course Management and Web Services

    Science.gov (United States)

    Mandal, Chittaranjan; Sinha, Vijay Luxmi; Reade, Christopher M. P.

    2004-01-01

    The architecture of a web-based course management tool that has been developed at IIT [Indian Institute of Technology], Kharagpur and which manages the submission of assignments is discussed. Both the distributed architecture used for data storage and the client-server architecture supporting the web interface are described. Further developments…

  2. Web-page Prediction for Domain Specific Web-search using Boolean Bit Mask

    OpenAIRE

    Sinha, Sukanta; Duttagupta, Rana; Mukhopadhyay, Debajyoti

    2012-01-01

    Search Engine is a Web-page retrieval tool. Nowadays Web searchers utilize their time using an efficient search engine. To improve the performance of the search engine, we are introducing a unique mechanism which will give Web searchers more prominent search results. In this paper, we are going to discuss a domain specific Web search prototype which will generate the predicted Web-page list for user given search string using Boolean bit mask.

  3. Web X-Ray: Developing and Adopting Web Best Practices in Enterprises

    Directory of Open Access Journals (Sweden)

    Reinaldo Ferreira

    2016-12-01

    Full Text Available The adoption of Semantic Web technologies constitutes a promising approach to data structuring and integration, both for public and private usage. While these technologies have been around for some time, their adoption is behind overall expectations, particularly in the case of Enterprises. Having that in mind, we developed a Semantic Web Implementation Model that measures and facilitates the implementation of the technology. The advantages of using the model proposed are two-fold: the model serves as a guide for driving the implementation of the Semantic Web as well as it helps to evaluate the impact of the introduction of the technology. The model was adopted by 19 enterprises in an Action Research intervention of one year with promising results: according to the model's scale, in average, all enterprises evolved from a 6% evaluation to 46% during that period. Furthermore, practical implementation recommendations, a typical consulting tool, were developed and adopted during the project by all enterprises, providing important guidelines for the identification of a development path that may be adopted on a larger scale. Meanwhile, the project also outlined that most enterprises were interested in an even broader scope of the Implementation Model and the ambition of a "All Web Technologies" approach arose. One model that could embrace the observable overlapping of different Web generations, namely the Web of Documents, the Social Web, the Web of Data and, ultimately, the Web of Context. One model that could combine the evaluation and guidance for all enterprises to follow. That's the goal of the undergoing "Project Web X-ray" that aims to involve 200 enterprises in the adoption of best practices that may lead to their business development based on Web technologies. This paper presents a case of how Action Research promoted the simultaneous advancement of academic research and enterprise development and introduces the framework and opportunities

  4. A Web-based e-learning course: integration of pathophysiology into pharmacology.

    Science.gov (United States)

    Tse, Mimi M Y; Lo, Lisa W L

    2008-11-01

    The Internet is becoming the preferred place to find information. Millions of people go online in search of health and medical information. Likewise, the demand for Web-based courses is growing. This paper presents the development, utilization, and evaluation of a Web-based e-learning course for nursing students, entitled Integration of Pathophysiology into Pharmacology. The pathophysiology component included cardiovascular, respiratory, central nervous and immune system diseases, while the pharmacology component was developed based on 150 commonly used drugs. One hundred and nineteen Year 1 nursing students took part in the course. The Web-based e-learning course materials were uploaded to a WebCT for students' self-directed learning and attempts to pass two scheduled online quizzes. At the end of the semester, students were given a questionnaire to measure the e-learning experience. Their experience in the e-learning course was a positive one. Students stated that they were able to understand rather than memorize the subject content, and develop their problem solving and critical thinking abilities. Online quizzes yielded satisfactory results. In the focus group interview, students indicated that they appreciated the time flexibility and convenience associated with Web-based learning, and also made good suggestions for enhancing Web-based learning. The Web-based approach is promising for teaching and learning pathophysiology and pharmacology for nurses and other healthcare professionals.

  5. Mapping Commercial Web 2.0 Worlds: Towards a New Critical Ontogenesis

    Directory of Open Access Journals (Sweden)

    Kenneth Werbin

    2009-01-01

    Full Text Available At the 2007 International Communication Association Conference, Web 2.0 was highlighted as an emergent topic of research with a keynote panel entitled 'What's so Significant about Social Networking? Web 2.0 and its Critical Potentials'. One of the thought-provoking moments during the panel was the juxtaposition of two very different and at first, contradictory theoretical approaches to the relationships between Web 2.0 and user-generated content. While Henry Jenkins focused on the democratic potential of online participatory culture as enabling new modes of knowledge production, Titziana Terranova argued for a post-Marxist perspective on Web 2.0 as a site of cultural colonization and expansion of new forms of capitalization on culture, affect and knowledge. The juxtaposition of these two very different critical approaches did not simply rehash the old divide between cultural theory, particularly active audience theory, and post-Marxist critical theory; rather, this debate over Web 2.0 suggested new possibilities for the synthesis and continued development of both sets of critiques. In other words, the event reinforced our belief that corporate colonization arguments do not provide an entirely adequate model for understanding Web 2.0. After all, commercial Web 2.0 spaces such as Facebook, YouTube and MySpace are important sites of cultural exchange and political discussion, in part because they almost entirely rely on user-generated content to exist.

  6. An Open-Source Web-Based Tool for Resource-Agnostic Interactive Translation Prediction

    Directory of Open Access Journals (Sweden)

    Daniel Torregrosa

    2014-09-01

    Full Text Available We present a web-based open-source tool for interactive translation prediction (ITP and describe its underlying architecture. ITP systems assist human translators by making context-based computer-generated suggestions as they type. Most of the ITP systems in literature are strongly coupled with a statistical machine translation system that is conveniently adapted to provide the suggestions. Our system, however, follows a resource-agnostic approach and suggestions are obtained from any unmodified black-box bilingual resource. This paper reviews our ITP method and describes the architecture of Forecat, a web tool, partly based on the recent technology of web components, that eases the use of our ITP approach in any web application requiring this kind of translation assistance. We also evaluate the performance of our method when using an unmodified Moses-based statistical machine translation system as the bilingual resource.

  7. ONTOLOGY BASED MEANINGFUL SEARCH USING SEMANTIC WEB AND NATURAL LANGUAGE PROCESSING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    K. Palaniammal

    2013-10-01

    Full Text Available The semantic web extends the current World Wide Web by adding facilities for the machine understood description of meaning. The ontology based search model is used to enhance efficiency and accuracy of information retrieval. Ontology is the core technology for the semantic web and this mechanism for representing formal and shared domain descriptions. In this paper, we proposed ontology based meaningful search using semantic web and Natural Language Processing (NLP techniques in the educational domain. First we build the educational ontology then we present the semantic search system. The search model consisting three parts which are embedding spell-check, finding synonyms using WordNet API and querying ontology using SPARQL language. The results are both sensitive to spell check and synonymous context. This paper provides more accurate results and the complete details for the selected field in a single page.

  8. WebQuest como recurso para aprender história no IFAC

    Directory of Open Access Journals (Sweden)

    Uthant Benicio Paiva

    2017-12-01

    Full Text Available This case studies presents a methodological tool called WebQuest as a facilitator for the teaching of history and the attempt to reduce the distance between the history teachers and their students. We used this tool as a methodology for the elaboration of the work. The teaching of history and education as a whole undergoes behavioral and structural changes from access to the internet. Advances and ease of information with the use of the web are undeniable. In the practical work carried out with a 3rd grade class Integrated from the Federal Institute of Acre (IFAC, WebQuest proved to be an innovative tool for teaching, arousing enthusiasm and interest on the part of students in the history discipline.

  9. Honours level Quantum Mechanics on the Web

    International Nuclear Information System (INIS)

    McKellar, B.H.J.; Thomson, M.J.

    1996-01-01

    The authors report on a pilot project to employ the World Wide Web (WWW) as an integral part of teaching Quantum Mechanics at the fourth year honours level. Although the project is still very much under development, the authors discuss about what they have learnt about how the WWW can be used as a teaching resource and the difficulties encountered in doing so. 7 refs

  10. SEE: structured representation of scientific evidence in the biomedical domain using Semantic Web techniques.

    Science.gov (United States)

    Bölling, Christian; Weidlich, Michael; Holzhütter, Hermann-Georg

    2014-01-01

    Accounts of evidence are vital to evaluate and reproduce scientific findings and integrate data on an informed basis. Currently, such accounts are often inadequate, unstandardized and inaccessible for computational knowledge engineering even though computational technologies, among them those of the semantic web, are ever more employed to represent, disseminate and integrate biomedical data and knowledge. We present SEE (Semantic EvidencE), an RDF/OWL based approach for detailed representation of evidence in terms of the argumentative structure of the supporting background for claims even in complex settings. We derive design principles and identify minimal components for the representation of evidence. We specify the Reasoning and Discourse Ontology (RDO), an OWL representation of the model of scientific claims, their subjects, their provenance and their argumentative relations underlying the SEE approach. We demonstrate the application of SEE and illustrate its design patterns in a case study by providing an expressive account of the evidence for certain claims regarding the isolation of the enzyme glutamine synthetase. SEE is suited to provide coherent and computationally accessible representations of evidence-related information such as the materials, methods, assumptions, reasoning and information sources used to establish a scientific finding by adopting a consistently claim-based perspective on scientific results and their evidence. SEE allows for extensible evidence representations, in which the level of detail can be adjusted and which can be extended as needed. It supports representation of arbitrary many consecutive layers of interpretation and attribution and different evaluations of the same data. SEE and its underlying model could be a valuable component in a variety of use cases that require careful representation or examination of evidence for data presented on the semantic web or in other formats.

  11. The Role of the Web Server in a Capstone Web Application Course

    Science.gov (United States)

    Umapathy, Karthikeyan; Wallace, F. Layne

    2010-01-01

    Web applications have become commonplace in the Information Systems curriculum. Much of the discussion about Web development for capstone courses has centered on the scripting tools. Very little has been discussed about different ways to incorporate the Web server into Web application development courses. In this paper, three different ways of…

  12. Measuring dynamic and kinetic information in the previously inaccessible supra-τ(c) window of nanoseconds to microseconds by solution NMR spectroscopy.

    Science.gov (United States)

    Ban, David; Sabo, T Michael; Griesinger, Christian; Lee, Donghan

    2013-09-26

    Nuclear Magnetic Resonance (NMR) spectroscopy is a powerful tool that has enabled experimentalists to characterize molecular dynamics and kinetics spanning a wide range of time-scales from picoseconds to days. This review focuses on addressing the previously inaccessible supra-tc window (defined as τ(c) supra-τ(c) supra-τ(c) window. In the second section, the current state of the art for RD is analyzed, as well as the considerable progress toward pushing the sensitivity of RD further into the supra-τ(c) scale by up to a factor of two (motion up to 25 μs). From the data obtained with these techniques and methodology, the importance of the supra-τ(c) scale for protein function and molecular recognition is becoming increasingly clearer as the connection between motion on the supra-τ(c) scale and protein functionality from the experimental side is further strengthened with results from molecular dynamics simulations.

  13. A Web-based Examination System Based on PHP+MySQL.

    Science.gov (United States)

    Wen, Ji; Zhang, Yang; Yan, Yong; Xia, Shunren

    2005-01-01

    The design and implementation of web-based examination system constructed by PHP and MySQL is presented in this paper. Three primary parts, including students',teachers' and administrators', are introduced and analyzed in detail. Initial application has demonstrated the system's feasibility and reasonability.*

  14. Embedded Web Technology: Applying World Wide Web Standards to Embedded Systems

    Science.gov (United States)

    Ponyik, Joseph G.; York, David W.

    2002-01-01

    Embedded Systems have traditionally been developed in a highly customized manner. The user interface hardware and software along with the interface to the embedded system are typically unique to the system for which they are built, resulting in extra cost to the system in terms of development time and maintenance effort. World Wide Web standards have been developed in the passed ten years with the goal of allowing servers and clients to intemperate seamlessly. The client and server systems can consist of differing hardware and software platforms but the World Wide Web standards allow them to interface without knowing about the details of system at the other end of the interface. Embedded Web Technology is the merging of Embedded Systems with the World Wide Web. Embedded Web Technology decreases the cost of developing and maintaining the user interface by allowing the user to interface to the embedded system through a web browser running on a standard personal computer. Embedded Web Technology can also be used to simplify an Embedded System's internal network.

  15. Web service composition: a semantic web and automated planning technique application

    Directory of Open Access Journals (Sweden)

    Jaime Alberto Guzmán Luna

    2008-09-01

    Full Text Available This article proposes applying semantic web and artificial intelligence planning techniques to a web services composition model dealing with problems of ambiguity in web service description and handling incomplete web information. The model uses an OWL-S services and implements a planning technique which handles open world semantics in its reasoning process to resolve these problems. This resulted in a web services composition system incorporating a module for interpreting OWL-S services and converting them into a planning problem in PDDL (a planning module handling incomplete information and an execution service module concurrently interacting with the planner for executing each composition plan service.

  16. Learning from WebQuests

    Science.gov (United States)

    Gaskill, Martonia; McNulty, Anastasia; Brooks, David W.

    2006-04-01

    WebQuests are activities in which students use Web resources to learn about school topics. WebQuests are advocated as constructivist activities and ones generally well regarded by students. Two experiments were conducted in school settings to compare learning using WebQuests versus conventional instruction. Students and teachers both enjoyed WebQuest instruction and spoke highly of it. In one experiment, however, conventional instruction led to significantly greater student learning. In the other, there were no significant differences in the learning outcomes between conventional versus WebQuest-based instruction.

  17. WebEase: Development of a Web-Based Epilepsy Self-Management Intervention

    OpenAIRE

    DiIorio, Colleen; Escoffery, Cam; Yeager, Katherine A.; Koganti, Archana; Reisinger, Elizabeth; Koganti, Archana; McCarty, Frances; Henry, Thomas R.; Robinson, Elise; Kobau, Rosemarie; Price, Patricia

    2008-01-01

    People with epilepsy must adopt many self-management behaviors, especially regarding medication adherence, stress management, and sleep quality. In response to the need for theory-based self-management programs that people with epilepsy can easily access, the WebEase Web site was created and tested for feasibility, acceptability, and usability. This article discusses the theoretical background and developmental phases of WebEase and lessons learned throughout the development process. The WebE...

  18. A Web System Trace Model and Its Application to Web Design

    OpenAIRE

    Kong, Xiaoying; Liu, Li; Lowe, David

    2007-01-01

    Traceability analysis is crucial to the development of web-centric systems, particularly those with frequent system changes, fine-grained evolution and maintenance, and high level of requirements uncertainty. A trace model at the level of the web system architecture is presented in this paper to address the specific challenges of developing web-centric systems. The trace model separates the concerns of different stakeholders in the web development life cycle into viewpoints; and c...

  19. Take control of iWeb

    CERN Document Server

    Sande, Steve

    2009-01-01

    Learn how to make useful, attractive Web sites with iWeb! Apple's iWeb aims to help you build an attractive Web site quickly and easily, but not all of iWeb's features are fully explained. If you want step-by-step instructions and plenty of time-saving tips, Web pro Steve Sande can help. In Take Control of iWeb, Steve walks you through all the steps for building an iWeb site and uploading it to .Mac or to another Web host. You can look over his shoulder as he enhances iWeb's templates with a designer's eye, using tools like masks, reflections, and Instant Alpha.Steve teaches you the best ways

  20. Evaluating company growth potential using AI and web media data

    DEFF Research Database (Denmark)

    Droll, Andrew; Khan, Shahzad; Tanev, Stoyan

    2017-01-01

    The article focuses on adapting and validating the use of an existing web search and analytics engine to evaluate the growth and competitive potential of new technology start-ups and existing firms in the newly emerging precision medicine sector. The results are based on two different search...... includes new technology firms in the same sector. The firms in the second sample were used as test cases in examining if their growth related web search scores would relate to the degree of their innovativeness. The second part of the study applied the same methodology to the real time monitoring of firms...

  1. WebFTS: File Transfer Web Interface for FTS3

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    WebFTS is a web-delivered file transfer and management solution which allows users to invoke reliable, managed data transfers on distributed infrastructures. The fully open source solution offers a simple graphical interface through which the power of the FTS3 service can be accessed without the installation of any special grid tools. Created following simplicity and efficiency criteria, WebFTS allows the user to access and interact with multiple grid and cloud storage. The “transfer engine” used is FTS3, the service responsible for distributing the majority of LHC data across WLCG infrastructure. This provides WebFTS with reliable, multi-protocol, adaptively optimised data transfers.The talk will focus on the recent development which allows transfers from/to Dropbox and CERNBox (CERN ownCloud deployment)

  2. Web Search Engines

    OpenAIRE

    Rajashekar, TB

    1998-01-01

    The World Wide Web is emerging as an all-in-one information source. Tools for searching Web-based information include search engines, subject directories and meta search tools. We take a look at key features of these tools and suggest practical hints for effective Web searching.

  3. WebSpy: An Architecture for Monitoring Web Server Availability in a Multi-Platform Environment

    Directory of Open Access Journals (Sweden)

    Madhan Mohan Thirukonda

    2002-01-01

    Full Text Available For an electronic business (e-business, customer satisfaction can be the difference between long-term success and short-term failure. Customer satisfaction is highly impacted by Web server availability, as customers expect a Web site to be available twenty-four hours a day and seven days a week. Unfortunately, unscheduled Web server downtime is often beyond the control of the organization. What is needed is an effective means of identifying and recovering from Web server downtime in order to minimize the negative impact on the customer. An automated architecture, called WebSpy, has been developed to notify administration and to take immediate action when Web server downtime is detected. This paper describes the WebSpy architecture and differentiates it from other popular Web monitoring tools. The results of a case study are presented as a means of demonstrating WebSpy's effectiveness in monitoring Web server availability.

  4. Maximum Spanning Tree Model on Personalized Web Based Collaborative Learning in Web 3.0

    OpenAIRE

    Padma, S.; Seshasaayee, Ananthi

    2012-01-01

    Web 3.0 is an evolving extension of the current web environme bnt. Information in web 3.0 can be collaborated and communicated when queried. Web 3.0 architecture provides an excellent learning experience to the students. Web 3.0 is 3D, media centric and semantic. Web based learning has been on high in recent days. Web 3.0 has intelligent agents as tutors to collect and disseminate the answers to the queries by the students. Completely Interactive learner's query determine the customization of...

  5. Web-Based Cognitive Behavioral Therapy for Female Patients With Eating Disorders: Randomized Controlled Trial.

    Science.gov (United States)

    ter Huurne, Elke D; de Haan, Hein A; Postel, Marloes G; van der Palen, Job; VanDerNagel, Joanne E L; DeJong, Cornelis A J

    2015-06-18

    Many patients with eating disorders do not receive help for their symptoms, even though these disorders have severe morbidity. The Internet may offer alternative low-threshold treatment interventions. This study evaluated the effects of a Web-based cognitive behavioral therapy (CBT) intervention using intensive asynchronous therapeutic support to improve eating disorder psychopathology, and to reduce body dissatisfaction and related health problems among patients with eating disorders. A two-arm open randomized controlled trial comparing a Web-based CBT intervention to a waiting list control condition (WL) was carried out among female patients with bulimia nervosa (BN), binge eating disorder (BED), and eating disorders not otherwise specified (EDNOS). The eating disorder diagnosis was in accordance with the Diagnostic and Statistical Manual of Mental Disorders, 4th edition, and was established based on participants' self-report. Participants were recruited from an open-access website, and the intervention consisted of a structured two-part program within a secure Web-based application. The aim of the first part was to analyze participant's eating attitudes and behaviors, while the second part focused on behavioral change. Participants had asynchronous contact with a personal therapist twice a week, solely via the Internet. Self-report measures of eating disorder psychopathology (primary outcome), body dissatisfaction, physical health, mental health, self-esteem, quality of life, and social functioning were completed at baseline and posttest. A total of 214 participants were randomized to either the Web-based CBT group (n=108) or to the WL group (n=106) stratified by type of eating disorder (BN: n=44; BED: n=85; EDNOS: n=85). Study attrition was low with 94% of the participants completing the posttest assignment. Overall, Web-based CBT showed a significant improvement over time for eating disorder psychopathology (F97=63.07, PWeb-based CBT participants in all three

  6. Programmation Web Typée

    OpenAIRE

    Canou , Benjamin

    2011-01-01

    The goal of this thesis is to contribute to make Web programming safer and more flexible than it is in the solutions prevalent today. To achieve this goal, we propose a solution based on the ML language family, which brings freedom to the programmer by its multi-paradigm aspect, while providing an important level of safety thanks to static typing. In the first part, we show that it is possible to program the browser without sticking to the style of JavaScript. Our solution is OBrowser, an OCa...

  7. University Presentation to Potential Students Using Web 2.0 Environments

    Directory of Open Access Journals (Sweden)

    Andrius Eidimtas

    2013-02-01

    Full Text Available Choosing what to study for school graduates is a compound and multi-stage process (Chapman, 1981; Hossler et al., 1999; Brennan, 2001; Shankle, 2009. In the information retrieval stage, future students have to gather and assimilate actual information, form a list of possible higher education institutions. Nowadays modern internet technologies enable universities to create conditions for attractive and interactive information retrieval. Userfriendliness and accessibility of Web 2.0-based environments attract more young people to search for information in the web. Western universities have noticed a great potential of Web 2.0 in information dissemination back in 2007. Meanwhile, Lithuanian universities began using Web 2.0 to assemble virtual communities only in 2010 (Valinevičienė, 2010. Purpose—to disclose possibilities to present universities to school graduates in Web 2.0 environments. Design/methodology/approach—strategies of a case study by using methods of scientific literature analysis, observation and quantitative content analysis. Findings—referring to the information retrieval types and particularity of information retrieval by school graduates disclosed in the analysis of scientific literature, it has been identified that 76 per cent of Lithuanian universities apply at least one website created on the basis of Web 2.0 technology for their official presentation. The variety of Web 2.0 being used distributes only from 1 to 6 different tools, while in scientific literature more possibilities to apply Web 2.0 environments can be found. Research limitations/implications—the empiric part of the case study has been contextualized for Lithuania; however, the theoretic construct of possibilities to present universities in Web 2.0 environments can be used for the analysis presentation of foreign universities in Web 2.0 environments. Practical implications—the work can become the recommendation to develop possibilities for Lithuanian

  8. University Presentation to Potential Students Using Web 2.0 Environments

    Directory of Open Access Journals (Sweden)

    Andrius Eidimtas

    2012-12-01

    Full Text Available Choosing what to study for school graduates is a compound and multi-stage process (Chapman, 1981; Hossler et al., 1999; Brennan, 2001; Shankle, 2009. In the information retrieval stage, future students have to gather and assimilate actual information, form a list of possible higher education institutions. Nowadays modern internet technologies enable universities to create conditions for attractive and interactive information retrieval. Userfriendliness and accessibility of Web 2.0-based environments attract more young people to search for information in the web. Western universities have noticed a great potential of Web 2.0 in information dissemination back in 2007. Meanwhile, Lithuanian universities began using Web 2.0 to assemble virtual communities only in 2010 (Valinevičienė, 2010.Purpose—to disclose possibilities to present universities to school graduates in Web 2.0 environments.Design/methodology/approach—strategies of a case study by using methods of scientific literature analysis, observation and quantitative content analysis.Findings—referring to the information retrieval types and particularity of information retrieval by school graduates disclosed in the analysis of scientific literature, it has been identified that 76 per cent of Lithuanian universities apply at least one website created on the basis of Web 2.0 technology for their official presentation. The variety of Web 2.0 being used distributes only from 1 to 6 different tools, while in scientific literature more possibilities to apply Web 2.0 environments can be found.Research limitations/implications—the empiric part of the case study has been contextualized for Lithuania; however, the theoretic construct of possibilities to present universities in Web 2.0 environments can be used for the analysis presentation of foreign universities in Web 2.0 environments.Practical implications—the work can become the recommendation to develop possibilities for Lithuanian

  9. Web Mining and Social Networking

    CERN Document Server

    Xu, Guandong; Li, Lin

    2011-01-01

    This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web mining, and the issue of how to incorporate web mining into web personalization and recommendation systems are also reviewed. Additionally, the volume explores web community mining and analysis to find the structural, organizational and temporal developments of web communities and reveal the societal s

  10. Semantic Web Primer

    NARCIS (Netherlands)

    Antoniou, Grigoris; Harmelen, Frank van

    2004-01-01

    The development of the Semantic Web, with machine-readable content, has the potential to revolutionize the World Wide Web and its use. A Semantic Web Primer provides an introduction and guide to this still emerging field, describing its key ideas, languages, and technologies. Suitable for use as a

  11. Classroom Assessment in Web-Based Instructional Environment: Instructors' Experience

    Directory of Open Access Journals (Sweden)

    Xin Liang

    2004-03-01

    Full Text Available While a great deal has been written on the advantage and benefits of online teaching, little is known on how..assessment is implemented in online classrooms to monitor and inform performance and progress. The..purpose of this study is to investigate the dynamics of WebCT classroom assessment by analyzing the..perceptions and experience of the instructors. Grounded theory method was employed to generate a - process..theory- . The study included 10 faculties who taught WebCT classes, and 216 students in the College of..Education in an urban university in the Mid west. Interviews and classroom observations were undertaken..on line. The findings indicated that, performance-based assessment, writing skills, interactive assessment..and learner autonomy were major assessment aspects to inform teaching and enhance learning. If one of..the major roles of online instruction is to increase self-directed learning, as part of the pedagogical..mechanism, web-based classroom assessment should be designed and practiced to impact learner autonomy.

  12. Interactive effects of fire and large herbivores on web-building spiders.

    Science.gov (United States)

    Foster, C N; Barton, P S; Wood, J T; Lindenmayer, D B

    2015-09-01

    Altered disturbance regimes are a major driver of biodiversity loss worldwide. Maintaining or re-creating natural disturbance regimes is therefore the focus of many conservation programmes. A key challenge, however, is to understand how co-occurring disturbances interact to affect biodiversity. We experimentally tested for the interactive effects of prescribed fire and large macropod herbivores on the web-building spider assemblage of a eucalypt forest understorey and investigated the role of vegetation in mediating these effects using path analysis. Fire had strong negative effects on the density of web-building spiders, which were partly mediated by effects on vegetation structure, while negative effects of large herbivores on web density were not related to changes in vegetation. Fire amplified the effects of large herbivores on spiders, both via vegetation-mediated pathways and by increasing herbivore activity. The importance of vegetation-mediated pathways and fire-herbivore interactions differed for web density and richness and also differed between web types. Our results demonstrate that for some groups of web-building spiders, the effects of co-occurring disturbance drivers may be mostly additive, whereas for other groups, interactions between drivers can amplify disturbance effects. In our study system, the use of prescribed fire in the presence of high densities of herbivores could lead to reduced densities and altered composition of web-building spiders, with potential cascading effects through the arthropod food web. Our study highlights the importance of considering both the independent and interactive effects of disturbances, as well as the mechanisms driving their effects, in the management of disturbance regimes.

  13. Informatics in radiology: radiology gamuts ontology: differential diagnosis for the Semantic Web.

    Science.gov (United States)

    Budovec, Joseph J; Lam, Cesar A; Kahn, Charles E

    2014-01-01

    The Semantic Web is an effort to add semantics, or "meaning," to empower automated searching and processing of Web-based information. The overarching goal of the Semantic Web is to enable users to more easily find, share, and combine information. Critical to this vision are knowledge models called ontologies, which define a set of concepts and formalize the relations between them. Ontologies have been developed to manage and exploit the large and rapidly growing volume of information in biomedical domains. In diagnostic radiology, lists of differential diagnoses of imaging observations, called gamuts, provide an important source of knowledge. The Radiology Gamuts Ontology (RGO) is a formal knowledge model of differential diagnoses in radiology that includes 1674 differential diagnoses, 19,017 terms, and 52,976 links between terms. Its knowledge is used to provide an interactive, freely available online reference of radiology gamuts ( www.gamuts.net ). A Web service allows its content to be discovered and consumed by other information systems. The RGO integrates radiologic knowledge with other biomedical ontologies as part of the Semantic Web. © RSNA, 2014.

  14. CSAR-web: a web server of contig scaffolding using algebraic rearrangements.

    Science.gov (United States)

    Chen, Kun-Tze; Lu, Chin Lung

    2018-05-04

    CSAR-web is a web-based tool that allows the users to efficiently and accurately scaffold (i.e. order and orient) the contigs of a target draft genome based on a complete or incomplete reference genome from a related organism. It takes as input a target genome in multi-FASTA format and a reference genome in FASTA or multi-FASTA format, depending on whether the reference genome is complete or incomplete, respectively. In addition, it requires the users to choose either 'NUCmer on nucleotides' or 'PROmer on translated amino acids' for CSAR-web to identify conserved genomic markers (i.e. matched sequence regions) between the target and reference genomes, which are used by the rearrangement-based scaffolding algorithm in CSAR-web to order and orient the contigs of the target genome based on the reference genome. In the output page, CSAR-web displays its scaffolding result in a graphical mode (i.e. scalable dotplot) allowing the users to visually validate the correctness of scaffolded contigs and in a tabular mode allowing the users to view the details of scaffolds. CSAR-web is available online at http://genome.cs.nthu.edu.tw/CSAR-web.

  15. La salud de las web universitarias españolas

    Directory of Open Access Journals (Sweden)

    Thelwall, Mike

    2003-09-01

    especializado, analizando posteriormente los datos obtenidos. Existe una gran variedad en el tamaño de las sedes universitarias, que reciben enlaces externos de otras en proporción aproximada al tamaño de sus sedes. Aunque la Web académica española se situó por detrás de los cuatro países con las que se comparó, la mayoría de los enlaces que recibe provienen de dominios nacionales e internacionales de países no hispanohablantes lo que indica un amplio impacto internacional y un elevado grado de multilingüismo por parte de los autores de las web. Las páginas más enlazadas son mayormente aquellas que atraen enlaces generados automáticamente, con la sorprendente inclusión de algunos organismos gubernamentales.

  16. Going, going, still there: using the WebCite service to permanently archive cited web pages.

    Science.gov (United States)

    Eysenbach, Gunther; Trudel, Mathieu

    2005-12-30

    Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research

  17. WebGL and web audio software lightweight components for multimedia education

    Science.gov (United States)

    Chang, Xin; Yuksel, Kivanc; Skarbek, Władysław

    2017-08-01

    The paper presents the results of our recent work on development of contemporary computing platform DC2 for multimedia education usingWebGL andWeb Audio { the W3C standards. Using literate programming paradigm the WEBSA educational tools were developed. It offers for a user (student), the access to expandable collection of WEBGL Shaders and web Audio scripts. The unique feature of DC2 is the option of literate programming, offered for both, the author and the reader in order to improve interactivity to lightweightWebGL andWeb Audio components. For instance users can define: source audio nodes including synthetic sources, destination audio nodes, and nodes for audio processing such as: sound wave shaping, spectral band filtering, convolution based modification, etc. In case of WebGL beside of classic graphics effects based on mesh and fractal definitions, the novel image processing analysis by shaders is offered like nonlinear filtering, histogram of gradients, and Bayesian classifiers.

  18. Applying Web usage mining for personalizing hyperlinks in Web-based adaptive educational systems

    NARCIS (Netherlands)

    Romero, C.; Ventura, S.; Zafra, A.; Bra, de P.M.E.

    2009-01-01

    Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender

  19. The Semantic Web Revisited

    OpenAIRE

    Shadbolt, Nigel; Berners-Lee, Tim; Hall, Wendy

    2006-01-01

    The original Scientific American article on the Semantic Web appeared in 2001. It described the evolution of a Web that consisted largely of documents for humans to read to one that included data and information for computers to manipulate. The Semantic Web is a Web of actionable information--information derived from data through a semantic theory for interpreting the symbols.This simple idea, however, remains largely unrealized. Shopbots and auction bots abound on the Web, but these are esse...

  20. Assembling Webs of Support: Child Domestic Workers in India

    Science.gov (United States)

    Wasiuzzaman, Shaziah; Wells, Karen

    2010-01-01

    This paper uses ethnographic and qualitative interview data with Muslim child domestic workers, their families and employers to investigate the social ties between young workers and their employers. Our analysis shows that working-class families use children's domestic work with middle-class families as part of a web of resources to protect them…

  1. Advanced web services

    CERN Document Server

    Bouguettaya, Athman; Daniel, Florian

    2013-01-01

    Web services and Service-Oriented Computing (SOC) have become thriving areas of academic research, joint university/industry research projects, and novel IT products on the market. SOC is the computing paradigm that uses Web services as building blocks for the engineering of composite, distributed applications out of the reusable application logic encapsulated by Web services. Web services could be considered the best-known and most standardized technology in use today for distributed computing over the Internet. This book is the second installment of a two-book collection covering the state-o

  2. Web-sovelluskehityksen tekniikat

    OpenAIRE

    Kettunen, Werner

    2015-01-01

    Web-sovelluskehitykseen käytettäviä tekniikoita, työkaluja ja ohjelmakirjastoja on olemassa useita erilaisia ja niiden lähestymistapa web-sovelluskehitykseen poikkeaa jonkin verran toisistaan. Opinnäytetyössä selvitetään teoriassa ja käytännön esimerkkiprojektin avulla yleisimmin web-sovelluskehityksessä käytettyjä tekniikoita ja kirjastoja. Työssä esimerkkinä luodussa web-sovelluksessa käytettiin Laravel-ohjelmakehystä ja alkuosassa käsiteltyjä työkaluja ja kirjastoja, kuten Bootstrap ja ...

  3. Web social y alfabetización informacional: experiencia en la Universidad de Puerto Rico

    Directory of Open Access Journals (Sweden)

    Liz M. Pagán

    2017-01-01

    Full Text Available En esta investigación se explora y describe la adopción de la web social para la alfabetización informacional por parte de las bibliotecas de la Universidad de Puerto Rico (UPR. Los objetivos de este estudio son: identificar las tecnologías de la web social utilizadas por los bibliotecarios para la alfabetización informacional; identificar y examinar la extensión en el uso de las tecnologías; evaluar las actitudes de los bibliotecarios hacia la aplicación de la web social; y la conexión que establecen con los estándares de la Association of College and Research Libraries (ACRL. El enfoque metodológico del estudio es cualitativo y se utiliza el cuestionario y la entrevista como técnicas para la recogida de datos. Los resultados muestran que el 82% de los participantes de este estudio utilizan la web social para la alfabetización informacional de los usuarios. Las tecnologías de mayor uso son: blogs, redes sociales, media sharing y mashup. El media sharing (Flickr, YouTube, Instagram, Pinterest, blog y redes sociales obtuvieron mayor frecuencia de uso por parte de los bibliotecarios. Los participantes presentaron una actitud de aceptación hacia la aplicación de la web social para la alfabetización informacional. En su mayoría, expresaron la conexión que establecen entre las normas de ACRL y el uso de la web social a través de la enseñanza. A partir de los resultados de este estudio, se presentan recomendaciones respecto a la aplicación actual de las herramientas de la web social para la alfabetización en información.

  4. Learning from WebQuests

    Science.gov (United States)

    Gaskill, Martonia; McNulty, Anastasia; Brooks, David W.

    2006-01-01

    WebQuests are activities in which students use Web resources to learn about school topics. WebQuests are advocated as constructivist activities and ones generally well regarded by students. Two experiments were conducted in school settings to compare learning using WebQuests versus conventional instruction. Students and teachers both enjoyed…

  5. Evaluating Web Usability

    Science.gov (United States)

    Snider, Jean; Martin, Florence

    2012-01-01

    Web usability focuses on design elements and processes that make web pages easy to use. A website for college students was evaluated for underutilization. One-on-one testing, focus groups, web analytics, peer university review and marketing focus group and demographic data were utilized to conduct usability evaluation. The results indicated that…

  6. FedWeb Greatest Hits: Presenting the New Test Collection for Federated Web Search

    NARCIS (Netherlands)

    Demeester, Thomas; Trieschnigg, Rudolf Berend; Zhou, Ke; Nguyen, Dong-Phuong; Hiemstra, Djoerd

    This paper presents 'FedWeb Greatest Hits', a large new test collection for research in web information retrieval. As a combination and extension of the datasets used in the TREC Federated Web Search Track, this collection opens up new research possibilities on federated web search challenges, as

  7. A simple method for serving Web hypermaps with dynamic database drill-down

    Directory of Open Access Journals (Sweden)

    Carson Ewart R

    2002-08-01

    Full Text Available Abstract Background HealthCyberMap http://healthcybermap.semanticweb.org aims at mapping parts of health information cyberspace in novel ways to deliver a semantically superior user experience. This is achieved through "intelligent" categorisation and interactive hypermedia visualisation of health resources using metadata, clinical codes and GIS. HealthCyberMap is an ArcView 3.1 project. WebView, the Internet extension to ArcView, publishes HealthCyberMap ArcView Views as Web client-side imagemaps. The basic WebView set-up does not support any GIS database connection, and published Web maps become disconnected from the original project. A dedicated Internet map server would be the best way to serve HealthCyberMap database-driven interactive Web maps, but is an expensive and complex solution to acquire, run and maintain. This paper describes HealthCyberMap simple, low-cost method for "patching" WebView to serve hypermaps with dynamic database drill-down functionality on the Web. Results The proposed solution is currently used for publishing HealthCyberMap GIS-generated navigational information maps on the Web while maintaining their links with the underlying resource metadata base. Conclusion The authors believe their map serving approach as adopted in HealthCyberMap has been very successful, especially in cases when only map attribute data change without a corresponding effect on map appearance. It should be also possible to use the same solution to publish other interactive GIS-driven maps on the Web, e.g., maps of real world health problems.

  8. Perceptions of Web Site Design Characteristics: A Malaysian/Australian Comparison.

    Science.gov (United States)

    Fink, Dieter; Laupase, Ricky

    2000-01-01

    Compares the perceptions of Malaysians and Australians for four Web site design characteristics--atmospherics, news stories, signs, and products and services--as part of the integrated Internet marketing model. Hypothesizes that the predominant culture is not generalized to another culture, discusses validity and reliability, and suggest further…

  9. Virological Sampling of Inaccessible Wildlife with Drones.

    Science.gov (United States)

    Geoghegan, Jemma L; Pirotta, Vanessa; Harvey, Erin; Smith, Alastair; Buchmann, Jan P; Ostrowski, Martin; Eden, John-Sebastian; Harcourt, Robert; Holmes, Edward C

    2018-06-02

    There is growing interest in characterizing the viromes of diverse mammalian species, particularly in the context of disease emergence. However, little is known about virome diversity in aquatic mammals, in part due to difficulties in sampling. We characterized the virome of the exhaled breath (or blow) of the Eastern Australian humpback whale ( Megaptera novaeangliae ). To achieve an unbiased survey of virome diversity, a meta-transcriptomic analysis was performed on 19 pooled whale blow samples collected via a purpose-built Unmanned Aerial Vehicle (UAV, or drone) approximately 3 km off the coast of Sydney, Australia during the 2017 winter annual northward migration from Antarctica to northern Australia. To our knowledge, this is the first time that UAVs have been used to sample viruses. Despite the relatively small number of animals surveyed in this initial study, we identified six novel virus species from five viral families. This work demonstrates the potential of UAVs in studies of virus disease, diversity, and evolution.

  10. The use of web ontology languages and other semantic web tools in drug discovery.

    Science.gov (United States)

    Chen, Huajun; Xie, Guotong

    2010-05-01

    To optimize drug development processes, pharmaceutical companies require principled approaches to integrate disparate data on a unified infrastructure, such as the web. The semantic web, developed on the web technology, provides a common, open framework capable of harmonizing diversified resources to enable networked and collaborative drug discovery. We survey the state of art of utilizing web ontologies and other semantic web technologies to interlink both data and people to support integrated drug discovery across domains and multiple disciplines. Particularly, the survey covers three major application categories including: i) semantic integration and open data linking; ii) semantic web service and scientific collaboration and iii) semantic data mining and integrative network analysis. The reader will gain: i) basic knowledge of the semantic web technologies; ii) an overview of the web ontology landscape for drug discovery and iii) a basic understanding of the values and benefits of utilizing the web ontologies in drug discovery. i) The semantic web enables a network effect for linking open data for integrated drug discovery; ii) The semantic web service technology can support instant ad hoc collaboration to improve pipeline productivity and iii) The semantic web encourages publishing data in a semantic way such as resource description framework attributes and thus helps move away from a reliance on pure textual content analysis toward more efficient semantic data mining.

  11. Web 3.0 Emerging

    Energy Technology Data Exchange (ETDEWEB)

    Hendler, James [Rensselaer Polytechnic Institute

    2012-02-22

    As more and more data and information becomes available on the Web, new technologies that use explicit semantics for information organization are becoming desirable. New terms such as Linked Data, Semantic Web and Web 3.0 are used more and more, although there is increasing confusion as to what each means. In this talk, I will describe how different sorts of models can be used to link data in different ways. I will particularly explore different kinds of Web applications, from Enterprise Data Integration to Web 3.0 startups, government data release, the different needs of Web 2.0 and 3.0, the growing interest in “semantic search”, and the underlying technologies that power these new approaches.

  12. Collecting behavioural data using the world wide web: considerations for researchers.

    Science.gov (United States)

    Rhodes, S D; Bowie, D A; Hergenrather, K C

    2003-01-01

    To identify and describe advantages, challenges, and ethical considerations of web based behavioural data collection. This discussion is based on the authors' experiences in survey development and study design, respondent recruitment, and internet research, and on the experiences of others as found in the literature. The advantages of using the world wide web to collect behavioural data include rapid access to numerous potential respondents and previously hidden populations, respondent openness and full participation, opportunities for student research, and reduced research costs. Challenges identified include issues related to sampling and sample representativeness, competition for the attention of respondents, and potential limitations resulting from the much cited "digital divide", literacy, and disability. Ethical considerations include anonymity and privacy, providing and substantiating informed consent, and potential risks of malfeasance. Computer mediated communications, including electronic mail, the world wide web, and interactive programs will play an ever increasing part in the future of behavioural science research. Justifiable concerns regarding the use of the world wide web in research exist, but as access to, and use of, the internet becomes more widely and representatively distributed globally, the world wide web will become more applicable. In fact, the world wide web may be the only research tool able to reach some previously hidden population subgroups. Furthermore, many of the criticisms of online data collection are common to other survey research methodologies.

  13. Web Server Embedded System

    Directory of Open Access Journals (Sweden)

    Adharul Muttaqin

    2014-07-01

    Full Text Available Abstrak Embedded sistem saat ini menjadi perhatian khusus pada teknologi komputer, beberapa sistem operasi linux dan web server yang beraneka ragam juga sudah dipersiapkan untuk mendukung sistem embedded, salah satu aplikasi yang dapat digunakan dalam operasi pada sistem embedded adalah web server. Pemilihan web server pada lingkungan embedded saat ini masih jarang dilakukan, oleh karena itu penelitian ini dilakukan dengan menitik beratkan pada dua buah aplikasi web server yang tergolong memiliki fitur utama yang menawarkan “keringanan” pada konsumsi CPU maupun memori seperti Light HTTPD dan Tiny HTTPD. Dengan menggunakan parameter thread (users, ramp-up periods, dan loop count pada stress test embedded system, penelitian ini menawarkan solusi web server manakah diantara Light HTTPD dan Tiny HTTPD yang memiliki kecocokan fitur dalam penggunaan embedded sistem menggunakan beagleboard ditinjau dari konsumsi CPU dan memori. Hasil penelitian menunjukkan bahwa dalam hal konsumsi CPU pada beagleboard embedded system lebih disarankan penggunaan Light HTTPD dibandingkan dengan tiny HTTPD dikarenakan terdapat perbedaan CPU load yang sangat signifikan antar kedua layanan web tersebut Kata kunci: embedded system, web server Abstract Embedded systems are currently of particular concern in computer technology, some of the linux operating system and web server variegated also prepared to support the embedded system, one of the applications that can be used in embedded systems are operating on the web server. Selection of embedded web server on the environment is still rarely done, therefore this study was conducted with a focus on two web application servers belonging to the main features that offer a "lightness" to the CPU and memory consumption as Light HTTPD and Tiny HTTPD. By using the parameters of the thread (users, ramp-up periods, and loop count on a stress test embedded systems, this study offers a solution of web server which between the Light

  14. Available, intuitive and free! Building e-learning modules using web 2.0 services.

    Science.gov (United States)

    Tam, Chun Wah Michael; Eastwood, Anne

    2012-01-01

    E-learning is part of the mainstream in medical education and often provides the most efficient and effective means of engaging learners in a particular topic. However, translating design and content ideas into a useable product can be technically challenging, especially in the absence of information technology (IT) support. There is little published literature on the use of web 2.0 services to build e-learning activities. To describe the web 2.0 tools and solutions employed to build the GP Synergy evidence-based medicine and critical appraisal online course. We used and integrated a number of free web 2.0 services including: Prezi, a web-based presentation platform; YouTube, a video sharing service; Google Docs, a online document platform; Tiny.cc, a URL shortening service; and Wordpress, a blogging platform. The course consisting of five multimedia-rich, tutorial-like modules was built without IT specialist assistance or specialised software. The web 2.0 services used were free. The course can be accessed with a modern web browser. Modern web 2.0 services remove many of the technical barriers for creating and sharing content on the internet. When used synergistically, these services can be a flexible and low-cost platform for building e-learning activities. They were a pragmatic solution in our context.

  15. Applying Web Usage Mining for Personalizing Hyperlinks in Web-Based Adaptive Educational Systems

    Science.gov (United States)

    Romero, Cristobal; Ventura, Sebastian; Zafra, Amelia; de Bra, Paul

    2009-01-01

    Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender engine is integrated into the AHA! system in…

  16. The MAJORANA Parts Tracking Database

    Science.gov (United States)

    Abgrall, N.; Aguayo, E.; Avignone, F. T.; Barabash, A. S.; Bertrand, F. E.; Brudanin, V.; Busch, M.; Byram, D.; Caldwell, A. S.; Chan, Y.-D.; Christofferson, C. D.; Combs, D. C.; Cuesta, C.; Detwiler, J. A.; Doe, P. J.; Efremenko, Yu.; Egorov, V.; Ejiri, H.; Elliott, S. R.; Esterline, J.; Fast, J. E.; Finnerty, P.; Fraenkle, F. M.; Galindo-Uribarri, A.; Giovanetti, G. K.; Goett, J.; Green, M. P.; Gruszko, J.; Guiseppe, V. E.; Gusev, K.; Hallin, A. L.; Hazama, R.; Hegai, A.; Henning, R.; Hoppe, E. W.; Howard, S.; Howe, M. A.; Keeter, K. J.; Kidd, M. F.; Kochetov, O.; Konovalov, S. I.; Kouzes, R. T.; LaFerriere, B. D.; Leon, J. Diaz; Leviner, L. E.; Loach, J. C.; MacMullin, J.; Martin, R. D.; Meijer, S. J.; Mertens, S.; Miller, M. L.; Mizouni, L.; Nomachi, M.; Orrell, J. L.; O`Shaughnessy, C.; Overman, N. R.; Petersburg, R.; Phillips, D. G.; Poon, A. W. P.; Pushkin, K.; Radford, D. C.; Rager, J.; Rielage, K.; Robertson, R. G. H.; Romero-Romero, E.; Ronquest, M. C.; Shanks, B.; Shima, T.; Shirchenko, M.; Snavely, K. J.; Snyder, N.; Soin, A.; Suriano, A. M.; Tedeschi, D.; Thompson, J.; Timkin, V.; Tornow, W.; Trimble, J. E.; Varner, R. L.; Vasilyev, S.; Vetter, K.; Vorren, K.; White, B. R.; Wilkerson, J. F.; Wiseman, C.; Xu, W.; Yakushev, E.; Young, A. R.; Yu, C.-H.; Yumatov, V.; Zhitnikov, I.

    2015-04-01

    The MAJORANA DEMONSTRATOR is an ultra-low background physics experiment searching for the neutrinoless double beta decay of 76Ge. The MAJORANA Parts Tracking Database is used to record the history of components used in the construction of the DEMONSTRATOR. The tracking implementation takes a novel approach based on the schema-free database technology CouchDB. Transportation, storage, and processes undergone by parts such as machining or cleaning are linked to part records. Tracking parts provide a great logistics benefit and an important quality assurance reference during construction. In addition, the location history of parts provides an estimate of their exposure to cosmic radiation. A web application for data entry and a radiation exposure calculator have been developed as tools for achieving the extreme radio-purity required for this rare decay search.

  17. The MAJORANA Parts Tracking Database

    Energy Technology Data Exchange (ETDEWEB)

    Abgrall, N. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Aguayo, E. [Pacific Northwest National Laboratory, Richland, WA (United States); Avignone, F.T. [Department of Physics and Astronomy, University of South Carolina, Columbia, SC (United States); Oak Ridge National Laboratory, Oak Ridge, TN (United States); Barabash, A.S. [Institute for Theoretical and Experimental Physics, Moscow (Russian Federation); Bertrand, F.E. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Brudanin, V. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Busch, M. [Department of Physics, Duke University, Durham, NC (United States); Triangle Universities Nuclear Laboratory, Durham, NC (United States); Byram, D. [Department of Physics, University of South Dakota, Vermillion, SD (United States); Caldwell, A.S. [South Dakota School of Mines and Technology, Rapid City, SD (United States); Chan, Y-D. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Christofferson, C.D. [South Dakota School of Mines and Technology, Rapid City, SD (United States); Combs, D.C. [Department of Physics, North Carolina State University, Raleigh, NC (United States); Triangle Universities Nuclear Laboratory, Durham, NC (United States); Cuesta, C.; Detwiler, J.A.; Doe, P.J. [Center for Experimental Nuclear Physics and Astrophysics, and Department of Physics, University of Washington, Seattle, WA (United States); Efremenko, Yu. [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN (United States); Egorov, V. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Ejiri, H. [Research Center for Nuclear Physics and Department of Physics, Osaka University, Ibaraki, Osaka (Japan); Elliott, S.R. [Los Alamos National Laboratory, Los Alamos, NM (United States); and others

    2015-04-11

    The MAJORANA DEMONSTRATOR is an ultra-low background physics experiment searching for the neutrinoless double beta decay of {sup 76}Ge. The MAJORANA Parts Tracking Database is used to record the history of components used in the construction of the DEMONSTRATOR. The tracking implementation takes a novel approach based on the schema-free database technology CouchDB. Transportation, storage, and processes undergone by parts such as machining or cleaning are linked to part records. Tracking parts provide a great logistics benefit and an important quality assurance reference during construction. In addition, the location history of parts provides an estimate of their exposure to cosmic radiation. A web application for data entry and a radiation exposure calculator have been developed as tools for achieving the extreme radio-purity required for this rare decay search.

  18. An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm.

    Science.gov (United States)

    Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya

    2015-01-01

    Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the "quality of service" as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services.

  19. RS-WebPredictor

    DEFF Research Database (Denmark)

    Zaretzki, J.; Bergeron, C.; Huang, T.-W.

    2013-01-01

    Regioselectivity-WebPredictor (RS-WebPredictor) is a server that predicts isozyme-specific cytochrome P450 (CYP)-mediated sites of metabolism (SOMs) on drug-like molecules. Predictions may be made for the promiscuous 2C9, 2D6 and 3A4 CYP isozymes, as well as CYPs 1A2, 2A6, 2B6, 2C8, 2C19 and 2E1....... RS-WebPredictor is the first freely accessible server that predicts the regioselectivity of the last six isozymes. Server execution time is fast, taking on average 2s to encode a submitted molecule and 1s to apply a given model, allowing for high-throughput use in lead optimization projects.......Availability: RS-WebPredictor is accessible for free use at http://reccr.chem.rpi.edu/ Software/RS-WebPredictor....

  20. IL web tutorials

    DEFF Research Database (Denmark)

    Hyldegård, Jette; Lund, Haakon

    2012-01-01

    The paper presents the results from a study on information literacy in a higher education (HE) context based on a larger research project evaluating 3 Norwegian IL web tutorials at 6 universities and colleges in Norway. The aim was to evaluate how the 3 web tutorials served students’ information...... seeking and writing process in an study context and to identify barriers to the employment and use of the IL web tutorials, hence to the underlying information literacy intentions by the developer. Both qualitative and quantitative methods were employed. A clear mismatch was found between intention...... and use of the web tutorials. In addition, usability only played a minor role compared to relevance. It is concluded that the positive expectations of the IL web tutorials tend to be overrated by the developers. Suggestions for further research are presented....

  1. A development process meta-model for Web based expert systems: The Web engineering point of view

    DEFF Research Database (Denmark)

    Dokas, I.M.; Alapetite, Alexandre

    2006-01-01

    raised their complexity. Unfortunately, there is so far no clear answer to the question: How may the methods and experience of Web engineering and expert systems be combined and applied in order todevelop effective and successful Web based expert systems? In an attempt to answer this question...... on Web based expert systems – will be presented. The idea behind the presentation of theaccessibility evaluation and its conclusions is to show to Web based expert system developers, who typically have little Web engineering background, that Web engineering issues must be considered when developing Web......Similar to many legacy computer systems, expert systems can be accessed via the Web, forming a set of Web applications known as Web based expert systems. The tough Web competition, the way people and organizations rely on Web applications and theincreasing user requirements for better services have...

  2. Programming Web services with Perl

    CERN Document Server

    Ray, Randy J

    2003-01-01

    Given Perl's natural fit for web applications development, it's no surprise that Perl is also a natural choice for web services development. It's the most popular web programming language, with strong implementations of both SOAP and XML-RPC, the leading ways to distribute applications using web services. But books on web services focus on writing these applications in Java or Visual Basic, leaving Perl programmers with few resources to get them started. Programming Web Services with Perl changes that, bringing Perl users all the information they need to create web services using their favori

  3. Cooperative Mobile Web Browsing

    DEFF Research Database (Denmark)

    Perrucci, GP; Fitzek, FHP; Zhang, Qi

    2009-01-01

    This paper advocates a novel approach for mobile web browsing based on cooperation among wireless devices within close proximity operating in a cellular environment. In the actual state of the art, mobile phones can access the web using different cellular technologies. However, the supported data......-range links can then be used for cooperative mobile web browsing. By implementing the cooperative web browsing on commercial mobile phones, it will be shown that better performance is achieved in terms of increased data rate and therefore reduced access times, resulting in a significantly enhanced web...

  4. Web Portal Design, Execution and Sustainability for Naval Websites and Web Services

    National Research Council Canada - National Science Library

    Amsden, Saundra

    2003-01-01

    .... The newest Web Service is the development of Web Portals. Portals allow the design of Web Services in such a way as to allow users to define their own needs and create a home of their own within a site...

  5. Amino Acid Interaction (INTAA) web server.

    Science.gov (United States)

    Galgonek, Jakub; Vymetal, Jirí; Jakubec, David; Vondrášek, Jirí

    2017-07-03

    Large biomolecules-proteins and nucleic acids-are composed of building blocks which define their identity, properties and binding capabilities. In order to shed light on the energetic side of interactions of amino acids between themselves and with deoxyribonucleotides, we present the Amino Acid Interaction web server (http://bioinfo.uochb.cas.cz/INTAA/). INTAA offers the calculation of the residue Interaction Energy Matrix for any protein structure (deposited in Protein Data Bank or submitted by the user) and a comprehensive analysis of the interfaces in protein-DNA complexes. The Interaction Energy Matrix web application aims to identify key residues within protein structures which contribute significantly to the stability of the protein. The application provides an interactive user interface enhanced by 3D structure viewer for efficient visualization of pairwise and net interaction energies of individual amino acids, side chains and backbones. The protein-DNA interaction analysis part of the web server allows the user to view the relative abundance of various configurations of amino acid-deoxyribonucleotide pairs found at the protein-DNA interface and the interaction energies corresponding to these configurations calculated using a molecular mechanical force field. The effects of the sugar-phosphate moiety and of the dielectric properties of the solvent on the interaction energies can be studied for the various configurations. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Web 2.0 as a dystopia in the recent internet

    Directory of Open Access Journals (Sweden)

    Antonio Cambra

    2008-05-01

    Full Text Available The term Web 2.0 has recently come to form part of the vocabulary associated with the internet. Semantically imprecise, it looks to capture a moment in the development of the internet where the user becomes the central catalyst with a greater capacity for expression, interaction and participation provided by certain recently developed technologies. Despite this, certain figures and experts are critical of the current course, which they accuse of leading to a whole series of effects that, far from being desirable, bring into question the ideal nature of the evolution of the internet. Expressions such as cultural levelling, cult of the amateur or collective intelligence (used pejoratively have formed around the term Web 2.0 producing an aureole of dystopic resonance.This article looks to relativise some of the arguments that form part of the dystopic view of the Web 2.0, reflecting on the internet from an alternative perspective that helps understand the phenomenon without being overcome by the pessimism that it seems to be subject to in terms of the views of certain experts. Far from presupposing a telos dictating its evolution, the article defends an idea of the internet as a pragmatic field for experimentation that is legitimate as a "path" rather than a "destination" in terms of an implicit or expected development.

  7. Potential influence of Web 2.0 usage and security practices of online users on information management

    Directory of Open Access Journals (Sweden)

    R.J. Rudman

    2009-02-01

    Full Text Available The proliferation of Web 2.0 applications was the impetus for this survey-based research into practices that online users currently employ when using Web 2.0 sites. As part of the study, the popularity of Web 2.0 technologies and sites among online users at a university was investigated to determine the extent of the potential threat to corporate security, arising from Web 2.0 use and access. The results of this study indicate that the use of Web 2.0 sites is very popular among students, as a proxy for the potential future business users, and that users are not necessarily aware of the risks associated with these sites. The respondents indicated that they regularly visit Web 2.0 sites, and that they post personal information on these sites. This is of concern in protecting arguably the most valuable asset of a business.

  8. Instant responsive web design

    CERN Document Server

    Simmons, Cory

    2013-01-01

    A step-by-step tutorial approach which will teach the readers what responsive web design is and how it is used in designing a responsive web page.If you are a web-designer looking to expand your skill set by learning the quickly growing industry standard of responsive web design, this book is ideal for you. Knowledge of CSS is assumed.

  9. SVG-Based Web Publishing

    Science.gov (United States)

    Gao, Jerry Z.; Zhu, Eugene; Shim, Simon

    2003-01-01

    With the increasing applications of the Web in e-commerce, advertising, and publication, new technologies are needed to improve Web graphics technology due to the current limitation of technology. The SVG (Scalable Vector Graphics) technology is a new revolutionary solution to overcome the existing problems in the current web technology. It provides precise and high-resolution web graphics using plain text format commands. It sets a new standard for web graphic format to allow us to present complicated graphics with rich test fonts and colors, high printing quality, and dynamic layout capabilities. This paper provides a tutorial overview about SVG technology and its essential features, capability, and advantages. The reports a comparison studies between SVG and other web graphics technologies.

  10. Sensor web

    Science.gov (United States)

    Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)

    2011-01-01

    A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.

  11. A Sensor Web and Web Service-Based Approach for Active Hydrological Disaster Monitoring

    Directory of Open Access Journals (Sweden)

    Xi Zhai

    2016-09-01

    Full Text Available Rapid advancements in Earth-observing sensor systems have led to the generation of large amounts of remote sensing data that can be used for the dynamic monitoring and analysis of hydrological disasters. The management and analysis of these data could take advantage of distributed information infrastructure technologies such as Web service and Sensor Web technologies, which have shown great potential in facilitating the use of observed big data in an interoperable, flexible and on-demand way. However, it remains a challenge to achieve timely response to hydrological disaster events and to automate the geoprocessing of hydrological disaster observations. This article proposes a Sensor Web and Web service-based approach to support active hydrological disaster monitoring. This approach integrates an event-driven mechanism, Web services, and a Sensor Web and coordinates them using workflow technologies to facilitate the Web-based sharing and processing of hydrological hazard information. The design and implementation of hydrological Web services for conducting various hydrological analysis tasks on the Web using dynamically updating sensor observation data are presented. An application example is provided to demonstrate the benefits of the proposed approach over the traditional approach. The results confirm the effectiveness and practicality of the proposed approach in cases of hydrological disaster.

  12. WebRTC using JSON via XMLHttpRequest and SIP over WebSocket: initial signalling overhead findings

    CSIR Research Space (South Africa)

    Adeyeye, M

    2013-08-01

    Full Text Available Web Real-Time Communication (WebRTC) introduces real-time multimedia communication as native capabilities of Web browsers. With the adoption of WebRTC the Web browsers will be able to use WebRTC to communicate with one another (peer...

  13. Treatment dropout in web-based cognitive behavioral therapy for patients with eating disorders.

    Science.gov (United States)

    Ter Huurne, Elke D; Postel, Marloes G; de Haan, Hein A; van der Palen, Job; DeJong, Cor A J

    2017-01-01

    Treatment dropout is an important concern in eating disorder treatments as it has negative implications for patients' outcome, clinicians' motivation, and research studies. Our main objective was to conduct an exploratory study on treatment dropout in a two-part web-based cognitive behavioral therapy with asynchronous therapeutic support. The analysis included 205 female patients with eating disorders. Reasons for dropout, treatment experiences, and predictors of dropout were analyzed. Overall treatment dropout was 37.6%, with 18.5% early dropout (before or during treatment part 1) and 19.0% late dropout (after part 1 or during part 2). Almost half of the participants identified personal circumstances as reason for dropout. The other participants mostly reported reasons related to the online delivery or treatment protocol. Predictors of early dropout included reporting less vigor and smoking at baseline and a longer average duration per completed treatment module of part 1. Late dropout was predicted by reporting less vigor at baseline and uncertainty about recommendation of the treatment to others after completion of treatment part 1. Generally, the web-based treatment and online therapeutic support were evaluated positively, although dropouts rated the treatment as significantly less helpful and effective than completers did. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Webs and posets

    International Nuclear Information System (INIS)

    Dukes, M.; Gardi, E.; McAslan, H.; Scott, D.J.; White, C.D.

    2014-01-01

    The non-Abelian exponentiation theorem has recently been generalised to correlators of multiple Wilson line operators. The perturbative expansions of these correlators exponentiate in terms of sets of diagrams called webs, which together give rise to colour factors corresponding to connected graphs. The colour and kinematic degrees of freedom of individual diagrams in a web are entangled by mixing matrices of purely combinatorial origin. In this paper we relate the combinatorial study of these matrices to properties of partially ordered sets (posets), and hence obtain explicit solutions for certain families of web-mixing matrix, at arbitrary order in perturbation theory. We also provide a general expression for the rank of a general class of mixing matrices, which governs the number of independent colour factors arising from such webs. Finally, we use the poset language to examine a previously conjectured sum rule for the columns of web-mixing matrices which governs the cancellation of the leading subdivergences between diagrams in the web. Our results, when combined with parallel developments in the evaluation of kinematic integrals, offer new insights into the all-order structure of infrared singularities in non-Abelian gauge theories

  15. Development of STEP-NC Adaptor for Advanced Web Manufacturing System

    Science.gov (United States)

    Ajay Konapala, Mr.; Koona, Ramji, Dr.

    2017-08-01

    Information systems play a key role in the modern era of Information Technology. Rapid developments in IT & global competition calls for many changes in basic CAD/CAM/CAPP/CNC manufacturing chain of operations. ‘STEP-NC’ an enhancement to STEP for operating CNC machines, creating new opportunities for collaborative, concurrent, adaptive works across the manufacturing chain of operations. Schemas and data models defined by ISO14649 in liaison with ISO10303 standards made STEP-NC file rich with feature based, rather than mere point to point information of G/M Code format. But one needs to have a suitable information system to understand and modify these files. Various STEP-NC information systems are reviewed to understand the suitability of STEP-NC for web manufacturing. Present work also deals with the development of an adaptor which imports STEP-NC file, organizes its information, allowing modifications to entity values and finally generates a new STEP-NC file to export. The system is designed and developed to work on web to avail additional benefits through the web and also to be part of a proposed ‘Web based STEP-NC manufacturing platform’ which is under development and explained as future scope.

  16. Advanced Techniques in Web Intelligence-2 Web User Browsing Behaviour and Preference Analysis

    CERN Document Server

    Palade, Vasile; Jain, Lakhmi

    2013-01-01

    This research volume focuses on analyzing the web user browsing behaviour and preferences in traditional web-based environments, social  networks and web 2.0 applications,  by using advanced  techniques in data acquisition, data processing, pattern extraction and  cognitive science for modeling the human actions.  The book is directed to  graduate students, researchers/scientists and engineers  interested in updating their knowledge with the recent trends in web user analysis, for developing the next generation of web-based systems and applications.

  17. A demanding web-based PACS supported by web services technology

    Science.gov (United States)

    Costa, Carlos M. A.; Silva, Augusto; Oliveira, José L.; Ribeiro, Vasco G.; Ribeiro, José

    2006-03-01

    During the last years, the ubiquity of web interfaces have pushed practically all PACS suppliers to develop client applications in which clinical practitioners can receive and analyze medical images, using conventional personal computers and Web browsers. However, due to security and performance issues, the utilization of these software packages has been restricted to Intranets. Paradigmatically, one of the most important advantages of digital image systems is to simplify the widespread sharing and remote access of medical data between healthcare institutions. This paper analyses the traditional PACS drawbacks that contribute to their reduced usage in the Internet and describes a PACS based on Web Services technology that supports a customized DICOM encoding syntax and a specific compression scheme providing all historical patient data in a unique Web interface.

  18. Educating for ethical leadership through web-based coaching.

    Science.gov (United States)

    Eide, Tom; Dulmen, Sandra van; Eide, Hilde

    2016-12-01

    Ethical leadership is important for developing ethical healthcare practice. However, there is little research-based knowledge on how to stimulate and educate for ethical leadership. The aim was to develop and investigate the feasibility of a 6-week web-based, ethical leadership educational programme and learn from participants' experience. Training programme and research design: A training programme was developed consisting of (1) a practice part, where the participating middle managers developed and ran an ethics project in their own departments aiming at enhancing the ethical mindfulness of the organizational culture, and (2) a web-based reflection part, including online reflections and coaching while executing the ethics project. Focus group interviews were used to explore the participants' experiences with and the feasibility of the training. Participants and research context: Nine middle managers were recruited from a part-time master's programme in leadership in Oslo, Norway. The research context was the participating leaders' work situation during the 6 weeks of training. Ethical considerations: Participation was voluntary, data anonymized and the confidentiality of the participating leaders/students and their institutions maintained. No patient or medical information was involved. Eight of the nine recruited leaders completed the programme. They evaluated the training programme as efficient and supportive, with the written, situational feedback/coaching as the most important element, enhancing reflection and motivation, counteracting a feeling of loneliness and promoting the execution of change. The findings seem consistent with the basic assumptions behind the educational design, based partly on e-health research, feedback studies and organizational ethics methodology, partly on theories on workplace learning, reflection, recognition and motivation. The training programme seems feasible. It should be adjusted according to participants' proposals and tested

  19. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    Science.gov (United States)

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  20. From people to entities new semantic search paradigms for the web

    CERN Document Server

    Demartini, G

    2014-01-01

    The exponential growth of digital information available in companies and on the Web creates the need for search tools that can respond to the most sophisticated information needs. Many user tasks would be simplified if Search Engines would support typed search, and return entities instead of just Web documents. For example, an executive who tries to solve a problem needs to find people in the company who are knowledgeable about a certain topic.In the first part of the book, we propose a model for expert finding based on the well-consolidated vector space model for Information Retrieval and inv

  1. Funnel-web spider bite

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/002844.htm Funnel-web spider bite To use the sharing features on ... the effects of a bite from the funnel-web spider. Male funnel-web spiders are more poisonous ...

  2. Web Apollo: a web-based genomic annotation editing platform.

    Science.gov (United States)

    Lee, Eduardo; Helt, Gregg A; Reese, Justin T; Munoz-Torres, Monica C; Childers, Chris P; Buels, Robert M; Stein, Lincoln; Holmes, Ian H; Elsik, Christine G; Lewis, Suzanna E

    2013-08-30

    Web Apollo is the first instantaneous, collaborative genomic annotation editor available on the web. One of the natural consequences following from current advances in sequencing technology is that there are more and more researchers sequencing new genomes. These researchers require tools to describe the functional features of their newly sequenced genomes. With Web Apollo researchers can use any of the common browsers (for example, Chrome or Firefox) to jointly analyze and precisely describe the features of a genome in real time, whether they are in the same room or working from opposite sides of the world.

  3. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    2010-01-01

    Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  4. Security Assessment of Web Based Distributed Applications

    Directory of Open Access Journals (Sweden)

    Catalin BOJA

    2010-01-01

    Full Text Available This paper presents an overview about the evaluation of risks and vulnerabilities in a web based distributed application by emphasizing aspects concerning the process of security assessment with regards to the audit field. In the audit process, an important activity is dedicated to the measurement of the characteristics taken into consideration for evaluation. From this point of view, the quality of the audit process depends on the quality of assessment methods and techniques. By doing a review of the fields involved in the research process, the approach wants to reflect the main concerns that address the web based distributed applications using exploratory research techniques. The results show that many are the aspects which must carefully be worked with, across a distributed system and they can be revealed by doing a depth introspective analyze upon the information flow and internal processes that are part of the system. This paper reveals the limitations of a non-existing unified security risk assessment model that could prevent such risks and vulnerabilities debated. Based on such standardize models, secure web based distributed applications can be easily audited and many vulnerabilities which can appear due to the lack of access to information can be avoided.

  5. Uses and Gratifications of the World Wide Web: From Couch Potato to Web Potato.

    Science.gov (United States)

    Kaye, Barbara K.

    1998-01-01

    Investigates uses and gratifications of the World Wide Web and its impact on traditional mass media, especially television. Identifies six Web use motivations: entertainment, social interaction, passing of time, escape, information, and Web site preference. Examines relationships between each use motivation and Web affinity, perceived realism, and…

  6. The RCSB Protein Data Bank: redesigned web site and web services.

    Science.gov (United States)

    Rose, Peter W; Beran, Bojan; Bi, Chunxiao; Bluhm, Wolfgang F; Dimitropoulos, Dimitris; Goodsell, David S; Prlic, Andreas; Quesada, Martha; Quinn, Gregory B; Westbrook, John D; Young, Jasmine; Yukich, Benjamin; Zardecki, Christine; Berman, Helen M; Bourne, Philip E

    2011-01-01

    The RCSB Protein Data Bank (RCSB PDB) web site (http://www.pdb.org) has been redesigned to increase usability and to cater to a larger and more diverse user base. This article describes key enhancements and new features that fall into the following categories: (i) query and analysis tools for chemical structure searching, query refinement, tabulation and export of query results; (ii) web site customization and new structure alerts; (iii) pair-wise and representative protein structure alignments; (iv) visualization of large assemblies; (v) integration of structural data with the open access literature and binding affinity data; and (vi) web services and web widgets to facilitate integration of PDB data and tools with other resources. These improvements enable a range of new possibilities to analyze and understand structure data. The next generation of the RCSB PDB web site, as described here, provides a rich resource for research and education.

  7. WebQuests in special primary education: Learning in a web-based environment

    NARCIS (Netherlands)

    Kleemans, M.A.J.; Segers, P.C.J.; Droop, W.; Wentink, W.M.J.

    2011-01-01

    The present study investigated the differences in learning gain when performing a WebQuest with a well-defined versus an ill-defined assignment. Twenty boys and twenty girls (mean age 11; 10), attending a special primary education school, performed two WebQuests. In each WebQuest, they performed

  8. Web Science emerges

    OpenAIRE

    Shadbolt, Nigel; Berners-Lee, Tim

    2008-01-01

    The relentless rise in Web pages and links is creating emergent properties, from social networks to virtual identity theft, that are transforming society. A new discipline, Web Science, aims to discover how Web traits arise and how they can be harnessed or held in check to benefit society. Important advances are beginning to be made; more work can solve major issues such as securing privacy and conveying trust.

  9. Hardening cookies in web-based systems for better system integrity

    International Nuclear Information System (INIS)

    Mohamad Safuan Sulaiman; Mohd Dzul Aiman Aslan; Saaidi Ismail; Abdul Aziz Mohd Ramli; Abdul Muin Abdul Rahman; Siti Nurbahyah Hamdan; Norlelawati Hashimuddin; Sufian Norazam Mohamed Aris

    2012-01-01

    IT Center (ITC) as technical support and provider for most of web-based systems in Nuclear Malaysia has conducted a study to investigate cookie vulnerability in a system for better integrity. A part of the result has found that cookies in a web-based system in Nuclear Malaysia can be easily manipulated. The main objective of the study is to harden the vulnerability of the cookies. Two levels of security procedures have been used and enforced which consist of 1) Penetration test (Pen Test) 2) Hardening procedure. In one of the system, study has found that 121 attempts threats have been detected after the hardening enforcement from 23 March till 20 September 2012. At this stage, it can be concluded that cookie vulnerability in the system has been hardened and integrity has been assured after the enforcement. This paper describes in detail the penetration and hardening process of cookie vulnerability for better supporting web-based system in Nuclear Malaysia. (author)

  10. WebViz: A web browser based application for collaborative analysis of 3D data

    Science.gov (United States)

    Ruegg, C. S.

    2011-12-01

    In the age of high speed Internet where people can interact instantly, scientific tools have lacked technology which can incorporate this concept of communication using the web. To solve this issue a web application for geological studies has been created, tentatively titled WebViz. This web application utilizes tools provided by Google Web Toolkit to create an AJAX web application capable of features found in non web based software. Using these tools, a web application can be created to act as piece of software from anywhere in the globe with a reasonably speedy Internet connection. An application of this technology can be seen with data regarding the recent tsunami from the major japan earthquakes. After constructing the appropriate data to fit a computer render software called HVR, WebViz can request images of the tsunami data and display it to anyone who has access to the application. This convenience alone makes WebViz a viable solution, but the option to interact with this data with others around the world causes WebViz to be taken as a serious computational tool. WebViz also can be used on any javascript enabled browser such as those found on modern tablets and smart phones over a fast wireless connection. Due to the fact that WebViz's current state is built using Google Web Toolkit the portability of the application is in it's most efficient form. Though many developers have been involved with the project, each person has contributed to increase the usability and speed of the application. In the project's most recent form a dramatic speed increase has been designed as well as a more efficient user interface. The speed increase has been informally noticed in recent uses of the application in China and Australia with the hosting server being located at the University of Minnesota. The user interface has been improved to not only look better but the functionality has been improved. Major functions of the application are rotating the 3D object using buttons

  11. Web thickness determines the therapeutic effect of endoscopic keel placement on anterior glottic web.

    Science.gov (United States)

    Chen, Jian; Shi, Fang; Chen, Min; Yang, Yue; Cheng, Lei; Wu, Haitao

    2017-10-01

    This work is a retrospective analysis to investigate the critical risk factor for the therapeutic effect of endoscopic keel placement on anterior glottic web. Altogether, 36 patients with anterior glottic web undergoing endoscopic lysis and silicone keel placement were enrolled. Their voice qualities were evaluated using the voice handicap index-10 (VHI-10) questionnaire, and improved significantly 3 months after surgery (21.53 ± 3.89 vs 9.81 ± 6.68, P web recurrence during the at least 1-year follow-up. Therefore, patients were classified according to the Cohen classification or web thickness, and the recurrence rates were compared. The distribution of recurrence rates for Cohen type 1 ~ 4 were 28.6, 16.7, 33.3, and 40%, respectively. The difference was not statistically significant (P = 0.461). When classified by web thickness, only 2 of 27 (7.41%) thin type cases relapsed whereas 8 of 9 (88.9%) cases in the thick group reformed webs (P web thickness rather than the Cohen grades. Endoscopic lysis and keel placement is only effective for cases with thin glottic webs. Patients with thick webs should be treated by other means.

  12. Web cache location

    Directory of Open Access Journals (Sweden)

    Boffey Brian

    2004-01-01

    Full Text Available Stress placed on network infrastructure by the popularity of the World Wide Web may be partially relieved by keeping multiple copies of Web documents at geographically dispersed locations. In particular, use of proxy caches and replication provide a means of storing information 'nearer to end users'. This paper concentrates on the locational aspects of Web caching giving both an overview, from an operational research point of view, of existing research and putting forward avenues for possible further research. This area of research is in its infancy and the emphasis will be on themes and trends rather than on algorithm construction. Finally, Web caching problems are briefly related to referral systems more generally.

  13. Multigraph: Interactive Data Graphs on the Web

    Science.gov (United States)

    Phillips, M. B.

    2010-12-01

    " through large data sets, downloading only those the parts of the data that are needed for display. Multigraph is currently in use on several web sites including the US Drought Portal (www.drought.gov), the NOAA Climate Services Portal (www.climate.gov), the Climate Reference Network (www.ncdc.noaa.gov/crn), NCDC's State of the Climate Report (www.ncdc.noaa.gov/sotc), and the US Forest Service's Forest Change Assessment Viewer (ews.forestthreats.org/NPDE/NPDE.html). More information about Multigraph is available from the web site www.multigraph.org. Interactive Graph of Global Temperature Anomalies from ClimateWatch Magazine (http://www.climatewatch.noaa.gov/2009/articles/climate-change-global-temperature)

  14. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    Virtual observatories mature from their original domain and become common practice for earth observation research and policy building. The term Virtual Observatory originally came from the astronomical research community. Here, virtual observatories provide universal access to the available astronomical data archives of space and ground-based observatories. Further on, as those virtual observatories aim at integrating heterogeneous ressources provided by a number of participating organizations, the virtual observatory acts as a coordinating entity that strives for common data analysis techniques and tools based on common standards. The Sensor Web is on its way to become one of the major virtual observatories outside of the astronomical research community. Like the original observatory that consists of a number of telescopes, each observing a specific part of the wave spectrum and with a collection of astronomical instruments, the Sensor Web provides a multi-eyes perspective on the current, past, as well as future situation of our planet and its surrounding spheres. The current view of the Sensor Web is that of a single worldwide collaborative, coherent, consistent and consolidated sensor data collection, fusion and distribution system. The Sensor Web can perform as an extensive monitoring and sensing system that provides timely, comprehensive, continuous and multi-mode observations. This technology is key to monitoring and understanding our natural environment, including key areas such as climate change, biodiversity, or natural disasters on local, regional, and global scales. The Sensor Web concept has been well established with ongoing global research and deployment of Sensor Web middleware and standards and represents the foundation layer of systems like the Global Earth Observation System of Systems (GEOSS). The Sensor Web consists of a huge variety of physical and virtual sensors as well as observational data, made available on the Internet at standardized

  15. Usability Testing for e-Resource Discovery: How Students Find and Choose e-Resources Using Library Web Sites

    Science.gov (United States)

    Fry, Amy; Rich, Linda

    2011-01-01

    In early 2010, library staff at Bowling Green State University (BGSU) in Ohio designed and conducted a usability study of key parts of the library web site, focusing on the web pages generated by the library's electronic resources management system (ERM) that list and describe the library's databases. The goal was to discover how users find and…

  16. Reactivity on the Web

    OpenAIRE

    Bailey, James; Bry, François; Eckert, Michael; Patrânjan, Paula Lavinia

    2005-01-01

    Reactivity, the ability to detect simple and composite events and respond in a timely manner, is an essential requirement in many present-day information systems. With the emergence of new, dynamic Web applications, reactivity on the Web is receiving increasing attention. Reactive Web-based systems need to detect and react not only to simple events but also to complex, real-life situations. This paper introduces XChange, a language for programming reactive behaviour on the Web,...

  17. Semantic Web status model

    CSIR Research Space (South Africa)

    Gerber, AJ

    2006-06-01

    Full Text Available Semantic Web application areas are experiencing intensified interest due to the rapid growth in the use of the Web, together with the innovation and renovation of information content technologies. The Semantic Web is regarded as an integrator across...

  18. Your Life in Web Apps

    CERN Document Server

    Turnbull, Giles

    2008-01-01

    What is a web app? It's software that you use right in your web browser. Rather than installing an application on your computer, you visit a web site and sign up as a new user of its software. Instead of storing your files on your own hard disk, the web app stores them for you, online. Is it possible to switch entirely to web apps? To run nothing but a browser for an entire day? In this PDF we'll take you through one day in the life of a web apps-only user and chronicle the pros and cons of living by browser. And if the idea of switching, fully or partially, to web apps sounds appealing to

  19. Web 25

    DEFF Research Database (Denmark)

    the reader on an exciting time travel journey to learn more about the prehistory of the hyperlink, the birth of the Web, the spread of the early Web, and the Web’s introduction to the general public in mainstream media. Fur- thermore, case studies of blogs, literature, and traditional media going online...

  20. Web Based Client/Server Interface for Part Task Training

    National Research Council Canada - National Science Library

    Blemel, Peter

    2000-01-01

    .... The project focused on developing concepts for ways to use the Internet to provide individual and cooperative Distance Part Task Training using virtual or real training equipment. The Phase I goal was to define a commercially viable multi-media virtual training environment for providing realistic training wherever and whenever needed.

  1. Implementation of MINI-PACS using the DICOM converter on the Web

    International Nuclear Information System (INIS)

    Ji, Youn Sang

    2000-01-01

    In recent years, medical procedures have become more complex, while financial pressures for shortened hospital stays and increased efficiency in patient care have increased. As a result, several shortcomings of present film-based systems for managing medical images have become apparent. Maintaining film space is labor intensive and consumes valuable space. Because only single copies of radiological examinations exist, they are prone to being lost or misplaced, thereby consuming additional valuable time and expense. In this paper, MINI-PACS for image archiving, transmission, and viewing offers a solution to these problems. Proposed MINI-PACS consists of mainly four parts such as Web Module, Client-Server Module, Internal Module, Acquisition Module. In addition, MINI-PACS system includes DICOM Converter that Non-DICOM file format converts standard file format. In Client-Server Module case, Proposed system is combined both SCU(Service Class User: Client) part and SCP(Service Class Provider: Server) part therefore this system provides the high resolution image processing techniques based on windows platform. Because general PACS system is too expensive for Medium and Small hospitals to install and operate the full-PACS. Also, we constructed Web Module for database connection through the WWW.=20

  2. Survey of Techniques for Deep Web Source Selection and Surfacing the Hidden Web Content

    OpenAIRE

    Khushboo Khurana; M.B. Chandak

    2016-01-01

    Large and continuously growing dynamic web content has created new opportunities for large-scale data analysis in the recent years. There is huge amount of information that the traditional web crawlers cannot access, since they use link analysis technique by which only the surface web can be accessed. Traditional search engine crawlers require the web pages to be linked to other pages via hyperlinks causing large amount of web data to be hidden from the crawlers. Enormous data is available in...

  3. Time-Series Adaptive Estimation of Vaccination Uptake Using Web Search Queries

    DEFF Research Database (Denmark)

    Dalum Hansen, Niels; Mølbak, Kåre; Cox, Ingemar J.

    2017-01-01

    Estimating vaccination uptake is an integral part of ensuring public health. It was recently shown that vaccination uptake can be estimated automatically from web data, instead of slowly collected clinical records or population surveys [2]. All prior work in this area assumes that features of vac...

  4. Information Waste on the World Wide Web and Combating the Clutter

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; Wijnhoven, Alphonsus B.J.M.; Beckers, David

    2015-01-01

    The Internet has become a critical part of the infrastructure supporting modern life. The high degree of openness and autonomy of information providers determines the access to a vast amount of information on the Internet. However, this makes the web vulnerable to inaccurate, misleading, or outdated

  5. Deploying and sharing U-Compare workflows as web services.

    Science.gov (United States)

    Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, Balakrishna; Thompson, Paul; Ananiadou, Sophia

    2013-02-18

    U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare's components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform.

  6. Web 1.0 to Web 3.0 Evolution: Reviewing the Impacts on Tourism Development and Opportunities

    Science.gov (United States)

    Eftekhari, M. Hossein; Barzegar, Zeynab; Isaai, M. T.

    The most important event following the establishmenet of the Internet network was the Web introduced by Tim Berners-Lee. Websites give their owners features that allow sharing with which they can publish their content with users and visitors. In the last 5 years, we have seen some changes in the use of web. Users want to participate in content sharing and they like to interact with each other. This is known as Web 2.0. In the last year, Web 2.0 has reached maturity and now we need a smart web which will be accordingly be called Web 3.0. Web 3.0 is based on semantic web definition. Changing the way of using the web has had a clear impact on E-Tourism and its development and also on business models. In this paper, we review the definitions and describe the impacts of web evolution on E-Tourism.

  7. Working without a Crystal Ball: Predicting Web Trends for Web Services Librarians

    Science.gov (United States)

    Ovadia, Steven

    2008-01-01

    User-centered design is a principle stating that electronic resources, like library Web sites, should be built around the needs of the users. This article interviews Web developers of library and non-library-related Web sites, determining how they assess user needs and how they decide to adapt certain technologies for users. According to the…

  8. Uncovering Web search strategies in South African higher education

    Directory of Open Access Journals (Sweden)

    Surika Civilcharran

    2016-11-01

    Full Text Available Background: In spite of the enormous amount of information available on the Web and the fact that search engines are continuously evolving to enhance the search experience, students are nevertheless faced with the difficulty of effectively retrieving information. It is, therefore, imperative for the interaction between students and search tools to be understood and search strategies to be identified, in order to promote successful information retrieval. Objectives: This study identifies the Web search strategies used by postgraduate students and forms part of a wider study into information retrieval strategies used by postgraduate students at the University of KwaZulu-Natal (UKZN, Pietermaritzburg campus, South Africa. Method: Largely underpinned by Thatcher’s cognitive search strategies, the mixed-methods approach was utilised for this study, in which questionnaires were employed in Phase 1 and structured interviews in Phase 2. This article reports and reflects on the findings of Phase 2, which focus on identifying the Web search strategies employed by postgraduate students. The Phase 1 results were reported in Civilcharran, Hughes and Maharaj (2015. Results: Findings reveal the Web search strategies used for academic information retrieval. In spite of easy access to the invisible Web and the advent of meta-search engines, the use of Web search engines still remains the preferred search tool. The UKZN online library databases and especially the UKZN online library, Online Public Access Catalogue system, are being underutilised. Conclusion: Being ranked in the top three percent of the world’s universities, UKZN is investing in search tools that are not being used to their full potential. This evidence suggests an urgent need for students to be trained in Web searching and to have a greater exposure to a variety of search tools. This article is intended to further contribute to the design of undergraduate training programmes in order to deal

  9. Private Web Browsing

    National Research Council Canada - National Science Library

    Syverson, Paul F; Reed, Michael G; Goldschlag, David M

    1997-01-01

    .... These are both kept confidential from network elements as well as external observers. Private Web browsing is achieved by unmodified Web browsers using anonymous connections by means of HTTP proxies...

  10. Microsoft Expression Web for dummies

    CERN Document Server

    Hefferman, Linda

    2013-01-01

    Expression Web is Microsoft's newest tool for creating and maintaining dynamic Web sites. This FrontPage replacement offers all the simple ""what-you-see-is-what-you-get"" tools for creating a Web site along with some pumped up new features for working with Cascading Style Sheets and other design options. Microsoft Expression Web For Dummies arrives in time for early adopters to get a feel for how to build an attractive Web site. Author Linda Hefferman teams up with longtime FrontPage For Dummies author Asha Dornfest to show the easy way for first-time Web designers, FrontPage ve

  11. The BiSciCol Triplifier: bringing biodiversity data to the Semantic Web.

    Science.gov (United States)

    Stucky, Brian J; Deck, John; Conlin, Tom; Ziemba, Lukasz; Cellinese, Nico; Guralnick, Robert

    2014-07-29

    Recent years have brought great progress in efforts to digitize the world's biodiversity data, but integrating data from many different providers, and across research domains, remains challenging. Semantic Web technologies have been widely recognized by biodiversity scientists for their potential to help solve this problem, yet these technologies have so far seen little use for biodiversity data. Such slow uptake has been due, in part, to the relative complexity of Semantic Web technologies along with a lack of domain-specific software tools to help non-experts publish their data to the Semantic Web. The BiSciCol Triplifier is new software that greatly simplifies the process of converting biodiversity data in standard, tabular formats, such as Darwin Core-Archives, into Semantic Web-ready Resource Description Framework (RDF) representations. The Triplifier uses a vocabulary based on the popular Darwin Core standard, includes both Web-based and command-line interfaces, and is fully open-source software. Unlike most other RDF conversion tools, the Triplifier does not require detailed familiarity with core Semantic Web technologies, and it is tailored to a widely popular biodiversity data format and vocabulary standard. As a result, the Triplifier can often fully automate the conversion of biodiversity data to RDF, thereby making the Semantic Web much more accessible to biodiversity scientists who might otherwise have relatively little knowledge of Semantic Web technologies. Easy availability of biodiversity data as RDF will allow researchers to combine data from disparate sources and analyze them with powerful linked data querying tools. However, before software like the Triplifier, and Semantic Web technologies in general, can reach their full potential for biodiversity science, the biodiversity informatics community must address several critical challenges, such as the widespread failure to use robust, globally unique identifiers for biodiversity data.

  12. EarthServer - 3D Visualization on the Web

    Science.gov (United States)

    Wagner, Sebastian; Herzig, Pasquale; Bockholt, Ulrich; Jung, Yvonne; Behr, Johannes

    2013-04-01

    EarthServer (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, is a project to enable the management, access and exploration of massive, multi-dimensional datasets using Open GeoSpatial Consortium (OGC) query and processing language standards like WCS 2.0 and WCPS. To this end, a server/client architecture designed to handle Petabyte/Exabyte volumes of multi-dimensional data is being developed and deployed. As an important part of the EarthServer project, six Lighthouse Applications, major scientific data exploitation initiatives, are being established to make cross-domain, Earth Sciences related data repositories available in an open and unified manner, as service endpoints based on solutions and infrastructure developed within the project. Clients technology developed and deployed in EarthServer ranges from mobile and web clients to immersive virtual reality systems, all designed to interact with a physically and logically distributed server infrastructure using exclusively OGC standards. In this contribution, we would like to present our work on a web-based 3D visualization and interaction client for Earth Sciences data using only technology found in standard web browsers without requiring the user to install plugins or addons. Additionally, we are able to run the earth data visualization client on a wide range of different platforms with very different soft- and hardware requirements such as smart phones (e.g. iOS, Android), different desktop systems etc. High-quality, hardware-accelerated visualization of 3D and 4D content in standard web browsers can be realized now and we believe it will become more and more common to use this fast, lightweight and ubiquitous platform to provide insights into big datasets without requiring the user to set up a specialized client first. With that in mind, we will also point out some of the limitations we encountered using current web technologies. Underlying the EarthServer web client

  13. SEMANTIC WEB MINING: ISSUES AND CHALLENGES

    OpenAIRE

    Karan Singh*, Anil kumar, Arun Kumar Yadav

    2016-01-01

    The combination of the two fast evolving scientific research areas “Semantic Web” and “Web Mining” are well-known as “Semantic Web Mining” in computer science. These two areas cover way for the mining of related and meaningful information from the web, by this means giving growth to the term “Semantic Web Mining”. The “Semantic Web” makes mining easy and “Web Mining” can construct new structure of Web. Web Mining applies Data Mining technique on web content, Structure and Usage. This paper gi...

  14. WebQuests in Special Primary Education: Learning in a Web-Based Environment

    Science.gov (United States)

    Kleemans, Tijs; Segers, Eliane; Droop, Mienke; Wentink, Hanneke

    2011-01-01

    The present study investigated the differences in learning gain when performing a WebQuest with a well-defined versus an ill-defined assignment. Twenty boys and twenty girls (mean age 11; 10), attending a special primary education school, performed two WebQuests. In each WebQuest, they performed either a well-defined or an ill-defined assignment.…

  15. A Web-Based Adaptive Tutor to Teach PCR Primer Design

    Science.gov (United States)

    van Seters, Janneke R.; Wellink, Joan; Tramper, Johannes; Goedhart, Martin J.; Ossevoort, Miriam A.

    2012-01-01

    When students have varying prior knowledge, personalized instruction is desirable. One way to personalize instruction is by using adaptive e-learning to offer training of varying complexity. In this study, we developed a web-based adaptive tutor to teach PCR primer design: the PCR Tutor. We used part of the Taxonomy of Educational Objectives (the…

  16. Medical Students' Experiences with Addicted Patients: A Web-Based Survey

    Science.gov (United States)

    Midmer, Deana; Kahan, Meldon; Wilson, Lynn

    2008-01-01

    Project CREATE was an initiative to strengthen undergraduate medical education in addictions. As part of a needs assessment, forty-six medical students at Ontario's five medical schools completed a bi-weekly, interactive web-based survey about addiction-related learning events. In all, 704 unique events were recorded, for an average of 16.7…

  17. TMFoldWeb: a web server for predicting transmembrane protein fold class.

    Science.gov (United States)

    Kozma, Dániel; Tusnády, Gábor E

    2015-09-17

    Here we present TMFoldWeb, the web server implementation of TMFoldRec, a transmembrane protein fold recognition algorithm. TMFoldRec uses statistical potentials and utilizes topology filtering and a gapless threading algorithm. It ranks template structures and selects the most likely candidates and estimates the reliability of the obtained lowest energy model. The statistical potential was developed in a maximum likelihood framework on a representative set of the PDBTM database. According to the benchmark test the performance of TMFoldRec is about 77 % in correctly predicting fold class for a given transmembrane protein sequence. An intuitive web interface has been developed for the recently published TMFoldRec algorithm. The query sequence goes through a pipeline of topology prediction and a systematic sequence to structure alignment (threading). Resulting templates are ordered by energy and reliability values and are colored according to their significance level. Besides the graphical interface, a programmatic access is available as well, via a direct interface for developers or for submitting genome-wide data sets. The TMFoldWeb web server is unique and currently the only web server that is able to predict the fold class of transmembrane proteins while assigning reliability scores for the prediction. This method is prepared for genome-wide analysis with its easy-to-use interface, informative result page and programmatic access. Considering the info-communication evolution in the last few years, the developed web server, as well as the molecule viewer, is responsive and fully compatible with the prevalent tablets and mobile devices.

  18. Measurment of Web Usability: Web Page of Hacettepe University Department of Information Management

    OpenAIRE

    Nazan Özenç Uçak; Tolga Çakmak

    2009-01-01

    Today, information is produced increasingly in electronic form and retrieval of information is provided via web pages. As a result of the rise of the number of web pages, many of them seem to comprise similar contents but different designs. In this respect, presenting information over the web pages according to user expectations and specifications is important in terms of effective usage of information. This study provides an insight about web usability studies that are executed for measuring...

  19. EPA Web Taxonomy

    Data.gov (United States)

    U.S. Environmental Protection Agency — EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's...

  20. minepath.org: a free interactive pathway analysis web server.

    Science.gov (United States)

    Koumakis, Lefteris; Roussos, Panos; Potamias, George

    2017-07-03

    ( www.minepath.org ) is a web-based platform that elaborates on, and radically extends the identification of differentially expressed sub-paths in molecular pathways. Besides the network topology, the underlying MinePath algorithmic processes exploit exact gene-gene molecular relationships (e.g. activation, inhibition) and are able to identify differentially expressed pathway parts. Each pathway is decomposed into all its constituent sub-paths, which in turn are matched with corresponding gene expression profiles. The highly ranked, and phenotype inclined sub-paths are kept. Apart from the pathway analysis algorithm, the fundamental innovation of the MinePath web-server concerns its advanced visualization and interactive capabilities. To our knowledge, this is the first pathway analysis server that introduces and offers visualization of the underlying and active pathway regulatory mechanisms instead of genes. Other features include live interaction, immediate visualization of functional sub-paths per phenotype and dynamic linked annotations for the engaged genes and molecular relations. The user can download not only the results but also the corresponding web viewer framework of the performed analysis. This feature provides the flexibility to immediately publish results without publishing source/expression data, and get all the functionality of a web based pathway analysis viewer. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Integration of distributed plant lifecycle data using ISO 15926 and Web services

    International Nuclear Information System (INIS)

    Kim, Byung Chul; Teijgeler, Hans; Mun, Duhwan; Han, Soonhung

    2011-01-01

    Highlights: → The ISO 15926 parts that provide implementation methods are under development. → A prototype of an ISO 15926-based data repository called a facade was implemented. → The prototype facade has the advantages of data interoperability and integration. → These are obtained through the features of ISO 15926 and Web services. - Abstract: Considering the financial, safety, and environmental risks related to industrial installations, it is of paramount importance that all relevant lifecycle information is readily available. Parts of this lifecycle information are stored in a plethora of computer systems, often scattered around the world and in many native formats and languages. These parts can create a complete, holistic set of lifecycle data only when they are integrated together. At present, no software is available that can integrate these parts into one coherent, distributed, and up-to-date set. The ISO 15926 standard has been developed, and in part is still under development, to overcome this problem. In this paper, the authors discuss a prototype of an ISO 15926-based data repository called a facade, and its Web services are implemented for storing the equipment data of a nuclear power plant and servicing the data to interested organizations. This prototype is for a proof-of-concept study regarding the ISO 15926 parts that are currently under development and that are expected to provide implementation methods for the integration of distributed plant systems.

  2. WebTag: Web browsing into sensor tags over NFC.

    Science.gov (United States)

    Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Alvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio

    2012-01-01

    Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm.

  3. Assessing autobiographical memory: the web-based autobiographical Implicit Association Test.

    Science.gov (United States)

    Verschuere, Bruno; Kleinberg, Bennett

    2017-04-01

    By assessing the association strength with TRUE and FALSE, the autobiographical Implicit Association Test (aIAT) [Sartori, G., Agosta, S., Zogmaister, C., Ferrara, S. D., & Castiello, U. (2008). How to accurately detect autobiographical events. Psychological Science, 19, 772-780. doi: 10.1111/j.1467-9280.2008.02156.x ] aims to determine which of two contrasting statements is true. To efficiently run well-powered aIAT experiments, we propose a web-based aIAT (web-aIAT). Experiment 1 (n = 522) is a web-based replication study of the first published aIAT study [Sartori, G., Agosta, S., Zogmaister, C., Ferrara, S. D., & Castiello, U. (2008). How to accurately detect autobiographical events. Psychological Science, 19, 772-780. doi: 10.1111/j.1467-9280.2008.02156.x ; Experiment 1]. We conclude that the replication was successful as the web-based aIAT could accurately detect which of two playing cards participants chose (AUC = .88; Hit rate = 81%). In Experiment 2 (n = 424), we investigated whether the use of affirmative versus negative sentences may partly explain the variability in aIAT accuracy findings. The aIAT could detect the chosen card when using affirmative (AUC = .90; Hit rate = 81%), but not when using negative sentences (AUC = .60; Hit rate = 53%). The web-based aIAT seems to be a valuable tool to facilitate aIAT research and may help to further identify moderators of the test's accuracy.

  4. From Web accessibility to Web adaptability.

    Science.gov (United States)

    Kelly, Brian; Nevile, Liddy; Sloan, David; Fanou, Sotiris; Ellison, Ruth; Herrod, Lisa

    2009-07-01

    This article asserts that current approaches to enhance the accessibility of Web resources fail to provide a solid foundation for the development of a robust and future-proofed framework. In particular, they fail to take advantage of new technologies and technological practices. The article introduces a framework for Web adaptability, which encourages the development of Web-based services that can be resilient to the diversity of uses of such services, the target audience, available resources, technical innovations, organisational policies and relevant definitions of 'accessibility'. The article refers to a series of author-focussed approaches to accessibility through which the authors and others have struggled to find ways to promote accessibility for people with disabilities. These approaches depend upon the resource author's determination of the anticipated users' needs and their provision. Through approaches labelled as 1.0, 2.0 and 3.0, the authors have widened their focus to account for contexts and individual differences in target audiences. Now, the authors want to recognise the role of users in determining their engagement with resources (including services). To distinguish this new approach, the term 'adaptability' has been used to replace 'accessibility'; new definitions of accessibility have been adopted, and the authors have reviewed their previous work to clarify how it is relevant to the new approach. Accessibility 1.0 is here characterised as a technical approach in which authors are told how to construct resources for a broadly defined audience. This is known as universal design. Accessibility 2.0 was introduced to point to the need to account for the context in which resources would be used, to help overcome inadequacies identified in the purely technical approach. Accessibility 3.0 moved the focus on users from a homogenised universal definition to recognition of the idiosyncratic needs and preferences of individuals and to cater for them. All of

  5. Web-based interventions in nursing.

    Science.gov (United States)

    Im, Eun-Ok; Chang, Sun Ju

    2013-02-01

    With recent advances in computer and Internet technologies and high funding priority on technological aspects of nursing research, researchers at the field level began to develop, use, and test various types of Web-based interventions. Despite high potential impacts of Web-based interventions, little is still known about Web-based interventions in nursing. In this article, to identify strengths and weaknesses of Web-based nursing interventions, a literature review was conducted using multiple databases with combined keywords of "online," "Internet" or "Web," "intervention," and "nursing." A total of 95 articles were retrieved through the databases and sorted by research topics. These articles were then analyzed to identify strengths and weaknesses of Web-based interventions in nursing. A strength of the Web-based interventions was their coverage of various content areas. In addition, many of them were theory-driven. They had advantages in their flexibility and comfort. They could provide consistency in interventions and require less cost in the intervention implementation. However, Web-based intervention studies had selected participants. They lacked controllability and had high dropouts. They required technical expertise and high development costs. Based on these findings, directions for future Web-based intervention research were provided.

  6. Las bibliotecas nacionales iberoamericanas en la web 2.0

    Directory of Open Access Journals (Sweden)

    Jorge Moisés Kroll do Prado

    2014-04-01

    Full Text Available http://dx.doi.org/10.5007/1518-2924.2014v19n39p133 Las redes sociales nacen del entorno web 2.0 y forman parte de nuestro día a día. Todas las áreas profesionales, en especial aquellas relacionadas con la información y la comunicación (Periodismo, Publicidad, Comunicación Audiovisual, Biblioteconomía y Documentación… diseñan estrategias encaminadas a formar parte de este conglomerado en el que convergen emisores, receptores y mensajes. La siguiente investigación expone los resultados de un estudio en el que se ha analizado la presencia de las bibliotecas nacionales iberoamericanas, integrantes de ABINIA, en la web 2.0. Se trata de conocer cómo dichas bibliotecas están utilizando las redes sociales más relevantes, a nivel de usuarios y penetración en los países iberoamericanos. El estudio analiza los perfiles que las bibliotecas nacionales tienen en Facebook, Twitter, Flickr y Youtube; así como, los contenidos que difunden a través de ellas y la participación que fomentan entre sus usuarios.

  7. UrbanWeb: a Platform for Mobile, Context-aware Web Services

    DEFF Research Database (Denmark)

    Hansen, Frank Allan; Grønbæk, Kaj

    2011-01-01

    much benefit from being informed about the user’s context and tailored to the user’s location or the activities the user is engaged in. In this article we focus on the definition of context and context-awareness for mobile Web 2.0 services and we present a framework, UrbanWeb, which has been designed......’s context from sensors in today mobile phones, ranging from GPS data, to 2D visual barcodes, and manual entry of context information and how to utilize this information in Web applications. Finally a number of applications built with the framework are presented.......Faster Internet connections on the mobile Internet and new advanced mobile terminals make it possible to use Web 2.0 applications and service beyond the desktop wherever and whenever you want. However, even though some service may scale in their current form to the mobile Internet, others will very...

  8. Web-Based Live Speech-Driven Lip-Sync

    OpenAIRE

    Llorach, Gerard; Evans, Alun; Blat, Josep; Grimm, Giso; Hohmann, Volker

    2016-01-01

    Virtual characters are an integral part of many games and virtual worlds. The ability to accurately synchronize lip movement to audio speech is an important aspect in the believability of the character. In this paper we propose a simple rule-based lip-syncing algorithm for virtual agents using the web browser. It works in real-time with live input, unlike most current lip-syncing proposals, which may require considerable amounts of computation, expertise and time to set up. Our method gen...

  9. Web party effect: a cocktail party effect in the web environment

    OpenAIRE

    Sara Rigutti; Carlo Fantoni; Walter Gerbino

    2015-01-01

    In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of ...

  10. Designing usable web forms- Empirical evaluation of web form improvement guidelines

    DEFF Research Database (Denmark)

    Seckler, Mirjam; Heinz, Silvia; Bargas-Avila, Javier A.

    2014-01-01

    This study reports a controlled eye tracking experiment (N = 65) that shows the combined effectiveness of 20 guidelines to improve interactive online forms when applied to forms found on real company websites. Results indicate that improved web forms lead to faster completion times, fewer form...... submission trials, and fewer eye movements. Data from subjective questionnaires and interviews further show increased user satisfaction. Overall, our findings highlight the importance for web designers to improve their web forms using UX guidelines....

  11. SIP: A Web-Based Astronomical Image Processing Program

    Science.gov (United States)

    Simonetti, J. H.

    1999-12-01

    I have written an astronomical image processing and analysis program designed to run over the internet in a Java-compatible web browser. The program, Sky Image Processor (SIP), is accessible at the SIP webpage (http://www.phys.vt.edu/SIP). Since nothing is installed on the user's machine, there is no need to download upgrades; the latest version of the program is always instantly available. Furthermore, the Java programming language is designed to work on any computer platform (any machine and operating system). The program could be used with students in web-based instruction or in a computer laboratory setting; it may also be of use in some research or outreach applications. While SIP is similar to other image processing programs, it is unique in some important respects. For example, SIP can load images from the user's machine or from the Web. An instructor can put images on a web server for students to load and analyze on their own personal computer. Or, the instructor can inform the students of images to load from any other web server. Furthermore, since SIP was written with students in mind, the philosophy is to present the user with the most basic tools necessary to process and analyze astronomical images. Images can be combined (by addition, subtraction, multiplication, or division), multiplied by a constant, smoothed, cropped, flipped, rotated, and so on. Statistics can be gathered for pixels within a box drawn by the user. Basic tools are available for gathering data from an image which can be used for performing simple differential photometry, or astrometry. Therefore, students can learn how astronomical image processing works. Since SIP is not part of a commercial CCD camera package, the program is written to handle the most common denominator image file, the FITS format.

  12. River food webs: an integrative approach to bottom-up flow webs, top-down impact webs, and trophic position.

    Science.gov (United States)

    Benke, Arthur C

    2018-03-31

    The majority of food web studies are based on connectivity, top-down impacts, bottom-up flows, or trophic position (TP), and ecologists have argued for decades which is best. Rarely have any two been considered simultaneously. The present study uses a procedure that integrates the last three approaches based on taxon-specific secondary production and gut analyses. Ingestion flows are quantified to create a flow web and the same data are used to quantify TP for all taxa. An individual predator's impacts also are estimated using the ratio of its ingestion (I) of each prey to prey production (P) to create an I/P web. This procedure was applied to 41 invertebrate taxa inhabiting submerged woody habitat in a southeastern U.S. river. A complex flow web starting with five basal food resources had 462 flows >1 mg·m -2 ·yr -1 , providing far more information than a connectivity web. Total flows from basal resources to primary consumers/omnivores were dominated by allochthonous amorphous detritus and ranged from 1 to >50,000 mg·m -2 ·yr -1 . Most predator-prey flows were much lower (1,000  mg·m -2 ·yr -1 . The I/P web showed that 83% of individual predator impacts were weak (90%). Quantitative estimates of TP ranged from 2 to 3.7, contrasting sharply with seven integer-based trophic levels based on longest feeding chain. Traditional omnivores (TP = 2.4-2.9) played an important role by consuming more prey and exerting higher impacts on primary consumers than strict predators (TP ≥ 3). This study illustrates how simultaneous quantification of flow pathways, predator impacts, and TP together provide an integrated characterization of natural food webs. © 2018 by the Ecological Society of America.

  13. Practical web development

    CERN Document Server

    Wellens, Paul

    2015-01-01

    This book is perfect for beginners who want to get started and learn the web development basics, but also offers experienced developers a web development roadmap that will help them to extend their capabilities.

  14. An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling

    Science.gov (United States)

    Devi, R. Suganya; Manjula, D.; Siddharth, R. K.

    2015-01-01

    Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling. PMID:26137592

  15. Architecture and the Web.

    Science.gov (United States)

    Money, William H.

    Instructors should be concerned with how to incorporate the World Wide Web into an information systems (IS) curriculum organized across three areas of knowledge: information technology, organizational and management concepts, and theory and development of systems. The Web fits broadly into the information technology component. For the Web to be…

  16. La calidad electrónica en sitios web corportivos. Propuesta de medición

    OpenAIRE

    González López, Óscar Rodrigo

    2015-01-01

    Actualmente, el sitio Web corporativo es uno de los pilares fundamentales dentro de la estrategia de comunicación de las empresas, ya que en este instrumento se concentra la parte fundamental de difusión de información de la organización y de los productos que comercializa. Los sitios Web corporativos se han posicionado como canales imprescindibles de difusión de información hacia todo tipo de público: clientes, proveedores, inversores, socios, empleados, habiendo crecido sustancialmen...

  17. Programming the semantic web

    CERN Document Server

    Segaran, Toby; Taylor, Jamie

    2009-01-01

    With this book, the promise of the Semantic Web -- in which machines can find, share, and combine data on the Web -- is not just a technical possibility, but a practical reality Programming the Semantic Web demonstrates several ways to implement semantic web applications, using current and emerging standards and technologies. You'll learn how to incorporate existing data sources into semantically aware applications and publish rich semantic data. Each chapter walks you through a single piece of semantic technology and explains how you can use it to solve real problems. Whether you're writing

  18. RESTful Web Services Cookbook

    CERN Document Server

    Allamaraju, Subbu

    2010-01-01

    While the REST design philosophy has captured the imagination of web and enterprise developers alike, using this approach to develop real web services is no picnic. This cookbook includes more than 100 recipes to help you take advantage of REST, HTTP, and the infrastructure of the Web. You'll learn ways to design RESTful web services for client and server applications that meet performance, scalability, reliability, and security goals, no matter what programming language and development framework you use. Each recipe includes one or two problem statements, with easy-to-follow, step-by-step i

  19. SPARQLGraph: a web-based platform for graphically querying biological Semantic Web databases.

    Science.gov (United States)

    Schweiger, Dominik; Trajanoski, Zlatko; Pabinger, Stephan

    2014-08-15

    Semantic Web has established itself as a framework for using and sharing data across applications and database boundaries. Here, we present a web-based platform for querying biological Semantic Web databases in a graphical way. SPARQLGraph offers an intuitive drag & drop query builder, which converts the visual graph into a query and executes it on a public endpoint. The tool integrates several publicly available Semantic Web databases, including the databases of the just recently released EBI RDF platform. Furthermore, it provides several predefined template queries for answering biological questions. Users can easily create and save new query graphs, which can also be shared with other researchers. This new graphical way of creating queries for biological Semantic Web databases considerably facilitates usability as it removes the requirement of knowing specific query languages and database structures. The system is freely available at http://sparqlgraph.i-med.ac.at.

  20. Drexel at TREC 2014 Federated Web Search Track

    Science.gov (United States)

    2014-11-01

    of its input RS results. 1. INTRODUCTION Federated Web Search is the task of searching multiple search engines simultaneously and combining their...or distributed properly[5]. The goal of RS is then, for a given query, to select only the most promising search engines from all those available. Most...result pages of 149 search engines . 4000 queries are used in building the sample set. As a part of the Vertical Selection task, search engines are

  1. Chemistry WebBook

    Science.gov (United States)

    SRD 69 NIST Chemistry WebBook (Web, free access)   The NIST Chemistry WebBook contains: Thermochemical data for over 7000 organic and small inorganic compounds; thermochemistry data for over 8000 reactions; IR spectra for over 16,000 compounds; mass spectra for over 33,000 compounds; UV/Vis spectra for over 1600 compounds; electronic and vibrational spectra for over 5000 compounds; constants of diatomic molecules(spectroscopic data) for over 600 compounds; ion energetics data for over 16,000 compounds; thermophysical property data for 74 fluids.

  2. PERBANDINGAN ANTARA “BIG” WEB SERVICE DENGAN RESTFUL WEB SERVICE UNTUK INTEGRASI DATA BERFORMAT GML

    Directory of Open Access Journals (Sweden)

    Adi Nugroho

    2012-01-01

    Full Text Available Web Service with Java: SOAP (JAX-WS/Java API for XML Web Services and Java RESTful Web Service (JAX-RS/Java RESTful API for XML Web Services are now a technology competing with each other in terms of their use for integrates data residing in different systems. Both Web Service technologies, of course, have advantages and disadvantages. In this paper, we discuss the comparison of the two technologies is a Java Web Service in relation to the development of GIS application (Geographic Information System integrates the use of data-formatted GML (Geography Markup Language, which is stored in the system database XML (eXtensible Markup Language.

  3. Using Technology of .Net Web Services in the Area of Automation

    OpenAIRE

    Martin Hnik; Marek Babiuch

    2009-01-01

    This work deals with a technology for data exchange XML Web Services and its application to specific tasks. One of the applications created allows you to monitor and control the real thermal process through a number of client devices, independent of the operating system, the type or their location. The thermal process can be controlled, for example, by another process, a website or a mobile phone. The system is designed from its base and contains three main parts. The hardware part consists f...

  4. Creating adaptive web recommendation system based on user behavior

    Science.gov (United States)

    Walek, Bogdan

    2018-01-01

    The paper proposes adaptive web recommendation system based on user behavior. The proposed system uses expert system to evaluating and recommending suitable items of content. Relevant items are subsequently evaluated and filtered based on history of visited items and user´s preferred categories of items. Main parts of the proposed system are presented and described. The proposed recommendation system is verified on specific example.

  5. MR imaging of carotid webs

    International Nuclear Information System (INIS)

    Boesen, Mari E.; Eswaradass, Prasanna Venkatesan; Singh, Dilip; Mitha, Alim P.; Menon, Bijoy K.; Goyal, Mayank; Frayne, Richard

    2017-01-01

    We propose a magnetic resonance (MR) imaging protocol for the characterization of carotid web morphology, composition, and vessel wall dynamics. The purpose of this case series was to determine the feasibility of imaging carotid webs with MR imaging. Five patients diagnosed with carotid web on CT angiography were recruited to undergo a 30-min MR imaging session. MR angiography (MRA) images of the carotid artery bifurcation were acquired. Multi-contrast fast spin echo (FSE) images were acquired axially about the level of the carotid web. Two types of cardiac phase resolved sequences (cineFSE and cine phase contrast) were acquired to visualize the elasticity of the vessel wall affected by the web. Carotid webs were identified on MRA in 5/5 (100%) patients. Multi-contrast FSE revealed vessel wall thickening and cineFSE demonstrated regional changes in distensibility surrounding the webs in these patients. Our MR imaging protocol enables an in-depth evaluation of patients with carotid webs: morphology (by MRA), composition (by multi-contrast FSE), and wall dynamics (by cineFSE). (orig.)

  6. MR imaging of carotid webs

    Energy Technology Data Exchange (ETDEWEB)

    Boesen, Mari E. [University of Calgary, Department of Biomedical Engineering, Calgary (Canada); Foothills Medical Centre, Seaman Family MR Research Centre, Calgary (Canada); Eswaradass, Prasanna Venkatesan; Singh, Dilip; Mitha, Alim P.; Menon, Bijoy K. [University of Calgary, Department of Clinical Neurosciences, Calgary (Canada); Foothills Medical Centre, Calgary Stroke Program, Calgary (Canada); Goyal, Mayank [Foothills Medical Centre, Calgary Stroke Program, Calgary (Canada); University of Calgary, Department of Radiology, Calgary (Canada); Frayne, Richard [Foothills Medical Centre, Seaman Family MR Research Centre, Calgary (Canada); University of Calgary, Hotchkiss Brain Institute, Calgary (Canada)

    2017-04-15

    We propose a magnetic resonance (MR) imaging protocol for the characterization of carotid web morphology, composition, and vessel wall dynamics. The purpose of this case series was to determine the feasibility of imaging carotid webs with MR imaging. Five patients diagnosed with carotid web on CT angiography were recruited to undergo a 30-min MR imaging session. MR angiography (MRA) images of the carotid artery bifurcation were acquired. Multi-contrast fast spin echo (FSE) images were acquired axially about the level of the carotid web. Two types of cardiac phase resolved sequences (cineFSE and cine phase contrast) were acquired to visualize the elasticity of the vessel wall affected by the web. Carotid webs were identified on MRA in 5/5 (100%) patients. Multi-contrast FSE revealed vessel wall thickening and cineFSE demonstrated regional changes in distensibility surrounding the webs in these patients. Our MR imaging protocol enables an in-depth evaluation of patients with carotid webs: morphology (by MRA), composition (by multi-contrast FSE), and wall dynamics (by cineFSE). (orig.)

  7. A survey on web modeling approaches for ubiquitous web applications

    NARCIS (Netherlands)

    Schwinger, W.; Retschitzegger, W.; Schauerhuber, A.; Kappel, G.; Wimmer, M.; Pröll, B.; Cachero Castro, C.; Casteleyn, S.; De Troyer, O.; Fraternali, P.; Garrigos, I.; Garzotto, F.; Ginige, A.; Houben, G.J.P.M.; Koch, N.; Moreno, N.; Pastor, O.; Paolini, P.; Pelechano Ferragud, V.; Rossi, G.; Schwabe, D.; Tisi, M.; Vallecillo, A.; Sluijs, van der K.A.M.; Zhang, G.

    2008-01-01

    Purpose – Ubiquitous web applications (UWA) are a new type of web applications which are accessed in various contexts, i.e. through different devices, by users with various interests, at anytime from anyplace around the globe. For such full-fledged, complex software systems, a methodologically sound

  8. La comunicación web de la RSC- El caso de las empresas cárnicas catalanas

    Directory of Open Access Journals (Sweden)

    Alejandra Aramayo García

    2014-10-01

    Full Text Available En los últimos años las buenas prácticas de RSC (Responsabilidad Social Corporativa enfocadas a la sostenibilidad empresarial han ido incrementado su relevancia. La comunicación es una parte intrínseca de la RSC, viabiliza el diálogo con los grupos de interés, pero no siempre es eficiente. El objetivo de este estudio es el de medir y evaluar la gestión de la comunicación web de la RSC que realizan las empresas del sector cárnico catalán, mediante el estudio de las variables significativas del análisis web aplicadas a la información de la RSC disponibles en los websites corporativos. CSR Communication in the Web – The Catalan meat companies case Abstract In recent years the good practices of CSR (Corporate Social Responsibility focused on corporate sustainability are becoming increasingly important. Communication is an intrinsic part of CSR, viable dialogue with stakeholders, but not always efficient. The objective of this study is to measure and evaluate the web communication management of CSR (Corporate Social Responsibility performed by catalan meat companies, by analyzing significant web variables of analysis applicable to the RSC information  available on the corporate website. Keywords:  Corporate communication; Corporate Social Responsibility; Meat Industry; Catalonia; Website; stakeholders.

  9. Classification of the web

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper discusses the challenges faced by investigations into the classification of the Web and outlines inquiries that are needed to use principles for bibliographic classification to construct classifications of the Web. This paper suggests that the classification of the Web meets challenges...... that call for inquiries into the theoretical foundation of bibliographic classification theory....

  10. Maintenance-Ready Web Application Development

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2016-01-01

    Full Text Available The current paper tackles the subject of developing maintenance-ready web applications. Maintenance is presented as a core stage in a web application’s lifecycle. The concept of maintenance-ready is defined in the context of web application development. Web application maintenance tasks types are enunciated and suitable task types are identified for further analysis. The research hypothesis is formulated based on a direct link between tackling maintenance in the development stage and reducing overall maintenance costs. A live maintenance-ready web application is presented and maintenance related aspects are highlighted. The web application’s features, that render it maintenance-ready, are emphasize. The cost of designing and building the web-application to be maintenance-ready are disclosed. The savings in maintenance development effort facilitated by maintenance ready features are also disclosed. Maintenance data is collected from 40 projects implemented by a web development company. Homogeneity and diversity of collected data is evaluated. A data sample is presented and the size and comprehensive nature of the entire dataset is depicted. Research hypothesis are validated and conclusions are formulated on the topic of developing maintenance-ready web applications. The limits of the research process which represented the basis for the current paper are enunciated. Future research topics are submitted for debate.

  11. Induction of cellular accessibility and inaccessibility and suppression and potentiation of cell death in oat attacked by ¤Blumeria graminis¤ f.sp. ¤avenae¤

    DEFF Research Database (Denmark)

    Carver, T.L.W.; Lyngkjær, M.F.; Neyron, L.

    1999-01-01

    graminis DC.). Successful penetration and haustorium formation by the inducer rendered living epidermal cells highly accessible to later challenge attack as judged by increased frequency of challenge penetration success compared to controls. Conversely, where failure of inducer attack on living epidermal......First-formed (seedling) and later-formed leaves of oat cvs Selma (susceptible) and Maldwyn (adult plant resistance under complex genetic control) were subjected to a double inoculation procedure ('inducer' followed by 'challenger') with conidia of Blumeria graminis (DC.) Speer (Syn. Erysiphe......, suggesting that induced changes in (in)accessibility may be a common consequence of B. graminis attack in cereals. As expected, in Maldwyn, cell death was a consistent but infrequent response to attack (5-20%, of attacks caused cell death in controls). Here, the successful formation of an inducer haustorium...

  12. Personalized Metaheuristic Clustering Onto Web Documents

    Institute of Scientific and Technical Information of China (English)

    Wookey Lee

    2004-01-01

    Optimal clustering for the web documents is known to complicated cornbinatorial Optimization problem and it is hard to develop a generally applicable oplimal algorithm. An accelerated simuIated arlneaIing aIgorithm is developed for automatic web document classification. The web document classification problem is addressed as the problem of best describing a match between a web query and a hypothesized web object. The normalized term frequency and inverse document frequency coefficient is used as a measure of the match. Test beds are generated on - line during the search by transforming model web sites. As a result, web sites can be clustered optimally in terms of keyword vectofs of corresponding web documents.

  13. Designing a WebQuest

    Science.gov (United States)

    Salsovic, Annette R.

    2009-01-01

    A WebQuest is an inquiry-based lesson plan that uses the Internet. This article explains what a WebQuest is, shows how to create one, and provides an example. When engaged in a WebQuest, students use technology to experience cooperative learning and discovery learning while honing their research, writing, and presentation skills. It has been found…

  14. Web-Based Inquiry Learning: Facilitating Thoughtful Literacy with WebQuests

    Science.gov (United States)

    Ikpeze, Chinwe H.; Boyd, Fenice B.

    2007-01-01

    An action research study investigated how the multiple tasks found in WebQuests facilitate fifth-grade students' literacy skills and higher order thinking. Findings indicate that WebQuests are most successful when activities are carefully selected and systematically delivered. Implications for teaching include the necessity for adequate planning,…

  15. Work of the Web Weavers: Web Development in Academic Libraries

    Science.gov (United States)

    Bundza, Maira; Vander Meer, Patricia Fravel; Perez-Stable, Maria A.

    2009-01-01

    Although the library's Web site has become a standard tool for seeking information and conducting research in academic institutions, there are a variety of ways libraries approach the often challenging--and sometimes daunting--process of Web site development and maintenance. Three librarians at Western Michigan University explored issues related…

  16. Web 2.0 i undervisningen

    DEFF Research Database (Denmark)

    Liburd, Janne J.; Christensen, Inger-Marie F.

    2011-01-01

    Temahæfte om web 2.0, der formidler viden om og inspiration til at arbejde med web 2.0 teknologier i videregående uddannelser. Hæftet introducerer sociale medier og web 2.0, og der redegøres for teoretisk funderede læreprocesser med web 2.0, og hvorledes disse kan indtænkes i undervisningsforløb....... Hæftet præsenterer endvidere en metode til design af læringsaktiviteter med web 2.0, og giver en række eksempler på konkrete forløb....

  17. Historical Quantitative Reasoning on the Web

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Ashkpour, A.

    2016-01-01

    The Semantic Web is an extension of the Web through standards by the World Wide Web Consortium (W3C) [4]. These standards promote common data formats and exchange protocols on the Web, most fundamentally the Resource Description Framework (RDF). Its ultimate goal is to make the Web a suitable data

  18. The Creative Web.

    Science.gov (United States)

    Yudess, Jo

    2003-01-01

    This article lists the Web sites of 12 international not-for-profit creativity associations designed to trigger more creative thought and research possibilities. Along with Web addresses, the entries include telephone contact information and a brief description of the organization. (CR)

  19. Engineering Adaptive Web Applications

    DEFF Research Database (Denmark)

    Dolog, Peter

    2007-01-01

    suit the user profile the most. This paper summarizes the domain engineering framework for such adaptive web applications. The framework provides guidelines to develop adaptive web applications as members of a family. It suggests how to utilize the design artifacts as knowledge which can be used......Information and services on the web are accessible for everyone. Users of the web differ in their background, culture, political and social environment, interests and so on. Ambient intelligence was envisioned as a concept for systems which are able to adapt to user actions and needs....... With the growing amount of information and services, the web applications become natural candidates to adopt the concepts of ambient intelligence. Such applications can deal with divers user intentions and actions based on the user profile and can suggest the combination of information content and services which...

  20. Selection and Cataloging of Adult Pornography Web Sites for Academic Libraries

    Science.gov (United States)

    Dilevko, Juris; Gottlieb, Lisa

    2004-01-01

    Pornography has become part of mainstream culture. As such, it has become a subject of academic research, and this, in turn, has implications for university libraries. Focusing on adult Internet pornography, this study suggests that academic libraries should provide access to adult pornographic Web sites by including them in their online catalogs.

  1. Physician training protocol within the WEB Intrasaccular Therapy (WEB-IT) study.

    Science.gov (United States)

    Arthur, Adam; Hoit, Daniel; Coon, Alexander; Delgado Almandoz, Josser E; Elijovich, Lucas; Cekirge, Saruhan; Fiorella, David

    2018-05-01

    The WEB Intra-saccular Therapy (WEB-IT) trial is an investigational device exemption study to demonstrate the safety and effectiveness of the WEB device for the treatment of wide-neck bifurcation aneurysms. The neurovascular replicator (Vascular Simulations, Stony Brook, New York, USA) creates a physical environment that replicates patient-specific neurovascular anatomy and hemodynamic physiology, and allows devices to be implanted under fluoroscopic guidance. To report the results of a unique neurovascular replicator-based training program, which was incorporated into the WEB-IT study to optimize technical performance and patient safety. US investigators participated in a new training program that incorporated full surgical rehearsals on a neurovascular replicator. No roll-in cases were permitted within the trial. Custom replicas of patient-specific neurovascular anatomy were created for the initial cases treated at each center, as well as for cases expected to be challenging. On-site surgical rehearsals were performed before these procedures. A total of 48 participating investigators at 25 US centers trained using the replicator. Sessions included centralized introductory training, on-site training, and patient-specific full surgical rehearsal. Fluoroscopy and procedure times in the WEB-IT study were not significantly different from those seen in two European trials where participating physicians had significant WEB procedure experience before study initiation. A new program of neurovascular-replicator-based physician training was employed within the WEB-IT study. This represents a new methodology for education and training that may be an effective means to optimize technical success and patient safety during the introduction of a new technology. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Stochastic analysis of web page ranking

    NARCIS (Netherlands)

    Volkovich, Y.

    2009-01-01

    Today, the study of the World Wide Web is one of the most challenging subjects. In this work we consider the Web from a probabilistic point of view. We analyze the relations between various characteristics of the Web. In particular, we are interested in the Web properties that affect the Web page

  3. Life Cycle Project Plan Outline: Web Sites and Web-based Applications

    Science.gov (United States)

    This tool is a guideline for planning and checking for 508 compliance on web sites and web based applications. Determine which EIT components are covered or excepted, which 508 standards and requirements apply, and how to implement them.

  4. Pembuatan Aplikasi Web Manajemen Laundry dan Integrasi Data dengan Web Service

    Directory of Open Access Journals (Sweden)

    Refika Khoirunnisa

    2016-01-01

    Full Text Available Selama ini banyak dari perusahaan di bidang jasa laundry masih menggunakan pencatatan secara manual seperti menggunakan buku, sehingga setiap data tidak terintegrasi secara waktu-nyata. Oleh sebab itu, perlu dibuat penelitian untuk merancang sebuah sistem terkomputerisasi yang dapat mempermudah pencatatan dan pengolahan data keuangan laundry. Pembuatan Aplikasi Web Manajemen Laundry menggunakan bahasa pemrograman PHP, HTML, CSS, JavaScript dan basisdata MySQL sebagai tempat penyimpanan data. Aplikasi ini merupakan aplikasi yang terintegrasi dengan aplikasi melalui sebuah teknologi yang disebut web service. Aplikasi Web Manajemen Laundry dikembangkan dengan menggunakan metode RAD (Rapid Application Development yang terdiri dari tahap perencanaan kebutuhan, proses perancangan, implementasi, dan tahap pengujian. Dari hasil penelitian dapat disimpulkan bahwa Aplikasi Web Manajemen Laundry memiliki fitur yang berfungsi untuk mempermudah pencatatan dan pengolahan data secara akurat dan waktu-nyata. Fitur-fitur utama dari aplikasi ini diantaranya adalah pengolahan data transaksi, pengeluaran, dan laporan laba/rugi. Dalam menunjang fitur-fitur utama agar dapat bekerja dengan baik maka terdapat fitur pendukung yaitu pengolahan data pelanggan dan data pengguna aplikasi. Berdasarkan pengujian dengan menggunakan metode black-box, seluruh fungsi menu yang ada dalam aplikasi web telah berhasil dan berjalan sesuai dengan kebutuhan.

  5. Web Literacy, Web Literacies or Just Literacies on the Web? Reflections from a Study of Personal Homepages.

    Science.gov (United States)

    Karlsson, Anna-Malin

    2002-01-01

    Discusses the question of whether there is such a thing as web literacy. Perspectives from media studies, literacy studies, and the study of multimodal texts are used to find the main contextual parameters involved in what might be classed as web literacy. The parameters suggested are material conditions, domain, power or ideology, and semiotic…

  6. Dependence of GAMA galaxy halo masses on the cosmic web environment from 100 deg2 of KiDS weak lensing data

    NARCIS (Netherlands)

    Brouwer, Margot M.; Cacciato, Marcello; Dvornik, Andrej; Eardley, Lizzie; Heymans, Catherine; Hoekstra, Henk; Kuijken, Konrad; McNaught-Roberts, Tamsyn; Sifón, Cristóbal; Viola, Massimo; Alpaslan, Mehmet; Bilicki, Maciej; Bland-Hawthorn, Joss; Brough, Sarah; Choi, Ami; Driver, Simon P.; Erben, Thomas; Grado, Aniello; Hildebrandt, Hendrik; Holwerda, Benne W.; Hopkins, Andrew M.; de Jong, Jelte T. A.; Liske, Jochen; Mc Farland, John; Nakajima, Reiko; Napolitano, Nicola R.; Norberg, Peder; Peacock, John A.; Radovich, Mario; Robotham, Aaron S. G.; Schneider, Peter; Sikkema, Gert; van Uitert, Edo; Verdoes Kleijn, Gijs; Valentijn, Edwin A.

    2016-01-01

    Galaxies and their dark matter haloes are part of a complex network of mass structures, collectively called the cosmic web. Using the tidal tensor prescription these structures can be classified into four cosmic environments: voids, sheets, filaments and knots. As the cosmic web may influence the

  7. Finding Web-Based Anxiety Interventions on the World Wide Web: A Scoping Review.

    Science.gov (United States)

    Ashford, Miriam Thiel; Olander, Ellinor K; Ayers, Susan

    2016-06-01

    One relatively new and increasingly popular approach of increasing access to treatment is Web-based intervention programs. The advantage of Web-based approaches is the accessibility, affordability, and anonymity of potentially evidence-based treatment. Despite much research evidence on the effectiveness of Web-based interventions for anxiety found in the literature, little is known about what is publically available for potential consumers on the Web. Our aim was to explore what a consumer searching the Web for Web-based intervention options for anxiety-related issues might find. The objectives were to identify currently publically available Web-based intervention programs for anxiety and to synthesize and review these in terms of (1) website characteristics such as credibility and accessibility; (2) intervention program characteristics such as intervention focus, design, and presentation modes; (3) therapeutic elements employed; and (4) published evidence of efficacy. Web keyword searches were carried out on three major search engines (Google, Bing, and Yahoo-UK platforms). For each search, the first 25 hyperlinks were screened for eligible programs. Included were programs that were designed for anxiety symptoms, currently publically accessible on the Web, had an online component, a structured treatment plan, and were available in English. Data were extracted for website characteristics, program characteristics, therapeutic characteristics, as well as empirical evidence. Programs were also evaluated using a 16-point rating tool. The search resulted in 34 programs that were eligible for review. A wide variety of programs for anxiety, including specific anxiety disorders, and anxiety in combination with stress, depression, or anger were identified and based predominantly on cognitive behavioral therapy techniques. The majority of websites were rated as credible, secure, and free of advertisement. The majority required users to register and/or to pay a program access

  8. Web Dynpro ABAP for practitioners

    CERN Document Server

    Gellert, Ulrich

    2013-01-01

    Web Dynpro ABAP, a NetWeaver web application user interface tool from SAP, enables web programming connected to SAP Systems. The authors' main focus was to create a book based on their own practical experience. Each chapter includes examples which lead through the content step-by-step and enable the reader to gradually explore and grasp the Web Dynpro ABAP process. The authors explain in particular how to design Web Dynpro components, the data binding and interface methods, and the view controller methods. They also describe the other SAP NetWeaver Elements (ABAP Dictionary, Authorization) and

  9. Introduction to the world wide web.

    Science.gov (United States)

    Downes, P K

    2007-05-12

    The World Wide Web used to be nicknamed the 'World Wide Wait'. Now, thanks to high speed broadband connections, browsing the web has become a much more enjoyable and productive activity. Computers need to know where web pages are stored on the Internet, in just the same way as we need to know where someone lives in order to post them a letter. This section explains how the World Wide Web works and how web pages can be viewed using a web browser.

  10. Web traffic and firm performance

    DEFF Research Database (Denmark)

    Farooq, Omar; Aguenaou, Samir

    2013-01-01

    Does the traffic generated by websites of firms signal anything to stock market participants? Does higher web-traffic translate into availability of more information and therefore lower agency problems? And if answers to above questions are in affirmative, does higher web-traffic traffic translate...... into better firm performance? This paper aims to answer these questions by documenting a positive relationship between the extent of web-traffic and firm performance in the MENA region during the 2010. We argue that higher web-traffic lowers the agency problems in firms by disseminating more information...... to stock market participants. Consequently, lower agency problems translate into better performance. Furthermore, we also show that agency reducing role of web-traffic is more pronounced in regimes where information environment is already bad. For example, our results show stronger impact of web...

  11. Students' Satisfaction and Perceived Learning with a Web-based Course

    Directory of Open Access Journals (Sweden)

    Derek Holton

    2003-01-01

    Full Text Available This paper describes a study, which explored students' responses and reactions to a Web-based tertiary statistics course supporting problem-based learning. The study was undertaken among postgraduate students in a Malaysian university. The findings revealed that the majority of the students were satisfied with their learning experience and achieved comparable learning outcomes to students in the face-to-face version of the course. Students appreciated the flexibility of anytime, anywhere learning. The majority of the students was motivated to learn and had adequate technical support to complete the course. Improvement in computer skills was an incidental learning outcome from the course. The student-student and student-teacher communication was satisfactory but a few students felt isolated learning in the Web environment. These students expressed a need for some face-to-face lectures. While the majority of the students saw value in learning in a problem-based setting, around a third of the students expressed no opinion on, or were dissatisfied with, the problem-based environment. They were satisfied with the group facilitators and learning materials but were unhappy with the group dynamics. Some of the students felt unable to contribute to or learn from the asynchronous Web-based conferences using problem-based approach. Some of the students were not punctual and were not prepared to take part in the Web-based conferences. The findings have suggested a need to explicitly design an organising strategy in the asynchronous Web-based conferences using problem-based approach to aid students in completing the problem-based learning process.

  12. WebEQ: a web-GIS System to collect, display and query data for the management of the earthquake emergency in Central Italy

    Science.gov (United States)

    Carbone, Gianluca; Cosentino, Giuseppe; Pennica, Francesco; Moscatelli, Massimiliano; Stigliano, Francesco

    2017-04-01

    After the strong earthquakes that hit central Italy in recent months, the Center for Seismic Microzonation and its applications (CentroMS) was commissioned by the Italian Department of Civil Protection to conduct the study of seismic microzonation of the territories affected by the earthquake of August 24, 2016. As part of the activities of microzonation, IGAG CNR has created WebEQ, a management tool of the data that have been acquired by all participants (i.e., more than twenty research institutes and university departments). The data collection was organized and divided into sub-areas, assigned to working groups with multidisciplinary expertise in geology, geophysics and engineering. WebEQ is a web-GIS System that helps all the subjects involved in the data collection activities, through tools aimed at data uploading and validation, and with a simple GIS interface to display, query and download geographic data. WebEQ is contributing to the creation of a large database containing geographical data, both vector and raster, from various sources and types: - Regional Technical Map em Geological and geomorphological maps em Data location maps em Maps of microzones homogeneous in seismic perspective and seismic microzonation maps em National strong motion network location. Data loading is done through simple input masks that ensure consistency with the database structure, avoiding possible errors and helping users to interact with the map through user-friendly tools. All the data are thematized through standardized symbologies and colors (Gruppo di lavoro MS 2008), in order to allow the easy interpretation by all users. The data download tools allow data exchange between working groups and the scientific community to benefit from the activities. The seismic microzonation activities are still ongoing. WebEQ is enabling easy management of large amounts of data and will form a basis for the development of tools for the management of the upcoming seismic emergencies.

  13. Characterizing web heuristics

    NARCIS (Netherlands)

    de Jong, Menno D.T.; van der Geest, Thea

    2000-01-01

    This article is intended to make Web designers more aware of the qualities of heuristics by presenting a framework for analyzing the characteristics of heuristics. The framework is meant to support Web designers in choosing among alternative heuristics. We hope that better knowledge of the

  14. Web Publishing Schedule

    Science.gov (United States)

    Section 207(f)(2) of the E-Gov Act requires federal agencies to develop an inventory and establish a schedule of information to be published on their Web sites, make those schedules available for public comment. To post the schedules on the web site.

  15. Treatment Dropout in Web-Based Cognitive Behavioral Therapy for Patients with Eating Disorders

    NARCIS (Netherlands)

    ter Huurne, E.D.; Postel, Marloes Gerda; de Haan, H.A.; van der Palen, Jacobus Adrianus Maria; de Jong, C.A.

    2017-01-01

    Treatment dropout is an important concern in eating disorder treatments as it has negative implications for patients’ outcome, clinicians’ motivation, and research studies. Our main objective was to conduct an exploratory study on treatment dropout in a two-part web-based cognitive behavioral

  16. WebPASS Explorer (HR Personnel Management)

    Data.gov (United States)

    US Agency for International Development — WebPass Explorer (WebPASS Framework): USAID is partnering with DoS in the implementation of their WebPass Post Personnel (PS) Module. WebPassPS does not replace...

  17. Interactive WebGL-based 3D visualizations for EAST experiment

    International Nuclear Information System (INIS)

    Xia, J.Y.; Xiao, B.J.; Li, Dan; Wang, K.R.

    2016-01-01

    Highlights: • Developing a user-friendly interface to visualize the EAST experimental data and the device is important to scientists and engineers. • The Web3D visualization system is based on HTML5 and WebGL, which runs without the need for plug-ins or third party components. • The interactive WebGL-based 3D visualization system is a web-portal integrating EAST 3D models, experimental data and plasma videos. • The original CAD model was discretized into different layers with different simplification to enable realistic rendering and improve performance. - Abstract: In recent years EAST (Experimental Advanced Superconducting Tokamak) experimental data are being shared and analyzed by an increasing number of international collaborators. Developing a user-friendly interface to visualize the data, meta data and the relevant parts of the device is becoming more and more important to aid scientists and engineers. Compared with the previous virtual EAST system based on VRML/Java3D [1] (Li et al., 2014), a new technology is being adopted to create a 3D visualization system based on HTML5 and WebGL, which runs without the need for plug-ins or third party components. The interactive WebGL-based 3D visualization system is a web-portal integrating EAST 3D models, experimental data and plasma videos. It offers a highly interactive interface allowing scientists to roam inside EAST device and view the complex 3-D structure of the machine. It includes technical details of the device and various diagnostic components, and provides visualization of diagnostic metadata with a direct link to each signal name and its stored data. In order for the quick access to the device 3D model, the original CAD model was discretized into different layers with different simplification. It allows users to search for plasma videos in any experiment and analyze the video frame by frame. In this paper, we present the implementation details to enable realistic rendering and improve performance.

  18. Interactive WebGL-based 3D visualizations for EAST experiment

    Energy Technology Data Exchange (ETDEWEB)

    Xia, J.Y., E-mail: jyxia@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); University of Science and Technology of China, Hefei, Anhui (China); Xiao, B.J. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); University of Science and Technology of China, Hefei, Anhui (China); Li, Dan [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); Wang, K.R. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); University of Science and Technology of China, Hefei, Anhui (China)

    2016-11-15

    Highlights: • Developing a user-friendly interface to visualize the EAST experimental data and the device is important to scientists and engineers. • The Web3D visualization system is based on HTML5 and WebGL, which runs without the need for plug-ins or third party components. • The interactive WebGL-based 3D visualization system is a web-portal integrating EAST 3D models, experimental data and plasma videos. • The original CAD model was discretized into different layers with different simplification to enable realistic rendering and improve performance. - Abstract: In recent years EAST (Experimental Advanced Superconducting Tokamak) experimental data are being shared and analyzed by an increasing number of international collaborators. Developing a user-friendly interface to visualize the data, meta data and the relevant parts of the device is becoming more and more important to aid scientists and engineers. Compared with the previous virtual EAST system based on VRML/Java3D [1] (Li et al., 2014), a new technology is being adopted to create a 3D visualization system based on HTML5 and WebGL, which runs without the need for plug-ins or third party components. The interactive WebGL-based 3D visualization system is a web-portal integrating EAST 3D models, experimental data and plasma videos. It offers a highly interactive interface allowing scientists to roam inside EAST device and view the complex 3-D structure of the machine. It includes technical details of the device and various diagnostic components, and provides visualization of diagnostic metadata with a direct link to each signal name and its stored data. In order for the quick access to the device 3D model, the original CAD model was discretized into different layers with different simplification. It allows users to search for plasma videos in any experiment and analyze the video frame by frame. In this paper, we present the implementation details to enable realistic rendering and improve performance.

  19. Programming the Mobile Web

    CERN Document Server

    Firtman, Maximiliano

    2010-01-01

    Today's market for mobile apps goes beyond the iPhone to include BlackBerry, Nokia, Windows Phone, and smartphones powered by Android, webOS, and other platforms. If you're an experienced web developer, this book shows you how to build a standard app core that you can extend to work with specific devices. You'll learn the particulars and pitfalls of building mobile apps with HTML, CSS, and other standard web tools. You'll also explore platform variations, finicky mobile browsers, Ajax design patterns for mobile, and much more. Before you know it, you'll be able to create mashups using Web 2.

  20. Creating Web Pages Simplified

    CERN Document Server

    Wooldridge, Mike

    2011-01-01

    The easiest way to learn how to create a Web page for your family or organization Do you want to share photos and family lore with relatives far away? Have you been put in charge of communication for your neighborhood group or nonprofit organization? A Web page is the way to get the word out, and Creating Web Pages Simplified offers an easy, visual way to learn how to build one. Full-color illustrations and concise instructions take you through all phases of Web publishing, from laying out and formatting text to enlivening pages with graphics and animation. This easy-to-follow visual guide sho

  1. SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services.

    Science.gov (United States)

    Gessler, Damian D G; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T

    2009-09-23

    SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at http://sswap.info (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at http://sswap.info/protocol.jsp, developer tools at http://sswap.info/developer.jsp, and a portal to third-party ontologies at http://sswapmeet.sswap.info (a "swap meet"). SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the

  2. Web 2.1 : Toward a large and qualitative participation on the Web

    Directory of Open Access Journals (Sweden)

    Boubker Sbihi

    2009-06-01

    Full Text Available Normal 0 21 false false false MicrosoftInternetExplorer4 This article presents the results of research done on Web 2.0 within the School of Information Sciences ESI. It aims to study the behavior of different academic actors who deal with information, among whom we cite teachers, students of masters and students of information sciences in Morocco, face to Web 2.0’s services. Firstly, it aims to evaluate the use and production of information in the context of Web 2.0. Then, it   attempts to assess those rates, to identify and analyze the causes of eventual problems and obstacles that academic actors face.  In fact, we intend to understand why information actors in the academic world use often Web 2.0’s services but do rarely produce qualitative content. To achieve the objectives set, we used the on-site survey method, which was based on an electronic questionnaire administered directly to our people via the Internet. We chose the electronic version of questionnaire in order to make an optimal use in terms of new technologies, to gain time and to reduce cost. Then, in order to deepen the understanding of the data collected, we complete the data collected by the questionnaire by an ongoing discussions with actors. Finally, to overcome the problems already identified, we intend to propose the elements of a new version of the Web called Web 2.1 offering new concepts   in order to encourage users to produce information of quality and make the Web more open to a larger community. This version maintains the current contents of   Web 2.0 and adds more value to it. Indeed, the content will be monitored, evaluated and validated before being published. In order to target valuable information, the new version of Web 2.1 proposes to categorize users into three groups: users who just use the contents, producers who use and produce content, and  validators  who validate the content in order to  target information that is validated and of good

  3. PERANCANGAN SISTEM PEMESANAN BARANG BERBASIS WEB DI TOKO ZENITH KOMPUTER DI PEKANBARU

    Directory of Open Access Journals (Sweden)

    Fery Wongso Johan Wyanaputra

    2016-03-01

    Full Text Available Abstrak : Sistem Informasi Pemesananberbasis web merupakan bagian dari sistem informasi pemasaran yang dikembangkan untuk mengumpulkan, mengolah data sehingga data tersebut dapat dilihat kembali untuk disalurkan sebagai suatu informasi yang berguna. Wujud dari pengembangan Sistem Informasi Pemesananberbasis web ini adalah pembuatan aplikasi Komputer yang mampu mewakili sistem informasi yang dirancang secara keseluruhan.Aplikasi Sistem Informasi pemesanan yang dihasilkan mampu mengelola data pemesanan secara terorgasisasi, serta menghasilkan laporan yang lengkap, akurat dan selalu aktual untuk setiap tingkatan manajemen. Perancangan sistemnya menggunakan PHP (Personal Home Page dan rancangan databasenya menggunakan Xamp Server. Hasil dari perancangan aplikasi Sistem Informasi Pemesananberbasis web menunjukkan bahwa peranan aplikasi Komputer dalam sistem informasi sangat penting sebagai penunjang dalam meningkatkan kualitas kegiatan Pemesanan dan pelayanan di lingkungan toko Zenith Komputer. Abstract: Information Systems is a web Pemesananberbasis Part Of The Marketing Information System was developed to review collect, process the data so that the data can be Seen Back to review the information supplied as a useful thing. From the form of development of Information Systems web Pemesananberbasis Singer Was Able Computer Application Development Information System Designed represent keseluruhan.Aplikasi Operating System Generated Ordering Information Ability to Manage Booking Data Operating terorgasisasi, as well as generate reports The complete, Accurate And Always Currents for EVERY level management review, The design of the system using PHP (Personal Home Page and database design using XAMP Server. Results From designing web applications Pemesananberbasis Information System showed that Role of Information Systems Computer Application hearts hearts Sangat as supporting activities improving QUALITY Booking And Environmental Services at Zenith

  4. Desarrollo de aplicaciones web

    OpenAIRE

    Luján Mora, Sergio

    2010-01-01

    Agradecimientos 1. Introducción a las aplicaciones web 2. Instalación del servidor 3. Diseño de páginas web 4. Formato estructurado de texto: XML 5. Contenido dinámico 6. Acceso a bases de datos: JDBC 7. Servicios web 8. Utilización y mantenimiento 9. Monitorización y análisis Bibliografía GNU Free Documentation License

  5. RESTful web services with Dropwizard

    CERN Document Server

    Dallas, Alexandros

    2014-01-01

    A hands-on focused step-by-step tutorial to help you create Web Service applications using Dropwizard. If you are a software engineer or a web developer and want to learn more about building your own Web Service application, then this is the book for you. Basic knowledge of Java and RESTful Web Service concepts is assumed and familiarity with SQL/MySQL and command-line scripting would be helpful.

  6. Segmenting The Web 2.0 Market: Behavioural And Usage Patterns Of Social Web Consumers

    NARCIS (Netherlands)

    Lorenzo Romero, Carlota; Constantinides, Efthymios; Alarcon-del-Amo, Maria-del-Carmen

    2010-01-01

    The evolution of the commercial Internet to the current phase, commonly called Web 2.0 (or Social Web) has firmly positioned the web not only as a commercial but also as a social communication platform: an online environment facilitating peer-to-peer interaction, socialization, co-operation and

  7. System configuration on Web with mashup.

    OpenAIRE

    清水, 宏泰; SHIMIZU, Hiroyasu

    2014-01-01

    Mashup become trend for create Web service due to popularizing cloud service. Mashup is method for create Web service from several Web services and API. Mashup has a few problems. One of the problem is deference of data format and label. Semantic Web can solve it. This paper propose method of building a system on Web with mashup using semantic Web. Mashup system configuration can express as URL. So, editing URL for mashup is editing system configuration. And any device can use this system on ...

  8. Using the Cognitive Apprenticeship Web-Based Argumentation System to Improve Argumentation Instruction

    Science.gov (United States)

    Tsai, Chun-Yen; Jack, Brady Michael; Huang, Tai-Chu; Yang, Jin-Tan

    2012-01-01

    This study investigated how the instruction of argumentation skills could be promoted by using an online argumentation system. This system entitled "Cognitive Apprenticeship Web-based Argumentation" (CAWA) system was based on cognitive apprenticeship model. One hundred eighty-nine fifth grade students took part in this study. A quasi-experimental…

  9. A Formal Analysis of the Web Services Atomic Transaction Protocol with UPPAAL

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2010-01-01

    We present a formal analysis of the Web Services Atomic Transaction (WS-AT) protocol. WS-AT is a part of the WS-Coordination framework and describes an algorithm for reaching agreement on the outcome of a distributed transaction. The protocol is modelled and verified using the model checker UPPAAL...

  10. Trust estimation of the semantic web using semantic web clustering

    Science.gov (United States)

    Shirgahi, Hossein; Mohsenzadeh, Mehran; Haj Seyyed Javadi, Hamid

    2017-05-01

    Development of semantic web and social network is undeniable in the Internet world these days. Widespread nature of semantic web has been very challenging to assess the trust in this field. In recent years, extensive researches have been done to estimate the trust of semantic web. Since trust of semantic web is a multidimensional problem, in this paper, we used parameters of social network authority, the value of pages links authority and semantic authority to assess the trust. Due to the large space of semantic network, we considered the problem scope to the clusters of semantic subnetworks and obtained the trust of each cluster elements as local and calculated the trust of outside resources according to their local trusts and trust of clusters to each other. According to the experimental result, the proposed method shows more than 79% Fscore that is about 11.9% in average more than Eigen, Tidal and centralised trust methods. Mean of error in this proposed method is 12.936, that is 9.75% in average less than Eigen and Tidal trust methods.

  11. First in the web, but where are the pieces

    Energy Technology Data Exchange (ETDEWEB)

    Deken, J.M.

    1998-04-01

    The World Wide Web (WWW) does matter to the SLAC Archives and History Office for two very important, and related, reasons. The first reason is that the early Web at SLAC is historically significant: it was the first of its kind on this continent, and it achieved new and important things. The second reason is that the Web at SLAC--in its present and future forms--is a large and changing collection of official documents of the organization, many of which exist in no other form or environment. As of the first week of August, 1997, SLAC had 8,940 administratively-accounted-for web pages, and an estimated 2,000 to 4,000 additional pages that are hard to administratively track because they either reside on the main server in users directories several levels below their top-level pages, or they reside on one of the more than 60 non-main servers at the Center. A very small sampling of the information that SLAC WWW pages convey includes: information for the general public about programs and activities at SLAC; pages which allow physics experiment collaborators to monitor data, arrange work schedules and analyze results; pages that convey information to staff and visiting scientists about seminar and activity schedules, publication procedures, and ongoing experiments; and pages that allow staff and outside users to access databases maintained at SLAC. So, when SLAC's Archives and History Office begins to approach collecting the documents of their WWW presence, what are they collecting, and how are they to go about the process of collecting it. In this paper, the author discusses the effort to archive SLAC's Web in two parts, concentrating on the first task that has been undertaken: the initial effort to identify and gather into the archives evidence and documentation of the early days of the SLAC Web. The second task, which is the effort to collect present and future web pages at SLAC, are also covered, although in less detail, since it is an effort that is only

  12. MedlinePlus Connect: Web Service

    Science.gov (United States)

    ... MedlinePlus Connect → Web Service URL of this page: https://medlineplus.gov/connect/service.html MedlinePlus Connect: Web ... will change.) Old URLs New URLs Web Application https://apps.nlm.nih.gov/medlineplus/services/mpconnect.cfm? ...

  13. MedlinePlus Connect: Web Application

    Science.gov (United States)

    ... MedlinePlus Connect → Web Application URL of this page: https://medlineplus.gov/connect/application.html MedlinePlus Connect: Web ... will change.) Old URLs New URLs Web Application https://apps.nlm.nih.gov/medlineplus/services/mpconnect.cfm? ...

  14. Flexible and Affordable Foreign Language Learning Environment based on Web 2.0 Technologies

    Directory of Open Access Journals (Sweden)

    Christian Guetl

    2013-05-01

    Full Text Available Web technologies and educational platforms have greatly evolved over the past decade. One of the most significant factors contributing to education on the Internet has been the development of Web 2.0 technologies. These technologies, socially interactive in nature, have much to contribute to the area of Computer Assisted Language Leaning. Unfortunately, Web 2.0 technologies for the most part have been used in an ad hoc manner, permitting language learners acquire knowledge through interaction, but not through a more structured manner as these technologies were not developed to help lean languages as such. The goal of our work is to research and develop an environment, which employs Web 2.0 technology plus online language learning tools to provide a more integrated language learning environment. This paper will explore the technologies and provide information about how tools can be better integrated to provide a more productive working environment for language learners. A first working proof of concept based on our approach introduced is promising supporting modern language requirements and first findings and space for improvements are discussed.

  15. SPADOCK: Adaptive Pipeline Technology for Web System using WebSocket

    Directory of Open Access Journals (Sweden)

    Aries RICHI

    2013-01-01

    Full Text Available As information technology grows to the era of IoT(Internet of Things and cloud computing, the performance ofweb application and web service which acts as the informationgateway becomes an issue. Horizontal quality of serviceimprovement through system performance escalation becomesan issue pursued by engineers and scientists, giving birth toBigPipe pipeline technology which was developed by Facebook.We make SPADOCK, an adaptive pipeline system which is builtunder distributed system architecture with the utilization ofHTML5 WebSocket, then measure its performance. Parametersused for the measurement includes latency, workload, andbandwidth. The result shows that SPADOCK could reduceserving latency by 68.28% compared with the conventional web,and it is 20.63% faster than BigPipe.

  16. INTERNET: a web of metaphors

    Directory of Open Access Journals (Sweden)

    Roger Pérez Brufau

    2007-05-01

    Full Text Available Ens proposem analitzar les principals metàfores que utilitzem els usuaris per a referir-nos a internet i a les activitats, els utensilis i les persones relacionats amb la xarxa en el marc de la teoria defensada per George Lakoff i Mark Johnson. La primera part del treball analitza la teoria de la metàfora conceptual d'aquests autors i la segona examina les diverses metàfores que utilitzem els usuaris per a referir-nos al món conceptual de la xarxa a partir del que s'ha exposat en la primera part.Aquest treball no solament hauria de servir per a adonar-nos del continu ontològico-estructural-orientacional que representa l'ús de metàfores que relacionen la xarxa, progressivament, amb un espai, amb un espai que és a dalt, amb un espai on hi ha coses, amb un espai que pren forma, normalment, de mar, de casa o de text, sinó que també hauria de servir per a adonar-nos que els motius pels quals fem servir aquestes metàfores i no unes altres són arrelats, successivament -tal com afirmen Lakoff i Johnson en els seus textos- en el nostre cos, la nostra interacció amb les coses del món i amb els altres en un context culturalment definit. Text complet (PDF We propose analysing the principal metaphors that we as users use to refer to the internet and to the activities, tools and people related to the web within the framework of the theory upheld by George Lakoff and Mark Johnson. The first part of the work analyses these authors' conceptual metaphor theory, while the second examines the various metaphors that we as users use to refer to the conceptual world of the web on the basis of what has been set out in the first part.This work should not only serve for us to take note of the ontological-structural-orientational continuum represented by the use of metaphors that relate the web, progressively, with a space, with a space that is above, with a space where there are things, with a space that takes the form, normally, of the sea, home or text

  17. Historical Network Analysis of the Web

    DEFF Research Database (Denmark)

    Brügger, Niels

    2013-01-01

    This article discusses some of the fundamental methodological challenges related to doing historical network analyses of the web based on material in web archives. Since the late 1990s many countries have established extensive national web archives, and software supported network analysis...... of the online web has for a number of years gained currency within Internet studies. However, the combination of these two phenomena—historical network analysis of material in web archives—can at best be characterized as an emerging new area of study. Most of the methodological challenges within this new area...... revolve around the specific nature of archived web material. On the basis of an introduction to the processes involved in web archiving as well as of the characteristics of archived web material, the article outlines and scrutinizes some of the major challenges which may arise when doing network analysis...

  18. Web corpus construction

    CERN Document Server

    Schafer, Roland

    2013-01-01

    The World Wide Web constitutes the largest existing source of texts written in a great variety of languages. A feasible and sound way of exploiting this data for linguistic research is to compile a static corpus for a given language. There are several adavantages of this approach: (i) Working with such corpora obviates the problems encountered when using Internet search engines in quantitative linguistic research (such as non-transparent ranking algorithms). (ii) Creating a corpus from web data is virtually free. (iii) The size of corpora compiled from the WWW may exceed by several orders of magnitudes the size of language resources offered elsewhere. (iv) The data is locally available to the user, and it can be linguistically post-processed and queried with the tools preferred by her/him. This book addresses the main practical tasks in the creation of web corpora up to giga-token size. Among these tasks are the sampling process (i.e., web crawling) and the usual cleanups including boilerplate removal and rem...

  19. Hera : Development of semantic web information systems

    NARCIS (Netherlands)

    Houben, G.J.P.M.; Barna, P.; Frasincar, F.; Vdovják, R.; Cuella Lovelle, J.M.; et al., xx

    2003-01-01

    As a consequence of the success of the Web, methodologies for information system development need to consider systems that use the Web paradigm. These Web Information Systems (WIS) use Web technologies to retrieve information from the Web and to deliver information in a Web presentation to the

  20. Answering the Call of the Web: UVA Crafts a Innovative Web Certification Program for Its Staff.

    Science.gov (United States)

    Lee, Sandra T.

    2000-01-01

    Describes the development of a Web Certification Program at the University of Virginia. This program offers certificates at three levels: Web Basics, Web Designer, and Web Master. The paper focuses on: determination of criteria for awarding certificates; program status; program evaluation and program effectiveness; and future plans for the Web…