WorldWideScience

Sample records for wide web edition

  1. World Wide Web voted most wonderful wonder by web-wide world

    CERN Multimedia

    2007-01-01

    The results are in, and the winner is...the World Wide Web! An online survey conducted by the CNN news group ranks the World Wide Web-invented at CERN--as the most wonderful of the seven modern wonders of the world. (See Bulletin No. 49/2006.) There is currently no speculation about whether they would have had the same results had they distributed the survey by post. The World Wide Web won with a whopping 50 per cent of the votes (3,665 votes). The runner up was CERN again, with 16 per cent of voters (1130 votes) casting the ballot in favour of the CERN particle accelerator. Stepping into place behind CERN and CERN is 'None of the Above' with 8 per cent of the votes (611 votes), followed by the development of Dubai (7%), the bionic arm (7%), China's Three Gorges Damn (5%), The Channel Tunnel (4%), and France's Millau viaduct (3%). Thanks to everyone from CERN who voted. You can view the results on http://edition.cnn.com/SPECIALS/2006/modern.wonders/

  2. Unit 148 - World Wide Web Basics

    OpenAIRE

    148, CC in GIScience; Yeung, Albert K.

    2000-01-01

    This unit explains the characteristics and the working principles of the World Wide Web as the most important protocol of the Internet. Topics covered in this unit include characteristics of the World Wide Web; using the World Wide Web for the dissemination of information on the Internet; and using the World Wide Web for the retrieval of information from the Internet.

  3. Introduction to the world wide web.

    Science.gov (United States)

    Downes, P K

    2007-05-12

    The World Wide Web used to be nicknamed the 'World Wide Wait'. Now, thanks to high speed broadband connections, browsing the web has become a much more enjoyable and productive activity. Computers need to know where web pages are stored on the Internet, in just the same way as we need to know where someone lives in order to post them a letter. This section explains how the World Wide Web works and how web pages can be viewed using a web browser.

  4. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  5. Management van World-Wide Web Servers

    NARCIS (Netherlands)

    van Hengstum, F.P.H.; Pras, Aiko

    1996-01-01

    Het World Wide Web is een populaire Internet toepassing waarmee het mogelijk is documenten aan willekeurige Internet gebruikers aan te bieden. Omdat hiervoor nog geen voorzieningen zijn getroffen, was het tot voor kort niet goed mogelijk het World Wide Web op afstand te beheren. De Universiteit

  6. Embedded Web Technology: Applying World Wide Web Standards to Embedded Systems

    Science.gov (United States)

    Ponyik, Joseph G.; York, David W.

    2002-01-01

    Embedded Systems have traditionally been developed in a highly customized manner. The user interface hardware and software along with the interface to the embedded system are typically unique to the system for which they are built, resulting in extra cost to the system in terms of development time and maintenance effort. World Wide Web standards have been developed in the passed ten years with the goal of allowing servers and clients to intemperate seamlessly. The client and server systems can consist of differing hardware and software platforms but the World Wide Web standards allow them to interface without knowing about the details of system at the other end of the interface. Embedded Web Technology is the merging of Embedded Systems with the World Wide Web. Embedded Web Technology decreases the cost of developing and maintaining the user interface by allowing the user to interface to the embedded system through a web browser running on a standard personal computer. Embedded Web Technology can also be used to simplify an Embedded System's internal network.

  7. World Wide Web of Your Wide Web? Juridische aspecten van zoekmachine-personalisatie

    NARCIS (Netherlands)

    Oostveen, M.

    2012-01-01

    Het world wide web is een enorme bron van informatie. Iedere internetgebruiker maakt gebruik van zoekmachines om die informatie te kunnen vinden. Veel gebruikers weten echter niet dat zoekresultaten behorende bij een bepaalde zoekterm niet voor iedereen hetzelfde zijn. Dit personaliseren van

  8. GenomeVx: simple web-based creation of editable circular chromosome maps.

    Science.gov (United States)

    Conant, Gavin C; Wolfe, Kenneth H

    2008-03-15

    We describe GenomeVx, a web-based tool for making editable, publication-quality, maps of mitochondrial and chloroplast genomes and of large plasmids. These maps show the location of genes and chromosomal features as well as a position scale. The program takes as input either raw feature positions or GenBank records. In the latter case, features are automatically extracted and colored, an example of which is given. Output is in the Adobe Portable Document Format (PDF) and can be edited by programs such as Adobe Illustrator. GenomeVx is available at http://wolfe.gen.tcd.ie/GenomeVx

  9. The World Wide Web Revisited

    Science.gov (United States)

    Owston, Ron

    2007-01-01

    Nearly a decade ago the author wrote in one of the first widely-cited academic articles, Educational Researcher, about the educational role of the web. He argued that educators must be able to demonstrate that the web (1) can increase access to learning, (2) must not result in higher costs for learning, and (3) can lead to improved learning. These…

  10. Tim Berners-Lee, World Wide Web inventor

    CERN Multimedia

    1998-01-01

    The "Internet, Web, What's next?" conference on 26 June 1998 at CERN: Tim Berners-Lee, inventor of the World Wide Web and Director of the W3C, explains how the Web came to be and gave his views on the future.

  11. Happy 20th Birthday, World Wide Web!

    CERN Multimedia

    2009-01-01

    On 13 March CERN celebrated the 20th anniversary of the World Wide Web. Check out the video interview with Web creator Tim Berners-Lee and find out more about the both the history and future of the Web. To celebrate CERN also launched a brand new website, CERNland, for kids.

  12. Use of World Wide Web and NCSA Mcsaic at Langley

    Science.gov (United States)

    Nelson, Michael

    1994-01-01

    A brief history of the use of the World Wide Web at Langley Research Center is presented along with architecture of the Langley Web. Benefits derived from the Web and some Langley projects that have employed the World Wide Web are discussed.

  13. Utilization of the world wide web

    International Nuclear Information System (INIS)

    Mohr, P.; Mallard, G.; Ralchenko, U.; Schultz, D.

    1998-01-01

    Two aspects of utilization of the World Wide Web are examined: (i) the communication of technical data through web cites that provide repositories of atomic and molecular data accessible through searchable databases; and (ii) the communication about issues of mutual concern among data producers, data compilers and evaluators, and data users. copyright 1998 American Institute of Physics

  14. Uses and Gratifications of the World Wide Web: From Couch Potato to Web Potato.

    Science.gov (United States)

    Kaye, Barbara K.

    1998-01-01

    Investigates uses and gratifications of the World Wide Web and its impact on traditional mass media, especially television. Identifies six Web use motivations: entertainment, social interaction, passing of time, escape, information, and Web site preference. Examines relationships between each use motivation and Web affinity, perceived realism, and…

  15. Web Security, Privacy & Commerce

    CERN Document Server

    Garfinkel, Simson

    2011-01-01

    Since the first edition of this classic reference was published, World Wide Web use has exploded and e-commerce has become a daily part of business and personal life. As Web use has grown, so have the threats to our security and privacy--from credit card fraud to routine invasions of privacy by marketers to web site defacements to attacks that shut down popular web sites. Web Security, Privacy & Commerce goes behind the headlines, examines the major security risks facing us today, and explains how we can minimize them. It describes risks for Windows and Unix, Microsoft Internet Exp

  16. GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.

    Science.gov (United States)

    Liang, Steve H L; Huang, Chih-Yuan

    2013-10-02

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  17. World-Wide Web: The Information Universe.

    Science.gov (United States)

    Berners-Lee, Tim; And Others

    1992-01-01

    Describes the World-Wide Web (W3) project, which is designed to create a global information universe using techniques of hypertext, information retrieval, and wide area networking. Discussion covers the W3 data model, W3 architecture, the document naming scheme, protocols, document formats, comparison with other systems, experience with the W3…

  18. Cpf1-Database: web-based genome-wide guide RNA library design for gene knockout screens using CRISPR-Cpf1.

    Science.gov (United States)

    Park, Jeongbin; Bae, Sangsu

    2018-03-15

    Following the type II CRISPR-Cas9 system, type V CRISPR-Cpf1 endonucleases have been found to be applicable for genome editing in various organisms in vivo. However, there are as yet no web-based tools capable of optimally selecting guide RNAs (gRNAs) among all possible genome-wide target sites. Here, we present Cpf1-Database, a genome-wide gRNA library design tool for LbCpf1 and AsCpf1, which have DNA recognition sequences of 5'-TTTN-3' at the 5' ends of target sites. Cpf1-Database provides a sophisticated but simple way to design gRNAs for AsCpf1 nucleases on the genome scale. One can easily access the data using a straightforward web interface, and using the powerful collections feature one can easily design gRNAs for thousands of genes in short time. Free access at http://www.rgenome.net/cpf1-database/. sangsubae@hanyang.ac.kr.

  19. U.S. Geological Survey World Wide Web Information

    Science.gov (United States)

    ,

    2003-01-01

    The U.S. Geological Survey (USGS) invites you to explore an earth science virtual library of digital information, publications, and data. The USGS World Wide Web sites offer an array of information that reflects scientific research and monitoring programs conducted in the areas of natural hazards, environmental resources, and cartography. This list provides gateways to access a cross section of the digital information on the USGS World Wide Web sites.

  20. Tim Berners-Lee, World Wide Web inventor

    CERN Multimedia

    1994-01-01

    Former physicist, Tim Berners-Lee invented the World Wide Web as an essential tool for high energy physics at CERN from 1989 to 1994. Together with a small team he conceived HTML, http, URLs, and put up the first server and the first 'what you see is what you get' browser and html editor. Tim is now Director of the Web Consortium W3C, the International Web standards body based at INRIA, MIT and Keio University.

  1. The World Wide Web and the Television Generation.

    Science.gov (United States)

    Maddux, Cleborne D.

    1996-01-01

    The hypermedia nature of the World Wide Web may represent a true paradigm shift in telecommunications, but barriers exist to the Web having similar impact on education. Some of today's college students compare the Web with "bad TV"--lengthy pauses, links that result in error messages, and animation and sound clips that are too brief.…

  2. Playing with the internet through world wide web

    International Nuclear Information System (INIS)

    Kim, Seon Tae; Jang, Jin Seok

    1995-07-01

    This book describes how to use the internet with world wide web. It is divided into six chapters, which are Let's go to the internet ocean, the internet in information superhighway are, connecting the world with a telephone wire such as link with the internet cable and telephone modem, internet service providers, text mode connection, Domain and IP address, the principle and use of world wide web ; business, music, fashion, movie and photo, internet news and e-mail, making internet map with web language, and from installation to application of base program such as TCP/IP, SLIP/PPP 3270 Emulator, Finger and NCSA Mosaic.

  3. GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web

    Directory of Open Access Journals (Sweden)

    Steve H.L. Liang

    2013-10-01

    Full Text Available The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web’s full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  4. ExpEdit: a webserver to explore human RNA editing in RNA-Seq experiments.

    Science.gov (United States)

    Picardi, Ernesto; D'Antonio, Mattia; Carrabino, Danilo; Castrignanò, Tiziana; Pesole, Graziano

    2011-05-01

    ExpEdit is a web application for assessing RNA editing in human at known or user-specified sites supported by transcript data obtained by RNA-Seq experiments. Mapping data (in SAM/BAM format) or directly sequence reads [in FASTQ/short read archive (SRA) format] can be provided as input to carry out a comparative analysis against a large collection of known editing sites collected in DARNED database as well as other user-provided potentially edited positions. Results are shown as dynamic tables containing University of California, Santa Cruz (UCSC) links for a quick examination of the genomic context. ExpEdit is freely available on the web at http://www.caspur.it/ExpEdit/.

  5. REDIdb: the RNA editing database.

    Science.gov (United States)

    Picardi, Ernesto; Regina, Teresa Maria Rosaria; Brennicke, Axel; Quagliariello, Carla

    2007-01-01

    The RNA Editing Database (REDIdb) is an interactive, web-based database created and designed with the aim to allocate RNA editing events such as substitutions, insertions and deletions occurring in a wide range of organisms. The database contains both fully and partially sequenced DNA molecules for which editing information is available either by experimental inspection (in vitro) or by computational detection (in silico). Each record of REDIdb is organized in a specific flat-file containing a description of the main characteristics of the entry, a feature table with the editing events and related details and a sequence zone with both the genomic sequence and the corresponding edited transcript. REDIdb is a relational database in which the browsing and identification of editing sites has been simplified by means of two facilities to either graphically display genomic or cDNA sequences or to show the corresponding alignment. In both cases, all editing sites are highlighted in colour and their relative positions are detailed by mousing over. New editing positions can be directly submitted to REDIdb after a user-specific registration to obtain authorized secure access. This first version of REDIdb database stores 9964 editing events and can be freely queried at http://biologia.unical.it/py_script/search.html.

  6. Surfing the World Wide Web to Education Hot-Spots.

    Science.gov (United States)

    Dyrli, Odvard Egil

    1995-01-01

    Provides a brief explanation of Web browsers and their use, as well as technical information for those considering access to the WWW (World Wide Web). Curriculum resources and addresses to useful Web sites are included. Sidebars show sample searches using Yahoo and Lycos search engines, and a list of recommended Web resources. (JKP)

  7. The World Wide Web of War

    National Research Council Canada - National Science Library

    Smith, Craig A

    2006-01-01

    Modern communications, combined with the near instantaneous publication of information on the World Wide Web, are providing the means to dramatically affect the pursuit, conduct, and public opinion of war on both sides...

  8. An information filtering system prototype for world wide web; Prototipo di sistema di information filtering per world wide web

    Energy Technology Data Exchange (ETDEWEB)

    Bordoni, L [ENEA Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). Funzione Centrale Studi

    1999-07-01

    In this report the architecture of an information filtering system for world wide web, developed by the Rome Third University (Italy) for ENEA (National Agency for New Technology, Energy and the Environment), is described. This prototype allows for selecting documents in text/HTML format from the web according to the interests of users. A user modeling shell allows ro build a model of user's interests, obtained during the interaction. The experimental results support the choice of embedding methods for this kind of application. [Italian] In questo rapporto viene descritta l'architettura di un sistema adattivo di information filtering su world wide web, sviluppato dall'universita' di Roma III in collaborazione con l'ENEA. Il prototipo descritto e' in grado di selezionare documenti in formato testo/html, raccolti dal web, in accordo con le caratteristiche e gli interessi degli utenti. Una shell di modellazione utente consente di costruire un modello degli interessi dell'utente, ottenuto nel corso dell'interazione. I risultati sperimentali rafforzano la scelta di usare metodi di modellazione utente per questo genere di applicazioni.

  9. World Wide Web Homepage Design.

    Science.gov (United States)

    Tillman, Michael L.

    This paper examines hypermedia design and draws conclusions about how educational research and theory applies to various aspects of World Wide Web (WWW) homepage design. "Hypermedia" is defined as any collection of information which may be textual, graphical, visual, or auditory in nature and which may be accessed via a nonlinear route.…

  10. Using the World Wide Web To Teach Francophone Culture.

    Science.gov (United States)

    Beyer, Deborah Berg; Van Ells, Paula Hartwig

    2002-01-01

    Examined use of the World Wide Web to teach Francophone culture. Suggests that bolstering reading comprehension in the foreign language and increased proficiency in navigating the Web are potential secondary benefits gained from the cultural Web-based activities proposed in the study.(Author/VWL)

  11. An information filtering system prototype for world wide web; Prototipo di sistema di information filtering per world wide web

    Energy Technology Data Exchange (ETDEWEB)

    Bordoni, L. [ENEA Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). Funzione Centrale Studi

    1999-07-01

    In this report the architecture of an information filtering system for world wide web, developed by the Rome Third University (Italy) for ENEA (National Agency for New Technology, Energy and the Environment), is described. This prototype allows for selecting documents in text/HTML format from the web according to the interests of users. A user modeling shell allows ro build a model of user's interests, obtained during the interaction. The experimental results support the choice of embedding methods for this kind of application. [Italian] In questo rapporto viene descritta l'architettura di un sistema adattivo di information filtering su world wide web, sviluppato dall'universita' di Roma III in collaborazione con l'ENEA. Il prototipo descritto e' in grado di selezionare documenti in formato testo/html, raccolti dal web, in accordo con le caratteristiche e gli interessi degli utenti. Una shell di modellazione utente consente di costruire un modello degli interessi dell'utente, ottenuto nel corso dell'interazione. I risultati sperimentali rafforzano la scelta di usare metodi di modellazione utente per questo genere di applicazioni.

  12. Collecting behavioural data using the world wide web: considerations for researchers.

    Science.gov (United States)

    Rhodes, S D; Bowie, D A; Hergenrather, K C

    2003-01-01

    To identify and describe advantages, challenges, and ethical considerations of web based behavioural data collection. This discussion is based on the authors' experiences in survey development and study design, respondent recruitment, and internet research, and on the experiences of others as found in the literature. The advantages of using the world wide web to collect behavioural data include rapid access to numerous potential respondents and previously hidden populations, respondent openness and full participation, opportunities for student research, and reduced research costs. Challenges identified include issues related to sampling and sample representativeness, competition for the attention of respondents, and potential limitations resulting from the much cited "digital divide", literacy, and disability. Ethical considerations include anonymity and privacy, providing and substantiating informed consent, and potential risks of malfeasance. Computer mediated communications, including electronic mail, the world wide web, and interactive programs will play an ever increasing part in the future of behavioural science research. Justifiable concerns regarding the use of the world wide web in research exist, but as access to, and use of, the internet becomes more widely and representatively distributed globally, the world wide web will become more applicable. In fact, the world wide web may be the only research tool able to reach some previously hidden population subgroups. Furthermore, many of the criticisms of online data collection are common to other survey research methodologies.

  13. Business use of the World-Wide Web

    Directory of Open Access Journals (Sweden)

    C. Cockburn

    1995-01-01

    Full Text Available Two methods were employed in this study of the use of the World Wide Web by business: first, a sample of 300 businesses with Web sites, across a wide range of industry types, was examined, by selecting (rather than sampling companies from the Yahoo! directory. The sites were investigated in relation to several areas - the purpose of the Web site, the use being made of electronic mail and the extent to which multi-media was being utilised. In addition, any other aspects of the site which were designed to make it more interesting to potential customers were also noted. Secondly, an electronic-mail questionnaire was sent to 222 of the 300 companies surveyed: that is, those that provided an e-mail address for contact. 14 were returned immediately due to unknown addresses or technical problems. Of the remaining 208, 102 replies were received, five of which were of no relevance, leaving 97 completed questionnaires to examine; a response rate of 47%, which is surprisingly good for a survey of this kind.

  14. Histories of Public Service Broadcasters on the Web

    DEFF Research Database (Denmark)

    This edited volume details multiple and dynamic histories of relations between public service broadcasters and the World Wide Web. What does it mean to be a national broadcaster in a global communications environment? What are the commercial and public service pressures that were brought to bear...... when public service broadcasters implemented web services? How did “one- to-many” broadcasters adapt to the “many-to-many” medium of the internet? The thematic or- ganisation of this collection addresses such major issues, while each chapter offers a particular historical account of relations between...... public service broadcasters and the World Wide Web....

  15. Re-Framing the World Wide Web

    Science.gov (United States)

    Black, August

    2011-01-01

    The research presented in this dissertation studies and describes how technical standards, protocols, and application programming interfaces (APIs) shape the aesthetic, functional, and affective nature of our most dominant mode of online communication, the World Wide Web (WWW). I examine the politically charged and contentious battle over browser…

  16. Process Support for Cooperative Work on the World Wide Web

    NARCIS (Netherlands)

    Sikkel, Nicolaas; Neumann, Olaf; Sachweh, Sabine

    The World Wide Web is becoming a dominating factor in information technology. Consequently, computer supported cooperative work on the Web has recently drawn a lot of attention. Process Support for Cooperative Work (PSCW) is a Web based system supporting both structured and unstructured forms of

  17. WorldWide Web: Hypertext from CERN.

    Science.gov (United States)

    Nickerson, Gord

    1992-01-01

    Discussion of software tools for accessing information on the Internet focuses on the WorldWideWeb (WWW) system, which was developed at the European Particle Physics Laboratory (CERN) in Switzerland to build a worldwide network of hypertext links using available networking technology. Its potential for use with multimedia documents is also…

  18. WebPresent: a World Wide Web-based telepresentation tool for physicians

    Science.gov (United States)

    Sampath-Kumar, Srihari; Banerjea, Anindo; Moshfeghi, Mehran

    1997-05-01

    In this paper, we present the design architecture and the implementation status of WebPresent - a world wide web based tele-presentation tool. This tool allows a physician to use a conference server workstation and make a presentation of patient cases to a geographically distributed audience. The audience consists of other physicians collaborating on patients' health care management and physicians participating in continuing medical education. These physicians are at several locations with networks of different bandwidth and capabilities connecting them. Audiences also receive the patient case information on different computers ranging form high-end display workstations to laptops with low-resolution displays. WebPresent is a scalable networked multimedia tool which supports the presentation of hypertext, images, audio, video, and a white-board to remote physicians with hospital Intranet access. WebPresent allows the audience to receive customized information. The data received can differ in resolution and bandwidth, depending on the availability of resources such as display resolution and network bandwidth.

  19. Internet and The World Wide Web

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 2. Internet and The World Wide Web. Neelima Shrikhande. General Article Volume 2 Issue 2 February 1997 pp 64-74. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/002/02/0064-0074 ...

  20. A World Wide Web Region-Based Image Search Engine

    DEFF Research Database (Denmark)

    Kompatsiaris, Ioannis; Triantafyllou, Evangelia; Strintzis, Michael G.

    2001-01-01

    In this paper the development of an intelligent image content-based search engine for the World Wide Web is presented. This system will offer a new form of media representation and access of content available in WWW. Information Web Crawlers continuously traverse the Internet and collect images...

  1. World-Wide Web the information universe

    CERN Document Server

    Berners-Lee, Tim; Groff, Jean-Francois; Pollermann, Bernd

    1992-01-01

    Purpose - The World-Wide Web (W-3) initiative is a practical project designed to bring a global information universe into existence using available technology. This paper seeks to describe the aims, data model, and protocols needed to implement the "web" and to compare them with various contemporary systems. Design/methodology/approach - Since Vannevar Bush's article, men have dreamed of extending their intellect by making their collective knowledge available to each individual by using machines. Computers provide us two practical techniques for human-knowledge interface. One is hypertext, in which links between pieces of text (or other media) mimic human association of ideas. The other is text retrieval, which allows associations to be deduced from the content of text. The W-3 ideal world allows both operations and provides access from any browsing platform. Findings - Various server gateways to other information systems have been produced, and the total amount of information available on the web is...

  2. Teaching Critical Evaluation Skills for World Wide Web Resources.

    Science.gov (United States)

    Tate, Marsha; Alexander, Jan

    1996-01-01

    Outlines a lesson plan used by an academic library to evaluate the quality of World Wide Web information. Discusses the traditional evaluation criteria of accuracy, authority, objectivity, currency, and coverage as it applies to the unique characteristics of Web pages: their marketing orientation, variety of information, and instability. The…

  3. Information about liver transplantation on the World Wide Web.

    Science.gov (United States)

    Hanif, F; Sivaprakasam, R; Butler, A; Huguet, E; Pettigrew, G J; Michael, E D A; Praseedom, R K; Jamieson, N V; Bradley, J A; Gibbs, P

    2006-09-01

    Orthotopic liver transplant (OLTx) has evolved to a successful surgical management for end-stage liver diseases. Awareness and information about OLTx is an important tool in assisting OLTx recipients and people supporting them, including non-transplant clinicians. The study aimed to investigate the nature and quality of liver transplant-related patient information on the World Wide Web. Four common search engines were used to explore the Internet by using the key words 'Liver transplant'. The URL (unique resource locator) of the top 50 returns was chosen as it was judged unlikely that the average user would search beyond the first 50 sites returned by a given search. Each Web site was assessed on the following categories: origin, language, accessibility and extent of the information. A weighted Information Score (IS) was created to assess the quality of clinical and educational value of each Web site and was scored independently by three transplant clinicians. The Internet search performed with the aid of the four search engines yielded a total of 2,255,244 Web sites. Of the 200 possible sites, only 58 Web sites were assessed because of repetition of the same Web sites and non-accessible links. The overall median weighted IS was 22 (IQR 1 - 42). Of the 58 Web sites analysed, 45 (77%) belonged to USA, six (10%) were European, and seven (12%) were from the rest of the world. The median weighted IS of publications originating from Europe and USA was 40 (IQR = 22 - 60) and 23 (IQR = 6 - 38), respectively. Although European Web sites produced a higher weighted IS [40 (IQR = 22 - 60)] as compared with the USA publications [23 (IQR = 6 - 38)], this was not statistically significant (p = 0.07). Web sites belonging to the academic institutions and the professional organizations scored significantly higher with a median weighted IS of 28 (IQR = 16 - 44) and 24(12 - 35), respectively, as compared with the commercial Web sites (median = 6 with IQR of 0 - 14, p = .001). There

  4. The Business Information Services: Old-Line Online Moves to the Web.

    Science.gov (United States)

    O'Leary, Mick

    1997-01-01

    Although the availability of free information on the World Wide Web has placed traditional, fee-based proprietary online services on the defensive, most major online business services are now on the Web. Highlights several business information providers: Profound, NewsNet and ProQuest Direct, Dow Jones and Wall Street Journal Interactive Edition,…

  5. Sources of Militaria on the World Wide Web | Walker | Scientia ...

    African Journals Online (AJOL)

    Having an interest in military-type topics is one thing, finding information on the web to quench your thirst for knowledge is another. The World Wide Web (WWW) is a universal electronic library that contains millions of web pages. As well as being fun, it is an addictive tool on which to search for information. To prevent hours ...

  6. Integrating Temporal Media and Open Hypermedia on the World Wide Web

    DEFF Research Database (Denmark)

    Bouvin, Niels Olof; Schade, René

    1999-01-01

    The World Wide Web has since its beginning provided linking to and from text documents encoded in HTML. The Web has evolved and most Web browsers now support a rich set of media types either by default or by the use of specialised content handlers, known as plug-ins. The limitations of the Web...

  7. So Wide a Web, So Little Time.

    Science.gov (United States)

    McConville, David; And Others

    1996-01-01

    Discusses new trends in the World Wide Web. Highlights include multimedia; digitized audio-visual files; compression technology; telephony; virtual reality modeling language (VRML); open architecture; and advantages of Java, an object-oriented programming language, including platform independence, distributed development, and pay-per-use software.…

  8. Business use of the World Wide Web: a report on further investigations

    Directory of Open Access Journals (Sweden)

    Hooi-Im Ng

    1998-01-01

    Full Text Available As a continuation of a previous study this paper reports on a series of studies into business use of the World Wide Web and, more generally the Internet. The use of the World Wide Web as a business tool has increased rapidly for the past three years, and the benefits of the World Wide Web to business and customers are discussed, together with the barriers that hold back future development of electronic commerce. As with the previous study we report on a desk survey of 300 randomly selected business Web sites and on the results of an electronic mail questionnaire sent to the sample companies. An extended version of this paper has been submitted to the International Journal of Information Management

  9. News Resources on the World Wide Web.

    Science.gov (United States)

    Notess, Greg R.

    1996-01-01

    Describes up-to-date news sources that are presently available on the Internet and World Wide Web. Highlights include electronic newspapers; AP (Associated Press) sources and Reuters; sports news; stock market information; New York Times; multimedia capabilities, including CNN Interactive; and local and regional news. (LRW)

  10. Teaching Hypertext and Hypermedia through the Web.

    Science.gov (United States)

    de Bra, Paul M. E.

    This paper describes a World Wide Web-based introductory course titled "Hypermedia Structures and Systems," offered as an optional part of the curriculum in computing science at the Eindhoven University of Technology (Netherlands). The technical environment for the current (1996) edition of the course is presented, which features…

  11. Golden Jubilee Photos: World Wide Web

    CERN Multimedia

    2004-01-01

    At the end of the 1980s, the Internet was already a valuable tool to scientists, allowing them to exchange e-mails and to access powerful computers remotely. A more simple means of sharing information was needed, however, and CERN, with its long tradition of informatics and networking, was the ideal place to find it. Moreover, hundreds of scientists from all over the world were starting to work together on preparations for the experiments at the Large Electron-Positron (LEP) collider. In 1989, Tim Berners-Lee (see photo), a young scientist working at CERN, drafted a proposal for an information-management system combining the internet, personal computers and computer-aided document consultation, known as hypertext. In 1990 he was joined by Robert Cailliau and the weaving of the World Wide Web began in earnest, even though only two CERN computers were allocated to the task at the time. The Web subsequently underwent a steady expansion to include the world's main particle physics institutes. The Web was not the...

  12. Judging nursing information on the world wide web.

    Science.gov (United States)

    Cader, Raffik

    2013-02-01

    The World Wide Web is increasingly becoming an important source of information for healthcare professionals. However, finding reliable information from unauthoritative Web sites to inform healthcare can pose a challenge to nurses. A study, using grounded theory, was undertaken in two phases to understand how qualified nurses judge the quality of Web nursing information. Data were collected using semistructured interviews and focus groups. An explanatory framework that emerged from the data showed that the judgment process involved the application of forms of knowing and modes of cognition to a range of evaluative tasks and depended on the nurses' critical skills, the time available, and the level of Web information cues. This article mainly focuses on the six evaluative tasks relating to assessing user-friendliness, outlook and authority of Web pages, and relationship to nursing practice; appraising the nature of evidence; and applying cross-checking strategies. The implications of these findings to nurse practitioners and publishers of nursing information are significant.

  13. International Markedsføring på World Wide Web

    DEFF Research Database (Denmark)

    Rask, Morten; Buch, Niels Jakob

    1999-01-01

    Denne artikel tager udgangspunkt i en gruppe af danske virksomheders anvendelse af World Wide Web til international markedsføring i en periode fra 1996 til 1998. Der identificeres tre interaktionstyper for virksomhedernes profil på Web, nemlig Brochuren, Håndbogen og Handelspladsen. Der reflekteres...... over de krav de enkelte interaktionstyper i forhold til automatisering, formalisering, integration og evaluering kunne kræve. Konklusionen bliver, at de tre interaktionstyper afspejler de udfordringer og muligheder, der er i anvendelsen af Web til markedsføring primært i et internationalt perspektiv......, men kan også bruges som input til nationale Web markedsføringsaktiviteter....

  14. Student participation in World Wide Web-based curriculum development of general chemistry

    Science.gov (United States)

    Hunter, William John Forbes

    1998-12-01

    This thesis describes an action research investigation of improvements to instruction in General Chemistry at Purdue University. Specifically, the study was conducted to guide continuous reform of curriculum materials delivered via the World Wide Web by involving students, instructors, and curriculum designers. The theoretical framework for this study was based upon constructivist learning theory and knowledge claims were developed using an inductive analysis procedure. This results of this study are assertions made in three domains: learning chemistry content via the World Wide Web, learning about learning via the World Wide Web, and learning about participation in an action research project. In the chemistry content domain, students were able to learn chemical concepts that utilized 3-dimensional visualizations, but not textual and graphical information delivered via the Web. In the learning via the Web domain, the use of feedback, the placement of supplementary aids, navigation, and the perception of conceptual novelty were all important to students' use of the Web. In the participation in action research domain, students learned about the complexity of curriculum. development, and valued their empowerment as part of the process.

  15. Exploring Geology on the World-Wide Web--Volcanoes and Volcanism.

    Science.gov (United States)

    Schimmrich, Steven Henry; Gore, Pamela J. W.

    1996-01-01

    Focuses on sites on the World Wide Web that offer information about volcanoes. Web sites are classified into areas of Global Volcano Information, Volcanoes in Hawaii, Volcanoes in Alaska, Volcanoes in the Cascades, European and Icelandic Volcanoes, Extraterrestrial Volcanism, Volcanic Ash and Weather, and Volcano Resource Directories. Suggestions…

  16. Collaborative Design of World Wide Web Pages: A Case Study.

    Science.gov (United States)

    Andrew, Paige G; Musser, Linda R.

    1997-01-01

    This case study of the collaborative design of an earth science World Wide Web page at Pennsylvania State University highlights the role of librarians. Discusses the original Web site and links, planning, the intended audience, and redesign and recommended changes; and considers the potential contributions of librarians. (LRW)

  17. Role of Librarian in Internet and World Wide Web Environment

    OpenAIRE

    K. Nageswara Rao; KH Babu

    2001-01-01

    The transition of traditional library collections to digital or virtual collections presented the librarian with new opportunities. The Internet, Web en-vironment and associated sophisticated tools have given the librarian a new dynamic role to play and serve the new information based society in bet-ter ways than hitherto. Because of the powerful features of Web i.e. distributed, heterogeneous, collaborative, multimedia, multi-protocol, hyperme-dia-oriented architecture, World Wide Web has re...

  18. Interactivity, Information Processing, and Learning on the World Wide Web.

    Science.gov (United States)

    Tremayne, Mark; Dunwoody, Sharon

    2001-01-01

    Examines the role of interactivity in the presentation of science news on the World Wide Web. Proposes and tests a model of interactive information processing that suggests that characteristics of users and Web sites influence interactivity, which influences knowledge acquisition. Describes use of a think-aloud method to study participants' mental…

  19. Increasing efficiency of information dissemination and collection through the World Wide Web

    Science.gov (United States)

    Daniel P. Huebner; Malchus B. Baker; Peter F. Ffolliott

    2000-01-01

    Researchers, managers, and educators have access to revolutionary technology for information transfer through the World Wide Web (Web). Using the Web to effectively gather and distribute information is addressed in this paper. Tools, tips, and strategies are discussed. Companion Web sites are provided to guide users in selecting the most appropriate tool for searching...

  20. Introduction to the World Wide Web and Mosaic

    Science.gov (United States)

    Youngblood, Jim

    1994-01-01

    This tutorial provides an introduction to some of the terminology related to the use of the World Wide Web and Mosaic. It is assumed that the user has some prior computer experience. References are included to other sources of additional information.

  1. Grid-optimized Web 3D applications on wide area network

    Science.gov (United States)

    Wang, Frank; Helian, Na; Meng, Lingkui; Wu, Sining; Zhang, Wen; Guo, Yike; Parker, Michael Andrew

    2008-08-01

    Geographical information system has come into the Web Service times now. In this paper, Web3D applications have been developed based on our developed Gridjet platform, which provides a more effective solution for massive 3D geo-dataset sharing in distributed environments. Web3D services enabling web users could access the services as 3D scenes, virtual geographical environment and so on. However, Web3D services should be shared by thousands of essential users that inherently distributed on different geography locations. Large 3D geo-datasets need to be transferred to distributed clients via conventional HTTP, NFS and FTP protocols, which often encounters long waits and frustration in distributed wide area network environments. GridJet was used as the underlying engine between the Web 3D application node and geo-data server that utilizes a wide range of technologies including the one of paralleling the remote file access, which is a WAN/Grid-optimized protocol and provides "local-like" accesses to remote 3D geo-datasets. No change in the way of using software is required since the multi-streamed GridJet protocol remains fully compatible with existing IP infrastructures. Our recent progress includes a real-world test that Web3D applications as Google Earth over the GridJet protocol beats those over the classic ones by a factor of 2-7 where the transfer distance is over 10,000 km.

  2. Service Learning and Building Community with the World Wide Web

    Science.gov (United States)

    Longan, Michael W.

    2007-01-01

    The geography education literature touts the World Wide Web (Web) as a revolutionary educational tool, yet most accounts ignore its uses for public communication and creative expression. This article argues that students can be producers of content that is of service to local audiences. Drawing inspiration from the community networking movement,…

  3. Perspectives for Electronic Books in the World Wide Web Age.

    Science.gov (United States)

    Bry, Francois; Kraus, Michael

    2002-01-01

    Discusses the rapid growth of the World Wide Web and the lack of use of electronic books and suggests that specialized contents and device independence can make Web-based books compete with print. Topics include enhancing the hypertext model of XML; client-side adaptation, including browsers and navigation; and semantic modeling. (Author/LRW)

  4. Remote sensing education and Internet/World Wide Web technology

    Science.gov (United States)

    Griffith, J.A.; Egbert, S.L.

    2001-01-01

    Remote sensing education is increasingly in demand across academic and professional disciplines. Meanwhile, Internet technology and the World Wide Web (WWW) are being more frequently employed as teaching tools in remote sensing and other disciplines. The current wealth of information on the Internet and World Wide Web must be distilled, nonetheless, to be useful in remote sensing education. An extensive literature base is developing on the WWW as a tool in education and in teaching remote sensing. This literature reveals benefits and limitations of the WWW, and can guide its implementation. Among the most beneficial aspects of the Web are increased access to remote sensing expertise regardless of geographic location, increased access to current material, and access to extensive archives of satellite imagery and aerial photography. As with other teaching innovations, using the WWW/Internet may well mean more work, not less, for teachers, at least at the stage of early adoption. Also, information posted on Web sites is not always accurate. Development stages of this technology range from on-line posting of syllabi and lecture notes to on-line laboratory exercises and animated landscape flyovers and on-line image processing. The advantages of WWW/Internet technology may likely outweigh the costs of implementing it as a teaching tool.

  5. Radiation protection and environmental radioactivity. A voyage to the World Wide Web for beginners; Strahlenschutz und Umweltradioaktivitaet im Internet. Eine Reise in das World Wide Web fuer Anfaenger

    Energy Technology Data Exchange (ETDEWEB)

    Weimer, S [Landesanstalt fuer Umweltschutz Baden-Wuerttemberg, Referat ' ' Umweltradioaktivitaet, Strahlenschutz' ' (Germany)

    1998-07-01

    According to the enormous growth of the Internet service 'World Wide Web' there is also a big growth in the number of web sites in connection with radiation protection. An introduction is given of some practical basis of the WWW. The structure of WWW addresses and navigating through the web with hyperlinks is explained. Further some search engines are presented. The paper lists a number of WWW addresses of interesting sites with radiological protection informations. (orig.) [German] Mit dem rasanten Wachstum des Internet-Dienstes 'World Wide Web' ist auch das Angebot von Web-Seiten im Bereich Strahlenschutz stark gewachsen. Es wird eine Einfuehrung in die wichtigsten praktischen Grundlagen des WWW gegeben. Es wird der Aufbau der WWW-Adressen erklaert und das Navigieren mit Hyperlinks. Ausserdem werden einige Suchmaschinen vorgestellt. Der Beitrag stellt eine groessere Zahl an WWW-Adressen zu interessanten Seiten mit Strahlenschutzinformationen zur Verfuegung. (orig.)

  6. World wide developments in shortwall and wide web mining techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pollard, T

    1975-11-01

    The paper describes the progress to date with continuous pillar extraction, and how the typical longwall powered support has been modified to be both strong enough and stable enough to provide roof support for very wide webs. It also describes the operating systems which have been specially designed. The next stages of development are discussed, particularly the provision of continuous conveyor haulage in place of the present-day shuttle car. The author suggests that marrying American coal-getting technology and British roof support technology might increase productivity.

  7. The world wide web: exploring a new advertising environment.

    Science.gov (United States)

    Johnson, C R; Neath, I

    1999-01-01

    The World Wide Web currently boasts millions of users in the United States alone and is likely to continue to expand both as a marketplace and as an advertising environment. Three experiments explored advertising in the Web environment, in particular memory for ads as they appear in everyday use across the Web. Experiments 1 and 2 examined the effect of advertising repetition on the retention of familiar and less familiar brand names, respectively. Experiment 1 demonstrated that repetition of a banner ad within multiple web pages can improve recall of familiar brand names, and Experiment 2 demonstrated that repetition can improve recognition of less familiar brand names. Experiment 3 directly compared the retention of familiar and less familiar brand names that were promoted by static and dynamic ads and demonstrated that the use of dynamic advertising can increase brand name recall, though only for familiar brand names. This study also demonstrated that, in the Web environment, much as in other advertising environments, familiar brand names possess a mnemonic advantage not possessed by less familiar brand names. Finally, data regarding Web usage gathered from all experiments confirm reports that Web usage among males tends to exceed that among females.

  8. Information on infantile colic on the World Wide Web.

    Science.gov (United States)

    Bailey, Shana D; D'Auria, Jennifer P; Haushalter, Jamie P

    2013-01-01

    The purpose of this study was to explore and describe the type and quality of information on infantile colic that a parent might access on the World Wide Web. Two checklists were used to evaluate the quality indicators of 24 Web sites and the colic-specific content. Fifteen health information Web sites met more of the quality parameters than the nine commercial sites. Eight Web sites included information about colic and infant abuse, with six being health information sites. The colic-specific content on 24 Web sites reflected current issues and controversies; however, the completeness of the information in light of current evidence varied among the Web sites. Strategies to avoid complications of parental stress or infant abuse were not commonly found on the Web sites. Pediatric professionals must guide parents to reliable colic resources that also include emotional support and understanding of infant crying. A best evidence guideline for the United States would eliminate confusion and uncertainty about which colic therapies are safe and effective for parents and professionals. Copyright © 2013 National Association of Pediatric Nurse Practitioners. Published by Mosby, Inc. All rights reserved.

  9. Medical mentoring via the evolving world wide web.

    Science.gov (United States)

    Jaffer, Usman; Vaughan-Huxley, Eyston; Standfield, Nigel; John, Nigel W

    2013-01-01

    Mentoring, for physicians and surgeons in training, is advocated as an essential adjunct in work-based learning, providing support in career and non-career related issues. The World Wide Web (WWW) has evolved, as a technology, to become more interactive and person centric, tailoring itself to the individual needs of the user. This changing technology may open new avenues to foster mentoring in medicine. DESIGN, SYSTEMATIC REVIEW, MAIN OUTCOME MEASURES: A search of the MEDLINE database from 1950 to 2012 using the PubMed interface, combined with manual cross-referencing was performed using the following strategy: ("mentors"[MeSH Terms] OR "mentors"[All Fields] OR "mentor"[All Fields]) AND ("internet"[MeSH Terms] OR "internet"[All Fields]) AND ("medicine"[MeSH Terms] OR "medicine"[All Fields]) AND ("humans"[MeSH Terms] AND English[lang]). Abstracts were screened for relevance (UJ) to the topic; eligibility for inclusion was simply on screening for relevance to online mentoring and web-based technologies. Forty-five papers were found, of which 16 were relevant. All studies were observational in nature. To date, all medical mentoring applications utilizing the World Wide Web have enjoyed some success limited by Web 1.0 and 2.0 technologies. With the evolution of the WWW through 1.0, 2.0 and 3.0 generations, the potential for meaningful tele- and distance mentoring has greatly improved. Some engagement has been made with these technological advancements, however further work is required to fully realize the potential of these technologies. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  10. Integrating Mathematics, Science, and Language Arts Instruction Using the World Wide Web.

    Science.gov (United States)

    Clark, Kenneth; Hosticka, Alice; Kent, Judi; Browne, Ron

    1998-01-01

    Addresses issues of access to World Wide Web sites, mathematics and science content-resources available on the Web, and methods for integrating mathematics, science, and language arts instruction. (Author/ASK)

  11. Basic support for cooperative work on the World Wide Web

    NARCIS (Netherlands)

    Bentley, R.; Appelt, W.; Busbach, U.; Hinrichs, E.; Kerr, D.; Sikkel, Nicolaas; Trevor, J.; Woetzel, G.

    The emergence and widespread adoption of the World Wide Web offers a great deal of potential in supporting cross-platform cooperative work within widely dispersed working groups. The Basic Support for Cooperative Work (BSCW) project at GMD is attempting to realize this potential through development

  12. Advanced use of World-Wide Web in the online system of DELPHI

    International Nuclear Information System (INIS)

    Doenszelmann, M.; Carvalho, D.; Du, S.; Tennebo, F.

    1996-01-01

    The World-Wide Web technologies used by the DELPHI experiment at CERN to provide easy access to information of the On-line System. WWW technology on both client and server side is used in five different projects. The World-Wide Web has its advantages concerning the network technology, the practical user interface and its scalability. It however also demands a stateless protocol and format negotiation. (author)

  13. Meeting the challenge of finding resources for ophthalmic nurses on the World Wide Web.

    Science.gov (United States)

    Duffel, P G

    1998-12-01

    The World Wide Web ("the Web") is a macrocosm of resources that can be overwhelming. Often the sheer volume of material available causes one to give up in despair before finding information of any use. The Web is such a popular resource that it cannot be ignored. Two of the biggest challenges to finding good information on the Web are knowing where to start and judging whether the information gathered is pertinent and credible. This article addresses these two challenges and introduces the reader to a variety of ophthalmology and vision science resources on the World Wide Web.

  14. How Commercial Banks Use the World Wide Web: A Content Analysis.

    Science.gov (United States)

    Leovic, Lydia K.

    New telecommunications vehicles expand the possible ways that business is conducted. The hypermedia portion of the Internet, the World Wide Web, is such a telecommunications device. The Web is presently one of the most flexible and dynamic methods for electronic information dissemination. The level of technological sophistication necessary to…

  15. System configuration on Web with mashup.

    OpenAIRE

    清水, 宏泰; SHIMIZU, Hiroyasu

    2014-01-01

    Mashup become trend for create Web service due to popularizing cloud service. Mashup is method for create Web service from several Web services and API. Mashup has a few problems. One of the problem is deference of data format and label. Semantic Web can solve it. This paper propose method of building a system on Web with mashup using semantic Web. Mashup system configuration can express as URL. So, editing URL for mashup is editing system configuration. And any device can use this system on ...

  16. 40 CFR 63.825 - Standards: Product and packaging rotogravure and wide-web flexographic printing.

    Science.gov (United States)

    2010-07-01

    ... POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for the Printing and Publishing Industry § 63.825 Standards: Product and packaging rotogravure and wide-web flexographic printing. (a) Each... rotogravure and wide-web flexographic printing. 63.825 Section 63.825 Protection of Environment ENVIRONMENTAL...

  17. Lithuanian on-line periodicals on the World Wide Web

    Directory of Open Access Journals (Sweden)

    Lina Sarlauskiene

    2001-01-01

    Full Text Available Deals with Lithuanian full-text electronic periodicals distributed through the World Wide Web. An electronic periodical is usually defined as a regular publication on some particular topic distributed in digital form, chiefly through the Web, but also by electronic mail or digital disk. The author has surveyed 106 publications. Thirty-four are distributed only on the Web, and 72 have printed versions. The number of analysed publications is not very big, but four years of electronic publishing and the variety of periodicals enables us to establish the causes of this phenomenon, the main features of development, and some perspectives. Electronic periodicals were analysed according to their type, purpose, contents, publisher, regularity, language, starting date and place of publication, and other features.

  18. Tim Berners-Lee: inventor de la World Wide Web

    OpenAIRE

    Universidad de Granada. Biblioteca

    2015-01-01

    El presente Cat??logo contiene la exposici??n organizada por la Biblioteca de la ETSIIT de la Universidad de Granada durante los meses de noviembre-diciembre de 2015 y titulada: "Tim Berners-Lee: inventor de la World Wide Web"

  19. CRISPR-FOCUS: A web server for designing focused CRISPR screening experiments

    OpenAIRE

    Cao, Qingyi; Ma, Jian; Chen, Chen-Hao; Xu, Han; Chen, Zhi; Li, Wei; Liu, X. Shirley

    2017-01-01

    The recently developed CRISPR screen technology, based on the CRISPR/Cas9 genome editing system, enables genome-wide interrogation of gene functions in an efficient and cost-effective manner. Although many computational algorithms and web servers have been developed to design single-guide RNAs (sgRNAs) with high specificity and efficiency, algorithms specifically designed for conducting CRISPR screens are still lacking. Here we present CRISPR-FOCUS, a web-based platform to search and prioriti...

  20. Usare WebDewey

    OpenAIRE

    Baldi, Paolo

    2016-01-01

    This presentation shows how to use the WebDewey tool. Features of WebDewey. Italian WebDewey compared with American WebDewey. Querying Italian WebDewey. Italian WebDewey and MARC21. Italian WebDewey and UNIMARC. Numbers, captions, "equivalente verbale": Dewey decimal classification in Italian catalogues. Italian WebDewey and Nuovo soggettario. Italian WebDewey and LCSH. Italian WebDewey compared with printed version of Italian Dewey Classification (22. edition): advantages and disadvantages o...

  1. Consécration pour les Inventeurs du World-Wide Web

    CERN Multimedia

    CERN Press Office. Geneva

    1996-01-01

    Nearly seven years after it was invented at CERN, the World-Wide Web has woven its way into every corner of the Internet. On Saturday, 17 February, the inventors of the Web, Tim Berners-Lee, now at Massachusetts Institute of Technology (MIT), and Robert Cailliau of CERN's Electronics and Computing for Physics (ECP) Division, will be honoured with one of computing's highest distinctions: the Association for Computing (ACM) Software System Award 1995.

  2. Educational use of World Wide Web pages on CD-ROM.

    Science.gov (United States)

    Engel, Thomas P; Smith, Michael

    2002-01-01

    The World Wide Web is increasingly important for medical education. Internet served pages may also be used on a local hard disk or CD-ROM without a network or server. This allows authors to reuse existing content and provide access to users without a network connection. CD-ROM offers several advantages over network delivery of Web pages for several applications. However, creating Web pages for CD-ROM requires careful planning. Issues include file names, relative links, directory names, default pages, server created content, image maps, other file types and embedded programming. With care, it is possible to create server based pages that can be copied directly to CD-ROM. In addition, Web pages on CD-ROM may reference Internet served pages to provide the best features of both methods.

  3. WEB-DL endovascular treatment of wide-neck bifurcation aneurysms

    DEFF Research Database (Denmark)

    Lubicz, B; Klisch, J; Gauvrit, J-Y

    2014-01-01

    BACKGROUND AND PURPOSE: Flow disruption with the WEB-DL device has been used safely for the treatment of wide-neck bifurcation aneurysms, but the stability of aneurysm occlusion after this treatment is unknown. This retrospective multicenter European study analyzed short- and midterm data...... in patients treated with WEB-DL. MATERIALS AND METHODS: Twelve European neurointerventional centers participated in the study. Clinical data and pre- and postoperative short- and midterm images were collected. An experienced interventional neuroradiologist independently analyzed the images. Aneurysm occlusion...... was classified into 4 grades: complete occlusion, opacification of the proximal recess of the device, neck remnant, and aneurysm remnant. RESULTS: Forty-five patients (34 women and 11 men) 35-74 years of age (mean, 56.3 ± 9.6 years) with 45 aneurysms treated with the WEB device were included. Aneurysm locations...

  4. Role of Librarian in Internet and World Wide Web Environment

    Directory of Open Access Journals (Sweden)

    K. Nageswara Rao

    2001-01-01

    Full Text Available The transition of traditional library collections to digital or virtual collections presented the librarian with new opportunities. The Internet, Web en-vironment and associated sophisticated tools have given the librarian a new dynamic role to play and serve the new information based society in bet-ter ways than hitherto. Because of the powerful features of Web i.e. distributed, heterogeneous, collaborative, multimedia, multi-protocol, hyperme-dia-oriented architecture, World Wide Web has revolutionized the way people access information, and has opened up new possibilities in areas such as digital libraries, virtual libraries, scientific information retrieval and dissemination. Not only the world is becoming interconnected, but also the use of Internet and Web has changed the fundamental roles, paradigms, and organizational culture of libraries and librarians as well. The article describes the limitless scope of Internet and Web, the existence of the librarian in the changing environment, parallelism between information sci-ence and information technology, librarians and intelligent agents, working of intelligent agents, strengths, weaknesses, threats and opportunities in-volved in the relationship between librarians and the Web. The role of librarian in Internet and Web environment especially as intermediary, facilita-tor, end-user trainer, Web site builder, researcher, interface designer, knowledge manager and sifter of information resources is also described.

  5. Multi-dimensional effects of color on the world wide web

    Science.gov (United States)

    Morton, Jill

    2002-06-01

    Color is the most powerful building material of visual imagery on the World Wide Web. It must function successfully as it has done historically in traditional two-dimensional media, as well as address new challenges presented by this electronic medium. The psychological, physiological, technical and aesthetic effects of color have been redefined by the unique requirements of the electronic transmission of text and images on the Web. Color simultaneously addresses each of these dimensions in this electronic medium.

  6. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  7. Accessing NASA Technology with the World Wide Web

    Science.gov (United States)

    Nelson, Michael L.; Bianco, David J.

    1995-01-01

    NASA Langley Research Center (LaRC) began using the World Wide Web (WWW) in the summer of 1993, becoming the first NASA installation to provide a Center-wide home page. This coincided with a reorganization of LaRC to provide a more concentrated focus on technology transfer to both aerospace and non-aerospace industry. Use of WWW and NCSA Mosaic not only provides automated information dissemination, but also allows for the implementation, evolution and integration of many technology transfer and technology awareness applications. This paper describes several of these innovative applications, including the on-line presentation of the entire Technology OPportunities Showcase (TOPS), an industrial partnering showcase that exists on the Web long after the actual 3-day event ended. The NASA Technical Report Server (NTRS) provides uniform access to many logically similar, yet physically distributed NASA report servers. WWW is also the foundation of the Langley Software Server (LSS), an experimental software distribution system which will distribute LaRC-developed software. In addition to the more formal technology distribution projects, WWW has been successful in connecting people with technologies and people with other people.

  8. PENYEBARAN INFORMASI MENGGUNAKAN WWW (WORLD WIDE WEB

    Directory of Open Access Journals (Sweden)

    Ika Atman Satya

    2011-12-01

    Full Text Available Media Informasi secara tradisional telah kita kenai dengan menggunakan koran, televisi, radio dan buku referensi. Media informasi tersebut untuk penyebarannya memerlukan penunjang agar informasi tersebut dapat disebarkan secara lutis. Selain penggunaan media tradisional tersebut penyebaran informasi dengan menggunakan jaringan komputer Internet juga berkembang. Salah satu cara penyebaran informasi dengan menggunakan aplikasi WWW (World Wide Web yang mempunyai kemampuan menggabungkan gambar, text dan suara secara interaktif. Pada tulisan ini akan dibahas tentang kemampuan, penggunaan dan pengembangan server WWW.

  9. Finding Web-Based Anxiety Interventions on the World Wide Web: A Scoping Review.

    Science.gov (United States)

    Ashford, Miriam Thiel; Olander, Ellinor K; Ayers, Susan

    2016-06-01

    One relatively new and increasingly popular approach of increasing access to treatment is Web-based intervention programs. The advantage of Web-based approaches is the accessibility, affordability, and anonymity of potentially evidence-based treatment. Despite much research evidence on the effectiveness of Web-based interventions for anxiety found in the literature, little is known about what is publically available for potential consumers on the Web. Our aim was to explore what a consumer searching the Web for Web-based intervention options for anxiety-related issues might find. The objectives were to identify currently publically available Web-based intervention programs for anxiety and to synthesize and review these in terms of (1) website characteristics such as credibility and accessibility; (2) intervention program characteristics such as intervention focus, design, and presentation modes; (3) therapeutic elements employed; and (4) published evidence of efficacy. Web keyword searches were carried out on three major search engines (Google, Bing, and Yahoo-UK platforms). For each search, the first 25 hyperlinks were screened for eligible programs. Included were programs that were designed for anxiety symptoms, currently publically accessible on the Web, had an online component, a structured treatment plan, and were available in English. Data were extracted for website characteristics, program characteristics, therapeutic characteristics, as well as empirical evidence. Programs were also evaluated using a 16-point rating tool. The search resulted in 34 programs that were eligible for review. A wide variety of programs for anxiety, including specific anxiety disorders, and anxiety in combination with stress, depression, or anger were identified and based predominantly on cognitive behavioral therapy techniques. The majority of websites were rated as credible, secure, and free of advertisement. The majority required users to register and/or to pay a program access

  10. White Supremacists, Oppositional Culture and the World Wide Web

    Science.gov (United States)

    Adams, Josh; Roscigno, Vincent J.

    2005-01-01

    Over the previous decade, white supremacist organizations have tapped into the ever emerging possibilities offered by the World Wide Web. Drawing from prior sociological work that has examined this medium and its uses by white supremacist organizations, this article advances the understanding of recruitment, identity and action by providing a…

  11. Web Apollo: a web-based genomic annotation editing platform.

    Science.gov (United States)

    Lee, Eduardo; Helt, Gregg A; Reese, Justin T; Munoz-Torres, Monica C; Childers, Chris P; Buels, Robert M; Stein, Lincoln; Holmes, Ian H; Elsik, Christine G; Lewis, Suzanna E

    2013-08-30

    Web Apollo is the first instantaneous, collaborative genomic annotation editor available on the web. One of the natural consequences following from current advances in sequencing technology is that there are more and more researchers sequencing new genomes. These researchers require tools to describe the functional features of their newly sequenced genomes. With Web Apollo researchers can use any of the common browsers (for example, Chrome or Firefox) to jointly analyze and precisely describe the features of a genome in real time, whether they are in the same room or working from opposite sides of the world.

  12. Growth and structure of the World Wide Web: Towards realistic modeling

    Science.gov (United States)

    Tadić, Bosiljka

    2002-08-01

    We simulate evolution of the World Wide Web from the dynamic rules incorporating growth, bias attachment, and rewiring. We show that the emergent double-hierarchical structure with distinct distributions of out- and in-links is comparable with the observed empirical data when the control parameter (average graph flexibility β) is kept in the range β=3-4. We then explore the Web graph by simulating (a) Web crawling to determine size and depth of connected components, and (b) a random walker that discovers the structure of connected subgraphs with dominant attractor and promoter nodes. A random walker that adapts its move strategy to mimic local node linking preferences is shown to have a short access time to "important" nodes on the Web graph.

  13. REDIdb 3.0: A Comprehensive Collection of RNA Editing Events in Plant Organellar Genomes.

    Science.gov (United States)

    Lo Giudice, Claudio; Pesole, Graziano; Picardi, Ernesto

    2018-01-01

    RNA editing is an important epigenetic mechanism by which genome-encoded transcripts are modified by substitutions, insertions and/or deletions. It was first discovered in kinetoplastid protozoa followed by its reporting in a wide range of organisms. In plants, RNA editing occurs mostly by cytidine (C) to uridine (U) conversion in translated regions of organelle mRNAs and tends to modify affected codons restoring evolutionary conserved aminoacid residues. RNA editing has also been described in non-protein coding regions such as group II introns and structural RNAs. Despite its impact on organellar transcriptome and proteome complexity, current primary databases still do not provide a specific field for RNA editing events. To overcome these limitations, we developed REDIdb a specialized database for RNA editing modifications in plant organelles. Hereafter we describe its third release containing more than 26,000 events in a completely novel web interface to accommodate RNA editing in its genomics, biological and evolutionary context through whole genome maps and multiple sequence alignments. REDIdb is freely available at http://srv00.recas.ba.infn.it/redidb/index.html.

  14. REDIdb 3.0: A Comprehensive Collection of RNA Editing Events in Plant Organellar Genomes

    Directory of Open Access Journals (Sweden)

    Claudio Lo Giudice

    2018-04-01

    Full Text Available RNA editing is an important epigenetic mechanism by which genome-encoded transcripts are modified by substitutions, insertions and/or deletions. It was first discovered in kinetoplastid protozoa followed by its reporting in a wide range of organisms. In plants, RNA editing occurs mostly by cytidine (C to uridine (U conversion in translated regions of organelle mRNAs and tends to modify affected codons restoring evolutionary conserved aminoacid residues. RNA editing has also been described in non-protein coding regions such as group II introns and structural RNAs. Despite its impact on organellar transcriptome and proteome complexity, current primary databases still do not provide a specific field for RNA editing events. To overcome these limitations, we developed REDIdb a specialized database for RNA editing modifications in plant organelles. Hereafter we describe its third release containing more than 26,000 events in a completely novel web interface to accommodate RNA editing in its genomics, biological and evolutionary context through whole genome maps and multiple sequence alignments. REDIdb is freely available at http://srv00.recas.ba.infn.it/redidb/index.html

  15. Touring the Campus Library from the World Wide Web.

    Science.gov (United States)

    Mosley, Pixey Anne; Xiao, Daniel

    1996-01-01

    The philosophy, design, implementation and evaluation of a World Wide Web-accessible Virtual Library Tour of Texas A & M University's Evans Library is presented. Its design combined technical computer issues and library instruction expertise. The tour can be used to simulate a typical walking tour through the library or heading directly to a…

  16. Distributing Congestion Management System Information Using the World Wide Web

    Science.gov (United States)

    1997-01-01

    The Internet is a unique medium for the distribution of information, and it provides a tremendous opportunity to take advantage of peoples innate interest in transportation issues as they relate to their own lives. In particular, the World Wide Web (...

  17. Molecular structure input on the web

    Directory of Open Access Journals (Sweden)

    Ertl Peter

    2010-02-01

    Full Text Available Abstract A molecule editor, that is program for input and editing of molecules, is an indispensable part of every cheminformatics or molecular processing system. This review focuses on a special type of molecule editors, namely those that are used for molecule structure input on the web. Scientific computing is now moving more and more in the direction of web services and cloud computing, with servers scattered all around the Internet. Thus a web browser has become the universal scientific user interface, and a tool to edit molecules directly within the web browser is essential. The review covers a history of web-based structure input, starting with simple text entry boxes and early molecule editors based on clickable maps, before moving to the current situation dominated by Java applets. One typical example - the popular JME Molecule Editor - will be described in more detail. Modern Ajax server-side molecule editors are also presented. And finally, the possible future direction of web-based molecule editing, based on technologies like JavaScript and Flash, is discussed.

  18. The World-Wide Web past present and future, and its application to medicine

    CERN Document Server

    Sendall, D M

    1997-01-01

    The World-Wide Web was first developed as a tool for collaboration in the high energy physics community. From there it spread rapidly to other fields, and grew to its present impressive size. As an easy way to access information, it has been a great success, and a huge number of medical applications have taken advantage of it. But there is another side to the Web, its potential as a tool for collaboration between people. Medical examples include telemedicine and teaching. New technical developments offer still greater potential in medical and other fields. This paper gives some background to the early development of the World-Wide Web, a brief overview of its present state with some examples relevant to medicine, and a look at the future.

  19. Business, Government, and Law on the Internet. A Hands-On Second Edition. Workshop. Internet Workshop Series Number 3.

    Science.gov (United States)

    Peete, Gary R.

    This "workshop-in-a-book" is a much-expanded second edition designed for the businessperson, legal researcher, information specialist, consumer, student, or scholar wanting to discover information in three overlapping areas: business, government, and law. The book is divided into two modules: (1) "The World Wide Web: Your Entree to…

  20. Wikinews interviews World Wide Web co-inventor Robert Cailliau

    CERN Multimedia

    2007-01-01

    "The name Robert Caillau may not ring a bell to the general pbulic, but his invention is the reason why you are reading this: Dr. Cailliau together with his colleague Sir Tim Berners-Lee invented the World Wide Web, making the internet accessible so it could grow from an academic tool to a mass communication medium." (9 pages)

  1. Collaborative Information Agents on the World Wide Web

    Science.gov (United States)

    Chen, James R.; Mathe, Nathalie; Wolfe, Shawn; Koga, Dennis J. (Technical Monitor)

    1998-01-01

    In this paper, we present DIAMS, a system of distributed, collaborative information agents which help users access, collect, organize, and exchange information on the World Wide Web. Personal agents provide their owners dynamic displays of well organized information collections, as well as friendly information management utilities. Personal agents exchange information with one another. They also work with other types of information agents such as matchmakers and knowledge experts to facilitate collaboration and communication.

  2. The Land of Confusion? High School Students and Their Use of the World Wide Web for Research.

    Science.gov (United States)

    Lorenzen, Michael

    2002-01-01

    Examines high school students' use of the World Wide Web to complete assignments. Findings showed the students used a good variety of resources, including libraries and the World Wide Web, to find information for assignments. However, students were weak at determining the quality of the information found on web sites. Students did poorly at…

  3. Wood Utilization Research Dissemination on the World Wide Web: A Case Study

    Science.gov (United States)

    Daniel L. Schmoldt; Matthew F. Winn; Philip A. Araman

    1997-01-01

    Because many research products are informational rather than tangible, emerging information technologies, such as the multi-media format of the World Wide Web, provide an open and easily accessible mechanism for transferring research to user groups. We have found steady, increasing use of our Web site over the first 6-1/2 months of operation; almost one-third of the...

  4. Statistical Analysis with Webstat, a Java applet for the World Wide Web

    Directory of Open Access Journals (Sweden)

    Webster West

    1997-09-01

    Full Text Available The Java programming language has added a new tool for delivering computing applications over the World Wide Web (WWW. WebStat is a new computing environment for basic statistical analysis which is delivered in the form of a Java applet. Anyone with WWW access and a Java capable browser can access this new analysis environment. Along with an overall introduction of the environment, the main features of this package are illustrated, and the prospect of using basic WebStat components for more advanced applications is discussed.

  5. A genome-wide map of hyper-edited RNA reveals numerous new sites

    Science.gov (United States)

    Porath, Hagit T.; Carmi, Shai; Levanon, Erez Y.

    2014-01-01

    Adenosine-to-inosine editing is one of the most frequent post-transcriptional modifications, manifested as A-to-G mismatches when comparing RNA sequences with their source DNA. Recently, a number of RNA-seq data sets have been screened for the presence of A-to-G editing, and hundreds of thousands of editing sites identified. Here we show that existing screens missed the majority of sites by ignoring reads with excessive (‘hyper’) editing that do not easily align to the genome. We show that careful alignment and examination of the unmapped reads in RNA-seq studies reveal numerous new sites, usually many more than originally discovered, and in precisely those regions that are most heavily edited. Specifically, we discover 327,096 new editing sites in the heavily studied Illumina Human BodyMap data and more than double the number of detected sites in several published screens. We also identify thousands of new sites in mouse, rat, opossum and fly. Our results establish that hyper-editing events account for the majority of editing sites. PMID:25158696

  6. Histology on the World Wide Web: A Digest of Resources for Students and Teachers.

    Science.gov (United States)

    Cotter, John R.

    1997-01-01

    Provides a list of 37 World Wide Web sites that are devoted to instruction in histology and include electronic manuals, syllabi, atlases, image galleries, and quizzes. Reviews the topics, content, and highlights of these Web sites. (DDR)

  7. Gender Equity in Advertising on the World-Wide Web: Can it be Found?

    Science.gov (United States)

    Kramer, Kevin M.; Knupfer, Nancy Nelson

    Recent attention to gender equity in computer environments, as well as in print-based and televised advertising for technological products, suggests that gender bias in the computer environment continues. This study examined gender messages within World Wide Web advertisements, specifically the type and number of visual images used in Web banner…

  8. CRISPR-RT: A web service for designing CRISPR-C2c2 crRNA with improved target specificity

    OpenAIRE

    Zhu, Houxiang; Richmond, Emily; Liang, Chun

    2017-01-01

    CRISPR-Cas systems have been successfully applied in genome editing. Recently, the CRISPR-C2c2 system has been reported as a tool for RNA editing. Here we describe CRISPR-RT (CRISPR RNA-Targeting), the first web service to help biologists design the crRNA with improved target specificity for the CRISPR-C2c2 system. CRISPR-RT allows users to set up a wide range of parameters, making it highly flexible for current and future research in CRISPR-based RNA editing. CRISPR-RT covers major model org...

  9. The online discourse on the Demjanjuk trial. New memory practices on the World Wide Web?

    Directory of Open Access Journals (Sweden)

    Vivien SOMMER

    2012-01-01

    Full Text Available In this article I want to discuss the question if and how the World Wide Web changes social memory practices. Therefore I examine the relationship between the World Wide Web, social memory practices and public discourses. Towards discussing mediated memory processes I focus on the online discourse about the trial against the former concentration camp guard John Demjanjuk.

  10. Infant Gastroesophageal Reflux Information on the World Wide Web.

    Science.gov (United States)

    Balgowan, Regina; Greer, Leah C; D'Auria, Jennifer P

    2016-01-01

    The purpose of this study was to describe the type and quality of health information about infant gastroesophageal reflux (GER) that a parent may find on the World Wide Web. The data collection tool included evaluation of Web site quality and infant GER-specific content on the 30 sites that met the inclusion criteria. The most commonly found content categories in order of frequency were management strategies, when to call a primary care provider, definition, and clinical features. The most frequently mentioned strategies included feeding changes, infant positioning, and medications. Thirteen of the 30 Web sites included information on both GER and gastroesophageal reflux disease. Mention of the use of medication to lessen infant symptoms was found on 15 of the 30 sites. Only 10 of the 30 sites included information about parent support and coping strategies. Pediatric nurse practitioners (PNPs) should utilize well-child visits to address the normalcy of physiologic infant GER and clarify any misperceptions parents may have about diagnosis and the role of medication from information they may have found on the Internet. It is critical for PNPs to assist in the development of Web sites with accurate content, advise parents on how to identify safe and reliable information, and provide examples of high-quality Web sites about child health topics such as infant GER. Copyright © 2016 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.

  11. Increasing public understanding of transgenic crops through the World Wide Web.

    Science.gov (United States)

    Byrne, Patrick F; Namuth, Deana M; Harrington, Judy; Ward, Sarah M; Lee, Donald J; Hain, Patricia

    2002-07-01

    Transgenic crops among the most controversial "science and society" issues of recent years. Because of the complex techniques involved in creating these crops and the polarized debate over their risks and beliefs, a critical need has arisen for accessible and balanced information on this technology. World Wide Web sites offer several advantages for disseminating information on a fast-changing technical topic, including their global accessibility; and their ability to update information frequently, incorporate multimedia formats, and link to networks of other sites. An alliance between two complementary web sites at Colorado State University and the University of Nebraska-Lincoln takes advantage of the web environment to help fill the need for public information on crop genetic engineering. This article describes the objectives and features of each site. Viewership data and other feedback have shown these web sites to be effective means of reaching public audiences on a complex scientific topic.

  12. Technical Evaluation Report 41: WebCT: A major shift of emphasis

    Directory of Open Access Journals (Sweden)

    Kristine Thibeault

    2004-11-01

    Full Text Available The evaluation reports in this series usually feature several products at once. The current review, however, comes at a time when one of the most widely used (and expensive online learning management systems is undergoing a major change in its marketing strategy and corporate focus. WebCT is currently evolving to a new version (WebCT Vista, with much attendant discussion by distance education (DE users. The current review, as the others in this series, adds the DE student's perspective to this discussion. The review compares the existing WebCT Campus Edition with the new WebCT Vista, and examines some of the problems associated with the migration to Vista at the institutional level. A response to the report by the WebCT company is appended.

  13. The Relationship of the World Wide Web to Thinking Skills.

    Science.gov (United States)

    Bradshaw, Amy C.; Bishop, Jeanne L.; Gens, Linda S.; Miller, Sharla L.; Rogers, Martha A.

    2002-01-01

    Discusses use of the World Wide Web in education and its possibilities for developing higher order critical thinking skills to successfully deal with the demands of the future information society. Suggests that teachers need to provide learning environments that are learner-centered, authentic, problem-based, and collaborative. (Contains 61…

  14. Exploratory Analysis of the Effect of Consultants on the Use of World Wide Web Sites in SMEs

    Directory of Open Access Journals (Sweden)

    Sigi Goode

    2002-11-01

    Full Text Available There is little published research on the role of consultants in technology adoption. Given the increasing popularity of the World Wide Web in commercial environments and the number of consultants now offering web development services, some analysis into the effects of their engagement would be of benefit. In an extension of an ongoing study, an existing sample of 113 World Wide Web adopters was used to examine the nature of World Wide Web site use with respect to consultant and Internet Service Provider (ISP engagement. Analysis was also conducted into the use of consultants and ISPs as developers and maintainers of these sites. This preliminary research finds a number of interesting outcomes. No significant relationship is found between consultant or ISP engagement and World Wide Web site use, regardless of whether the consultant was engaged as site developer or site maintainer. The study raises a number of additional findings that are of interest but are not directly related to this study. These findings merit further research.

  15. El creador de World Wide Web gana premio Millennium de tecnologia

    CERN Multimedia

    Galan, J

    2004-01-01

    "El creador de la World Wide Web (WWW), el fisico britanico Tim Berners-Lee, gano hoy la primera edicion del Millennium Technology Prize, un galardon internacional creado por una fundacion finlandesa y dotado con un millon de euros" (1/2 page)

  16. Remote monitoring using technologies from the Internet and World Wide Web

    International Nuclear Information System (INIS)

    Puckett, J.M.; Burczyk, L.

    1997-01-01

    Recent developments in Internet technologies are changing and enhancing how one processes and exchanges information. These developments include software and hardware in support of multimedia applications on the World Wide Web. In this paper the authors describe these technologies as they have applied them to remote monitoring and show how they will allow the International Atomic Energy Agency to efficiently review and analyze remote monitoring data for verification of material movements. The authors have developed demonstration software that illustrates several safeguards data systems using the resources of the Internet and Web to access and review data. This Web demo allows the user to directly observe sensor data, to analyze simulated safeguards data, and to view simulated on-line inventory data. Future activities include addressing the technical and security issues associated with using the Web to interface with existing and planned monitoring systems at nuclear facilities. Some of these issues are authentication, encryption, transmission of large quantities of data, and data compression

  17. Annotation of toponyms in TEI digital literary editions and linking to the web of data

    Directory of Open Access Journals (Sweden)

    Frontini, Francesca

    2016-07-01

    Full Text Available This paper aims to discuss the challenges and benefits of the annotation of place names in literary texts and literary criticism. We shall first highlight the problems of encoding spatial information in digital editions using the TEI format by means of two manual annotation experiments and the discussion of various cases. This will lead to the question of how to use existing semantic web resources to complement and enrich toponym mark-up, in particular to provide mentions with precise georeferencing. Finally the automatic annotation of a large corpus will show the potential of visualizing places from texts, by illustrating an analysis of the evolution of literary life from the spatial and geographical point of view.

  18. Health information seeking and the World Wide Web: an uncertainty management perspective.

    Science.gov (United States)

    Rains, Stephen A

    2014-01-01

    Uncertainty management theory was applied in the present study to offer one theoretical explanation for how individuals use the World Wide Web to acquire health information and to help better understand the implications of the Web for information seeking. The diversity of information sources available on the Web and potential to exert some control over the depth and breadth of one's information-acquisition effort is argued to facilitate uncertainty management. A total of 538 respondents completed a questionnaire about their uncertainty related to cancer prevention and information-seeking behavior. Consistent with study predictions, use of the Web for information seeking interacted with respondents' desired level of uncertainty to predict their actual level of uncertainty about cancer prevention. The results offer evidence that respondents who used the Web to search for cancer information were better able than were respondents who did not seek information to achieve a level of uncertainty commensurate with the level of uncertainty they desired.

  19. The poor quality of information about laparoscopy on the World Wide Web as indexed by popular search engines.

    Science.gov (United States)

    Allen, J W; Finch, R J; Coleman, M G; Nathanson, L K; O'Rourke, N A; Fielding, G A

    2002-01-01

    This study was undertaken to determine the quality of information on the Internet regarding laparoscopy. Four popular World Wide Web search engines were used with the key word "laparoscopy." Advertisements, patient- or physician-directed information, and controversial material were noted. A total of 14,030 Web pages were found, but only 104 were unique Web sites. The majority of the sites were duplicate pages, subpages within a main Web page, or dead links. Twenty-eight of the 104 pages had a medical product for sale, 26 were patient-directed, 23 were written by a physician or group of physicians, and six represented corporations. The remaining 21 were "miscellaneous." The 46 pages containing educational material were critically reviewed. At least one of the senior authors found that 32 of the pages contained controversial or misleading statements. All of the three senior authors (LKN, NAO, GAF) independently agreed that 17 of the 46 pages contained controversial information. The World Wide Web is not a reliable source for patient or physician information about laparoscopy. Authenticating medical information on the World Wide Web is a difficult task, and no government or surgical society has taken the lead in regulating what is presented as fact on the World Wide Web.

  20. World Wide Web Homepages: An Examination of Content and Audience.

    Science.gov (United States)

    Reynolds, Betty; And Others

    This paper shows how the content of a World Wide Web page is selected and how an examination of the intended audience influences content. Examples from the New Mexico Tech (NMT) Library homepage show what sources are selected and what level of detail is appropriate for the intended audience. Six fundamental functions of libraries and information…

  1. Contemporary Approaches to Critical Thinking and the World Wide Web

    Science.gov (United States)

    Buffington, Melanie L.

    2007-01-01

    Teaching critical thinking skills is often endorsed as a means to help students develop their abilities to navigate the complex world in which people live and, in addition, as a way to help students succeed in school. Over the past few years, this author explored the idea of teaching critical thinking using the World Wide Web (WWW). She began…

  2. Marketing and Selling CD-ROM Products on the World-Wide Web.

    Science.gov (United States)

    Walker, Becki

    1995-01-01

    Describes three companies' approaches to marketing and selling CD-ROM products on the World Wide Web. Benefits include low overhead for Internet-based sales, allowance for creativity, and ability to let customers preview products online. Discusses advertising, information delivery, content, information services, and security. (AEF)

  3. Radiation protection and environmental radioactivity. A voyage to the World Wide Web for beginners

    International Nuclear Information System (INIS)

    Weimer, S.

    1998-01-01

    According to the enormous growth of the Internet service 'World Wide Web' there is also a big growth in the number of web sites in connection with radiation protection. An introduction is given of some practical basis of the WWW. The structure of WWW addresses and navigating through the web with hyperlinks is explained. Further some search engines are presented. The paper lists a number of WWW addresses of interesting sites with radiological protection informations. (orig.) [de

  4. Quality analysis of patient information about knee arthroscopy on the World Wide Web.

    Science.gov (United States)

    Sambandam, Senthil Nathan; Ramasamy, Vijayaraj; Priyanka, Priyanka; Ilango, Balakrishnan

    2007-05-01

    This study was designed to ascertain the quality of patient information available on the World Wide Web on the topic of knee arthroscopy. For the purpose of quality analysis, we used a pool of 232 search results obtained from 7 different search engines. We used a modified assessment questionnaire to assess the quality of these Web sites. This questionnaire was developed based on similar studies evaluating Web site quality and includes items on illustrations, accessibility, availability, accountability, and content of the Web site. We also compared results obtained with different search engines and tried to establish the best possible search strategy to attain the most relevant, authentic, and adequate information with minimum time consumption. For this purpose, we first compared 100 search results from the single most commonly used search engine (AltaVista) with the pooled sample containing 20 search results from each of the 7 different search engines. The search engines used were metasearch (Copernic and Mamma), general search (Google, AltaVista, and Yahoo), and health topic-related search engines (MedHunt and Healthfinder). The phrase "knee arthroscopy" was used as the search terminology. Excluding the repetitions, there were 117 Web sites available for quality analysis. These sites were analyzed for accessibility, relevance, authenticity, adequacy, and accountability by use of a specially designed questionnaire. Our analysis showed that most of the sites providing patient information on knee arthroscopy contained outdated information, were inadequate, and were not accountable. Only 16 sites were found to be providing reasonably good patient information and hence can be recommended to patients. Understandably, most of these sites were from nonprofit organizations and educational institutions. Furthermore, our study revealed that using multiple search engines increases patients' chances of obtaining more relevant information rather than using a single search

  5. Tesauros e a World Wide Web

    OpenAIRE

    Murakami, Tiago R. M.

    2005-01-01

    Thesauri are tools that growing importance in Web context. For this, is necessary adapting the thesauri for Web technologies and functionalities. The present work is an exploratory study that aim identifies how the documentary thesauri are being utilized and/or incorporated for the management of information in the Web.

  6. Radar Images of the Earth and the World Wide Web

    Science.gov (United States)

    Chapman, B.; Freeman, A.

    1995-01-01

    A perspective of NASA's Jet Propulsion Laboratory as a center of planetary exploration, and its involvement in studying the earth from space is given. Remote sensing, radar maps, land topography, snow cover properties, vegetation type, biomass content, moisture levels, and ocean data are items discussed related to earth orbiting satellite imaging radar. World Wide Web viewing of this content is discussed.

  7. Technical Note: On The Usage and Development of the AWAKE Web Server and Web Applications

    CERN Document Server

    Berger, Dillon Tanner

    2017-01-01

    The purpose of this technical note is to give a brief explanation of the AWAKE Web Server, the current web applications it serves, and how to edit, maintain, and update the source code. The majority of this paper is dedicated to the development of the server and its web applications.

  8. Documenting historical data and accessing it on the World Wide Web

    Science.gov (United States)

    Malchus B. Baker; Daniel P. Huebner; Peter F. Ffolliott

    2000-01-01

    New computer technologies facilitate the storage, retrieval, and summarization of watershed-based data sets on the World Wide Web. These data sets are used by researchers when testing and validating predictive models, managers when planning and implementing watershed management practices, educators when learning about hydrologic processes, and decisionmakers when...

  9. Wired World-Wide Web Interactive Remote Event Display

    Energy Technology Data Exchange (ETDEWEB)

    De Groot, Nicolo

    2003-05-07

    WIRED (World-Wide Web Interactive Remote Event Display) is a framework, written in the Java{trademark} language, for building High Energy Physics event displays. An event display based on the WIRED framework enables users of a HEP collaboration to visualize and analyze events remotely using ordinary WWW browsers, on any type of machine. In addition, event displays using WIRED may provide the general public with access to the research of high energy physics. The recent introduction of the object-oriented Java{trademark} language enables the transfer of machine independent code across the Internet, to be safely executed by a Java enhanced WWW browser. We have employed this technology to create a remote event display in WWW. The combined Java-WWW technology hence assures a world wide availability of such an event display, an always up-to-date program and a platform independent implementation, which is easy to use and to install.

  10. Outreach to International Students and Scholars Using the World Wide Web.

    Science.gov (United States)

    Wei, Wei

    1998-01-01

    Describes the creation of a World Wide Web site for the Science Library International Outreach Program at the University of California, Santa Cruz. Discusses design elements, content, and promotion of the site. Copies of the home page and the page containing the outreach program's statement of purpose are included. (AEF)

  11. Alaskan Auroral All-Sky Images on the World Wide Web

    Science.gov (United States)

    Stenbaek-Nielsen, H. C.

    1997-01-01

    In response to a 1995 NASA SPDS announcement of support for preservation and distribution of important data sets online, the Geophysical Institute, University of Alaska Fairbanks, Alaska, proposed to provide World Wide Web access to the Poker Flat Auroral All-sky Camera images in real time. The Poker auroral all-sky camera is located in the Davis Science Operation Center at Poker Flat Rocket Range about 30 miles north-east of Fairbanks, Alaska, and is connected, through a microwave link, with the Geophysical Institute where we maintain the data base linked to the Web. To protect the low light-level all-sky TV camera from damage due to excessive light, we only operate during the winter season when the moon is down. The camera and data acquisition is now fully computer controlled. Digital images are transmitted each minute to the Web linked data base where the data are available in a number of different presentations: (1) Individual JPEG compressed images (1 minute resolution); (2) Time lapse MPEG movie of the stored images; and (3) A meridional plot of the entire night activity.

  12. Where to find nutritional science journals on the World Wide Web.

    Science.gov (United States)

    Brown, C M

    1997-08-01

    The World Wide Web (WWW) is a burgeoning information resource that can be utilized for current awareness and assistance in manuscript preparation and submission. The ever changing and expanding nature of the WWW allows it to provide up to the minute information, but this inherent changeability often makes information access difficult. To assist nutrition scientists in locating useful information about nutritional science journals on the WWW, this article critically reviews and describes the WWW sites for seventeen highly ranked nutrition and dietetics journals. Included in each annotation are the site's title, web address or Universal Resource Locator (URL), journal ranking and site authorship. Also listed is whether or not the site makes available the guidelines for authors, tables of contents, abstracts, online ordering, as well as information about the editorial board. This critical survey illustrates that the information on the web, regardless of its authority, is not of equal quality.

  13. World wide web and virtual reality in developing and using environmental models

    International Nuclear Information System (INIS)

    Guariso, G.

    2001-01-01

    The application of World wide web as an active component of environmental decision support system is still largely unexplored. Environmental problems are distributed in nature, both from the physical and from the social point of view; the Web is thus an ideal tool to share concepts and decisions among multiple interested parties. Also Virtual Reality (VR) that has not find, up to know, a large application in the development and teaching of environmental models. The paper shows some recent applications that highlight the potential of these tools [it

  14. A review of images of nurses and smoking on the World Wide Web.

    Science.gov (United States)

    Sarna, Linda; Bialous, Stella Aguinaga

    2012-01-01

    With the advent of the World Wide Web, historic images previously having limited distributions are now widely available. As tobacco use has evolved, so have images of nurses related to smoking. Using a systematic search, the purpose of this article is to describe types of images of nurses and smoking available on the World Wide Web. Approximately 10,000 images of nurses and smoking published over the past century were identified through search engines and digital archives. Seven major themes were identified: nurses smoking, cigarette advertisements, helping patients smoke, "naughty" nurse, teaching women to smoke, smoking in and outside of health care facilities, and antitobacco images. The use of nursing images to market cigarettes was known but the extent of the use of these images has not been reported previously. Digital archives can be used to explore the past, provide a perspective for understanding the present, and suggest directions for the future in confronting negative images of nursing. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Efficacy of the World Wide Web in K-12 environmental education

    Science.gov (United States)

    York, Kimberly Jane

    1998-11-01

    Despite support by teachers, students, and the American public in general, environmental education is not a priority in U.S. schools. Teachers face many barriers to integrating environmental education into K--12 curricula. The focus of this research is teachers' lack of access to environmental education resources. New educational reforms combined with emerging mass communication technologies such as the Internet and World Wide Web present new opportunities for the infusion of environmental content into the curriculum. New technologies can connect teachers and students to a wealth of resources previously unavailable to them. However, significant barriers to using technologies exist that must be overcome to make this promise a reality. Web-based environmental education is a new field and research is urgently needed. If teachers are to use the Web meaningfully in their classrooms, it is essential that their attitudes and perceptions about using this new technology be brought to light. Therefore, this exploratory research investigates teachers' attitudes toward using the Web to share environmental education resources. Both qualitative and quantitative methods were used to investigate this problem. Two surveys were conducted---self-administered mail survey and a Web-based online survey---to elicit teachers perceptions and comments about environmental education and the Web. Preliminary statistical procedures including frequencies, percentages and correlational measures were performed to interpret the data. In-depth interviews and participant-observation methods were used during an extended environmental education curriculum development project with two practicing teachers to gain insights into the process of creating curricula and placing it online. Findings from the both the mail survey and the Web-based survey suggest that teachers are interested in environmental education---97% of respondents for each survey agreed that environmental education should be taught in K

  16. The World-Wide Web: An Interface between Research and Teaching in Bioinformatics

    Directory of Open Access Journals (Sweden)

    James F. Aiton

    1994-01-01

    Full Text Available The rapid expansion occurring in World-Wide Web activity is beginning to make the concepts of ‘global hypermedia’ and ‘universal document readership’ realistic objectives of the new revolution in information technology. One consequence of this increase in usage is that educators and students are becoming more aware of the diversity of the knowledge base which can be accessed via the Internet. Although computerised databases and information services have long played a key role in bioinformatics these same resources can also be used to provide core materials for teaching and learning. The large datasets and arch ives th at have been compiled for biomedical research can be enhanced with the addition of a variety of multimedia elements (images. digital videos. animation etc.. The use of this digitally stored information in structured and self-directed learning environments is likely to increase as activity across World-Wide Web increases.

  17. Quality of information available on the World Wide Web for patients undergoing thyroidectomy: review.

    Science.gov (United States)

    Muthukumarasamy, S; Osmani, Z; Sharpe, A; England, R J A

    2012-02-01

    This study aimed to assess the quality of information available on the World Wide Web for patients undergoing thyroidectomy. The first 50 web-links generated by internet searches using the five most popular search engines and the key word 'thyroidectomy' were evaluated using the Lida website validation instrument (assessing accessibility, usability and reliability) and the Flesch Reading Ease Score. We evaluated 103 of a possible 250 websites. Mean scores (ranges) were: Lida accessibility, 48/63 (27-59); Lida usability, 36/54 (21-50); Lida reliability, 21/51 (4-38); and Flesch Reading Ease, 43.9 (2.6-77.6). The quality of internet health information regarding thyroidectomy is variable. High ranking and popularity are not good indicators of website quality. Overall, none of the websites assessed achieved high Lida scores. In order to prevent the dissemination of inaccurate or commercially motivated information, we recommend independent labelling of medical information available on the World Wide Web.

  18. Environmental Reporting for Global Higher Education Institutions using the World Wide Web.

    Science.gov (United States)

    Walton, J.; Alabaster, T.; Richardson, S.; Harrison, R.

    1997-01-01

    Proposes the value of voluntary environmental reporting by higher education institutions as an aid to implementing environmental policies. Suggests that the World Wide Web can provide a fast, up-to-date, flexible, participatory, multidimensional medium for information exchange and management. Contains 29 references. (PVD)

  19. WWW.Cell Biology Education: Using the World Wide Web to Develop a New Teaching Topic

    Science.gov (United States)

    Blystone, Robert V.; MacAlpine, Barbara

    2005-01-01

    "Cell Biology Education" calls attention each quarter to several Web sites of educational interest to the biology community. The Internet provides access to an enormous array of potential teaching materials. In this article, the authors describe one approach for using the World Wide Web to develop a new college biology laboratory exercise. As a…

  20. Distributing flight dynamics products via the World Wide Web

    Science.gov (United States)

    Woodard, Mark; Matusow, David

    1996-01-01

    The NASA Flight Dynamics Products Center (FDPC), which make available selected operations products via the World Wide Web, is reported on. The FDPC can be accessed from any host machine connected to the Internet. It is a multi-mission service which provides Internet users with unrestricted access to the following standard products: antenna contact predictions; ground tracks; orbit ephemerides; mean and osculating orbital elements; earth sensor sun and moon interference predictions; space flight tracking data network summaries; and Shuttle transport system predictions. Several scientific data bases are available through the service.

  1. [Preface for genome editing special issue].

    Science.gov (United States)

    Gu, Feng; Gao, Caixia

    2017-10-25

    Genome editing technology, as an innovative biotechnology, has been widely used for editing the genome from model organisms, animals, plants and microbes. CRISPR/Cas9-based genome editing technology shows its great value and potential in the dissection of functional genomics, improved breeding and genetic disease treatment. In the present special issue, the principle and application of genome editing techniques has been summarized. The advantages and disadvantages of the current genome editing technology and future prospects would also be highlighted.

  2. The World Wide Web: A Web Even a Fly Would Love

    Science.gov (United States)

    Bryson, E.

    Ever since my introduction to the World Wide Web (WWW), it's been love at first byte. Searching on the WWW is similar to being able to go to a public library and allow yourself to be transported to any other book or library around the world by looking at a reference or index and clicking your heels together like Dorothy did in "The Wizard of Oz", only the clicking is done with a computer mouse. During this presentation, we will explore the WWW protocols which allow clients and servers to communicate on the Internet. We will demonstrate the ease with which users can navigate the virtual tidal wave of information available with a mere click of a button. In addition, the workshop will discuss the revolutionary aspects of this network information system and how it's impacting our libraries as a primary mechanism for rapid dissemination of knowledge.

  3. Expert knowledge in palliative care on the World Wide Web: palliativedrugs.org.

    Science.gov (United States)

    Gavrin, Jonathan

    2009-01-01

    In my last Internet-related article, I speculated that social networking would be the coming wave in the effort to share knowledge among experts in various disciplines. At the time I did not know that a palliative care site on the World Wide Web (WWW), palliativedrugs.com, already provided the infrastructure for sharing expert knowledge in the field. The Web site is an excellent traditional formulary but it is primarily devoted to "unlicensed" ("off-label") use of medications in palliative care, something we in the specialty often do with little to support our interventions except shared knowledge and experience. There is nothing fancy about this Web site. In a good way, its format is a throwback to Web sites of the 1990s. In only the loosest sense can one describe it as "multimedia." Yet, it provides the perfect forum for expert knowledge and is a "must see" resource. Its existing content is voluminous and reliable, filtered and reviewed by renowned clinicians and educators in the field. Although its origin and structure were not specifically designed for social or professional networking, the Web site's format makes it a natural way for practitioners around the world to contribute to an ever-growing body of expertise in palliative care.

  4. Landscaping climate change: a mapping technique for understanding science and technology debates on the world wide web

    NARCIS (Netherlands)

    Rogers, R.; Marres, N.

    2000-01-01

    New World Wide Web (web) mapping techniques may inform and ultimately facilitate meaningful participation in current science and technology debates. The technique described here "landscapes" a debate by displaying key "webby" relationships between organizations. "Debate-scaping" plots two

  5. Securing the anonymity of content providers in the World Wide Web

    Science.gov (United States)

    Demuth, Thomas; Rieke, Andreas

    1999-04-01

    Nowadays the World Wide Web (WWW) is an established service used by people all over the world. Most of them do not recognize the fact that they reveal plenty of information about themselves or their affiliation and computer equipment to the providers of web pages they connect to. As a result, a lot of services offer users to access web pages unrecognized or without risk of being backtracked, respectively. This kind of anonymity is called user or client anonymity. But on the other hand, an equivalent protection for content providers does not exist, although this feature is desirable for many situations in which the identity of a publisher or content provider shall be hidden. We call this property server anonymity. We will introduce the first system with the primary target to offer anonymity for providers of information in the WWW. Beside this property, it provides also client anonymity. Based on David Chaum's idea of mixes and in relation to the context of the WWW, we explain the term 'server anonymity' motivating the system JANUS which offers both client and server anonymity.

  6. Pre-Service Teachers Critically Evaluate Scientific Information on the World-Wide Web: What Makes Information Believable?

    Science.gov (United States)

    Iding, Marie; Klemm, E. Barbara

    2005-01-01

    The present study addresses the need for teachers to critically evaluate the credibility, validity, and cognitive load associated with scientific information on Web sites, in order to effectively teach students to evaluate scientific information on the World Wide Web. A line of prior research investigating high school and university students'…

  7. Le world wide web: l'hypermedià sur internet | Houmel | Revue d ...

    African Journals Online (AJOL)

    The telecommunication's networks technology linked to the electronic document has changed abroad the information specialists' methods of work. The Internet network did a lot in thèse big changes and especially after the World Wide Web intégration wich is a high hypermedia distributed information System. In Algeria lots ...

  8. Network dynamics: The World Wide Web

    Science.gov (United States)

    Adamic, Lada Ariana

    Despite its rapidly growing and dynamic nature, the Web displays a number of strong regularities which can be understood by drawing on methods of statistical physics. This thesis finds power-law distributions in website sizes, traffic, and links, and more importantly, develops a stochastic theory which explains them. Power-law link distributions are shown to lead to network characteristics which are especially suitable for scalable localized search. It is also demonstrated that the Web is a "small world": to reach one site from any other takes an average of only 4 hops, while most related sites cluster together. Additional dynamical properties of the Web graph are extracted from diffusion processes.

  9. THE NEW “UNIVERSAL TRUTH” OF THE WORLD WIDE WEB

    OpenAIRE

    Alexandru Tăbușcă

    2011-01-01

    We all see that the world wide web is permanently evolving and developing. New websites are created continuously and push the limits of the old HTML specs in all respects. HTML4 is the real standard for almost 10 years and developers are starting to look for new and improved technologies to help them provide greater functionality. In order to give the authors flexibility and interoperability and to enable much more interactive and innovative websites and applications, HTML5 introduces and enh...

  10. The World Wide Web as a Medium of Instruction: What Works and What Doesn't

    Science.gov (United States)

    McCarthy, Marianne; Grabowski, Barbara; Hernandez, Angel; Koszalka, Tiffany; Duke, Lee

    1997-01-01

    A conference was held on March 18-20, 1997 to investigate the lessons learned by the Aeronautics Cooperative Agreement Projects with regard to the most effective strategies for developing instruction for the World Wide Web. The conference was a collaboration among the NASA Aeronautics and Space Transportation Technology Centers (Ames, Dryden, Langley, and Lewis), NASA Headquarters, the University of Idaho and The Pennsylvania State University. The conference consisted of presentations by the Aeronautics Cooperative Agreement Teams, the University of Idaho, and working sessions in which the participants addressed teacher training and support, technology, evaluation and pedagogy. The conference was also undertaken as part of the Dryden Learning Technologies Project which is a collaboration between the Dryden Education Office and The Pennsylvania State University. The DFRC Learning Technology Project goals relevant to the conference are as follows: conducting an analysis of current teacher needs, classroom infrastructure and exemplary instructional World Wide Web sites, and developing models for Web-enhanced learning environments that optimize teaching practices and student learning.

  11. A systematic review of patient inflammatory bowel disease information resources on the World Wide Web.

    Science.gov (United States)

    Bernard, André; Langille, Morgan; Hughes, Stephanie; Rose, Caren; Leddin, Desmond; Veldhuyzen van Zanten, Sander

    2007-09-01

    The Internet is a widely used information resource for patients with inflammatory bowel disease, but there is variation in the quality of Web sites that have patient information regarding Crohn's disease and ulcerative colitis. The purpose of the current study is to systematically evaluate the quality of these Web sites. The top 50 Web sites appearing in Google using the terms "Crohn's disease" or "ulcerative colitis" were included in the study. Web sites were evaluated using a (a) Quality Evaluation Instrument (QEI) that awarded Web sites points (0-107) for specific information on various aspects of inflammatory bowel disease, (b) a five-point Global Quality Score (GQS), (c) two reading grade level scores, and (d) a six-point integrity score. Thirty-four Web sites met the inclusion criteria, 16 Web sites were excluded because they were portals or non-IBD oriented. The median QEI score was 57 with five Web sites scoring higher than 75 points. The median Global Quality Score was 2.0 with five Web sites achieving scores of 4 or 5. The average reading grade level score was 11.2. The median integrity score was 3.0. There is marked variation in the quality of the Web sites containing information on Crohn's disease and ulcerative colitis. Many Web sites suffered from poor quality but there were five high-scoring Web sites.

  12. The Web Application Hacker's Handbook Finding and Exploiting Security Flaws

    CERN Document Server

    Stuttard, Dafydd

    2011-01-01

    The highly successful security book returns with a new edition, completely updated Web applications are the front door to most organizations, exposing them to attacks that may disclose personal information, execute fraudulent transactions, or compromise ordinary users. This practical book has been completely updated and revised to discuss the latest step-by-step techniques for attacking and defending the range of ever-evolving web applications. You'll explore the various new technologies employed in web applications that have appeared since the first edition and review the new attack technique

  13. WEBSLIDE: A "Virtual" Slide Projector Based on World Wide Web

    Science.gov (United States)

    Barra, Maria; Ferrandino, Salvatore; Scarano, Vittorio

    1999-03-01

    We present here the design key concepts of WEBSLIDE, a software project whose objective is to provide a simple, cheap and efficient solution for showing slides during lessons in computer labs. In fact, WEBSLIDE allows the video monitors of several client machines (the "STUDENTS") to be synchronously updated by the actions of a particular client machine, called the "INSTRUCTOR." The system is based on the World Wide Web and the software components of WEBSLIDE mainly consists in a WWW server, browsers and small Cgi-Bill scripts. What makes WEBSLIDE particularly appealing for small educational institutions is that WEBSLIDE is built with "off the shelf" products: it does not involve using a specifically designed program but any Netscape browser, one of the most popular browsers available on the market, is sufficient. Another possible use is to use our system to implement "guided automatic tours" through several pages or Intranets internal news bulletins: the company Web server can broadcast to all employees relevant information on their browser.

  14. Do We Need to Impose More Regulation Upon the World Wide Web? -A Metasystem Analysis

    Directory of Open Access Journals (Sweden)

    John P. van Gigch

    2000-01-01

    Full Text Available Every day a new problem attributable to the World Wide Web's lack of formal structure and/or organization is made public. What arguably could be represented as one of its main strengths is rapidly turning out to be one of its most flagrant weaknesses. The intent of this article is to show the need to establish a more formal organization than presently exists over the World Wide Web. (This article will use the terms the Internet and Cyberspace interchangeably. It is proposed that this formal organization take the form of a metacontrol system--to be explained-- and rely, at least in part, for this control to self-regulate. The so-called metasystem system would be responsible for preventing some of the unanticipated situations that take place in cyberspace and that, due to the web's lack of maturity, have not been encountered heretofore. Some activities, such as the denial-of-service (DoS attacks, may well be illicit. Others, like the question of establishing a world-wide democratic board to administer the Internet's address system, are so new that there are no technical, legal or political precedents to ensure its design will succeed. What is needed is a formal, over-arching control system, i.e. a "metasystem," to arbitrate over controversies, decide on the legality of new policies and, in general, act as a metalevel controller over the activities of the virtual community called Cyberspace. The World Wide Web Consortium has emerged as a possible candidate for this role.This paper uses control theory to define both the problem and the proposed solution. Cyberspace lacks a metacontroller that can be used to resolve the many problems that arise when a new organizational configuration, such as the Internet, is created and when questions surface about the extent to which new activities interfere with individual or corporate freedoms.

  15. Reading on the World Wide Web: Dealing with conflicting information from multiple sources

    NARCIS (Netherlands)

    Van Strien, Johan; Brand-Gruwel, Saskia; Boshuizen, Els

    2011-01-01

    Van Strien, J. L. H., Brand-Gruwel, S., & Boshuizen, H. P. A. (2011, August). Reading on the World Wide Web: Dealing with conflicting information from multiple sources. Poster session presented at the biannual conference of the European Association for Research on Learning and Instruction, Exeter,

  16. Delivering an Alternative Medicine Resource to the User's Desktop via World Wide Web.

    Science.gov (United States)

    Li, Jie; Wu, Gang; Marks, Ellen; Fan, Weiyu

    1998-01-01

    Discusses the design and implementation of a World Wide Web-based alternative medicine virtual resource. This homepage integrates regional, national, and international resources and delivers library services to the user's desktop. Goals, structure, and organizational schemes of the system are detailed, and design issues for building such a…

  17. The use of the World Wide Web by medical journals in 2003 and 2005: an observational study.

    Science.gov (United States)

    Schriger, David L; Ouk, Sripha; Altman, Douglas G

    2007-01-01

    The 2- to 6-page print journal article has been the standard for 200 years, yet this format severely limits the amount of detailed information that can be conveyed. The World Wide Web provides a low-cost option for posting extended text and supplementary information. It also can enhance the experience of journal editors, reviewers, readers, and authors through added functionality (eg, online submission and peer review, postpublication critique, and e-mail notification of table of contents.) Our aim was to characterize ways that journals were using the World Wide Web in 2005 and note changes since 2003. We analyzed the Web sites of 138 high-impact print journals in 3 ways. First, we compared the print and Web versions of March 2003 and 2005 issues of 28 journals (20 of which were randomly selected from the 138) to determine how often articles were published Web only and how often print articles were augmented by Web-only supplements. Second, we examined what functions were offered by each journal Web site. Third, for journals that offered Web pages for reader commentary about each article, we analyzed the number of comments and characterized these comments. Fifty-six articles (7%) in 5 journals were Web only. Thirteen of the 28 journals had no supplementary online content. By 2005, several journals were including Web-only supplements in >20% of their papers. Supplementary methods, tables, and figures predominated. The use of supplementary material increased by 5% from 2% to 7% in the 20-journal random sample from 2003 to 2005. Web sites had similar functionality with an emphasis on linking each article to related material and e-mailing readers about activity related to each article. There was little evidence of journals using the Web to provide readers an interactive experience with the data or with each other. Seventeen of the 138 journals offered rapid-response pages. Only 18% of eligible articles had any comments after 5 months. Journal Web sites offer similar

  18. World Wide Web Usage Mining Systems and Technologies

    Directory of Open Access Journals (Sweden)

    Wen-Chen Hu

    2003-08-01

    Full Text Available Web usage mining is used to discover interesting user navigation patterns and can be applied to many real-world problems, such as improving Web sites/pages, making additional topic or product recommendations, user/customer behavior studies, etc. This article provides a survey and analysis of current Web usage mining systems and technologies. A Web usage mining system performs five major tasks: i data gathering, ii data preparation, iii navigation pattern discovery, iv pattern analysis and visualization, and v pattern applications. Each task is explained in detail and its related technologies are introduced. A list of major research systems and projects concerning Web usage mining is also presented, and a summary of Web usage mining is given in the last section.

  19. The PEP-II/BaBar Project-Wide Database using World Wide Web and Oracle*Case

    International Nuclear Information System (INIS)

    Chan, A.; Crane, G.; MacGregor, I.; Meyer, S.

    1995-12-01

    The PEP-II/BaBar Project Database is a tool for monitoring the technical and documentation aspects of the accelerator and detector construction. It holds the PEP-II/BaBar design specifications, fabrication and installation data in one integrated system. Key pieces of the database include the machine parameter list, components fabrication and calibration data, survey and alignment data, property control, CAD drawings, publications and documentation. This central Oracle database on a UNIX server is built using Oracle*Case tools. Users at the collaborating laboratories mainly access the data using World Wide Web (WWW). The Project Database is being extended to link to legacy databases required for the operations phase

  20. An Ontology of Quality Initiatives and a Model for Decentralized, Collaborative Quality Management on the (Semantic) World Wide Web

    Science.gov (United States)

    2001-01-01

    This editorial provides a model of how quality initiatives concerned with health information on the World Wide Web may in the future interact with each other. This vision fits into the evolving "Semantic Web" architecture - ie, the prospective that the World Wide Web may evolve from a mess of unstructured, human-readable information sources into a global knowledge base with an additional layer providing richer and more meaningful relationships between resources. One first prerequisite for forming such a "Semantic Web" or "web of trust" among the players active in quality management of health information is that these initiatives make statements about themselves and about each other in a machine-processable language. I present a concrete model on how this collaboration could look, and provide some recommendations on what the role of the World Health Organization (WHO) and other policy makers in this framework could be. PMID:11772549

  1. The World Wide Web and Technology Transfer at NASA Langley Research Center

    Science.gov (United States)

    Nelson, Michael L.; Bianco, David J.

    1994-01-01

    NASA Langley Research Center (LaRC) began using the World Wide Web (WWW) in the summer of 1993, becoming the first NASA installation to provide a Center-wide home page. This coincided with a reorganization of LaRC to provide a more concentrated focus on technology transfer to both aerospace and non-aerospace industry. Use of the WWW and NCSA Mosaic not only provides automated information dissemination, but also allows for the implementation, evolution and integration of many technology transfer applications. This paper describes several of these innovative applications, including the on-line presentation of the entire Technology Opportunities Showcase (TOPS), an industrial partnering showcase that exists on the Web long after the actual 3-day event ended. During its first year on the Web, LaRC also developed several WWW-based information repositories. The Langley Technical Report Server (LTRS), a technical paper delivery system with integrated searching and retrieval, has proved to be quite popular. The NASA Technical Report Server (NTRS), an outgrowth of LTRS, provides uniform access to many logically similar, yet physically distributed NASA report servers. WWW is also the foundation of the Langley Software Server (LSS), an experimental software distribution system which will distribute LaRC-developed software with the possible phase-out of NASA's COSMIC program. In addition to the more formal technology distribution projects, WWW has been successful in connecting people with technologies and people with other people. With the completion of the LaRC reorganization, the Technology Applications Group, charged with interfacing with non-aerospace companies, opened for business with a popular home page.

  2. Flow Webs: Mechanism and Architecture for the Implementation of Sensor Webs

    Science.gov (United States)

    Gorlick, M. M.; Peng, G. S.; Gasster, S. D.; McAtee, M. D.

    2006-12-01

    The sensor web is a distributed, federated infrastructure much like its predecessors, the internet and the world wide web. It will be a federation of many sensor webs, large and small, under many distinct spans of control, that loosely cooperates and share information for many purposes. Realistically, it will grow piecemeal as distinct, individual systems are developed and deployed, some expressly built for a sensor web while many others were created for other purposes. Therefore, the architecture of the sensor web is of fundamental import and architectural strictures that inhibit innovation, experimentation, sharing or scaling may prove fatal. Drawing upon the architectural lessons of the world wide web, we offer a novel system architecture, the flow web, that elevates flows, sequences of messages over a domain of interest and constrained in both time and space, to a position of primacy as a dynamic, real-time, medium of information exchange for computational services. The flow web captures; in a single, uniform architectural style; the conflicting demands of the sensor web including dynamic adaptations to changing conditions, ease of experimentation, rapid recovery from the failures of sensors and models, automated command and control, incremental development and deployment, and integration at multiple levels—in many cases, at different times. Our conception of sensor webs—dynamic amalgamations of sensor webs each constructed within a flow web infrastructure—holds substantial promise for earth science missions in general, and of weather, air quality, and disaster management in particular. Flow webs, are by philosophy, design and implementation a dynamic infrastructure that permits massive adaptation in real-time. Flows may be attached to and detached from services at will, even while information is in transit through the flow. This concept, flow mobility, permits dynamic integration of earth science products and modeling resources in response to real

  3. Network Formation and the Structure of the Commercial World Wide Web

    OpenAIRE

    Zsolt Katona; Miklos Sarvary

    2008-01-01

    We model the commercial World Wide Web as a directed graph that emerges as the equilibrium of a game in which utility maximizing websites purchase (advertising) in-links from each other while also setting the price of these links. In equilibrium, higher content sites tend to purchase more advertising links (mirroring the Dorfman-Steiner rule) while selling less advertising links themselves. As such, there seems to be specialization across sites in revenue models: high content sites tend to ea...

  4. Application of World Wide Web (W3) Technologies in Payload Operations

    Science.gov (United States)

    Sun, Charles; Windrem, May; Picinich, Lou

    1996-01-01

    World Wide Web (W3) technologies are considered in relation to their application to space missions. It is considered that such technologies, including the hypertext transfer protocol and the Java object-oriented language, offer a powerful and relatively inexpensive framework for distributed application software development. The suitability of these technologies for payload monitoring systems development is discussed, and the experience gained from the development of an insect habitat monitoring system based on W3 technologies is reported.

  5. Googling DNA sequences on the World Wide Web.

    Science.gov (United States)

    Hajibabaei, Mehrdad; Singer, Gregory A C

    2009-11-10

    New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.

  6. Software Project Management and Measurement on the World-Wide-Web (WWW)

    Science.gov (United States)

    Callahan, John; Ramakrishnan, Sudhaka

    1996-01-01

    We briefly describe a system for forms-based, work-flow management that helps members of a software development team overcome geographical barriers to collaboration. Our system, called the Web Integrated Software Environment (WISE), is implemented as a World-Wide-Web service that allows for management and measurement of software development projects based on dynamic analysis of change activity in the workflow. WISE tracks issues in a software development process, provides informal communication between the users with different roles, supports to-do lists, and helps in software process improvement. WISE minimizes the time devoted to metrics collection and analysis by providing implicit delivery of messages between users based on the content of project documents. The use of a database in WISE is hidden from the users who view WISE as maintaining a personal 'to-do list' of tasks related to the many projects on which they may play different roles.

  7. Tapping the Resources of the World Wide Web for Inquiry in Middle Schools.

    Science.gov (United States)

    Windschitl, Mark; Irby, Janet

    1999-01-01

    Argues for the cautiously expanded use of the World Wide Web for inquiry across the middle school curriculum, noting how the Internet can be used in schools. Describes the Internet and appraises its distractions and academic utility, identifying features that support student inquiry in science, mathematics, social studies, and language arts. (JPB)

  8. Revising and editing for translators

    CERN Document Server

    Mossop, Brian

    2014-01-01

    Revising and Editing for Translators provides guidance and learning materials for translation students learning to edit texts written by others, and professional translators wishing to improve their self-revision ability or learning to revise the work of others. Editing is understood as making corrections and improvements to texts, with particular attention to tailoring them to the given readership. Revising is this same task applied to draft translations. The linguistic work of editors and revisers is related to the professional situations in which they work. Mossop offers in-depth coverage of a wide range of topics, including copyediting, style editing, structural editing, checking for consistency, revising procedures and principles, and translation quality assessment. This third edition provides extended coverage of computer aids for revisers, and of the different degrees of revision suited to different texts. The inclusion of suggested activities and exercises, numerous real-world examples, a proposed gra...

  9. Spiders and Worms and Crawlers, Oh My: Searching on the World Wide Web.

    Science.gov (United States)

    Eagan, Ann; Bender, Laura

    Searching on the world wide web can be confusing. A myriad of search engines exist, often with little or no documentation, and many of these search engines work differently from the standard search engines people are accustomed to using. Intended for librarians, this paper defines search engines, directories, spiders, and robots, and covers basics…

  10. Integration of Web mining and web crawler: Relevance and State of Art

    OpenAIRE

    Subhendu kumar pani; Deepak Mohapatra,; Bikram Keshari Ratha

    2010-01-01

    This study presents the role of web crawler in web mining environment. As the growth of the World Wide Web exceeded all expectations,the research on Web mining is growing more and more.web mining research topic which combines two of the activated research areas: Data Mining and World Wide Web .So, the World Wide Web is a very advanced area for data mining research. Search engines that are based on web crawling framework also used in web mining to find theinteracted web pages. This paper discu...

  11. Overview of the TREC 2013 Federated Web Search Track

    NARCIS (Netherlands)

    Demeester, Thomas; Trieschnigg, Rudolf Berend; Nguyen, Dong-Phuong; Hiemstra, Djoerd

    2014-01-01

    The TREC Federated Web Search track is intended to promote research related to federated search in a realistic web setting, and hereto provides a large data collection gathered from a series of online search engines. This overview paper discusses the results of the first edition of the track, FedWeb

  12. Information consumerism on the World Wide Web: implications for dermatologists and patients.

    Science.gov (United States)

    Travers, Robin L

    2002-09-01

    The World Wide Web (WWW) is continuing to grow exponentially both in terms of numbers of users and numbers of web pages. There is a trend toward the increasing use of the WWW for medical educational purposes, both among physicians and patients alike. The multimedia capabilities of this evolving medium are particularly relevant to visual medical specialties such as dermatology. The origins of information consumerism on the WWW are examined, and the public health issues surrounding dermatologic information and misinformation, and how consumers navigate through the WWW are reviewed. The economic realities of medical information as a "capital good," and the impact this has on dermatologic information sources on the WWW are also discussed.Finally, strategies for guiding consumers and ourselves toward credible medical information sources on the WWW are outlined.

  13. Office 2010 Web Apps For Dummies

    CERN Document Server

    Weverka, Peter

    2010-01-01

    Enhance your Microsoft Office 2010 experience with Office 2010 Web Apps!. Office Web Apps complement Office, making it easy to access and edit files from anywhere. It also simplifies collaboration with those who don't have Microsoft Office on their computers. This helpful book shows you the optimum ways you can use Office Web Apps to save time and streamline your work. Veteran For Dummies author Peter Weverka begins with an introduction to Office Web Apps and then goes on to clearly explain how Office Web Apps provide you with easier, faster, more flexible ways to get things done.: Walks you t

  14. Compact Optical Discs and the World Wide Web: Two Mediums in Digitized Information Delivery Services

    Directory of Open Access Journals (Sweden)

    Ziyu Lin

    1999-10-01

    Full Text Available

    頁次:40-52

    Compact optical discs (CDs and the World Wide Web (the Web are two mechanisms that contemporary libraries extensively use for digitized information storage, dissemination, and retrieval. The Web features an unparalleled global accessibility free from many previously known temporal and spatial restrictions. Its real-time update capability is impossible for CDs. Web-based information delivery can reduce the cost in hardware and software ownership and management of a local library, and provide one-to-one zcustomization to better serve library's clients. The current limitations of the Web include inadequate speed in data transmission, particularly for multimedia applications, and its insufficient reliability, search capabilities, and security. In comparison, speed, quality, portability, and reliability are the current advantages of CDs over the Web. These features, together with the trend in the PC industry and market, suggest that CDs will exist and continue to develop. CD/Web hybrids can combine the best of both developing mechanisms and offer optimal results. Through a comparison of CDs and the Web, it is argued that the functionality and unique features of a technology determine its future.

  15. GLIDERS - A web-based search engine for genome-wide linkage disequilibrium between HapMap SNPs

    Directory of Open Access Journals (Sweden)

    Broxholme John

    2009-10-01

    Full Text Available Abstract Background A number of tools for the examination of linkage disequilibrium (LD patterns between nearby alleles exist, but none are available for quickly and easily investigating LD at longer ranges (>500 kb. We have developed a web-based query tool (GLIDERS: Genome-wide LInkage DisEquilibrium Repository and Search engine that enables the retrieval of pairwise associations with r2 ≥ 0.3 across the human genome for any SNP genotyped within HapMap phase 2 and 3, regardless of distance between the markers. Description GLIDERS is an easy to use web tool that only requires the user to enter rs numbers of SNPs they want to retrieve genome-wide LD for (both nearby and long-range. The intuitive web interface handles both manual entry of SNP IDs as well as allowing users to upload files of SNP IDs. The user can limit the resulting inter SNP associations with easy to use menu options. These include MAF limit (5-45%, distance limits between SNPs (minimum and maximum, r2 (0.3 to 1, HapMap population sample (CEU, YRI and JPT+CHB combined and HapMap build/release. All resulting genome-wide inter-SNP associations are displayed on a single output page, which has a link to a downloadable tab delimited text file. Conclusion GLIDERS is a quick and easy way to retrieve genome-wide inter-SNP associations and to explore LD patterns for any number of SNPs of interest. GLIDERS can be useful in identifying SNPs with long-range LD. This can highlight mis-mapping or other potential association signal localisation problems.

  16. Comparison of student outcomes and preferences in a traditional vs. World Wide Web-based baccalaureate nursing research course.

    Science.gov (United States)

    Leasure, A R; Davis, L; Thievon, S L

    2000-04-01

    The purpose of this project was to compare student outcomes in an undergraduate research course taught using both World Wide Web-based distance learning technology and traditional pedagogy. Reasons given for enrolling in the traditional classroom section included the perception of increased opportunity for interaction, decreased opportunity to procrastinate, immediate feedback, and more meaningful learning activities. Reasons for selecting the Web group section included cost, convenience, and flexibility. Overall, there was no significant difference in examination scores between the two groups on the three multiple-choice examinations or for the course grades (t = -.96, P = .343). Students who reported that they were self-directed and had the ability to maintain their own pace and avoid procrastination were most suited to Web-based courses. The Web-based classes can help provide opportunities for methods of communication that are not traditionally nurtured in traditional classroom settings. Secondary benefits of the World Wide Web-based course were to increase student confidence with the computer, and introduce them to skills and opportunities they would not have had in the classroom. Additionally, over time and with practice, student's writing skills improved.

  17. Book Review: Astronomy: A Self-Teaching Guide, 6th Edition

    Science.gov (United States)

    Marigza, R. N., Jr.

    2009-03-01

    The sixth edition of Moche's book is up-to-date with the latest in astronomy. It contains accurate astronomical data on stars and constellations. The topics are incorporated with web site addresses for the reader to expand his/her knowledge and see high-resolution images of the celestial targets. This edition incorporates new discoveries and suggestions made prior to the first editions. Among the new developments is the twenty-first-century research into black holes, active galaxies and quasars, searches for life in space, origin and structure of our universe, and the latest in ground and space telescopes.

  18. Beyond Piñatas, Fortune Cookies, and Wooden Shoes: Using the World Wide Web to Help Children Explore the Whole Wide World

    Science.gov (United States)

    Kirkwood, Donna; Shulsky, Debra; Willis, Jana

    2014-01-01

    The advent of technology and access to the internet through the World Wide Web have stretched the traditional ways of teaching social studies beyond classroom boundaries. This article explores how teachers can create authentic and contextualized cultural studies experiences for young children by integrating social studies and technology. To…

  19. E-Learning and Role of World Wide Web in E-Learning

    OpenAIRE

    Jahankhani, Hossein

    2012-01-01

    This paper reviews some of the aspects of the E-learning through the World Wide Web. E-revolution as new phenomenon influenced the society by its means and strategies. E-learning is one of the sub-products of E-revolution, towards making more convenient and effective learning. In time Internet become a source of information, people start to learn through the Internet instead of books. It gives the flexibility to remote access at any time. The working people and the students are inspired by th...

  20. Creating a web site the missing manual

    CERN Document Server

    MacDonald, Matthew

    2008-01-01

    Think you have to be a technical wizard to build a great web site? Think again. If you want to create an engaging web site, this thoroughly revised, completely updated edition of Creating a Web Site: The Missing Manual demystifies the process and provides tools, techniques, and expert guidance for developing a professional and reliable web presence. Whether you want to build a personal web site, an e-commerce site, a blog, or a web site for a specific occasion or promotion, this book gives you detailed instructions and clear-headed advice for: Everything from planning to launching. From pi

  1. Precision genome editing

    DEFF Research Database (Denmark)

    Steentoft, Catharina; Bennett, Eric P; Schjoldager, Katrine Ter-Borch Gram

    2014-01-01

    Precise and stable gene editing in mammalian cell lines has until recently been hampered by the lack of efficient targeting methods. While different gene silencing strategies have had tremendous impact on many biological fields, they have generally not been applied with wide success in the field...... of glycobiology, primarily due to their low efficiencies, with resultant failure to impose substantial phenotypic consequences upon the final glycosylation products. Here, we review novel nuclease-based precision genome editing techniques enabling efficient and stable gene editing, including gene disruption...... by introducing single or double-stranded breaks at a defined genomic sequence. We here compare and contrast the different techniques and summarize their current applications, highlighting cases from the field of glycobiology as well as pointing to future opportunities. The emerging potential of precision gene...

  2. REDIdb: an upgraded bioinformatics resource for organellar RNA editing sites.

    Science.gov (United States)

    Picardi, Ernesto; Regina, Teresa M R; Verbitskiy, Daniil; Brennicke, Axel; Quagliariello, Carla

    2011-03-01

    RNA editing is a post-transcriptional molecular process whereby the information in a genetic message is modified from that in the corresponding DNA template by means of nucleotide substitutions, insertions and/or deletions. It occurs mostly in organelles by clade-specific diverse and unrelated biochemical mechanisms. RNA editing events have been annotated in primary databases as GenBank and at more sophisticated level in the specialized databases REDIdb, dbRES and EdRNA. At present, REDIdb is the only freely available database that focuses on the organellar RNA editing process and annotates each editing modification in its biological context. Here we present an updated and upgraded release of REDIdb with a web-interface refurbished with graphical and computational facilities that improve RNA editing investigations. Details of the REDIdb features and novelties are illustrated and compared to other RNA editing databases. REDIdb is freely queried at http://biologia.unical.it/py_script/REDIdb/. Copyright © 2010 Elsevier B.V. and Mitochondria Research Society. All rights reserved.

  3. Development of Content Management System-based Web Applications

    OpenAIRE

    Souer, J.

    2012-01-01

    Web engineering is the application of systematic and quantifiable approaches (concepts, methods, techniques, tools) to cost-effective requirements analysis, design, implementation, testing, operation, and maintenance of high quality web applications. Over the past years, Content Management Systems (CMS) have emerged as an important foundation for the web engineering process. CMS can be defined as a tool for the creation, editing and management of web information in an integral way. A CMS appe...

  4. Trends in the wide web converting markets for UV curing

    International Nuclear Information System (INIS)

    Fisher, R.

    1999-01-01

    As we prepare to enter a new decade, the use of ultraviolet (UV) energy to initiate the polymerization of coatings in the wide web segment of the Converting industry continues to increase. As is typical in the Converting industry, while many of the significant advances in technology have been developed around the world, they have been driven initially by the Western European markets. This was true with regards to the introduction of water-borne Pressure Sensitive Adhesives and thermal curing 100% solids silicone release coatings during the late 1970s and early 1980s, but this trend has changed with regards to the current state-of-the-art in UV curing

  5. Affordances of students' using the World Wide Web as a publishing medium in project-based learning environments

    Science.gov (United States)

    Bos, Nathan Daniel

    This dissertation investigates the emerging affordance of the World Wide Web as a place for high school students to become authors and publishers of information. Two empirical studies lay groundwork for student publishing by examining learning issues related to audience adaptation in writing, motivation and engagement with hypermedia, design, problem-solving, and critical evaluation. Two models of student publishing on the World Wide Web were investigated over the course of two 11spth grade project-based science curriculums. In the first curricular model, students worked in pairs to design informative hypermedia projects about infectious diseases that were published on the Web. Four case studies were written, drawing on both product- and process-related data sources. Four theoretically important findings are illustrated through these cases: (1) multimedia, especially graphics, seemed to catalyze some students' design processes by affecting the sequence of their design process and by providing a connection between the science content and their personal interest areas, (2) hypermedia design can demand high levels of analysis and synthesis of science content, (3) students can learn to think about science content representation through engagement with challenging design tasks, and (4) students' consideration of an outside audience can be facilitated by teacher-given design principles. The second Web-publishing model examines how students critically evaluate scientific resources on the Web, and how students can contribute to the Web's organization and usability by publishing critical reviews. Students critically evaluated Web resources using a four-part scheme: summarization of content, content, evaluation of credibility, evaluation of organizational structure, and evaluation of appearance. Content analyses comparing students' reviews and reviewed Web documents showed that students were proficient at summarizing content of Web documents, identifying their publishing

  6. How Students Evaluate Information and Sources when Searching the World Wide Web for Information

    Science.gov (United States)

    Walraven, Amber; Brand-Gruwel, Saskia; Boshuizen, Henny P. A.

    2009-01-01

    The World Wide Web (WWW) has become the biggest information source for students while solving information problems for school projects. Since anyone can post anything on the WWW, information is often unreliable or incomplete, and it is important to evaluate sources and information before using them. Earlier research has shown that students have…

  7. INTERNET and information about nuclear sciences. The world wide web virtual library: nuclear sciences

    International Nuclear Information System (INIS)

    Kuruc, J.

    1999-01-01

    In this work author proposes to constitute new virtual library which should centralize the information from nuclear disciplines on the INTERNET, in order to them to give first and foremost the connection on the most important links in set nuclear sciences. The author has entitled this new virtual library The World Wide Web Library: Nuclear Sciences. By constitution of this virtual library next basic principles were chosen: home pages of international organizations important from point of view of nuclear disciplines; home pages of the National Nuclear Commissions and governments; home pages of nuclear scientific societies; web-pages specialized on nuclear problematic, in general; periodical tables of elements and isotopes; web-pages aimed on Chernobyl crash and consequences; web-pages with antinuclear aim. Now continue the links grouped on web-pages according to single nuclear areas: nuclear arsenals; nuclear astrophysics; nuclear aspects of biology (radiobiology); nuclear chemistry; nuclear company; nuclear data centres; nuclear energy; nuclear energy, environmental aspects of (radioecology); nuclear energy info centres; nuclear engineering; nuclear industries; nuclear magnetic resonance; nuclear material monitoring; nuclear medicine and radiology; nuclear physics; nuclear power (plants); nuclear reactors; nuclear risk; nuclear technologies and defence; nuclear testing; nuclear tourism; nuclear wastes; nuclear wastes. In these single groups web-links will be concentrated into following groups: virtual libraries and specialized servers; science; nuclear societies; nuclear departments of the academic institutes; nuclear research institutes and laboratories; centres, info links

  8. World Wide Webs: Crossing the Digital Divide through Promotion of Public Access

    Science.gov (United States)

    Coetzee, Liezl

    “As Bill Gates and Steve Case proclaim the global omnipresence of the Internet, the majority of non-Western nations and 97 per cent of the world's population remain unconnected to the net for lack of money, access, or knowledge. This exclusion of so vast a share of the global population from the Internet sharply contradicts the claims of those who posit the World Wide Web as a ‘universal' medium of egalitarian communication.” (Trend 2001:2)

  9. Digital libraries and World Wide Web sites and page persistence.

    Directory of Open Access Journals (Sweden)

    Wallace Koehler

    1999-01-01

    Full Text Available Web pages and Web sites, some argue, can either be collected as elements of digital or hybrid libraries, or, as others would have it, the WWW is itself a library. We begin with the assumption that Web pages and Web sites can be collected and categorized. The paper explores the proposition that the WWW constitutes a library. We conclude that the Web is not a digital library. However, its component parts can be aggregated and included as parts of digital library collections. These, in turn, can be incorporated into "hybrid libraries." These are libraries with both traditional and digital collections. Material on the Web can be organized and managed. Native documents can be collected in situ, disseminated, distributed, catalogueed, indexed, controlled, in traditional library fashion. The Web therefore is not a library, but material for library collections is selected from the Web. That said, the Web and its component parts are dynamic. Web documents undergo two kinds of change. The first type, the type addressed in this paper, is "persistence" or the existence or disappearance of Web pages and sites, or in a word the lifecycle of Web documents. "Intermittence" is a variant of persistence, and is defined as the disappearance but reappearance of Web documents. At any given time, about five percent of Web pages are intermittent, which is to say they are gone but will return. Over time a Web collection erodes. Based on a 120-week longitudinal study of a sample of Web documents, it appears that the half-life of a Web page is somewhat less than two years and the half-life of a Web site is somewhat more than two years. That is to say, an unweeded Web document collection created two years ago would contain the same number of URLs, but only half of those URLs point to content. The second type of change Web documents experience is change in Web page or Web site content. Again based on the Web document samples, very nearly all Web pages and sites undergo some

  10. Web Page Recommendation Using Web Mining

    OpenAIRE

    Modraj Bhavsar; Mrs. P. M. Chavan

    2014-01-01

    On World Wide Web various kind of content are generated in huge amount, so to give relevant result to user web recommendation become important part of web application. On web different kind of web recommendation are made available to user every day that includes Image, Video, Audio, query suggestion and web page. In this paper we are aiming at providing framework for web page recommendation. 1) First we describe the basics of web mining, types of web mining. 2) Details of each...

  11. Professional JavaScript for Web Developers

    CERN Document Server

    Zakas, Nicholas C

    2011-01-01

    A significant update to a bestselling JavaScript book As the key scripting language for the web, JavaScript is supported by every modern web browser and allows developers to create client-side scripts that take advantage of features such as animating the canvas tag and enabling client-side storage and application caches. After an in-depth introduction to the JavaScript language, this updated edition of a bestseller progresses to break down how JavaScript is applied for web development using the latest web development technologies. Veteran author and JavaScript guru Nicholas Zakas shows how Jav

  12. From theater to the world wide web--a new online era for surgical education.

    LENUS (Irish Health Repository)

    O'Leary, D Peter

    2012-07-01

    Traditionally, surgical education has been confined to operating and lecture theaters. Access to the World Wide Web and services, such as YouTube and iTunes has expanded enormously. Each week throughout Ireland, nonconsultant hospital doctors work hard to create presentations for surgical teaching. Once presented, these valuable presentations are often never used again.

  13. Enhancement of shear strength and ductility for reinforced concrete wide beams due to web reinforcement

    Directory of Open Access Journals (Sweden)

    M. Said

    2013-12-01

    Full Text Available The shear behavior of reinforced concrete wide beams was investigated. The experimental program consisted of nine beams of 29 MPa concrete strength tested with a shear span-depth ratio equal to 3.0. One of the tested beams had no web reinforcement as a control specimen. The flexure mode of failure was secured for all of the specimens to allow for shear mode of failure. The key parameters covered in this investigation are the effect of the existence, spacing, amount and yield stress of the vertical stirrups on the shear capacity and ductility of the tested wide beams. The study shows that the contribution of web reinforcement to the shear capacity is significant and directly proportional to the amount and spacing of the shear reinforcement. The increase in the shear capacity ranged from 32% to 132% for the range of the tested beams compared with the control beam. High grade steel was more effective in the contribution of the shear strength of wide beams. Also, test results demonstrate that the shear reinforcement significantly enhances the ductility of the wide beams. In addition, shear resistances at failure recorded in this study are compared to the analytical strengths calculated according to the current Egyptian Code and the available international codes. The current study highlights the need to include the contribution of shear reinforcement in the Egyptian Code requirements for shear capacity of wide beams.

  14. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.; Sarfraz, M.

    2004-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  15. [Genome editing of industrial microorganism].

    Science.gov (United States)

    Zhu, Linjiang; Li, Qi

    2015-03-01

    Genome editing is defined as highly-effective and precise modification of cellular genome in a large scale. In recent years, such genome-editing methods have been rapidly developed in the field of industrial strain improvement. The quickly-updating methods thoroughly change the old mode of inefficient genetic modification, which is "one modification, one selection marker, and one target site". Highly-effective modification mode in genome editing have been developed including simultaneous modification of multiplex genes, highly-effective insertion, replacement, and deletion of target genes in the genome scale, cut-paste of a large DNA fragment. These new tools for microbial genome editing will certainly be applied widely, and increase the efficiency of industrial strain improvement, and promote the revolution of traditional fermentation industry and rapid development of novel industrial biotechnology like production of biofuel and biomaterial. The technological principle of these genome-editing methods and their applications were summarized in this review, which can benefit engineering and construction of industrial microorganism.

  16. Web-Based Instruction: A Guide for Libraries, Third Edition

    Science.gov (United States)

    Smith, Susan Sharpless

    2010-01-01

    Expanding on the popular, practical how-to guide for public, academic, school, and special libraries, technology expert Susan Sharpless Smith offers library instructors the confidence to take Web-based instruction into their own hands. Smith has thoroughly updated "Web-Based Instruction: A Guide for Libraries" to include new tools and trends,…

  17. Two virtual astro refresher courses on the world-wide-web

    International Nuclear Information System (INIS)

    Goldwein, Joel W.

    1997-01-01

    Purpose/Objective: The Internet offers a novel venue for providing educational material to radiation oncologists. This exhibit demonstrates its utility for providing the complete content of two past ASTRO refresher courses. Materials and Methods: The audio recording, handout and slides from the 1995 ASTRO refresher course entitled 'Radiation Therapy for Pediatric Brain Tumors; Standards of Care, Current Clinical Trials and New Directions' and the 1996 ASTRO refresher course entitled 'Internet-based communications in Radiation Oncology' were digitized and placed on an Internet World-Wide-Web site. The Web address was posted on the refresher course handout and in the meeting book ('http://goldwein 1.xrt.upenn.edu/brain95.html' and 'http://goldwein 1.xrt.upenn.edu/astro96/'). The computer distributing this material is an Intel-based 486 DEC50 personal computer with a 50 Mhz processor running Windows NT 3.51 workstation. Software utilized to distribute the material is in the public domain and includes EWMAC's 'httpd', and Progressive Network's 'RealAudio Server' and 'Encoder'. The University's dedicated Internet connection is used to 'serve' this material. Results: The two approximately 100 minute lectures have been encoded into several 'RealAudio' files totaling 10 Megabytes in size. These files are accessible with moderate to excellent quality and speed utilizing as little as a 14.4k modem connection to the Internet. Use of 'streaming' technology provides a means for playing the audio files over the Internet after downloading only a small portion of the files. The time required to digitize the material has been approximately 40 hours, with most time related to digitizing slides from a Powerpoint presentation. Not all slides have been digitized as of this time. To date, approximately 400 accesses to this resource have been logged on the system. Seven electronic comment forms for the second course have all rated it as 'superior'. Pitfalls include the difficulty

  18. Navigational Structure on the World Wide Web: Usability Concerns, User Preferences, and "Browsing Behavior."

    Science.gov (United States)

    Frick, Theodore; Monson, John A.; Xaver, Richard F.; Kilic, Gulsen; Conley, Aaron T.; Wamey, Beatrice

    There are several approaches a World Wide Web site designer considers in developing a menu structure. One consideration is the content of the menus (what choices are available to the user). Another consideration is the physical layout of the menu structure. The physical layout of a menu may be described as being one of at least three different…

  19. A World Wide Web Human Dimensions Framework and Database for Wildlife and Forest Planning

    Science.gov (United States)

    Michael A. Tarrant; Alan D. Bright; H. Ken Cordell

    1999-01-01

    The paper describes a human dimensions framework(HDF) for application in wildlife and forest planning. The HDF is delivered via the world wide web and retrieves data on-line from the Social, Economic, Environmental, Leisure, and Attitudes (SEELA) database. The proposed HDF is guided by ten fundamental HD principles, and is applied to wildlife and forest planning using...

  20. DICOM to print, 35-mm slides, web, and video projector: tutorial using Adobe Photoshop.

    Science.gov (United States)

    Gurney, Jud W

    2002-10-01

    Preparing images for publication has dealt with film and the photographic process. With picture archiving and communications systems, many departments will no longer produce film. This will change how images are produced for publication. DICOM, the file format for radiographic images, has to be converted and then prepared for traditional publication, 35-mm slides, the newest techniques of video projection, and the World Wide Web. Tagged image file format is the common format for traditional print publication, whereas joint photographic expert group is the current file format for the World Wide Web. Each medium has specific requirements that can be met with a common image-editing program such as Adobe Photoshop (Adobe Systems, San Jose, CA). High-resolution images are required for print, a process that requires interpolation. However, the Internet requires images with a small file size for rapid transmission. The resolution of each output differs and the image resolution must be optimized to match the output of the publishing medium.

  1. PHP Solutions Dynamic Web Design Made Easy

    CERN Document Server

    Powers, David

    2010-01-01

    This is the second edition of David Power's highly-respected PHP Solutions: Dynamic Web Design Made Easy. This new edition has been updated by David to incorporate changes to PHP since the first edition and to offer the latest techniques - a classic guide modernized for 21st century PHP techniques, innovations, and best practices. You want to make your websites more dynamic by adding a feedback form, creating a private area where members can upload images that are automatically resized, or perhaps storing all your content in a database. The problem is, you're not a programmer and the thought o

  2. Rendimiento de los sistemas de recuperación en la world wide web: revisión metodológica.

    Directory of Open Access Journals (Sweden)

    Olvera Lobo, María Dolores

    2000-03-01

    Full Text Available This study is an attempt to establish a methodology for the evaluation of information retrieval with search engines in the World Wide Web. The method, which is explained in detail, adapts traditional techniques for evaluating web peculiarities and makes use of precision and recall scores, based on the relevance of the first 20 results retrieved. This method has been successfully applied to the evaluation of ten different search engines.

    Este estudio pretende contribuir a establecer una metodología para la evaluación de la recuperación de información de las herramientas de búsqueda en el entorno de la World Wide Web. Se detalla el método diseñado (y aplicado con éxito, para evaluar los resultados de las búsquedas, adaptando las técnicas tradicionales de evaluación a las particularidades de la Web y empleando las medidas de la precisión y exhaustividad, basadas en la relevancia, para los 20 primeros resultados recuperados.

  3. An Image Retrieval and Processing Expert System for the World Wide Web

    Science.gov (United States)

    Rodriguez, Ricardo; Rondon, Angelica; Bruno, Maria I.; Vasquez, Ramon

    1998-01-01

    This paper presents a system that is being developed in the Laboratory of Applied Remote Sensing and Image Processing at the University of P.R. at Mayaguez. It describes the components that constitute its architecture. The main elements are: a Data Warehouse, an Image Processing Engine, and an Expert System. Together, they provide a complete solution to researchers from different fields that make use of images in their investigations. Also, since it is available to the World Wide Web, it provides remote access and processing of images.

  4. From theater to the world wide web--a new online era for surgical education.

    Science.gov (United States)

    O'Leary, D Peter; Corrigan, Mark A; McHugh, Seamus M; Hill, A D; Redmond, H Paul

    2012-01-01

    Traditionally, surgical education has been confined to operating and lecture theaters. Access to the World Wide Web and services, such as YouTube and iTunes has expanded enormously. Each week throughout Ireland, nonconsultant hospital doctors work hard to create presentations for surgical teaching. Once presented, these valuable presentations are often never used again. We aimed to compile surgical presentations online and establish a new online surgical education tool. We also sought to measure the effect of this educational tool on surgical presentation quality. Surgical presentations from Cork University Hospital and Beaumont Hospital presented between January 2010 and April 2011 were uploaded to http://www.pilgrimshospital.com/presentations. A YouTube channel and iTunes application were created. Web site hits were monitored. Quality of presentations was assessed by 4 independent senior surgical judges using a validated PowerPoint assessment form. Judges were randomly given 6 presentations; 3 presentations were pre-web site setup and 3 were post-web site setup. Once uploading commenced, presenters were informed. A total of 89 presentations have been uploaded to date. This includes 55 cases, 17 journal club, and 17 short bullet presentations. This has been associated with 46,037 web site page views. Establishment of the web site was associated with a significant improvement in the quality of presentations. Mean scores for pre- and post-web site group were 6.2 vs 7.7 out of 9 respectively, p = 0.037. This novel educational tool provides a unique method to enable surgical education become more accessible to trainees, while also improving the overall quality of surgical teaching PowerPoint presentations. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  5. Validity and client use of information from the World Wide Web regarding veterinary anesthesia in dogs.

    Science.gov (United States)

    Hofmeister, Erik H; Watson, Victoria; Snyder, Lindsey B C; Love, Emma J

    2008-12-15

    To determine the validity of the information on the World Wide Web concerning veterinary anesthesia in dogs and to determine the methods dog owners use to obtain that information. Web-based search and client survey. 73 Web sites and 92 clients. Web sites were scored on a 5-point scale for completeness and accuracy of information about veterinary anesthesia by 3 board-certified anesthesiologists. A search for anesthetic information regarding 49 specific breeds of dogs was also performed. A survey was distributed to the clients who visited the University of Georgia Veterinary Teaching Hospital during a 4-month period to solicit data about sources used by clients to obtain veterinary medical information and the manner in which information obtained from Web sites was used. The general search identified 73 Web sites that included information on veterinary anesthesia; these sites received a mean score of 3.4 for accuracy and 2.5 for completeness. Of 178 Web sites identified through the breed-specific search, 57 (32%) indicated that a particular breed was sensitive to anesthesia. Of 83 usable, completed surveys, 72 (87%) indicated the client used the Web for veterinary medical information. Fifteen clients (18%) indicated they believed their animal was sensitive to anesthesia because of its breed. Information available on the internet regarding anesthesia in dogs is generally not complete and may be misleading with respect to risks to specific breeds. Consequently, veterinarians should appropriately educate clients regarding anesthetic risk to their particular dog.

  6. Genome Editing: A New Approach to Human Therapeutics.

    Science.gov (United States)

    Porteus, Matthew

    2016-01-01

    The ability to manipulate the genome with precise spatial and nucleotide resolution (genome editing) has been a powerful research tool. In the past decade, the tools and expertise for using genome editing in human somatic cells and pluripotent cells have increased to such an extent that the approach is now being developed widely as a strategy to treat human disease. The fundamental process depends on creating a site-specific DNA double-strand break (DSB) in the genome and then allowing the cell's endogenous DSB repair machinery to fix the break such that precise nucleotide changes are made to the DNA sequence. With the development and discovery of several different nuclease platforms and increasing knowledge of the parameters affecting different genome editing outcomes, genome editing frequencies now reach therapeutic relevance for a wide variety of diseases. Moreover, there is a series of complementary approaches to assessing the safety and toxicity of any genome editing process, irrespective of the underlying nuclease used. Finally, the development of genome editing has raised the issue of whether it should be used to engineer the human germline. Although such an approach could clearly prevent the birth of people with devastating and destructive genetic diseases, questions remain about whether human society is morally responsible enough to use this tool.

  7. The readability of pediatric patient education materials on the World Wide Web.

    Science.gov (United States)

    D'Alessandro, D M; Kingsley, P; Johnson-West, J

    2001-07-01

    Literacy is a national and international problem. Studies have shown the readability of adult and pediatric patient education materials to be too high for average adults. Materials should be written at the 8th-grade level or lower. To determine the general readability of pediatric patient education materials designed for adults on the World Wide Web (WWW). GeneralPediatrics.com (http://www.generalpediatrics.com) is a digital library serving the medical information needs of pediatric health care providers, patients, and families. Documents from 100 different authoritative Web sites designed for laypersons were evaluated using a built-in computer software readability formula (Flesch Reading Ease and Flesch-Kincaid reading levels) and hand calculation methods (Fry Formula and SMOG methods). Analysis of variance and paired t tests determined significance. Eighty-nine documents constituted the final sample; they covered a wide spectrum of pediatric topics. The overall Flesch Reading Ease score was 57.0. The overall mean Fry Formula was 12.0 (12th grade, 0 months of schooling) and SMOG was 12.2. The overall Flesch-Kincaid grade level was significantly lower (Peducation materials on the WWW are not written at an appropriate reading level for the average adult. We propose that a practical reading level and how it was determined be included on all patient education materials on the WWW for general guidance in material selection. We discuss suggestions for improved readability of patient education materials.

  8. World wide web implementation of the Langley technical report server

    Science.gov (United States)

    Nelson, Michael L.; Gottlich, Gretchen L.; Bianco, David J.

    1994-01-01

    On January 14, 1993, NASA Langley Research Center (LaRC) made approximately 130 formal, 'unclassified, unlimited' technical reports available via the anonymous FTP Langley Technical Report Server (LTRS). LaRC was the first organization to provide a significant number of aerospace technical reports for open electronic dissemination. LTRS has been successful in its first 18 months of operation, with over 11,000 reports distributed and has helped lay the foundation for electronic document distribution for NASA. The availability of World Wide Web (WWW) technology has revolutionized the Internet-based information community. This paper describes the transition of LTRS from a centralized FTP site to a distributed data model using the WWW, and suggests how the general model for LTRS can be applied to other similar systems.

  9. Studying Acute Coronary Syndrome Through the World Wide Web: Experiences and Lessons.

    Science.gov (United States)

    Alonzo, Angelo A

    2017-10-13

    This study details my viewpoint on the experiences, lessons, and assessments of conducting a national study on care-seeking behavior for heart attack in the United States utilizing the World Wide Web. The Yale Heart Study (YHS) was funded by the National Heart, Lung, and Blood Institute (NHLBI) of the National Institutes of Health (NIH). Grounded on two prior studies, the YHS combined a Web-based interview survey instrument; ads placed on the Internet; flyers and posters in public libraries, senior centers, and rehabilitation centers; information on chat rooms; a viral marketing strategy; and print ads to attract potential participants to share their heart attack experiences. Along the way, the grant was transferred from Ohio State University (OSU) to Yale University, and significant administrative, information technology, and personnel challenges ensued that materially delayed the study's execution. Overall, the use of the Internet to collect data on care-seeking behavior is very time consuming and emergent. The cost of using the Web was approximately 31% less expensive than that of face-to-face interviews. However, the quality of the data may have suffered because of the absence of some data compared with interviewing participants. Yet the representativeness of the 1154 usable surveys appears good, with the exception of a dearth of African American participants. ©Angelo A Alonzo. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 13.10.2017.

  10. Developing as new search engine and browser for libraries to search and organize the World Wide Web library resources

    OpenAIRE

    Sreenivasulu, V.

    2000-01-01

    Internet Granthalaya urges world wide advocates and targets at the task of creating a new search engine and dedicated browseer. Internet Granthalaya may be the ultimate search engine exclusively dedicated for every library use to search and organize the world wide web libary resources

  11. Traitor: associating concepts using the world wide web

    NARCIS (Netherlands)

    Drijfhout, Wanno; Oliver, J.; Oliver, Jundt; Wevers, L.; Hiemstra, Djoerd

    We use Common Crawl's 25TB data set of web pages to construct a database of associated concepts using Hadoop. The database can be queried through a web application with two query interfaces. A textual interface allows searching for similarities and differences between multiple concepts using a query

  12. Intelligent System for Data Tracking in Image Editing Company

    Directory of Open Access Journals (Sweden)

    Kimlong Ngin

    2017-11-01

    Full Text Available The success of data transaction in a company largely depends on the intelligence system used in its database and application system. The complex and heterogeneous data in the log file make it more difficult for users to manage data effectively. Therefore, this paper creates an application system that can manage data from the log file. A sample was collected from an image editing company in Cambodia by interviewing five customers and seven operators, who worked on the data files for 300 images. This paper found two results: first, the agent script was used for retrieving data from the log file, classifying data, and inserting data into a database; and second, the web interface was used for the viewing of results by the users. The intelligence capabilities of our application, together with a friendly web-based and window-based experience, allow the users to easily acquire, manage, and access the data in an image editing company.

  13. Finding Emotional-Laden Resources on the World Wide Web

    Directory of Open Access Journals (Sweden)

    Diane Rasmussen Neal

    2011-03-01

    Full Text Available Some content in multimedia resources can depict or evoke certain emotions in users. The aim of Emotional Information Retrieval (EmIR and of our research is to identify knowledge about emotional-laden documents and to use these findings in a new kind of World Wide Web information service that allows users to search and browse by emotion. Our prototype, called Media EMOtion SEarch (MEMOSE, is largely based on the results of research regarding emotive music pieces, images and videos. In order to index both evoked and depicted emotions in these three media types and to make them searchable, we work with a controlled vocabulary, slide controls to adjust the emotions’ intensities, and broad folksonomies to identify and separate the correct resource-specific emotions. This separation of so-called power tags is based on a tag distribution which follows either an inverse power law (only one emotion was recognized or an inverse-logistical shape (two or three emotions were recognized. Both distributions are well known in information science. MEMOSE consists of a tool for tagging basic emotions with the help of slide controls, a processing device to separate power tags, a retrieval component consisting of a search interface (for any topic in combination with one or more emotions and a results screen. The latter shows two separately ranked lists of items for each media type (depicted and felt emotions, displaying thumbnails of resources, ranked by the mean values of intensity. In the evaluation of the MEMOSE prototype, study participants described our EmIR system as an enjoyable Web 2.0 service.

  14. Enhancing Student Performance in First-Semester General Chemistry Using Active Feedback through the World Wide Web

    Science.gov (United States)

    Chambers, Kent A.; Blake, Bob

    2007-01-01

    The World Wide Web recently launched a new interactive feedback system for the instructors, so that can better understanding about their students and their problems. The feedback, in combination with tailored lectures is expected to enhance student performance in the first semester of general chemistry.

  15. Development of a world wide web-based interactive education program to improve detectability of pulmonary nodules on chest radiographs

    International Nuclear Information System (INIS)

    Ohm, Joon Young; Kim, Jin Hwan; Kim, Sung Soo; Han, Ki Tae; Ahn, Young Seob; Shin, Byung Seok; Bae, Kyongtae T.

    2007-01-01

    To design and develop a World Wide Web-based education program that will allow trainees to interactively learn and improve the diagnostic capability of detecting pulmonary nodules on chest radiographs. Chest radiographs with known diagnosis were retrieved and selected from our institutional clinical archives. A database was constructed by sorting radiographs into three groups: normal, nodule, and false positive (i.e., nodule-like focal opacity). Each nodule was assigned with the degree of detectability: easy, intermediate, difficult, and likely missed. Nodules were characterized by their morphology (well-defined, ill-defined, irregular, faint) and by other associated pathologies or potentially obscuring structures. The Web site was organized into four sections: study, test, record and information. The Web site allowed a user interactively to undergo the training section appropriate to the user's diagnostic capability. The training was enhanced by means of clinical and other pertinent radiological findings included in the database. The outcome of the training was tested with clinical test radiographs that presented nodules or false positives with varying diagnostic difficulties. A World Wide Web-based education program is a promising technique that would allow trainees to interactively learn and improve the diagnostic capability of detecting and characterizing pulmonary nodules

  16. Use of World Wide Web-based directories for tracing subjects in epidemiologic studies.

    Science.gov (United States)

    Koo, M M; Rohan, T E

    2000-11-01

    The recent availability of World Wide Web-based directories has opened up a new approach for tracing subjects in epidemiologic studies. The completeness of two World Wide Web-based directories (Canada411 and InfoSpace Canada) for subject tracing was evaluated by using a randomized crossover design for 346 adults randomly selected from respondents in an ongoing cohort study. About half (56.4%) of the subjects were successfully located by using either Canada411 or InfoSpace. Of the 43.6% of the subjects who could not be located using either directory, the majority (73.5%) were female. Overall, there was no clear advantage of one directory over the other. Although Canada411 could find significantly more subjects than InfoSpace, the number of potential matches returned by Canada411 was also higher, which meant that a longer list of potential matches had to be examined before a true match could be found. One strategy to minimize the number of potential matches per true match is to first search by InfoSpace with the last name and first name, then by Canada411 with the last name and first name, and finally by InfoSpace with the last name and first initial. Internet-based searches represent a potentially useful approach to tracing subjects in epidemiologic studies.

  17. Using Open Web APIs in Teaching Web Mining

    Science.gov (United States)

    Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju

    2009-01-01

    With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…

  18. Migrating the facility profile information management system into the world wide web

    Energy Technology Data Exchange (ETDEWEB)

    Kero, R.E.; Swietlik, C.E.

    1994-09-01

    The Department of Energy - Office of Special Projects and Argonne National Laboratory (ANL), along with the Department of Energy - office of Scientific and Technical Information have previously designed and implemented the Environment, Safety and Health Facility Profile Information Management System (FPIMS) to facilitate greater efficiency in searching, analyzing and disseminating information found within environment, safety and health oversight documents. This information retrieval based system serves as a central repository for full-text electronic oversight documents, as well as a management planning and decision making tool that can assist in trend and root cause analyses. Continuous improvement of environment, safety and health programs are currently aided through this personal computer-based system by providing a means for the open communication of lessons learned across the department. Overall benefits have included reductions in costs and improvements in past information management capabilities. Access to the FPIMS has been possible historically through a headquarters-based local area network equipped with modems. Continued demand for greater accessibility of the system by remote DOE field offices and sites, in conjunction with the Secretary of Energy` s call for greater public accessibility to Department of Energy (DOE) information resources, has been the impetus to expand access through the use of Internet technologies. Therefore, the following paper will discuss reasons for migrating the FPIMS system into the World Wide Web (Web), various lessons learned from the FPIMS migration effort, as well as future plans for enhancing the Web-based FPIMS.

  19. World Wide Web Metaphors for Search Mission Data

    Science.gov (United States)

    Norris, Jeffrey S.; Wallick, Michael N.; Joswig, Joseph C.; Powell, Mark W.; Torres, Recaredo J.; Mittman, David S.; Abramyan, Lucy; Crockett, Thomas M.; Shams, Khawaja S.; Fox, Jason M.; hide

    2010-01-01

    A software program that searches and browses mission data emulates a Web browser, containing standard meta - phors for Web browsing. By taking advantage of back-end URLs, users may save and share search states. Also, since a Web interface is familiar to users, training time is reduced. Familiar back and forward buttons move through a local search history. A refresh/reload button regenerates a query, and loads in any new data. URLs can be constructed to save search results. Adding context to the current search is also handled through a familiar Web metaphor. The query is constructed by clicking on hyperlinks that represent new components to the search query. The selection of a link appears to the user as a page change; the choice of links changes to represent the updated search and the results are filtered by the new criteria. Selecting a navigation link changes the current query and also the URL that is associated with it. The back button can be used to return to the previous search state. This software is part of the MSLICE release, which was written in Java. It will run on any current Windows, Macintosh, or Linux system.

  20. Using the World-Wide Web to Facilitate Communications of Non-Destructive Evaluation

    Science.gov (United States)

    McBurney, Sean

    1995-01-01

    The high reliability required for Aeronautical components is a major reason for extensive Nondestructive Testing and Evaluation. Here at Langley Research Center (LaRC), there are highly trained and certified personal to conduct such testing to prevent hazards from occurring in the workplace and on the research projects for the National Aeronautics and Space Administration (NASA). The purpose of my studies was to develop a communication source to educate others of the services and equipment offered here. This was accomplished by creating documents that are accessible to all in the industry via the World Wide Web.

  1. User Interface on the World Wide Web: How to Implement a Multi-Level Program Online

    Science.gov (United States)

    Cranford, Jonathan W.

    1995-01-01

    The objective of this Langley Aerospace Research Summer Scholars (LARSS) research project was to write a user interface that utilizes current World Wide Web (WWW) technologies for an existing computer program written in C, entitled LaRCRisk. The project entailed researching data presentation and script execution on the WWW and than writing input/output procedures for the database management portion of LaRCRisk.

  2. Creation and utilization of a World Wide Web based space radiation effects code: SIREST

    Science.gov (United States)

    Singleterry, R. C. Jr; Wilson, J. W.; Shinn, J. L.; Tripathi, R. K.; Thibeault, S. A.; Noor, A. K.; Cucinotta, F. A.; Badavi, F. F.; Chang, C. K.; Qualls, G. D.; hide

    2001-01-01

    In order for humans and electronics to fully and safely operate in the space environment, codes like HZETRN (High Charge and Energy Transport) must be included in any designer's toolbox for design evaluation with respect to radiation damage. Currently, spacecraft designers do not have easy access to accurate radiation codes like HZETRN to evaluate their design for radiation effects on humans and electronics. Today, the World Wide Web is sophisticated enough to support the entire HZETRN code and all of the associated pre and post processing tools. This package is called SIREST (Space Ionizing Radiation Effects and Shielding Tools). There are many advantages to SIREST. The most important advantage is the instant update capability of the web. Another major advantage is the modularity that the web imposes on the code. Right now, the major disadvantage of SIREST will be its modularity inside the designer's system. This mostly comes from the fact that a consistent interface between the designer and the computer system to evaluate the design is incomplete. This, however, is to be solved in the Intelligent Synthesis Environment (ISE) program currently being funded by NASA.

  3. ePlant and the 3D data display initiative: integrative systems biology on the world wide web.

    Science.gov (United States)

    Fucile, Geoffrey; Di Biase, David; Nahal, Hardeep; La, Garon; Khodabandeh, Shokoufeh; Chen, Yani; Easley, Kante; Christendat, Dinesh; Kelley, Lawrence; Provart, Nicholas J

    2011-01-10

    Visualization tools for biological data are often limited in their ability to interactively integrate data at multiple scales. These computational tools are also typically limited by two-dimensional displays and programmatic implementations that require separate configurations for each of the user's computing devices and recompilation for functional expansion. Towards overcoming these limitations we have developed "ePlant" (http://bar.utoronto.ca/eplant) - a suite of open-source world wide web-based tools for the visualization of large-scale data sets from the model organism Arabidopsis thaliana. These tools display data spanning multiple biological scales on interactive three-dimensional models. Currently, ePlant consists of the following modules: a sequence conservation explorer that includes homology relationships and single nucleotide polymorphism data, a protein structure model explorer, a molecular interaction network explorer, a gene product subcellular localization explorer, and a gene expression pattern explorer. The ePlant's protein structure explorer module represents experimentally determined and theoretical structures covering >70% of the Arabidopsis proteome. The ePlant framework is accessed entirely through a web browser, and is therefore platform-independent. It can be applied to any model organism. To facilitate the development of three-dimensional displays of biological data on the world wide web we have established the "3D Data Display Initiative" (http://3ddi.org).

  4. OrthoVenn: a web server for genome wide comparison and annotation of orthologous clusters across multiple species

    Science.gov (United States)

    Genome wide analysis of orthologous clusters is an important component of comparative genomics studies. Identifying the overlap among orthologous clusters can enable us to elucidate the function and evolution of proteins across multiple species. Here, we report a web platform named OrthoVenn that i...

  5. 07051 Executive Summary -- Programming Paradigms for the Web: Web Programming and Web Services

    OpenAIRE

    Hull, Richard; Thiemann, Peter; Wadler, Philip

    2007-01-01

    The world-wide web raises a variety of new programming challenges. To name a few: programming at the level of the web browser, data-centric approaches, and attempts to automatically discover and compose web services. This seminar brought together researchers from the web programming and web services communities and strove to engage them in communication with each other. The seminar was held in an unusual style, in a mixture of short presentations and in-depth discussio...

  6. Health and medication information resources on the World Wide Web.

    Science.gov (United States)

    Grossman, Sara; Zerilli, Tina

    2013-04-01

    Health care practitioners have increasingly used the Internet to obtain health and medication information. The vast number of Internet Web sites providing such information and concerns with their reliability makes it essential for users to carefully select and evaluate Web sites prior to use. To this end, this article reviews the general principles to consider in this process. Moreover, as cost may limit access to subscription-based health and medication information resources with established reputability, freely accessible online resources that may serve as an invaluable addition to one's reference collection are highlighted. These include government- and organization-sponsored resources (eg, US Food and Drug Administration Web site and the American Society of Health-System Pharmacists' Drug Shortage Resource Center Web site, respectively) as well as commercial Web sites (eg, Medscape, Google Scholar). Familiarity with such online resources can assist health care professionals in their ability to efficiently navigate the Web and may potentially expedite the information gathering and decision-making process, thereby improving patient care.

  7. Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review.

    Science.gov (United States)

    Eysenbach, Gunther; Powell, John; Kuss, Oliver; Sa, Eun-Ryoung

    The quality of consumer health information on the World Wide Web is an important issue for medicine, but to date no systematic and comprehensive synthesis of the methods and evidence has been performed. To establish a methodological framework on how quality on the Web is evaluated in practice, to determine the heterogeneity of the results and conclusions, and to compare the methodological rigor of these studies, to determine to what extent the conclusions depend on the methodology used, and to suggest future directions for research. We searched MEDLINE and PREMEDLINE (1966 through September 2001), Science Citation Index (1997 through September 2001), Social Sciences Citation Index (1997 through September 2001), Arts and Humanities Citation Index (1997 through September 2001), LISA (1969 through July 2001), CINAHL (1982 through July 2001), PsychINFO (1988 through September 2001), EMBASE (1988 through June 2001), and SIGLE (1980 through June 2001). We also conducted hand searches, general Internet searches, and a personal bibliographic database search. We included published and unpublished empirical studies in any language in which investigators searched the Web systematically for specific health information, evaluated the quality of Web sites or pages, and reported quantitative results. We screened 7830 citations and retrieved 170 potentially eligible full articles. A total of 79 distinct studies met the inclusion criteria, evaluating 5941 health Web sites and 1329 Web pages, and reporting 408 evaluation results for 86 different quality criteria. Two reviewers independently extracted study characteristics, medical domains, search strategies used, methods and criteria of quality assessment, results (percentage of sites or pages rated as inadequate pertaining to a quality criterion), and quality and rigor of study methods and reporting. Most frequently used quality criteria used include accuracy, completeness, readability, design, disclosures, and references provided

  8. Architecture for biomedical multimedia information delivery on the World Wide Web

    Science.gov (United States)

    Long, L. Rodney; Goh, Gin-Hua; Neve, Leif; Thoma, George R.

    1997-10-01

    Research engineers at the National Library of Medicine are building a prototype system for the delivery of multimedia biomedical information on the World Wide Web. This paper discuses the architecture and design considerations for the system, which will be used initially to make images and text from the third National Health and Nutrition Examination Survey (NHANES) publicly available. We categorized our analysis as follows: (1) fundamental software tools: we analyzed trade-offs among use of conventional HTML/CGI, X Window Broadway, and Java; (2) image delivery: we examined the use of unconventional TCP transmission methods; (3) database manager and database design: we discuss the capabilities and planned use of the Informix object-relational database manager and the planned schema for the HNANES database; (4) storage requirements for our Sun server; (5) user interface considerations; (6) the compatibility of the system with other standard research and analysis tools; (7) image display: we discuss considerations for consistent image display for end users. Finally, we discuss the scalability of the system in terms of incorporating larger or more databases of similar data, and the extendibility of the system for supporting content-based retrieval of biomedical images. The system prototype is called the Web-based Medical Information Retrieval System. An early version was built as a Java applet and tested on Unix, PC, and Macintosh platforms. This prototype used the MiniSQL database manager to do text queries on a small database of records of participants in the second NHANES survey. The full records and associated x-ray images were retrievable and displayable on a standard Web browser. A second version has now been built, also a Java applet, using the MySQL database manager.

  9. Distributed nuclear medicine applications using World Wide Web and Java technology

    International Nuclear Information System (INIS)

    Knoll, P.; Hoell, K.; Koriska, K.; Mirzaei, S.; Koehn, H.

    2000-01-01

    At present, medical applications applying World Wide Web (WWW) technology are mainly used to view static images and to retrieve some information. The Java platform is a relative new way of computing, especially designed for network computing and distributed applications which enables interactive connection between user and information via the WWW. The Java 2 Software Development Kit (SDK) including Java2D API, Java Remote Method Invocation (RMI) technology, Object Serialization and the Java Advanced Imaging (JAI) extension was used to achieve a robust, platform independent and network centric solution. Medical image processing software based on this technology is presented and adequate performance capability of Java is demonstrated by an iterative reconstruction algorithm for single photon emission computerized tomography (SPECT). (orig.)

  10. Semantic Web Requirements through Web Mining Techniques

    OpenAIRE

    Hassanzadeh, Hamed; Keyvanpour, Mohammad Reza

    2012-01-01

    In recent years, Semantic web has become a topic of active research in several fields of computer science and has applied in a wide range of domains such as bioinformatics, life sciences, and knowledge management. The two fast-developing research areas semantic web and web mining can complement each other and their different techniques can be used jointly or separately to solve the issues in both areas. In addition, since shifting from current web to semantic web mainly depends on the enhance...

  11. Genome Editing in Penicillium chrysogenum Using Cas9 Ribonucleoprotein Particles

    NARCIS (Netherlands)

    Pohl, Carsten; Mózsik, László; Driessen, Arnold J M; Bovenberg, Roel A L; Nygård, Yvonne I; Braman, Jeffrey Carl

    Several CRISPR/Cas9 tools have been recently established for precise genome editing in a wide range of filamentous fungi. This genome editing platform offers high flexibility in target selection and the possibility of introducing genetic deletions without the introduction of transgenic sequences .

  12. Overview of the TREC 2013 federated web search track

    OpenAIRE

    Demeester, Thomas; Trieschnigg, D; Nguyen, D; Hiemstra, D

    2013-01-01

    The TREC Federated Web Search track is intended to promote research related to federated search in a realistic web setting, and hereto provides a large data collection gathered from a series of online search engines. This overview paper discusses the results of the first edition of the track, FedWeb 2013. The focus was on basic challenges in federated search: (1) resource selection, and (2) results merging. After an overview of the provided data collection and the relevance judgments for the ...

  13. Assessing the quality of infertility resources on the World Wide Web: tools to guide clients through the maze of fact and fiction.

    Science.gov (United States)

    Okamura, Kyoko; Bernstein, Judith; Fidler, Anne T

    2002-01-01

    The Internet has become a major source of health information for women, but information placed on the World Wide Web does not routinely undergo a peer review process before dissemination. In this study, we present an analysis of 197 infertility-related Web sites for quality and accountability, using JAMA's minimal core standards for responsible print. Only 2% of the web sites analyzed met all four recommended standards, and 50.8% failed to report any of the four. Commercial web sites were more likely to fail to meet minimum standards (71.2%) than those with educational (46.8%) or supportive (29.8%) elements. Web sites with educational and informational components were most common (70.6%), followed by commercial sites (52.8%) and sites that offered a forum for infertility support and activism (28.9%). Internet resources available to infertile patients are at best variable. The current state of infertility-related materials on the World Wide Web offers unprecedented opportunities to improve services to a growing number of e-health users. Because of variations in quality of site content, women's health clinicians must assume responsibility for a new role as information monitor. This study provides assessment tools clinicians can apply and share with clients.

  14. Der Wandel in der Benutzung des World Wide Webs

    NARCIS (Netherlands)

    Weinreich, H.; Heinecke, A.; Obendorf, H.; Paul, H.; Mayer, M.; Herder, E.

    2006-01-01

    Dieser Beitrag präsentiert ausgewählte Ergebnisse einer Langzeitstudie mit 25 Teilnehmern zur Benutzung des Webs. Eine Gegenüberstellung mit den Ergebnissen der letzten vergleichbaren Studien offenbart eine deutliche Veränderung im Navigationsverhalten der Nutzer. Neue Angebote und Dienste des Webs

  15. Community food webs data and theory

    CERN Document Server

    Cohen, Joel E; Newman, Charles M

    1990-01-01

    Food webs hold a central place in ecology. They describe which organisms feed on which others in natural habitats. This book describes recently discovered empirical regularities in real food webs: it proposes a novel theory unifying many of these regularities, as well as extensive empirical data. After a general introduction, reviewing the empirical and theoretical discoveries about food webs, the second portion of the book shows that community food webs obey several striking phenomenological regularities. Some of these unify, regardless of habitat. Others differentiate, showing that habitat significantly influences structure. The third portion of the book presents a theoretical analysis of some of the unifying empirical regularities. The fourth portion of the book presents 13 community food webs. Collected from scattered sources and carefully edited, they are the empirical basis for the results in the volume. The largest available set of data on community food webs provides a valuable foundation for future s...

  16. Treatment of Wide-Neck Bifurcation Aneurysm Using "WEB Device Waffle Cone Technique".

    Science.gov (United States)

    Mihalea, Cristian; Caroff, Jildaz; Rouchaud, Aymeric; Pescariu, Sorin; Moret, Jacques; Spelle, Laurent

    2018-05-01

    The endovascular treatment of wide-neck bifurcation aneurysms can be challenging and often requires the use of adjunctive techniques and devices. We report our first experience of using a waffle-cone technique adapted to the Woven Endoluminal Bridge (WEB) device in a large-neck basilar tip aneurysm, suitable in cases where the use of Y stenting or other techniques is limited due to anatomic restrictions. The procedure was complete, and angiographic occlusion of the aneurysm was achieved 24 hours post treatment, as confirmed by digital subtraction angiography. No complications occurred. The case reported here was not suitable for Y stenting or deployment of the WEB device alone, due to the small caliber of both posterior cerebral arteries and their origin at the neck level. The main advantage of this technique is that both devices have a controlled detachment system and are fully independent. To our knowledge, this technique has not been reported previously and this modality of treatment has never been described in the literature. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Editing Audio with Audacity

    Directory of Open Access Journals (Sweden)

    Brandon Walsh

    2016-08-01

    Full Text Available For those interested in audio, basic sound editing skills go a long way. Being able to handle and manipulate the materials can help you take control of your object of study: you can zoom in and extract particular moments to analyze, process the audio, and upload the materials to a server to compliment a blog post on the topic. On a more practical level, these skills could also allow you to record and package recordings of yourself or others for distribution. That guest lecture taking place in your department? Record it and edit it yourself! Doing so is a lightweight way to distribute resources among various institutions, and it also helps make the materials more accessible for readers and listeners with a wide variety of learning needs. In this lesson you will learn how to use Audacity to load, record, edit, mix, and export audio files. Sound editing platforms are often expensive and offer extensive capabilities that can be overwhelming to the first-time user, but Audacity is a free and open source alternative that offers powerful capabilities for sound editing with a low barrier for entry. For this lesson we will work with two audio files: a recording of Bach’s Goldberg Variations available from MusOpen and another recording of your own voice that will be made in the course of the lesson. This tutorial uses Audacity 2.1.2, released January 2016.

  18. Use of World Wide Web Server and Browser Software To Support a First-Year Medical Physiology Course.

    Science.gov (United States)

    Davis, Michael J.; And Others

    1997-01-01

    Describes the use of a World Wide Web server to support a team-taught physiology course for first-year medical students. The students' evaluations indicate that computer use in class made lecture material more interesting, while the online documents helped reinforce lecture materials and textbooks. Lists factors which contribute to the…

  19. Advancements in web-database applications for rabies surveillance

    Directory of Open Access Journals (Sweden)

    Bélanger Denise

    2011-08-01

    Full Text Available Abstract Background Protection of public health from rabies is informed by the analysis of surveillance data from human and animal populations. In Canada, public health, agricultural and wildlife agencies at the provincial and federal level are responsible for rabies disease control, and this has led to multiple agency-specific data repositories. Aggregation of agency-specific data into one database application would enable more comprehensive data analyses and effective communication among participating agencies. In Québec, RageDB was developed to house surveillance data for the raccoon rabies variant, representing the next generation in web-based database applications that provide a key resource for the protection of public health. Results RageDB incorporates data from, and grants access to, all agencies responsible for the surveillance of raccoon rabies in Québec. Technological advancements of RageDB to rabies surveillance databases include 1 automatic integration of multi-agency data and diagnostic results on a daily basis; 2 a web-based data editing interface that enables authorized users to add, edit and extract data; and 3 an interactive dashboard to help visualize data simply and efficiently, in table, chart, and cartographic formats. Furthermore, RageDB stores data from citizens who voluntarily report sightings of rabies suspect animals. We also discuss how sightings data can indicate public perception to the risk of racoon rabies and thus aid in directing the allocation of disease control resources for protecting public health. Conclusions RageDB provides an example in the evolution of spatio-temporal database applications for the storage, analysis and communication of disease surveillance data. The database was fast and inexpensive to develop by using open-source technologies, simple and efficient design strategies, and shared web hosting. The database increases communication among agencies collaborating to protect human health from

  20. Autonomous Satellite Command and Control through the World Wide Web: Phase 3

    Science.gov (United States)

    Cantwell, Brian; Twiggs, Robert

    1998-01-01

    NASA's New Millenium Program (NMP) has identified a variety of revolutionary technologies that will support orders of magnitude improvements in the capabilities of spacecraft missions. This program's Autonomy team has focused on science and engineering automation technologies. In doing so, it has established a clear development roadmap specifying the experiments and demonstrations required to mature these technologies. The primary developmental thrusts of this roadmap are in the areas of remote agents, PI/operator interface, planning/scheduling fault management, and smart execution architectures. Phases 1 and 2 of the ASSET Project (previously known as the WebSat project) have focused on establishing World Wide Web-based commanding and telemetry services as an advanced means of interfacing a spacecraft system with the PI and operators. Current automated capabilities include Web-based command submission, limited contact scheduling, command list generation and transfer to the ground station, spacecraft support for demonstrations experiments, data transfer from the ground station back to the ASSET system, data archiving, and Web-based telemetry distribution. Phase 2 was finished in December 1996. During January-December 1997 work was commenced on Phase 3 of the ASSET Project. Phase 3 is the subject of this report. This phase permitted SSDL and its project partners to expand the ASSET system in a variety of ways. These added capabilities included the advancement of ground station capabilities, the adaptation of spacecraft on-board software, and the expansion of capabilities of the ASSET management algorithms. Specific goals of Phase 3 were: (1) Extend Web-based goal-level commanding for both the payload PI and the spacecraft engineer; (2) Support prioritized handling of multiple PIs as well as associated payload experimenters; (3) Expand the number and types of experiments supported by the ASSET system and its associated spacecraft; (4) Implement more advanced resource

  1. Web Mining

    Science.gov (United States)

    Fürnkranz, Johannes

    The World-Wide Web provides every internet citizen with access to an abundance of information, but it becomes increasingly difficult to identify the relevant pieces of information. Research in web mining tries to address this problem by applying techniques from data mining and machine learning to Web data and documents. This chapter provides a brief overview of web mining techniques and research areas, most notably hypertext classification, wrapper induction, recommender systems and web usage mining.

  2. Development of a New Web Portal for the Database on Demand Service

    CERN Document Server

    Altinigne, Can Yilmaz

    2017-01-01

    The Database on Demand service allows members of CERN communities to provision and manage database instances of different flavours (MySQL, Oracle, PostgreSQL and InfluxDB). Users can create and edit these instances using the web interface of DB On Demand. This web front end is currently on Java technologies and the ZK web framework, for which is generally difficult to find experienced developers and which has gotten to lack behind more modern web stacks in capabilities and usability.

  3. Tokamaks (Second Edition)

    Energy Technology Data Exchange (ETDEWEB)

    Stott, Peter [JET, UK (United Kingdom)

    1998-10-01

    The first edition of John Wesson's book on tokamaks, published in 1987, established itself as essential reading for researchers in the field of magnetic confinement fusion: it was an excellent introduction for students to tokamak physics and also a valuable reference work for the more experienced. The second edition, published in 1997, has been completely rewritten and substantially enlarged (680 pages compared with 300). The new edition maintains the aim of providing a simple introduction to basic tokamak physics, but also includes discussion of the substantial advances in fusion research during the past decade. The new book, like its predecessor, is well written and commendable for its clarity and accuracy. In fact many of the chapters are written by a series of co-authors bringing the benefits of a wide range of expertise but, by careful editing, Wesson has maintained a uniformity of style and presentation. The chapter headings and coverage for the most part remain the same - but are expanded considerably and brought up to date. The most substantial change is that the single concluding chapter in the first edition on 'Experiments' has been replaced by three chapters: 'Tokamak experiments' which deals with some of the earlier key experiments plus a selection of recent small and medium-sized devices, 'Large experiments' which gives an excellent summary of the main results from the four large tokamaks - TFTR, JET, JT60/JT60U and DIII-D, and 'The future' which gives a very short (possibly too short in my opinion) account of reactors and ITER. This is an excellent book, which I strongly recommend should have a place - on the desk rather than in the bookshelf - of researchers in magnetic confinement fusion. (book review)

  4. Tokamaks (Second Edition)

    International Nuclear Information System (INIS)

    Stott, Peter

    1998-01-01

    The first edition of John Wesson's book on tokamaks, published in 1987, established itself as essential reading for researchers in the field of magnetic confinement fusion: it was an excellent introduction for students to tokamak physics and also a valuable reference work for the more experienced. The second edition, published in 1997, has been completely rewritten and substantially enlarged (680 pages compared with 300). The new edition maintains the aim of providing a simple introduction to basic tokamak physics, but also includes discussion of the substantial advances in fusion research during the past decade. The new book, like its predecessor, is well written and commendable for its clarity and accuracy. In fact many of the chapters are written by a series of co-authors bringing the benefits of a wide range of expertise but, by careful editing, Wesson has maintained a uniformity of style and presentation. The chapter headings and coverage for the most part remain the same - but are expanded considerably and brought up to date. The most substantial change is that the single concluding chapter in the first edition on 'Experiments' has been replaced by three chapters: 'Tokamak experiments' which deals with some of the earlier key experiments plus a selection of recent small and medium-sized devices, 'Large experiments' which gives an excellent summary of the main results from the four large tokamaks - TFTR, JET, JT60/JT60U and DIII-D, and 'The future' which gives a very short (possibly too short in my opinion) account of reactors and ITER. This is an excellent book, which I strongly recommend should have a place - on the desk rather than in the bookshelf - of researchers in magnetic confinement fusion. (book review)

  5. Taking risks on the world wide web: The impact of families and societies on adolescents' risky online behavior

    NARCIS (Netherlands)

    Notten, N.J.W.R.; Hof, S. van der; Berg, B. van den; Schermer, B.W.

    2014-01-01

    Children’s engagement in risky online behavior—such as providing personal information or agreeing to meet with a stranger—is an important predictor of whether they will encounter harmful content on the World Wide Web or be confronted with situations such as sexual harassment and privacy violations.

  6. A Subdivision-Based Representation for Vector Image Editing.

    Science.gov (United States)

    Liao, Zicheng; Hoppe, Hugues; Forsyth, David; Yu, Yizhou

    2012-11-01

    Vector graphics has been employed in a wide variety of applications due to its scalability and editability. Editability is a high priority for artists and designers who wish to produce vector-based graphical content with user interaction. In this paper, we introduce a new vector image representation based on piecewise smooth subdivision surfaces, which is a simple, unified and flexible framework that supports a variety of operations, including shape editing, color editing, image stylization, and vector image processing. These operations effectively create novel vector graphics by reusing and altering existing image vectorization results. Because image vectorization yields an abstraction of the original raster image, controlling the level of detail of this abstraction is highly desirable. To this end, we design a feature-oriented vector image pyramid that offers multiple levels of abstraction simultaneously. Our new vector image representation can be rasterized efficiently using GPU-accelerated subdivision. Experiments indicate that our vector image representation achieves high visual quality and better supports editing operations than existing representations.

  7. Java2 Enterprise Edition 14 (J2EE 14) Bible

    CERN Document Server

    McGovern, James; Fain, Yakov; Gordon, Jason; Henry, Ethan; Hurst, Walter; Jain, Ashish; Little, Mark; Nagarajan, Vaidyanathan; Oak, Harshad; Phillips, Lee Anne

    2011-01-01

    Java 2 Enterprise Edition (J2EE) is the specification that all enterprise Java developers need to build multi-tier applications, and also the basis for BEA's WebLogic Application Server and IBM's WebSphereRevised to be current with the significant J2EE 1.4 update that will drive substantial developer interestWritten by a top-selling team of eleven experts who provide unique and substantial business examples in a vendor-neutral format, making the information applicable to various application serversCovers patterns, J2EE application servers, frameworks, Ant, and continuous availabilityIncludes e

  8. Real-Time Payload Control and Monitoring on the World Wide Web

    Science.gov (United States)

    Sun, Charles; Windrem, May; Givens, John J. (Technical Monitor)

    1998-01-01

    World Wide Web (W3) technologies such as the Hypertext Transfer Protocol (HTTP) and the Java object-oriented programming environment offer a powerful, yet relatively inexpensive, framework for distributed application software development. This paper describes the design of a real-time payload control and monitoring system that was developed with W3 technologies at NASA Ames Research Center. Based on Java Development Toolkit (JDK) 1.1, the system uses an event-driven "publish and subscribe" approach to inter-process communication and graphical user-interface construction. A C Language Integrated Production System (CLIPS) compatible inference engine provides the back-end intelligent data processing capability, while Oracle Relational Database Management System (RDBMS) provides the data management function. Preliminary evaluation shows acceptable performance for some classes of payloads, with Java's portability and multimedia support identified as the most significant benefit.

  9. Biomedical semantics in the Semantic Web.

    Science.gov (United States)

    Splendiani, Andrea; Burger, Albert; Paschke, Adrian; Romano, Paolo; Marshall, M Scott

    2011-03-07

    The Semantic Web offers an ideal platform for representing and linking biomedical information, which is a prerequisite for the development and application of analytical tools to address problems in data-intensive areas such as systems biology and translational medicine. As for any new paradigm, the adoption of the Semantic Web offers opportunities and poses questions and challenges to the life sciences scientific community: which technologies in the Semantic Web stack will be more beneficial for the life sciences? Is biomedical information too complex to benefit from simple interlinked representations? What are the implications of adopting a new paradigm for knowledge representation? What are the incentives for the adoption of the Semantic Web, and who are the facilitators? Is there going to be a Semantic Web revolution in the life sciences?We report here a few reflections on these questions, following discussions at the SWAT4LS (Semantic Web Applications and Tools for Life Sciences) workshop series, of which this Journal of Biomedical Semantics special issue presents selected papers from the 2009 edition, held in Amsterdam on November 20th.

  10. Technobabble: Photoshop 6 Converges Web, Print Photograph-Editing Capabilities.

    Science.gov (United States)

    Communication: Journalism Education Today, 2001

    2001-01-01

    Discusses the newly-released Adobe Photoshop 6, and its use in student publications. Notes its refined text-handling capabilities, a more user-friendly interface, integrated vector functions, easier preparation of Web images, and new and more powerful layer functions. (SR)

  11. Glue ear: how good is the information on the World Wide Web?

    Science.gov (United States)

    Ritchie, L; Tornari, C; Patel, P M; Lakhani, R

    2016-02-01

    This paper objectively evaluates current information available to the general public related to glue ear on the World Wide Web. The term 'glue ear' was typed into the 3 most frequently used internet search engines - Google, Bing and Yahoo - and the first 20 links were analysed. The first 400 words of each page were used to calculate the Flesch-Kincaid readability score. Each website was subsequently graded using the Discern instrument, which gauges quality and content of literature. The websites Webmd.boots.com, Bupa.co.uk and Patient.co.uk received the highest overall scores. These reflected top scores in either readability or Discern instrument assessment, but not both. Readability and Discern scores increased with the presence of a marketing or advertising incentive. The Patient.co.uk website had the highest Discern score and third highest readability score. There is huge variation in the quality of information available to patients on the internet. Some websites may be accessible to a wide range of reading ages but have poor quality content, and vice versa. Clinicians should be aware of indicators of quality, and use validated instruments to assess and recommend literature.

  12. Pilot using World Wide Web to prevent diabetes in adolescents.

    Science.gov (United States)

    Long, Joann D; Armstrong, Myrna L; Amos, Elizabeth; Shriver, Brent; Roman-Shriver, Carmen; Feng, Du; Harrison, Lanell; Luker, Scott; Nash, Anita; Blevins, Monica Witcher

    2006-02-01

    This pilot study tested the effects of an interactive nutrition education Web site on fruit, vegetable, and fat consumption in minority adolescents genetically at risk for Type 2 diabetes. A one-group nonexperimental pretest, posttest focus group design was used. Twenty-one sixth-grade to eighth-grade junior high adolescents who were minorities volunteered to participate. Participants received 5 hours of Web-based nutrition education over 3 weeks. A significant difference in fat consumption was supported from the computerized dietary assessment. No difference was found in fruit or vegetable consumption. Comparative data indicated a rise in body mass index (BMI) percentile from 88.03 (1999) to 88.40 (2002; boys) and 88.25 (1999) to 91.2 (2002; girls). Focus group responses supported the satisfaction of adolescents in the study with the use of the Web-based intervention for nutrition education. Healthy eating interventions using Web-based nutrition education should be further investigated with adolescents.

  13. Interactive fluka: a world wide web version for a simulation code in proton therapy

    International Nuclear Information System (INIS)

    Garelli, S.; Giordano, S.; Piemontese, G.; Squarcia, S.

    1998-01-01

    We considered the possibility of using the simulation code FLUKA, in the framework of TERA. We provided a window under World Wide Web in which an interactive version of the code is available. The user can find instructions for the installation, an on-line FLUKA manual and interactive windows for inserting all the data required by the configuration running file in a very simple way. The database choice allows a more versatile use for data verification and update, recall of old simulations and comparison with selected examples. A completely new tool for geometry drawing under Java has also been developed. (authors)

  14. Web Search Engines

    OpenAIRE

    Rajashekar, TB

    1998-01-01

    The World Wide Web is emerging as an all-in-one information source. Tools for searching Web-based information include search engines, subject directories and meta search tools. We take a look at key features of these tools and suggest practical hints for effective Web searching.

  15. Semantic Web Primer

    NARCIS (Netherlands)

    Antoniou, Grigoris; Harmelen, Frank van

    2004-01-01

    The development of the Semantic Web, with machine-readable content, has the potential to revolutionize the World Wide Web and its use. A Semantic Web Primer provides an introduction and guide to this still emerging field, describing its key ideas, languages, and technologies. Suitable for use as a

  16. Educational Applications on the World Wide Web: An Example Using Amphion

    Science.gov (United States)

    Friedman, Jane

    1998-01-01

    There is a great deal of excitement about using the internet and the World Wide Web in education. There are such exciting possibilities and there is a wealth and variety of material up on the web. There are however many problems, problems of access and resources, problems of quality -- for every excellent resource there are many poor ones, and there are insufficiently explored problems of teacher training and motivation. For example, Wiesenmayer and Meadows report on a study of 347 West Virginia science teachers. These teachers were enrolled in a week-long summer workshop to introduce them to the internet and its educational potential. The teachers were asked to review science sites as to overall quality and then about their usefulness in their own classrooms. The teachers were enthusiastic about the web, and gave two-thirds of the sites high ratings, and essentially all the rest average ratings. But alarmingly, over 80% of these sites were viewed as having no direct applicability in the teacher's own classroom. This summer I was assigned to work on the Amphion project in the Automated Software Engineering Group under the leadership of Michael Lowry. I wished to find educational applications of the Amphion system, which in its current implementation can be used to create fortran programs and animations using the SPICE libraries created by the NAIF group at JPL. I wished to find an application which provided real added educational value, which was in line with educational curriculum standards and which would serve a documented need of the educational community. The application selected was teaching about the causes of the seasons -- at the approximately the fourth, fifth, sixth grade level. This topic was chosen because it is in line with national curriculum standards. The fourth, fifth, sixth grade level was selected to coincide with the grade level served by the Ames Aerospace Encounter, which services 10,000 children a year on field trips. The hope is that

  17. Cytological analysis of atypical squamous epithelial cells of undetermined significance using the world wide web.

    Science.gov (United States)

    Washiya, Kiyotada; Abe, Ichinosuke; Ambo, Junichi; Iwai, Muneo; Okusawa, Estuko; Asanuma, Kyousuke; Watanabe, Jun

    2011-01-01

    The low-level consistency of the cytodiagnosis of uterine cervical atypical squamous epithelial cells of undetermined significance (ASC-US) employing the Bethesda System has been reported, suggesting the necessity of a wide survey. We presented cases judged as ASC-US on the Web and analyzed the voting results to investigate ASC-US cytologically. Cytology samples from 129 patients diagnosed with ASC-US were used. Images of several atypical cells observed in these cases were presented on the Web. The study, based on the voting results, was presented and opinions were exchanged at the meeting of the Japanese Society of Clinical Cytology. The final diagnosis of ASC-US was benign lesions in 76 cases and low- and high-grade squamous intraepithelial lesions in 44, but no definite diagnosis could be made for the remaining 9. The total number of votes was 17,884 with a 36.5% consistency of cases judged as ASC-US. Benign cases were divided into 6 categories. Four categories not corresponding to the features of koilocytosis and small abnormal keratinized cells were judged as negative for an intraepithelial lesion or malignancy at a high rate. A Web-based survey would be useful which could be viewed at any time and thereby facilitate the sharing of cases to increase consistency. Copyright © 2011 S. Karger AG, Basel.

  18. TOGA COARE Satellite data summaries available on the World Wide Web

    Science.gov (United States)

    Chen, S. S.; Houze, R. A., Jr.; Mapes, B. E.; Brodzick, S. R.; Yutler, S. E.

    1995-01-01

    Satellite data summary images and analysis plots from the Tropical Ocean Global Atmosphere Coupled Ocean-Atmosphere Response Experiment (TOGA COARE), which were initially prepared in the field at the Honiara Operations Center, are now available on the Internet via World Wide Web browsers such as Mosaic. These satellite data summaries consist of products derived from the Japanese Geosynchronous Meteorological Satellite IR data: a time-size series of the distribution of contiguous cold cloudiness areas, weekly percent high cloudiness (PHC) maps, and a five-month time-longitudinal diagram illustrating the zonal motion of large areas of cold cloudiness. The weekly PHC maps are overlaid with weekly mean 850-hPa wind calculated from the European Centre for Medium-Range Weather Forecasts (ECMWF) global analysis field and can be viewed as an animation loop. These satellite summaries provide an overview of spatial and temporal variabilities of the cloud population and a large-scale context for studies concerning specific processes of various components of TOGA COARE.

  19. Fan edits and the legacy of The Phantom Edit

    Directory of Open Access Journals (Sweden)

    Joshua Wille

    2014-09-01

    Full Text Available A fan edit can generally be defined as an alternative version of a film or television text created by a fan. It offers a different viewing experience, much as a song remix offers a different listening experience. The contemporary wave of fan edits has emerged during the remix zeitgeist of digital media and at a time when digital video editing technology has become more affordable and popular. The increasing number of alternative versions of films and the works of revisionist Hollywood filmmakers such as George Lucas have contributed to a greater public understanding of cinema as a fluid medium instead of one that exists in a fixed form. The Phantom Edit (2000, a seminal fan edit based on Lucas's Star Wars Episode I: The Phantom Menace (1999, inspired new ranks of fan editors. However, critics have misunderstood fan edits as merely the work of disgruntled fans. In order to provide a critical and historical basis for studies in fan editing as a creative practice, I examine previous interpretations of fan edits in the context of relevant contemporary works, and I use an annotated chronology of The Phantom Edit to trace its influence on subsequent fan editing communities and uncover their relationship with intellectual property disputes.

  20. Towards a critical edition of Fibonacci’s Liber Abaci

    Directory of Open Access Journals (Sweden)

    Giuseppe Germano

    2013-11-01

    Full Text Available A group of research working at the University of Naples Federico II aim to achieve the goal to offer a modern scientific and widely accessible edition of Fibonacci’s treatise. With a linguistic-philological, an historical-mathematical and a computer approach it has pointed out the value and the need for a multidisciplinary research in order to achieve the goal of making this edition adequately available to the scientific community.

  1. Country Nuclear Power Profiles - 2013 Edition

    International Nuclear Information System (INIS)

    2013-08-01

    The Country Nuclear Power Profiles compile background information on the status and development of nuclear power programmes in Member States. The CNPP summarizes organizational and industrial aspects of nuclear power programs and provides information about the relevant legislative, regulatory, and international framework in each country. Its descriptive and statistical overview of the overall economic, energy, and electricity situation in each country and its nuclear power framework is intended to serve as an integrated source of key background information about nuclear power programs in the world. This 2013 edition, issued on CD-ROM and Web pages, contains updated country information for 51 countries

  2. A brief history of the World Wide Web Where it as invented, how it's used, and where it's headed

    CERN Document Server

    Kyrnin, Jennifer

    2005-01-01

    The World Wide Web has its historical roots in things such as the creation of the telegraph, the launching of the Sputnik and more, but it really all started in March 1989, when Tim Berners-Lee, a computer scientist at CERN in Geneva wrote a paper called Information Management: A proposal

  3. Effects of Learning Style and Training Method on Computer Attitude and Performance in World Wide Web Page Design Training.

    Science.gov (United States)

    Chou, Huey-Wen; Wang, Yu-Fang

    1999-01-01

    Compares the effects of two training methods on computer attitude and performance in a World Wide Web page design program in a field experiment with high school students in Taiwan. Discusses individual differences, Kolb's Experiential Learning Theory and Learning Style Inventory, Computer Attitude Scale, and results of statistical analyses.…

  4. Optimized gene editing technology for Drosophila melanogaster using germ line-specific Cas9.

    Science.gov (United States)

    Ren, Xingjie; Sun, Jin; Housden, Benjamin E; Hu, Yanhui; Roesel, Charles; Lin, Shuailiang; Liu, Lu-Ping; Yang, Zhihao; Mao, Decai; Sun, Lingzhu; Wu, Qujie; Ji, Jun-Yuan; Xi, Jianzhong; Mohr, Stephanie E; Xu, Jiang; Perrimon, Norbert; Ni, Jian-Quan

    2013-11-19

    The ability to engineer genomes in a specific, systematic, and cost-effective way is critical for functional genomic studies. Recent advances using the CRISPR-associated single-guide RNA system (Cas9/sgRNA) illustrate the potential of this simple system for genome engineering in a number of organisms. Here we report an effective and inexpensive method for genome DNA editing in Drosophila melanogaster whereby plasmid DNAs encoding short sgRNAs under the control of the U6b promoter are injected into transgenic flies in which Cas9 is specifically expressed in the germ line via the nanos promoter. We evaluate the off-targets associated with the method and establish a Web-based resource, along with a searchable, genome-wide database of predicted sgRNAs appropriate for genome engineering in flies. Finally, we discuss the advantages of our method in comparison with other recently published approaches.

  5. Web-Based Media Contents Editor for UCC Websites

    Science.gov (United States)

    Kim, Seoksoo

    The purpose of this research is to "design web-based media contents editor for establishing UCC(User Created Contents)-based websites." The web-based editor features user-oriented interfaces and increased convenience, significantly different from previous off-line editors. It allows users to edit media contents online and can be effectively used for online promotion activities of enterprises and organizations. In addition to development of the editor, the research aims to support the entry of enterprises and public agencies to the online market by combining the technology with various UCC items.

  6. Letting go of the words writing web content that works

    CERN Document Server

    Redish, Janice (Ginny)

    2012-01-01

    Web site design and development continues to become more sophisticated an important part of this maturity originates with well laid out and well written content. Ginny Redish is a world-renowned expert on information design and how to produce clear writing in plain language for the web. All of the invaluable information that she  shared in the first edition is included with numerous new examples. New information on content strategy for web sites, search engine optimization (SEO), and social media will enhance the book's content making it once again the only book you need to own to o

  7. Don't make me think: a common sense approach to Web usability

    National Research Council Canada - National Science Library

    Krug, Steve

    2006-01-01

    .... The second edition of this classic adds three new chapters that explain why people really leave Web sites, how to make sites usable and accessible, and the art of surviving executive design whims...

  8. News from the Library: The 8th edition Karlsruhe nuclide chart has been released

    CERN Multimedia

    CERN Library

    2012-01-01

    The 8th edition of the Karlsruhe Nuclide Chart contains new data not found in the 7th edition.   Since 1958, the well-known Karlsruhe Nuclide Chart has provided scientists with structured, valuable information on the half-lives, decay modes and energies of radioactive nuclides. The chart is used in many disciplines in physics (health physics, radiation protection, nuclear and radiochemistry, astrophysics, etc.) but also in the life and earth sciences (biology, medicine, agriculture, geology, etc.). The 8th edition of the Karlsruhe Nuclide Chart contains new data on 737 nuclides not found in the 7th edition. In total, nuclear data on 3847 experimentally observed ground states and isomers are presented. A new web-based version of this chart is in the final stages of development for use within the Nucleonica Nuclear Science Portal - a portal for which CERN has an institutional license. The chart is also available in paper format.   If you want to buy a paper version of the chart, ple...

  9. INES: The International Nuclear and Radiological Event Scale User's Manual. 2008 Edition (Spanish Edition)

    International Nuclear Information System (INIS)

    2010-11-01

    INES, the International Nuclear and Radiological Event Scale, was developed in 1990 by experts convened by the IAEA and the OECD Nuclear Energy Agency with the aim of communicating the safety significance of events. This edition of the INES User?s Manual is designed to facilitate the task of those who are required to rate the safety significance of events using the scale. It includes additional guidance and clarifications, and provides examples and comments on the continued use of INES. With this new edition, it is anticipated that INES will be widely used by Member States and become the worldwide scale for putting into proper perspective the safety significance of any event associated with the transport, storage and use of radioactive material and radiation sources, whether or not the event occurs at a facility.

  10. INES: The International Nuclear and Radiological Event Scale User's Manual. 2008 Edition (Chinese Edition)

    International Nuclear Information System (INIS)

    2012-01-01

    INES, the International Nuclear and Radiological Event Scale, was developed in 1990 by experts convened by the IAEA and the OECD Nuclear Energy Agency with the aim of communicating the safety significance of events. This edition of the INES User's Manual is designed to facilitate the task of those who are required to rate the safety significance of events using the scale. It includes additional guidance and clarifications, and provides examples and comments on the continued use of INES. With this new edition, it is anticipated that INES will be widely used by Member States and become the worldwide scale for putting into proper perspective the safety significance of any event associated with the transport, storage and use of radioactive material and radiation sources, whether or not the event occurs at a facility.

  11. DEVELOPMENT OF A WEB-BASED PROXIMITY BASED MEDIA SHARING APPLICATION

    OpenAIRE

    Erol Ozan

    2016-01-01

    This article reports the development of Vissou, which is a location based web application that enables media recording and sharing among users who are in close proximity to each other. The application facilitates the automated hand-over of the recorded media files from one user to another. There are many social networking applications and web sites that provide digital media sharing and editing functionalities. What differentiates Vissou from other similar applications is the functions and us...

  12. Implementation of a World Wide Web server for the oil and gas industry

    International Nuclear Information System (INIS)

    Blaylock, R.E.; Martin, F.D.; Emery, R.

    1996-01-01

    The Gas and Oil Technology Exchange and Communication Highway (GO-TECH) provides an electronic information system for the petroleum community for exchanging ideas, data, and technology. The PC-based system fosters communication and discussion by linking the oil and gas producers with resource centers, government agencies, consulting firms, service companies, national laboratories, academic research groups, and universities throughout the world. The oil and gas producers can access the GO-TECH World Wide Web (WWW) home page through modem links, as well as through the Internet. Future GO-TECH applications will include the establishment of virtual corporations consisting of consortia of small companies, consultants, and service companies linked by electronic information systems. These virtual corporations will have the resources and expertise previously found only in major corporations

  13. Web wisdom how to evaluate and create information quality on the Web

    CERN Document Server

    Alexander, Janet E

    1999-01-01

    Web Wisdom is an essential reference for anyone needing to evaluate or establish information quality on the World Wide Web. The book includes easy to use checklists for step-by-step quality evaluations of virtually any Web page. The checklists can also be used by Web authors to help them ensure quality information on their pages. In addition, Web Wisdom addresses other important issues, such as understanding the ways that advertising and sponsorship may affect the quality of Web information. It features: * a detailed discussion of the items involved in evaluating Web information; * checklists

  14. Presentation of klystron history and statistics by World-Wide Web

    International Nuclear Information System (INIS)

    Kamikubota, N.; Furukawa, K.

    2000-01-01

    A web-based system for browsing klystron histories and statistics has been developed for the KEKB e-/e+ linac. This system enables linac staffs to investigate various klystron histories, such as recent trends of ES (down frequency/reflection/high voltage), at his/her convenient PC/Mac/console, where a web-browser is available. This system started in January 2000, and now becomes an inevitable tool for the linac staffs. (author)

  15. Students' Perceptions of the Effectiveness of the World Wide Web as a Research and Teaching Tool in Science Learning.

    Science.gov (United States)

    Ng, Wan; Gunstone, Richard

    2002-01-01

    Investigates the use of the World Wide Web (WWW) as a research and teaching tool in promoting self-directed learning groups of 15-year-old students. Discusses the perceptions of students of the effectiveness of the WWW in assisting them with the construction of knowledge on photosynthesis and respiration. (Contains 33 references.) (Author/YDS)

  16. "Così abbiamo creato il World Wide Web"

    CERN Multimedia

    Sigiani, GianLuca

    2002-01-01

    Meeting with Robert Cailliau, scientist and pioneer of the web, who, in a book, tells how at CERN in Geneva, his team transformed Internet (an instrument used for military purposes) in one of the most revolutionary tool of mass media from ever (1 page)

  17. An Exploratory Survey of Digital Libraries on the World Wide Web: Art and Literature of the Early Italian Renaissance.

    Science.gov (United States)

    McKibben, Suzanne J.

    This study assessed the ongoing development of digital libraries (DLs) on the World Wide Web. DLs of art and literature were surveyed for selected works from the early Italian Renaissance in order to gain insight into the current trends prevalent throughout the larger population of DLs. The following artists and authors were selected for study:…

  18. Integrating Streaming Media to Web-based Learning: A Modular Approach.

    Science.gov (United States)

    Miltenoff, Plamen

    2000-01-01

    Explains streaming technology and discusses how to integrate it into Web-based instruction based on experiences at St. Cloud State University (Minnesota). Topics include a modular approach, including editing, copyright concerns, digitizing, maintenance, and continuing education needs; the role of the library; and how streaming can enhance…

  19. Testing Quantum Models of Conjunction Fallacy on the World Wide Web

    Science.gov (United States)

    Aerts, Diederik; Arguëlles, Jonito Aerts; Beltran, Lester; Beltran, Lyneth; de Bianchi, Massimiliano Sassoli; Sozzo, Sandro; Veloz, Tomas

    2017-12-01

    The `conjunction fallacy' has been extensively debated by scholars in cognitive science and, in recent times, the discussion has been enriched by the proposal of modeling the fallacy using the quantum formalism. Two major quantum approaches have been put forward: the first assumes that respondents use a two-step sequential reasoning and that the fallacy results from the presence of `question order effects'; the second assumes that respondents evaluate the cognitive situation as a whole and that the fallacy results from the `emergence of new meanings', as an `effect of overextension' in the conceptual conjunction. Thus, the question arises as to determine whether and to what extent conjunction fallacies would result from `order effects' or, instead, from `emergence effects'. To help clarify this situation, we propose to use the World Wide Web as an `information space' that can be interrogated both in a sequential and non-sequential way, to test these two quantum approaches. We find that `emergence effects', and not `order effects', should be considered the main cognitive mechanism producing the observed conjunction fallacies.

  20. The Art of Electronics - 2nd Edition

    Science.gov (United States)

    Horowitz, Paul; Hill, Winfield

    1989-09-01

    This is the thoroughly revised and updated second edition of the hugely successful The Art of Electronics. Widely accepted as the single authoritative text and reference on electronic circuit design, both analog and digital, the original edition sold over 125,000 copies worldwide and was translated into eight languages. The book revolutionized the teaching of electronics by emphasizing the methods actually used by citcuit designers - a combination of some basic laws, rules to thumb, and a large nonmathematical treatment that encourages circuit values and performance. The new Art of Electronics retains the feeling of informality and easy access that helped make the first edition so successful and popular. It is an ideal first textbook on electronics for scientists and engineers and an indispensable reference for anyone, professional or amateur, who works with electronic circuits. The best self-teaching book and reference book in electronics Simply indispensable, packed with essential information for all scientists and engineers who build electronic circuits Totally rewritten chapters on microcomputers and microprocessors The first edition of this book has sold over 100,000 copies in seven years, it has a market in virtually all research centres where electronics is important

  1. A World Wide Web-based antimicrobial stewardship program improves efficiency, communication, and user satisfaction and reduces cost in a tertiary care pediatric medical center.

    Science.gov (United States)

    Agwu, Allison L; Lee, Carlton K K; Jain, Sanjay K; Murray, Kara L; Topolski, Jason; Miller, Robert E; Townsend, Timothy; Lehmann, Christoph U

    2008-09-15

    Antimicrobial stewardship programs aim to reduce inappropriate hospital antimicrobial use. At the Johns Hopkins Children's Medical and Surgical Center (Baltimore, MD), we implemented a World Wide Web-based antimicrobial restriction program to address problems with the existing restriction program. A user survey identified opportunities for improvement of an existing antimicrobial restriction program and resulted in subsequent design, implementation, and evaluation of a World Wide Web-based antimicrobial restriction program at a 175-bed, tertiary care pediatric teaching hospital. The program provided automated clinical decision support, facilitated approval, and enhanced real-time communication among prescribers, pharmacists, and pediatric infectious diseases fellows. Approval status, duration, and rationale; missing request notifications; and expiring approvals were stored in a database that is accessible via a secure Intranet site. Before and after implementation of the program, user satisfaction, reports of missed and/or delayed doses, antimicrobial dispensing times, and cost were evaluated. After implementation of the program, there was a $370,069 reduction in projected annual cost associated with restricted antimicrobial use and an 11.6% reduction in the number of dispensed doses. User satisfaction increased from 22% to 68% and from 13% to 69% among prescribers and pharmacists, respectively. There were 21% and 32% reductions in the number of prescriber reports of missed and delayed doses, respectively, and there was a 37% reduction in the number of pharmacist reports of delayed approvals; measured dispensing times were unchanged (P = .24). In addition, 40% fewer restricted antimicrobial-related phone calls were noted by the pharmacy. The World Wide Web-based antimicrobial approval program led to improved communication, more-efficient antimicrobial administration, increased user satisfaction, and significant cost savings. Integrated tools, such as this World

  2. Architecture and the Web.

    Science.gov (United States)

    Money, William H.

    Instructors should be concerned with how to incorporate the World Wide Web into an information systems (IS) curriculum organized across three areas of knowledge: information technology, organizational and management concepts, and theory and development of systems. The Web fits broadly into the information technology component. For the Web to be…

  3. Abundant off-target edits from site-directed RNA editing can be reduced by nuclear localization of the editing enzyme.

    Science.gov (United States)

    Vallecillo-Viejo, Isabel C; Liscovitch-Brauer, Noa; Montiel-Gonzalez, Maria Fernanda; Eisenberg, Eli; Rosenthal, Joshua J C

    2018-01-02

    Site-directed RNA editing (SDRE) is a general strategy for making targeted base changes in RNA molecules. Although the approach is relatively new, several groups, including our own, have been working on its development. The basic strategy has been to couple the catalytic domain of an adenosine (A) to inosine (I) RNA editing enzyme to a guide RNA that is used for targeting. Although highly efficient on-target editing has been reported, off-target events have not been rigorously quantified. In this report we target premature termination codons (PTCs) in messages encoding both a fluorescent reporter protein and the Cystic Fibrosis Transmembrane Conductance Regulator (CFTR) protein transiently transfected into human epithelial cells. We demonstrate that while on-target editing is efficient, off-target editing is extensive, both within the targeted message and across the entire transcriptome of the transfected cells. By redirecting the editing enzymes from the cytoplasm to the nucleus, off-target editing is reduced without compromising the on-target editing efficiency. The addition of the E488Q mutation to the editing enzymes, a common strategy for increasing on-target editing efficiency, causes a tremendous increase in off-target editing. These results underscore the need to reduce promiscuity in current approaches to SDRE.

  4. Medical knowledge packages and their integration into health-care information systems and the World Wide Web.

    Science.gov (United States)

    Adlassnig, Klaus-Peter; Rappelsberger, Andrea

    2008-01-01

    Software-based medical knowledge packages (MKPs) are packages of highly structured medical knowledge that can be integrated into various health-care information systems or the World Wide Web. They have been established to provide different forms of clinical decision support such as textual interpretation of combinations of laboratory rest results, generating diagnostic hypotheses as well as confirmed and excluded diagnoses to support differential diagnosis in internal medicine, or for early identification and automatic monitoring of hospital-acquired infections. Technically, an MKP may consist of a number of inter-connected Arden Medical Logic Modules. Several MKPs have been integrated thus far into hospital, laboratory, and departmental information systems. This has resulted in useful and widely accepted software-based clinical decision support for the benefit of the patient, the physician, and the organization funding the health care system.

  5. WEB STRUCTURE MINING

    Directory of Open Access Journals (Sweden)

    CLAUDIA ELENA DINUCĂ

    2011-01-01

    Full Text Available The World Wide Web became one of the most valuable resources for information retrievals and knowledge discoveries due to the permanent increasing of the amount of data available online. Taking into consideration the web dimension, the users get easily lost in the web’s rich hyper structure. Application of data mining methods is the right solution for knowledge discovery on the Web. The knowledge extracted from the Web can be used to raise the performances for Web information retrievals, question answering and Web based data warehousing. In this paper, I provide an introduction of Web mining categories and I focus on one of these categories: the Web structure mining. Web structure mining, one of three categories of web mining for data, is a tool used to identify the relationship between Web pages linked by information or direct link connection. It offers information about how different pages are linked together to form this huge web. Web Structure Mining finds hidden basic structures and uses hyperlinks for more web applications such as web search.

  6. Spectral properties of the Google matrix of the World Wide Web and other directed networks.

    Science.gov (United States)

    Georgeot, Bertrand; Giraud, Olivier; Shepelyansky, Dima L

    2010-05-01

    We study numerically the spectrum and eigenstate properties of the Google matrix of various examples of directed networks such as vocabulary networks of dictionaries and university World Wide Web networks. The spectra have gapless structure in the vicinity of the maximal eigenvalue for Google damping parameter α equal to unity. The vocabulary networks have relatively homogeneous spectral density, while university networks have pronounced spectral structures which change from one university to another, reflecting specific properties of the networks. We also determine specific properties of eigenstates of the Google matrix, including the PageRank. The fidelity of the PageRank is proposed as a characterization of its stability.

  7. Historical Quantitative Reasoning on the Web

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Ashkpour, A.

    2016-01-01

    The Semantic Web is an extension of the Web through standards by the World Wide Web Consortium (W3C) [4]. These standards promote common data formats and exchange protocols on the Web, most fundamentally the Resource Description Framework (RDF). Its ultimate goal is to make the Web a suitable data

  8. Building interactive simulations in a Web page design program.

    Science.gov (United States)

    Kootsey, J Mailen; Siriphongs, Daniel; McAuley, Grant

    2004-01-01

    A new Web software architecture, NumberLinX (NLX), has been integrated into a commercial Web design program to produce a drag-and-drop environment for building interactive simulations. NLX is a library of reusable objects written in Java, including input, output, calculation, and control objects. The NLX objects were added to the palette of available objects in the Web design program to be selected and dropped on a page. Inserting an object in a Web page is accomplished by adding a template block of HTML code to the page file. HTML parameters in the block must be set to user-supplied values, so the HTML code is generated dynamically, based on user entries in a popup form. Implementing the object inspector for each object permits the user to edit object attributes in a form window. Except for model definition, the combination of the NLX architecture and the Web design program permits construction of interactive simulation pages without writing or inspecting code.

  9. SED-ML web tools: generate, modify and export standard-compliant simulation studies.

    Science.gov (United States)

    Bergmann, Frank T; Nickerson, David; Waltemath, Dagmar; Scharm, Martin

    2017-04-15

    The Simulation Experiment Description Markup Language (SED-ML) is a standardized format for exchanging simulation studies independently of software tools. We present the SED-ML Web Tools, an online application for creating, editing, simulating and validating SED-ML documents. The Web Tools implement all current SED-ML specifications and, thus, support complex modifications and co-simulation of models in SBML and CellML formats. Ultimately, the Web Tools lower the bar on working with SED-ML documents and help users create valid simulation descriptions. http://sysbioapps.dyndns.org/SED-ML_Web_Tools/ . fbergman@caltech.edu . © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  10. Stochastic analysis of web page ranking

    NARCIS (Netherlands)

    Volkovich, Y.

    2009-01-01

    Today, the study of the World Wide Web is one of the most challenging subjects. In this work we consider the Web from a probabilistic point of view. We analyze the relations between various characteristics of the Web. In particular, we are interested in the Web properties that affect the Web page

  11. Personalizing Web Search based on User Profile

    OpenAIRE

    Utage, Sharyu; Ahire, Vijaya

    2016-01-01

    Web Search engine is most widely used for information retrieval from World Wide Web. These Web Search engines help user to find most useful information. When different users Searches for same information, search engine provide same result without understanding who is submitted that query. Personalized web search it is search technique for proving useful result. This paper models preference of users as hierarchical user profiles. a framework is proposed called UPS. It generalizes profile and m...

  12. RNA Editing During Sexual Development Occurs in Distantly Related Filamentous Ascomycetes.

    Science.gov (United States)

    Teichert, Ines; Dahlmann, Tim A; Kück, Ulrich; Nowrousian, Minou

    2017-04-01

    RNA editing is a post-transcriptional process that modifies RNA molecules leading to transcript sequences that differ from their template DNA. A-to-I editing was found to be widely distributed in nuclear transcripts of metazoa, but was detected in fungi only recently in a study of the filamentous ascomycete Fusarium graminearum that revealed extensive A-to-I editing of mRNAs in sexual structures (fruiting bodies). Here, we searched for putative RNA editing events in RNA-seq data from Sordaria macrospora and Pyronema confluens, two distantly related filamentous ascomycetes, and in data from the Taphrinomycete Schizosaccharomyces pombe. Like F. graminearum, S. macrospora is a member of the Sordariomycetes, whereas P. confluens belongs to the early-diverging group of Pezizomycetes. We found extensive A-to-I editing in RNA-seq data from sexual mycelium from both filamentous ascomycetes, but not in vegetative structures. A-to-I editing was not detected in different stages of meiosis of S. pombe. A comparison of A-to-I editing in S. macrospora with F. graminearum and P. confluens, respectively, revealed little conservation of individual editing sites. An analysis of RNA-seq data from two sterile developmental mutants of S. macrospora showed that A-to-I editing is strongly reduced in these strains. Sequencing of cDNA fragments containing more than one editing site from P. confluens showed that at the beginning of sexual development, transcripts were incompletely edited or unedited, whereas in later stages transcripts were more extensively edited. Taken together, these data suggest that A-to-I RNA editing is an evolutionary conserved feature during fruiting body development in filamentous ascomycetes. © The Author(s) 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  13. Residential and Light Commercial HVAC. Teacher Edition and Student Edition. Second Edition.

    Science.gov (United States)

    Stephenson, David

    This package contains teacher and student editions of a residential and light commercial heating, ventilation, and air conditioning (HVAC) course of study. The teacher edition contains information on the following: using the publication; national competencies; competency profile; related academic and workplace skills list; tools, equipment, and…

  14. Regulation of Gene Expression by DNA Methylation and RNA Editing in Animals

    DEFF Research Database (Denmark)

    Li, Qiye

    , there has been growing interest in exploring the modifications occurring at the RNA level, which can impact the fate and function of mRNA. One fascinating type of such modifications is RNA editing, which alters specific nucleotides in transcribed RNA and thus can produce transcripts that are not encoded...... (Heterocephalus glaber), a eusocial mammal living in cooperative colonies. Finally, I introduce a software package that I developed that is specifically designed for the genome-wide identification of RNA-editing sites in animals, with the ultimate aim of promoting the evolutionary and functional study of RNA...... editing in different species....

  15. jORCA: easily integrating bioinformatics Web Services.

    Science.gov (United States)

    Martín-Requena, Victoria; Ríos, Javier; García, Maximiliano; Ramírez, Sergio; Trelles, Oswaldo

    2010-02-15

    Web services technology is becoming the option of choice to deploy bioinformatics tools that are universally available. One of the major strengths of this approach is that it supports machine-to-machine interoperability over a network. However, a weakness of this approach is that various Web Services differ in their definition and invocation protocols, as well as their communication and data formats-and this presents a barrier to service interoperability. jORCA is a desktop client aimed at facilitating seamless integration of Web Services. It does so by making a uniform representation of the different web resources, supporting scalable service discovery, and automatic composition of workflows. Usability is at the top of the jORCA agenda; thus it is a highly customizable and extensible application that accommodates a broad range of user skills featuring double-click invocation of services in conjunction with advanced execution-control, on the fly data standardization, extensibility of viewer plug-ins, drag-and-drop editing capabilities, plus a file-based browsing style and organization of favourite tools. The integration of bioinformatics Web Services is made easier to support a wider range of users. .

  16. Promoting Your Web Site.

    Science.gov (United States)

    Raeder, Aggi

    1997-01-01

    Discussion of ways to promote sites on the World Wide Web focuses on how search engines work and how they retrieve and identify sites. Appropriate Web links for submitting new sites and for Internet marketing are included. (LRW)

  17. Using the World Wide Web to Connect Research and Professional Practice: Towards Evidence-Based Practice

    Directory of Open Access Journals (Sweden)

    Daniel L. Moody

    2003-01-01

    Full Text Available In most professional (applied disciplines, research findings take a long time to filter into practice, if they ever do at all. The result of this is under-utilisation of research results and sub-optimal practices. There are a number of reasons for the lack of knowledge transfer. On the "demand side", people working in professional practice have little time available to keep up with the latest research in their field. In addition, the volume of research published each year means that the average practitioner would not have time to read all the research articles in their area of interest even if they devoted all their time to it. From the "supply side", academic research is primarily focused on the production rather than distribution of knowledge. While they have highly developed mechanisms for transferring knowledge among themselves, there is little investment in the distribution of research results be-yond research communities. The World Wide Web provides a potential solution to this problem, as it provides a global information infrastructure for connecting those who produce knowledge (researchers and those who need to apply this knowledge (practitioners. This paper describes two projects which use the World Wide Web to make research results directly available to support decision making in the workplace. The first is a successful knowledge management project in a health department which provides medical staff with on-line access to the latest medical research at the point of care. The second is a project currently in progress to implement a similar system to support decision making in IS practice. Finally, we draw some general lessons about how to improve transfers of knowledge from research and practice, which could be applied in any discipline.

  18. AstroWeb -- Internet Resources for Astronomers

    Science.gov (United States)

    Jackson, R. E.; Adorf, H.-M.; Egret, D.; Heck, A.; Koekemoer, A.; Murtagh, F.; Wells, D. C.

    AstroWeb is a World Wide Web (WWW) interface to a collection of Internet accessible resources aimed at the astronomical community. The collection currently contains more than 1000 WWW, Gopher, Wide Area Information System (WAIS), Telnet, and Anonymous FTP resources, and it is still growing. AstroWeb provides the additional value-added services: categorization of each resource; descriptive paragraphs for some resources; searchable index of all resource information; 3 times daily search for ``dead'' or ``unreliable'' resources.

  19. Obtaining Streamflow Statistics for Massachusetts Streams on the World Wide Web

    Science.gov (United States)

    Ries, Kernell G.; Steeves, Peter A.; Freeman, Aleda; Singh, Raj

    2000-01-01

    A World Wide Web application has been developed to make it easy to obtain streamflow statistics for user-selected locations on Massachusetts streams. The Web application, named STREAMSTATS (available at http://water.usgs.gov/osw/streamstats/massachusetts.html ), can provide peak-flow frequency, low-flow frequency, and flow-duration statistics for most streams in Massachusetts. These statistics describe the magnitude (how much), frequency (how often), and duration (how long) of flow in a stream. The U.S. Geological Survey (USGS) has published streamflow statistics, such as the 100-year peak flow, the 7-day, 10-year low flow, and flow-duration statistics, for its data-collection stations in numerous reports. Federal, State, and local agencies need these statistics to plan and manage use of water resources and to regulate activities in and around streams. Engineering and environmental consulting firms, utilities, industry, and others use the statistics to design and operate water-supply systems, hydropower facilities, industrial facilities, wastewater treatment facilities, and roads, bridges, and other structures. Until now, streamflow statistics for data-collection stations have often been difficult to obtain because they are scattered among many reports, some of which are not readily available to the public. In addition, streamflow statistics are often needed for locations where no data are available. STREAMSTATS helps solve these problems. STREAMSTATS was developed jointly by the USGS and MassGIS, the State Geographic Information Systems (GIS) agency, in cooperation with the Massachusetts Departments of Environmental Management and Environmental Protection. The application consists of three major components: (1) a user interface that displays maps and allows users to select stream locations for which they want streamflow statistics (fig. 1), (2) a data base of previously published streamflow statistics and descriptive information for 725 USGS data

  20. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    Science.gov (United States)

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. The Semantic Web in Teacher Education

    Science.gov (United States)

    Czerkawski, Betül Özkan

    2014-01-01

    The Semantic Web enables increased collaboration among computers and people by organizing unstructured data on the World Wide Web. Rather than a separate body, the Semantic Web is a functional extension of the current Web made possible by defining relationships among websites and other online content. When explicitly defined, these relationships…

  2. Dynamic Interactive Educational Diabetes Simulations Using the World Wide Web: An Experience of More Than 15 Years with AIDA Online.

    Science.gov (United States)

    Lehmann, Eldon D; Dewolf, Dennis K; Novotny, Christopher A; Reed, Karen; Gotwals, Robert R

    2014-01-01

    Background. AIDA is a widely available downloadable educational simulator of glucose-insulin interaction in diabetes. Methods. A web-based version of AIDA was developed that utilises a server-based architecture with HTML FORM commands to submit numerical data from a web-browser client to a remote web server. AIDA online, located on a remote server, passes the received data through Perl scripts which interactively produce 24 hr insulin and glucose simulations. Results. AIDA online allows users to modify the insulin regimen and diet of 40 different prestored "virtual diabetic patients" on the internet or create new "patients" with user-generated regimens. Multiple simulations can be run, with graphical results viewed via a standard web-browser window. To date, over 637,500 diabetes simulations have been run at AIDA online, from all over the world. Conclusions. AIDA online's functionality is similar to the downloadable AIDA program, but the mode of implementation and usage is different. An advantage to utilising a server-based application is the flexibility that can be offered. New modules can be added quickly to the online simulator. This has facilitated the development of refinements to AIDA online, which have instantaneously become available around the world, with no further local downloads or installations being required.

  3. Dynamic Interactive Educational Diabetes Simulations Using the World Wide Web: An Experience of More Than 15 Years with AIDA Online

    Science.gov (United States)

    Lehmann, Eldon D.; DeWolf, Dennis K.; Novotny, Christopher A.; Reed, Karen; Gotwals, Robert R.

    2014-01-01

    Background. AIDA is a widely available downloadable educational simulator of glucose-insulin interaction in diabetes. Methods. A web-based version of AIDA was developed that utilises a server-based architecture with HTML FORM commands to submit numerical data from a web-browser client to a remote web server. AIDA online, located on a remote server, passes the received data through Perl scripts which interactively produce 24 hr insulin and glucose simulations. Results. AIDA online allows users to modify the insulin regimen and diet of 40 different prestored “virtual diabetic patients” on the internet or create new “patients” with user-generated regimens. Multiple simulations can be run, with graphical results viewed via a standard web-browser window. To date, over 637,500 diabetes simulations have been run at AIDA online, from all over the world. Conclusions. AIDA online's functionality is similar to the downloadable AIDA program, but the mode of implementation and usage is different. An advantage to utilising a server-based application is the flexibility that can be offered. New modules can be added quickly to the online simulator. This has facilitated the development of refinements to AIDA online, which have instantaneously become available around the world, with no further local downloads or installations being required. PMID:24511312

  4. Astronomy: A Self-Teaching Guide, 6th Edition

    Science.gov (United States)

    Moché, Dinah L.

    2004-02-01

    "A lively, up-to-date account of the basic principles of astronomy and exciting current field of research."-Science Digest For a quarter of a century, Astronomy: A Self-Teaching Guide has been making students and amateur stargazers alike feel at home among the stars. From stars, planets and galaxies, to black holes, the Big Bang and life in space, this title has been making it easy for beginners to quickly grasp the basic concepts of astronomy for over 25 years. Updated with the latest discoveries in astronomy and astrophysics, this newest edition of Dinah Moché's classic guide now includes many Web site addresses for spectacular images and news. And like all previous editions, it is packed with valuable tables, charts, star and moon maps and features simple activities that reinforce readers' grasp of basic concepts at their own pace, as well as objectives, reviews, and self-tests to monitor their progress. Dinah L. Moché, PhD (Rye, NY), is an award-winning author, educator, and lecturer. Her books have sold over nine million copies in seven languages.

  5. Engineering Web Applications

    DEFF Research Database (Denmark)

    Casteleyn, Sven; Daniel, Florian; Dolog, Peter

    Nowadays, Web applications are almost omnipresent. The Web has become a platform not only for information delivery, but also for eCommerce systems, social networks, mobile services, and distributed learning environments. Engineering Web applications involves many intrinsic challenges due...... to their distributed nature, content orientation, and the requirement to make them available to a wide spectrum of users who are unknown in advance. The authors discuss these challenges in the context of well-established engineering processes, covering the whole product lifecycle from requirements engineering through...... design and implementation to deployment and maintenance. They stress the importance of models in Web application development, and they compare well-known Web-specific development processes like WebML, WSDM and OOHDM to traditional software development approaches like the waterfall model and the spiral...

  6. Engineering semantic web information systems in Hera

    NARCIS (Netherlands)

    Vdovják, R.; Frasincar, F.; Houben, G.J.P.M.; Barna, P.

    2003-01-01

    The success of the World Wide Web has caused the concept of information system to change. Web Information Systems (WIS) use from the Web its paradigm and technologies in order to retrieve information from sources on the Web, and to present the information in terms of a Web or hypermedia

  7. Women, pharmacy and the World Wide Web: could they be the answer to the obesity epidemic?

    Science.gov (United States)

    Fakih, Souhiela; Hussainy, Safeera; Marriott, Jennifer

    2014-04-01

    The objective of this article is to explore how giving women access to evidence-based information in weight management through pharmacies, and by utilising the World Wide Web, is a much needed step towards dealing with the obesity crisis. Women's needs should be considered when developing evidence-based information on weight. Excess weight places them at high risk of diabetes and cardiovascular disease, infertility and complications following pregnancy and giving birth. Women are also an important population group because they influence decision-making around meal choices for their families and are the biggest consumers of weight-loss products, many of which can be purchased in pharmacies. Pharmacies are readily accessible primary healthcare locations and given the pharmacist's expertise in being able to recognise underlying causes of obesity (e.g. medications, certain disease states), pharmacies are an ideal location to provide women with evidence-based information on all facets of weight management. Considering the exponential rise in the use of the World Wide Web, this information could be delivered as an online educational resource supported by other flexible formats. The time has come for the development of an online, evidence-based educational resource on weight management, which is combined with other flexible formats and targeted at women in general and according to different phases of their lives (pregnancy, post-partum, menopause). By empowering women with this knowledge it will allow them and their families to take better control of their health and wellbeing, and it may just be the much needed answer to complement already existing resources to help curb the obesity epidemic. © 2013 Royal Pharmaceutical Society.

  8. Global Bathymetry: Machine Learning for Data Editing

    Science.gov (United States)

    Sandwell, D. T.; Tea, B.; Freund, Y.

    2017-12-01

    The accuracy of global bathymetry depends primarily on the coverage and accuracy of the sounding data and secondarily on the depth predicted from gravity. A main focus of our research is to add newly-available data to the global compilation. Most data sources have 1-12% of erroneous soundings caused by a wide array of blunders and measurement errors. Over the years we have hand-edited this data using undergraduate employees at UCSD (440 million soundings at 500 m resolution). We are developing a machine learning approach to refine the flagging of the older soundings and provide automated editing of newly-acquired soundings. The approach has three main steps: 1) Combine the sounding data with additional information that may inform the machine learning algorithm. The additional parameters include: depth predicted from gravity; distance to the nearest sounding from other cruises; seafloor age; spreading rate; sediment thickness; and vertical gravity gradient. 2) Use available edit decisions as training data sets for a boosted tree algorithm with a binary logistic objective function and L2 regularization. Initial results with poor quality single beam soundings show that the automated algorithm matches the hand-edited data 89% of the time. The results show that most of the information for detecting outliers comes from predicted depth with secondary contributions from distance to the nearest sounding and longitude. A similar analysis using very high quality multibeam data shows that the automated algorithm matches the hand-edited data 93% of the time. Again, most of the information for detecting outliers comes from predicted depth secondary contributions from distance to the nearest sounding and longitude. 3) The third step in the process is to use the machine learning parameters, derived from the training data, to edit 12 million newly acquired single beam sounding data provided by the National Geospatial-Intelligence Agency. The output of the learning algorithm will be

  9. XML and Better Web Searching.

    Science.gov (United States)

    Jackson, Joe; Gilstrap, Donald L.

    1999-01-01

    Addresses the implications of the new Web metalanguage XML for searching on the World Wide Web and considers the future of XML on the Web. Compared to HTML, XML is more concerned with structure of data than documents, and these data structures should prove conducive to precise, context rich searching. (Author/LRW)

  10. Fundamentals of Welding. Teacher Edition [and] Student Edition [and] Student Workbook. Second Edition.

    Science.gov (United States)

    Fortney, Clarence; Gregory, Mike; New, Larry

    Teacher and student editions and a student workbook for fundamentals of welding comprise the first of six in a series of competency-based instructional materials for welding programs. Introductory pages in the teacher edition are training and competency profile, instructional/task analysis, basic skills icons and classifications, basic skills…

  11. 60. The World-Wide Inaccessible Web, Part 1: Browsing

    Science.gov (United States)

    Baggaley, Jon; Batpurev, Batchuluun

    2007-01-01

    Two studies are reported, comparing the browser loading times of webpages created using common Web development techniques. The loading speeds were estimated in 12 Asian countries by members of the "PANdora" network, funded by the International Development Research Centre (IDRC) to conduct collaborative research in the development of…

  12. Handbook of radioactivity analysis. Second edition

    International Nuclear Information System (INIS)

    L'Annunziata, M.

    2003-07-01

    This updated and much expanded Second Edition is an authoritative handbook providing the principles, practical techniques, and procedures for the accurate measurement of radioactivity from the very low levels encountered in the environment to higher levels measured in radioisotope research, clinical laboratories, biological sciences, radionuclide standardization, nuclear medicine, nuclear power, fuel cycle facilities, and in the implementation of nuclear safeguards. The book describes the preparation of samples from a wide variety of matrices, assists the investigator or technician in the selection and use of appropriate radiation detectors, and presents state-of-the-art methods of analysis. Fundamentals of radioactivity properties, radionuclide decay, the calculations involved, and methods of detection provide the basis for a thorough understanding of the analytical procedures. The Handbook of Radioactivity Analysis, Second Edition is suitable as a teaching text for university and professional training courses

  13. [CRISPR/Cas system for genome editing in pluripotent stem cells].

    Science.gov (United States)

    Vasil'eva, E A; Melino, D; Barlev, N A

    2015-01-01

    Genome editing systems based on site-specific nucleases became very popular for genome editing in modern bioengineering. Human pluripotent stem cells provide a unique platform for genes function study, disease modeling, and drugs testing. Consequently, technology for fast, accurate and well controlled genome manipulation is required. CRISPR/Cas (clustered regularly interspaced short palindromic repeat/CRISPR-associated) system could be employed for these purposes. This system is based on site-specific programmable nuclease Cas9. Numerous advantages of the CRISPR/Cas system and its successful application to human stem cells provide wide opportunities for genome therapy and regeneration medicine. In this publication, we describe and compare the main genome editing systems based on site-specific programmable nucleases and discuss opportunities and perspectives of the CRISPR/Cas system for application to pluripotent stem cells.

  14. Capataz: a framework for distributing algorithms via the World Wide Web

    Directory of Open Access Journals (Sweden)

    Gonzalo J. Martínez

    2015-08-01

    Full Text Available In recent years, some scientists have embraced the distributed computing paradigm. As experiments and simulations demand ever more computing power, coordinating the efforts of many different processors is often the only reasonable resort. We developed an open-source distributed computing framework based on web technologies, and named it Capataz. Acting as an HTTP server, web browsers running on many different devices can connect to it to contribute in the execution of distributed algorithms written in Javascript. Capataz takes advantage of architectures with many cores using web workers. This paper presents an improvement in Capataz´ usability and why it was needed. In previous experiments the total time of distributed algorithms proved to be susceptible to changes in the execution time of the jobs. The system now adapts by bundling jobs together if they are too simple. The computational experiment to test the solution is a brute force estimation of pi. The benchmark results show that by bundling jobs, the overall perfomance is greatly increased.

  15. New web interface for Personal dosimetry VF, a.s

    International Nuclear Information System (INIS)

    Studeny, J.

    2014-01-01

    The lecture will introduce new functions and graphic design WebSOD - web interface Personal dosimetry Service VF. a.s. which will be updated in November 2014. The new interface will have a new graphic design, intuitive control system and will be providing a range of new functions: - Personal doses - display of personal doses from personal, extremity and neutron dosimeters including graphs, annual and electronic listings of doses; - Collective doses - display of group doses for selected periods of time; Reference levels - setting and display of three reference levels; - Evidence - enables administration of monitored individuals - beginning, ending of monitoring, or editing the data of monitored persons and centers. (author)

  16. The Atlas of Chinese World Wide Web Ecosystem Shaped by the Collective Attention Flows.

    Science.gov (United States)

    Lou, Xiaodan; Li, Yong; Gu, Weiwei; Zhang, Jiang

    2016-01-01

    The web can be regarded as an ecosystem of digital resources connected and shaped by collective successive behaviors of users. Knowing how people allocate limited attention on different resources is of great importance. To answer this, we embed the most popular Chinese web sites into a high dimensional Euclidean space based on the open flow network model of a large number of Chinese users' collective attention flows, which both considers the connection topology of hyperlinks between the sites and the collective behaviors of the users. With these tools, we rank the web sites and compare their centralities based on flow distances with other metrics. We also study the patterns of attention flow allocation, and find that a large number of web sites concentrate on the central area of the embedding space, and only a small fraction of web sites disperse in the periphery. The entire embedding space can be separated into 3 regions(core, interim, and periphery). The sites in the core (1%) occupy a majority of the attention flows (40%), and the sites (34%) in the interim attract 40%, whereas other sites (65%) only take 20% flows. What's more, we clustered the web sites into 4 groups according to their positions in the space, and found that similar web sites in contents and topics are grouped together. In short, by incorporating the open flow network model, we can clearly see how collective attention allocates and flows on different web sites, and how web sites connected each other.

  17. The Atlas of Chinese World Wide Web Ecosystem Shaped by the Collective Attention Flows

    Science.gov (United States)

    Lou, Xiaodan; Li, Yong; Gu, Weiwei; Zhang, Jiang

    2016-01-01

    The web can be regarded as an ecosystem of digital resources connected and shaped by collective successive behaviors of users. Knowing how people allocate limited attention on different resources is of great importance. To answer this, we embed the most popular Chinese web sites into a high dimensional Euclidean space based on the open flow network model of a large number of Chinese users’ collective attention flows, which both considers the connection topology of hyperlinks between the sites and the collective behaviors of the users. With these tools, we rank the web sites and compare their centralities based on flow distances with other metrics. We also study the patterns of attention flow allocation, and find that a large number of web sites concentrate on the central area of the embedding space, and only a small fraction of web sites disperse in the periphery. The entire embedding space can be separated into 3 regions(core, interim, and periphery). The sites in the core (1%) occupy a majority of the attention flows (40%), and the sites (34%) in the interim attract 40%, whereas other sites (65%) only take 20% flows. What’s more, we clustered the web sites into 4 groups according to their positions in the space, and found that similar web sites in contents and topics are grouped together. In short, by incorporating the open flow network model, we can clearly see how collective attention allocates and flows on different web sites, and how web sites connected each other. PMID:27812133

  18. Object Lessons: Material Culture on the World Wide Web.

    Science.gov (United States)

    Mires, Charlene

    2001-01-01

    Describes the content of a course on material culture for undergraduate students that was separated into two sections: (1) first students read books and analyzed artifacts; and (2) then the class explored the Centennial Exhibition held in Philadelphia (Pennsylvania) in 1876, applying material culture methods and constructing a Web site from their…

  19. Interactive Editing and Cataloging Interfaces for Modern Digital Library Systems

    CERN Document Server

    Raae, L C; Helstrup, H

    2009-01-01

    The next-generation High Energy Physics information system, INSPIRE, is being built by combining the content from the successful SPIRES database of bibliographic information with the CDS Invenio software being developed at CERN, an open-source platform for large digital library systems. The project is a collaboration between four major particle physics laboratories in Europe and the U.S. New tools are being developed to enable the global cooperation between catalogers at these labs. The BibEdit module will provide a central interface for the editing, enrichment, correction and verification of a record on its way into the system, by processing and presenting data from several supporting modules to the cataloger. The objective is to minimize the time and actions needed by the cataloger to process the record. To create a fast and powerful web application we make use of modern AJAX technology to create a dynamic and responsive user interface, where server communication happens in the background without delaying t...

  20. Archaeology 2.0? Review of Archaeology 2.0: New Approaches to Communication and Collaboration [Web Book

    Directory of Open Access Journals (Sweden)

    Michael Shanks

    2012-10-01

    Full Text Available The Cotsen Institute in Los Angeles has launched a new publishing initiative in 'Digital Archaeology'. Its first book, Archaeology 2.0: New Approaches to Communication and Collaboration, edited by Eric C. Kansa, Sarah Whitcher Kansa and Ethan Watrall, makes a grand claim, if only in its title, that archaeology has undergone, or is about to undergo, changes that bring about a completely new version or kind of archaeology. The analogy is with the World Wide Web. Just as the IT world embraced radical changes of software design and web delivery nearly ten years ago and announced that this was Web version 2.0, so too archaeology is changed, the authors claim, and enough to warrant the designation version 2.0. We disagree and argue that the claim is not well supported. Moreover, we hold that the book misunderstands the implications of Web 2.0 and its aftermath. The well-meaning authors do make a valuable contribution to debates about uses of information technology in archaeology, and particularly data management. But their perspective is hopelessly narrow, looking back to the circumscribed concerns of professional field archaeologists with their data, its dissemination, use and survival. The authors focus mainly upon their own projects, expressing little interest in the scope of contemporary archaeology, digitally enabled as it all is, through heritage and everything to do with the representation of the material past in the present, an interest surely begged by the overt reference to the global changes associated with the notion of Web 2.0.

  1. TMFoldWeb: a web server for predicting transmembrane protein fold class.

    Science.gov (United States)

    Kozma, Dániel; Tusnády, Gábor E

    2015-09-17

    Here we present TMFoldWeb, the web server implementation of TMFoldRec, a transmembrane protein fold recognition algorithm. TMFoldRec uses statistical potentials and utilizes topology filtering and a gapless threading algorithm. It ranks template structures and selects the most likely candidates and estimates the reliability of the obtained lowest energy model. The statistical potential was developed in a maximum likelihood framework on a representative set of the PDBTM database. According to the benchmark test the performance of TMFoldRec is about 77 % in correctly predicting fold class for a given transmembrane protein sequence. An intuitive web interface has been developed for the recently published TMFoldRec algorithm. The query sequence goes through a pipeline of topology prediction and a systematic sequence to structure alignment (threading). Resulting templates are ordered by energy and reliability values and are colored according to their significance level. Besides the graphical interface, a programmatic access is available as well, via a direct interface for developers or for submitting genome-wide data sets. The TMFoldWeb web server is unique and currently the only web server that is able to predict the fold class of transmembrane proteins while assigning reliability scores for the prediction. This method is prepared for genome-wide analysis with its easy-to-use interface, informative result page and programmatic access. Considering the info-communication evolution in the last few years, the developed web server, as well as the molecule viewer, is responsive and fully compatible with the prevalent tablets and mobile devices.

  2. General edition program

    International Nuclear Information System (INIS)

    Vaturi, Sylvain

    1969-01-01

    Computerized edition is essential for data processing exploitation. When a more or less complex edition program is required for each task, then the need for a general edition program become obvious. The aim of this study is to create a general edition program. Universal programs are capable to execute numerous and varied tasks. For a more precise processing, the execution of which is frequently required, the use of a specialized program is preferable because, contradictory to the universal program, it goes straight to the point [fr

  3. CRISPR/Cas9 based genome editing of Penicillium chrysogenum

    NARCIS (Netherlands)

    Pohl, Carsten; Kiel, Jan A K W; Driessen, Arnold J M; Bovenberg, Roel A L; Nygård, Yvonne

    2016-01-01

    CRISPR/Cas9 based systems have emerged as versatile platforms for precision genome editing in a wide range of organisms. Here we have developed powerful CRISPR/Cas9 tools for marker-based and marker-free genome modifications in Penicillium chrysogenum, a model filamentous fungus and industrially

  4. The Evaluation of Web pages of State Universities’ Usability via Content Analysis

    Directory of Open Access Journals (Sweden)

    Ezgi CEVHER

    2015-12-01

    Full Text Available Within the scope of e-transformation project in Turkey, the “Preparation of Guideline for State Institutions’ Web Pages” action has been carried out for ensuring the minimal cohesiveness among government institutions’ and organizations’ Web pages in terms of design and content. As a result of those efforts, the first edition of “Guideline for State Institutions’ Web Pages” has been prepared in year 2006. The second edition of this guideline has been published in year 2009 under in simpler form under the name of “Guideline and Suggestions for the Standards of Governmental Institutions’ Web Pages”. It became compulsory for local and central level governmental institutions and organizations to obey the procedures and principles stated in Guideline. Through this Guideline, the preparation of websites of governmental institutions in harmony with mentioned standards, and updating them in parallel with changing conditions and requirements have been brought to agenda especially in recent years. In this study, by considering the characteristics stated in Guideline, the webpages of state universities’ have been assessed through “content analysis”. Considering that the webpages of universities are being visited by hundreds of visitors daily, it is required to ensure the effective, productive and comfortable usability. For this reason, by analyzing their webpages, the object is to determine to what extend the state universities implement the compulsory principles stated in Guideline, the webpages of universities have been assessed in this study from the aspects of compliance with standards, usability, and accessibility

  5. Ten years for the public Web

    CERN Multimedia

    2003-01-01

    Ten years ago, CERN issued a statement declaring that a little known piece of software called the World Wide Web was in the public domain. Nowadays, the Web is an indispensable part of modern communications. The idea for the Web goes back to March 1989 when CERN Computer scientist Tim Berners-Lee wrote a proposal for a 'Distributed Information Management System' for the high-energy physics community. The Web was originaly conceived and developed to meet the demand for information sharing between scientists working all over the world. There were many obstacles in the 1980s to the effective exchange of information. There was, for example a great variety of computer and network systems, with hardly any common features. The main purpose of the web was to allow scientists to access information from any source in a consistent and simple way. By Christmas 1990, Berners-Lee's idea had become the World Wide Web, with its first server and browser running at CERN. Through 1991, the Web spread to other particle physics ...

  6. Caught in the Web

    Energy Technology Data Exchange (ETDEWEB)

    Gillies, James

    1995-06-15

    The World-Wide Web may have taken the Internet by storm, but many people would be surprised to learn that it owes its existence to CERN. Around half the world's particle physicists come to CERN for their experiments, and the Web is the result of their need to share information quickly and easily on a global scale. Six years after Tim Berners-Lee's inspired idea to marry hypertext to the Internet in 1989, CERN is handing over future Web development to the World-Wide Web Consortium, run by the French National Institute for Research in Computer Science and Control, INRIA, and the Laboratory for Computer Science of the Massachusetts Institute of Technology, MIT, leaving itself free to concentrate on physics. The Laboratory marked this transition with a conference designed to give a taste of what the Web can do, whilst firmly stamping it with the label ''Made in CERN''. Over 200 European journalists and educationalists came to CERN on 8 - 9 March for the World-Wide Web Days, resulting in wide media coverage. The conference was opened by UK Science Minister David Hunt who stressed the importance of fundamental research in generating new ideas. ''Who could have guessed 10 years ago'', he said, ''that particle physics research would lead to a communication system which would allow every school to have the biggest library in the world in a single computer?''. In his introduction, the Minister also pointed out that ''CERN and other basic research laboratories help to break new technological ground and sow the seeds of what will become mainstream manufacturing in the future.'' Learning the jargon is often the hardest part of coming to grips with any new invention, so CERN put it at the top of the agenda. Jacques Altaber, who helped introduce the Internet to CERN in the early 1980s, explained that without the Internet, the Web couldn't exist. The Internet began as a US Defense Department research project in the 1970s and has grown into a global network-ofnetworks linking some

  7. A hybrid BCI web browser based on EEG and EOG signals.

    Science.gov (United States)

    Shenghong He; Tianyou Yu; Zhenghui Gu; Yuanqing Li

    2017-07-01

    In this study, we propose a new web browser based on a hybrid brain computer interface (BCI) combining electroencephalographic (EEG) and electrooculography (EOG) signals. Specifically, the user can control the horizontal movement of the mouse by imagining left/right hand motion, and control the vertical movement of the mouse, select/reject a target, or input text in an edit box by blinking eyes in synchrony with the flashes of the corresponding buttons on the GUI. Based on mouse control, target selection and text input, the user can open a web page of interest, select an intended target in the web and read the page content. An online experiment was conducted involving five healthy subjects. The experimental results demonstrated the effectiveness of the proposed method.

  8. The Development of Interactive World Wide Web Based Teaching Material in Forensic Science.

    Science.gov (United States)

    Daeid, Niamh Nic

    2001-01-01

    Describes the development of a Web-based tutorial in the forensic science teaching program at the University of Strathclyde (Scotland). Highlights include the theoretical basis for course development; objectives; Web site design; student feedback; and staff feedback. (LRW)

  9. Storage Manager and File Transfer Web Services

    International Nuclear Information System (INIS)

    William A Watson III; Ying Chen; Jie Chen; Walt Akers

    2002-01-01

    Web services are emerging as an interesting mechanism for a wide range of grid services, particularly those focused upon information services and control. When coupled with efficient data transfer services, they provide a powerful mechanism for building a flexible, open, extensible data grid for science applications. In this paper we present our prototype work on a Java Storage Resource Manager (JSRM) web service and a Java Reliable File Transfer (JRFT) web service. A java client (Grid File Manager) on top of JSRM and is developed to demonstrate the capabilities of these web services. The purpose of this work is to show the extent to which SOAP based web services are an appropriate direction for building a grid-wide data management system, and eventually grid-based portals

  10. Designing Content for A Web-Based Application Used in Blended Composition Classes: Things to Consider in The EFL/ESL Context

    Directory of Open Access Journals (Sweden)

    Irfan Rifai

    2014-10-01

    Full Text Available In the world of composition teaching, teachers of writing play just as vital role. Their tasks are demonstrating, motivating, supporting, responding, and evaluating. The days of these teachers are often filled with editing and additional feedback sessions. Thus, in order to have a web especially designed as a tool for learning to write in ESL, the five tasks mentioned by Harmer should be made as important points to consider (teachers’ preferences. The content of such web should be also based on careful considerations that include factors like students’ preferences (user experience. It is to make sure that the web being created meets the preference of the users. With the thoughts, two groups of students were gathered in a study involving two writing classes in which online technology being used as a platform for students and Instructor to exchange ideas, review and edit drafts, provide writing tips links and leave comments on others’ pieces of writing. Students’ online activities were observed and their feedback during group discussion was used as the base to construct the content of the web

  11. Web cache location

    Directory of Open Access Journals (Sweden)

    Boffey Brian

    2004-01-01

    Full Text Available Stress placed on network infrastructure by the popularity of the World Wide Web may be partially relieved by keeping multiple copies of Web documents at geographically dispersed locations. In particular, use of proxy caches and replication provide a means of storing information 'nearer to end users'. This paper concentrates on the locational aspects of Web caching giving both an overview, from an operational research point of view, of existing research and putting forward avenues for possible further research. This area of research is in its infancy and the emphasis will be on themes and trends rather than on algorithm construction. Finally, Web caching problems are briefly related to referral systems more generally.

  12. [Web-based analysis of Stilling's color plates].

    Science.gov (United States)

    Kuchenbecker, J

    2014-12-01

    Color vision tests with pseudoisochromatic plates currently represent the most common procedure for the screening of congenital color vision deficiencies. By means of a web-based color vision test, new and old color plates can be tested for diagnostic quality without major effort. A total of 16 digitized Stilling's color plates of the 11th edition from 1907 were included in a web-based color vision test (http://www.farbsehtest.de). The χ(2)-test was used to check whether the Stilling color plates showed similar results to the nine previously evaluated Ishihara color plates. A total of 518 subjects including101 (19.5 %) female subjects with a mean age of 34.6 ± 17 years, took the web-based test with the 25 plates. For all participants the range for the correctly recognized plates was between 5.2 % (n = 27) and 97.7 % (n = 506) for the Stilling color plates and between 64.9 % (n = 336) and 100 % (n = 518) for the Ishihara color plates. For participants with more than 5 errors (n = 247), the range for correctly recognized plates was between 2.0 % (n = 5) and 98.0 % (n = 242) for the Stilling plates and between 42.5 % (n = 105) and 100 % (n = 247) for the Ishihara plates. Taking all color plates and all participants into account there was a significantly higher incidence of erroneous recognition of the Stilling color plates (3038 false and 5250 true answers) compared to the Ishihara color plates (1511 false and 3151 true answers) (p plates could be used for the test edition of the Velhagen/Broschmann/Kuchenbecker color plates from 2014. Overall, the Stilling color plates were recognized with a higher incidence of error by all participants in the web-based test compared to the utilized Ishihara color plates, which in most cases was attributable to ambiguity of some symbols.

  13. Moving toward a universally accessible web: Web accessibility and education.

    Science.gov (United States)

    Kurt, Serhat

    2017-12-08

    The World Wide Web is an extremely powerful source of information, inspiration, ideas, and opportunities. As such, it has become an integral part of daily life for a great majority of people. Yet, for a significant number of others, the internet offers only limited value due to the existence of barriers which make accessing the Web difficult, if not impossible. This article illustrates some of the reasons that achieving equality of access to the online world of education is so critical, explores the current status of Web accessibility, discusses evaluative tools and methods that can help identify accessibility issues in educational websites, and provides practical recommendations and guidelines for resolving some of the obstacles that currently hinder the achievability of the goal of universal Web access.

  14. [Current advances and future prospects of genome editing technology in the field of biomedicine.

    Science.gov (United States)

    Sakuma, Tetsushi

    Genome editing technology can alter the genomic sequence at will, contributing the creation of cellular and animal models of human diseases including hereditary disorders and cancers, and the generation of the mutation-corrected human induced pluripotent stem cells for ex vivo regenerative medicine. In addition, novel approaches such as drug development using genome-wide CRISPR screening and cancer suppression using epigenome editing technology, which can change the epigenetic modifications in a site-specific manner, have also been conducted. In this article, I summarize the current advances and future prospects of genome editing technology in the field of biomedicine.

  15. Semantic Web Services Challenge, Results from the First Year. Series: Semantic Web And Beyond, Volume 8.

    Science.gov (United States)

    Petrie, C.; Margaria, T.; Lausen, H.; Zaremba, M.

    Explores trade-offs among existing approaches. Reveals strengths and weaknesses of proposed approaches, as well as which aspects of the problem are not yet covered. Introduces software engineering approach to evaluating semantic web services. Service-Oriented Computing is one of the most promising software engineering trends because of the potential to reduce the programming effort for future distributed industrial systems. However, only a small part of this potential rests on the standardization of tools offered by the web services stack. The larger part of this potential rests upon the development of sufficient semantics to automate service orchestration. Currently there are many different approaches to semantic web service descriptions and many frameworks built around them. A common understanding, evaluation scheme, and test bed to compare and classify these frameworks in terms of their capabilities and shortcomings, is necessary to make progress in developing the full potential of Service-Oriented Computing. The Semantic Web Services Challenge is an open source initiative that provides a public evaluation and certification of multiple frameworks on common industrially-relevant problem sets. This edited volume reports on the first results in developing common understanding of the various technologies intended to facilitate the automation of mediation, choreography and discovery for Web Services using semantic annotations. Semantic Web Services Challenge: Results from the First Year is designed for a professional audience composed of practitioners and researchers in industry. Professionals can use this book to evaluate SWS technology for their potential practical use. The book is also suitable for advanced-level students in computer science.

  16. Multiplex editing system

    DEFF Research Database (Denmark)

    2015-01-01

    The present invention relates to a multiplex editing system. The system allows multiple editing of nucleic acid sequences such as genomic sequences, such as knockins of genes of interest in a genome, knockouts of genomic sequences and/or allele replacement. Also provided herein are a method...... for editing nucleic acids and a cell comprising a stably integrated endonuclease....

  17. Simple Genome Editing of Rodent Intact Embryos by Electroporation.

    Directory of Open Access Journals (Sweden)

    Takehito Kaneko

    Full Text Available The clustered regularly interspaced short palindromic repeat (CRISPR/CRISPR-associated (Cas system is a powerful tool for genome editing in animals. Recently, new technology has been developed to genetically modify animals without using highly skilled techniques, such as pronuclear microinjection of endonucleases. Technique for animal knockout system by electroporation (TAKE method is a simple and effective technology that produces knockout rats by introducing endonuclease mRNAs into intact embryos using electroporation. Using TAKE method and CRISPR/Cas system, the present study successfully produced knockout and knock-in mice and rats. The mice and rats derived from embryos electroporated with Cas9 mRNA, gRNA and single-stranded oligodeoxynucleotide (ssODN comprised the edited targeted gene as a knockout (67% of mice and 88% of rats or knock-in (both 33%. The TAKE method could be widely used as a powerful tool to produce genetically modified animals by genome editing.

  18. Happy birthday WWW: the web is now old enough to drive

    CERN Document Server

    Gilbertson, Scott

    2007-01-01

    "The World Wide Web can now drive. Sixteen years ago yeterday, in a short post to the alt.hypertext newsgroup, tim Berners-Lee revealed the first public web pages summarizing his World Wide Web project." (1/4 page)

  19. A design method for an intuitive web site

    Energy Technology Data Exchange (ETDEWEB)

    Quinniey, M.L.; Diegert, K.V.; Baca, B.G.; Forsythe, J.C.; Grose, E.

    1999-11-03

    The paper describes a methodology for designing a web site for human factor engineers that is applicable for designing a web site for a group of people. Many web pages on the World Wide Web are not organized in a format that allows a user to efficiently find information. Often the information and hypertext links on web pages are not organized into intuitive groups. Intuition implies that a person is able to use their knowledge of a paradigm to solve a problem. Intuitive groups are categories that allow web page users to find information by using their intuition or mental models of categories. In order to improve the human factors engineers efficiency for finding information on the World Wide Web, research was performed to develop a web site that serves as a tool for finding information effectively. The paper describes a methodology for designing a web site for a group of people who perform similar task in an organization.

  20. Bioprocess-Engineering Education with Web Technology

    NARCIS (Netherlands)

    Sessink, O.

    2006-01-01

    Development of learning material that is distributed through and accessible via the World Wide Web. Various options from web technology are exploited to improve the quality and efficiency of learning material.

  1. International use of an academic nephrology World Wide Web site: from medical information resource to business tool.

    Science.gov (United States)

    Abbott, Kevin C; Oliver, David K; Boal, Thomas R; Gadiyak, Grigorii; Boocks, Carl; Yuan, Christina M; Welch, Paul G; Poropatich, Ronald K

    2002-04-01

    Studies of the use of the World Wide Web to obtain medical knowledge have largely focused on patients. In particular, neither the international use of academic nephrology World Wide Web sites (websites) as primary information sources nor the use of search engines (and search strategies) to obtain medical information have been described. Visits ("hits") to the Walter Reed Army Medical Center (WRAMC) Nephrology Service website from April 30, 2000, to March 14, 2001, were analyzed for the location of originating source using Webtrends, and search engines (Google, Lycos, etc.) were analyzed manually for search strategies used. From April 30, 2000 to March 14, 2001, the WRAMC Nephrology Service website received 1,007,103 hits and 12,175 visits. These visits were from 33 different countries, and the most frequent regions were Western Europe, Asia, Australia, the Middle East, Pacific Islands, and South America. The most frequent organization using the site was the military Internet system, followed by America Online and automated search programs of online search engines, most commonly Google. The online lecture series was the most frequently visited section of the website. Search strategies used in search engines were extremely technical. The use of "robots" by standard Internet search engines to locate websites, which may be blocked by mandatory registration, has allowed users worldwide to access the WRAMC Nephrology Service website to answer very technical questions. This suggests that it is being used as an alternative to other primary sources of medical information and that the use of mandatory registration may hinder users from finding valuable sites. With current Internet technology, even a single service can become a worldwide information resource without sacrificing its primary customers.

  2. A comprehensive and cost-effective preparticipation exam implemented on the World Wide Web.

    Science.gov (United States)

    Peltz, J E; Haskell, W L; Matheson, G O

    1999-12-01

    Mandatory preparticipation examinations (PPE) are labor intensive, offer little routine health maintenance and are poor predictors of future injury or illness. Our objective was to develop a new PPE for the Stanford University varsity athletes that improved both quality of primary and preventive care and physician time efficiency. This PPE is based on the annual submission, by each athlete, of a comprehensive medical history questionnaire that is then summarized in a two-page report for the examining physician. The questionnaire was developed through a search of MEDLINE from 1966 to 1997, review of PPE from 11 other institutions, and discussion with two experts from each of seven main content areas: medical and musculoskeletal history, eating, menstrual and sleep disorders, stress and health risk behaviors. Content validity was assessed by 10 sports medicine physicians and four epidemiologists. It was then programmed for the World Wide Web (http:// www.stanford.edu/dept/sportsmed/). The questionnaire demonstrated a 97 +/- 2% sensitivity in detecting positive responses requiring physician attention. Sixteen physicians administered the 1997/98 PPE; using the summary reports, 15 found improvement in their ability to provide overall medical care including health issues beyond clearance; 13 noted a decrease in time needed for each athlete exam. Over 90% of athletes who used the web site found it "easy" or "moderately easy" to access and complete. Initial assessment of this new PPE format shows good athlete compliance, improved exam efficiency and a strong increase in subjective physician satisfaction with the quality of screening and medical care provided. The data indicate a need for improvement of routine health maintenance in this population. The database offers opportunities to study trends, risk factors, and results of interventions.

  3. Cardiac Resynchronization Therapy Online: What Patients Find when Searching the World Wide Web.

    Science.gov (United States)

    Modi, Minal; Laskar, Nabila; Modi, Bhavik N

    2016-06-01

    To objectively assess the quality of information available on the World Wide Web on cardiac resynchronization therapy (CRT). Patients frequently search the internet regarding their healthcare issues. It has been shown that patients seeking information can help or hinder their healthcare outcomes depending on the quality of information consulted. On the internet, this information can be produced and published by anyone, resulting in the risk of patients accessing inaccurate and misleading information. The search term "Cardiac Resynchronisation Therapy" was entered into the three most popular search engines and the first 50 pages on each were pooled and analyzed, after excluding websites inappropriate for objective review. The "LIDA" instrument (a validated tool for assessing quality of healthcare information websites) was to generate scores on Accessibility, Reliability, and Usability. Readability was assessed using the Flesch Reading Ease Score (FRES). Of the 150 web-links, 41 sites met the eligibility criteria. The sites were assessed using the LIDA instrument and the FRES. A mean total LIDA score for all the websites assessed was 123.5 of a possible 165 (74.8%). The average Accessibility of the sites assessed was 50.1 of 60 (84.3%), on Usability 41.4 of 54 (76.6%), on Reliability 31.5 of 51 (61.7%), and 41.8 on FRES. There was a significant variability among sites and interestingly, there was no correlation between the sites' search engine ranking and their scores. This study has illustrated the variable quality of online material on the topic of CRT. Furthermore, there was also no apparent correlation between highly ranked, popular websites and their quality. Healthcare professionals should be encouraged to guide their patients toward the online material that contains reliable information. © 2016 Wiley Periodicals, Inc.

  4. [Preliminary construction of three-dimensional visual educational system for clinical dentistry based on world wide web webpage].

    Science.gov (United States)

    Hu, Jian; Xu, Xiang-yang; Song, En-min; Tan, Hong-bao; Wang, Yi-ning

    2009-09-01

    To establish a new visual educational system of virtual reality for clinical dentistry based on world wide web (WWW) webpage in order to provide more three-dimensional multimedia resources to dental students and an online three-dimensional consulting system for patients. Based on computer graphics and three-dimensional webpage technologies, the software of 3Dsmax and Webmax were adopted in the system development. In the Windows environment, the architecture of whole system was established step by step, including three-dimensional model construction, three-dimensional scene setup, transplanting three-dimensional scene into webpage, reediting the virtual scene, realization of interactions within the webpage, initial test, and necessary adjustment. Five cases of three-dimensional interactive webpage for clinical dentistry were completed. The three-dimensional interactive webpage could be accessible through web browser on personal computer, and users could interact with the webpage through rotating, panning and zooming the virtual scene. It is technically feasible to implement the visual educational system of virtual reality for clinical dentistry based on WWW webpage. Information related to clinical dentistry can be transmitted properly, visually and interactively through three-dimensional webpage.

  5. Tracing agents and other automatic sampling procedures for the World Wide Web

    OpenAIRE

    Aguillo, Isidro F.

    1999-01-01

    Many of the search engines and recovery tools are not suitable to make samples of web resources for quantitative analysis. The increasing size of the web and its hypertextual nature offer opportunities for a novel approach. A new generation of recovering tools involving tracing hypertext links from selected sites are very promising. Offering capabilities to automate tasks Extracting large samples of high pertinence Ready to use in standard database formats Selecting additional resour...

  6. Oxyacetylene Welding and Oxyfuel Cutting. Third Edition. Teacher Edition [and] Student Edition [and] Student Workbook.

    Science.gov (United States)

    Knapp, John; Harper, Eddie

    This Oklahoma curriculum guide, which includes a teacher edition, a student edition, and a student workbook, provides three units for a course on oxyacetylene welding, oxyfuel cutting, and cutting done with alternative fuels such as MAPP, propane, and natural gas. The three units are: "Oxyacetylene Welding"; "Oxyfuel Cutting";…

  7. Blueprint of a Cross-Lingual Web Retrieval Collection

    NARCIS (Netherlands)

    Sigurbjörnsson, B.; Kamps, J.; de Rijke, M.; van Zwol, R.

    2005-01-01

    The world wide web is a natural setting for cross-lingual information retrieval; web content is essentially multilingual, and web searchers are often polyglots. Even though English has emerged as the lingua franca of the web, planning for a business trip or holiday usually involves digesting pages

  8. Semantic Advertising for Web 3.0

    Science.gov (United States)

    Thomas, Edward; Pan, Jeff Z.; Taylor, Stuart; Ren, Yuan; Jekjantuk, Nophadol; Zhao, Yuting

    Advertising on the World Wide Web is based around automatically matching web pages with appropriate advertisements, in the form of banner ads, interactive adverts, or text links. Traditionally this has been done by manual classification of pages, or more recently using information retrieval techniques to find the most important keywords from the page, and match these to keywords being used by adverts. In this paper, we propose a new model for online advertising, based around lightweight embedded semantics. This will improve the relevancy of adverts on the World Wide Web and help to kick-start the use of RDFa as a mechanism for adding lightweight semantic attributes to the Web. Furthermore, we propose a system architecture for the proposed new model, based on our scalable ontology reasoning infrastructure TrOWL.

  9. Web document clustering using hyperlink structures

    Energy Technology Data Exchange (ETDEWEB)

    He, Xiaofeng; Zha, Hongyuan; Ding, Chris H.Q; Simon, Horst D.

    2001-05-07

    With the exponential growth of information on the World Wide Web there is great demand for developing efficient and effective methods for organizing and retrieving the information available. Document clustering plays an important role in information retrieval and taxonomy management for the World Wide Web and remains an interesting and challenging problem in the field of web computing. In this paper we consider document clustering methods exploring textual information hyperlink structure and co-citation relations. In particular we apply the normalized cut clustering method developed in computer vision to the task of hyperdocument clustering. We also explore some theoretical connections of the normalized-cut method to K-means method. We then experiment with normalized-cut method in the context of clustering query result sets for web search engines.

  10. Multimedia radiology self-learning course on the world wide web

    International Nuclear Information System (INIS)

    Sim, Jung Suk; Kim, Jong Hyo; Kim, Tae Kyoung; Han, Joon Koo; Kang, Heung Sik; Yeon, Kyung Mo; Han, Man Chung

    1997-01-01

    The creation and maintenance of radiology teaching materials is both laborious and very time-consuming, but at a teaching hospital is important. Through use of the technology offered by today's worldwide web, this problem can be efficiently solved, and on this basis, we devised a multimedia radiology self-learning course for abdominal ultrasound and CT. A combination of video and audio tapes has been used as teaching material; the authors digitized and converted these to Hypertext Mark-up Language (HTML) format. films were digitized with a digital camera and compressed to joint photographic expert group (JPEG) format, while audio tapes were digitized with a sound recorder and compressed to real audio format. Multimedia on the worldwide web will facilitate easy management and maintenance of a self-learning course. To make this more suitable for practical use, continual upgrading on the basis of experience is needed. (author). 3 refs., 4 figs

  11. Mojo Hand, a TALEN design tool for genome editing applications.

    Science.gov (United States)

    Neff, Kevin L; Argue, David P; Ma, Alvin C; Lee, Han B; Clark, Karl J; Ekker, Stephen C

    2013-01-16

    Recent studies of transcription activator-like (TAL) effector domains fused to nucleases (TALENs) demonstrate enormous potential for genome editing. Effective design of TALENs requires a combination of selecting appropriate genetic features, finding pairs of binding sites based on a consensus sequence, and, in some cases, identifying endogenous restriction sites for downstream molecular genetic applications. We present the web-based program Mojo Hand for designing TAL and TALEN constructs for genome editing applications (http://www.talendesign.org). We describe the algorithm and its implementation. The features of Mojo Hand include (1) automatic download of genomic data from the National Center for Biotechnology Information, (2) analysis of any DNA sequence to reveal pairs of binding sites based on a user-defined template, (3) selection of restriction-enzyme recognition sites in the spacer between the TAL monomer binding sites including options for the selection of restriction enzyme suppliers, and (4) output files designed for subsequent TALEN construction using the Golden Gate assembly method. Mojo Hand enables the rapid identification of TAL binding sites for use in TALEN design. The assembly of TALEN constructs, is also simplified by using the TAL-site prediction program in conjunction with a spreadsheet management aid of reagent concentrations and TALEN formulation. Mojo Hand enables scientists to more rapidly deploy TALENs for genome editing applications.

  12. Mojo Hand, a TALEN design tool for genome editing applications

    Directory of Open Access Journals (Sweden)

    Neff Kevin L

    2013-01-01

    Full Text Available Abstract Background Recent studies of transcription activator-like (TAL effector domains fused to nucleases (TALENs demonstrate enormous potential for genome editing. Effective design of TALENs requires a combination of selecting appropriate genetic features, finding pairs of binding sites based on a consensus sequence, and, in some cases, identifying endogenous restriction sites for downstream molecular genetic applications. Results We present the web-based program Mojo Hand for designing TAL and TALEN constructs for genome editing applications (http://www.talendesign.org. We describe the algorithm and its implementation. The features of Mojo Hand include (1 automatic download of genomic data from the National Center for Biotechnology Information, (2 analysis of any DNA sequence to reveal pairs of binding sites based on a user-defined template, (3 selection of restriction-enzyme recognition sites in the spacer between the TAL monomer binding sites including options for the selection of restriction enzyme suppliers, and (4 output files designed for subsequent TALEN construction using the Golden Gate assembly method. Conclusions Mojo Hand enables the rapid identification of TAL binding sites for use in TALEN design. The assembly of TALEN constructs, is also simplified by using the TAL-site prediction program in conjunction with a spreadsheet management aid of reagent concentrations and TALEN formulation. Mojo Hand enables scientists to more rapidly deploy TALENs for genome editing applications.

  13. Sign Language Web Pages

    Science.gov (United States)

    Fels, Deborah I.; Richards, Jan; Hardman, Jim; Lee, Daniel G.

    2006-01-01

    The World Wide Web has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The…

  14. Hacking web intelligence open source intelligence and web reconnaissance concepts and techniques

    CERN Document Server

    Chauhan, Sudhanshu

    2015-01-01

    Open source intelligence (OSINT) and web reconnaissance are rich topics for infosec professionals looking for the best ways to sift through the abundance of information widely available online. In many cases, the first stage of any security assessment-that is, reconnaissance-is not given enough attention by security professionals, hackers, and penetration testers. Often, the information openly present is as critical as the confidential data. Hacking Web Intelligence shows you how to dig into the Web and uncover the information many don't even know exists. The book takes a holistic approach

  15. WebQuests: Are They Developmentally Appropriate?

    Science.gov (United States)

    Maddux, Cleborne D.; Cummings, Rhoda

    2007-01-01

    A topic that currently is receiving a great deal of attention by educators is the nature and use of WebQuests--computer-based activities that guide student learning through use of the World Wide Web (Sharp 2004). Despite their popularity, questions remain about the effectiveness with which WebQuests are being used with students. This article…

  16. A Survey On Various Web Template Detection And Extraction Methods

    Directory of Open Access Journals (Sweden)

    Neethu Mary Varghese

    2015-03-01

    Full Text Available Abstract In todays digital world reliance on the World Wide Web as a source of information is extensive. Users increasingly rely on web based search engines to provide accurate search results on a wide range of topics that interest them. The search engines in turn parse the vast repository of web pages searching for relevant information. However majority of web portals are designed using web templates which are designed to provide consistent look and feel to end users. The presence of these templates however can influence search results leading to inaccurate results being delivered to the users. Therefore to improve the accuracy and reliability of search results identification and removal of web templates from the actual content is essential. A wide range of approaches are commonly employed to achieve this and this paper focuses on the study of the various approaches of template detection and extraction that can be applied across homogenous as well as heterogeneous web pages.

  17. Minimalist instruction for learning to search the World Wide Web

    NARCIS (Netherlands)

    Lazonder, Adrianus W.

    2001-01-01

    This study examined the efficacy of minimalist instruction to develop self-regulatory skills involved in Web searching. Two versions of minimalist self-regulatory skill instruction were compared to a control group that was merely taught procedural skills to operate the search engine. Acquired skills

  18. A Technique to Speedup Access to Web Contents

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 7. Web Caching - A Technique to Speedup Access to Web Contents. Harsha Srinath Shiva Shankar Ramanna. General Article Volume 7 Issue 7 July 2002 pp 54-62 ... Keywords. World wide web; data caching; internet traffic; web page access.

  19. World-Wide Web Tools for Locating Planetary Images

    Science.gov (United States)

    Kanefsky, Bob; Deiss, Ron (Technical Monitor)

    1995-01-01

    The explosive growth of the World-Wide Web (WWW) in the past year has made it feasible to provide interactive graphical tools to assist scientists in locating planetary images. The highest available resolution images of any site of interest can be quickly found on a map or plot, and, if online, displayed immediately on nearly any computer equipped with a color screen, an Internet connection, and any of the free WWW browsers. The same tools may also be of interest to educators, students, and the general public. Image finding tools have been implemented covering most of the solar system: Earth, Mars, and the moons and planets imaged by Voyager. The Mars image-finder, which plots the footprints of all the high-resolution Viking Orbiter images and can be used to display any that are available online, also contains a complete scrollable atlas and hypertext gazetteer to help locating areas. The Earth image-finder is linked to thousands of Shuttle images stored at NASA/JSC, and displays them as red dots on a globe. The Voyager image-finder plots images as dots, by longitude and apparent target size, linked to online images. The locator (URL) for the top-level page is http: //ic-www.arc.nasa.gov/ic/projects/bayes-group/Atlas/. Through the efforts of the Planetary Data System and other organizations, hundreds of thousands of planetary images are now available on CD-ROM, and many of these have been made available on the WWW. However, locating images of a desired site is still problematic, in practice. For example, many scientists studying Mars use digital image maps, which are one third the resolution of Viking Orbiter survey images. When they douse Viking Orbiter images, they often work with photographically printed hardcopies, which lack the flexibility of digital images: magnification, contrast stretching, and other basic image-processing techniques offered by off-the-shelf software. From the perspective of someone working on an experimental image processing technique for

  20. Spinning the web of knowledge

    CERN Multimedia

    Knight, Matthew

    2007-01-01

    "On August 6, 1991, Tim Berners-Lee posted the World Wide Web's first Web site. Fifteen years on there are estimated to be over 100 million. The space of growth has happened at a bewildering rate and its success has even confounded its inventor." (1/2 page)

  1. Caught in the Web

    International Nuclear Information System (INIS)

    Gillies, James

    1995-01-01

    The World-Wide Web may have taken the Internet by storm, but many people would be surprised to learn that it owes its existence to CERN. Around half the world's particle physicists come to CERN for their experiments, and the Web is the result of their need to share information quickly and easily on a global scale. Six years after Tim Berners-Lee's inspired idea to marry hypertext to the Internet in 1989, CERN is handing over future Web development to the World-Wide Web Consortium, run by the French National Institute for Research in Computer Science and Control, INRIA, and the Laboratory for Computer Science of the Massachusetts Institute of Technology, MIT, leaving itself free to concentrate on physics. The Laboratory marked this transition with a conference designed to give a taste of what the Web can do, whilst firmly stamping it with the label ''Made in CERN''. Over 200 European journalists and educationalists came to CERN on 8 - 9 March for the World-Wide Web Days, resulting in wide media coverage. The conference was opened by UK Science Minister David Hunt who stressed the importance of fundamental research in generating new ideas. ''Who could have guessed 10 years ago'', he said, ''that particle physics research would lead to a communication system which would allow every school to have the biggest library in the world in a single computer?''. In his introduction, the Minister also pointed out that ''CERN and other basic research laboratories help to break new technological ground and sow the seeds of what will become mainstream manufacturing in the future.'' Learning the jargon is often the hardest part of coming to grips with any new invention, so CERN put it at the top of the agenda. Jacques Altaber, who helped introduce the Internet to CERN in the early 1980s, explained that without the Internet, the Web couldn't exist. The Internet began as a US Defense

  2. THE REACH AND RICHNESS OF WIKINOMICS: IS THE FREE WEB-BASED ENCYCLOPEDIA WIKIPEDIA ONLY FOR RICH COUNTRIES?

    DEFF Research Database (Denmark)

    Rask, Morten

    2007-01-01

    In this article, a model of the patterns of correlation in Wikipedia, reach and richness, lays the foundation for studying whether the free Web-based encyclopedia Wikipedia is only for developed countries. Based on data from 12 different Wikipedia language editions, the author finds that the cent......In this article, a model of the patterns of correlation in Wikipedia, reach and richness, lays the foundation for studying whether the free Web-based encyclopedia Wikipedia is only for developed countries. Based on data from 12 different Wikipedia language editions, the author finds...... that the central structural effect is on the level of human development in the current country. In other words, Wikipedia is in general more for rich countries than for less developed countries. It is suggested that policy makers make investments in increasing the general level of literacy, education, and standard...

  3. Web 3.0: implicaciones educativas

    OpenAIRE

    Grupo TACE. Tecnologías Aplicadas a las Ciencias de la Educación

    2012-01-01

    La Web 3.0 se considera la etapa que sigue a la Web 2.0 o Web social. Todavía no es un término unívoco y suele aparecer unido con la web semántica. Es una extensión del World Wide Web, que permite expresar el lenguaje natural y también utilizar un lenguaje que se puede entender, interpretar y utilizar por agentes software, permitiendo encontrar, compartir e integrar la información más fácilmente. Se trata de un nuevo ciclo en el que la inteligencia artificial se combina con la capacidad de l...

  4. Re-editing the paradigm of Cytidine (C) to Uridine (U) RNA editing.

    Science.gov (United States)

    Fossat, Nicolas; Tam, Patrick P L

    2014-01-01

    Cytidine (C) to Uridine (U) RNA editing is a post-trancriptional modification that until recently was known to only affect Apolipoprotein b (Apob) RNA and minimally require 2 components of the C to U editosome, the deaminase APOBEC1 and the RNA-binding protein A1CF. Our latest work has identified a novel RNA-binding protein, RBM47, as a core component of the editosome, which can substitute A1CF for the editing of ApoB mRNA. In addition, new RNA species that are subjected to C to U editing have been identified. Here, we highlight these recent discoveries and discuss how they change our view of the composition of the C to U editing machinery and expand our knowledge of the functional attributes of C to U RNA editing.

  5. Efficient Oligo nucleotide mediated CRISPR-Cas9 Gene Editing in Aspergilli

    DEFF Research Database (Denmark)

    Nødvig, Christina Spuur; Hoof, Jakob Blæsbjerg; Kogle, Martin Engelhard

    2018-01-01

    CRISPR-Cas9 technologies are revolutionizing fungal gene editing. Here we show that survival of specific Cas9/sgRNA mediated DNA double strand breaks (DSBs) depends on the non-homologous end-joining, NHEJ, DNA repair pathway and we use this observation to develop a tool to assess protospacer....... niger, and in A. oryzae indicating that this type of repair may be wide spread in filamentous fungi. Importantly, we demonstrate that by using single-stranded oligo nucleotides for CRISPR-Cas9 mediated gene editing it is possible to introduce specific point mutations as well gene deletions...

  6. Turkish University Students’ Perceptions of the World Wide Web as a Learning Tool: An Investigation Based on Gender, Socio-Economic Background, and Web Experience

    Directory of Open Access Journals (Sweden)

    Erkan Tekinarslan

    2009-04-01

    Full Text Available The main purpose of the study is to investigate Turkish undergraduate students’ perceptions of the Web as a learning tool and to analyze whether their perceptions differ significantly based on gender, socio-economic background, and Web experience. Data obtained from 722 undergraduate students (331 males and 391 females were used in the analyses. The findings indicated significant differences based on gender, socio-economic background, and Web experience. The students from higher socio-economic backgrounds indicated significantly higher attitude scores on the self-efficacy subscale of the Web attitude scale. Similarly, the male students indicated significantly higher scores on the self-efficacy subscale than the females. Also, the students with higher Web experience in terms of usage frequency indicated higher scores on all subscales (i.e., self-efficacy, affective, usefulness, Web-based learning. Moreover, the two-way ANOVA results indicated that the student’s PC ownership has significant main effects on their Web attitudes and on the usefulness, self-efficacy, and affective subscales.

  7. Ebola virus RNA editing depends on the primary editing site sequence and an upstream secondary structure.

    Directory of Open Access Journals (Sweden)

    Masfique Mehedi

    Full Text Available Ebolavirus (EBOV, the causative agent of a severe hemorrhagic fever and a biosafety level 4 pathogen, increases its genome coding capacity by producing multiple transcripts encoding for structural and nonstructural glycoproteins from a single gene. This is achieved through RNA editing, during which non-template adenosine residues are incorporated into the EBOV mRNAs at an editing site encoding for 7 adenosine residues. However, the mechanism of EBOV RNA editing is currently not understood. In this study, we report for the first time that minigenomes containing the glycoprotein gene editing site can undergo RNA editing, thereby eliminating the requirement for a biosafety level 4 laboratory to study EBOV RNA editing. Using a newly developed dual-reporter minigenome, we have characterized the mechanism of EBOV RNA editing, and have identified cis-acting sequences that are required for editing, located between 9 nt upstream and 9 nt downstream of the editing site. Moreover, we show that a secondary structure in the upstream cis-acting sequence plays an important role in RNA editing. EBOV RNA editing is glycoprotein gene-specific, as a stretch encoding for 7 adenosine residues located in the viral polymerase gene did not serve as an editing site, most likely due to an absence of the necessary cis-acting sequences. Finally, the EBOV protein VP30 was identified as a trans-acting factor for RNA editing, constituting a novel function for this protein. Overall, our results provide novel insights into the RNA editing mechanism of EBOV, further understanding of which might result in novel intervention strategies against this viral pathogen.

  8. Ebola virus RNA editing depends on the primary editing site sequence and an upstream secondary structure.

    Science.gov (United States)

    Mehedi, Masfique; Hoenen, Thomas; Robertson, Shelly; Ricklefs, Stacy; Dolan, Michael A; Taylor, Travis; Falzarano, Darryl; Ebihara, Hideki; Porcella, Stephen F; Feldmann, Heinz

    2013-01-01

    Ebolavirus (EBOV), the causative agent of a severe hemorrhagic fever and a biosafety level 4 pathogen, increases its genome coding capacity by producing multiple transcripts encoding for structural and nonstructural glycoproteins from a single gene. This is achieved through RNA editing, during which non-template adenosine residues are incorporated into the EBOV mRNAs at an editing site encoding for 7 adenosine residues. However, the mechanism of EBOV RNA editing is currently not understood. In this study, we report for the first time that minigenomes containing the glycoprotein gene editing site can undergo RNA editing, thereby eliminating the requirement for a biosafety level 4 laboratory to study EBOV RNA editing. Using a newly developed dual-reporter minigenome, we have characterized the mechanism of EBOV RNA editing, and have identified cis-acting sequences that are required for editing, located between 9 nt upstream and 9 nt downstream of the editing site. Moreover, we show that a secondary structure in the upstream cis-acting sequence plays an important role in RNA editing. EBOV RNA editing is glycoprotein gene-specific, as a stretch encoding for 7 adenosine residues located in the viral polymerase gene did not serve as an editing site, most likely due to an absence of the necessary cis-acting sequences. Finally, the EBOV protein VP30 was identified as a trans-acting factor for RNA editing, constituting a novel function for this protein. Overall, our results provide novel insights into the RNA editing mechanism of EBOV, further understanding of which might result in novel intervention strategies against this viral pathogen.

  9. Nuclease Target Site Selection for Maximizing On-target Activity and Minimizing Off-target Effects in Genome Editing

    Science.gov (United States)

    Lee, Ciaran M; Cradick, Thomas J; Fine, Eli J; Bao, Gang

    2016-01-01

    The rapid advancement in targeted genome editing using engineered nucleases such as ZFNs, TALENs, and CRISPR/Cas9 systems has resulted in a suite of powerful methods that allows researchers to target any genomic locus of interest. A complementary set of design tools has been developed to aid researchers with nuclease design, target site selection, and experimental validation. Here, we review the various tools available for target selection in designing engineered nucleases, and for quantifying nuclease activity and specificity, including web-based search tools and experimental methods. We also elucidate challenges in target selection, especially in predicting off-target effects, and discuss future directions in precision genome editing and its applications. PMID:26750397

  10. The Semantic Web and Educational Technology

    Science.gov (United States)

    Maddux, Cleborne D., Ed.

    2008-01-01

    The "Semantic Web" is an idea proposed by Tim Berners-Lee, the inventor of the "World Wide Web." The topic has been generating a great deal of interest and enthusiasm, and there is a rapidly growing body of literature dealing with it. This article attempts to explain how the Semantic Web would work, and explores short-term and long-term…

  11. Requirements of a security framework for the semantic web

    CSIR Research Space (South Africa)

    Mbaya, IR

    2009-02-01

    Full Text Available The vision of the Semantic Web is to provide the World Wide Web the ability to automate interoperate and reason about resources and services on the Web. However, the autonomous dynamic open distributed and heterogeneous nature of the Semantic Web...

  12. FlaME: Flash Molecular Editor - a 2D structure input tool for the web

    Directory of Open Access Journals (Sweden)

    Dallakian Pavel

    2011-02-01

    Full Text Available Abstract Background So far, there have been no Flash-based web tools available for chemical structure input. The authors herein present a feasibility study, aiming at the development of a compact and easy-to-use 2D structure editor, using Adobe's Flash technology and its programming language, ActionScript. As a reference model application from the Java world, we selected the Java Molecular Editor (JME. In this feasibility study, we made an attempt to realize a subset of JME's functionality in the Flash Molecular Editor (FlaME utility. These basic capabilities are: structure input, editing and depiction of single molecules, data import and export in molfile format. Implementation The result of molecular diagram sketching in FlaME is accessible in V2000 molfile format. By integrating the molecular editor into a web page, its communication with the HTML elements on this page is established using the two JavaScript functions, getMol( and setMol(. In addition, structures can be copied to the system clipboard. Conclusion A first attempt was made to create a compact single-file application for 2D molecular structure input/editing on the web, based on Flash technology. With the application examples presented in this article, it could be demonstrated that the Flash methods are principally well-suited to provide the requisite communication between the Flash object (application and the HTML elements on a web page, using JavaScript functions.

  13. Programming NET Web Services

    CERN Document Server

    Ferrara, Alex

    2007-01-01

    Web services are poised to become a key technology for a wide range of Internet-enabled applications, spanning everything from straight B2B systems to mobile devices and proprietary in-house software. While there are several tools and platforms that can be used for building web services, developers are finding a powerful tool in Microsoft's .NET Framework and Visual Studio .NET. Designed from scratch to support the development of web services, the .NET Framework simplifies the process--programmers find that tasks that took an hour using the SOAP Toolkit take just minutes. Programming .NET

  14. Understanding Editing Behaviors in Multilingual Wikipedia.

    Science.gov (United States)

    Kim, Suin; Park, Sungjoon; Hale, Scott A; Kim, Sooyoung; Byun, Jeongmin; Oh, Alice H

    2016-01-01

    Multilingualism is common offline, but we have a more limited understanding of the ways multilingualism is displayed online and the roles that multilinguals play in the spread of content between speakers of different languages. We take a computational approach to studying multilingualism using one of the largest user-generated content platforms, Wikipedia. We study multilingualism by collecting and analyzing a large dataset of the content written by multilingual editors of the English, German, and Spanish editions of Wikipedia. This dataset contains over two million paragraphs edited by over 15,000 multilingual users from July 8 to August 9, 2013. We analyze these multilingual editors in terms of their engagement, interests, and language proficiency in their primary and non-primary (secondary) languages and find that the English edition of Wikipedia displays different dynamics from the Spanish and German editions. Users primarily editing the Spanish and German editions make more complex edits than users who edit these editions as a second language. In contrast, users editing the English edition as a second language make edits that are just as complex as the edits by users who primarily edit the English edition. In this way, English serves a special role bringing together content written by multilinguals from many language editions. Nonetheless, language remains a formidable hurdle to the spread of content: we find evidence for a complexity barrier whereby editors are less likely to edit complex content in a second language. In addition, we find that multilinguals are less engaged and show lower levels of language proficiency in their second languages. We also examine the topical interests of multilingual editors and find that there is no significant difference between primary and non-primary editors in each language.

  15. Understanding Editing Behaviors in Multilingual Wikipedia.

    Directory of Open Access Journals (Sweden)

    Suin Kim

    Full Text Available Multilingualism is common offline, but we have a more limited understanding of the ways multilingualism is displayed online and the roles that multilinguals play in the spread of content between speakers of different languages. We take a computational approach to studying multilingualism using one of the largest user-generated content platforms, Wikipedia. We study multilingualism by collecting and analyzing a large dataset of the content written by multilingual editors of the English, German, and Spanish editions of Wikipedia. This dataset contains over two million paragraphs edited by over 15,000 multilingual users from July 8 to August 9, 2013. We analyze these multilingual editors in terms of their engagement, interests, and language proficiency in their primary and non-primary (secondary languages and find that the English edition of Wikipedia displays different dynamics from the Spanish and German editions. Users primarily editing the Spanish and German editions make more complex edits than users who edit these editions as a second language. In contrast, users editing the English edition as a second language make edits that are just as complex as the edits by users who primarily edit the English edition. In this way, English serves a special role bringing together content written by multilinguals from many language editions. Nonetheless, language remains a formidable hurdle to the spread of content: we find evidence for a complexity barrier whereby editors are less likely to edit complex content in a second language. In addition, we find that multilinguals are less engaged and show lower levels of language proficiency in their second languages. We also examine the topical interests of multilingual editors and find that there is no significant difference between primary and non-primary editors in each language.

  16. Discovering More Accurate Frequent Web Usage Patterns

    OpenAIRE

    Bayir, Murat Ali; Toroslu, Ismail Hakki; Cosar, Ahmet; Fidan, Guven

    2008-01-01

    Web usage mining is a type of web mining, which exploits data mining techniques to discover valuable information from navigation behavior of World Wide Web users. As in classical data mining, data preparation and pattern discovery are the main issues in web usage mining. The first phase of web usage mining is the data processing phase, which includes the session reconstruction operation from server logs. Session reconstruction success directly affects the quality of the frequent patterns disc...

  17. Using EMBL-EBI Services via Web Interface and Programmatically via Web Services.

    Science.gov (United States)

    Lopez, Rodrigo; Cowley, Andrew; Li, Weizhong; McWilliam, Hamish

    2014-12-12

    The European Bioinformatics Institute (EMBL-EBI) provides access to a wide range of databases and analysis tools that are of key importance in bioinformatics. As well as providing Web interfaces to these resources, Web Services are available using SOAP and REST protocols that enable programmatic access to our resources and allow their integration into other applications and analytical workflows. This unit describes the various options available to a typical researcher or bioinformatician who wishes to use our resources via Web interface or programmatically via a range of programming languages. Copyright © 2014 John Wiley & Sons, Inc.

  18. Technical Evaluation Report 61: The World-Wide Inaccessible Web, Part 2: Internet routes

    Directory of Open Access Journals (Sweden)

    Jim Klaas

    2007-06-01

    Full Text Available In the previous report in this series, Web browser loading times were measured in 12 Asian countries, and were found to be up to four times slower than commonly prescribed as acceptable. Failure of webpages to load at all was frequent. The current follow-up study compares these loading times with the complexity of the Internet routes linking the Web users and the Web servers hosting them. The study was conducted in the same 12 Asian countries, with the assistance of members of the International Development Research Centre’s PANdora distance education research network. The data were generated by network members in Bhutan, Cambodia, India, Indonesia, Laos, Mongolia, the Philippines, Sri Lanka, Pakistan, Singapore, Thailand, and Vietnam. Additional data for the follow-up study were collected in China. Using a ‘traceroute’ routine, the study indicates that webpage loading time is linked to the complexity of the Internet routes between Web users and the host server. It is indicated that distance educators can apply such information in the design of improved online delivery and mirror sites, notably in areas of the developing world which currently lack an effective infrastructure for online education.

  19. Small RNA and A-to-I Editing in Autism Spectrum Disorders

    Science.gov (United States)

    Eran, Alal

    One in every 88 children is diagnosed with Autism Spectrum Disorders (ASDs), a set of neurodevelopmental conditions characterized by social impairments, communication deficits, and repetitive behavior. ASDs have a substantial genetic component, but the specific cause of most cases remains unknown. Understanding gene-environment interactions underlying ASD is essential for improving early diagnosis and identifying critical targets for intervention and prevention. Towards this goal, we surveyed adenosine-to-inosine (A-to-I) RNA editing in autistic brains. A-to-I editing is an epigenetic mechanism that fine-tunes synaptic function in response to environmental stimuli, shown to modulate complex behavior in animals. We used ultradeep sequencing to quantify A-to-I receding of candidate synaptic genes in postmortem cerebella from individuals with ASD and neurotypical controls. We found unexpectedly wide distributions of human A-to-I editing levels, whose extremes were consistently populated by individuals with ASD. We correlated A-to-I editing with isoform usage, identified clusters of correlated sites, and examined differential editing patterns. Importantly, we found that individuals with ASD commonly use a dysfunctional form of the editing enzyme ADARB1. We next profiled small RNAs thought to regulate A-to-I editing, which originate from one of the most commonly altered loci in ASD, 15q11. Deep targeted sequencing of SNORD115 and SNORD116 transcripts enabled their high-resolution detection in human brains, and revealed a strong gender bias underlying their expression. The consistent 2-fold upregulation of 15q11 small RNAs in male vs. female cerebella could be important in delineating the role of this locus in ASD, a male dominant disorder. Overall, these studies provide an accurate population-level view of small RNA and A-to-I editing in human cerebella, and suggest that A-to-I editing of synaptic genes may be informative for assessing the epigenetic risk for autism

  20. Safety and efficacy of aneurysm treatment with WEB

    DEFF Research Database (Denmark)

    Pierot, Laurent; Costalat, Vincent; Moret, Jacques

    2016-01-01

    OBJECT WEB is an innovative intrasaccular treatment for intracranial aneurysms. Preliminary series have shown good safety and efficacy. The WEB Clinical Assessment of Intrasaccular Aneurysm Therapy (WEBCAST) trial is a prospective European trial evaluating the safety and efficacy of WEB in wide......-neck bifurcation aneurysms. METHODS Patients with wide-neck bifurcation aneurysms for which WEB treatment was indicated were included in this multicentergood clinical practices study. Clinical data including adverse events and clinical status at 1 and 6 months were collected and independently analyzed by a medical....... RESULTS Ten European neurointerventional centers enrolled 51 patients with 51 aneurysms. Treatment with WEB was achieved in 48 of 51 aneurysms (94.1%). Adjunctive implants (coils/stents) were used in 4 of 48 aneurysms (8.3%). Thromboembolic events were observed in 9 of 51 patients (17.6%), resulting...

  1. The definitive guide to HTML5 WebSocket

    CERN Document Server

    Wang, Vanessa; Moskovits, Peter

    2013-01-01

    The Definitive Guide to HTML5 WebSocket is the ultimate insider's WebSocket resource. This revolutionary new web technology enables you to harness the power of true real-time connectivity and build responsive, modern web applications.   This book contains everything web developers and architects need to know about WebSocket. It discusses how WebSocket-based architectures provide a dramatic reduction in unnecessary network overhead and latency compared to older HTTP (Ajax) architectures, how to layer widely used protocols such as XMPP and STOMP on top of WebSocket, and how to secure WebSocket c

  2. Study on online community user motif using web usage mining

    Science.gov (United States)

    Alphy, Meera; Sharma, Ajay

    2016-04-01

    The Web usage mining is the application of data mining, which is used to extract useful information from the online community. The World Wide Web contains at least 4.73 billion pages according to Indexed Web and it contains at least 228.52 million pages according Dutch Indexed web on 6th august 2015, Thursday. It’s difficult to get needed data from these billions of web pages in World Wide Web. Here is the importance of web usage mining. Personalizing the search engine helps the web user to identify the most used data in an easy way. It reduces the time consumption; automatic site search and automatic restore the useful sites. This study represents the old techniques to latest techniques used in pattern discovery and analysis in web usage mining from 1996 to 2015. Analyzing user motif helps in the improvement of business, e-commerce, personalisation and improvement of websites.

  3. WebMGA: a customizable web server for fast metagenomic sequence analysis.

    Science.gov (United States)

    Wu, Sitao; Zhu, Zhengwei; Fu, Liming; Niu, Beifang; Li, Weizhong

    2011-09-07

    The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  4. WebMGA: a customizable web server for fast metagenomic sequence analysis

    Directory of Open Access Journals (Sweden)

    Niu Beifang

    2011-09-01

    Full Text Available Abstract Background The new field of metagenomics studies microorganism communities by culture-independent sequencing. With the advances in next-generation sequencing techniques, researchers are facing tremendous challenges in metagenomic data analysis due to huge quantity and high complexity of sequence data. Analyzing large datasets is extremely time-consuming; also metagenomic annotation involves a wide range of computational tools, which are difficult to be installed and maintained by common users. The tools provided by the few available web servers are also limited and have various constraints such as login requirement, long waiting time, inability to configure pipelines etc. Results We developed WebMGA, a customizable web server for fast metagenomic analysis. WebMGA includes over 20 commonly used tools such as ORF calling, sequence clustering, quality control of raw reads, removal of sequencing artifacts and contaminations, taxonomic analysis, functional annotation etc. WebMGA provides users with rapid metagenomic data analysis using fast and effective tools, which have been implemented to run in parallel on our local computer cluster. Users can access WebMGA through web browsers or programming scripts to perform individual analysis or to configure and run customized pipelines. WebMGA is freely available at http://weizhongli-lab.org/metagenomic-analysis. Conclusions WebMGA offers to researchers many fast and unique tools and great flexibility for complex metagenomic data analysis.

  5. CRISPR-Cas9 Toolkit for Actinomycete Genome Editing

    DEFF Research Database (Denmark)

    Tong, Yaojun; Robertsen, Helene Lunde; Blin, Kai

    2018-01-01

    engineering approaches for boosting known and discovering novel natural products. In order to facilitate the genome editing for actinomycetes, we developed a CRISPR-Cas9 toolkit with high efficiency for actinomyces genome editing. This basic toolkit includes a software for spacer (sgRNA) identification......, a system for in-frame gene/gene cluster knockout, a system for gene loss-of-function study, a system for generating a random size deletion library, and a system for gene knockdown. For the latter, a uracil-specific excision reagent (USER) cloning technology was adapted to simplify the CRISPR vector...... construction process. The application of this toolkit was successfully demonstrated by perturbation of genomes of Streptomyces coelicolor A3(2) and Streptomyces collinus Tü 365. The CRISPR-Cas9 toolkit and related protocol described here can be widely used for metabolic engineering of actinomycetes....

  6. Progress of CRISPR-Cas Based Genome Editing in Photosynthetic Microbes.

    Science.gov (United States)

    Naduthodi, Mihris Ibnu Saleem; Barbosa, Maria J; van der Oost, John

    2018-02-03

    The carbon footprint caused by unsustainable development and its environmental and economic impact has become a major concern in the past few decades. Photosynthetic microbes such as microalgae and cyanobacteria are capable of accumulating value-added compounds from carbon dioxide, and have been regarded as environmentally friendly alternatives to reduce the usage of fossil fuels, thereby contributing to reducing the carbon footprint. This light-driven generation of green chemicals and biofuels has triggered the research for metabolic engineering of these photosynthetic microbes. CRISPR-Cas systems are successfully implemented across a wide range of prokaryotic and eukaryotic species for efficient genome editing. However, the inception of this genome editing tool in microalgal and cyanobacterial species took off rather slowly due to various complications. In this review, we elaborate on the established CRISPR-Cas based genome editing in various microalgal and cyanobacterial species. The complications associated with CRISPR-Cas based genome editing in these species are addressed along with possible strategies to overcome these issues. It is anticipated that in the near future this will result in improving and expanding the microalgal and cyanobacterial genome engineering toolbox. © 2018 The Authors. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  7. Integrating Web Services into Map Image Applications

    National Research Council Canada - National Science Library

    Tu, Shengru

    2003-01-01

    Web services have been opening a wide avenue for software integration. In this paper, we have reported our experiments with three applications that are built by utilizing and providing web services for Geographic Information Systems (GIS...

  8. Cloud Properties of CERES-MODIS Edition 4 and CERES-VIIRS Edition 1

    Science.gov (United States)

    Sun-Mack, Sunny; Minnis, Patrick; Chang, Fu-Lung; Hong, Gang; Arduini, Robert; Chen, Yan; Trepte, Qing; Yost, Chris; Smith, Rita; Brown, Ricky; hide

    2015-01-01

    The Clouds and Earth's Radiant Energy System (CERES) analyzes MODerate-resolution Imaging Spectroradiometer (MODIS) data and Visible Infrared Imaging Radiometer Suite (VIIRS) to derive cloud properties that are combine with aerosol and CERES broadband flux data to create a multi-parameter data set for climate study. CERES has produced over 15 years of data from Terra and over 13 years of data from Aqua using the CERES-MODIS Edition-2 cloud retrieval algorithm. A recently revised algorithm, CERESMODIS Edition 4, has been developed and is now generating enhanced cloud data for climate research (over 10 years for Terra and 8 years for Aqua). New multispectral retrievals of properties are included along with a multilayer cloud retrieval system. Cloud microphysical properties are reported at 3 wavelengths, 0.65, 1.24, and 2.1 microns to enable better estimates of the vertical profiles of cloud water contents. Cloud properties over snow are retrieved using the 1.24-micron channel. A new CERES-VIIRS cloud retrieval package was developed for the VIIRS spectral complement and is currently producing the CERES-VIIRS Edition 1 cloud dataset. The results from CERES-MODIS Edition 4 and CERES-VIIRS Edition 1 are presented and compared with each other and other datasets, including CALIPSO, CloudSat and the CERES-MODIS Edition-2 results.

  9. THE IMAGE OF INVESTMENT AND FINANCIAL SERVICES COMPANIES IN WWW LANDSCAPE (WORLD WIDE WEB

    Directory of Open Access Journals (Sweden)

    Iancu Ioana Ancuta

    2011-07-01

    Full Text Available In a world where the internet and its image are becoming more and more important, this study is about the importance of Investment and Financial Services Companies web sites. Market competition, creates the need of studies, focused on assessing and analyzing the websites of companies who are active in this sector. Our study wants to respond at several questions related to Romanian Investment and Financial Services Companies web sites through four dimensions: content, layout, handling and interactivity. Which web sites are best and from what point of view? Where should financial services companies direct their investments to differentiate themselves and their sites? In fact we want to rank the 58 Investment and Financial Services Companies web sites based on 127 criteria. There are numerous methods for evaluating web pages. The evaluation methods are similar from the structural point of view and the most popular are: Serqual, Sitequal, Webqual / Equal EtailQ, Ewam, e-Serqual, WebQEM (Badulescu, 2008:58. In the paper: "Assessment of Romanian Banks E-Image: A Marketing Perspective" (Catana, Catana and Constantinescu, 2006: 4 the authors point out that there are at least four complex variables: accessibility, functionality, performance and usability. Each of these can be decomposed into simple ones. We used the same method, and we examined from the utility point of view, 58 web sites of Investment and Financial Services Companies based on 127 criteria following a procedure developed by Institut fur ProfNet Internet Marketing, Munster (Germany. The data collection period was 1-30 September 2010. The results show that there are very large differences between corporate sites; their creators are concentrating on the information required by law and aesthetics, neglecting other aspects as communication and online service. In the future we want to extend this study at international level, by applying the same methods of research in 5 countries from

  10. Provenance-Based Approaches to Semantic Web Service Discovery and Usage

    Science.gov (United States)

    Narock, Thomas William

    2012-01-01

    The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…

  11. Critical Reading of the Web

    Science.gov (United States)

    Griffin, Teresa; Cohen, Deb

    2012-01-01

    The ubiquity and familiarity of the world wide web means that students regularly turn to it as a source of information. In doing so, they "are said to rely heavily on simple search engines, such as Google to find what they want." Researchers have also investigated how students use search engines, concluding that "the young web users tended to…

  12. WebNet 99 : proceedings of WebNet 99 - World Conference on the WWW and Internet, Honolulu, Hawaii, October 24-30, 1999

    NARCIS (Netherlands)

    De Bra, P.M.E.; Leggett, J.

    1999-01-01

    The 1999 WebNet conference addressed research, new developments, and experiences related to the Internet and World Wide Web. The 394 contributions of WebNet 99 contained in this proceedings comprise the full and short papers accepted for presentation at the conference. Major topics covered include:

  13. World wide web for database of Japanese translation on international nuclear event scale reports

    International Nuclear Information System (INIS)

    Watanabe, Norio; Hirano, Masashi

    1999-01-01

    The International Nuclear Event Scale (INES) is a means designed for providing prompt, clear and consistent information related to nuclear events, that occurred at nuclear facilities, and facilitating communication between the nuclear community, the media and the public. The INES is jointly operated by the IAEA and the OECD-NEA. Nuclear events reported are rated by the Scale', a consistent safety significance indicator. The scale runs from level 0, for events with no safety significance, to level 7 for a major accident with widespread health and environmental effects. The Japan Atomic Energy Research Institute (JAERI) has been promptly translating the INES reports into Japanese and developing a world-wide-web database for the Japanese translation, aiming at more efficient utilization of the INES information inside Japan. The present paper briefly introduces the definitions of INES rating levels and the scope of the Scale, and describes the outlines of the database (the information stored in the database, its functions and how to use it). As well, technical use of the INES reports and the availability/ effectiveness of the database are discussed. (author)

  14. Specification of application logic in web information systems

    NARCIS (Netherlands)

    Barna, P.

    2007-01-01

    The importance of the World Wide Web has grown tremendously over the past decade (or decade and a half). With a quickly growing amount of information published on the Web and its rapidly growing audience, requirements put on Web-based Information Systems (WIS), their developers and maintainers have

  15. Web document engineering

    International Nuclear Information System (INIS)

    White, B.

    1996-05-01

    This tutorial provides an overview of several document engineering techniques which are applicable to the authoring of World Wide Web documents. It illustrates how pre-WWW hypertext research is applicable to the development of WWW information resources

  16. The Rise and Fall of Text on the Web: A Quantitative Study of Web Archives

    Science.gov (United States)

    Cocciolo, Anthony

    2015-01-01

    Introduction: This study addresses the following research question: is the use of text on the World Wide Web declining? If so, when did it start declining, and by how much has it declined? Method: Web pages are downloaded from the Internet Archive for the years 1999, 2002, 2005, 2008, 2011 and 2014, producing 600 captures of 100 prominent and…

  17. An Algebraic Specification of the Semantic Web

    OpenAIRE

    Ksystra, Katerina; Triantafyllou, Nikolaos; Stefaneas, Petros; Frangos, Panayiotis

    2011-01-01

    We present a formal specification of the Semantic Web, as an extension of the World Wide Web using the well known algebraic specification language CafeOBJ. Our approach allows the description of the key elements of the Semantic Web technologies, in order to give a better understanding of the system, without getting involved with their implementation details that might not yet be standardized. This specification is part of our work in progress concerning the modeling the Social Semantic Web.

  18. Online Access to Weather Satellite Imagery Through the World Wide Web

    Science.gov (United States)

    Emery, W.; Baldwin, D.

    1998-01-01

    Both global area coverage (GAC) and high-resolution picture transmission (HRTP) data from the Advanced Very High Resolution Radiometer (AVHRR) are made available to laternet users through an online data access system. Older GOES-7 data am also available. Created as a "testbed" data system for NASA's future Earth Observing System Data and Information System (EOSDIS), this testbed provides an opportunity to test both the technical requirements of an onune'd;ta system and the different ways in which the -general user, community would employ such a system. Initiated in December 1991, the basic data system experienced five major evolutionary changes In response to user requests and requirements. Features added with these changes were the addition of online browse, user subsetting, dynamic image Processing/navigation, a stand-alone data storage system, and movement,from an X-windows graphical user Interface (GUI) to a World Wide Web (WWW) interface. Over Its lifetime, the system has had as many as 2500 registered users. The system on the WWW has had over 2500 hits since October 1995. Many of these hits are by casual users that only take the GIF images directly from the interface screens and do not specifically order digital data. Still, there b a consistent stream of users ordering the navigated image data and related products (maps and so forth). We have recently added a real-time, seven- day, northwestern United States normalized difference vegetation index (NDVI) composite that has generated considerable Interest. Index Terms-Data system, earth science, online access, satellite data.

  19. Using National Instruments LabVIEW[TM] Education Edition in Schools

    Science.gov (United States)

    Butlin, Chris A.

    2011-01-01

    With the development of LabVIEW[TM] Education Edition schools can now provide experience of using this widely used software. Here, a few of the many applications that students aged around 11 years and over could develop are outlined in the resulting front panel screen displays and block diagrams showing the associated graphical programmes, plus a…

  20. Library OPACs on the Web: Finding and Describing Directories.

    Science.gov (United States)

    Henry, Marcia

    1997-01-01

    Provides current descriptions of some of the major directories that link to library catalogs on the World Wide Web. Highlights include LibWeb; Hytelnet; WebCats; WWW Library Directory; and techniques for finding new library OPAC (online public access catalog) directories. (LRW)

  1. An Empirical Comparison of Navigation Effect of Pull-Down Menu Style on The World Wide Web.

    Science.gov (United States)

    Yu, Byeong-Min; Han, Sungwook

    Effective navigation is becoming more and more critical to the success of electronic commerce (E-commerce). It remains a challenge for educational technologists and Web designers to develop Web systems that can help customers find products or services without experiencing disorientation problems and cognitive overload. Many E-commerce Web sites…

  2. PROTOTIPE PEMESANAN BAHAN PUSTAKA MELALUI WEB MENGGUNAKAN ACTIVE SERVER PAGE (ASP

    Directory of Open Access Journals (Sweden)

    Djoni Haryadi Setiabudi

    2002-01-01

    Full Text Available Electronic commerce is one of the components in the internet that growing fast in the world. In this research, it is developed the prototype for library service that offers library collection ordering especially books and articles through World Wide Web. In order to get an interaction between seller and buyer, there is an urgency to develop a dynamic web, which needs the technology and software. One of the programming languages is called Active Server Pages (ASP and it is combining with database system to store data. The other component as an interface between application and database is ActiveX Data Objects (ADO. ASP has an advantage in the scripting method and it is easy to make the configuration with database. This application consists of two major parts those are administrator and user. This prototype has the facilities for editing, searching and looking through ordering information online. Users can also do downloading process for searching and ordering articles. Paying method in this e-commerce system is quite essential because in Indonesia not everybody has a credit card. As a solution to this situation, this prototype has a form for user who does not have credit card. If the bill has been paid, he can do the transaction online. In this case, one of the ASP advantages will be used. This is called "session" when data in process would not be lost as long as the user still in that "session". This will be used in user area and admin area where the users and the admin can do various processes. Abstract in Bahasa Indonesia : Electronic commerce adalah satu bagian dari internet yang berkembang pesat di dunia saat ini. Pada penelitian ini dibuat suatu prototipe program aplikasi untuk pengembangan jasa layanan perpustakaan khususnya pemesanan artikel dan buku melalui World Wide Web. Untuk membangun aplikasi berbasis web diperlukan teknologi dan software yang mendukung pembuatan situs web dinamis sehingga ada interaksi antara pembeli dan penjual

  3. Test Review: Wilkinson, G. S., & Robertson, G. J. (2006). Wide Range Achievement Test--Fourth Edition. Lutz, FL: Psychological Assessment Resources. WRAT4 Introductory Kit (Includes Manual, 25 Test/Response Forms [Blue and Green], and Accompanying Test Materials): $243.00

    Science.gov (United States)

    Dell, Cindy Ann; Harrold, Barbara; Dell, Thomas

    2008-01-01

    The Wide Range Achievement Test-Fourth Edition (WRAT4) is designed to provide "a quick, simple, psychometrically sound assessment of academic skills". The test was first published in 1946 by Joseph F. Jastak, with the purpose of augmenting the cognitive performance measures of the Wechsler-Bellevue Scales, developed by David Wechsler.…

  4. Web Sitings.

    Science.gov (United States)

    Lo, Erika

    2001-01-01

    Presents seven mathematics games, located on the World Wide Web, for elementary students, including: Absurd Math: Pre-Algebra from Another Dimension; The Little Animals Activity Centre; MathDork Game Room (classic video games focusing on algebra); Lemonade Stand (students practice math and business skills); Math Cats (teaches the artistic beauty…

  5. Deep iCrawl: An Intelligent Vision-Based Deep Web Crawler

    OpenAIRE

    R.Anita; V.Ganga Bharani; N.Nityanandam; Pradeep Kumar Sahoo

    2011-01-01

    The explosive growth of World Wide Web has posed a challenging problem in extracting relevant data. Traditional web crawlers focus only on the surface web while the deep web keeps expanding behind the scene. Deep web pages are created dynamically as a result of queries posed to specific web databases. The structure of the deep web pages makes it impossible for traditional web crawlers to access deep web contents. This paper, Deep iCrawl, gives a novel and vision-based app...

  6. The emergent discipline of health web science.

    Science.gov (United States)

    Luciano, Joanne S; Cumming, Grant P; Wilkinson, Mark D; Kahana, Eva

    2013-08-22

    The transformative power of the Internet on all aspects of daily life, including health care, has been widely recognized both in the scientific literature and in public discourse. Viewed through the various lenses of diverse academic disciplines, these transformations reveal opportunities realized, the promise of future advances, and even potential problems created by the penetration of the World Wide Web for both individuals and for society at large. Discussions about the clinical and health research implications of the widespread adoption of information technologies, including the Internet, have been subsumed under the disciplinary label of Medicine 2.0. More recently, however, multi-disciplinary research has emerged that is focused on the achievement and promise of the Web itself, as it relates to healthcare issues. In this paper, we explore and interrogate the contributions of the burgeoning field of Web Science in relation to health maintenance, health care, and health policy. From this, we introduce Health Web Science as a subdiscipline of Web Science, distinct from but overlapping with Medicine 2.0. This paper builds on the presentations and subsequent interdisciplinary dialogue that developed among Web-oriented investigators present at the 2012 Medicine 2.0 Conference in Boston, Massachusetts.

  7. Understanding the Web from an Economic Perspective: The Evolution of Business Models and the Web

    Directory of Open Access Journals (Sweden)

    Louis Rinfret

    2014-08-01

    Full Text Available The advent of the World Wide Web is arguably amongst the most important changes that have occurred since the 1990s in the business landscape. It has fueled the rise of new industries, supported the convergence and reshaping of existing ones and enabled the development of new business models. During this time the web has evolved tremendously from a relatively static pagedisplay tool to a massive network of user-generated content, collective intelligence, applications and hypermedia. As technical standards continue to evolve, business models catch-up to the new capabilities. New ways of creating value, distributing it and profiting from it emerge more rapidly than ever. In this paper we explore how the World Wide Web and business models evolve and we identify avenues for future research in light of the web‟s ever-evolving nature and its influence on business models.

  8. Models and methods for building web recommendation systems

    OpenAIRE

    Stekh, Yu.; Artsibasov, V.

    2012-01-01

    Modern Word Wide Web contains a large number of Web sites and pages in each Web site. Web recommendation system (recommendation system for web pages) are typically implemented on web servers and use the data obtained from the collection viewed web templates (implicit data) or user registration data (explicit data). In article considering methods and algorithms of web recommendation system based on the technology of data mining (web mining). Сучасна мережа Інтернет містить велику кількість веб...

  9. Prototyping Tool for Web-Based Multiuser Online Role-Playing Game

    Science.gov (United States)

    Okamoto, Shusuke; Kamada, Masaru; Yonekura, Tatsuhiro

    This letter proposes a prototyping tool for Web-based Multiuser Online Role-Playing Game (MORPG). The design goal is to make this tool simple and powerful. The tool is comprised of a GUI editor, a translator and a runtime environment. The GUI editor is used to edit state-transition diagrams, each of which defines the behavior of the fictional characters. The state-transition diagrams are translated into C program codes, which plays the role of a game engine in RPG system. The runtime environment includes PHP, JavaScript with Ajax and HTML. So the prototype system can be played on the usual Web browser, such as Fire-fox, Safari and IE. On a click or key press by a player, the Web browser sends it to the Web server to reflect its consequence on the screens which other players are looking at. Prospected users of this tool include programming novices and schoolchildren. The knowledge or skill of any specific programming languages is not required to create state-transition diagrams. Its structure is not only suitable for the definition of a character behavior but also intuitive to help novices understand. Therefore, the users can easily create Web-based MORPG system with the tool.

  10. Safety and Efficacy of Aneurysm Treatment with the WEB

    DEFF Research Database (Denmark)

    Pierot, L; Gubucz, I; Buhk, J H

    2017-01-01

    BACKGROUND AND PURPOSE: Flow disruption with the Woven EndoBridge (WEB) device is an innovative technique for the endovascular treatment of wide-neck bifurcation aneurysms. The initial version of the device (WEB Double-Layer) was evaluated in the WEB Clinical Assessment of IntraSaccular Aneurysm ...

  11. CRISPy-web

    DEFF Research Database (Denmark)

    Blin, Kai; Pedersen, Lasse Ebdrup; Weber, Tilmann

    2016-01-01

    CRISPR/Cas9-based genome editing has been one of the major achievements of molecular biology, allowing the targeted engineering of a wide range of genomes. The system originally evolved in prokaryotes as an adaptive immune system against bacteriophage infections. It now sees widespread application...... in genome engineering workflows, especially using the Streptococcus pyogenes endonuclease Cas9. To utilize Cas9, so-called single guide RNAs (sgRNAs) need to be designed for each target gene. While there are many tools available to design sgRNAs for the popular model organisms, only few tools that allow...

  12. Connectivity editing for quad-dominant meshes

    KAUST Repository

    Peng, Chihan

    2013-08-01

    We propose a connectivity editing framework for quad-dominant meshes. In our framework, the user can edit the mesh connectivity to control the location, type, and number of irregular vertices (with more or fewer than four neighbors) and irregular faces (non-quads). We provide a theoretical analysis of the problem, discuss what edits are possible and impossible, and describe how to implement an editing framework that realizes all possible editing operations. In the results, we show example edits and illustrate the advantages and disadvantages of different strategies for quad-dominant mesh design. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  13. Information Waste on the World Wide Web and Combating the Clutter

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; Wijnhoven, Alphonsus B.J.M.; Beckers, David

    2015-01-01

    The Internet has become a critical part of the infrastructure supporting modern life. The high degree of openness and autonomy of information providers determines the access to a vast amount of information on the Internet. However, this makes the web vulnerable to inaccurate, misleading, or outdated

  14. Processing biological literature with customizable Web services supporting interoperable formats.

    Science.gov (United States)

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.

  15. Fast 3D Net Expeditions: Tools for Effective Scientific Collaboration on the World Wide Web

    Science.gov (United States)

    Watson, Val; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D (three dimensional), high resolution, dynamic, interactive viewing of scientific data. The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG (Motion Picture Expert Group) movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewers local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: (1) The visual is much higher in resolution (1280x1024 pixels with 24 bits of color) than typical video format transmitted over the network. (2) The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). (3) A rich variety of guided expeditions through the data can be included easily. (4) A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of

  16. A Typology for Web 2.0

    DEFF Research Database (Denmark)

    Dalsgaard, Christian; Sorensen, Elsebeth Korsgaard

    2008-01-01

    of a learning environment: 1) organizing communicative processes and 2) organizing resources. Organizing communicative processes is supported by Web 2.0’s ability to provide a range of communicative tools that can be organized flexibly by students. Web 2.0 provides opportunities for communities and groups...... to organize their own communicative processes. Further, Web 2.0 supports organization of resources by empowering students to create, construct, manage and share content themselves. However, the main potential lies within collaborative creation and sharing in networks. Potentially, networking tools......Web 2.0 is a term used to describe recent developments on the World Wide Web. The term is often used to describe the increased use of the web for user-generated content, collaboration, and social networking. However, Web 2.0 is a weakly defined concept, and it is unclear exactly what kind...

  17. Web corpus construction

    CERN Document Server

    Schafer, Roland

    2013-01-01

    The World Wide Web constitutes the largest existing source of texts written in a great variety of languages. A feasible and sound way of exploiting this data for linguistic research is to compile a static corpus for a given language. There are several adavantages of this approach: (i) Working with such corpora obviates the problems encountered when using Internet search engines in quantitative linguistic research (such as non-transparent ranking algorithms). (ii) Creating a corpus from web data is virtually free. (iii) The size of corpora compiled from the WWW may exceed by several orders of magnitudes the size of language resources offered elsewhere. (iv) The data is locally available to the user, and it can be linguistically post-processed and queried with the tools preferred by her/him. This book addresses the main practical tasks in the creation of web corpora up to giga-token size. Among these tasks are the sampling process (i.e., web crawling) and the usual cleanups including boilerplate removal and rem...

  18. pedigreejs: a web-based graphical pedigree editor.

    Science.gov (United States)

    Carver, Tim; Cunningham, Alex P; Babb de Villiers, Chantal; Lee, Andrew; Hartley, Simon; Tischkowitz, Marc; Walter, Fiona M; Easton, Douglas F; Antoniou, Antonis C

    2018-03-15

    The collection, management and visualization of clinical pedigree (family history) data is a core activity in clinical genetics centres. However, clinical pedigree datasets can be difficult to manage, as they are time consuming to capture, and can be difficult to build, manipulate and visualize graphically. Several standalone graphical pedigree editors and drawing applications exist but there are no freely available lightweight graphical pedigree editors that can be easily configured and incorporated into web applications. We developed 'pedigreejs', an interactive graphical pedigree editor written in JavaScript, which uses standard pedigree nomenclature. Pedigreejs provides an easily configurable, extensible and lightweight pedigree editor. It makes use of an open-source Javascript library to define a hierarchical layout and to produce images in scalable vector graphics (SVG) format that can be viewed and edited in web browsers. The software is freely available under GPL licence (https://ccge-boadicea.github.io/pedigreejs/). tjc29@cam.ac.uk. Supplementary data are available at Bioinformatics online.

  19. Primer on client-side web security

    CERN Document Server

    De Ryck, Philippe; Piessens, Frank; Johns, Martin

    2014-01-01

    This volume illustrates the continuous arms race between attackers and defenders of the Web ecosystem by discussing a wide variety of attacks. In the first part of the book, the foundation of the Web ecosystem is briefly recapped and discussed. Based on this model, the assets of the Web ecosystem are identified, and the set of capabilities an attacker may have are enumerated. In the second part, an overview of the web security vulnerability landscape is constructed. Included are selections of the most representative attack techniques reported in great detail. In addition to descriptions of the

  20. Country Nuclear Power Profiles - 2009 Edition

    International Nuclear Information System (INIS)

    2009-08-01

    The Country Nuclear Power Profiles compiles background information on the status and development of nuclear power programs in Member States. It consists of organizational and industrial aspects of nuclear power programs and provides information about the relevant legislative, regulatory, and international framework in each country. Its descriptive and statistical overview of the overall economic, energy, and electricity situation in each country, and its nuclear power framework is intended to serve as an integrated source of key background information about nuclear power programs in the world. The preparation of Country Nuclear Power Profiles (CNPP) was initiated in 1990s. It responded to a need for a database and a technical publication containing a description of the energy and economic situation, the energy and the electricity sector, and the primary organizations involved in nuclear power in IAEA Member States. This is the 2009 edition issued on CD-ROM and Web pages. It updates the country information for 44 countries. The CNPP is updated based on information voluntarily provided by participating IAEA Member States. Participants include the 30 countries that have operating nuclear power plants, as well as 14 countries having past or planned nuclear power programmes (Bangladesh, Egypt, Ghana, Indonesia, the Islamic Republic of Iran, Italy, Kazakhstan, Nigeria, Philippines, Poland, Thailand, Tunisia, Turkey and Vietnam). For the 2009 edition, 26 countries provided updated or new profiles. For the other countries, the IAEA updated the profile statistical tables on nuclear power, energy development, and economic indicators based on information from IAEA and World Bank databases

  1. Country Nuclear Power Profiles - 2011 Edition

    International Nuclear Information System (INIS)

    2011-08-01

    The Country Nuclear Power Profiles compiles background information on the status and development of nuclear power programs in Member States. It consists of organizational and industrial aspects of nuclear power programs and provides information about the relevant legislative, regulatory, and international framework in each country. Its descriptive and statistical overview of the overall economic, energy, and electricity situation in each country, and its nuclear power framework is intended to serve as an integrated source of key background information about nuclear power programs in the world. The preparation of Country Nuclear Power Profiles (CNPP) was initiated in 1990s. It responded to a need for a database and a technical publication containing a description of the energy and economic situation, the energy and the electricity sector, and the primary organizations involved in nuclear power in IAEA Member States. This is the 2011 edition issued on CD-ROM and Web pages. It updates the country information for 50 countries. The CNPP is updated based on information voluntarily provided by participating IAEA Member States. Participants include the 29 countries that have operating nuclear power plants, as well as 21 countries having past or planned nuclear power programmes (Bangladesh, Belarus, Chile, Egypt, Ghana, Indonesia, the Islamic Republic of Iran, Italy, Jordan, Kazakhstan, Kuwait, Lithuania, Morocco, Nigeria, Philippines, Poland, Syrian Arab Republic, Thailand, Tunisia, Turkey and Vietnam). For the 2011 edition, 23 countries provided updated or new profiles. For the other countries, the IAEA updated the profile statistical tables on nuclear power, energy development, and economic indicators based on information from IAEA and World Bank databases.

  2. Functional Reconstitution of a Fungal Natural Product Gene Cluster by Advanced Genome Editing.

    Science.gov (United States)

    Weber, Jakob; Valiante, Vito; Nødvig, Christina S; Mattern, Derek J; Slotkowski, Rebecca A; Mortensen, Uffe H; Brakhage, Axel A

    2017-01-20

    Filamentous fungi produce varieties of natural products even in a strain dependent manner. However, the genetic basis of chemical speciation between strains is still widely unknown. One example is trypacidin, a natural product of the opportunistic human pathogen Aspergillus fumigatus, which is not produced among different isolates. Combining computational analysis with targeted gene editing, we could link a single nucleotide insertion in the polyketide synthase of the trypacidin biosynthetic pathway and reconstitute its production in a nonproducing strain. Thus, we present a CRISPR/Cas9-based tool for advanced molecular genetic studies in filamentous fungi, exploiting selectable markers separated from the edited locus.

  3. Edit distance for marked point processes revisited: An implementation by binary integer programming

    Energy Technology Data Exchange (ETDEWEB)

    Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2015-12-15

    We implement the edit distance for marked point processes [Suzuki et al., Int. J. Bifurcation Chaos 20, 3699–3708 (2010)] as a binary integer program. Compared with the previous implementation using minimum cost perfect matching, the proposed implementation has two advantages: first, by using the proposed implementation, we can apply a wide variety of software and hardware, even spin glasses and coherent ising machines, to calculate the edit distance for marked point processes; second, the proposed implementation runs faster than the previous implementation when the difference between the numbers of events in two time windows for a marked point process is large.

  4. Conducting Web-based Surveys.

    OpenAIRE

    David J. Solomon

    2001-01-01

    Web-based surveying is becoming widely used in social science and educational research. The Web offers significant advantages over more traditional survey techniques however there are still serious methodological challenges with using this approach. Currently coverage bias or the fact significant numbers of people do not have access, or choose not to use the Internet is of most concern to researchers. Survey researchers also have much to learn concerning the most effective ways to conduct s...

  5. Preservation of the Digital Culture: Archiving the World Wide Web Sayısal (Dijital Kültürün Korunması: Web Arşivleme

    Directory of Open Access Journals (Sweden)

    Ahmet Aldemir

    2006-09-01

    Full Text Available Information growth in the web medium has required the necessity of archiving these information to transmit them to future generations. Web archiving is a versatile application which covers technical, legal and organizational dimensions. Any stage within the life cycle of digital information is critically important for the information in web environment. All over the world, many countries have started web archiving efforts through the leadership of their national libraries and attempted to carry these initiatives on a legal bases. In the light of these developments, this paper examines the necessity and major techniques in web archiving and it also discuss national and international web archiving projects. Web ortamında yaşanan bilgi artışı, beraberinde bu bilgilerin gelecek nesillere aktarılması amacıyla arşivlenmesi gereğini gündeme getirmiştir. Web'in arşivlenmesi teknik, yasal ve örgütsel boyutları olan çok yönlü bir uygulamadır. Sayısal ortamda üretilmiş bilginin yaşam döngüsündeki her bir aşama, web ortamında yer alan bilgiler için hayati önem taşımaktadır. Dünyada bir çok ülke milli kütüphaneleri öncülüğünde web arşivleme çalışmalarını başlatmış ve bu girişimlerinin yasal bir platforma taşınması için gerekli adımlar atılmıştır. Bu gelişmeler ışığında çalışmamızda, web'in neden arşivlenmesi gerektiğine değinilmiş, web arşivlemede kullanılan belli başlı yaklaşımlar ele alınmış, ulusal ve uluslararası ölçekli web arşivleme çalışmalarına yer verilmiştir.

  6. An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling

    Science.gov (United States)

    Devi, R. Suganya; Manjula, D.; Siddharth, R. K.

    2015-01-01

    Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling. PMID:26137592

  7. Web interface for plasma analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M. [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan)], E-mail: emo@nifs.ac.jp; Murakami, S. [Kyoto University, Yoshida-Honmachi, Sakyo-ku, Kyoto 606-8501 (Japan); Yoshida, M.; Funaba, H.; Nagayama, Y. [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan)

    2008-04-15

    There are many analysis codes that analyze various aspects of plasma physics. However, most of them are FORTRAN programs that are written to be run in supercomputers. On the other hand, many scientists use GUI (graphical user interface)-based operating systems. For those who are not familiar with supercomputers, it is a difficult task to run analysis codes in supercomputers, and they often hesitate to use these programs to substantiate their ideas. Furthermore, these analysis codes are written for personal use, and the programmers do not expect these programs to be run by other users. In order to make these programs to be widely used by many users, the authors developed user-friendly interfaces using a Web interface. Since the Web browser is one of the most common applications, it is useful for both the users and developers. In order to realize interactive Web interface, AJAX technique is widely used, and the authors also adopted AJAX. To build such an AJAX based Web system, Ruby on Rails plays an important role in this system. Since this application framework, which is written in Ruby, abstracts the Web interfaces necessary to implement AJAX and database functions, it enables the programmers to efficiently develop the Web-based application. In this paper, the authors will introduce the system and demonstrate the usefulness of this approach.

  8. Web interface for plasma analysis codes

    International Nuclear Information System (INIS)

    Emoto, M.; Murakami, S.; Yoshida, M.; Funaba, H.; Nagayama, Y.

    2008-01-01

    There are many analysis codes that analyze various aspects of plasma physics. However, most of them are FORTRAN programs that are written to be run in supercomputers. On the other hand, many scientists use GUI (graphical user interface)-based operating systems. For those who are not familiar with supercomputers, it is a difficult task to run analysis codes in supercomputers, and they often hesitate to use these programs to substantiate their ideas. Furthermore, these analysis codes are written for personal use, and the programmers do not expect these programs to be run by other users. In order to make these programs to be widely used by many users, the authors developed user-friendly interfaces using a Web interface. Since the Web browser is one of the most common applications, it is useful for both the users and developers. In order to realize interactive Web interface, AJAX technique is widely used, and the authors also adopted AJAX. To build such an AJAX based Web system, Ruby on Rails plays an important role in this system. Since this application framework, which is written in Ruby, abstracts the Web interfaces necessary to implement AJAX and database functions, it enables the programmers to efficiently develop the Web-based application. In this paper, the authors will introduce the system and demonstrate the usefulness of this approach

  9. Materializing the web of linked data

    CERN Document Server

    Konstantinou, Nikolaos

    2015-01-01

    This book explains the Linked Data domain by adopting a bottom-up approach: it introduces the fundamental Semantic Web technologies and building blocks, which are then combined into methodologies and end-to-end examples for publishing datasets as Linked Data, and use cases that harness scholarly information and sensor data. It presents how Linked Data is used for web-scale data integration, information management and search. Special emphasis is given to the publication of Linked Data from relational databases as well as from real-time sensor data streams. The authors also trace the transformation from the document-based World Wide Web into a Web of Data. Materializing the Web of Linked Data is addressed to researchers and professionals studying software technologies, tools and approaches that drive the Linked Data ecosystem, and the Web in general.

  10. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    Science.gov (United States)

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  11. Toward a Critique of Surveillance in the Age of the Internet: A Reflection on the “Internet and Surveillance” Volume Edited by Fuchs, Boersma, Albrechtslund, and Sandoval

    Directory of Open Access Journals (Sweden)

    Jernej Prodnik

    2012-02-01

    Full Text Available This article is a reflection on the following book (edited volume: Fuchs, Christian, Kees Boersma, Anders Albrechtslund, and Marisol Sandoval, eds. 2012. Internet and Surveillance: The Challenges of Web 2.0 and Social Media. New York: Routledge.

  12. Geoinformation web-system for processing and visualization of large archives of geo-referenced data

    Science.gov (United States)

    Gordov, E. P.; Okladnikov, I. G.; Titov, A. G.; Shulgina, T. M.

    2010-12-01

    Developed working model of information-computational system aimed at scientific research in area of climate change is presented. The system will allow processing and analysis of large archives of geophysical data obtained both from observations and modeling. Accumulated experience of developing information-computational web-systems providing computational processing and visualization of large archives of geo-referenced data was used during the implementation (Gordov et al, 2007; Okladnikov et al, 2008; Titov et al, 2009). Functional capabilities of the system comprise a set of procedures for mathematical and statistical analysis, processing and visualization of data. At present five archives of data are available for processing: 1st and 2nd editions of NCEP/NCAR Reanalysis, ECMWF ERA-40 Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, and NOAA-CIRES XX Century Global Reanalysis Version I. To provide data processing functionality a computational modular kernel and class library providing data access for computational modules were developed. Currently a set of computational modules for climate change indices approved by WMO is available. Also a special module providing visualization of results and writing to Encapsulated Postscript, GeoTIFF and ESRI shape files was developed. As a technological basis for representation of cartographical information in Internet the GeoServer software conforming to OpenGIS standards is used. Integration of GIS-functionality with web-portal software to provide a basis for web-portal’s development as a part of geoinformation web-system is performed. Such geoinformation web-system is a next step in development of applied information-telecommunication systems offering to specialists from various scientific fields unique opportunities of performing reliable analysis of heterogeneous geophysical data using approved computational algorithms. It will allow a wide range of researchers to work with geophysical data without specific programming

  13. Teaching Web 2.0 technologies using Web 2.0 technologies.

    Science.gov (United States)

    Rethlefsen, Melissa L; Piorun, Mary; Prince, J Dale

    2009-10-01

    The research evaluated participant satisfaction with the content and format of the "Web 2.0 101: Introduction to Second Generation Web Tools" course and measured the impact of the course on participants' self-evaluated knowledge of Web 2.0 tools. The "Web 2.0 101" online course was based loosely on the Learning 2.0 model. Content was provided through a course blog and covered a wide range of Web 2.0 tools. All Medical Library Association members were invited to participate. Participants were asked to complete a post-course survey. Respondents who completed the entire course or who completed part of the course self-evaluated their knowledge of nine social software tools and concepts prior to and after the course using a Likert scale. Additional qualitative information about course strengths and weaknesses was also gathered. Respondents' self-ratings showed a significant change in perceived knowledge for each tool, using a matched pair Wilcoxon signed rank analysis (P<0.0001 for each tool/concept). Overall satisfaction with the course appeared high. Hands-on exercises were the most frequently identified strength of the course; the length and time-consuming nature of the course were considered weaknesses by some. Learning 2.0-style courses, though demanding time and self-motivation from participants, can increase knowledge of Web 2.0 tools.

  14. Genome editing for crop improvement: Challenges and opportunities.

    Science.gov (United States)

    Abdallah, Naglaa A; Prakash, Channapatna S; McHughen, Alan G

    2015-01-01

    Genome or gene editing includes several new techniques to help scientists precisely modify genome sequences. The techniques also enables us to alter the regulation of gene expression patterns in a pre-determined region and facilitates novel insights into the functional genomics of an organism. Emergence of genome editing has brought considerable excitement especially among agricultural scientists because of its simplicity, precision and power as it offers new opportunities to develop improved crop varieties with clear-cut addition of valuable traits or removal of undesirable traits. Research is underway to improve crop varieties with higher yields, strengthen stress tolerance, disease and pest resistance, decrease input costs, and increase nutritional value. Genome editing encompasses a wide variety of tools using either a site-specific recombinase (SSR) or a site-specific nuclease (SSN) system. Both systems require recognition of a known sequence. The SSN system generates single or double strand DNA breaks and activates endogenous DNA repair pathways. SSR technology, such as Cre/loxP and Flp/FRT mediated systems, are able to knockdown or knock-in genes in the genome of eukaryotes, depending on the orientation of the specific sites (loxP, FLP, etc.) flanking the target site. There are 4 main classes of SSN developed to cleave genomic sequences, mega-nucleases (homing endonuclease), zinc finger nucleases (ZFNs), transcriptional activator-like effector nucleases (TALENs), and the CRISPR/Cas nuclease system (clustered regularly interspaced short palindromic repeat/CRISPR-associated protein). The recombinase mediated genome engineering depends on recombinase (sub-) family and target-site and induces high frequencies of homologous recombination. Improving crops with gene editing provides a range of options: by altering only a few nucleotides from billions found in the genomes of living cells, altering the full allele or by inserting a new gene in a targeted region of

  15. Marketing your medical practice with an effective web presence.

    Science.gov (United States)

    Finch, Tammy

    2004-01-01

    The proliferation of the World Wide Web has provided an opportunity for medical practices to sell themselves through low-cost marketing on the Internet. A Web site is a quick and effective way to provide patients with up-to-date treatment and procedure information. This article provides suggestions on what to include on a medical practice's Web site, how the Web can assist office staff and physicians, and cost options for your Web site. The article also discusses design tips, such as Web-site optimization.

  16. A Framework for Dynamic Web Services Composition

    NARCIS (Netherlands)

    Lécué, F.; Goncalves da Silva, Eduardo; Ferreira Pires, Luis

    2007-01-01

    Dynamic composition of web services is a promising approach and at the same time a challenging research area for the dissemination of service-oriented applications. It is widely recognised that service semantics is a key element for the dynamic composition of Web services, since it allows the

  17. Statistical Physics Approaches to RNA Editing

    Science.gov (United States)

    Bundschuh, Ralf

    2012-02-01

    The central dogma of molecular Biology states that DNA is transcribed base by base into RNA which is in turn translated into proteins. However, some organisms edit their RNA before translation by inserting, deleting, or substituting individual or short stretches of bases. In many instances the mechanisms by which an organism recognizes the positions at which to edit or by which it performs the actual editing are unknown. One model system that stands out by its very high rate of on average one out of 25 bases being edited are the Myxomycetes, a class of slime molds. In this talk we will show how the computational methods and concepts from statistical Physics can be used to analyze DNA and protein sequence data to predict editing sites in these slime molds and to guide experiments that identified previously unknown types of editing as well as the complete set of editing events in the slime mold Physarum polycephalum.

  18. OPTIMALISASI EDITING GREEN SCREEN MENGGUNAKAN TEKNIK LIGHTING PADA CHROMA KEY

    Directory of Open Access Journals (Sweden)

    Arin Yuli Astuti

    2016-10-01

    Full Text Available In the film world greenscreen or bluescreen is already widely in use as a background making technology . Utilization of this technology is very great because filmmakers can imagine changing the background to the shape or the desired atmosphere without having to perform image capture directly to the location . Chroma key is a technique for combining two images made with a video camera as well , in which a background color of an image to remove (made transparent , was replaced by another image behind it. Deficiencies that exist in the chroma key is that at the time of editing of the hair . Here the author tries to do research on how to minimize the weaknesses mengilangkan edge portion of hair at the time of editing green screen by optimizing lighting / lighting .

  19. SFO-Project: The New Generation of Sharable, Editable and Open-Access CFD Tutorials

    Science.gov (United States)

    Javaherchi, Teymour; Javaherchi, Ardeshir; Aliseda, Alberto

    2016-11-01

    One of the most common approaches to develop a Computational Fluid Dynamic (CFD) simulation for a new case study of interest is to search for the most similar, previously developed and validated CFD simulation among other works. A simple search would result into a pool of written/visual tutorials. However, users should spend significant amount of time and effort to find the most correct, compatible and valid tutorial in this pool and further modify it toward their simulation of interest. SFO is an open-source project with the core idea of saving the above-mentioned time and effort. This is done via documenting/sharing scientific and methodological approaches to develop CFD simulations for a wide spectrum of fundamental and industrial case studies in three different CFD solvers; STAR-CCM +, FLUENT and Open FOAM (SFO). All of the steps and required files of these tutorials are accessible and editable under the common roof of Github (a web-based Git repository hosting service). In this presentation we will present the current library of 20 + developed CFD tutorials, discuss the idea and benefit of using them, their educational values and explain how the next generation of open-access and live resource of CFD tutorials can be built further hand-in-hand within our community.

  20. Using the World Wide Web for GIDEP Problem Data Processing at Marshall Space Flight Center

    Science.gov (United States)

    McPherson, John W.; Haraway, Sandra W.; Whirley, J. Don

    1999-01-01

    Since April 1997, Marshall Space Flight Center has been using electronic transfer and the web to support our processing of the Government-Industry Data Exchange Program (GIDEP) and NASA ALERT information. Specific aspects include: (1) Extraction of ASCII text information from GIDEP for loading into Word documents for e-mail to ALERT actionees; (2) Downloading of GIDEP form image formats in Adobe Acrobat (.pdf) for internal storage display on the MSFC ALERT web page; (3) Linkage of stored GRDEP problem forms with summary information for access from the MSFC ALERT Distribution Summary Chart or from an html table of released MSFC ALERTs (4) Archival of historic ALERTs for reference by GIDEP ID, MSFC ID, or MSFC release date; (5) On-line tracking of ALERT response status using a Microsoft Access database and the web (6) On-line response to ALERTs from MSFC actionees through interactive web forms. The technique, benefits, effort, coordination, and lessons learned for each aspect are covered herein.

  1. A Methodology for Integrating Tools in a Web-Based Environment

    National Research Council Canada - National Science Library

    Arslan, Musa

    2000-01-01

    ...." The Internet and the World Wide Web are getting more important and bigger than ever. Because of the increase in the importance of the Internet and the Web, migrating old applications and tools to a web-based environment is becoming more important...

  2. DelPhi web server v2: incorporating atomic-style geometrical figures into the computational protocol.

    Science.gov (United States)

    Smith, Nicholas; Witham, Shawn; Sarkar, Subhra; Zhang, Jie; Li, Lin; Li, Chuan; Alexov, Emil

    2012-06-15

    A new edition of the DelPhi web server, DelPhi web server v2, is released to include atomic presentation of geometrical figures. These geometrical objects can be used to model nano-size objects together with real biological macromolecules. The position and size of the object can be manipulated by the user in real time until desired results are achieved. The server fixes structural defects, adds hydrogen atoms and calculates electrostatic energies and the corresponding electrostatic potential and ionic distributions. The web server follows a client-server architecture built on PHP and HTML and utilizes DelPhi software. The computation is carried out on supercomputer cluster and results are given back to the user via http protocol, including the ability to visualize the structure and corresponding electrostatic potential via Jmol implementation. The DelPhi web server is available from http://compbio.clemson.edu/delphi_webserver.

  3. Genome editing in plants: Advancing crop transformation and overview of tools.

    Science.gov (United States)

    Shah, Tariq; Andleeb, Tayyaba; Lateef, Sadia; Noor, Mehmood Ali

    2018-05-07

    Genome manipulation technology is one of emerging field which brings real revolution in genetic engineering and biotechnology. Targeted editing of genomes pave path to address a wide range of goals not only to improve quality and productivity of crops but also permit to investigate the fundamental roots of biological systems. These goals includes creation of plants with valued compositional properties and with characters that confer resistance to numerous biotic and abiotic stresses. Numerous novel genome editing systems have been introduced during the past few years; these comprise zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), and clustered regularly interspaced short palindromic repeats/Cas9 (CRISPR/Cas9). Genome editing technique is consistent for improving average yield to achieve the growing demands of the world's existing food famine and to launch a feasible and environmentally safe agriculture scheme, to more specific, productive, cost-effective and eco-friendly. These exciting novel methods, concisely reviewed herein, have verified themselves as efficient and reliable tools for the genetic improvement of plants. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  4. Factors influencing the adoption of the World Wide Web for job-seeking in South Africa

    Directory of Open Access Journals (Sweden)

    Fernando Pavon

    2010-10-01

    Full Text Available In the past decade, the use of the World Wide Web (WWW as a tool for job-seeking and recruitment has increased globally, changing the dynamics for job-seekers and recruitment organisations. The purpose of this study was to gain greater insight into the factors that influence the adoption of the Internet (WWW for job-seeking within a South African context. The impact of the Internet (WWW and newspaper-reading habits on the adoption process was of specific interest. Data was gathered by survey through telephonic interviews with 228 job seekers applying for information technology (IT work in Cape Town, South Africa. The findings show that the income of a job-seeker influences the favourability of internet facilitating conditions they encounter. Facilitating conditions in turn influence Internet (WWW usage habits. Such habits influence performance expectancy, effort expectancy and intentions to use the Internet (WWW for job-seeking. The actual extent of Internet (WWW usage for job-seeking is positively influenced by these usage intentions and negatively influenced by newspaper-reading habits. These and other findings are discussed and implications drawn.

  5. Factors influencing the adoption of the World Wide Web for job-seeking in South Africa

    Directory of Open Access Journals (Sweden)

    Fernando Pavon

    2010-08-01

    Full Text Available In the past decade, the use of the World Wide Web (WWW as a tool for job-seeking and recruitment has increased globally, changing the dynamics for job-seekers and recruitment organisations. The purpose of this study was to gain greater insight into the factors that influence the adoption of the Internet (WWW for job-seeking within a South African context. The impact of the Internet (WWW and newspaper-reading habits on the adoption process was of specific interest. Data was gathered by survey through telephonic interviews with 228 job seekers applying for information technology (IT work in Cape Town, South Africa. The findings show that the income of a job-seeker influences the favourability of internet facilitating conditions they encounter. Facilitating conditions in turn influence Internet (WWW usage habits. Such habits influence performance expectancy, effort expectancy and intentions to use the Internet (WWW for job-seeking. The actual extent of Internet (WWW usage for job-seeking is positively influenced by these usage intentions and negatively influenced by newspaper-reading habits. These and other findings are discussed and implications drawn.

  6. Advancing translational research with the Semantic Web

    Science.gov (United States)

    Ruttenberg, Alan; Clark, Tim; Bug, William; Samwald, Matthias; Bodenreider, Olivier; Chen, Helen; Doherty, Donald; Forsberg, Kerstin; Gao, Yong; Kashyap, Vipul; Kinoshita, June; Luciano, Joanne; Marshall, M Scott; Ogbuji, Chimezie; Rees, Jonathan; Stephens, Susie; Wong, Gwendolyn T; Wu, Elizabeth; Zaccagnini, Davide; Hongsermeier, Tonya; Neumann, Eric; Herman, Ivan; Cheung, Kei-Hoi

    2007-01-01

    Background A fundamental goal of the U.S. National Institute of Health (NIH) "Roadmap" is to strengthen Translational Research, defined as the movement of discoveries in basic research to application at the clinical level. A significant barrier to translational research is the lack of uniformly structured data across related biomedical domains. The Semantic Web is an extension of the current Web that enables navigation and meaningful use of digital resources by automatic processes. It is based on common formats that support aggregation and integration of data drawn from diverse sources. A variety of technologies have been built on this foundation that, together, support identifying, representing, and reasoning across a wide range of biomedical data. The Semantic Web Health Care and Life Sciences Interest Group (HCLSIG), set up within the framework of the World Wide Web Consortium, was launched to explore the application of these technologies in a variety of areas. Subgroups focus on making biomedical data available in RDF, working with biomedical ontologies, prototyping clinical decision support systems, working on drug safety and efficacy communication, and supporting disease researchers navigating and annotating the large amount of potentially relevant literature. Results We present a scenario that shows the value of the information environment the Semantic Web can support for aiding neuroscience researchers. We then report on several projects by members of the HCLSIG, in the process illustrating the range of Semantic Web technologies that have applications in areas of biomedicine. Conclusion Semantic Web technologies present both promise and challenges. Current tools and standards are already adequate to implement components of the bench-to-bedside vision. On the other hand, these technologies are young. Gaps in standards and implementations still exist and adoption is limited by typical problems with early technology, such as the need for a critical mass of

  7. Advancing translational research with the Semantic Web.

    Science.gov (United States)

    Ruttenberg, Alan; Clark, Tim; Bug, William; Samwald, Matthias; Bodenreider, Olivier; Chen, Helen; Doherty, Donald; Forsberg, Kerstin; Gao, Yong; Kashyap, Vipul; Kinoshita, June; Luciano, Joanne; Marshall, M Scott; Ogbuji, Chimezie; Rees, Jonathan; Stephens, Susie; Wong, Gwendolyn T; Wu, Elizabeth; Zaccagnini, Davide; Hongsermeier, Tonya; Neumann, Eric; Herman, Ivan; Cheung, Kei-Hoi

    2007-05-09

    A fundamental goal of the U.S. National Institute of Health (NIH) "Roadmap" is to strengthen Translational Research, defined as the movement of discoveries in basic research to application at the clinical level. A significant barrier to translational research is the lack of uniformly structured data across related biomedical domains. The Semantic Web is an extension of the current Web that enables navigation and meaningful use of digital resources by automatic processes. It is based on common formats that support aggregation and integration of data drawn from diverse sources. A variety of technologies have been built on this foundation that, together, support identifying, representing, and reasoning across a wide range of biomedical data. The Semantic Web Health Care and Life Sciences Interest Group (HCLSIG), set up within the framework of the World Wide Web Consortium, was launched to explore the application of these technologies in a variety of areas. Subgroups focus on making biomedical data available in RDF, working with biomedical ontologies, prototyping clinical decision support systems, working on drug safety and efficacy communication, and supporting disease researchers navigating and annotating the large amount of potentially relevant literature. We present a scenario that shows the value of the information environment the Semantic Web can support for aiding neuroscience researchers. We then report on several projects by members of the HCLSIG, in the process illustrating the range of Semantic Web technologies that have applications in areas of biomedicine. Semantic Web technologies present both promise and challenges. Current tools and standards are already adequate to implement components of the bench-to-bedside vision. On the other hand, these technologies are young. Gaps in standards and implementations still exist and adoption is limited by typical problems with early technology, such as the need for a critical mass of practitioners and installed base

  8. Advancing translational research with the Semantic Web

    Directory of Open Access Journals (Sweden)

    Marshall M Scott

    2007-05-01

    Full Text Available Abstract Background A fundamental goal of the U.S. National Institute of Health (NIH "Roadmap" is to strengthen Translational Research, defined as the movement of discoveries in basic research to application at the clinical level. A significant barrier to translational research is the lack of uniformly structured data across related biomedical domains. The Semantic Web is an extension of the current Web that enables navigation and meaningful use of digital resources by automatic processes. It is based on common formats that support aggregation and integration of data drawn from diverse sources. A variety of technologies have been built on this foundation that, together, support identifying, representing, and reasoning across a wide range of biomedical data. The Semantic Web Health Care and Life Sciences Interest Group (HCLSIG, set up within the framework of the World Wide Web Consortium, was launched to explore the application of these technologies in a variety of areas. Subgroups focus on making biomedical data available in RDF, working with biomedical ontologies, prototyping clinical decision support systems, working on drug safety and efficacy communication, and supporting disease researchers navigating and annotating the large amount of potentially relevant literature. Results We present a scenario that shows the value of the information environment the Semantic Web can support for aiding neuroscience researchers. We then report on several projects by members of the HCLSIG, in the process illustrating the range of Semantic Web technologies that have applications in areas of biomedicine. Conclusion Semantic Web technologies present both promise and challenges. Current tools and standards are already adequate to implement components of the bench-to-bedside vision. On the other hand, these technologies are young. Gaps in standards and implementations still exist and adoption is limited by typical problems with early technology, such as the need

  9. Teachers' Attitudes Toward WebQuests as a Method of Teaching

    Science.gov (United States)

    Perkins, Robert; McKnight, Margaret L.

    2005-01-01

    One of the latest uses of technology gaining popular status in education is the WebQuest, a process that involves students using the World Wide Web to solve a problem. The goals of this project are to: (a) determine if teachers are using WebQuests in their classrooms; (b) ascertain whether teachers feel WebQuests are effective for teaching…

  10. POTENSI DAN KAIDAH PERANCANGAN SITUS-WEB SEBAGAI MEDIA KOMUNIKASI VISUAL

    Directory of Open Access Journals (Sweden)

    Freddy H. Istanto

    2001-01-01

    Full Text Available The world is overwhelmed with the newly advanced-communication%2C that is Internet. One which widely known is web-site. Performing as the media of information through visual communication%2C web site is very potential and provides a wide variety of work field for the designers. The rules in web-site design are not far different from those of visual communication design. The essential elements%2C such as typography%2C illustration%2C symbolism%2C and photography%2C act as the key design in producing the web sites. Abstract in Bahasa Indonesia : Dunia dilanda model komunikasi baru yaitu internet. Salah satu bentuknya yang sangat terkenal adalah situs-web. Sebagai media penyebaran informasi melalui komunikasi visual%2C situs-web merupakan potensi dan lapangan kerja yang luar biasa bagi desainer komunikasi visual. Kaidah-kaidah perancangan situs-web tidak berbeda jauh dengan kaidah-kaidah pokok desain komunikasi visual. Elemen-elemen penting dalam desain komunikasi visual seperti typografi%2C ilustrasi%2C simbolisme dan fotografi merupakan kunci perancangan tampilan situs-web. web-site%2C potensi%2C design-criteria%2C visual communication design

  11. Introduction to Webometrics Quantitative Web Research for the Social Sciences

    CERN Document Server

    Thelwall, Michael

    2009-01-01

    Webometrics is concerned with measuring aspects of the web: web sites, web pages, parts of web pages, words in web pages, hyperlinks, web search engine results. The importance of the web itself as a communication medium and for hosting an increasingly wide array of documents, from journal articles to holiday brochures, needs no introduction. Given this huge and easily accessible source of information, there are limitless possibilities for measuring or counting on a huge scale (e.g., the number of web sites, the number of web pages, the number of blogs) or on a smaller scale (e.g., the number o

  12. Canonical A-to-I and C-to-U RNA editing is enriched at 3'UTRs and microRNA target sites in multiple mouse tissues.

    Directory of Open Access Journals (Sweden)

    Tongjun Gu

    Full Text Available RNA editing is a process that modifies RNA nucleotides and changes the efficiency and fidelity of the central dogma. Enzymes that catalyze RNA editing are required for life, and defects in RNA editing are associated with many diseases. Recent advances in sequencing have enabled the genome-wide identification of RNA editing sites in mammalian transcriptomes. Here, we demonstrate that canonical RNA editing (A-to-I and C-to-U occurs in liver, white adipose, and bone tissues of the laboratory mouse, and we show that apparent non-canonical editing (all other possible base substitutions is an artifact of current high-throughput sequencing technology. Further, we report that high-confidence canonical RNA editing sites can cause non-synonymous amino acid changes and are significantly enriched in 3' UTRs, specifically at microRNA target sites, suggesting both regulatory and functional consequences for RNA editing.

  13. Human Genome Editing and Ethical Considerations.

    Science.gov (United States)

    Krishan, Kewal; Kanchan, Tanuj; Singh, Bahadur

    2016-04-01

    Editing human germline genes may act as boon in some genetic and other disorders. Recent editing of the genome of the human embryo with the CRISPR/Cas9 editing tool generated a debate amongst top scientists of the world for the ethical considerations regarding its effect on the future generations. It needs to be seen as to what transformation human gene editing brings to humankind in the times to come.

  14. Web Accessibility and Guidelines

    Science.gov (United States)

    Harper, Simon; Yesilada, Yeliz

    Access to, and movement around, complex online environments, of which the World Wide Web (Web) is the most popular example, has long been considered an important and major issue in the Web design and usability field. The commonly used slang phrase ‘surfing the Web’ implies rapid and free access, pointing to its importance among designers and users alike. It has also been long established that this potentially complex and difficult access is further complicated, and becomes neither rapid nor free, if the user is disabled. There are millions of people who have disabilities that affect their use of the Web. Web accessibility aims to help these people to perceive, understand, navigate, and interact with, as well as contribute to, the Web, and thereby the society in general. This accessibility is, in part, facilitated by the Web Content Accessibility Guidelines (WCAG) currently moving from version one to two. These guidelines are intended to encourage designers to make sure their sites conform to specifications, and in that conformance enable the assistive technologies of disabled users to better interact with the page content. In this way, it was hoped that accessibility could be supported. While this is in part true, guidelines do not solve all problems and the new WCAG version two guidelines are surrounded by controversy and intrigue. This chapter aims to establish the published literature related to Web accessibility and Web accessibility guidelines, and discuss limitations of the current guidelines and future directions.

  15. Effective gene editing by high-fidelity base editor 2 in mouse zygotes

    Directory of Open Access Journals (Sweden)

    Puping Liang

    2017-06-01

    Full Text Available ABSTRACT Targeted point mutagenesis through homologous recombination has been widely used in genetic studies and holds considerable promise for repairing disease-causing mutations in patients. However, problems such as mosaicism and low mutagenesis efficiency continue to pose challenges to clinical application of such approaches. Recently, a base editor (BE system built on cytidine (C deaminase and CRISPR/Cas9 technology was developed as an alternative method for targeted point mutagenesis in plant, yeast, and human cells. Base editors convert C in the deamination window to thymidine (T efficiently, however, it remains unclear whether targeted base editing in mouse embryos is feasible. In this report, we generated a modified high-fidelity version of base editor 2 (HF2-BE2, and investigated its base editing efficacy in mouse embryos. We found that HF2-BE2 could convert C to T efficiently, with up to 100% biallelic mutation efficiency in mouse embryos. Unlike BE3, HF2-BE2 could convert C to T on both the target and non-target strand, expanding the editing scope of base editors. Surprisingly, we found HF2-BE2 could also deaminate C that was proximal to the gRNA-binding region. Taken together, our work demonstrates the feasibility of generating point mutations in mouse by base editing, and underscores the need to carefully optimize base editing systems in order to eliminate proximal-site deamination.

  16. Development of a CRISPR/Cas9 genome editing toolbox for Corynebacterium glutamicum.

    Science.gov (United States)

    Liu, Jiao; Wang, Yu; Lu, Yujiao; Zheng, Ping; Sun, Jibin; Ma, Yanhe

    2017-11-16

    Corynebacterium glutamicum is an important industrial workhorse and advanced genetic engineering tools are urgently demanded. Recently, the clustered regularly interspaced short palindromic repeats (CRISPR) and their CRISPR-associated proteins (Cas) have revolutionized the field of genome engineering. The CRISPR/Cas9 system that utilizes NGG as protospacer adjacent motif (PAM) and has good targeting specificity can be developed into a powerful tool for efficient and precise genome editing of C. glutamicum. Herein, we developed a versatile CRISPR/Cas9 genome editing toolbox for C. glutamicum. Cas9 and gRNA expression cassettes were reconstituted to combat Cas9 toxicity and facilitate effective termination of gRNA transcription. Co-transformation of Cas9 and gRNA expression plasmids was exploited to overcome high-frequency mutation of cas9, allowing not only highly efficient gene deletion and insertion with plasmid-borne editing templates (efficiencies up to 60.0 and 62.5%, respectively) but also simple and time-saving operation. Furthermore, CRISPR/Cas9-mediated ssDNA recombineering was developed to precisely introduce small modifications and single-nucleotide changes into the genome of C. glutamicum with efficiencies over 80.0%. Notably, double-locus editing was also achieved in C. glutamicum. This toolbox works well in several C. glutamicum strains including the widely-used strains ATCC 13032 and ATCC 13869. In this study, we developed a CRISPR/Cas9 toolbox that could facilitate markerless gene deletion, gene insertion, precise base editing, and double-locus editing in C. glutamicum. The CRISPR/Cas9 toolbox holds promise for accelerating the engineering of C. glutamicum and advancing its application in the production of biochemicals and biofuels.

  17. A Typology for Web 2.0

    DEFF Research Database (Denmark)

    Dalsgaard, Christian; Sorensen, Elsebeth Korsgaard

    2008-01-01

    Web 2.0 is a term used to describe recent developments on the World Wide Web. The term is often used to describe the increased use of the web for user-generated content, collaboration, and social networking. However, Web 2.0 is a weakly defined concept, and it is unclear exactly what kind...... of technologies it covers. The objective of the paper is to develop a typology that can be used to categorize Web 2.0 technologies. Further, the paper will discuss which of these technologies are unique to Web 2.0. Often, Web 2.0 is described by way of different kinds of software; for instance, blogs, wikis......, podcasts, RSS, and social networking sites. The problem with this type of description is that it fails to distinguish between different types or categories of technologies. As an alternative, the typology developed in the paper distinguishes between technologies on basis of, how - and in which contexts...

  18. Cluster editing

    DEFF Research Database (Denmark)

    Böcker, S.; Baumbach, Jan

    2013-01-01

    . The problem has been the inspiration for numerous algorithms in bioinformatics, aiming at clustering entities such as genes, proteins, phenotypes, or patients. In this paper, we review exact and heuristic methods that have been proposed for the Cluster Editing problem, and also applications......The Cluster Editing problem asks to transform a graph into a disjoint union of cliques using a minimum number of edge modifications. Although the problem has been proven NP-complete several times, it has nevertheless attracted much research both from the theoretical and the applied side...

  19. Web Inventor Berners-Lee starts a Blog

    CERN Multimedia

    Olson, Parmy

    2005-01-01

    Berners-Lee created what is known today as the World Wide Web. Now, just in time for the Web's 15th anniversary and after taking his proverbial stroll around the global dwelling of cyberspace, Berners-Lee is chatting with the rest of us about what he thinks with a blog

  20. Surfing the web and parkinson's law.

    Science.gov (United States)

    Baldwin, F D

    1996-05-01

    The World Wide Web accounts for much of the popular interest in the Internet and offers a rich and variegated source of medical information. It's where you'll find online attractions ranging from "The Visible Human" to collections of lawyer jokes, as well as guides to clinical materials. Here's a basic introduction to the Web, its features, and its vocabulary.

  1. Web resources for myrmecologists

    DEFF Research Database (Denmark)

    Nash, David Richard

    2005-01-01

    The world wide web provides many resources that are useful to the myrmecologist. Here I provide a brief introduc- tion to the types of information currently available, and to recent developments in data provision over the internet which are likely to become important resources for myrmecologists...... in the near future. I discuss the following types of web site, and give some of the most useful examples of each: taxonomy, identification and distribution; conservation; myrmecological literature; individual species sites; news and discussion; picture galleries; personal pages; portals....

  2. The Little Engines That Could: Modeling the Performance of World Wide Web Search Engines

    OpenAIRE

    Eric T. Bradlow; David C. Schmittlein

    2000-01-01

    This research examines the ability of six popular Web search engines, individually and collectively, to locate Web pages containing common marketing/management phrases. We propose and validate a model for search engine performance that is able to represent key patterns of coverage and overlap among the engines. The model enables us to estimate the typical additional benefit of using multiple search engines, depending on the particular set of engines being considered. It also provides an estim...

  3. Making Web 2.0 Work--From "Librarian Habilis" to "Librarian Sapiens"

    Science.gov (United States)

    Cvetkovic, Milica

    2009-01-01

    When people look back at the World Wide Web of 1996, there can be no doubt that today's web is better and more useful. Hyperlinking webpages and bookmarking were two of the most important aspects of the Web 1.0 world. Soon, though, usability and sharing became very high priorities, and Web 2.0 evolved. Information published in the Web 2.0 world…

  4. ADAR RNA editing in human disease; more to it than meets the I.

    Science.gov (United States)

    Gallo, Angela; Vukic, Dragana; Michalík, David; O'Connell, Mary A; Keegan, Liam P

    2017-09-01

    We review the structures and functions of ADARs and their involvements in human diseases. ADAR1 is widely expressed, particularly in the myeloid component of the blood system, and plays a prominent role in promiscuous editing of long dsRNA. Missense mutations that change ADAR1 residues and reduce RNA editing activity cause Aicardi-Goutières Syndrome, a childhood encephalitis and interferonopathy that mimics viral infection and resembles an extreme form of Systemic Lupus Erythmatosus (SLE). In Adar1 mouse mutant models aberrant interferon expression is prevented by eliminating interferon activation signaling from cytoplasmic dsRNA sensors, indicating that unedited cytoplasmic dsRNA drives the immune induction. On the other hand, upregulation of ADAR1 with widespread promiscuous RNA editing is a prominent feature of many cancers and particular site-specific RNA editing events are also affected. ADAR2 is most highly expressed in brain and is primarily required for site-specific editing of CNS transcripts; recent findings indicate that ADAR2 editing is regulated by neuronal excitation for synaptic scaling of glutamate receptors. ADAR2 is also linked to the circadian clock and to sleep. Mutations in ADAR2 could contribute to excitability syndromes such as epilepsy, to seizures, to diseases involving neuronal plasticity defects, such as autism and Fragile-X Syndrome, to neurodegenerations such as ALS, or to astrocytomas or glioblastomas in which reduced ADAR2 activity is required for oncogenic cell behavior. The range of human disease associated with ADAR1 mutations may extend further to include other inflammatory conditions while ADAR2 mutations may affect psychiatric conditions.

  5. Mining the Social Web Analyzing Data from Facebook, Twitter, LinkedIn, and Other Social Media Sites

    CERN Document Server

    Russell, Matthew

    2011-01-01

    Want to tap the tremendous amount of valuable social data in Facebook, Twitter, LinkedIn, and Google+? This refreshed edition helps you discover who's making connections with social media, what they're talking about, and where they're located. You'll learn how to combine social web data, analysis techniques, and visualization to find what you've been looking for in the social haystack-as well as useful information you didn't know existed. Each standalone chapter introduces techniques for mining data in different areas of the social Web, including blogs and email. All you need to get started

  6. WebGimm: An integrated web-based platform for cluster analysis, functional analysis, and interactive visualization of results.

    Science.gov (United States)

    Joshi, Vineet K; Freudenberg, Johannes M; Hu, Zhen; Medvedovic, Mario

    2011-01-17

    Cluster analysis methods have been extensively researched, but the adoption of new methods is often hindered by technical barriers in their implementation and use. WebGimm is a free cluster analysis web-service, and an open source general purpose clustering web-server infrastructure designed to facilitate easy deployment of integrated cluster analysis servers based on clustering and functional annotation algorithms implemented in R. Integrated functional analyses and interactive browsing of both, clustering structure and functional annotations provides a complete analytical environment for cluster analysis and interpretation of results. The Java Web Start client-based interface is modeled after the familiar cluster/treeview packages making its use intuitive to a wide array of biomedical researchers. For biomedical researchers, WebGimm provides an avenue to access state of the art clustering procedures. For Bioinformatics methods developers, WebGimm offers a convenient avenue to deploy their newly developed clustering methods. WebGimm server, software and manuals can be freely accessed at http://ClusterAnalysis.org/.

  7. World directory of crystallographers and of other scientists employing crystallographic methods

    CERN Document Server

    Filippini, G; Hashizume, H; Torriani, I; Duax, W

    1995-01-01

    The 9th edition of the World Directory of Crystallographers and of Other Scientists Employing Crystallographic Methods, which contains 7907 entries embracing 72 countries, differs considerably from the 8th edition, published in 1990. The content has been updated, and the methods used to acquire the information presented and to produce this new edition of the Directory have involved the latest advances in technology. The Directory is now also available as a regularly updated electronic database, accessible via e-mail, Telnet, Gopher, World-Wide Web, and Mosaic. Full details are given in an Appendix to the printed edition.

  8. Decimal Classification Editions

    OpenAIRE

    Zenovia Niculescu

    2009-01-01

    The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.

  9. The genome editing revolution

    DEFF Research Database (Denmark)

    Stella, Stefano; Montoya, Guillermo

    2016-01-01

    -Cas system has become the main tool for genome editing in many laboratories. Currently the targeted genome editing technology has been used in many fields and may be a possible approach for human gene therapy. Furthermore, it can also be used to modifying the genomes of model organisms for studying human......In the last 10 years, we have witnessed a blooming of targeted genome editing systems and applications. The area was revolutionized by the discovery and characterization of the transcription activator-like effector proteins, which are easier to engineer to target new DNA sequences than...... sequence). This ribonucleoprotein complex protects bacteria from invading DNAs, and it was adapted to be used in genome editing. The CRISPR ribonucleic acid (RNA) molecule guides to the specific DNA site the Cas9 nuclease to cleave the DNA target. Two years and more than 1000 publications later, the CRISPR...

  10. Fiber webs

    Science.gov (United States)

    Roger M. Rowell; James S. Han; Von L. Byrd

    2005-01-01

    Wood fibers can be used to produce a wide variety of low-density three-dimensional webs, mats, and fiber-molded products. Short wood fibers blended with long fibers can be formed into flexible fiber mats, which can be made by physical entanglement, nonwoven needling, or thermoplastic fiber melt matrix technologies. The most common types of flexible mats are carded, air...

  11. Human Germline Genome Editing.

    Science.gov (United States)

    Ormond, Kelly E; Mortlock, Douglas P; Scholes, Derek T; Bombard, Yvonne; Brody, Lawrence C; Faucett, W Andrew; Garrison, Nanibaa' A; Hercher, Laura; Isasi, Rosario; Middleton, Anna; Musunuru, Kiran; Shriner, Daniel; Virani, Alice; Young, Caroline E

    2017-08-03

    With CRISPR/Cas9 and other genome-editing technologies, successful somatic and germline genome editing are becoming feasible. To respond, an American Society of Human Genetics (ASHG) workgroup developed this position statement, which was approved by the ASHG Board in March 2017. The workgroup included representatives from the UK Association of Genetic Nurses and Counsellors, Canadian Association of Genetic Counsellors, International Genetic Epidemiology Society, and US National Society of Genetic Counselors. These groups, as well as the American Society for Reproductive Medicine, Asia Pacific Society of Human Genetics, British Society for Genetic Medicine, Human Genetics Society of Australasia, Professional Society of Genetic Counselors in Asia, and Southern African Society for Human Genetics, endorsed the final statement. The statement includes the following positions. (1) At this time, given the nature and number of unanswered scientific, ethical, and policy questions, it is inappropriate to perform germline gene editing that culminates in human pregnancy. (2) Currently, there is no reason to prohibit in vitro germline genome editing on human embryos and gametes, with appropriate oversight and consent from donors, to facilitate research on the possible future clinical applications of gene editing. There should be no prohibition on making public funds available to support this research. (3) Future clinical application of human germline genome editing should not proceed unless, at a minimum, there is (a) a compelling medical rationale, (b) an evidence base that supports its clinical use, (c) an ethical justification, and (d) a transparent public process to solicit and incorporate stakeholder input. Copyright © 2017 American Society of Human Genetics. All rights reserved.

  12. Social web and knowledge management

    DEFF Research Database (Denmark)

    Dolog, Peter; Kroetz, Markus; Schaffert, Sebastian

    2009-01-01

    Knowledge Management is the study and practice of representing, communicating, organizing, and applying knowledge in organizations. Moreover, being used by organizations, it is inherently social. The Web, as a medium, enables new forms of communications and interactions and requires new ways...... to represent knowledge assets. It is therefore obvious that the Web will influence and change Knowledge Management, but it is very unclear what the impact of these changes will be. This chapter raises questions and discusses visions in the area that connects the Social Web and Knowledge Management – an area...... of research that is only just emerging. The World Wide Web conference 2008 in Beijing hosted a workshop on that question, bringing together researchers and practitioners to gain first insights toward answering questions of that area....

  13. University of Glasgow at WebCLEF 2005

    DEFF Research Database (Denmark)

    Macdonald, C.; Plachouras, V.; He, B.

    2006-01-01

    We participated in the WebCLEF 2005 monolingual task. In this task, a search system aims to retrieve relevant documents from a multilingual corpus of Web documents from Web sites of European governments. Both the documents and the queries are written in a wide range of European languages......, namely content, title, and anchor text of incoming hyperlinks. We use a technique called per-field normalisation, which extends the Divergence From Randomness (DFR) framework, to normalise the term frequencies, and to combine them across the three fields. We also employ the length of the URL path of Web...

  14. Frontiers in ICT towards web 3.0

    CERN Document Server

    Levnajic, Zoran

    2014-01-01

    Life without the World Wide Web has become unthinkable, much like life without electricity or water supply. We rely on the web to check public transport schedules, buy a ticket for a concert or exchange photos with friends. However, many everyday tasks cannot be accomplished by the computer itself, since the websites are designed to be read by people, not machines. In addition, the online information is often unstructured and poorly organized, leaving the user with tedious work of searching and filtering. This book takes us to the frontiers of the emerging Web 3.0 or Semantic Web - a new gener

  15. Integrating WebQuests in Preservice Teacher Education

    Science.gov (United States)

    Wang, Feng; Hannafin, Michael J.

    2008-01-01

    During the past decade, WebQuests have been widely used by teachers to integrate technology into teaching and learning. Recently, teacher educators have applied the WebQuest model with preservice teachers in order to develop technology integration skills akin to those used in everyday schools. Scaffolding, used to support the gradual acquisition…

  16. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  17. Connectivity editing for quadrilateral meshes

    KAUST Repository

    Peng, Chihan

    2011-12-12

    We propose new connectivity editing operations for quadrilateral meshes with the unique ability to explicitly control the location, orientation, type, and number of the irregular vertices (valence not equal to four) in the mesh while preserving sharp edges. We provide theoretical analysis on what editing operations are possible and impossible and introduce three fundamental operations to move and re-orient a pair of irregular vertices. We argue that our editing operations are fundamental, because they only change the quad mesh in the smallest possible region and involve the fewest irregular vertices (i.e., two). The irregular vertex movement operations are supplemented by operations for the splitting, merging, canceling, and aligning of irregular vertices. We explain how the proposed highlevel operations are realized through graph-level editing operations such as quad collapses, edge flips, and edge splits. The utility of these mesh editing operations are demonstrated by improving the connectivity of quad meshes generated from state-of-art quadrangulation techniques. © 2011 ACM.

  18. Connectivity editing for quadrilateral meshes

    KAUST Repository

    Peng, Chihan; Zhang, Eugene; Kobayashi, Yoshihiro; Wonka, Peter

    2011-01-01

    We propose new connectivity editing operations for quadrilateral meshes with the unique ability to explicitly control the location, orientation, type, and number of the irregular vertices (valence not equal to four) in the mesh while preserving sharp edges. We provide theoretical analysis on what editing operations are possible and impossible and introduce three fundamental operations to move and re-orient a pair of irregular vertices. We argue that our editing operations are fundamental, because they only change the quad mesh in the smallest possible region and involve the fewest irregular vertices (i.e., two). The irregular vertex movement operations are supplemented by operations for the splitting, merging, canceling, and aligning of irregular vertices. We explain how the proposed highlevel operations are realized through graph-level editing operations such as quad collapses, edge flips, and edge splits. The utility of these mesh editing operations are demonstrated by improving the connectivity of quad meshes generated from state-of-art quadrangulation techniques. © 2011 ACM.

  19. Ontology-Based Information Visualization: Toward Semantic Web Applications

    NARCIS (Netherlands)

    Fluit, Christiaan; Sabou, Marta; Harmelen, Frank van

    2006-01-01

    The Semantic Web is an extension of the current World Wide Web, based on the idea of exchanging information with explicit, formal, and machine-accessible descriptions of meaning. Providing information with such semantics will enable the construction of applications that have an increased awareness

  20. Harnessing the web information ecosystem with wiki-based visualization dashboards.

    Science.gov (United States)

    McKeon, Matt

    2009-01-01

    We describe the design and deployment of Dashiki, a public website where users may collaboratively build visualization dashboards through a combination of a wiki-like syntax and interactive editors. Our goals are to extend existing research on social data analysis into presentation and organization of data from multiple sources, explore new metaphors for these activities, and participate more fully in the web!s information ecology by providing tighter integration with real-time data. To support these goals, our design includes novel and low-barrier mechanisms for editing and layout of dashboard pages and visualizations, connection to data sources, and coordinating interaction between visualizations. In addition to describing these technologies, we provide a preliminary report on the public launch of a prototype based on this design, including a description of the activities of our users derived from observation and interviews.

  1. Common Data Format (CDF) and Coordinated Data Analysis Web (CDAWeb)

    Science.gov (United States)

    Candey, Robert M.

    2010-01-01

    The Coordinated Data Analysis Web (CDAWeb) data browsing system provides plotting, listing and open access v ia FTP, HTTP, and web services (REST, SOAP, OPeNDAP) for data from mo st NASA Heliophysics missions and is heavily used by the community. C ombining data from many instruments and missions enables broad resear ch analysis and correlation and coordination with other experiments a nd missions. Crucial to its effectiveness is the use of a standard se lf-describing data format, in this case, the Common Data Format (CDF) , also developed at the Space Physics Data facility , and the use of metadata standa rds (easily edited with SKTeditor ). CDAweb is based on a set of IDL routines, CDAWlib . . The CDF project also maintains soft ware and services for translating between many standard formats (CDF. netCDF, HDF, FITS, XML) .

  2. Deletions in cox2 mRNA result in loss of splicing and RNA editing and gain of novel RNA editing sites.

    Directory of Open Access Journals (Sweden)

    Stefanie Grüttner

    Full Text Available As previously demonstrated, the maize cox2 RNA is fully edited in cauliflower mitochondria. Use of constructs with a deleted cox2 intron, however, led to a loss of RNA editing at almost all editing sites, with only a few sites still partially edited. Likewise, one deletion in exon 1 and three in exon 2 abolish RNA editing at all cox2 sites analyzed. Furthermore, intron splicing is abolished using these deletions. Mutation of a cytosine residue, which is normally edited and localized directly adjacent to the intron, to thymidine did not result in restoration of splicing, indicating that the loss of splicing was not due to loss of RNA editing. One deletion in exon 2 did not lead to loss of splicing. Instead, most editing sites were found to be edited, only three were not edited. Unexpectedly, we observed additional RNA editing events at new sites. Thus it appears that deletions in the cox2 RNA sequence can have a strong effect on RNA processing, leading to loss of splicing, loss of editing at all sites, or even to a gain of new editing sites. As these effects are not limited to the vicinity of the respective deletions, but appear to be widespread or even affect all editing sites, they may not be explained by the loss of PPR binding sites. Instead, it appears that several parts of the cox2 transcript are required for proper RNA processing. This indicates the roles of the RNA sequence and structural elements in the recognition of the editing sites.

  3. Decimal Classification Editions

    Directory of Open Access Journals (Sweden)

    Zenovia Niculescu

    2009-01-01

    Full Text Available The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.

  4. RNA Editing and Drug Discovery for Cancer Therapy

    Directory of Open Access Journals (Sweden)

    Wei-Hsuan Huang

    2013-01-01

    Full Text Available RNA editing is vital to provide the RNA and protein complexity to regulate the gene expression. Correct RNA editing maintains the cell function and organism development. Imbalance of the RNA editing machinery may lead to diseases and cancers. Recently, RNA editing has been recognized as a target for drug discovery although few studies targeting RNA editing for disease and cancer therapy were reported in the field of natural products. Therefore, RNA editing may be a potential target for therapeutic natural products. In this review, we provide a literature overview of the biological functions of RNA editing on gene expression, diseases, cancers, and drugs. The bioinformatics resources of RNA editing were also summarized.

  5. Renaissance of the Web

    Science.gov (United States)

    McCarty, M.

    2009-09-01

    The renaissance of the web has driven development of many new technologies that have forever changed the way we write software. The resulting tools have been applied to both solve problems and creat new ones in a wide range of domains ranging from monitor and control user interfaces to information distribution. This discussion covers which of and how these technologies are being used in the astronomical computing community. Topics include JavaScript, Cascading Style Sheets, HTML, XML, JSON, RSS, iCalendar, Java, PHP, Python, Ruby on Rails, database technologies, and web frameworks/design patterns.

  6. FluDetWeb: an interactive web-based system for the early detection of the onset of influenza epidemics.

    Science.gov (United States)

    Conesa, David; López-Quílez, Antonio; Martínez-Beneito, Miguel Angel; Miralles, María Teresa; Verdejo, Francisco

    2009-07-29

    The early identification of influenza outbreaks has became a priority in public health practice. A large variety of statistical algorithms for the automated monitoring of influenza surveillance have been proposed, but most of them require not only a lot of computational effort but also operation of sometimes not-so-friendly software. In this paper, we introduce FluDetWeb, an implementation of a prospective influenza surveillance methodology based on a client-server architecture with a thin (web-based) client application design. Users can introduce and edit their own data consisting of a series of weekly influenza incidence rates. The system returns the probability of being in an epidemic phase (via e-mail if desired). When the probability is greater than 0.5, it also returns the probability of an increase in the incidence rate during the following week. The system also provides two complementary graphs. This system has been implemented using statistical free-software (R and WinBUGS), a web server environment for Java code (Tomcat) and a software module created by us (Rdp) responsible for managing internal tasks; the software package MySQL has been used to construct the database management system. The implementation is available on-line from: http://www.geeitema.org/meviepi/fludetweb/. The ease of use of FluDetWeb and its on-line availability can make it a valuable tool for public health practitioners who want to obtain information about the probability that their system is in an epidemic phase. Moreover, the architecture described can also be useful for developers of systems based on computationally intensive methods.

  7. CRISPR Editing in Biological and Biomedical Investigation.

    Science.gov (United States)

    Ju, Xing-Da; Xu, Jing; Sun, Zhong Sheng

    2018-01-01

    The CRISPR (clustered regularly interspaced short palindromic repeat)-Cas (CRISPR-associated protein) system, a prokaryotic RNA-based adaptive immune system against viral infection, is emerging as a powerful genome editing tool in broad research areas. To further improve and expand its functionality, various CRISPR delivery strategies have been tested and optimized, and key CRISPR system components such as Cas protein have been engineered with different purposes. Benefiting from more in-depth understanding and further development of CRISPR, versatile CRISPR-based platforms for genome editing have been rapidly developed to advance investigations in biology and biomedicine. In biological research area, CRISPR has been widely adopted in both fundamental and applied research fields, such as genomic and epigenomic modification, genome-wide screening, cell and animal research, agriculture transforming, livestock breeding, food manufacture, industrial biotechnology, and gene drives in disease agents control. In biomedical research area, CRISPR has also shown its extensive applicability in the establishment of animal models for genetic disorders, generation of tissue donors, implementation of antimicrobial and antiviral studies, identification and assessment of new drugs, and even treatment for clinical diseases. However, there are still several problems to consider, and the biggest concerns are the off-target effects and ethical issues of this technology. In this prospect article, after highlighting recent development of CRISPR systems, we outline different applications and current limitations of CRISPR in biological and biomedical investigation. Finally, we provide a perspective on future development and potential risks of this multifunctional technology. J. Cell. Biochem. 119: 52-61, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  8. Quality of Web-Based Information on Cannabis Addiction

    Science.gov (United States)

    Khazaal, Yasser; Chatton, Anne; Cochand, Sophie; Zullino, Daniele

    2008-01-01

    This study evaluated the quality of Web-based information on cannabis use and addiction and investigated particular content quality indicators. Three keywords ("cannabis addiction," "cannabis dependence," and "cannabis abuse") were entered into two popular World Wide Web search engines. Websites were assessed with a standardized proforma designed…

  9. Strategies of Qualitative Inquiry. Third Edition

    Science.gov (United States)

    Denzin, Norman K., Ed.; Lincoln, Yvonna S., Ed.

    2007-01-01

    "Strategies of Qualitative Inquiry, Third Edition," the second volume in the paperback version of "The SAGE Handbook of Qualitative Research, 3rd Edition," consists of Part III of the handbook ("Strategies of Inquiry"). "Strategies of Qualitative Inquiry, Third Edition" presents the major tactics--historically, the research methods--that…

  10. SAMP: Application Messaging for Desktop and Web Applications

    Science.gov (United States)

    Taylor, M. B.; Boch, T.; Fay, J.; Fitzpatrick, M.; Paioro, L.

    2012-09-01

    SAMP, the Simple Application Messaging Protocol, is a technology which allows tools to communicate. It is deployed in a number of desktop astronomy applications including ds9, Aladin, TOPCAT, World Wide Telescope and numerous others, and makes it straightforward for a user to treat a selection of these tools as a loosely-integrated suite, combining the most powerful features of each. It has been widely used within Virtual Observatory contexts, but is equally suitable for non-VO use. Enabling SAMP communication from web-based content has long been desirable. An obvious use case is arranging for a click on a web page link to deliver an image, table or spectrum to a desktop viewer, but more sophisticated two-way interaction with rich internet applications would also be possible. Use from the web however presents some problems related to browser sandboxing. We explain how the SAMP Web Profile, introduced in version 1.3 of the SAMP protocol, addresses these issues, and discuss the resulting security implications.

  11. CRISPR/Cas9 Based Genome Editing of Penicillium chrysogenum.

    Science.gov (United States)

    Pohl, C; Kiel, J A K W; Driessen, A J M; Bovenberg, R A L; Nygård, Y

    2016-07-15

    CRISPR/Cas9 based systems have emerged as versatile platforms for precision genome editing in a wide range of organisms. Here we have developed powerful CRISPR/Cas9 tools for marker-based and marker-free genome modifications in Penicillium chrysogenum, a model filamentous fungus and industrially relevant cell factory. The developed CRISPR/Cas9 toolbox is highly flexible and allows editing of new targets with minimal cloning efforts. The Cas9 protein and the sgRNA can be either delivered during transformation, as preassembled CRISPR-Cas9 ribonucleoproteins (RNPs) or expressed from an AMA1 based plasmid within the cell. The direct delivery of the Cas9 protein with in vitro synthesized sgRNA to the cells allows for a transient method for genome engineering that may rapidly be applicable for other filamentous fungi. The expression of Cas9 from an AMA1 based vector was shown to be highly efficient for marker-free gene deletions.

  12. Searching for information on the World Wide Web with a search engine: a pilot study on cognitive flexibility in younger and older users.

    Science.gov (United States)

    Dommes, Aurelie; Chevalier, Aline; Rossetti, Marilyne

    2010-04-01

    This pilot study investigated the age-related differences in searching for information on the World Wide Web with a search engine. 11 older adults (6 men, 5 women; M age=59 yr., SD=2.76, range=55-65 yr.) and 12 younger adults (2 men, 10 women; M=23.7 yr., SD=1.07, range=22-25 yr.) had to conduct six searches differing in complexity, and for which a search method was or was not induced. The results showed that the younger and older participants provided with an induced search method were less flexible than the others and produced fewer new keywords. Moreover, older participants took longer than the younger adults, especially in the complex searches. The younger participants were flexible in the first request and spontaneously produced new keywords (spontaneous flexibility), whereas the older participants only produced new keywords when confronted by impasses (reactive flexibility). Aging may influence web searches, especially the nature of keywords used.

  13. Phylemon 2.0: a suite of web-tools for molecular evolution, phylogenetics, phylogenomics and hypotheses testing.

    Science.gov (United States)

    Sánchez, Rubén; Serra, François; Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Pulido, Luis; de María, Alejandro; Capella-Gutíerrez, Salvador; Huerta-Cepas, Jaime; Gabaldón, Toni; Dopazo, Joaquín; Dopazo, Hernán

    2011-07-01

    Phylemon 2.0 is a new release of the suite of web tools for molecular evolution, phylogenetics, phylogenomics and hypotheses testing. It has been designed as a response to the increasing demand of molecular sequence analyses for experts and non-expert users. Phylemon 2.0 has several unique features that differentiates it from other similar web resources: (i) it offers an integrated environment that enables evolutionary analyses, format conversion, file storage and edition of results; (ii) it suggests further analyses, thereby guiding the users through the web server; and (iii) it allows users to design and save phylogenetic pipelines to be used over multiple genes (phylogenomics). Altogether, Phylemon 2.0 integrates a suite of 30 tools covering sequence alignment reconstruction and trimming; tree reconstruction, visualization and manipulation; and evolutionary hypotheses testing.

  14. Phylemon 2.0: a suite of web-tools for molecular evolution, phylogenetics, phylogenomics and hypotheses testing

    Science.gov (United States)

    Sánchez, Rubén; Serra, François; Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Pulido, Luis; de María, Alejandro; Capella-Gutíerrez, Salvador; Huerta-Cepas, Jaime; Gabaldón, Toni; Dopazo, Joaquín; Dopazo, Hernán

    2011-01-01

    Phylemon 2.0 is a new release of the suite of web tools for molecular evolution, phylogenetics, phylogenomics and hypotheses testing. It has been designed as a response to the increasing demand of molecular sequence analyses for experts and non-expert users. Phylemon 2.0 has several unique features that differentiates it from other similar web resources: (i) it offers an integrated environment that enables evolutionary analyses, format conversion, file storage and edition of results; (ii) it suggests further analyses, thereby guiding the users through the web server; and (iii) it allows users to design and save phylogenetic pipelines to be used over multiple genes (phylogenomics). Altogether, Phylemon 2.0 integrates a suite of 30 tools covering sequence alignment reconstruction and trimming; tree reconstruction, visualization and manipulation; and evolutionary hypotheses testing. PMID:21646336

  15. Do-It-Yourself: A Special Library's Approach to Creating Dynamic Web Pages Using Commercial Off-The-Shelf Applications

    Science.gov (United States)

    Steeman, Gerald; Connell, Christopher

    2000-01-01

    Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.

  16. Abstracts of the Standard Edition of the Complete Psychological Works of Sigmund Freud.

    Science.gov (United States)

    Rothgeb, Carrie Lee, Ed.

    In order to make mental health-related knowledge available widely and in a form to encourage its use, the National Institute of Mental Health collaborated with the American Psychoanalytic Association in this pioneer effort to abstract the 23 volumes of the "Standard Edition of Freud." The volume is a comprehensive compilation of…

  17. Play Therapy with Children in Crisis: Individual, Group, and Family Treatment. Third Edition

    Science.gov (United States)

    Webb, Nancy Boyd, Ed.

    2007-01-01

    Now in a completely revised and updated third edition, this widely adopted casebook and text presents effective, creative approaches to helping children who have experienced such stressful situations as parental death or divorce, abuse and neglect, violence in the school or community, and natural disasters. While the book retains the focus on…

  18. Space Physics Data Facility Web Services

    Science.gov (United States)

    Candey, Robert M.; Harris, Bernard T.; Chimiak, Reine A.

    2005-01-01

    The Space Physics Data Facility (SPDF) Web services provides a distributed programming interface to a portion of the SPDF software. (A general description of Web services is available at http://www.w3.org/ and in many current software-engineering texts and articles focused on distributed programming.) The SPDF Web services distributed programming interface enables additional collaboration and integration of the SPDF software system with other software systems, in furtherance of the SPDF mission to lead collaborative efforts in the collection and utilization of space physics data and mathematical models. This programming interface conforms to all applicable Web services specifications of the World Wide Web Consortium. The interface is specified by a Web Services Description Language (WSDL) file. The SPDF Web services software consists of the following components: 1) A server program for implementation of the Web services; and 2) A software developer s kit that consists of a WSDL file, a less formal description of the interface, a Java class library (which further eases development of Java-based client software), and Java source code for an example client program that illustrates the use of the interface.

  19. Issues to Consider in Designing WebQuests: A Literature Review

    Science.gov (United States)

    Kurt, Serhat

    2012-01-01

    A WebQuest is an inquiry-based online learning technique. This technique has been widely adopted in K-16 education. Therefore, it is important that conditions of effective WebQuest design are defined. Through this article the author presents techniques for improving WebQuest design based on current research. More specifically, the author analyzes…

  20. Towards the Semantic Web

    NARCIS (Netherlands)

    Davies, John; Fensel, Dieter; Harmelen, Frank Van

    2003-01-01

    With the current changes driven by the expansion of the World Wide Web, this book uses a different approach from other books on the market: it applies ontologies to electronically available information to improve the quality of knowledge management in large and distributed organizations. Ontologies