Taylor, Edward N.; Franx, Marijn; Quadri, Ryan F.; Damen, Maaike; Hildebrandt, Hendrik; Van Dokkum, Pieter G.; Herrera, David; Gawiser, Eric; Bell, Eric F.; Barrientos, L. Felipe; Blanc, Guillermo A.; Castander, Francisco J.; Gonzalez-Perez, Violeta; Hall, Patrick B.; Kriek, Mariska; Labbe, Ivo; Lira, Paulina; Maza, Jose; Rudnick, Gregory; Treister, Ezequiel
We present a new, K-selected, optical-to-near infrared photometric catalog of the Extended Chandra Deep Field South (ECDFS), making it publicly available to the astronomical community. 22 Imaging and spectroscopy data and catalogs are freely available through the MUSYC Public Data Release webpage: http://www.astro.yale.edu/MUSYC/. The data set is founded on publicly available imaging, supplemented by original z'JK imaging data collected as part of the MUltiwavelength Survey by Yale-Chile (MUSYC). The final photometric catalog consists of photometry derived from UU 38 BVRIz'JK imaging covering the full 1/2 x 1/2 square circ of the ECDFS, plus H-band photometry for approximately 80% of the field. The 5σ flux limit for point sources is K (AB) tot = 22.0. This is also the nominal completeness and reliability limit of the catalog: the empirical completeness for 21.75 85%. We have verified the quality of the catalog through both internal consistency checks, and comparisons to other existing and publicly available catalogs. As well as the photometric catalog, we also present catalogs of photometric redshifts and rest-frame photometry derived from the 10-band photometry. We have collected robust spectroscopic redshift determinations from published sources for 1966 galaxies in the catalog. Based on these sources, we have achieved a (1σ) photometric redshift accuracy of Δz/(1 + z) = 0.036, with an outlier fraction of 7.8%. Most of these outliers are X-ray sources. Finally, we describe and release a utility for interpolating rest-frame photometry from observed spectral energy distributions, dubbed InterRest. 23 InterRest is available via http://www.strw.leidenuniv.nl/~ent/InterRest. Documentation and a complete walkthrough can be found at the same address.
Cardamone, Carolin N.; Van Dokkum, Pieter G.; Urry, C. Megan; Brammer, Gabriel; Taniguchi, Yoshi; Gawiser, Eric; Bond, Nicholas; Taylor, Edward; Damen, Maaike; Treister, Ezequiel; Cobb, Bethany E.; Schawinski, Kevin; Lira, Paulina; Murayama, Takashi; Saito, Tomoki; Sumikawa, Kentaro
We present deep optical 18-medium-band photometry from the Subaru telescope over the ∼30' x 30' Extended Chandra Deep Field-South, as part of the Multiwavelength Survey by Yale-Chile (MUSYC). This field has a wealth of ground- and space-based ancillary data, and contains the GOODS-South field and the Hubble Ultra Deep Field. We combine the Subaru imaging with existing UBVRIzJHK and Spitzer IRAC images to create a uniform catalog. Detecting sources in the MUSYC 'BVR' image we find ∼40,000 galaxies with R AB 3.5. For 0.1 < z < 1.2, we find a 1σ scatter in Δz/(1 + z) of 0.007, similar to results obtained with a similar filter set in the COSMOS field. As a demonstration of the data quality, we show that the red sequence and blue cloud can be cleanly identified in rest-frame color-magnitude diagrams at 0.1 < z < 1.2. We find that ∼20% of the red sequence galaxies show evidence of dust emission at longer rest-frame wavelengths. The reduced images, photometric catalog, and photometric redshifts are provided through the public MUSYC Web site.
Thorlund Jepsen, Erik; Seiden, Piet; Ingwersen, Peter Emil Rerup
were generated based on specifically selected domain topics that are searched for in three publicly accessible search engines (Google, AllTheWeb, and AltaVista). A sample of the retrieved hits was analyzed with regard to how various publication attributes correlated with the scientific quality...... of the content and whether this information could be employed to harvest, filter, and rank Web publications. The attributes analyzed were inlinks, outlinks, bibliographic references, file format, language, search engine overlap, structural position (according to site structure), and the occurrence of various...... types of metadata. As could be expected, the ranked output differs between the three search engines. Apparently, this is caused by differences in ranking algorithms rather than the databases themselves. In fact, because scientific Web content in this subject domain receives few inlinks, both Alta...
This article examines the nature and role of Web 2.0 resources and their impact on health information made available though the Internet. The transition of the Web from version one to Web 2.0 is described and the main features of the new Web examined. Two characteristic Web 2.0 resources are explored and the implications for the public and practitioners examined. First, what are known as 'user reviews' or 'user testimonials', which allow people to comment on the health services delivered to them, are described. Second, new mapping applications that take advantage of the interactive potential of Web 2.0 and provide tools to visualize complex data are examined. Following a discussion of the potential of Web 2.0, it is concluded that it offers considerable opportunities for disseminating health information and creating new sources of data, as well as generating new questions and dilemmas.
Park, Moon Su; Lee, Young Wook; Kang, Chang Sun
Public acceptance has been a key factor in nuclear industry as well as other fields. There are many ways to get public acceptance. Public participation in making a policy must be a good tool for this purpose. Moreover, the participation by means of internet may be an excellent way to increase voluntary participation. In this paper, the level of electronic public participation is defined and how easy and deep for lay public to participate electronically is assessed for some organization's web sites
Full Text Available Libraries should offer their patrons web sites which establish the unmistakeable concept (public of library, the concept that cannot be mistaken for other information brokers and services available on the Internet, but inside this framework of the concept of library, would show a diversity which directs patrons to other (public libraries. This can be achieved by reliability, quality of information and services, and safety of usage.Achieving this, patrons regard library web sites as important reference sources deserving continuous usage for obtaining relevant information. Libraries excuse investment in the development and sustainance of their web sites by the number of visits and by patron satisfaction. The presented research, made on a sample of Slovene public libraries’web sites, determines how the libraries establish their purpose and role, as well as the given professional recommendations in web site design.The results uncover the striving of libraries for the modernisation of their functions,major attention is directed to the presentation of classic libraries and their activities,lesser to the expansion of available contents and electronic sources. Pointing to their diversity is significant since it is not a result of patrons’ needs, but more the consequence of improvisation, too little attention to selection, availability, organisation and formation of different kind of information and services on the web sites. Based on the analysis of a common concept of the public library web site, certain activities for improving the existing state of affairs are presented in the paper.
Ten years ago, CERN issued a statement declaring that a little known piece of software called the World Wide Web was in the public domain. Nowadays, the Web is an indispensable part of modern communications. The idea for the Web goes back to March 1989 when CERN Computer scientist Tim Berners-Lee wrote a proposal for a 'Distributed Information Management System' for the high-energy physics community. The Web was originaly conceived and developed to meet the demand for information sharing between scientists working all over the world. There were many obstacles in the 1980s to the effective exchange of information. There was, for example a great variety of computer and network systems, with hardly any common features. The main purpose of the web was to allow scientists to access information from any source in a consistent and simple way. By Christmas 1990, Berners-Lee's idea had become the World Wide Web, with its first server and browser running at CERN. Through 1991, the Web spread to other particle physics ...
Luis Alejandro Casasola Balsells
Full Text Available This paper describes an analysis conducted in 2015 to evaluate the accessibility of content on Andalusian public university websites. In order to determinate whether these websites are accessible, an assessment has been carried out to check conformance with the latest Web Content Accessibility Guidelines (WCAG 2.0 established by the World Wide Web Consortium (W3C. For this purpose, we have designed a methodology for analysis that combines the use of three automatic tools (eXaminator, MINHAP web accessibility tool, and TAW with a manual analysis to provide a greater reliability and validity of the results. Although the results are acceptable overall, a detailed analysis shows that more is still needed for achieving full accessibility for the entire university community. In this respect, we suggest several corrections to common accessibility errors for facilitating the design of university web portals.
Web mapping sites are interactive maps that are accessed via Webpages. With the rapid development of Internet and Geographic Information System (GIS) field, public web mapping sites are not foreign to people. Nowadays, people use these web mapping sites for various reasons, in that increasing maps and related map services of web mapping sites are freely available for end users. Thus, increased users of web mapping sites led to more usability studies. Usability Engineering (UE), for instance, is an approach for analyzing and improving the usability of websites through examining and evaluating an interface. In this research, UE method was employed to explore usability problems of four public web mapping sites, analyze the problems quantitatively and provide guidelines for future design based on the test results. Firstly, the development progress for usability studies were described, and simultaneously several usability evaluation methods such as Usability Engineering (UE), User-Centered Design (UCD) and Human-Computer Interaction (HCI) were generally introduced. Then the method and procedure of experiments for the usability test were presented in detail. In this usability evaluation experiment, four public web mapping sites (Google Maps, Bing maps, Mapquest, Yahoo Maps) were chosen as the testing websites. And 42 people, who having different GIS skills (test users or experts), gender (male or female), age and nationality, participated in this test to complete the several test tasks in different teams. The test comprised three parts: a pretest background information questionnaire, several test tasks for quantitative statistics and progress analysis, and a posttest questionnaire. The pretest and posttest questionnaires focused on gaining the verbal explanation of their actions qualitatively. And the design for test tasks targeted at gathering quantitative data for the errors and problems of the websites. Then, the results mainly from the test part were analyzed. The
Fluckiger, Francois; CERN. Geneva. IT Department
This note is an extended version of the article “Licencing the Web” (http://home.web.cern.ch/topics/birthweb/licensing-web) published by CERN, Nov 2013, in the “Birth of the Web” series of articles (http://home.cern/topics/birth-web). It describes the successive steps of the public release of the CERN Web software, from public domain to open source, and explains their rationale. It provides in annexes historical documents including release announcement and texts of the licences used by CERN and MIT in public software distributions.
This edited volume details multiple and dynamic histories of relations between public service broadcasters and the World Wide Web. What does it mean to be a national broadcaster in a global communications environment? What are the commercial and public service pressures that were brought to bear...... when public service broadcasters implemented web services? How did “one- to-many” broadcasters adapt to the “many-to-many” medium of the internet? The thematic or- ganisation of this collection addresses such major issues, while each chapter offers a particular historical account of relations between...... public service broadcasters and the World Wide Web....
Smith, Kerry J.
Offers advice to librarians for marketing their Web sites on Internet search engines. Advises against relying solely on spiders and recommends adding metadata to the source code and delivering that information directly to the search engines. Gives an overview of metadata and typical coding for meta tags. Includes Web addresses for a number of…
Reed, Rajika E.; Bodzin, Alec M.
An interdisciplinary curriculum unit that used Web GIS mapping to investigate malaria disease patterns and spread in relation to the environment for a high school Advanced Placement Environmental Science course was developed. A feasibility study was conducted to investigate the efficacy of the unit to promote geospatial thinking and reasoning…
Dolamic, Ljiljana; Boyer, Célia
This paper describes and evaluates the public health web pages classification model based on key phrase extraction and matching. Easily extendible both in terms of new classes as well as the new language this method proves to be a good solution for text classification faced with the total lack of training data. To evaluate the proposed solution we have used a small collection of public health related web pages created by a double blind manual classification. Our experiments have shown that by choosing the adequate threshold value the desired value for either precision or recall can be achieved.
Pasachoff, Jay M.
As part of alerting the general public to the subtly spectacular transit of Venus as an intellectual marvel not available to us from Earth until AD 2117/2125, in addition to our scientific plans (Pasachoff et al., this meeting), I provided: (1) an article in the children's magazine Odyssey (May/June 2011); (2) a discussion in National Geographic Society's BreakingOrbit blog (March 1, 2011); (3) and a year's advance notice as "June 5: Transit of Venus," 365daysofastronomy.org. (4) Nantes DPS: I participated in "Transits of Venus in Public Education and Contemporary Research" (http://transitofvenus.nl/wp/2011/10/16/four-giants-talk-about-transits). (5) 22-minute lecture on the Phi Beta Kappa website: http://www.pbk.org/home/playpodcast.aspx?id=772. (6) E/PO summary at Historical Astronomy Division News, #79, October. Closer to the event, I had a (7) Comment in Nature ("Transit of Venus: Last Chance to See," Nature 485, 303-304) and (8, 9) articles in Physics World, 25, 36-41; and Scientific American, http://www.scientificamerican.com/article.cfm?id=transit-venus-june-5). The day before the transit, (10) I had a radio/podcast Academic Minute (http://www.wamc.org/post/dr-jay-pasachoff-williams-college). (11) On transit day, I had an Op-Ed piece in The New York Times ("Learning from Celestial Beauty," http://www.nytimes.com/2012/06/05/opinion/learning-from-celestial-beauty.html) that was seen by largely a non-scientific audience. Subsequently, (12) I gave a Keck-Observatory-sponsored Waimea general-public lecture (http://keckobservatory.org/news/video_venus_transits_past_present_future), and (13) an invited public lecture at the AAS meeting in Anchorage (http://aas.org/meetings/aas220/video_session_127). I had a podcast on (14) 365daysofastronomy.org (June 29). (15) My article for Sky & Telescope appeared in its October issue. (16) My editorial "Syzygy x 3" will be in RASC Observer's Handbook 2013. (16) These efforts as well as links to history and science of transits
...-0392] Proposed Enhancements to the Motor Carrier Safety Measurement System (SMS) Public Web Site AGENCY... proposed enhancements to the display of information on the Agency's Safety Measurement System (SMS) public Web site. On December 6, 2013, Advocates [[Page 76392
... stakeholders the opportunity to give policy, management, and technical input concerning Trinity River...-FF08EACT00] Trinity Adaptive Management Working Group; Public Meeting, Teleconference and Web-Based Meeting... Trinity Management Council (TMC). DATES: Public meeting, Teleconference, and web-based meeting: TAMWG and...
... meeting. Background The TAMWG affords stakeholders the opportunity to give policy, management, and...-FF08EACT00] Trinity Adaptive Management Working Group; Public Teleconference/ Web-Based Meeting AGENCY: Fish..., announce a public teleconference/web-based meeting of [[Page 60139
... stakeholders the opportunity to give policy, management, and technical input concerning Trinity River...-FF08EACT00] Trinity Adaptive Management Working Group; Public Meeting, Teleconference and Web-Based Meeting... Service, announce a public meeting, teleconference and web-based meeting of the Trinity Adaptive...
... opportunity to give policy, management, and technical input concerning Trinity River (California) restoration...-FF08EACT00] Trinity Adaptive Management Working Group; Public Meeting, Teleconference and Web-Based Meeting... Service, announce a public meeting, teleconference, and web-based meeting of the Trinity Adaptive...
Scotch, Matthew; Yip, Kevin Y.; Cheung, Kei-Hoi
Development of public health informatics applications often requires the integration of multiple data sources. This process can be challenging due to issues such as different file formats, schemas, naming systems, and having to scrape the content of web pages. A potential solution to these system development challenges is the use of Web 2.0 technologies. In general, Web 2.0 technologies are new internet services that encourage and value information sharing and collaboration among individuals....
Scotch, Matthew; Yip, Kevin Y; Cheung, Kei-Hoi
Development of public health informatics applications often requires the integration of multiple data sources. This process can be challenging due to issues such as different file formats, schemas, naming systems, and having to scrape the content of web pages. A potential solution to these system development challenges is the use of Web 2.0 technologies. In general, Web 2.0 technologies are new internet services that encourage and value information sharing and collaboration among individuals. In this case report, we describe the development and use of Web 2.0 technologies including Yahoo! Pipes within a public health application that integrates animal, human, and temperature data to assess the risk of West Nile Virus (WNV) outbreaks. The results of development and testing suggest that while Web 2.0 applications are reasonable environments for rapid prototyping, they are not mature enough for large-scale public health data applications. The application, in fact a "systems of systems," often failed due to varied timeouts for application response across web sites and services, internal caching errors, and software added to web sites by administrators to manage the load on their servers. In spite of these concerns, the results of this study demonstrate the potential value of grid computing and Web 2.0 approaches in public health informatics.
Khabsa, Madian; Giles, C. Lee
The number of scholarly documents available on the web is estimated using capture/recapture methods by studying the coverage of two major academic search engines: Google Scholar and Microsoft Academic Search. Our estimates show that at least 114 million English-language scholarly documents are accessible on the web, of which Google Scholar has nearly 100 million. Of these, we estimate that at least 27 million (24%) are freely available since they do not require a subscription or payment of any kind. In addition, at a finer scale, we also estimate the number of scholarly documents on the web for fifteen fields: Agricultural Science, Arts and Humanities, Biology, Chemistry, Computer Science, Economics and Business, Engineering, Environmental Sciences, Geosciences, Material Science, Mathematics, Medicine, Physics, Social Sciences, and Multidisciplinary, as defined by Microsoft Academic Search. In addition, we show that among these fields the percentage of documents defined as freely available varies significantly, i.e., from 12 to 50%. PMID:24817403
Full Text Available The number of scholarly documents available on the web is estimated using capture/recapture methods by studying the coverage of two major academic search engines: Google Scholar and Microsoft Academic Search. Our estimates show that at least 114 million English-language scholarly documents are accessible on the web, of which Google Scholar has nearly 100 million. Of these, we estimate that at least 27 million (24% are freely available since they do not require a subscription or payment of any kind. In addition, at a finer scale, we also estimate the number of scholarly documents on the web for fifteen fields: Agricultural Science, Arts and Humanities, Biology, Chemistry, Computer Science, Economics and Business, Engineering, Environmental Sciences, Geosciences, Material Science, Mathematics, Medicine, Physics, Social Sciences, and Multidisciplinary, as defined by Microsoft Academic Search. In addition, we show that among these fields the percentage of documents defined as freely available varies significantly, i.e., from 12 to 50%.
Khabsa, Madian; Giles, C Lee
The number of scholarly documents available on the web is estimated using capture/recapture methods by studying the coverage of two major academic search engines: Google Scholar and Microsoft Academic Search. Our estimates show that at least 114 million English-language scholarly documents are accessible on the web, of which Google Scholar has nearly 100 million. Of these, we estimate that at least 27 million (24%) are freely available since they do not require a subscription or payment of any kind. In addition, at a finer scale, we also estimate the number of scholarly documents on the web for fifteen fields: Agricultural Science, Arts and Humanities, Biology, Chemistry, Computer Science, Economics and Business, Engineering, Environmental Sciences, Geosciences, Material Science, Mathematics, Medicine, Physics, Social Sciences, and Multidisciplinary, as defined by Microsoft Academic Search. In addition, we show that among these fields the percentage of documents defined as freely available varies significantly, i.e., from 12 to 50%.
When I undertook my first library website redesign a few years ago, I stumbled upon an ongoing culture clash in web-based industries between the developer and the designer. Developers are programmers – they have coding skills and speak languages like PHP, jQuery, and AJAX. For them, Cake isn’t something you eat – it’s a development [...
Web 2.0 technology is a hot topic at the moment, and public librarians in particular are beginning to feel the pressure to apply these tools. Indeed, Web 2.0 has the potential to transform library services, but only if the policy and strategy for those services are ready to be transformed. The author not only reviews these tools and provides practical advice and case studies on how they can be applied in the public library setting, but also recommends the policies and business cases that begin to create a new strategy for public libraries.particularly geared to the public library settingadvice
Jordan, Melissa; DuClos, Chris; Folsom, John; Thomas, Rebecca
As smartphone and tablet devices continue to proliferate, it is becoming increasingly important to tailor information delivery to the mobile device. The Florida Environmental Public Health Tracking Program recognized that the mobile device user needs Web content formatted to smaller screen sizes, simplified data displays, and reduced textual information. The Florida Environmental Public Health Tracking Program developed a smartphone-friendly version of the state Web portal for easier access by mobile device users. The resulting smartphone-friendly portal combines calculated data measures such as inpatient hospitalizations and emergency department visits and presents them grouped by county, along with temporal trend graphs. An abbreviated version of the public health messaging provided on the traditional Web portal is also provided, along with social media connections. As a result of these efforts, the percentage of Web site visitors using an iPhone tripled in just 1 year.
Full Text Available When I undertook my first library website redesign a few years ago, I stumbled upon an ongoing culture clash in web-based industries between the developer and the designer. Developers are programmers – they have coding skills and speak languages like PHP, jQuery, and AJAX. For them, Cake isn’t something you eat – it’s a development [...
Molin, E.J.E.; Timmermans, H.J.P.
Web-enabled public transport (PT) information systems that combine information on different PT modes, different PT companies and different geographical regions, can be built to improve the accessibility of public transportation. As the potential list of information aspects that can be included in
Rosenkrantz, Andrew B; Doshi, Ankur M
To assess information regarding radiology practices on public transparency Web sites. Eight Web sites comparing radiology centers' price and quality were identified. Web site content was assessed. Six of eight Web sites reported examination prices. Other reported information included hours of operation (4/8), patient satisfaction (2/8), American College of Radiology (ACR) accreditation (3/8), on-site radiologists (2/8), as well as parking, accessibility, waiting area amenities, same/next-day reports, mammography follow-up rates, examination appropriateness, radiation dose, fellowship-trained radiologists, and advanced technologies (1/8 each). Transparency Web sites had a preponderance of price (and to a lesser extent service quality) information, risking fostering price-based competition at the expense of clinical quality. Copyright © 2016 Elsevier Inc. All rights reserved.
Byrne, Patrick F; Namuth, Deana M; Harrington, Judy; Ward, Sarah M; Lee, Donald J; Hain, Patricia
Transgenic crops among the most controversial "science and society" issues of recent years. Because of the complex techniques involved in creating these crops and the polarized debate over their risks and beliefs, a critical need has arisen for accessible and balanced information on this technology. World Wide Web sites offer several advantages for disseminating information on a fast-changing technical topic, including their global accessibility; and their ability to update information frequently, incorporate multimedia formats, and link to networks of other sites. An alliance between two complementary web sites at Colorado State University and the University of Nebraska-Lincoln takes advantage of the web environment to help fill the need for public information on crop genetic engineering. This article describes the objectives and features of each site. Viewership data and other feedback have shown these web sites to be effective means of reaching public audiences on a complex scientific topic.
R&Tserve is a publications system based on 'commercial, off-the-shelf' (COTS) software that provides a persistent, collaborative workspace for authors and editors to support the entire publication development process from initial submission, through iterative editing in a hierarchical approval structure, and on to 'publication' on the WWW. It requires no specific knowledge of the WWW (beyond basic use) or HyperText Markup Language (HTML). Graphics and URLs are automatically supported. The system includes a transaction archive, a comments utility, help functionality, automated graphics conversion, automated table generation, and an email-based notification system. It may be configured and administered via the WWW and can support publications ranging from single page documents to multiple-volume 'tomes'.
Colineau, Nathalie; Paris, Cécile; Vander Linden, Keith
Public administration organizations commonly produce citizen-focused, informational materials describing public programs and the conditions under which citizens or citizen groups are eligible for these programs. The organizations write these materials for generic audiences because of the excessive human resource costs that would be required to produce personalized materials for everyone. Unfortunately, generic materials tend to be longer and harder to understand than materials tailored for particular citizens. Our work explores the feasibility and effectiveness of automatically producing tailored materials. We have developed an adaptive hypermedia application system that automatically produces tailored informational materials and have evaluated it in a series of studies. The studies demonstrate that: (1) subjects prefer tailored materials over generic materials, even if the tailoring requires answering a set of demographic questions first; (2) tailored materials are more effective at supporting subjects in their task of learning about public programs; and (3) the time required to specify the demographic information on which the tailoring is based does not significantly slow down the subjects in their information seeking task.
Vacca, John R
OVERVIEW OF PKI TECHNOLOGYPublic Key Infrastructures (PKIs): What Are They?Type of Certificate Authorities (CAS) ServicesPKI StandardsTypes of Vendor and Third-Party CA SystemsProtecting Private KeysCA System AttacksStolen Private Keys: What Can Be Done?Certificate Practice StatementsPKI ReadinessANALYZING AND DESIGNING PUBLIC KEY INFRASTRUCTURESPKI Design IssuesCost Justification and ConsiderationPKI Standards Design IssuesPKI Architectural Design ConsiderationsIMPLEMENTING PKIRequirementsImplementation ScheduleImplementation CostsPKI PerformanceMANAGING PKIRequesting a CertificateObtaining a
Full Text Available Abstract Background Emerging and re-emerging infectious diseases are a significant public health concern, and early detection and immediate response is crucial for disease control. These challenges have led to the need for new approaches and technologies to reinforce the capacity of traditional surveillance systems for detecting emerging infectious diseases. In the last few years, the availability of novel web-based data sources has contributed substantially to infectious disease surveillance. This study explores the burgeoning field of web-based infectious disease surveillance systems by examining their current status, importance, and potential challenges. Methods A systematic review framework was applied to the search, screening, and analysis of web-based infectious disease surveillance systems. We searched PubMed, Web of Science, and Embase databases to extensively review the English literature published between 2000 and 2015. Eleven surveillance systems were chosen for evaluation according to their high frequency of application. Relevant terms, including newly coined terms, development and classification of the surveillance systems, and various characteristics associated with the systems were studied. Results Based on a detailed and informative review of the 11 web-based infectious disease surveillance systems, it was evident that these systems exhibited clear strengths, as compared to traditional surveillance systems, but with some limitations yet to be overcome. The major strengths of the newly emerging surveillance systems are that they are intuitive, adaptable, low-cost, and operated in real-time, all of which are necessary features of an effective public health tool. The most apparent potential challenges of the web-based systems are those of inaccurate interpretation and prediction of health status, and privacy issues, based on an individual’s internet activity. Conclusion Despite being in a nascent stage with further modification
Goldfarb, Steven; Phoboo, Abha Eli; Shaw, Kate
The ATLAS Education and Outreach Group is in the process of migrating its public online content to a professionally designed set of web pages built on the Drupal content management system. Development of the front-end design passed through several key stages, including audience surveys, stakeholder interviews, usage analytics, and a series of fast design iterations, called sprints. Implementation of the web site involves application of the html design using Drupal templates, refined development iterations, and the overall population of the site with content. We present the design and development processes and share the lessons learned along the way, including the results of the data-driven discovery studies. We also demonstrate the advantages of selecting a back-end supported by content management, with a focus on workflow. Finally, we discuss usage of the new public web pages to implement outreach strategy through implementation of clearly presented themes, consistent audience targeting and messaging, and th...
Purpose: The purpose of this paper is to demonstrate the work undertaken by Vancouver Public Library (VPL) in an effort to convert its website into a true virtual branch, both through the functionality of the website itself and by extending its web presence on to external social networking sites. Design/methodology/approach: VPL worked with its…
Online Public Access Catalog (OPAC) merupakan sistem katalog online yang memanfaatkan teknologi komputer dan internet sebagai media pengaksesan dan penyimpanan datanya. Sebuah katalog biasanya memberikan informasi mengenai koleksi yang disimpan dalam sebuah perpustakaan digital. Dalam penelitian ini akan dibuat sebuah prototipe aplikasi pencarian pada katalog online di perpustakaan Universitas Binadarma Palembang berbasis teknologi web semantik serta menerapkan pengolahan bahasa alami sederha...
The study confirms Africa's deep interest in the grasscutter which is not shared by other parts of the world. We recommend increased publication of research on cane rats in web-based journals to quickly spread the food value of this prized meat rodent to other parts of the world and so attract research interest and funding.
Gielissen, T.; Marx, M.
The development of PoliDocs.nl, a Web Information System for the disclosure of Dutch parliamentary publications, is an effort to improve the disclosure of parliamentary publications in The Netherlands. The data is distributed over three sources and is available through different Web Information
Full Text Available Abstract 'Mashup' was originally used to describe the mixing together of musical tracks to create a new piece of music. The term now refers to Web sites or services that weave data from different sources into a new data source or service. Using a musical metaphor that builds on the origin of the word 'mashup', this paper presents a demonstration "playlist" of four geo-mashup vignettes that make use of a range of Web 2.0, Semantic Web, and 3-D Internet methods, with outputs/end-user interfaces spanning the flat Web (two-dimensional – 2-D maps, a three-dimensional – 3-D mirror world (Google Earth and a 3-D virtual world (Second Life ®. The four geo-mashup "songs" in this "playlist" are: 'Web 2.0 and GIS (Geographic Information Systems for infectious disease surveillance', 'Web 2.0 and GIS for molecular epidemiology', 'Semantic Web for GIS mashup', and 'From Yahoo! Pipes to 3-D, avatar-inhabited geo-mashups'. It is hoped that this showcase of examples and ideas, and the pointers we are providing to the many online tools that are freely available today for creating, sharing and reusing geo-mashups with minimal or no coding, will ultimately spark the imagination of many public health practitioners and stimulate them to start exploring the use of these methods and tools in their day-to-day practice. The paper also discusses how today's Web is rapidly evolving into a much more intensely immersive, mixed-reality and ubiquitous socio-experiential Metaverse that is heavily interconnected through various kinds of user-created mashups.
Boulos, Maged N Kamel; Scotch, Matthew; Cheung, Kei-Hoi; Burden, David
'Mashup' was originally used to describe the mixing together of musical tracks to create a new piece of music. The term now refers to Web sites or services that weave data from different sources into a new data source or service. Using a musical metaphor that builds on the origin of the word 'mashup', this paper presents a demonstration "playlist" of four geo-mashup vignettes that make use of a range of Web 2.0, Semantic Web, and 3-D Internet methods, with outputs/end-user interfaces spanning the flat Web (two-dimensional - 2-D maps), a three-dimensional - 3-D mirror world (Google Earth) and a 3-D virtual world (Second Life). The four geo-mashup "songs" in this "playlist" are: 'Web 2.0 and GIS (Geographic Information Systems) for infectious disease surveillance', 'Web 2.0 and GIS for molecular epidemiology', 'Semantic Web for GIS mashup', and 'From Yahoo! Pipes to 3-D, avatar-inhabited geo-mashups'. It is hoped that this showcase of examples and ideas, and the pointers we are providing to the many online tools that are freely available today for creating, sharing and reusing geo-mashups with minimal or no coding, will ultimately spark the imagination of many public health practitioners and stimulate them to start exploring the use of these methods and tools in their day-to-day practice. The paper also discusses how today's Web is rapidly evolving into a much more intensely immersive, mixed-reality and ubiquitous socio-experiential Metaverse that is heavily interconnected through various kinds of user-created mashups.
Some of you may have noticed a change of format on the CERN public homepage recently. The Bulletin catches up with Dan Noyes, CERN web content manager, to find out what is happening. With over 7000 websites in the cern.ch domain, CERN’s web landscape is a challenging one to manage. Dan Noyes, who joined CERN a year ago, is the web content manager within the Communication group, which is mandated to develop the public and user websites as well as developing standards and guidelines for the wider CERN web. The recent changes made to the public homepage were the first small step towards some quite major changes proposed for CERN’s websites over the next couple of years. Currently, one of the problems of CERN’s websites is that the quantity and the diversity of the information in them make them difficult to manage, if one wants to avoid duplication and to keep information updated and easy to find. This is also aggravated by the lack of a standard design philosophy a...
Full Text Available The risk of adverse pregnancy outcomes can be minimized through the adoption of healthy lifestyles before pregnancy by women of childbearing age. Initiatives for promotion of preconception health may be difficult to implement. Internet can be used to build tailored health interventions through identification of the public's information needs. To this aim, we developed a semi-automatic web-based system for monitoring Google searches, web pages and activity on social networks, regarding preconception health.Based on the American College of Obstetricians and Gynecologists guidelines and on the actual search behaviors of Italian Internet users, we defined a set of keywords targeting preconception care topics. Using these keywords, we analyzed the usage of Google search engine and identified web pages containing preconception care recommendations. We also monitored how the selected web pages were shared on social networks. We analyzed discrepancies between searched and published information and the sharing pattern of the topics.We identified 1,807 Google search queries which generated a total of 1,995,030 searches during the study period. Less than 10% of the reviewed pages contained preconception care information and in 42.8% information was consistent with ACOG guidelines. Facebook was the most used social network for sharing. Nutrition, Chronic Diseases and Infectious Diseases were the most published and searched topics. Regarding Genetic Risk and Folic Acid, a high search volume was not associated to a high web page production, while Medication pages were more frequently published than searched. Vaccinations elicited high sharing although web page production was low; this effect was quite variable in time.Our study represent a resource to prioritize communication on specific topics on the web, to address misconceptions, and to tailor interventions to specific populations.
Tian, Hao; Brimmer, Dana J; Lin, Jin-Mann S; Tumpey, Abbigail J; Reeves, William C
The Internet is increasingly utilized by researchers, health care providers, and the public to seek medical information. The Internet also provides a powerful tool for public health messaging. Understanding the needs of the intended audience and how they use websites is critical for website developers to provide better services to the intended users. The aim of the study was to examine the utilization of the chronic fatigue syndrome (CFS) website at the Centers for Disease Control and Prevention (CDC). We evaluated (1) CFS website utilization, (2) outcomes of a CDC CFS public awareness campaign, and (3) user behavior related to public awareness campaign materials and CFS continuing medical education courses. To describe and evaluate Web utilization, we collected Web usage data over an 18-month period and extracted page views, visits, referring domains, and geographic locations. We used page views as the primary measure for the CFS awareness outreach effort. We utilized market basket analysis and Markov chain model techniques to describe user behavior related to utilization of campaign materials and continuing medical education courses. The CDC CFS website received 3,647,736 views from more than 50 countries over the 18-month period and was the 33rd most popular CDC website. States with formal CFS programs had higher visiting density, such as Washington, DC; Georgia; and New Jersey. Most visits (71%) were from Web search engines, with 16% from non-search-engine sites and 12% from visitors who had bookmarked the site. The public awareness campaign was associated with a sharp increase and subsequent quick drop in Web traffic. Following the campaign, user interest shifted from information targeting consumer basic knowledge to information for health care professionals. The market basket analysis showed that visitors preferred the 60-second radio clip public service announcement over the 30-second one. Markov chain model results revealed that most visitors took the
Kiefer, Richard C; Freimuth, Robert R; Chute, Christopher G; Pathak, Jyotishman
Gene Wiki Plus (GeneWiki+) and the Online Mendelian Inheritance in Man (OMIM) are publicly available resources for sharing information about disease-gene and gene-SNP associations in humans. While immensely useful to the scientific community, both resources are manually curated, thereby making the data entry and publication process time-consuming, and to some degree, error-prone. To this end, this study investigates Semantic Web technologies to validate existing and potentially discover new genotype-phenotype associations in GWP and OMIM. In particular, we demonstrate the applicability of SPARQL queries for identifying associations not explicitly stated for commonly occurring chronic diseases in GWP and OMIM, and report our preliminary findings for coverage, completeness, and validity of the associations. Our results highlight the benefits of Semantic Web querying technology to validate existing disease-gene associations as well as identify novel associations although further evaluation and analysis is required before such information can be applied and used effectively.
Full Text Available In order to remain relevant in society, archivists should promote collections and records that are kept in the archives. Through public programmes, archives interact with customers and various public actors and create the institutional image. This paper is concerned with the role of public programmes in the process of modernization of the archival practice, with the emphasis on the Croatian state archives. The aim of the paper is to identify what kind of information is offered to users and public in general on the web sites of the Croatian state archives. Public programmes involve two important components of archival practice: archives and users. Therefore, public programmes ensure good relations with the public. Croatian archivists still question the need for public relations in archives, while American and European archives have already integrated public relations into the basic archival functions. The key components needed for successful planning and implementation of public programs are the source of financing, compliance with the annual work plan, clear goals, defined target audience, cooperation and support from the local community, and the evaluation of results.
Ahmed, M. Imran; Maruf Hassan, Md; Bhuyian, Touhid
Almost all public-sector organisations in Bangladesh now offer online services through web applications, along with the existing channels, in their endeavour to realise the dream of a ‘Digital Bangladesh’. Nations across the world have joined the online environment thanks to training and awareness initiatives by their government. File sharing and downloading activities using web applications have now become very common, not only ensuring the easy distribution of different types of files and documents but also enormously reducing the time and effort of users. Although the online services that are being used frequently have made users’ life easier, it has increased the risk of exploitation of local file disclosure (LFD) vulnerability in the web applications of different public-sector organisations due to unsecure design and careless coding. This paper analyses the root cause of LFD vulnerability, its exploitation techniques, and its impact on 129 public-sector websites in Bangladesh by examining the use of manual black box testing approach.
Full Text Available Introduction. This paper reports preliminary research in a primarily experimental study of how the general public search for information on the Web. The focus is on the query transformation patterns that characterise searching. Method. In this work, we have used transaction logs from the Excite search engine to develop methods for analysing query transformations that should aid the analysis of our ongoing experimental work. Our methods involve the use of similarity techniques to link queries with the most similar previous query in a train. The resulting query transformations are represented as a list of codes representing a whole search. Analysis. It is shown how query transformation sequences can be represented as graphical networks and some basic statistical results are shown. A correlation analysis is performed to examine the co-occurrence of Boolean and quotation mark changes with the syntactic changes. Results. A frequency analysis of the occurrence of query transformation codes is presented. The connectivity of graphs obtained from the query transformation is investigated and found to follow an exponential scaling law. The correlation analysis reveals a number of patterns that provide some interesting insights into Web searching by the general public. Conclusion. We have developed analytical methods based on query similarity that can be applied to our current experimental work with volunteer subjects. The results of these will form part of a database with the aim of developing an improved understanding of how the public search the Web.
Zhou Yangping; Yoshikawa, Hidekazu; Liu Jingquan; Ouyang, Jun; Lu Daogang
Now, public acceptance plays a central role in the nuclear energy. Public concerns on safety and sustainability of nuclear energy, ground nuclear power in many countries and territories to a stop or even a downfall. In this study, an e-learning framework by using Internet, is proposed for public education in order to boost public perception on nuclear energy, which will certainly affect public acceptance toward it. This study aims at investigating public perception and acceptance on nuclear energy in a continuous and accurate manner. In addition, this e-learning framework can promote public perception on nuclear energy by using teaching material with a graphical hierarchy about knowledge of nuclear energy. This web-based e-learning framework mainly consists of two components: (1) an e-learning support module which continuously investigates public perception and acceptance toward nuclear energy and teaches public knowledge about nuclear energy; (2) an updating module which may improve the education materials by analyzing the effect of education or proving the materials submitted by the visitors through Wiki pages. Advantages and future work of this study are also generally described. (author)
Yoshiura, Vinicius Tohoru; de Azevedo-Marques, João Mazzoncini; Rzewuska, Magdalena; Vinci, André Luiz Teixeira; Sasso, Ariane Morassi; Miyoshi, Newton Shydeo Brandão; Furegato, Antonia Regina Ferreira; Rijo, Rui Pedro Charters Lopes; Del-Ben, Cristina Marta; Alves, Domingos
Regional networking between services that provide mental health care in Brazil's decentralized public health system is challenging, partly due to the simultaneous existence of services managed by municipal and state authorities and a lack of efficient and transparent mechanisms for continuous and updated communication between them. Since 2011, the Ribeirao Preto Medical School and the XIII Regional Health Department of the Sao Paulo state, Brazil, have been developing and implementing a web-based information system to facilitate an integrated care throughout a public regional mental health care network. After a profound on-site analysis, the structure of the network was identified and a web-based information system for psychiatric admissions and discharges was developed and implemented using a socio-technical approach. An information technology team liaised with mental health professionals, health-service managers, municipal and state health secretariats and judicial authorities. Primary care, specialized community services, general emergency and psychiatric wards services, that comprise the regional mental healthcare network, were identified and the system flow was delineated. The web-based system overcame the fragmentation of the healthcare system and addressed service specific needs, enabling: detailed patient information sharing; active coordination of the processes of psychiatric admissions and discharges; real-time monitoring; the patients' status reports; the evaluation of the performance of each service and the whole network. During a 2-year period of operation, it registered 137 services, 480 health care professionals and 4271 patients, with a mean number of 2835 accesses per month. To date the system is successfully operating and further expanding. We have successfully developed and implemented an acceptable, useful and transparent web-based information system for a regional mental healthcare service network in a medium-income country with a decentralized
Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat
At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.
... and Drug Administration (FDA) is announcing the Web site location where the Agency will post two lists... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-N-1021] Medical Device User Fee and Modernization Act; Notice to Public of Web Site Location of Fiscal Year 2013...
.... ACTION: Notice. SUMMARY: The Food and Drug Administration (FDA) is announcing the Web site location where... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2007-N-0270; formerly Docket No. 2007N-0357] Medical Device User Fee and Modernization Act; Notice to Public of Web Site...
... and Drug Administration (FDA or the Agency) is announcing the Web site location where the Agency will... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-N-1021] Medical Device User Fee and Modernization Act; Notice to Public of Web Site Location of Fiscal Year 2014...
Lin, Chi-Shiou; Eschenfelder, Kristin R.
This paper reports on a study of librarian initiated publications discovery (LIPD) in U.S. state digital depository programs using the OCLC Digital Archive to preserve web-based government publications for permanent public access. This paper describes a model of LIPD processes based on empirical investigations of four OCLC DA-based digital…
Cha, YoonKyung; Stow, Craig A.
We explore how the analysis of web-based data, such as Twitter and Google Trends, can be used to assess the social relevance of an environmental accident. The concept and methods are applied in the shutdown of drinking water supply at the city of Toledo, Ohio, USA. Toledo's notice, which persisted from August 1 to 4, 2014, is a high-profile event that directly influenced approximately half a million people and received wide recognition. The notice was given when excessive levels of microcystin, a byproduct of cyanobacteria blooms, were discovered at the drinking water treatment plant on Lake Erie. Twitter mining results illustrated an instant response to the Toledo incident, the associated collective knowledge, and public perception. The results from Google Trends, on the other hand, revealed how the Toledo event raised public attention on the associated environmental issue, harmful algal blooms, in a long-term context. Thus, when jointly applied, Twitter and Google Trend analysis results offer complementary perspectives. Web content aggregated through mining approaches provides a social standpoint, such as public perception and interest, and offers context for establishing and evaluating environmental management policies. - The joint application of Twitter and Google Trend analysis to an environmental event offered both short and long-term patterns of public perception and interest on the event
Beril T. Arik
Full Text Available There are several indicators that distinguish an academic discipline, including journals, conferences, and graduate programs. One of them is the presence of academic publications in well-regarded citation indices such as Web of Science (WoS. This study explored the bibliometric characteristics of publications on “second language writing” (SLW covered in the Social Sciences Citation Index and the Arts & Humanities Citation Index of WoS. We found that, while the first appeared in 1992 with a steady increase in recent years, there were a total of 266 SLW publications, mostly in the linguistics research area (92%, in the WoS between 1900 and 2013. The publications included articles, book reviews, and bibliographies written by 1.64 authors per publication, suggesting a low level of collaborations among SLW scholars. They cited 31.44 publications and received citations from 5.90 publications on average. An average SLW title had 2.49 different words and a total of 10.85 words, with an abstract of about five sentences and about six keywords and diverse topics including second language writing, writing, academic writing, error correction, and plagiarism. Our findings will be of value to second language writing scholars, graduate students, and practitioners for examining the status of their field.
Bi, T. P.; Gao, D. Y.; Zhong, X. Y.
In order to actualize the social sharing and service of the emergency-response information for sudden pollution accidents, the public can share the risk source information service, dangerous goods control technology service and so on, The SQL Server and ArcSDE software are used to establish a spatial database to restore all kinds of information including risk sources, hazardous chemicals and handling methods in case of accidents. Combined with Chinese atmospheric environmental assessment standards, the SCREEN3 atmospheric dispersion model and one-dimensional liquid diffusion model are established to realize the query of related information and the display of the diffusion effect under B/S structure. Based on the WebGIS technology, C#.Net language is used to develop the sudden environmental pollution public service platform. As a result, the public service platform can make risk assessments and provide the best emergency processing services.
Full Text Available In some ways discussion of the political implications of Web 2.0 reinvigorates a debate about the democratising nature of the Internet that began in the 1990s. The concept of participation is at the heart of many current debates about politics and technology. There are two main reasons for saying this. On the one hand is an ongoing and increasing concern about public participation, or lack of it, in modern (predominantly Western democracies. This participatory deficit is to be seen in falling voter turnout at elections, public apathy on key political issues and scorn or indifference for elected political representatives. On the other hand, there is a wave of optimism concerning the potential of new technologies, particularly the web, to enable new forms of participation in economic and public life, to transform political debate and citizenship and to renew the ailing (or perceived to be ailing institutions of democracy. This optimism around participation and politics, while it has played a role in utopian visions of the internet more or less since its inception, has been reinvigorated recently by the discussion around the so-called Web 2.0. This article argues for a much more critical or sceptical approach to the political promise of Web 2.0. Focusing particularly on Yochai Benkler's The Wealth of Networks, it argues that current accounts of the participatory aspects of web culture tend to take a rather narrow view of what such participation might mean. However, aspects of the work of Bernard Stiegler, and that of others in the Ars Industrialis group co-founded by Stiegler, can help inform a more nuanced account of the relationship between politics and participation. It looks specifically at the arguments in Marc Crépon and Bernard Stiegler's book De la démocratie participative, written during the recent French presidential campaign, and will examine how the idea of participation articulates with key themes in Stiegler's philosophy of technics
Kiefer, Richard C.; Freimuth, Robert R.; Chute, Christopher G; Pathak, Jyotishman
Gene Wiki Plus (GeneWiki+) and the Online Mendelian Inheritance in Man (OMIM) are publicly available resources for sharing information about disease-gene and gene-SNP associations in humans. While immensely useful to the scientific community, both resources are manually curated, thereby making the data entry and publication process time-consuming, and to some degree, error-prone. To this end, this study investigates Semantic Web technologies to validate existing and potentially discover new genotype-phenotype associations in GWP and OMIM. In particular, we demonstrate the applicability of SPARQL queries for identifying associations not explicitly stated for commonly occurring chronic diseases in GWP and OMIM, and report our preliminary findings for coverage, completeness, and validity of the associations. Our results highlight the benefits of Semantic Web querying technology to validate existing disease-gene associations as well as identify novel associations although further evaluation and analysis is required before such information can be applied and used effectively. PMID:24303249
Ryu, Seewon; Park, Minsu; Lee, Jaegook; Kim, Sung-Soo; Han, Bum Soo; Mo, Kyoung Chun; Lee, Hyung Seok
The Web-based integrated public healthcare information system (PHIS) of Korea was planned and developed from 2005 to 2010, and it is being used in 3,501 regional health organizations. This paper introduces and discusses development and performance of the system. We reviewed and examined documents about the development process and performance of the newly integrated PHIS. The resources we analyzed the national plan for public healthcare, information strategy for PHIS, usage and performance reports of the system. The integrated PHIS included 19 functional business areas, 47 detailed health programs, and 48 inter-organizational tasks. The new PHIS improved the efficiency and effectiveness of the business process and inter-organizational business, and enhanced user satisfaction. Economic benefits were obtained from five categories: labor, health education and monitoring, clinical information management, administration and civil service, and system maintenance. The system was certified by a patent from the Korean Intellectual Property Office and accredited as an ISO 9001. It was also reviewed and received preliminary comments about its originality, advancement, and business applicability from the Patent Cooperation Treaty. It has been found to enhance the quality of policy decision-making about regional healthcare at the self-governing local government level. PHIS, a Web-based integrated system, has contributed to the improvement of regional healthcare services of Korea. However, when it comes to an appropriate evolution, the needs and changing environments of community-level healthcare service and IT infrastructure should be analyzed properly in advance.
Ballew, Paula; Castro, Sarah; Claus, Julie; Kittur, Nupur; Brennan, Laura; Brownson, Ross C.
During a time when governmental funding, resources and staff are decreasing and travel restrictions are increasing, attention to efficient methods of public health workforce training is essential. A literature review was conducted to inform the development and delivery of web-based trainings for public health practitioners. Literature was gathered…
Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.
This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroic effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster
Full Text Available The Web 2.0, which includes Facebook, Twitter, Youtube and other social medias, is considered as to be one of the strongest communication tools of the early 21st century. TheWeb evolutionhas changed deeply the way Public relations agents operate. In 2009, Charest and Bédard have shown that the Web 2.0 was in fact a reclaim by the internet users of the Web as it was first imagined by Tim Berners-Lee in November 1993 : a tool to exchange and share information. The Web first generation has instead been used by the administrators for dissemination and promotion. Today, in order to appropriate themselves these new medias, PR agents have to findnew business models, even new ways to communicate.RésuméLe Web 2.0, regroupant les Facebook, Twitter, YouTube et autres médias sociaux, est considéré comme l’un des plus puissants outils de communication, en ce début de XX1 siècle. C’est sous l’angle de mutations qu’il induit sur les pratiques des professionnels en relations publiques, qu’il nous intéresse d’étudier les enjeux de l’évolution des usages du Web 2.0. Charest et Bédard ont montré en 2009 que le Web 2.0 était la revanche des internautes qui tentent de se réapproprier le Web tel qu’il avait été conçu par Tim Berners-Lee en novembre 1993, soit comme un outil d’échange et de partage d’information. Il a été clairement montré que la première générationde Web a plutôt été utilisée par les gestionnaires à des fins de diffusion et de promotion. L’appropriation de ces nouveaux médias par les relationnistes passe nécessairement par de nouveaux modèles d’affaires, voire de nouvelles façons de communiquer.
Savel, Craig; Mierzwa, Stan; Gorbach, Pamina M; Souidi, Samir; Lally, Michelle; Zimet, Gregory; Interventions, Aids
This paper reports on a specific Web-based self-report data collection system that was developed for a public health research study in the United States. Our focus is on technical outcome results and lessons learned that may be useful to other projects requiring such a solution. The system was accessible from any device that had a browser that supported HTML5. Report findings include: which hardware devices, Web browsers, and operating systems were used; the rate of survey completion; and key considerations for employing Web-based surveys in a clinical trial setting.
Foster, S. Q.; Carbone, L.; Gardiner, L.; Johnson, R.; Russell, R.; Advisory Committee, S.; Ammann, C.; Lu, G.; Richmond, A.; Maute, A.; Haller, D.; Conery, C.; Bintner, G.
lessons and ancillary exhibit interactives and visualizations for the final Teachers' Guide unit about 'Climate Future.' Units developed so far are available in downloadable format on the NCAR EO and Windows to the Universe web sites for dissemination to educators and the general public public. Those web sites are, respectively, (http://eo.ucar.edu/educators/ClimateDiscovery) and (http://www.windows.ucar.edu). Encouragement from funding agencies to integrate and relate resources and growing pressure to implement efficiencies in educational programs have created excellent opportunities which will be described from the viewpoints of EO staff and scientists'. Challenges related to public and student perceptions about climate and global change, the scientific endeavor, and how to establish successful dialogues between educators and scientists will also be discussed.
Full Text Available A longstanding idea in the literature on human cooperation is that cooperation should be reinforced when conditional cooperators are more likely to interact. In the context of social networks, this idea implies that cooperation should fare better in highly clustered networks such as cliques than in networks with low clustering such as random networks. To test this hypothesis, we conducted a series of web-based experiments, in which 24 individuals played a local public goods game arranged on one of five network topologies that varied between disconnected cliques and a random regular graph. In contrast with previous theoretical work, we found that network topology had no significant effect on average contributions. This result implies either that individuals are not conditional cooperators, or else that cooperation does not benefit from positive reinforcement between connected neighbors. We then tested both of these possibilities in two subsequent series of experiments in which artificial seed players were introduced, making either full or zero contributions. First, we found that although players did generally behave like conditional cooperators, they were as likely to decrease their contributions in response to low contributing neighbors as they were to increase their contributions in response to high contributing neighbors. Second, we found that positive effects of cooperation were contagious only to direct neighbors in the network. In total we report on 113 human subjects experiments, highlighting the speed, flexibility, and cost-effectiveness of web-based experiments over those conducted in physical labs.
Lipman, Len J; Barnier, Valérie M; de Balogh, Katalin K
The expanding field of Veterinary Public Health places new demands on the knowledge and skills of veterinarians. Veterinary curricula must therefore adapt to this new profile. Through the introduction of case studies dealing with up-to-date issues, students are being trained to solve (real-life) problems and come up with realistic solutions. At the Department of Public Health and Food Safety of the Veterinary Faculty at the University of Utrecht in the Netherlands, positive experiences have resulted from the new opportunities offered by the use of information and communication technology (ICT) in education. The possibility of creating a virtual classroom on the Internet through the use of WebCT software has enabled teachers and students to tackle emerging issues by working together with students in other countries and across disciplines. This article presents some of these experiences, through which international exchange of ideas and realities were stimulated, in addition to consolidating relations between universities in different countries. Long-distance education methodologies provide an important tool to achieve the increasing need for international cooperation in Veterinary Public Health curricula.
... check and a National Sex Offender Public Web site check on an individual in a covered position? 2540.203... National Sex Offender Public Web site check on an individual in a covered position? (a) The State criminal... enrolls in, or is hired by, your program on or after October 1, 2009. (b) The National Sex Offender Public...
Friedman, A.; Pizarro, O.; Williams, S. B.
With the advances in high bandwidth communications and the proliferation of social media tools, education & outreach activities have become commonplace on ocean-bound research cruises. In parallel, advances in underwater robotics & other data collecting platforms, have made it possible to collect copious amounts of oceanographic data. This data then typically undergoes laborious, manual processing to transform it into quantitative information, which normally occurs post cruise resulting in significant lags between collecting data and using it for scientific discovery. This presentation discusses how appropriately designed software systems, can be used to fulfill multiple objectives and attempt to leverage public engagement in order to compliment science goals. We will present two software platforms: the first is a web browser based tool that was developed for real-time tracking of multiple underwater robots and ships. It was designed to allow anyone on board to view or control it on any device with a web browser. It opens up the possibility of remote teleoperation & engagement and was easily adapted to enable live streaming over the internet for public outreach. While the tracking system provided context and engaged people in real-time, it also directed interested participants to Squidle, another online system. Developed for scientists, Squidle supports data management, exploration & analysis and enables direct access to survey data reducing the lag in data processing. It provides a user-friendly streamlined interface that integrates advanced data management & online annotation tools. This system was adapted to provide a simplified user interface, tutorial instructions and a gamified ranking system to encourage "citizen science" participation. These examples show that through a flexible design approach, it is possible to leverage the development effort of creating science tools to facilitate outreach goals, opening up the possibility for acquiring large volumes of
Paioro, L.; Garilli, B.; Le Brun, V.; Franzetti, P.; Fumana, M.; Scodeggio, M.
Cosmological surveys (like VVDS, GOODS, DEEP2, COSMOS, etc.) aim at providing a complete census of the universe over a broad redshift range. Often different information are gathered with different instruments (e.g., spectrographs, HST, X-ray telescopes, etc.) and it is only by correctly assembling and easily manipulating such wide sets of data that astronomers can attempt to describe the universe; many different scientific goals can be tackled grouping and filtering the different data sets. When dealing with the huge databases resulting from public cosmological surveys , what is needed is: (a) a versatile system of queries, to allow searches by different parameters (like redshifts, magnitude, colors, etc.) according to the specific scientific goal to be tackled; (b) a cross-matching system to verify or redefine the identification of the sources; and (c) a data products retrieving system to download data related images and spectra. The Virtual Observatory Alliance defines a set of services which can satisfy the needs described above, exploiting Web Services technology. Having in mind the exploitation of cosmological surveys, we have implemented what we consider the most fundamental VO Web Services for our scientific interests: Conesearch (retrieves physical data values from a cone centered on one point in the sky - the simplest query), SkyNode (allows to filter on the physical quantities in the database in order to select a well defined data subset), SIAP (retrieves all the images contained in a sky region of interest), SSAP (retrieves 1D spectra). Our testing bench is the VVDSCDFS data set, made public in 2004, which contains photometric and spectroscopic information for 1599 sources (Le F`rve et al., 2004, A&A, 428, 1043, see ). On e this data set, we have implemented and published on US NVO registry the first three services mentioned above, to demonstrate the viability of this approach and its usefulness to the astronomical community. Implementation of SSAP
Shiro, B.; Palaia, J.; Ferrone, K.
Recent advances in social media and internet communications have revolutionized the ways people interact and disseminate information. Astronauts are already starting to take advantage of these tools by blogging and tweeting from space, and almost all NASA missions now have presences on the major social networking sites. One priority for future human explorers on Mars will be communicating their experiences to the people back on Earth. During July 2009, a six-member crew of volunteers carried out a simulated Mars mission at the Flashline Mars Arctic Research Station (FMARS) on Devon Island in the Canadian Arctic. Living in a habitat, conducting EVAs wearing spacesuits, and observing communication delays with “Earth,” the crew endured restrictions similar to those that will be faced by future human Mars explorers. Throughout the expedition, crewmembers posted regular blog entries, reports, photos, videos, and updates to their website and social media outlets Twitter, Facebook, YouTube, and Picasa Web Albums. During the sixteen EVAs of their field science research campaign, FMARS crewmembers collected GPS track information and took geotagged photos using GPS-enabled cameras. They combined their traverse GPS tracks with photo location information into KML/KMZ files that website visitors can view in Google Maps or Google Earth. Although the crew observed a strict 20-minute communication delay with “Earth” to simulate a real Mars mission, they broke this rule to conduct four very successful live webcasts with student groups using Skype since education and public outreach were important objectives of the endeavor. This presentation will highlight the use of Web 2.0 technologies for public outreach during the simulated Mars expedition and the implications for other remote scientific journeys. The author embarks on a "rover" to carry out an EVA near the FMARS Habitat. The satellite dish to the right of the structure was used for all communications with the remote
Aletéia de Moura Carpes
Full Text Available Internationalization is an alternative to business growth, allowing its exposure to international standards of products, technologies and management methods, generating significant returns for domestic transactions (STAL, 2010. The evolution of commercial transactions between the countries have brought a lot of issues to be understood, that seek to check the reflection of international activity in the individual, company and nation experiencing this context of globalization. This article was developed from the perspective of a bibliometric research, aiming to increase awareness in the study area related to International Business (International Business and determine which topics studied by the administration on this issue are being investigated further and which are most relevant (hot topics. Analysis of data held on the approaches of qualitative and quantitative research. Qualitatively analyzed the issues addressed in the publications surveyed regarding the content, keywords and relevance of topics. As regards the figures sought to investigate the following variables: total number of publications, authors, subject areas, types of documents, the sources title, year of publications, institutions, funding agencies, languages, countries and analysis of the number of times each publication was cited by the hb index and the index m. According to Hirsch (2005, the total number of articles published measures the productivity of the author, but does not measure the importance and / or impact of their publications. Already the impact of publications is measured by the number of citations that each one receives can be measured by the h-index. The survey of publications housed in the Web of Science citation index with the ISI Citation Indexes from 1997 to 2010 (14 years resulted in 5,355 jobs related to international business, which were entered, especially in the areas of business (business and management (management and the studies analyzed showed a
Eduardo Luís Hepper
Full Text Available Brazil is going through a time of reflection about the preservation of natural resources, an issue that is increasingly considered in its agenda. The search for balance between environmental, social and economic aspects has been a challenge for business survival over the years and has led companies to adopt initiatives focused on sustainability. The objective of this article is to analyse how the international scientific production addresses sustainable practices and initiatives and their relationship with organizational performance. Considering this scope, a bibliometric study of the publications located on Web of Science - Social Sciences Citation Index (WoS-SSCI was developed. There were 33 articles identified and selected on the subject. Journals that stand out in quantity of articles and number of citations are the Journal of Cleaner Production and Strategic Management Journal, respectively. Analysing the results, a growing concern about this issue and the increase in publications was noticed after the 2000s. The results found, in general, associate sustainable practices to positive organizational performance, such as increased profit on the product sold, quality improvement, improved reputation, and waste reduction, among others gains identified.
This thesis aims to contribute to the construction of a theoretical and methodological framework for the analysis of symbolic mediations which occur in the public sphere during public debates. Firstly, we discuss the epistemological conditions of a search for ideological forms shaped by the circulation of discourses. Secondly, we show that conversations about civil nuclear power among internet users on comment boards of online news web sites are structured by a limited number of frames of intelligibility that we call 'modes of apprehension'. These modes of apprehension never occur in their canonic form: they only appear by fragments in the speech of individuals. Hence, an argumentative analysis of discourse can be used to rebuild them by reordering the multiple 'topoi' in consistent and coherent universes of meaning. Bringing out these modes of apprehension, forged and perpetuated by the circulation of discourses, has three main interests: we highlight some of the symbolic mediations of the social communication about civil nuclear power after Fukushima; we underline some of the main political and philosophical issues of the question; and we examine some of the dominant ideological sedimentations of our modernity. (author)
Full Text Available Abstract Background Surveillance data allow for analysis, providing public health officials and policy-makers with a basis for long-term priorities and timely information on possible outbreaks for rapid response (data for action. In this article we describe the considerations and technology behind a newly introduced public web tool in Sweden for easy retrieval of county and national surveillance data on communicable diseases. Methods The web service was designed to automatically present updated surveillance statistics of some 50 statutory notifiable diseases notified to the Swedish Institute for Infectious Disease Control (SMI. The surveillance data is based on clinical notifications from the physician having treated the patient and laboratory notifications, merged into cases using a unique personal identification number issued to all Swedish residents. The web service use notification data from 1997 onwards, stored in a relational database at the SMI. Results The web service presents surveillance data to the user in various ways; tabulated data containing yearly and monthly disease data per county, age and sex distribution, interactive maps illustrating the total number of cases and the incidence per county and time period, graphs showing the total number of cases per week and graphs illustrating trends in the disease data. The system design encompasses the database (storing the data, the web server (holding the web service and an in-the-middle computer (to ensure good security standards. Conclusions The web service has provided the health community, the media, and the public with easy access to both timely and detailed surveillance data presented in various forms. Since it was introduced in May 2003, the system has been accessed more than 1,000,000 times, by more than 10,000 different viewers (over 12.600 unique IP-numbers.
With the migration of the written record from paper to digital format, archivists and historians must urgently consider how web content should be conserved, retrieved and analysed. The British Library has recently acquired a large number of UK domain websites, captured 1996–2010, which is colloquially termed the Dark Domain Archive while technical issues surrounding user access are resolved. This article reports the results of an invited pilot project that explores methodological issues surrounding use of this archive. It asks how the relationship between UK public health and local government was represented on the web, drawing on the ‘declinist’ historiography to frame its questions. It points up some difficulties in developing an aggregate picture of web content due to duplication of sites. It also highlights their potential for thematic and discourse analysis, using both text and image, illustrated through an argument about the contradictory rationale for public health policy under New Labour. PMID:26217072
This second part of a two-part series is a survey of U.S. government web resources on human trafficking in the United States, particularly of the online publications and data included on agencies' websites. Overall, the goal is to provide an introduction, an overview, and a guide on this topic for library staff to use in their research and…
Ballew, Paula; Castro, Sarah; Claus, Julie; Kittur, Nupur; Brennan, Laura; Brownson, Ross C
During a time when governmental funding, resources and staff are decreasing and travel restrictions are increasing, attention to efficient methods of public health workforce training is essential. A literature review was conducted to inform the development and delivery of web-based trainings for public health practitioners. Literature was gathered and summarized from five disciplines: Information Technology, Health, Education, Business and Communications, following five research themes: benefits, barriers, retention, promotion and evaluation. As a result, a total of 138 articles relevant to web-based training design and implementation were identified. Key recommendations emerged, including the need to conduct formative research and evaluation, provide clear design and layout, concise content, interactivity, technical support, marketing and promotion and incentives. We conclude that there is limited application of web-based training in public health. This review offers an opportunity to learn from other disciplines. Web-based training methods may prove to be a key training strategy for reaching our public health workforce in the environment of limited resources.
Perianes-Rodriguez, A.; Ruiz-Castillo, J.
In this paper we propose a new criterion for choosing between a pair of classification systems of science that assign publications (or journals) to a set of scientific fields. Consider the standard normalization procedure in which field mean citations are used as normalization factors. We recommend system A over system B whenever the standard normalization procedure based on A performs better than the when it is based on B. Since the evaluation can be made in terms of either system, the performance assessment requires a double test. In addition, since the assessment of two normalization procedures would be generally biased in favor of the one based on the classification system used for evaluation purposes, ideally a pair of classification systems must be compared using a third, independent classification system for evaluation purposes. We illustrate this strategy by comparing a Web of Science journal-level classification system, consisting of 236 journal subject categories, with two publication-level algorithmically constructed classification systems consisting of 1,363 (G6) and 5,119 (G8) clusters. There are two main findings. (1) The G8 system is found to dominate the G6 system. Therefore, when we have a choice between two classification systems at different granularity levels, we should use the system at the higher level because it typically exhibits a better standard normalization performance. (2) The G8 system and the Web of Science (WoS) journal-level system are found to be non-comparable. Nevertheless, the G8-normalization procedure performs better using the WoS system for evaluation purposes than the WoS-normalization procedure using the G8 system for evaluation purposes. Furthermore, when we use the G6 system for evaluation purposes, the G8-normalization procedure performs better than the WoS-normalization procedure. We conclude that algorithmically constructed classification systems constitute a credible alternative to the WoS system and, by extension, to
Li, Ping; Cunningham, Krystal
The APA Style Converter is a Web-based tool with which authors may prepare their articles in APA style according to the APA Publication Manual (5th ed.). The Converter provides a user-friendly interface that allows authors to copy and paste text and upload figures through the Web, and it automatically converts all texts, references, and figures to a structured article in APA style. The output is saved in PDF or RTF format, ready for either electronic submission or hardcopy printing.
Cosmin Catalin Olteanu
Full Text Available The main purpose of the paper is to illustrate how we can improve a public institution web site by using social plugins and mobile devices optimization. The general idea is to increase the number of visitors by viral message and users to access a special template web site from their devices. I will present in this paper how you can increase the visitors in your sites by using Facebook and provide mobile layouts to users. Google analytics is one tool to show what common devices are used.
Ferrone, Kristine; Shiro, Brian; Palaia, Joseph E., IV
Recent advances in social media and internet communications have revolutionized the ways people interact and disseminate information. Astronauts are already taking advantage of these tools by blogging and tweeting from space, and almost all NASA missions now have presences on the major social networking sites. One priotity for future human explorers on Mars will be communicating their experiences to the people back on Earth. During July 2009, a 6-member crew of volunteers carried out a simulated Mars mission at the Flashline Mars Arctic Research Station (FMARS). The Mars Society built the mock Mars habitat in 2000-01 to help develop key knowledge and inspire the public for human Mars exploration. It is located on Devon island about 1600 km from the North Pole within the Arctic Circle. The structure is situated on the rim of Haughton Crater in an environment geologically and biologically analogous to Mars. Living in a habitat, conducting EVAs wearing spacesuits, and observing communication delays with "Earth,"the crew endured restrictions similar to those that will be faced by future human Mars explorers. Throughout the expedition, crewmembers posted daily blog entries, reports, photos, videos, and updates to their website and social media outlets Twitter, Facebook, YouTube, and Picasa Web Albums. During the sixteen EVAs of thier field science research campaign, FMARS crewmembers collected GPS track information and took geotagged photos using GPS-enabled cameras. They combined their traverse GPS tracks with photo location information into KML/KMZ files that website visitors can view in Google Earth.
Devaraju, A.; Klump, J. F.; Tey, V.; Fraser, R.; Reid, N.; Brown, A.; Golodoniuc, P.
Inaccessible samples are an obstacle to the reproducibility of research and may cause waste of time and resources through duplication of sample collection and management. Within the Commonwealth Scientific and Industrial Research Organisation (CSIRO) Mineral Resources there are various research communities who collect or generate physical samples as part of their field studies and analytical processes. Materials can be varied and could be rock, soil, plant materials, water, and even synthetic materials. Given the wide range of applications in CSIRO, each researcher or project may follow their own method of collecting, curating and documenting samples. In many cases samples and their documentation are often only available to the sample collector. For example, the Australian Resources Research Centre stores rock samples and research collections dating as far back as the 1970s. Collecting these samples again would be prohibitively expensive and in some cases impossible because the site has been mined out. These samples would not be easily discoverable by others without an online sample catalog. We identify some of the organizational and technical challenges to provide unambiguous and systematic access to geoscience samples, and present their solutions (e.g., workflow, persistent identifier and tools). We present the workflow starting from field sampling to sample publication on the Web, and describe how the International Geo Sample Number (IGSN) can be applied to identify samples along the process. In our test case geoscientific samples are collected as part of the Capricorn Distal Footprints project, a collaboration project between the CSIRO, the Geological Survey of Western Australia, academic institutions and industry partners. We conclude by summarizing the values of our solutions in terms of sample management and publication.
Riley-Huff, Debra A.
This article is an analysis of academic library organizational culture, patterns, and processes as they relate to Web services. Data gathered in a research survey is examined in an attempt to reveal current departmental and administrative attitudes, practices, and support for Web services in the library research environment. (Contains 10 tables.)
Rojas-Sola, J. I.; de San-Antonio-Gómez, C.
In this paper the publications from Spanish institutions listed in the journals of the Construction & Building Technology subject of Web of Science database for the period 1997- 2008 are analyzed. The number of journals in whose is published is 35 and the number of articles was 760 (Article or Review). Also a bibliometric assessment has done and we propose two new parameters: Weighted Impact Factor and Relative Impact Factor; also includes the number of citations and the number documents ...
Tam, Greta; Liu, Sida
Background Web-based public health courses are becoming increasingly popular. “Public Health Principles in Disaster and Medical Humanitarian Response” is a unique Web-based course in Hong Kong. This course aimed to fill a public health training gap by reaching out to postgraduates who are unable to access face-to-face learning. Objective The aim of this paper was to use a structured framework to objectively evaluate the effectiveness of a Web-based course according to Greenhalgh et al’s quality framework and the Donabedian model to make recommendations for program improvement. Methods An interim evaluation of the first cohort of students in 2014 was conducted according to the Donabedian model and a quality framework by Greenhalgh et al using objective and self-reported data. Results Students who registered for the first cohort (n=1152) from June 16, 2014 to December 15, 2014 (6 months) were surveyed. Two tutors and the course director were interviewed. The Web-based course was effective in using technology to deliver suitable course materials and assessment and to enhance student communication, support, and learning. Of the total number of students registered, 59.00% (680/1152) were nonlocal, originating from 6 continents, and 72.50% (835/1152) possessed a bachelor’s or postgraduate degree. The completion rate was 20.00% (230/1152). The chi-square test comparing students who completed the course with dropouts showed no significant difference in gender (P=.40), age (P=.98), occupation (P=.43), or qualification (P=.17). The cost (HK $272 per student) was lower than that of conducting a face-to-face course (HK $4000 per student). Conclusions The Web-based course was effective in using technology to deliver a suitable course and reaching an intended audience. It had a higher completion rate than other Web-based courses. However, sustainable sources of funding may be needed to maintain the free Web-based course. PMID:29374007
Ostlund, Neil [Chemical Semantics, Inc., Gainesville, FL (United States)
This research showed the feasibility of applying the concepts of the Semantic Web to Computation Chemistry. We have created the first web portal (www.chemsem.com) that allows data created in the calculations of quantum chemistry, and other such chemistry calculations to be placed on the web in a way that makes the data accessible to scientists in a semantic form never before possible. The semantic web nature of the portal allows data to be searched, found, and used as an advance over the usual approach of a relational database. The semantic data on our portal has the nature of a Giant Global Graph (GGG) that can be easily merged with related data and searched globally via a SPARQL Protocol and RDF Query Language (SPARQL) that makes global searches for data easier than with traditional methods. Our Semantic Web Portal requires that the data be understood by a computer and hence defined by an ontology (vocabulary). This ontology is used by the computer in understanding the data. We have created such an ontology for computational chemistry (purl.org/gc) that encapsulates a broad knowledge of the field of computational chemistry. We refer to this ontology as the Gainesville Core. While it is perhaps the first ontology for computational chemistry and is used by our portal, it is only a start of what must be a long multi-partner effort to define computational chemistry. In conjunction with the above efforts we have defined a new potential file standard (Common Standard for eXchange – CSX for computational chemistry data). This CSX file is the precursor of data in the Resource Description Framework (RDF) form that the semantic web requires. Our portal translates CSX files (as well as other computational chemistry data files) into RDF files that are part of the graph database that the semantic web employs. We propose a CSX file as a convenient way to encapsulate computational chemistry data.
Full Text Available This paper describes a teaching experiment designed to examine the learning (i.e., retention of content and conceptual development that takes place when public scientific web lectures delivered by scientists are utilized to present advanced ideas in physics to students with a high school background in physics. The students watched an exemplary public physics web lecture that was followed by a collaborative generic activity session. The collaborative session involved a guided critical reconstruction of the main arguments in the lecture, and a processing of the key analogical explanations. Then the students watched another exemplary web lecture on a different topic. The participants (N=14 were divided into two groups differing only in the order in which the lectures were presented. The students’ discussions during the activities show that they were able to reason and demonstrate conceptual progress, although the physics ideas in the lectures were far beyond their level in physics. The discussions during the collaborative session contributed significantly to the students’ understanding. We illustrate this point through an analysis of one of these discussions between two students on an analogical explanation of the Aharonov-Bohm effect that was presented in one of the lectures. The results from the tests that were administered to the participants several times during the intervention further support this contention.
Kapon, Shulamit; Ganiel, Uri; Eylon, Bat Sheva
This paper describes a teaching experiment designed to examine the learning (i.e., retention of content and conceptual development) that takes place when public scientific web lectures delivered by scientists are utilized to present advanced ideas in physics to students with a high school background in physics. The students watched an exemplary public physics web lecture that was followed by a collaborative generic activity session. The collaborative session involved a guided critical reconstruction of the main arguments in the lecture, and a processing of the key analogical explanations. Then the students watched another exemplary web lecture on a different topic. The participants (N=14) were divided into two groups differing only in the order in which the lectures were presented. The students’ discussions during the activities show that they were able to reason and demonstrate conceptual progress, although the physics ideas in the lectures were far beyond their level in physics. The discussions during the collaborative session contributed significantly to the students’ understanding. We illustrate this point through an analysis of one of these discussions between two students on an analogical explanation of the Aharonov-Bohm effect that was presented in one of the lectures. The results from the tests that were administered to the participants several times during the intervention further support this contention.
Powell, Kimberly R; Peterson, Shenita R
Web of Science and Scopus are the leading databases of scholarly impact. Recent studies outside the field of nursing report differences in journal coverage and quality. A comparative analysis of nursing publications reported impact. Journal coverage by each database for the field of nursing was compared. Additionally, publications by 2014 nursing faculty were collected in both databases and compared for overall coverage and reported quality, as modeled by Scimajo Journal Rank, peer review status, and MEDLINE inclusion. Individual author impact, modeled by the h-index, was calculated by each database for comparison. Scopus offered significantly higher journal coverage. For 2014 faculty publications, 100% of journals were found in Scopus, Web of Science offered 82%. No significant difference was found in the quality of reported journals. Author h-index was found to be higher in Scopus. When reporting faculty publications and scholarly impact, academic nursing programs may be better represented by Scopus, without compromising journal quality. Programs with strong interdisciplinary work should examine all areas of strength to ensure appropriate coverage. Copyright © 2017 Elsevier Inc. All rights reserved.
Ketter, T.; Kanari, M.; Tibor, G.
Recent offshore discoveries and regulation in the Israel Exclusive Economic Zone (EEZ) are the driving forces behind increasing marine research and development initiatives such as infrastructure development, environmental protection and decision making among many others. All marine operations rely on existing seabed information, while some also generate new data. We aim to create a single platform knowledge-base to enable access to existing information, in a comprehensive, publicly accessible web-based interface. The Israel EEZ covers approx. 26,000 sqkm and has been surveyed continuously with various geophysical instruments over the past decades, including 10,000 km of multibeam survey lines, 8,000 km of sub-bottom seismic lines, and hundreds of sediment sampling stations. Our database consists of vector and raster datasets from multiple sources compiled into a repository of geophysical data and metadata, acquired nation-wide by several research institutes and universities. The repository will enable public access via a web portal based on a GIS platform, including datasets from multibeam, sub-bottom profiling, single- and multi-channel seismic surveys and sediment sampling analysis. Respective data products will also be available e.g. bathymetry, substrate type, granulometry, geological structure etc. Operating a web-GIS based repository allows retrieval of pre-existing data for potential users to facilitate planning of future activities e.g. conducting marine surveys, construction of marine infrastructure and other private or public projects. User interface is based on map oriented spatial selection, which will reveal any relevant data for designated areas of interest. Querying the database will allow the user to obtain information about the data owner and to address them for data retrieval as required. Wide and free public access to existing data and metadata can save time and funds for academia, government and commercial sectors, while aiding in cooperation
Thamer A Alrawashdeh
Full Text Available With the development of information technology, organizations have applied e-learning system to train their employees in order to enhance the its performance. In this respect, applying web based training will enable the organization to train their employees quickly, efficiently and effectively anywhere at any time. This research aims to extend Unified Theory of Acceptance and Use Technology (UTAUT using some factors such flexibility of web based training system, system interactivity and system enjoyment, in order to explain the employees
Marin, F.; Rohatgi, A.; Charlot, S.
In this contribution, we present WebPlotDigitizer, a polyvalent and free software developed to facilitate easy and accurate data extraction from a variety of plot types. We describe the numerous features of this numerical tool and present its relevance when applied to astrophysical archival research. We exploit WebPlotDigitizer to extract ultraviolet spectropolarimetric spectra from old publications that used the Hubble Space Telescope, Lick Observatory 3 m Shane telescope and Astro-2 mission to observe the Seyfert-2 AGN NGC 1068. By doing so, we compile all the existing ultraviolet polarimetric data on NGC 1068 to prepare the ground for further investigations with the future high-resolution spectropolarimeter POLLUX on-board of the proposed Large UV/Optical/Infrared Surveyor (LUVOIR) NASA mission.
Levay, Paul; Ainsworth, Nicola; Kettle, Rachel; Morgan, Antony
To examine how effectively forwards citation searching with Web of Science (WOS) or Google Scholar (GS) identified evidence to support public health guidance published by the National Institute for Health and Care Excellence. Forwards citation searching was performed using GS on a base set of 46 publications and replicated using WOS. WOS and GS were compared in terms of recall; precision; number needed to read (NNR); administrative time and costs; and screening time and costs. Outcomes for all publications were compared with those for a subset of highly important publications. The searches identified 43 relevant publications. The WOS process had 86.05% recall and 1.58% precision. The GS process had 90.7% recall and 1.62% precision. The NNR to identify one relevant publication was 63.3 with WOS and 61.72 with GS. There were nine highly important publications. WOS had 100% recall, 0.38% precision and NNR of 260.22. GS had 88.89% recall, 0.33% precision and NNR of 300.88. Administering the WOS results took 4 h and cost £88-£136, compared with 75 h and £1650-£2550 with GS. WOS is recommended over GS, as citation searching was more effective, while the administrative and screening times and costs were lower. Copyright © 2015 John Wiley & Sons, Ltd.
the reader on an exciting time travel journey to learn more about the prehistory of the hyperlink, the birth of the Web, the spread of the early Web, and the Web’s introduction to the general public in mainstream media. Fur- thermore, case studies of blogs, literature, and traditional media going online...
Cowie, B. R.; Lim, D. S.; Pendery, R.; Laval, B.; Slater, G. F.; Brady, A. L.; Dearing, W. L.; Downs, M.; Forrest, A.; Lees, D. S.; Lind, R. A.; Marinova, M.; Reid, D.; Seibert, M. A.; Shepard, R.; Williams, D.
The Pavilion Lake Research Project (PLRP) is an international multi-disciplinary science and exploration effort to explain the origin and preservation potential of freshwater microbialites in Pavilion Lake, British Columbia, Canada. Using multiple exploration platforms including one person DeepWorker submersibles, Autonomous Underwater Vehicles, and SCUBA divers, the PLRP acts as an analogue research site for conducting science in extreme environments, such as the Moon or Mars. In 2009, the PLRP integrated several Web 2.0 technologies to provide a pilot-scale Education and Public Outreach (EPO) program targeting the internet savvy generation. The seamless integration of multiple technologies including Google Earth, Wordpress, Youtube, Twitter and Facebook, facilitated the rapid distribution of exciting and accessible science and exploration information over multiple channels. Field updates, science reports, and multimedia including videos, interactive maps, and immersive visualization were rapidly available through multiple social media channels, partly due to the ease of integration of these multiple technologies. Additionally, the successful application of videoconferencing via a readily available technology (Skype) has greatly increased the capacity of our team to conduct real-time education and public outreach from remote locations. The improved communication afforded by Web 2.0 has increased the quality of EPO provided by the PLRP, and has enabled a higher level of interaction between the science team and the community at large. Feedback from these online interactions suggest that remote communication via Web 2.0 technologies were effective tools for increasing public discourse and awareness of the science and exploration activity at Pavilion Lake.
Geum Hee Jeong
Full Text Available The aim of this study was to analyze the bibliometric characteristics of publications from North Korea indexed in the Web of Science Core Collection from 1988 to 2016. We hypothesized that the main research area would be the physical sciences, and that the number of articles would continually increase over time. The Web of Science Core Collection was searched using the terms “North Korea” OR “Democratic People’s Republic of Korea” OR “DPRK” in the address field of the basic search on February 2, 2017. The country of the co-authors, affiliations, journals, annual number of publications, and research fields were analyzed. Additionally, the articles by North Korean authors only were analyzed for the same parameters. A total of 318 articles from North Korea were found. The most frequent countries of collaboration were China, Germany, and Australia. Kim Il Sung University produced the most articles. The main research fields were physics, mathematics, and materials science. The categories of the journal titles corresponded to the research fields. The rapid increase in the number of articles in 2015 and 2016 was remarkable, although this increase started from a very small baseline number of publications. The results of the analysis of the 46 articles published by North Korean authors only were equivalent to the results for the 318 articles presented above. Our hypotheses were confirmed. The surge of articles in 2015 and 2016 may represent the recent efforts by the North Korean government to emphasize scientific research and development. It is anticipated that the productivity of North Korean researchers in terms of publications in international journals will increase dramatically based on the above trends, although the publication baseline is very low.
“As Bill Gates and Steve Case proclaim the global omnipresence of the Internet, the majority of non-Western nations and 97 per cent of the world's population remain unconnected to the net for lack of money, access, or knowledge. This exclusion of so vast a share of the global population from the Internet sharply contradicts the claims of those who posit the World Wide Web as a ‘universal' medium of egalitarian communication.” (Trend 2001:2)
Chang, Hsiao-Ting; Lin, Ming-Hwai; Chen, Chun-Ku; Hwang, Shinn-Jang; Hwang, I-Hsuan; Chen, Yu-Chun
Academic publications are important for developing a medical specialty or discipline and improvements of quality of care. As hospice palliative care medicine is a rapidly growing medical specialty in Taiwan, this study aimed to analyze the hospice palliative care-related publications from 1993 through 2013 both worldwide and in Taiwan, by using the Web of Science database. Academic articles published with topics including "hospice", "palliative care", "end of life care", and "terminal care" were retrieved and analyzed from the Web of Science database, which includes documents published in Science Citation Index-Expanded and Social Science Citation Indexed journals from 1993 to 2013. Compound annual growth rates (CAGRs) were calculated to evaluate the trends of publications. There were a total of 27,788 documents published worldwide during the years 1993 to 2013. The top five most prolific countries/areas with published documents were the United States (11,419 documents, 41.09%), England (3620 documents, 13.03%), Canada (2428 documents, 8.74%), Germany (1598 documents, 5.75%), and Australia (1580 documents, 5.69%). Three hundred and ten documents (1.12%) were published from Taiwan, which ranks second among Asian countries (after Japan, with 594 documents, 2.14%) and 16(th) in the world. During this 21-year period, the number of hospice palliative care-related article publications increased rapidly. The worldwide CAGR for hospice palliative care publications during 1993 through 2013 was 12.9%. As for Taiwan, the CAGR for publications during 1999 through 2013 was 19.4%. The majority of these documents were submitted from universities or hospitals affiliated to universities. The number of hospice palliative care-related publications increased rapidly from 1993 to 2013 in the world and in Taiwan; however, the number of publications from Taiwan is still far below those published in several other countries. Further research is needed to identify and try to reduce the
Tania FERNÁNDEZ LOMBAO FERNÁNDEZ LOMBAO
Full Text Available Corporate Social Responsibility is a concept that defines the model of corporate governance based on responsible, horizontal and interactive accountability as opposed to closed and rail control systems. This type of management has been associated, at an initial moment, with private enterprise in the context of globalization, although gradually being implemented in the public, and consequently in the state-owned broadcasting corporations of the European Union. The three first corporations who have introduced CSR in their management are the BBC in the UK, RTÉ in Ireland, and ZDF in Germany. They develop their strategies in the fields of governance, working conditions, human rights, consumer, good practices in the activity, environment and community involvement. Annually these three corporations publish memories to evaluate the success or failure of their CSR activities, in order to provide detailed information to its stakeholders or interest groups: managers, suppliers, employees, partners, local communities and international communities. The purpose of this paper is to analyze the way in which the three corporations use Web 2.0 through their corporate websites in order to spread their CSR activities. Thus, detail the peculiarities and possibilities offered by each of the spaces 2.0 and how it encourages interaction, understood as a pillar of the `social media 'against excessive elite control prevailing in the traditional media. Also, check if the three public broadcasting corporations use Web 2.0 share CSR as a management philosophy or whether, on the contrary, does not go beyond simple social marketing. To do this, we will identify the spaces dedicated to Corporate Social Responsibility, specify the category in which it is included and the importance given to it in the middle of other content contained in the corporate websites. Overall, we aim to find out if the web 2.0. is the method of choice for corporations to communicate their CSR
Yi, Fengyun; Yang, Pin; Sheng, Huifeng
Ebola virus disease (hereafter EVD or Ebola) has a high fatality rate. The devastating effects of the current epidemic of Ebola in West Africa have put the global health response in acute focus. In response, the World Health Organization (WHO) has declared the Ebola outbreak in West Africa as a "Public Health Emergency of International Concern". A small proportion of scientific literature is dedicated to Ebola research. To identify global research trends in Ebola research, the Institute for Scientific Information (ISI) Web of Science™ database was used to search for data, which encompassed original articles published from 1900 to 2013. The keyword "Ebola" was used to identify articles for the purposes of this review. In order to include all published items, the database was searched using the Basic Search method. The earliest record of literature about Ebola indexed in the Web of Science is from 1977. A total of 2477 publications on Ebola, published between 1977 and 2014 (with the number of publications increasing annually), were retrieved from the database. Original research articles (n = 1623, 65.5%) were the most common type of publication. Almost all (96.5%) of the literature in this field was in English. The USA had the highest scientific output and greatest number of funding agencies. Journal of Virology published 239 papers on Ebola, followed by Journal of Infectious Diseases and Virology, which published 113 and 99 papers, respectively. A total of 1911 papers on Ebola were cited 61,477 times. This analysis identified the current state of research and trends in studies about Ebola between 1977 and 2014. Our bibliometric analysis provides a historical perspective on the progress in Ebola research.
Hemmingson, Kaitlyn; Lucchesi, Roxanne; Droke, Elizabeth; Kattelmann, Kendra K.
Objective: High levels of obesity-related health disparities are common among US American Indian (AI) populations. AI public university students often face unique challenges that may contribute to weight gain and related consequences. Few weight maintenance interventions have been developed that meet the needs of AI public university students. The…
Hussey, K.; Doronila, P.; Kulikov, A.; Lane, K.; Upchurch, P.; Howard, J.; Harvey, S.; Woodmansee, L.
With the recent releases of both Google's "Sky" and Microsoft's "WorldWide Telescope" and the large and increasing popularity of video games, the time is now for using these tools, and those crafted at NASA's Jet Propulsion Laboratory, to engage the public in astronomy like never before. This presentation will use "Cassini at Saturn Interactive Explorer " (CASSIE) to demonstrate the power of web-based video-game engine technology in providing the public a "first-person" look at space exploration. The concept of virtual space exploration is to allow the public to "see" objects in space as if they were either riding aboard or "flying" next to an ESA/NASA spacecraft. Using this technology, people are able to immediately "look" in any direction from their virtual location in space and "zoom-in" at will. Users can position themselves near Saturn's moons and observe the Cassini Spacecraft's "encounters" as they happened. Whenever real data for their "view" exists it is incorporated into the scene. Where data is missing, a high-fidelity simulation of the view is generated to fill in the scene. The observer can also change the time of observation into the past or future. Our approach is to utilize and extend the Unity 3d game development tool, currently in use by the computer gaming industry, along with JPL mission specific telemetry and instrument data to build our virtual explorer. The potential of the application of game technology for the development of educational curricula and public engagement are huge. We believe this technology can revolutionize the way the general public and the planetary science community views ESA/NASA missions and provides an educational context that is attractive to the younger generation. This technology is currently under development and application at JPL to assist our missions in viewing their data, communicating with the public and visualizing future mission plans. Real-time demonstrations of CASSIE and other applications in development
Full Text Available As a consequence of the development of Information and Communication Technologies (ICT, nowadays almost all governments around the world, included Indonesian government have official websites to provide information and services for their citizen. In the second period of President Susilo Bambang Yudhoyono administration has thirty two ministries and each ministries have an official website. However implementation of the ministry websites have not been measured yet on usability aspect. The objective of this research is to examine the usability of ministry websites of Indonesian Government. Eleven websites was taken as sample in this study. Respondents are 128 Internet users who have competency for assessing web usability. Usability of websites were measured by several indicators were adapted from E-Government Toolkit for Developing Countries that was prepared by the National Informatics Centre and UNESCO. The main indicators consist of navigation architecture, layout design, and content.
Public Relations: The Route to Success and Influence. Public Relations for Your Library: A Tool for Effective Communications; Tooting Your Own Horn: Web-Based Public Relations for the School Media Specialist; Bookmarks as a Teaching Tool; Customers and Culture: The Who and What of Library Public Relations Efforts; Strategies for Successful Job Transition.
Lyon, Linda; Silverstein, Roberta; Fisher, Julieta Dias; Hill, Ann; Hegel, Claudette; Miller, Donna; Moyer, Mary
This special section includes five articles that discuss public relations strategies for school librarians. Highlights include effective communication, including measuring and evaluating the success of public relations efforts; Web-based public relations; giving bookmarks to students; customers and cultural contexts; and successful job…
Capuzzi, Stephen J; Thornton, Thomas E; Liu, Kammy; Baker, Nancy; Lam, Wai In; O'Banion, Colin P; Muratov, Eugene N; Pozefsky, Diane; Tropsha, Alexander
Elucidation of the mechanistic relationships between drugs, their targets, and diseases is at the core of modern drug discovery research. Thousands of studies relevant to the drug-target-disease (DTD) triangle have been published and annotated in the Medline/PubMed database. Mining this database affords rapid identification of all published studies that confirm connections between vertices of this triangle or enable new inferences of such connections. To this end, we describe the development of Chemotext, a publicly available Web server that mines the entire compendium of published literature in PubMed annotated by Medline Subject Heading (MeSH) terms. The goal of Chemotext is to identify all known DTD relationships and infer missing links between vertices of the DTD triangle. As a proof-of-concept, we show that Chemotext could be instrumental in generating new drug repurposing hypotheses or annotating clinical outcomes pathways for known drugs. The Chemotext Web server is freely available at http://chemotext.mml.unc.edu .
Meneveau, Charles; Yang, Yunke; Perlman, Eric; Wan, Minpin; Burns, Randal; Szalay, Alex; Chen, Shiyi; Eyink, Gregory
A public database system archiving a direct numerical simulation (DNS) data set of isotropic, forced turbulence is used for studying basic turbulence dynamics. The data set consists of the DNS output on 1024-cubed spatial points and 1024 time-samples spanning about one large-scale turn-over timescale. This complete space-time history of turbulence is accessible to users remotely through an interface that is based on the Web-services model (see http://turbulence.pha.jhu.edu). Users may write and execute analysis programs on their host computers, while the programs make subroutine-like calls that request desired parts of the data over the network. The architecture of the database is briefly explained, as are some of the new functions such as Lagrangian particle tracking and spatial box-filtering. These tools are used to evaluate and compare subgrid stresses and models.
Kung, Yen-Ying; Hwang, Shinn-Jang; Li, Tsai-Feng; Ko, Seong-Gyu; Huang, Ching-Wen; Chen, Fang-Pey
Acupuncture is a rapidly growing medical specialty worldwide. This study aimed to analyze the acupuncture publications from 1988 to 2015 by using the Web of Science (WoS) database. Familiarity with the trend of acupuncture publications will facilitate a better understanding of existing academic research in acupuncture and its applications. Academic articles published focusing on acupuncture were retrieved and analyzed from the WoS database which included articles published in Science Citation Index-Expanded and Social Science Citation Indexed journals from 1988 to 2015. A total of 7450 articles were published in the field of acupuncture during the period of 1988-2015. Annual article publications increased from 109 in 1988 to 670 in 2015. The People's Republic of China (published 2076 articles, 27.9%), USA (published 1638 articles, 22.0%) and South Korea (published 707 articles, 9.5%) were the most abundantly prolific countries. According to the WoS subject categories, 2591 articles (34.8%) were published in the category of Integrative and Complementary Medicine, followed by Neurosciences (1147 articles, 15.4%), and General Internal Medicine (918 articles, 12.3%). Kyung Hee University (South Korea) is the most prolific organization that is the source of acupuncture publications (365 articles, 4.9%). Fields within acupuncture with the most cited articles included mechanism, clinical trials, epidemiology, and a new research method of acupuncture. Publications associated with acupuncture increased rapidly from 1988 to 2015. The different applications of acupuncture were extensive in multiple fields of medicine. It is important to maintain and even nourish a certain quantity and quality of published acupuncture papers, which can play an important role in developing a medical discipline for acupuncture. Copyright © 2017. Published by Elsevier Taiwan LLC.
Welle Donker, F.M.
Geo-information (GI) is increasingly having a bigger impact on socio-economic benefits. Over the last decade, use of GI has shifted from a specialised GIS niche market to serving as a direct input to planning and decision-making, public policy, environmental management, readiness to deal with
Vedro, Steven R.
Digital convergence--the merging of television and computing--challenges localized monopolies of public television and continuing education. Continuing educators can reposition themselves in the electronic marketplace by serving as an educational portal, bringing their strengths of "brand recognition," local customer base, and access to…
Campagna, Michele; Arleth, Mette
This paper presents the results of an ongoing comparative study on the accessibility of Geographic Information at public authorities’ websites in Denmark and Italy. Qualitative and quantitative mappings of the level of accessibility to GI in the two countries are made and the results are compared...
Jake-Schoffman, Danielle E; Wilcox, Sara; Kaczynski, Andrew T; Turner-McGrievy, Gabrielle; Friedman, Daniela B; West, Delia S
As social media (eg, Twitter) continues to gain widespread popularity, health research and practice organizations may consider combining it with other electronic media (e-media) channels (eg, Web sites, e-newsletters) within their communication plans. However, little is known about added benefits of using social media when trying to reach public health audiences about physical activity. Learn about current use and preference for e-media communication channels among physical activity researchers and practitioners. A Web-based survey was used, open for responses from August 20, 2015, through January 5, 2016. Survey participation was voluntary and anonymous. The survey was advertised through multiple channels targeting physical activity researchers and practitioners, including announcements on professional listservs and in e-newsletters, Twitter, and posts on Facebook pages of public health organizations. A total of 284 survey respondents had complete data. Typical use of e-media to receive, seek out, and share information about physical activity and health and what appeals to researchers and practitioners for professional use. Most respondents preferred non-social media channels to social media and these preferences did not differ widely when examining subgroups such as researchers versus practitioners or social media users versus nonusers. There were few differences by respondent demographics, though younger respondents reported using social media more than older respondents. However, limiting analyses to respondents who identified as social media users, only about 1% of respondents ranked social media sources as their preferred channels for information; thus, most people would continue to be reached if communication remained largely via non-social media e-media channels. The present study supports growing evidence that careful surveying of a target audience should be undertaken when considering new communication channels, as preference and use may not support the
Maidantchik, Carmen; Moraes, Laura O.F.; Karam, K.; Grael, Felipe F.; Evora, Luiz Henrique R.A.
Full text: In 2010, the LHC experiment produced 7 TeV and heavy-ions collisions continually. It allowed ATLAS to collect and analyze a huge amount of data and perform several studies. Physicists are now publishing papers and conference notes announcing results and achievements. During the past year, 37 papers were published and for 2011 there are already 39 papers in preparation. A paper publication management involves several aspects, such as keep track of the analysis results status, follow the procedure step-by-step, promote the communication among collaborators, improve the paper initial version, and make an interaction between the Authorship Committee and the Publication Committee to produce a final authors list. The UFRJ group developed the Glance system, a retrieval mechanism to perform data manipulation and operation in distinct and geographically spread repositories. Using Glance as the main application to access data, the Analysis Follow-Up and Analysis CONF Notes systems allow users to manage and update information related to all ATLAS papers and conference notes. Both systems support the process of revision, approval and publication of the analysis outcome. The first step to publish a paper or note is to define an Editorial Board, which main responsibility is to improve the initial versions. The whole process is supervised by the Publication Committee, which will sign-off the final decisions and submit the paper for publishing. Presentations and papers versions are elaborated by the Conveners of the different ATLAS physics groups. The systems also support the registration of meetings, tracking the paper through the official references (like CDS, arXiv, DOI and the published Journal) and insertion of comments about the successive versions. As all steps are traced, automatic e-mails warn the responsible of the next step to take an action. A search engine allows any user to follow an analysis publication stage. The access privileges are based on
Speed, Ewen; Davison, Charlie; Gunnell, Caroline
The UK National Health Service (NHS) has long espoused patient and public engagement. Recent years have seen increasing use of internet-based methods of collecting feedback about patient experience and public and staff views about NHS services and priorities. Often hailed as a means of facilitating participative democratic patient engagement, these processes raise a number of complex issues. A key aspect of it is the opportunity for comment to be made anonymously. Our research reveals an anonymity paradox whereby patients clearly demonstrate a perception that anonymity is a prerequisite for effective use of these feedback processes, whereas professionals demonstrate a perception that patient anonymity is a barrier to effective use. The risks of anonymity are constructed very differently by patients and professionals. Patient concerns around anonymity were not motivated by a general concern about a loss of privacy, but more that a positive identification might compromise future care. For professionals, concerns were voiced more around risks of reputational damage for specific practitioners or practices (in that anyone could say anything) and also that this anonymous feedback was available publicly and that it might go against the medical opinion of the professional. These concerns pointed to important differences in perceptions of patient and professional vulnerability. In the qualitative analysis that follows the key finding was that while anonymity makes service users feel less vulnerable, it can have the opposite effect on managers and clinical staff. This raises important implications for the use and utility of internet-based methods of collecting patient feedback. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Wong, Michelle; Wolff, Craig; Collins, Natalie; Guo, Liang; Meltzer, Dan; English, Paul
Significant illness is associated with biological contaminants in drinking water, but little is known about health effects from low levels of chemical contamination in drinking water. To examine these effects in epidemiological studies, the sources of drinking water of study populations need to be known. The California Environmental Health Tracking Program developed an online application that would collect data on the geographic location of public water system (PWS) customer service areas in California, which then could be linked to demographic and drinking water quality data. We deployed the Water Boundary Tool (WBT), a Web-based geospatial crowdsourcing application that can manage customer service boundary data for each PWS in California and can track changes over time. We also conducted a needs assessment for expansion to other states. The WBT was designed for water system operators, local and state regulatory agencies, and government entities. Since its public launch in 2012, the WBT has collected service area boundaries for about 2300 individual PWS, serving more than 90% of the California population. Results of the needs assessment suggest interest and utility for deploying such a tool among states lacking statewide PWS service area boundary data. Although the WBT data set is incomplete, it has already been used for a variety of applications, including fulfilling legislatively mandated reporting requirements and linking customer service areas to drinking water quality data to better understand local water quality issues. Development of this tool holds promise to assist with outbreak investigations and prevention, environmental health monitoring, and emergency preparedness and response.
With the migration of the written record from paper to digital format, archivists and historians must urgently consider how web content should be conserved, retrieved and analysed. The British Library has recently acquired a large number of UK domain websites, captured 1996-2010, which is colloquially termed the Dark Domain Archive while technical issues surrounding user access are resolved. This article reports the results of an invited pilot project that explores methodological issues surrounding use of this archive. It asks how the relationship between UK public health and local government was represented on the web, drawing on the 'declinist' historiography to frame its questions. It points up some difficulties in developing an aggregate picture of web content due to duplication of sites. It also highlights their potential for thematic and discourse analysis, using both text and image, illustrated through an argument about the contradictory rationale for public health policy under New Labour.
McDonough, Brianna; Felter, Elizabeth; Downes, Amia; Trauth, Jeanette
Pregnant and postpartum women have special needs during public health emergencies but often have inadequate levels of disaster preparedness. Thus, improving maternal emergency preparedness is a public health priority. More research is needed to identify the strengths and weaknesses of various approaches to how preparedness information is communicated to these women. A sample of web pages from the Centers for Disease Control and Prevention intended to address the preparedness needs of pregnant and postpartum populations was examined for suitability for this audience. Five of the 7 web pages examined were considered adequate. One web page was considered not suitable and one the raters split between not suitable and adequate. None of the resources examined were considered superior. If these resources are considered some of the best available to pregnant and postpartum women, more work is needed to improve the suitability of educational resources, especially for audiences with low literacy and low incomes.
This is a tentative interim report of worry and question appeared in Japan Health Physics Society (JHPS) web-site, which formally started from Mar. 25 in 2011 and had origins of voluntary activity by JHPS members from Mar. 16 and of subsequent Japan MECSST requirement on Mar. 18 concerning the Fukushima Daiichi Power Plant Accident (Fukushima I, Mar. 12). Replying to the health risk in the web essentially stood on the stance that the risk always exists, uncertainty is not unavoidable and risk concerns about public, not about individuals. Major two problems generated were solved by the answering with the manager name not with the responsible individual expert, and by the careful, detailed explanation (long sentence) about the safety dose standards. Questioners were thought mostly to be women of ages between 20 and 40 y. Their living addresses written were Tokyo, Chiba, Fukushima areas and so on, although they were mostly unwritten. The number of questions was 709 (10/day) until July 1; their major three items, when classified in 48, were related to the comparison with the nuclear explosion experiments, Chernobyl Accident and released nuclides from the Fukushima I; and their major key words contained were child (39 questions), health hazard and concern (25), food and water (16), etc. It has become clear that sufficient carefulness is necessary in telling non-experts who scarcely have the fundamental, systematic and mechanistic knowledge, about the assessment of the radiation health risk, where quantitativeness is highly important. The author hopes to publish answers not only as a documentary of the Accident but also as a system of concerned knowledge. (T.T.)
U.S. Environmental Protection Agency — The Chemical Search Web Utility is an intuitive web application that allows the public to easily find the chemical that they are interested in using, and which...
The new interface of the Web of Science (Thomson Reuters) enables users to retrieve sets larger than 100,000 documents in a single search. This makes it possible to compare publication trends for China, the USA, EU-27, and smaller countries with the data in the Scopus (Elsevier) database. China no
Wildgaard, Lorna Elizabeth
were calculated for 512 researchers in Astronomy, Environmental Science, Philosophy and Public Health. Indicator scores and scholar rankings calculated in Web of Science (WoS) and Google Scholar (GS) were analyzed. The indexing policies of WoS and GS were found to have a direct effect on the amount...
Vélez-Cuartas, G.; Lucio-Arias, D.; Leydesdorff, L.
In this article the authors compare the visibility of Latin American and Caribbean (LAC) publications in the Core Collection indexes of the Web of Science (WoS) inlcuding Science Citation Index Expanded, Social Sciences Citation Index, and Arts & Humanities Citation Index, and the SciELO Citation
In this paper we propose an application of software agents to provide Virtual Web Services. A Virtual Web Service is a linked collection of several real and/or virtual Web Services, and public and private agents, accessed by the user in the same way as a single real Web Service. A Virtual Web Service allows unrestricted comparison, information merging, pipelining, etc., of data coming from different sources and in different forms. Detailed architecture and functionality of a single Virtual We...
leveraged through Web caching technology. Specifically, Web caching becomes an ... Web routing can improve the overall performance of the Internet. Web caching is similar to memory system caching - a Web cache stores Web resources in ...
Rojas-Sola, J. I.
Full Text Available In this paper the publications from Spanish institutions listed in the journals of the Construction & Building Technology subject of Web of Science database for the period 1997- 2008 are analyzed. The number of journals in whose is published is 35 and the number of articles was 760 (Article or Review. Also a bibliometric assessment has done and we propose two new parameters: Weighted Impact Factor and Relative Impact Factor; also includes the number of citations and the number documents at the institutional level. Among the major production Institutions with greater scientific production, as expected, the Institute of Constructional Science Eduardo Torroja (CSIC, while taking into account the weighted impact factor ranks first University of Vigo. On the other hand, only two journals Cement and Concrete Materials and Materials de Construction agglutinate the 45.26% of the Spanish scientific production published in the Construction & Building Technology subject, with 172 papers each one. Regarding international cooperation, include countries such as England, Mexico, United States, Italy, Argentina and France.
En este trabajo se analizan las publicaciones procedentes de instituciones españolas recogidas en las revistas de la categoría Construction & Building Technology de la base de datos Web of Science para el periodo 1997-2008. El número de revistas incluidas es de 35 y el número de artículos publicados ha sido de 760 (Article o Review. Se ha realizado una evaluación bibliométrica con dos nuevos parámetros: Factor de Impacto Ponderado y Factor de Impacto Relativo; asimismo se incluyen el número de citas y el número de documentos a nivel institucional. Entre los centros con una mayor producción científica destaca, como era de prever, el Instituto de Ciencias de la Construcción Eduardo Torroja (CSIC, mientras que atendiendo al Factor de Impacto Ponderado ocupa el primer lugar la Universidad de Vigo. Por otro lado, sólo dos
Chen, I Ju; Yang, Kuei-Feng; Tang, Fu-In; Huang, Chun-Hsia; Yu, Shu
In the era of the knowledge economy, public health nurses (PHNs) need to update their knowledge to ensure quality of care. In pre-implementation stage, policy makers and educators should understand PHNs' behavioural intentions (BI) toward web-based learning because it is the most important determinant of actual behaviour. To understand PHNs' BI toward web-based learning and further to identify the factors influencing PHNs' BI based on the technology acceptance model (TAM) in pre-implementation stage. A nationwide-based cross-sectional research design was used in this study. Three hundred and sixty-nine health centres in Taiwan. A randomly selected sample, 202 PHNs participated in this study. Data were collected by mailing in a questionnaire. The majority of PHNs (91.6%, n=185) showed an affirmative BI toward web-based learning. PHNs rated moderate values of perceived usefulness (U), perceived ease of use (EOU) and attitude toward web-based learning (A). Multiple regression analyses indicated that only U revealed a significantly direct influence on BI. U and EOU had significantly direct relationships with A; however, no significant relationship existed between A and BI. Additionally, EOU and an individual's computer competence revealed significant relationships with U; Internet access at the workplace revealed a significant relationship with EOU. In the pre-implementation stage, PHNs perceived a high likelihood of adopting web-based learning as their way of continuing education. In pre-implementation stage, perceived usefulness is the most important factor for BI instead of the attitude. Perceived EOU, an individual's computer competency, and Internet access at workplaces revealed indirect effects on BI. Therefore, increasing U, EOU, computer competence, and Internet access at workplace will be helpful in increasing PHNs' BI. Moreover, we suggest that future studies should focus on clarifying problems in different stages of implementation to build a more complete
Full Text Available Abstract Background The "place-consciousness" of public health professionals is on the rise as spatial analyses and Geographic Information Systems (GIS are rapidly becoming key components of their toolbox. However, "place" is most useful at its most precise, granular scale – which increases identification risks, thereby clashing with privacy issues. This paper describes the views and requirements of public health professionals in Canada and the UK on privacy issues and spatial data, as collected through a web-based survey. Methods Perceptions on the impact of privacy were collected through a web-based survey administered between November 2006 and January 2007. The survey targeted government, non-government and academic GIS labs and research groups involved in public health, as well as public health units (Canada, ministries, and observatories (UK. Potential participants were invited to participate through personally addressed, standardised emails. Results Of 112 invitees in Canada and 75 in the UK, 66 and 28 participated in the survey, respectively. The completion proportion for Canada was 91%, and 86% for the UK. No response differences were observed between the two countries. Ninety three percent of participants indicated a requirement for personally identifiable data (PID in their public health activities, including geographic information. Privacy was identified as an obstacle to public health practice by 71% of respondents. The overall self-rated median score for knowledge of privacy legislation and policies was 7 out of 10. Those who rated their knowledge of privacy as high (at the median or above also rated it significantly more severe as an obstacle to research (P Conclusion The clash between PID requirements – including granular geography – and limitations imposed by privacy and its associated bureaucracy require immediate attention and solutions, particularly given the increasing utilisation of GIS in public health. Solutions
Wagner, Michael M; Levander, John D; Brown, Shawn; Hogan, William R; Millett, Nicholas; Hanna, Josh
This paper describes the Apollo Web Services and Apollo-SV, its related ontology. The Apollo Web Services give an end-user application a single point of access to multiple epidemic simulators. An end user can specify an analytic problem-which we define as a configuration and a query of results-exactly once and submit it to multiple epidemic simulators. The end user represents the analytic problem using a standard syntax and vocabulary, not the native languages of the simulators. We have demonstrated the feasibility of this design by implementing a set of Apollo services that provide access to two epidemic simulators and two visualizer services.
Kamel Boulos, Maged N
Abstract \\'Wikification of GIS by the masses\\' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild\\'s term \\'Volunteered Geographic Information\\'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced \\'Wikipedias of the Earth\\' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and \\'human-in-the-loop sensing\\' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis\\/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, "noise", misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world.
'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, "noise", misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world. PMID:22188675
Kamel Boulos Maged N
Full Text Available Abstract 'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011, OpenStreetMap and Google Earth (GE are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, "noise", misinformation, bias and trust, the core technologies and Open Geospatial Consortium (OGC standards involved (Sensor Web Enablement and Open GeoSMS, as well as a few outstanding project implementation examples from around the world.
Kamel Boulos, Maged N; Resch, Bernd; Crowley, David N; Breslin, John G; Sohn, Gunho; Burtner, Russ; Pike, William A; Jezierski, Eduardo; Chuang, Kuo-Yu Slayer
'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, "noise", misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world.
Rocío Ortiz Galindo
Full Text Available This work aims to advance in the theoretical analysis of the field of communication and social movements. The article studies the new changes that have emerged in the social Web era. We will review some of the principal works of the literature to observe the communicative strategies in these collectives, which are able to influence in the social change. We will differentiate the area of interpersonal communication (the informal networks and the public communication (the repertoires of collective action. We will use this classification to analyze the communicative strategies which have born in the social cybermovements, in the ICT age.
Conclusion: The number of hospice palliative care-related publications increased rapidly from 1993 to 2013 in the world and in Taiwan; however, the number of publications from Taiwan is still far below those published in several other countries. Further research is needed to identify and try to reduce the barriers to hospice palliative care research and publication in Taiwan.
Garcia, Joseph A.; Smith, Charles A. (Technical Monitor)
The document consists of a publicly available web site (george.arc.nasa.gov) for Joseph A. Garcia's personal web pages in the AI division. Only general information will be posted and no technical material. All the information is unclassified.
Costin PRIBEANU; Ruxandra-Dora MARINESCU; Paul FOGARASSY-NESZLY; Maria GHEORGHE-MOISII
The accessibility of public administration web sites is a key quality attribute for the successful implementation of the Information Society. The purpose of this paper is to present a second review of municipal web sites in Romania that is based on automated accessibility checking. A number of 60 web sites were evaluated against WCAG 2.0 recommendations. The analysis of results reveals a relatively low web accessibility of municipal web sites and highlights several aspects. Firstly, a slight ...
Section 207(f)(2) of the E-Gov Act requires federal agencies to develop an inventory and establish a schedule of information to be published on their Web sites, make those schedules available for public comment. To post the schedules on the web site.
Kamel Boulos, Maged; Resch, Bernd; Crowley, David N.; Breslin, John G.; Sohn, Gunho; Burtner, Edwin R.; Pike, William A.; Jeziersk, Eduardo; Slayer Chuang, Kuo Yu
The PIE Activity Awareness Environment is designed to be an adaptive data triage and decision support tool that allows role and activity based situation awareness through a dynamic, trainable filtering system. This paper discusses the process and methodology involved in the application as well as some of its capabilities. 'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, 'noise', misinformation, bias and trust), the core technologies and Open Geospatial
Sanchez, Arturo; The ATLAS collaboration
The integration of the ROOT data analysis framework with the Jupyter Notebook technology presents an incredible potential in the enhance and expansion of educational and training programs: starting from university students in their early years, passing to new ATLAS PhD students and post doctoral researchers, to those senior analysers and professors that want to restart their contact with the analysis of data or to include a more friendly but yet very powerful open source tool in the classroom. Such tools have been already tested in several environments and a fully web-based integration together with Open Access Data repositories brings the possibility to go a step forward in the search of ATLAS for integration between several CERN projects in the field of the education and training, developing new computing solutions on the way.
AUTHOR|(INSPIRE)INSPIRE-00237353; The ATLAS collaboration
Integration of the ROOT data analysis framework with the Jupyter Notebook technology presents the potential of enhancement and expansion of educational and training programs. It can be beneficial for university students in their early years, new PhD students and post-doctoral researchers, as well as for senior researchers and teachers who want to refresh their data analysis skills or to introduce a more friendly and yet very powerful open source tool in the classroom. Such tools have been already tested in several environments. A fully web-based integration of the tools and the Open Access Data repositories brings the possibility to go a step forward in the ATLAS quest of making use of several CERN projects in the field of the education and training, developing new computing solutions on the way.
The project worked on the development of a physics analysis and its software under ROOT framework and Jupyter notebooks for the the ATLAS Outreach and the Naples teams. This analysis is created in the context of the release of data and Monte Carlo samples by the ATLAS collaboration. The project focuses on the enhancement of the recent opendata.atlas.cern web platform to be used as educational resources for university students and new researches. The generated analysis structure and tutorials will be used to extend the participation of students from other locations around the World. We conclude the project with the creation of a complete notebook representing the so-called W analysis in C + + language for the mentioned platform.
Terkamo-Moisio, Anja; Kvist, Tarja; Laitila, Teuvo; Kangasniemi, Mari; Ryynänen, Olli-Pekka; Pietilä, Anna-Maija
The debate about euthanasia is ongoing in several countries including Finland. However, there is a lack of information on current attitudes toward euthanasia among general Finnish public. The traditional model for predicting individuals' attitudes to euthanasia is based on their age, gender, educational level, and religiosity. However, a new evaluation of religiosity is needed due to the limited operationalization of this factor in previous studies. This study explores the connections between the factors of the traditional model and the attitudes toward euthanasia among the general public in the Finnish context. The Finnish public's attitudes toward euthanasia have become remarkably more positive over the last decade. Further research is needed on the factors that predict euthanasia attitudes. We suggest two different explanatory models for consideration: one that emphasizes the value of individual autonomy and another that approaches euthanasia from the perspective of fears of death or the process of dying.
Lim, Kheng Seang; Hills, Michael D; Choo, Wan Yuen; Wong, Mee Hoo; Wu, Cathie; Tan, Chong Tin
Students' attitudes toward epilepsy have been studied in several countries, but none of the studies used a quantitative scale. We aimed to determine the validity and reliability of the Public Attitudes Toward Epilepsy (PATE) scale in a homogenous population consisting of secondary and tertiary students in Malaysia and to quantify their attitudes toward epilepsy, using a web-based survey. A total of 227 respondents with a mean age of 19.6±2.07 years, predominantly Chinese (85%), female (62%), and in a pre-university education level (71%) completed the web-based survey. Psychometric testing showed that the PATE is a valid and reliable scale to be applied in a homogenous population. The mean score in the personal domain was significantly higher than that in the general domain (2.73±0.61 vs. 2.12±0.60, respectively, ppopulation (Lim et al., 2012 ), the mean score in the general domain was significantly lower (pstudents are more positive than those of the general population in the general domain but not in the personal domain. Copyright © 2012 Elsevier Inc. All rights reserved.
Tetko, Igor V; Maran, Uko; Tropsha, Alexander
Thousands of (Quantitative) Structure-Activity Relationships (Q)SAR models have been described in peer-reviewed publications; however, this way of sharing seldom makes models available for the use by the research community outside of the developer's laboratory. Conversely, on-line models allow broad dissemination and application representing the most effective way of sharing the scientific knowledge. Approaches for sharing and providing on-line access to models range from web services created by individual users and laboratories to integrated modeling environments and model repositories. This emerging transition from the descriptive and informative, but "static", and for the most part, non-executable print format to interactive, transparent and functional delivery of "living" models is expected to have a transformative effect on modern experimental research in areas of scientific and regulatory use of (Q)SAR models. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Eamon, Mary Keegan; Wu, Chi-Fang; Moroney, Gabriela; Cundari, Melissa
Research suggests that social work students and practitioners are not particularly sensitive to assessing clients' economic hardship, nor when needed to assist clients in accessing relevant resources such as public benefits. To enhance students' understanding of the importance of engaging in these activities, this article provides…
Banda, Tea; CERN. Geneva. EP Department
The project consists in the initial development of ROOT notebooks for a Z boson analysis in C++ programming language that will allow students and researches to perform fast and very useful data analysis, using ATLAS public data and Monte- Carlo simulations. Several tools are considered: ROOT Data Analysis Frame- work, Jupyter Notebook Technology and CERN-ROOT computing service so-called SWAN.
Nekaris, By K. Anne-Isola; Campbell, Nicola; Coggins, Tim G.; Rode, E. Johanna; Nijman, Vincent
Background The internet is gaining importance in global wildlife trade and changing perceptions of threatened species. There is little data available to examine the impact that popular Web 2.0 sites play on public perceptions of threatened species. YouTube videos portraying wildlife allow us to quantify these perceptions. Methodology/Principal Findings Focussing on a group of threatened and globally protected primates, slow lorises, we quantify public attitudes towards wildlife conservation by analysing 12,411 comments and associated data posted on a viral YouTube video ‘tickling slow loris’ over a 33-months period. In the initial months a quarter of commentators indicated wanting a loris as a pet, but as facts about their conservation and ecology became more prevalent this dropped significantly. Endorsements, where people were directed to the site by celebrities, resulted mostly in numerous neutral responses with few links to conservation or awareness. Two conservation-related events, linked to Wikipedia and the airing of a television documentary, led to an increase in awareness, and ultimately to the removal of the analysed video. Conclusions/Significance Slow loris videos that have gone viral have introduced these primates to a large cross-section of society that would not normally come into contact with them. Analyses of webometric data posted on the internet allow us quickly to gauge societal sentiments. We showed a clear temporal change in some views expressed but without an apparent increase in knowledge about the conservation plight of the species, or the illegal nature of slow loris trade. Celebrity endorsement of videos showing protected wildlife increases visits to such sites, but does not educate about conservation issues. The strong desire of commentators to express their want for one as a pet demonstrates the need for Web 2.0 sites to provide a mechanism via which illegal animal material can be identified and policed. PMID:23894432
Anne-Isola Nekaris, K; Nekaris, By K Anne-Isola; Campbell, Nicola; Coggins, Tim G; Rode, E Johanna; Nijman, Vincent
The internet is gaining importance in global wildlife trade and changing perceptions of threatened species. There is little data available to examine the impact that popular Web 2.0 sites play on public perceptions of threatened species. YouTube videos portraying wildlife allow us to quantify these perceptions. Focussing on a group of threatened and globally protected primates, slow lorises, we quantify public attitudes towards wildlife conservation by analysing 12,411 comments and associated data posted on a viral YouTube video 'tickling slow loris' over a 33-months period. In the initial months a quarter of commentators indicated wanting a loris as a pet, but as facts about their conservation and ecology became more prevalent this dropped significantly. Endorsements, where people were directed to the site by celebrities, resulted mostly in numerous neutral responses with few links to conservation or awareness. Two conservation-related events, linked to Wikipedia and the airing of a television documentary, led to an increase in awareness, and ultimately to the removal of the analysed video. Slow loris videos that have gone viral have introduced these primates to a large cross-section of society that would not normally come into contact with them. Analyses of webometric data posted on the internet allow us quickly to gauge societal sentiments. We showed a clear temporal change in some views expressed but without an apparent increase in knowledge about the conservation plight of the species, or the illegal nature of slow loris trade. Celebrity endorsement of videos showing protected wildlife increases visits to such sites, but does not educate about conservation issues. The strong desire of commentators to express their want for one as a pet demonstrates the need for Web 2.0 sites to provide a mechanism via which illegal animal material can be identified and policed.
K Anne-Isola Nekaris
Full Text Available BACKGROUND: The internet is gaining importance in global wildlife trade and changing perceptions of threatened species. There is little data available to examine the impact that popular Web 2.0 sites play on public perceptions of threatened species. YouTube videos portraying wildlife allow us to quantify these perceptions. METHODOLOGY/PRINCIPAL FINDINGS: Focussing on a group of threatened and globally protected primates, slow lorises, we quantify public attitudes towards wildlife conservation by analysing 12,411 comments and associated data posted on a viral YouTube video 'tickling slow loris' over a 33-months period. In the initial months a quarter of commentators indicated wanting a loris as a pet, but as facts about their conservation and ecology became more prevalent this dropped significantly. Endorsements, where people were directed to the site by celebrities, resulted mostly in numerous neutral responses with few links to conservation or awareness. Two conservation-related events, linked to Wikipedia and the airing of a television documentary, led to an increase in awareness, and ultimately to the removal of the analysed video. CONCLUSIONS/SIGNIFICANCE: Slow loris videos that have gone viral have introduced these primates to a large cross-section of society that would not normally come into contact with them. Analyses of webometric data posted on the internet allow us quickly to gauge societal sentiments. We showed a clear temporal change in some views expressed but without an apparent increase in knowledge about the conservation plight of the species, or the illegal nature of slow loris trade. Celebrity endorsement of videos showing protected wildlife increases visits to such sites, but does not educate about conservation issues. The strong desire of commentators to express their want for one as a pet demonstrates the need for Web 2.0 sites to provide a mechanism via which illegal animal material can be identified and policed.
Yokochi, Masashi; Kobayashi, Naohiro; Ulrich, Eldon L; Kinjo, Akira R; Iwata, Takeshi; Ioannidis, Yannis E; Livny, Miron; Markley, John L; Nakamura, Haruki; Kojima, Chojiro; Fujiwara, Toshimichi
The nuclear magnetic resonance (NMR) spectroscopic data for biological macromolecules archived at the BioMagResBank (BMRB) provide a rich resource of biophysical information at atomic resolution. The NMR data archived in NMR-STAR ASCII format have been implemented in a relational database. However, it is still fairly difficult for users to retrieve data from the NMR-STAR files or the relational database in association with data from other biological databases. To enhance the interoperability of the BMRB database, we present a full conversion of BMRB entries to two standard structured data formats, XML and RDF, as common open representations of the NMR-STAR data. Moreover, a SPARQL endpoint has been deployed. The described case study demonstrates that a simple query of the SPARQL endpoints of the BMRB, UniProt, and Online Mendelian Inheritance in Man (OMIM), can be used in NMR and structure-based analysis of proteins combined with information of single nucleotide polymorphisms (SNPs) and their phenotypes. We have developed BMRB/XML and BMRB/RDF and demonstrate their use in performing a federated SPARQL query linking the BMRB to other databases through standard semantic web technologies. This will facilitate data exchange across diverse information resources.
Tania FERNÁNDEZ LOMBAO FERNÁNDEZ LOMBAO
Corporate Social Responsibility is a concept that defines the model of corporate governance based on responsible, horizontal and interactive accountability as opposed to closed and rail control systems. This type of management has been associated, at an initial moment, with private enterprise in the context of globalization, although gradually being implemented in the public, and consequently in the state-owned broadcasting corporations of the European Union. The three first corporations who ...
Bardach, Naomi S; Hibbard, Judith H; Greaves, Felix; Dudley, R Adams
In the context of the Affordable Care Act, there is extensive emphasis on making provider quality transparent and publicly available. Online public reports of quality exist, but little is known about how visitors find reports or about their purpose in visiting. To address this gap, we gathered website analytics data from a national group of online public reports of hospital or physician quality and surveyed real-time visitors to those websites. Websites were recruited from a national group of online public reports of hospital or physician quality. Analytics data were gathered from each website: number of unique visitors, method of arrival for each unique visitor, and search terms resulting in visits. Depending on the website, a survey invitation was launched for unique visitors on landing pages or on pages with quality information. Survey topics included type of respondent (eg, consumer, health care professional), purpose of visit, areas of interest, website experience, and demographics. There were 116,657 unique visitors to the 18 participating websites (1440 unique visitors/month per website), with most unique visitors arriving through search (63.95%, 74,606/116,657). Websites with a higher percent of traffic from search engines garnered more unique visitors (P=.001). The most common search terms were for individual hospitals (23.25%, 27,122/74,606) and website names (19.43%, 22,672/74,606); medical condition terms were uncommon (0.81%, 605/74,606). Survey view rate was 42.48% (49,560/116,657 invited) resulting in 1755 respondents (participation rate=3.6%). There were substantial proportions of consumer (48.43%, 850/1755) and health care professional respondents (31.39%, 551/1755). Across websites, proportions of consumer (21%-71%) and health care professional respondents (16%-48%) varied. Consumers were frequently interested in using the information to choose providers or assess the quality of their provider (52.7%, 225/427); the majority of those choosing a
Hibbard, Judith H; Greaves, Felix; Dudley, R Adams
Background In the context of the Affordable Care Act, there is extensive emphasis on making provider quality transparent and publicly available. Online public reports of quality exist, but little is known about how visitors find reports or about their purpose in visiting. Objective To address this gap, we gathered website analytics data from a national group of online public reports of hospital or physician quality and surveyed real-time visitors to those websites. Methods Websites were recruited from a national group of online public reports of hospital or physician quality. Analytics data were gathered from each website: number of unique visitors, method of arrival for each unique visitor, and search terms resulting in visits. Depending on the website, a survey invitation was launched for unique visitors on landing pages or on pages with quality information. Survey topics included type of respondent (eg, consumer, health care professional), purpose of visit, areas of interest, website experience, and demographics. Results There were 116,657 unique visitors to the 18 participating websites (1440 unique visitors/month per website), with most unique visitors arriving through search (63.95%, 74,606/116,657). Websites with a higher percent of traffic from search engines garnered more unique visitors (P=.001). The most common search terms were for individual hospitals (23.25%, 27,122/74,606) and website names (19.43%, 22,672/74,606); medical condition terms were uncommon (0.81%, 605/74,606). Survey view rate was 42.48% (49,560/116,657 invited) resulting in 1755 respondents (participation rate=3.6%). There were substantial proportions of consumer (48.43%, 850/1755) and health care professional respondents (31.39%, 551/1755). Across websites, proportions of consumer (21%-71%) and health care professional respondents (16%-48%) varied. Consumers were frequently interested in using the information to choose providers or assess the quality of their provider (52.7%, 225
Full Text Available The semantic Web is a technology at the service of knowledge which is aimed at accessibility and the sharing of content; facilitating interoperability between different systems and as such is one of the nine key technological pillars of TIC (technologies for information and communication within the third theme, programme specific cooperation of the seventh programme framework for research and development (7°PQRS, 2007-2013. As a system it seeks to overcome overload or excess of irrelevant information in Internet, in order to facilitate specific or pertinent research. It is an extension of the existing Web in which the aim is for cooperation between and the computer and people (the dream of Sir Tim Berners –Lee where machines can give more support to people when integrating and elaborating data in order to obtain inferences and a global sharing of data. It is a technology that is able to favour the development of a “data web” in other words the creation of a space in both sets of interconnected and shared data (Linked Data which allows users to link different types of data coming from different sources. It is a technology that will have great effect on everyday life since it will permit the planning of “intelligent applications” in various sectors such as education and training, research, the business world, public information, tourism, health, and e-government. It is an innovative technology that activates a social transformation (socio-semantic Web on a world level since it redefines the cognitive universe of users and enables the sharing not only of information but of significance (collective and connected intelligence.
Badenhorst, Anna; Mansoori, Parisa; Chan, Kit Yee
The past two decades have seen a large increase in investment in global public health research. There is a need for increased coordination and accountability, particularly in understanding where funding is being allocated and who has capacity to perform research. In this paper, we aim to assess global, regional, national and sub-national capacity for public health research and how it is changing over time in different parts of the world. To allow comparisons of regions, countries and universities/research institutes over time, we relied on Web of Science(TM) database and used Hirsch (h) index based on 5-year-periods (h5). We defined articles relevant to public health research with 98% specificity using the combination of search terms relevant to public health, epidemiology or meta-analysis. Based on those selected papers, we computed h5 for each country of the world and their main universities/research institutes for these 5-year time periods: 1996-2000, 2001-2005 and 2006-2010. We computed h5 with a 3-year-window after each time period, to allow citations from more recent years to accumulate. Among the papers contributing to h5-core, we explored a topic/disease under investigation, "instrument" of health research used (eg, descriptive, discovery, development or delivery research); and universities/research institutes contributing to h5-core. Globally, the majority of public health research has been conducted in North America and Europe, but other regions (particularly Eastern Mediterranean and South-East Asia) are showing greater improvement rate and are rapidly gaining capacity. Moreover, several African nations performed particularly well when their research output is adjusted by their gross domestic product (GDP). In the regions gaining capacity, universities are contributing more substantially to the h-core publications than other research institutions. In all regions of the world, the topics of articles in h-core are shifting from communicable to non
The World-Wide Web provides every internet citizen with access to an abundance of information, but it becomes increasingly difficult to identify the relevant pieces of information. Research in web mining tries to address this problem by applying techniques from data mining and machine learning to Web data and documents. This chapter provides a brief overview of web mining techniques and research areas, most notably hypertext classification, wrapper induction, recommender systems and web usage mining.
Publicity for preschool cooperatives is described. Publicity helps produce financial support for preschool cooperatives. It may take the form of posters, brochures, newsletters, open house, newspaper coverage, and radio and television. Word of mouth and general good will in the community are the best avenues of publicity that a cooperative nursery…
Web-based public health geographic information systems for resources-constrained environment using scalable vector graphics technology: a proof of concept applied to the expanded program on immunization data
Full Text Available Abstract Background Geographic Information Systems (GIS are powerful communication tools for public health. However, using GIS requires considerable skill and, for this reason, is sometimes limited to experts. Web-based GIS has emerged as a solution to allow a wider audience to have access to geospatial information. Unfortunately the cost of implementing proprietary solutions may be a limiting factor in the adoption of a public health GIS in a resource-constrained environment. Scalable Vector Graphics (SVG is used to define vector-based graphics for the internet using XML (eXtensible Markup Language; it is an open, platform-independent standard maintained by the World Wide Web Consortium (W3C since 2003. In this paper, we summarize our methodology and demonstrate the potential of this free and open standard to contribute to the dissemination of Expanded Program on Immunization (EPI information by providing interactive maps to a wider audience through the Internet. Results We used SVG to develop a database driven web-based GIS applied to EPI data from three countries of WHO AFRO (World Health Organization – African Region. The system generates interactive district-level country immunization coverage maps and graphs. The approach we describe can be expanded to cover other public health GIS demanding activities, including the design of disease atlases in a resources-constrained environment. Conclusion Our system contributes to accumulating evidence demonstrating the potential of SVG technology to develop web-based public health GIS in resources-constrained settings.
Web-based public health geographic information systems for resources-constrained environment using scalable vector graphics technology: a proof of concept applied to the expanded program on immunization data.
Kamadjeu, Raoul; Tolentino, Herman
Geographic Information Systems (GIS) are powerful communication tools for public health. However, using GIS requires considerable skill and, for this reason, is sometimes limited to experts. Web-based GIS has emerged as a solution to allow a wider audience to have access to geospatial information. Unfortunately the cost of implementing proprietary solutions may be a limiting factor in the adoption of a public health GIS in a resource-constrained environment. Scalable Vector Graphics (SVG) is used to define vector-based graphics for the internet using XML (eXtensible Markup Language); it is an open, platform-independent standard maintained by the World Wide Web Consortium (W3C) since 2003. In this paper, we summarize our methodology and demonstrate the potential of this free and open standard to contribute to the dissemination of Expanded Program on Immunization (EPI) information by providing interactive maps to a wider audience through the Internet. We used SVG to develop a database driven web-based GIS applied to EPI data from three countries of WHO AFRO (World Health Organization - African Region). The system generates interactive district-level country immunization coverage maps and graphs. The approach we describe can be expanded to cover other public health GIS demanding activities, including the design of disease atlases in a resources-constrained environment. Our system contributes to accumulating evidence demonstrating the potential of SVG technology to develop web-based public health GIS in resources-constrained settings.
Full Text Available Se analizan las interfaces de usuario de los catálogos en línea de acceso público (OPACs en entorno web de las bibliotecas universitarias, especializadas, públicas y nacionales de los países parte del Mercosur (Argentina, Brasil, Paraguay, Uruguay, para elaborar un diagnóstico de situación sobre: descripción bibliográfica, análisis temático, mensajes de ayuda al usuario, visualización de datos bibliográficos. Se adopta una metodología cuali-cuantitativa, se utiliza como instrumento de recolección de datos la lista de funcionalidades del sistema que proporciona Hildreth (1982, se actualiza, se obtiene un formulario que permite, mediante 38 preguntas cerradas, observar la frecuencia de aparición de las funcionalidades básicas propias de cuatro áreas: Área I - control de operaciones; Área II - control de formulación de la búsqueda y puntos de acceso; Área III - control de salida y Área IV - asistencia al usuario: información e instrucción. Se trabaja con la información correspondiente a 297 unidades. Se delimitan estratos por tipo de software, tipo de biblioteca y país. Se aplican a los resultados las pruebas de Chi-cuadrado, Odds ratio y regresión logística multinomial. El análisis corrobora la existencia de diferencias significativas en cada uno de los estratos y verifica que la mayoría de los OPACs relevados brindan prestaciones mínimas.User interfaces of web based online public access catalogs (OPACs of academic, special, public and national libraries in countries belonging to Mercosur (Argentina, Brazil, Paraguay, Uruguay are studied to provide a diagnosis of the situation of bibliographic description, subject analisis, help messages and bibliographic display. A cuali-cuantitative methodology is adopted and a checklist of systems functions created by Hildreth (1982 is updated and used as data collection tool. The resulting 38 closed questions checklist has allowed to observe the frequency of appearance of the
Yashashri Chandrakant Shetty
Full Text Available Objective: The study surveyed the availability of the intranet in campus and also the knowledge related to drug spectrum an intranet publication. Materials and Methods: Institutional ethics committee permission was obtained. Verbal consent was taken from the faculty and resident doctors of departments where all the facilities were available. Universal sampling method was used for recruitment. Pre-validated questionnaires were given to approximately 100 faculty and 500 resident doctors in the year 2012-2013. The questionnaire contained 15 items. Content analysis was done. The study questionnaire focused on a survey to obtain participants feedback on the use of the intranet and to evaluate the use of intranet as a source of knowledge. It also dealt on the relevance of the drug spectrum in the context of their subject. The responses were taken after giving the participants sufficient time. Data was entered into an Excel 2003 spread sheet and analyzed by using descriptive statistics. Results: The total number of respondents who participated in our study was 134 including faculty and residents from various departments. A total of 117 (89.66% respondents stated that their departments have access to the internet. Departments having access to intranet was 103 (76.29%. 67 (49.62% respondents have accessed. 67 (49.62% did not have the time to visit intranet site whereas 67 (49.62% have not accessed intranet. 89 (65.92% respondents were not aware of the drug spectrum. 101 (74.81% respondents felt that drug spectrum is a useful activity on intranet. 45 (33.33% knew about the intranet periodical drug spectrum, but most of the respondents (33.33% explained the meaning of the word drug spectrum according to their understanding, but never knew about the online intranet journal drug spectrum. Conclusion: The study found that the intranet is available in the campus, but it is not being utilized. The awareness and knowledge regarding drug spectrum is lacking, but
Passow, M. J.; Kastens, K. A.; Goodwillie, A. M.; Brenner, C.
The Lamont-Doherty Earth Observatory of Columbia University (LDEO) continues its long history of contributions to public understanding of Science. Highlights of current efforts are described in paired posters. Part 2 focuses on web-based activities that foster access to LDEO cutting-edge research for worldwide audiences. “Geoscience Data Puzzles" are activities that purposefully present a high ratio of insight-to-effort for students. Each Puzzle uses selected authentic data to illuminate fundamental Earth processes typically taught in Earth Science curricula. Data may be in the form of a graph, table, map, image or combination of the above. Some Puzzles involve downloading a simple Excel file, but most can be worked from paper copies. Questions guide students through the process of data interpretion. Most Puzzles involve calculations, with emphasis on the too-seldom-taught skill of figuring out what math process is useful to answer an unfamiliar question or solve a problem. Every Puzzle offers "Aha" insights, when the connection between data and process or data and problem comes clear in a rewarding burst of illumination. Time needed to solve a Puzzle is between 15 minutes and an hour. “GeoMapApp” is a free, map-based data exploration and visualization application from the LDEO Marine Geoscience Data System group. GeoMapApp provides direct access to hundreds of data sets useful to geoscience educators, including continuously-updated Global Multi-Resolution Topography compilations that incorporates high-resolution bathymetry in the oceans and Space Shuttle elevations over land. A new User Guide, multi-media tutorials and webinar offer follow-along help and examples. “Virtual Ocean” integrates GeoMapApp functionality with NASA World Wind code to provide a powerful new 3-D platform for interdisciplinary geoscience research and education. Both GeoMapApp and Virtual Ocean foster scientific understanding and provide training in new data visualization
Full Text Available The present study endeavored to analysis the scientific publications that were indexed in the Web of Science database as the information management records and the visualization of science structure in this field during 1988-2009. The research method was scientometrics. During the study period, 1120 records in the field of information management have been published. These records were extracted in the form of plain text files and stored in a PC. Then they were analyzed by ISI.exe and HistCite softwares. Author's coefficient collaboration (CC was grown from zero in 1988 to 0.33 in 2009. Average coefficient collaboration between the authors was 0.22 which confirmed low authors collaboration in this area. The records have been published in 63 languages. Among these records the English language with 93.8 % possessed the highest proportion. City University London and the University of Sheffield in England had the most common publications in information management field. Based on the number of published records, T.D. Wilson with 13 records and 13 citations ranked as the first. The average number of global citations to 112 documents has been equal to 8.78. Despite the participation of different countries in the production of documents, more than 28.9% of records have been produced in the United States. According to results, 10 countries have published more than 72.4 percent of the records. City University London and the University of Sheffield have had highest frequency in this area. 15 journals have published 564 records (50.4% of the total productions. Finally, by implementation of scientific software HistCite map drawing clustered and authors, articles and four effective specific subjects were introduced..
Finnemann, Niels Ole
This article deals with general web archives and the principles for selection of materials to be preserved. It opens with a brief overview of reasons why general web archives are needed. Section two and three present major, long termed web archive initiatives and discuss the purposes and possible...... values of web archives and asks how to meet unknown future needs, demands and concerns. Section four analyses three main principles in contemporary web archiving strategies, topic centric, domain centric and time-centric archiving strategies and section five discuss how to combine these to provide...... a broad and rich archive. Section six is concerned with inherent limitations and why web archives are always flawed. The last sections deal with the question how web archives may fit into the rapidly expanding, but fragmented landscape of digital repositories taking care of various parts...
Tagiew, Rustam; Ignatov, Dmitry I.; Amroush, Fadi
This paper offers a step towards research infrastructure, which makes data from experimental economics efficiently usable for analysis of web data. We believe that regularities of human behavior found in experimental data also emerge in real world web data. A format for data from experiments is suggested, which enables its publication as open data. Once standardized datasets of experiments are available on-line, web mining can take advantages from this data. Further, the questions about the o...
Thai, Quan Ke; Chung, Dung Anh; Tran, Hoang-Dung
Canine and wolf mitochondrial DNA haplotypes, which can be used for forensic or phylogenetic analyses, have been defined in various schemes depending on the region analyzed. In recent studies, the 582 bp fragment of the HV1 region is most commonly used. 317 different canine HV1 haplotypes have been reported in the rapidly growing public database GenBank. These reported haplotypes contain several inconsistencies in their haplotype information. To overcome this issue, we have developed a Canis mtDNA HV1 database. This database collects data on the HV1 582 bp region in dog mitochondrial DNA from the GenBank to screen and correct the inconsistencies. It also supports users in detection of new novel mutation profiles and assignment of new haplotypes. The Canis mtDNA HV1 database (CHD) contains 5567 nucleotide entries originating from 15 subspecies in the species Canis lupus. Of these entries, 3646 were haplotypes and grouped into 804 distinct sequences. 319 sequences were recognized as previously assigned haplotypes, while the remaining 485 sequences had new mutation profiles and were marked as new haplotype candidates awaiting further analysis for haplotype assignment. Of the 3646 nucleotide entries, only 414 were annotated with correct haplotype information, while 3232 had insufficient or lacked haplotype information and were corrected or modified before storing in the CHD. The CHD can be accessed at http://chd.vnbiology.com . It provides sequences, haplotype information, and a web-based tool for mtDNA HV1 haplotyping. The CHD is updated monthly and supplies all data for download. The Canis mtDNA HV1 database contains information about canine mitochondrial DNA HV1 sequences with reconciled annotation. It serves as a tool for detection of inconsistencies in GenBank and helps identifying new HV1 haplotypes. Thus, it supports the scientific community in naming new HV1 haplotypes and to reconcile existing annotation of HV1 582 bp sequences.
Web Engineering is the application of systematic, disciplined and quantifiable approaches to development, operation, and maintenance of Web-based applications. It is both a pro-active approach and a growing collection of theoretical and empirical research in Web application development. This paper gives an overview of Web Engineering by addressing the questions: (a) why is it needed? (b) what is its domain of operation? (c) how does it help and what should it do to improve Web application development? and (d) how should it be incorporated in education and training? The paper discusses the significant differences that exist between Web applications and conventional software, the taxonomy of Web applications, the progress made so far and the research issues and experience of creating a specialization at the master's level. The paper reaches a conclusion that Web Engineering at this stage is a moving target since Web technologies are constantly evolving, making new types of applications possible, which in turn may require innovations in how they are built, deployed and maintained.
Babu, B. Ramesh; O'Brien, Ann
Discussion of Web-based online public access catalogs (OPACs) focuses on a review of six Web OPAC interfaces in use in academic libraries in the United Kingdom. Presents a checklist and guidelines of important features and functions that are currently available, including search strategies, access points, display, links, and layout. (Author/LRW)
This paper notes the ineffectualness of organizational World Wide Web sites which are generally supportive of nuclear science and technology versus those whose mission is to oppose nuclear matters and which do so by providing mis-information to the public. Specific comparisons of pro and con sites are made, and recommendations are made for improving the communication effectiveness of proponent sites. (author)
NREL developed a free, publicly available web version of the REopt (TM) renewable energy integration and optimization platform called REopt Lite. REopt Lite recommends the optimal size and dispatch strategy for grid-connected photovoltaics (PV) and battery storage at a site. It also allows users to explore how PV and storage can increase a site's resiliency during a grid outage.
Kamel Boulos, Maged N; Sanfilippo, Antonio P; Corley, Courtney D; Wheeler, Steve
This paper explores Technosocial Predictive Analytics (TPA) and related methods for Web "data mining" where users' posts and queries are garnered from Social Web ("Web 2.0") tools such as blogs, micro-blogging and social networking sites to form coherent representations of real-time health events. The paper includes a brief introduction to commonly used Social Web tools such as mashups and aggregators, and maps their exponential growth as an open architecture of participation for the masses and an emerging way to gain insight about people's collective health status of whole populations. Several health related tool examples are described and demonstrated as practical means through which health professionals might create clear location specific pictures of epidemiological data such as flu outbreaks. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
As an increasing part of everyday life becomes connected with the web in many areas of the globe, the question of how the web mediates political processes becomes still more urgent. Several scholars have started to address this question by thinking about the web in terms of a public space....... In this paper, we aim to make a twofold contribution towards the development of the concept of publics in web science. First, we propose that although the notion of publics raises a variety of issues, two major concerns continue to be user privacy and democratic citizenship on the web. Well-known arguments hold......, this paper points towards an alternative way to think about publics by proposing a pragmatist reorientation of the public/private distinction in web science, away from seeing two spheres that needs to be kept separate, towards seeing the public and the private as something that is continuously connected...
Modraj Bhavsar; Mrs. P. M. Chavan
On World Wide Web various kind of content are generated in huge amount, so to give relevant result to user web recommendation become important part of web application. On web different kind of web recommendation are made available to user every day that includes Image, Video, Audio, query suggestion and web page. In this paper we are aiming at providing framework for web page recommendation. 1) First we describe the basics of web mining, types of web mining. 2) Details of each...
Web 2.0 e redes sociais: ferramentas de relações públicas dos centros de eventos em Espanha, Portugal e América Latina/ Web 2.0 and social networks: public relations tools of exhibition centres in Spain, Portugal and Latin America
Xosé Manuel Baamonde Silva
Full Text Available Os centros de eventos, como as feiras, são espaços de comunicação. Não são apenas centrosde negócios e, como tal, podem ser estudados de um ponto de vista comunicativo. Estainvestigação analisa a utilização da Web 2.0 e das redes sociais como ferramentas deRelações Públicas. Constata-se que as organizações de feiras da Península Ibérica e daAmérica Latina não utilizam um modelo interativo e multidirecional. As suas webs estãoorientadas para satisfazer as necessidades de informação dos meios de comunicação social,mas não estão preparadas para dialogar diretamente com a opinião pública. A maioria dosespaços na Internet das organizações de feiras espanholas, portuguesas e latino-americanasnão está aproveitando as possibilidades que a Web 2.0 e as redes sociais oferecem paraconversar com os seus públicos. A gestão das Relações Públicas deve incluir a utilização das redes sociais, dado que já não é suficiente estar na Internet, há que participar na vida da Rede./The trade fairs are spaces of communication. They are not only business centres and,therefore, they can study under a communicative approach. This investigation analyses theutilisation of the Web 2.0 and the social networks like public relations tools. The IberianPeninsula and o Latin America trade fairs organization do not use a bidirectional symmetricalmodel. Their webs are oriented to satisfy the needs of information of the media but not tohave a conversation directly with the public opinion. The spaces in Internet of Spanish,Portuguese and Latin American trade fairs organizations are not taking advantage of thepossibilities that offers the web 2.0 and the social networks to converse with their publics.The management of the public relations has to include the utilisation of the social networks,because it is necessary to participate in the life of Internet.
This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...
How the “Understanding Research Evidence” Web-Based Video Series From the National Collaborating Centre for Methods and Tools Contributes to Public Health Capacity to Practice Evidence-Informed Decision Making: Mixed-Methods Evaluation
Chan, Linda; Mackintosh, Jeannie
Background The National Collaborating Centre for Methods and Tools (NCCMT) offers workshops and webinars to build public health capacity for evidence-informed decision-making. Despite positive feedback for NCCMT workshops and resources, NCCMT users found key terms used in research papers difficult to understand. The Understanding Research Evidence (URE) videos use plain language, cartoon visuals, and public health examples to explain complex research concepts. The videos are posted on the NCCMT website and YouTube channel. Objective The first four videos in the URE web-based video series, which explained odds ratios (ORs), confidence intervals (CIs), clinical significance, and forest plots, were evaluated. The evaluation examined how the videos affected public health professionals’ practice. A mixed-methods approach was used to examine the delivery mode and the content of the videos. Specifically, the evaluation explored (1) whether the videos were effective at increasing knowledge on the four video topics, (2) whether public health professionals were satisfied with the videos, and (3) how public health professionals applied the knowledge gained from the videos in their work. Methods A three-part evaluation was conducted to determine the effectiveness of the first four URE videos. The evaluation included a Web-based survey, telephone interviews, and pretest and posttests, which evaluated public health professionals’ experience with the videos and how the videos affected their public health work. Participants were invited to participate in this evaluation through various open access, public health email lists, through informational flyers and posters at the Canadian Public Health Association (CPHA) conference, and through targeted recruitment to NCCMT’s network. Results In the Web-based surveys (n=46), participants achieved higher scores on the knowledge assessment questions from watching the OR (P=.04), CI (P=.04), and clinical significance (P=.05) videos but
Gao, Jerry Z.; Zhu, Eugene; Shim, Simon
With the increasing applications of the Web in e-commerce, advertising, and publication, new technologies are needed to improve Web graphics technology due to the current limitation of technology. The SVG (Scalable Vector Graphics) technology is a new revolutionary solution to overcome the existing problems in the current web technology. It provides precise and high-resolution web graphics using plain text format commands. It sets a new standard for web graphic format to allow us to present complicated graphics with rich test fonts and colors, high printing quality, and dynamic layout capabilities. This paper provides a tutorial overview about SVG technology and its essential features, capability, and advantages. The reports a comparison studies between SVG and other web graphics technologies.
Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)
A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.
EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.
... topic data in XML format. Using the Web service, software developers can build applications that utilize MedlinePlus health topic information. The service accepts keyword searches as requests and returns relevant ...
Schwartz Communications, LLC, executes a successful PR campaign to position Subimo, a provider of online healthcare decision tools, as a leader in the industry that touts names such as WebMD.com and HealthGrades.com. Through a three-pronged media relations strategy, Schwartz and Subimo together branded the company as an industry thought-leader.
Libeskind, Noam I.; van de Weygaert, Rien; Cautun, Marius; Falck, Bridget; Tempel, Elmo; Abel, Tom; Alpaslan, Mehmet; Aragón-Calvo, Miguel A.; Forero-Romero, Jaime E.; Gonzalez, Roberto; Gottlöber, Stefan; Hahn, Oliver; Hellwing, Wojciech A.; Hoffman, Yehuda; Jones, Bernard J. T.; Kitaura, Francisco; Knebe, Alexander; Manti, Serena; Neyrinck, Mark; Nuza, Sebastián E.; Padilla, Nelson; Platen, Erwin; Ramachandra, Nesar; Robotham, Aaron; Saar, Enn; Shandarin, Sergei; Steinmetz, Matthias; Stoica, Radu S.; Sousbie, Thierry; Yepes, Gustavo
The cosmic web is one of the most striking features of the distribution of galaxies and dark matter on the largest scales in the Universe. It is composed of dense regions packed full of galaxies, long filamentary bridges, flattened sheets and vast low-density voids. The study of the cosmic web has focused primarily on the identification of such features, and on understanding the environmental effects on galaxy formation and halo assembly. As such, a variety of different methods have been devised to classify the cosmic web - depending on the data at hand, be it numerical simulations, large sky surveys or other. In this paper, we bring 12 of these methods together and apply them to the same data set in order to understand how they compare. In general, these cosmic-web classifiers have been designed with different cosmological goals in mind, and to study different questions. Therefore, one would not a priori expect agreement between different techniques; however, many of these methods do converge on the identification of specific features. In this paper, we study the agreements and disparities of the different methods. For example, each method finds that knots inhabit higher density regions than filaments, etc. and that voids have the lowest densities. For a given web environment, we find a substantial overlap in the density range assigned by each web classification scheme. We also compare classifications on a halo-by-halo basis; for example, we find that 9 of 12 methods classify around a third of group-mass haloes (i.e. Mhalo ∼ 1013.5 h-1 M⊙) as being in filaments. Lastly, so that any future cosmic-web classification scheme can be compared to the 12 methods used here, we have made all the data used in this paper public.
Smith, Craig A
Modern communications, combined with the near instantaneous publication of information on the World Wide Web, are providing the means to dramatically affect the pursuit, conduct, and public opinion of war on both sides...
"MedTRIS" (Medical Triage and Registration Informatics System): A Web-based Client Server System for the Registration of Patients Being Treated in First Aid Posts at Public Events and Mass Gatherings.
Gogaert, Stefan; Vande Veegaete, Axel; Scholliers, Annelies; Vandekerckhove, Philippe
First aid (FA) services are provisioned on-site as a preventive measure at most public events. In Flanders, Belgium, the Belgian Red Cross-Flanders (BRCF) is the major provider of these FA services with volunteers being deployed at approximately 10,000 public events annually. The BRCF has systematically registered information on the patients being treated in FA posts at major events and mass gatherings during the last 10 years. This information has been collected in a web-based client server system called "MedTRIS" (Medical Triage and Registration Informatics System). MedTRIS contains data on more than 200,000 patients at 335 mass events. This report describes the MedTRIS architecture, the data collected, and how the system operates in the field. This database consolidates different types of information with regards to FA interventions in a standardized way for a variety of public events. MedTRIS allows close monitoring in "real time" of the situation at mass gatherings and immediate intervention, when necessary; allows more accurate prediction of resources needed; allows to validate conceptual and predictive models for medical resources at (mass) public events; and can contribute to the definition of a standardized minimum data set (MDS) for mass-gathering health research and evaluation. Gogaert S , Vande veegaete A , Scholliers A , Vandekerckhove P . "MedTRIS" (Medical Triage and Registration Informatics System): a web-based client server system for the registration of patients being treated in first aid posts at public events and mass gatherings. Prehosp Disaster Med. 2016;31(5):557-562.
The Internet is undoubtedly still a revolutionary breakthrough in the history of humanity. Many people use the internet for communication, social media, shopping, political and social agenda, and more. Deep Web and Dark Web concepts not only handled by computer, software engineers but also handled by social siciensists because of the role of internet for the States in international arenas, public institutions and human life. By the moving point that very importantrole of internet for social s...
Bordeianu, Sever; Carter, Christina E.; Dennis, Nancy K.
Describes Web-based online public access catalogs (Web OPACs) and other Web-based tools as gateway methods for providing access to library collections. Addresses solutions for overcoming barriers to information, such as through the implementation of proxy servers and other authentication tools for remote users. (Contains 18 references.)…
Roger M. Rowell; James S. Han; Von L. Byrd
Wood fibers can be used to produce a wide variety of low-density three-dimensional webs, mats, and fiber-molded products. Short wood fibers blended with long fibers can be formed into flexible fiber mats, which can be made by physical entanglement, nonwoven needling, or thermoplastic fiber melt matrix technologies. The most common types of flexible mats are carded, air...
Presents seven mathematics games, located on the World Wide Web, for elementary students, including: Absurd Math: Pre-Algebra from Another Dimension; The Little Animals Activity Centre; MathDork Game Room (classic video games focusing on algebra); Lemonade Stand (students practice math and business skills); Math Cats (teaches the artistic beauty…
Legasto, A.C.; Haller, J.O.; Giusti, R.J.
Congenital tracheal web is a rare entity often misdiagnosed as refractory asthma. Clinical suspicion based on patient history, examination, and pulmonary function tests should lead to its consideration. Bronchoscopy combined with CT imaging and multiplanar reconstruction is an accepted, highly sensitive means of diagnosis. (orig.)
Casey, Maire; Pahl, Claus
Component-based software engineering on the Web differs from traditional component and software engineering. We investigate Web component engineering activites that are crucial for the development,com position, and deployment of components on the Web. The current Web Services and Semantic Web initiatives strongly influence our work. Focussing on Web component composition we develop description and reasoning techniques that support a component developer in the composition activities,fo cussing...
Sherif Kamel Shaheen
Full Text Available The research aims at evaluating Arabic Libraries’ Web-based Catalogues in the light of Principles and Recommendations published in: IFLA’s Guidelines For OPAC Displays (September 30, 2003 Draft For Worldwide Review. The total No. Of Recommendations reached” 38 “were categorized under three main titles, as follows: User Needs (12 recommendations, Content and arrangement Principle (25 recommendations, Standardization Principle (one recommendation However that number increased to reach 88 elements when formulated as evaluative criteria and included in the study’s checklist.
How the "Understanding Research Evidence" Web-Based Video Series From the National Collaborating Centre for Methods and Tools Contributes to Public Health Capacity to Practice Evidence-Informed Decision Making: Mixed-Methods Evaluation.
Chan, Linda; Mackintosh, Jeannie; Dobbins, Maureen
The National Collaborating Centre for Methods and Tools (NCCMT) offers workshops and webinars to build public health capacity for evidence-informed decision-making. Despite positive feedback for NCCMT workshops and resources, NCCMT users found key terms used in research papers difficult to understand. The Understanding Research Evidence (URE) videos use plain language, cartoon visuals, and public health examples to explain complex research concepts. The videos are posted on the NCCMT website and YouTube channel. The first four videos in the URE web-based video series, which explained odds ratios (ORs), confidence intervals (CIs), clinical significance, and forest plots, were evaluated. The evaluation examined how the videos affected public health professionals' practice. A mixed-methods approach was used to examine the delivery mode and the content of the videos. Specifically, the evaluation explored (1) whether the videos were effective at increasing knowledge on the four video topics, (2) whether public health professionals were satisfied with the videos, and (3) how public health professionals applied the knowledge gained from the videos in their work. A three-part evaluation was conducted to determine the effectiveness of the first four URE videos. The evaluation included a Web-based survey, telephone interviews, and pretest and posttests, which evaluated public health professionals' experience with the videos and how the videos affected their public health work. Participants were invited to participate in this evaluation through various open access, public health email lists, through informational flyers and posters at the Canadian Public Health Association (CPHA) conference, and through targeted recruitment to NCCMT's network. In the Web-based surveys (n=46), participants achieved higher scores on the knowledge assessment questions from watching the OR (P=.04), CI (P=.04), and clinical significance (P=.05) videos but not the forest plot (P=.12) video, as
.... The Policy requires heads of DoD Components to establish a process to identify appropriate information for posting to Web sites and to review all information placed on publicly accessible Web sites...
Usage of health-themed public service announcements as a social marketing communication tool: A content analysis related to public service announcements in the republic of Turkey, ministry of health’s web site
Burcu İnci; Oya Sancar; Seda H. Bostancı
Public service announcements are informative short films that are made with the purpose of increasing the awareness of the society and/or creating behavioral changes. Also, they are communication tools used within the context of social marketing. One of the main themes of public service announcements which may have a substantial impact on masses is “health theme”. Tobacco, blood donation, breast milk, obesity, and diabetes themed public service announcements which aimed to protect and improve...
This presentation shows how to use the WebDewey tool. Features of WebDewey. Italian WebDewey compared with American WebDewey. Querying Italian WebDewey. Italian WebDewey and MARC21. Italian WebDewey and UNIMARC. Numbers, captions, "equivalente verbale": Dewey decimal classification in Italian catalogues. Italian WebDewey and Nuovo soggettario. Italian WebDewey and LCSH. Italian WebDewey compared with printed version of Italian Dewey Classification (22. edition): advantages and disadvantages o...
Goodwin, Morten; Susar, Deniz; Nietzio, Annika
Equal access to public information and services for all is an essential part of the United Nations (UN) Declaration of Human Rights. Today, the Web plays an important role in providing information and services to citizens. Unfortunately, many government Web sites are poorly designed and have...... accessibility barriers that prevent people with disabilities from using them. This article combines current Web accessibility benchmarking methodologies with a sound strategy for comparing Web accessibility among countries and continents. Furthermore, the article presents the first global analysis of the Web...... accessibility of 192 United Nation Member States made publically available. The article also identifies common properties of Member States that have accessible and inaccessible Web sites and shows that implementing antidisability discrimination laws is highly beneficial for the accessibility of Web sites, while...
Hassanzadeh, Hamed; Keyvanpour, Mohammad Reza
In recent years, Semantic web has become a topic of active research in several fields of computer science and has applied in a wide range of domains such as bioinformatics, life sciences, and knowledge management. The two fast-developing research areas semantic web and web mining can complement each other and their different techniques can be used jointly or separately to solve the issues in both areas. In addition, since shifting from current web to semantic web mainly depends on the enhance...
Freeman, B; Chapman, S
Many nations have banned or curtailed advertising of potentially harmful products to protect public health, particularly in the area of chronic disease control. The growth in Internet-based marketing techniques is subverting these advertising regulations. Explosive rises in use of social networking and user-generated content websites is further fuelling product promotion through electronic media. In contrast, there is a very limited body of public health research on these "new media" advertising methods. This paper provides an overview of these advertising methods and details examples relevant to chronic disease control. There is a vast untapped potential for health practitioners and researchers to exploit these same media for health promotion.
Responsive Web Design Workflow is a literature review about Responsive Web Design, a web standards based modern web design paradigm. The goals of this research were to define what responsive web design is, determine its importance in building modern websites and describe a workflow for responsive web design projects. Responsive web design is a paradigm to create adaptive websites, which respond to the properties of the media that is used to render them. The three key elements of responsi...
Eminovic, Nina; Wyatt, Jeremy C.; Tarpey, Aideen M.; Murray, Gerard; Ingrams, Grant J.
Background: NHS Direct is a telephone triage service used by the UK public to contact a nurse for any kind of health problem. NHS Direct Online (NHSDO) extends NHS Direct, allowing the telephone to be replaced by the Internet, and introducing new opportunities for informing patients about their
Los catálogos en línea de acceso público del Mercosur disponibles en entorno web: características del Proyecto UBACYT F054 Online public access catalogs of Mercosur in a web environment: characteristics of UBACYT F054 Project
Elsa E. Barber
Full Text Available Se presentan los lineamientos teórico-metodológicos del proyecto de investigación UBACYT F054 (Programación Científica y Técnica de la Universidad de Buenos Aires 2004-2007. Se analiza la problemática de los catálogos en línea de acceso público (OPACs disponibles en entorno web de las bibliotecas nacionales, universitarias, especializadas y públicas del Mercosur. Se estudian los aspectos vinculados con el control operativo, la formulación de la búsqueda, los puntos de acceso, el control de salida y la asistencia al usuario. El proyecto se propone, desde un abordaje cuantitativo y cualitativo, efectuar un diagnóstico de situación válido para los catálogos de la región. Plantea, además, un estudio comparativo con el fin de vislumbrar las tendencias existentes dentro de esta temática en bibliotecas semejantes de Argentina, Brasil, Paraguay y Uruguay.The theoretical-methodological aspects of the research project UBACYT F054 (Universidad de Buenos Aires Technical and Scientific Program, 2004- 2007 are outlined. Online Public Access Catalogs (OPACs in web environment in national, academic, public and special libraries in countries belonging to Mercosur are analized. Aspects related to the operational control, search formulation, access points, output control and user assistance are studied. The project aims, both quantitatively and qualitatively, to make a situation diagnosis valid for the catalogs of the region. It also offers a comparative study in order to see the existing tendencies on the subject in similar libraries in Argentina, Brasil, Paraguay and Uruguay.
van der Waaij, B.D.; Sprenkels, Ron; van Beijnum, Bernhard J.F.; Pras, Aiko
This paper discusses the design of a public domain web based ATM PVC Management tool for the Dutch SURFnet research ATM network. The aim of this tool is to assists in the creation and deletion of PVCs through local and remote ATM network domains. The tool includes security mechanisms to restrict the
Wing, Louise; Massoud, Tarik F
Quantitative, qualitative, and innovative application of bibliometric research performance indicators to anatomy and radiology research and education can enhance cross-fertilization between the two disciplines. We aim to use these indicators to identify long-term trends in dissemination of publications in neuroimaging anatomy (including both productivity and citation rates), which has subjectively waned in prestige during recent years. We examined publications over the last 40 years in two neuroradiological journals, AJNR and Neuroradiology, and selected and categorized all neuroimaging anatomy research articles according to theme and type. We studied trends in their citation activity over time, and mathematically analyzed these trends for 1977, 1987, and 1997 publications. We created a novel metric, "citation half-life at 10 years postpublication" (CHL-10), and used this to examine trends in the skew of citation numbers for anatomy articles each year. We identified 367 anatomy articles amongst a total of 18,110 in these journals: 74.2% were original articles, with study of normal anatomy being the commonest theme (46.7%). We recorded a mean of 18.03 citations for each anatomy article, 35% higher than for general neuroradiology articles. Graphs summarizing the rise (upslope) in citation rates after publication revealed similar trends spanning two decades. CHL-10 trends demonstrated that more recently published anatomy articles were likely to take longer to reach peak citation rate. Bibliometric analysis suggests that anatomical research in neuroradiology is not languishing. This novel analytical approach can be applied to other aspects of neuroimaging research, and within other subspecialties in radiology and anatomy, and also to foster anatomical education. © 2014 Wiley Periodicals, Inc.
This summer I assisted the RPT Program Office in developing a design plan to update their existing website to current NASA web standards. The finished website is intended for the general public, specifically potential customers interested in learning about NASA's chemical rocket test facility capabilities and test assignment process. The goal of the website is to give the public insight about the purpose and function of the RPT Program. Working on this project gave me the opportunity to learn skills necessary for effective project management. The RPT Program Office manages numerous facilities so they are required to travel often to other sites for meetings throughout the year. Maneuvering around the travel schedule of the office and the workload priority of the IT Department proved to be quite the challenge. I overcame the travel schedule of the office by frequently communicating and checking in with my mentor via email and telephone.
US Agency for International Development — WebTA is a web-based time and attendance system that supports USAID payroll administration functions, and is designed to capture hours worked, leave used and...
Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...
Pollock, Jeffrey T
Semantic Web technology is already changing how we interact with data on the Web. By connecting random information on the Internet in new ways, Web 3.0, as it is sometimes called, represents an exciting online evolution. Whether you're a consumer doing research online, a business owner who wants to offer your customers the most useful Web site, or an IT manager eager to understand Semantic Web solutions, Semantic Web For Dummies is the place to start! It will help you:Know how the typical Internet user will recognize the effects of the Semantic WebExplore all the benefits the data Web offers t
Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin; Jasche, Jens
We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.
This book explains the Linked Data domain by adopting a bottom-up approach: it introduces the fundamental Semantic Web technologies and building blocks, which are then combined into methodologies and end-to-end examples for publishing datasets as Linked Data, and use cases that harness scholarly information and sensor data. It presents how Linked Data is used for web-scale data integration, information management and search. Special emphasis is given to the publication of Linked Data from relational databases as well as from real-time sensor data streams. The authors also trace the transformation from the document-based World Wide Web into a Web of Data. Materializing the Web of Linked Data is addressed to researchers and professionals studying software technologies, tools and approaches that drive the Linked Data ecosystem, and the Web in general.
Leclercq, Florent [Institute of Cosmology and Gravitation (ICG), University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth PO1 3FX (United Kingdom); Lavaux, Guilhem; Wandelt, Benjamin [Institut d' Astrophysique de Paris (IAP), UMR 7095, CNRS – UPMC Université Paris 6, Sorbonne Universités, 98bis boulevard Arago, F-75014 Paris (France); Jasche, Jens, E-mail: firstname.lastname@example.org, E-mail: email@example.com, E-mail: firstname.lastname@example.org, E-mail: email@example.com [Excellence Cluster Universe, Technische Universität München, Boltzmannstrasse 2, D-85748 Garching (Germany)
We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.
Qutab, Saima; Mahmood, Khalid
Purpose: The purpose of this paper is to investigate library web sites in Pakistan, to analyse their content and navigational strengths and weaknesses and to give recommendations for developing better web sites and quality assessment studies. Design/methodology/approach: Survey of web sites of 52 academic, special, public and national libraries in…
Nguyen, Dong-Phuong; Demeester, Thomas; Trieschnigg, Rudolf Berend; Hiemstra, Djoerd
A publicly available dataset for federated search reflecting a real web environment has long been bsent, making it difficult for researchers to test the validity of their federated search algorithms for the web setting. We present several experiments and analyses on resource selection on the web
Provides current descriptions of some of the major directories that link to library catalogs on the World Wide Web. Highlights include LibWeb; Hytelnet; WebCats; WWW Library Directory; and techniques for finding new library OPAC (online public access catalog) directories. (LRW)
... 32 National Defense 5 2010-07-01 2010-07-01 false Privacy and the web. 701.119 Section 701.119... THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.119 Privacy and the web. DON activities shall consult SECNAVINST 5720.47B for guidance on what may be posted on a Navy Web site. ...
... 32 National Defense 6 2010-07-01 2010-07-01 false Privacy and the Web. 806b.51 Section 806b.51... PROGRAM Disclosing Records to Third Parties § 806b.51 Privacy and the Web. Do not post personal information on publicly accessible DoD web sites unless clearly authorized by law and implementing regulation...
Geeson, Nichola; Brandt, Jane; Quaranta, Giovanni; Salvia, Rosanna
Until around 1995 it was challenging to make the scientific results of research projects publicly available except through presentations at meetings or conferences, or as papers in academic journals. Then it began to be clear that the Internet could become the main medium to publish and share new information with a much wider audience. The DESIRE Project (desertification mitigation and remediation of land-a global approach for local solutions) has built on expertise gained in previous projects to develop an innovative online 'Harmonized Information System' (HIS). This documents the context, delivery and evaluation of all tasks in the DESIRE Project using non-scientific terminology, with much of it also available in the local languages of the study sites. The DESIRE-HIS makes use of new possibilities for communication, including video clips, interactive tools, and links to social media networks such as Twitter. Dissemination of research results using this approach has required careful planning and design. This paper sets out the steps that have culminated in a complete online Information System about local solutions to global land management problems in desertification-affected areas, including many practical guidelines for responsible land management. As many of those who are affected by desertification do not have Internet access, printable dissemination materials are also available on the DESIRE-HIS.
Geeson, Nichola; Brandt, Jane; Quaranta, Giovanni; Salvia, Rosanna
Until around 1995 it was challenging to make the scientific results of research projects publicly available except through presentations at meetings or conferences, or as papers in academic journals. Then it began to be clear that the Internet could become the main medium to publish and share new information with a much wider audience. The DESIRE Project (desertification mitigation and remediation of land—a global approach for local solutions) has built on expertise gained in previous projects to develop an innovative online `Harmonized Information System' (HIS). This documents the context, delivery and evaluation of all tasks in the DESIRE Project using non-scientific terminology, with much of it also available in the local languages of the study sites. The DESIRE-HIS makes use of new possibilities for communication, including video clips, interactive tools, and links to social media networks such as Twitter. Dissemination of research results using this approach has required careful planning and design. This paper sets out the steps that have culminated in a complete online Information System about local solutions to global land management problems in desertification-affected areas, including many practical guidelines for responsible land management. As many of those who are affected by desertification do not have Internet access, printable dissemination materials are also available on the DESIRE-HIS.
The WEB could be much more useful if the computers understood something of information on the Web pages. That explains the goal of the "semantic Web", a project in which takes part, amongst others, Tim Berners Lee, the inventor of the original WEB
A step-by-step tutorial approach which will teach the readers what responsive web design is and how it is used in designing a responsive web page.If you are a web-designer looking to expand your skill set by learning the quickly growing industry standard of responsive web design, this book is ideal for you. Knowledge of CSS is assumed.
Zhang, Chuanrong; Li, Weidong
This book covers key issues related to Geospatial Semantic Web, including geospatial web services for spatial data interoperability; geospatial ontology for semantic interoperability; ontology creation, sharing, and integration; querying knowledge and information from heterogeneous data source; interfaces for Geospatial Semantic Web, VGI (Volunteered Geographic Information) and Geospatial Semantic Web; challenges of Geospatial Semantic Web; and development of Geospatial Semantic Web applications. This book also describes state-of-the-art technologies that attempt to solve these problems such as WFS, WMS, RDF, OWL, and GeoSPARQL, and demonstrates how to use the Geospatial Semantic Web technologies to solve practical real-world problems such as spatial data interoperability.
Scientist Participation in Education and Public Outreach - Using Web Tools to Communicate the Scientific Process and Engage an Audience in Understanding Planetary Science: Examples with Lunar Reconnaissance Orbiter (LRO) Data (Invited)
Petro, N. E.
Scientists often speak to the public about their science and the current state of understanding of their field. While many talks (including those by this author) typically feature static plots, figures, diagrams, and the odd movie/animation/visualization (when technology allows), it is now possible, using the web to guide an audience through the thought process of how a scientist tackles certain questions. The presentation will highlight examples of web tools that effectively illustrate how datasets are used to address questions of lunar science. Why would a scientist use precious time during a talk to interact with data, in real time? Why not just show the results and move on? Through experience it is evident that illustrating how data is analyzed, even in a simple form, engages an audience, and demonstrates the thought process when interacting with data. While it is clear that scientists are unlikely to use such a tool to conduct science, it illustrates how a member of the public can engage with mission data. An example is discussed below. When discussing the geology of the Moon, there is an enormous volume of data that can be used to explain what we know (or think we know) and how we know it. For example, the QuickMap interface (http://www.actgate.com/home/quickmap.htm) enables interaction with a set of data (images, spectral data, topography, radar data) across the entire Moon (http://target.lroc.asu.edu/q3/). This webtool enables a speaker the opportunity (given adequate web connectivity) to talk about features, such as a crater, and show it from multiple perspectives (e.g., plan view, oblique, topographically exaggerated) in a logical flow. The tool enables illustration of topographic profiles, 3-D perspectives, and data overlays. Now, one might ask why doing this demonstration in real time is valuable, over a set of static slides. In some cases static slides are best, and doing any real time demos is unfeasible. However, guiding an engaged audience through
Shadbolt, Nigel; Berners-Lee, Tim; Hall, Wendy
The original Scientific American article on the Semantic Web appeared in 2001. It described the evolution of a Web that consisted largely of documents for humans to read to one that included data and information for computers to manipulate. The Semantic Web is a Web of actionable information--information derived from data through a semantic theory for interpreting the symbols.This simple idea, however, remains largely unrealized. Shopbots and auction bots abound on the Web, but these are esse...
Suralkar, Sunita; Joshi, Nilambari; Meshram, B B
This paper describes about the need for Web project management, fundamentals of project management for web projects: what it is, why projects go wrong, and what's different about web projects. We also discuss Cost Estimation Techniques based on Size Metrics. Though Web project development is similar to traditional software development applications, the special characteristics of Web Application development requires adaption of many software engineering approaches or even development of comple...
Eysenbach, Gunther; Trudel, Mathieu
Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research
Since February 2002, the French authority of nuclear safety (ASN) publishes on its web site (http://www.asn.gouv.fr) the letters addressed to the operators of nuclear facilities at the end of its inspections. The ASN carries out about 700 inspections every year which concern the French nuclear facilities, the central services of nuclear operators or of their suppliers, and the transports of nuclear materials. Each inspection is followed by a follow-up letter which mentions all anomalies noticed during the inspection and eventually asks for some remedial actions or for some complements of information. This document brings together the letters published between february and May 2002 and concerning the on-site inspections of nuclear facilities (EdF nuclear power plants, CEA centers, Cogema facilities, other sites) and the off-site inspections (Andra, transports of nuclear materials). (J.S.)
Laura Cristina Simões Viana
Full Text Available The increasing adoption of information and communication technology (ICT in public administration has changed the way governments make the purchases of products and service procurement, activities that are essential for the rendering of services in quantities and quality appropriate to meet the needs of the population. The present project suggests the use of a type of ICT aiming at good governance of scientific research and technological development in inputs for health at the Fundação Oswaldo Cruz: Web semantics and ontologies. From a theoretical point of view, this project is in tune with one of the emerging approaches for the understanding and outlining of current policies for research and technological development in health - an innovation systems approach. Despite the economic advantages of the adoption of electronic methods in governmental purchasing policy, it is necessary to keep in mind that it is a long term change process, since many administrative stages are being transferred to the electronic environment, requiring a new work flow design, as well as integration of electronic purchasing and management and administration systems such as, for instance, orders, purchase orders, logistics, finance and accounting. We propose the sharing of ontologies so as to allow interoperability between systems used in the purchasing process as well as in other key institutional management and administration systems.
CLAUDIA ELENA DINUCĂ
Full Text Available The World Wide Web became one of the most valuable resources for information retrievals and knowledge discoveries due to the permanent increasing of the amount of data available online. Taking into consideration the web dimension, the users get easily lost in the web’s rich hyper structure. Application of data mining methods is the right solution for knowledge discovery on the Web. The knowledge extracted from the Web can be used to raise the performances for Web information retrievals, question answering and Web based data warehousing. In this paper, I provide an introduction of Web mining categories and I focus on one of these categories: the Web structure mining. Web structure mining, one of three categories of web mining for data, is a tool used to identify the relationship between Web pages linked by information or direct link connection. It offers information about how different pages are linked together to form this huge web. Web Structure Mining finds hidden basic structures and uses hyperlinks for more web applications such as web search.
Dolog, Peter; Nejdl, Wolfgang
Ontologies and reasoning are the key terms brought into focus by the semantic web community. Formal representation of ontologies in a common data model on the web can be taken as a foundation for adaptive web technologies as well. This chapter describes how ontologies shared on the semantic web...... provide conceptualization for the links which are a main vehicle to access information on the web. The subject domain ontologies serve as constraints for generating only those links which are relevant for the domain a user is currently interested in. Furthermore, user model ontologies provide additional...... means for deciding which links to show, annotate, hide, generate, and reorder. The semantic web technologies provide means to formalize the domain ontologies and metadata created from them. The formalization enables reasoning for personalization decisions. This chapter describes which components...
Hu, Y; Yang, Q P; Sun, X; Wei, P
Enterprise Web provides a convenient, extendable, integrated platform for information sharing and knowledge management. However, it still has many drawbacks due to complexity and increasing information glut, as well as the heterogeneity of the information processed. Research in the field of Semantic Web Services has shown the possibility of adding higher level of semantic functionality onto the top of current Enterprise Web, enhancing usability and usefulness of resource, enabling decision su...
Jessen, Iben Bredahl; Graakjær, Nicolai Jørgensgaard
Sound seems to be a neglected issue in the study of web ads. Web advertising is predominantly regarded as visual phenomena–commercial messages, as for instance banner ads that we watch, read, and eventually click on–but only rarely as something that we listen to. The present chapter presents...... an overview of the auditory dimensions in web advertising: Which kinds of sounds do we hear in web ads? What are the conditions and functions of sound in web ads? Moreover, the chapter proposes a theoretical framework in order to analyse the communicative functions of sound in web advertising. The main...... argument is that an understanding of the auditory dimensions in web advertising must include a reflection on the hypertextual settings of the web ad as well as a perspective on how users engage with web content....
The life stories of migrants are increasingly being told, as part of the work of cultural organizations, and websites are well suited to making such life story projects accessible to the public. However, by using the lives of real people as raw material in a public forum, Web projects raise...
Schweiger, Dominik; Trajanoski, Zlatko; Pabinger, Stephan
Semantic Web has established itself as a framework for using and sharing data across applications and database boundaries. Here, we present a web-based platform for querying biological Semantic Web databases in a graphical way. SPARQLGraph offers an intuitive drag & drop query builder, which converts the visual graph into a query and executes it on a public endpoint. The tool integrates several publicly available Semantic Web databases, including the databases of the just recently released EBI RDF platform. Furthermore, it provides several predefined template queries for answering biological questions. Users can easily create and save new query graphs, which can also be shared with other researchers. This new graphical way of creating queries for biological Semantic Web databases considerably facilitates usability as it removes the requirement of knowing specific query languages and database structures. The system is freely available at http://sparqlgraph.i-med.ac.at.
Frasincar, F.; Houben, G.J.P.M.; Barna, P.; Vasilecas, O.; Eder, J.; Caplinskas, A.
Hera is a model-driven methodology for designing Web information systems. In the past a CASE tool for the Hera methodology was implemented. This software had different components that together form one centralized application. In this paper, we present a distributed Web service-oriented architecture
The Integrated Propulsion Data System's (IPDS) focus is to provide technologically-advanced philosophies of doing business at SSC that will enhance the existing operations, engineering and management strategies and provide insight and metrics to assess their daily impacts, especially as related to the Propulsion Test Directorate testing scenarios for the 21st Century.
Web 2.0 has brought a change to how we communicate and disseminate information with the use of Twitter, Facebook, YouTube, instant messaging and blogging. This technology is beginning to be used in the health field for public awareness campaigns, emergency health alerts, medical education and remote healthcare services. Australian Health Information Managers will be called upon to reconcile their organisations' policies and procedures regarding the use of Web 2.0 technologies within the existing legal framework of privacy, confidentiality and consent. This article explores various applications of Web 2.0, their benefits and some of their potential legal and ethical implications when implemented in Australia.
Ratnayake, Rakhitha Nimesh
This book is intended for WordPress developers and designers who want to develop quality web applications within a limited time frame and for maximum profit. Prior knowledge of basic web development and design is assumed.
Discussion of ways to promote sites on the World Wide Web focuses on how search engines work and how they retrieve and identify sites. Appropriate Web links for submitting new sites and for Internet marketing are included. (LRW)
This book is perfect for beginners who want to get started and learn the web development basics, but also offers experienced developers a web development roadmap that will help them to extend their capabilities.
U.S. Environmental Protection Agency — EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's...
Syverson, Paul F; Reed, Michael G; Goldschlag, David M
.... These are both kept confidential from network elements as well as external observers. Private Web browsing is achieved by unmodified Web browsers using anonymous connections by means of HTTP proxies...
Web application security has been a major issue in information technology since the evolvement of dynamic web application. The main objective of this project was to carry out a detailed study on the top three web application vulnerabilities such as injection, cross site scripting, broken authentication and session management, present the situation where an application can be vulnerable to these web threats and finally provide preventative measures against them. ...
Bailey, James; Bry, François; Eckert, Michael; Patrânjan, Paula Lavinia
Reactivity, the ability to detect simple and composite events and respond in a timely manner, is an essential requirement in many present-day information systems. With the emergence of new, dynamic Web applications, reactivity on the Web is receiving increasing attention. Reactive Web-based systems need to detect and react not only to simple events but also to complex, real-life situations. This paper introduces XChange, a language for programming reactive behaviour on the Web,...
Money, William H.
Instructors should be concerned with how to incorporate the World Wide Web into an information systems (IS) curriculum organized across three areas of knowledge: information technology, organizational and management concepts, and theory and development of systems. The Web fits broadly into the information technology component. For the Web to be…
Antoniou, Grigoris; Harmelen, Frank van
The development of the Semantic Web, with machine-readable content, has the potential to revolutionize the World Wide Web and its use. A Semantic Web Primer provides an introduction and guide to this still emerging field, describing its key ideas, languages, and technologies. Suitable for use as a
Snider, Jean; Martin, Florence
Web usability focuses on design elements and processes that make web pages easy to use. A website for college students was evaluated for underutilization. One-on-one testing, focus groups, web analytics, peer university review and marketing focus group and demographic data were utilized to conduct usability evaluation. The results indicated that…
The World Wide Web is emerging as an all-in-one information source. Tools for searching Web-based information include search engines, subject directories and meta search tools. We take a look at key features of these tools and suggest practical hints for effective Web searching.
Full Text Available Semantic Web application areas are experiencing intensified interest due to the rapid growth in the use of the Web, together with the innovation and renovation of information content technologies. The Semantic Web is regarded as an integrator across...
Mai, Jens Erik
This paper discusses the challenges faced by investigations into the classification of the Web and outlines inquiries that are needed to use principles for bibliographic classification to construct classifications of the Web. This paper suggests that the classification of the Web meets challenges...... that call for inquiries into the theoretical foundation of bibliographic classification theory....
Göçmen, Z. Asligül
Web-based geographic information system (GIS) technology, or web-based GIS, offers many opportunities for public planners and Extension educators who have limited GIS backgrounds or resources. However, investigation of its use in planning has been limited. The study described here examined the use of web-based GIS by public planning agencies. A…
Full Text Available The web and the social web play an increasingly important role as an information source for Members of Parliament and their assistants, journalists, political analysts and researchers. It provides important and crucial background information, like reactions to political events and comments made by the general public. The case study presented in this paper is driven by two European parliaments (the Greek and the Austrian parliament and targets an effective exploration of political web archives. In this paper, we describe semantic technologies deployed to ease the exploration of the archived web and social web content and present evaluation results.
Feldbrugge, Job; van de Weygaert, Rien; Hidding, Johan; Feldbrugge, Joost
arbitrary dynamics. Most important in the present context is that it allows us to follow and describe the full three-dimensional geometric and topological complexity of the purely gravitationally evolving nonlinear cosmic matter field. While generic and statistical results can be based on the eigenvalue characteristics, one of our key findings is that of the significance of the eigenvector field of the deformation field for outlining the entire spatial structure of the caustic skeleton emerging from a primordial density field. In this paper we explicitly consider the caustic conditions for the three-dimensional Zel'dovich approximation, extending earlier work on those for one- and two-dimensional fluids towards the full spatial richness of the cosmic web. In an accompanying publication, we apply this towards a full three-dimensional study of caustics in the formation of the cosmic web and evaluate in how far it manages to outline and identify the intricate skeletal features in the corresponding N-body simulations.
Bouguettaya, Athman; Daniel, Florian
Web services and Service-Oriented Computing (SOC) have become thriving areas of academic research, joint university/industry research projects, and novel IT products on the market. SOC is the computing paradigm that uses Web services as building blocks for the engineering of composite, distributed applications out of the reusable application logic encapsulated by Web services. Web services could be considered the best-known and most standardized technology in use today for distributed computing over the Internet.Web Services Foundations is the first installment of a two-book collection coverin
Since the first edition of this classic reference was published, World Wide Web use has exploded and e-commerce has become a daily part of business and personal life. As Web use has grown, so have the threats to our security and privacy--from credit card fraud to routine invasions of privacy by marketers to web site defacements to attacks that shut down popular web sites. Web Security, Privacy & Commerce goes behind the headlines, examines the major security risks facing us today, and explains how we can minimize them. It describes risks for Windows and Unix, Microsoft Internet Exp
Perrucci, GP; Fitzek, FHP; Zhang, Qi
This paper advocates a novel approach for mobile web browsing based on cooperation among wireless devices within close proximity operating in a cellular environment. In the actual state of the art, mobile phones can access the web using different cellular technologies. However, the supported data......-range links can then be used for cooperative mobile web browsing. By implementing the cooperative web browsing on commercial mobile phones, it will be shown that better performance is achieved in terms of increased data rate and therefore reduced access times, resulting in a significantly enhanced web...
Davis, Derrick D.; Armstrong, Curtis D.
During my 2015 internship at Stennis Space Center (SSC) I supported the development of a web based tool to enable user interaction with a low-cost environmental monitoring buoy called the DRIFTER. DRIFTERs are designed by SSC's Applied Science and Technology Projects branch and are used to measure parameters such as water temperature and salinity. Data collected by the buoys help verify measurements by NASA satellites, which contributes to NASA's mission to advance understanding of the Earth by developing technologies to improve the quality of life on or home planet. My main objective during this internship was to support the development of the DRIFTER by writing web-based software that allows the public to view and access data collected by the buoys. In addition, this software would enable DRIFTER owners to configure and control the devices.
This article critically interrogates key assumptions in popular web discourse by revisiting an early example of web 'participation.' Against the claim that Web 2.0 technologies ushered in a new paradigm of participatory media, I turn to the history of HotWired, Wired magazine's ambitious web-only publication launched in 1994. The case shows how debates about the value of amateur participation vis-à-vis editorial control have long been fundamental to the imagination of the web's difference fro...
種市, 淳子; 逸村, 裕; TANEICHI, Junko; ITSUMURA, Hiroshi
In this study, we discussed information seeking behavior on the Web. First, the currentWeb-searching studies are reviewed from the perspective of: (1) Web-searching characteristics; (2) the process model for how users evaluate Web resources. Secondly, we investigated information seeking processes using the Web search engine and online public access catalogue (OPAC) system by undergraduate students, through an experiment and its protocol analysis. The results indicate that: (1) Web-searching p...
Antunes, A. K.
We suggest improvements to citation standards and creation of remuneration opportunities to encourage career scientist contributions to Web2.0 and social media science channels. At present, agencies want to accomplish better outreach and engagement with no funding, while scientists sacrifice their personal time to contribute to web and social media sites. Securing active participation by scientists requires career recognition of the value scientists provide to web knowledge bases and to the general public. One primary mechanism to encourage participation is citation standards, which let a contributor improve their reputation in a quantifiable way. But such standards must be recognized by their scientific and workplace communities. Using case studies such as the acceptance of web in the workplace and the growth of open access journals, we examine what agencies and individual can do as well as the time scales needed to secure increased active contribution by scientists. We also discuss ways to jumpstart this process.
Guruvadoo, Eranna K.
In order to promote NASA-wide educational outreach program to educate and inform the public of space exploration, NASA, at Kennedy Space Center, is seeking efficient ways to add more contents to the web by streaming audio/video files. This project proposes a high level overview of a framework for the creation, management, and scheduling of audio/video assets over the web. To support short-term goals, the prototype of a web-based tool is designed and demonstrated to automate the process of streaming audio/video files. The tool provides web-enabled users interfaces to manage video assets, create publishable schedules of video assets for streaming, and schedule the streaming events. These operations are performed on user-defined and system-derived metadata of audio/video assets stored in a relational database while the assets reside on separate repository. The prototype tool is designed using ColdFusion 5.0.
Casteleyn, Sven; Daniel, Florian; Dolog, Peter
Nowadays, Web applications are almost omnipresent. The Web has become a platform not only for information delivery, but also for eCommerce systems, social networks, mobile services, and distributed learning environments. Engineering Web applications involves many intrinsic challenges due...... to their distributed nature, content orientation, and the requirement to make them available to a wide spectrum of users who are unknown in advance. The authors discuss these challenges in the context of well-established engineering processes, covering the whole product lifecycle from requirements engineering through...... design and implementation to deployment and maintenance. They stress the importance of models in Web application development, and they compare well-known Web-specific development processes like WebML, WSDM and OOHDM to traditional software development approaches like the waterfall model and the spiral...
Luján Mora, Sergio
Agradecimientos 1. Introducción a las aplicaciones web 2. Instalación del servidor 3. Diseño de páginas web 4. Formato estructurado de texto: XML 5. Contenido dinámico 6. Acceso a bases de datos: JDBC 7. Servicios web 8. Utilización y mantenimiento 9. Monitorización y análisis Bibliografía GNU Free Documentation License
Shadbolt, Nigel; Berners-Lee, Tim
The relentless rise in Web pages and links is creating emergent properties, from social networks to virtual identity theft, that are transforming society. A new discipline, Web Science, aims to discover how Web traits arise and how they can be harnessed or held in check to benefit society. Important advances are beginning to be made; more work can solve major issues such as securing privacy and conveying trust.
Segaran, Toby; Taylor, Jamie
With this book, the promise of the Semantic Web -- in which machines can find, share, and combine data on the Web -- is not just a technical possibility, but a practical reality Programming the Semantic Web demonstrates several ways to implement semantic web applications, using current and emerging standards and technologies. You'll learn how to incorporate existing data sources into semantically aware applications and publish rich semantic data. Each chapter walks you through a single piece of semantic technology and explains how you can use it to solve real problems. Whether you're writing
While the REST design philosophy has captured the imagination of web and enterprise developers alike, using this approach to develop real web services is no picnic. This cookbook includes more than 100 recipes to help you take advantage of REST, HTTP, and the infrastructure of the Web. You'll learn ways to design RESTful web services for client and server applications that meet performance, scalability, reliability, and security goals, no matter what programming language and development framework you use. Each recipe includes one or two problem statements, with easy-to-follow, step-by-step i
Today's market for mobile apps goes beyond the iPhone to include BlackBerry, Nokia, Windows Phone, and smartphones powered by Android, webOS, and other platforms. If you're an experienced web developer, this book shows you how to build a standard app core that you can extend to work with specific devices. You'll learn the particulars and pitfalls of building mobile apps with HTML, CSS, and other standard web tools. You'll also explore platform variations, finicky mobile browsers, Ajax design patterns for mobile, and much more. Before you know it, you'll be able to create mashups using Web 2.
Bouguettaya, Athman; Daniel, Florian
Web services and Service-Oriented Computing (SOC) have become thriving areas of academic research, joint university/industry research projects, and novel IT products on the market. SOC is the computing paradigm that uses Web services as building blocks for the engineering of composite, distributed applications out of the reusable application logic encapsulated by Web services. Web services could be considered the best-known and most standardized technology in use today for distributed computing over the Internet. This book is the second installment of a two-book collection covering the state-o
Web-sovelluskehitykseen käytettäviä tekniikoita, työkaluja ja ohjelmakirjastoja on olemassa useita erilaisia ja niiden lähestymistapa web-sovelluskehitykseen poikkeaa jonkin verran toisistaan. Opinnäytetyössä selvitetään teoriassa ja käytännön esimerkkiprojektin avulla yleisimmin web-sovelluskehityksessä käytettyjä tekniikoita ja kirjastoja. Työssä esimerkkinä luodussa web-sovelluksessa käytettiin Laravel-ohjelmakehystä ja alkuosassa käsiteltyjä työkaluja ja kirjastoja, kuten Bootstrap ja ...
The easiest way to learn how to create a Web page for your family or organization Do you want to share photos and family lore with relatives far away? Have you been put in charge of communication for your neighborhood group or nonprofit organization? A Web page is the way to get the word out, and Creating Web Pages Simplified offers an easy, visual way to learn how to build one. Full-color illustrations and concise instructions take you through all phases of Web publishing, from laying out and formatting text to enlivening pages with graphics and animation. This easy-to-follow visual guide sho
Harper, Simon; Yesilada, Yeliz
Access to, and movement around, complex online environments, of which the World Wide Web (Web) is the most popular example, has long been considered an important and major issue in the Web design and usability field. The commonly used slang phrase ‘surfing the Web’ implies rapid and free access, pointing to its importance among designers and users alike. It has also been long established that this potentially complex and difficult access is further complicated, and becomes neither rapid nor free, if the user is disabled. There are millions of people who have disabilities that affect their use of the Web. Web accessibility aims to help these people to perceive, understand, navigate, and interact with, as well as contribute to, the Web, and thereby the society in general. This accessibility is, in part, facilitated by the Web Content Accessibility Guidelines (WCAG) currently moving from version one to two. These guidelines are intended to encourage designers to make sure their sites conform to specifications, and in that conformance enable the assistive technologies of disabled users to better interact with the page content. In this way, it was hoped that accessibility could be supported. While this is in part true, guidelines do not solve all problems and the new WCAG version two guidelines are surrounded by controversy and intrigue. This chapter aims to establish the published literature related to Web accessibility and Web accessibility guidelines, and discuss limitations of the current guidelines and future directions.
Building a web application that attracts and retains regular visitors is tricky enough, but creating a social application that encourages visitors to interact with one another requires careful planning. This book provides practical solutions to the tough questions you'll face when building an effective community site -- one that makes visitors feel like they've found a new home on the Web. If your company is ready to take part in the social web, this book will help you get started. Whether you're creating a new site from scratch or reworking an existing site, Building Social Web Applications
Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju
With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…
Ingram, Albert L.
Web server logs contain a great deal of information about who uses a Web site and how they use it. This article discusses the analysis of Web logs for instructional Web sites; reviews the data stored in most Web server logs; demonstrates what further information can be gleaned from the logs; and discusses analyzing that information for the…
We analyzed and examined the public perception of having a web based information system for orphanage management and also designed and implemented a web based system for management of orphanages. The system we developed keeps track of orphanages, the orphans, the helps received by the orphanages and ...
Kohtz, Cindy; Gowda, Connie; Stockert, Patricia; White, Jane; Kennel, Lynn
Although many publications laud the potential benefits of using Web 2.0 technologies in nursing education, little has been published on the extent of their use. This descriptive study examined the personal and academic use of Web 2.0 technologies among nursing students enrolled in 3 different baccalaureate programs.
Longan, Michael W.
The geography education literature touts the World Wide Web (Web) as a revolutionary educational tool, yet most accounts ignore its uses for public communication and creative expression. This article argues that students can be producers of content that is of service to local audiences. Drawing inspiration from the community networking movement,…
Smith, Susan Sharpless
Expanding on the popular, practical how-to guide for public, academic, school, and special libraries, technology expert Susan Sharpless Smith offers library instructors the confidence to take Web-based instruction into their own hands. Smith has thoroughly updated "Web-Based Instruction: A Guide for Libraries" to include new tools and trends,…
Bertot, John Carlo
Web-based surveys are not new to the library environment. Although such surveys began as extensions of print surveys, the Web-based environment offers a number of approaches to conducting a survey that the print environment cannot duplicate easily. Since 1994, the author and others have conducted national surveys of public library Internet…
Sørensen, Jannick Kirk
Between 2006 and 2011, a number of European public service broadcasting (PSB) organisations offered their website users the opportunity to create their own PSB homepage. The web customisation was conceived by the editors as a response to developments in commercial web services, particularly social...
This article gives tips on how to avoid having content stolen by plagiarists. Suggestions include: using a Web search service such as Google to search for unique strings of text at the individuals site to uncover other sites with the same content; buying a infringement-detection program; or hiring a public relations firm to do the work. There are…
Kieran, Shirley; Sauve, Diane
Provides an overview of the Quebec National Library (Bibliotheque Nationale du Quebec, or BNQ) Web site. Highlights include issues related to content, design, and technology; IRIS, the BNQ online public access catalog; development of the multimedia catalog; software; digitization of documents; links to bibliographic records; and future…
Full Text Available The purpose of this paper is to discuss the creation and intended evolution of the Rice University mobile online public access catalog (OPAC. The focus of the article is on how SirsiDynix’s Symphony Web Services can be used to create a mobile OPAC.
Typography is one of the most important elements of web design and marketing. Good typography makes web design more appealing, which is important for readers in evaluating titles and the quality of text. The aim of this thesis is to provide a characterization of good and bad typography. I will use this characterization to identify modern typographical trends in a digital background....
This article lists the Web sites of 12 international not-for-profit creativity associations designed to trigger more creative thought and research possibilities. Along with Web addresses, the entries include telephone contact information and a brief description of the organization. (CR)
If you are a web programmer with experience in developing web services and have a rudimentary knowledge of using Go, then this is the book for you. Basic knowledge of Go as well as knowledge of relational databases and non-relational NoSQL datastores is assumed. Some basic concurrency knowledge is also required.
de Jong, Menno D.T.; van der Geest, Thea
This article is intended to make Web designers more aware of the qualities of heuristics by presenting a framework for analyzing the characteristics of heuristics. The framework is meant to support Web designers in choosing among alternative heuristics. We hope that better knowledge of the
A. Pouloudi; J. Paarlberg; H.W.G.M. van Heck (Eric)
textabstractThis paper argues that a better understanding of the business model of web auctions can be reached if we adopt a broader view and provide empirical research from different sites. In this paper the business model of web auctions is refined into four dimensions. These are auction model,
Scheduled webinars can help you better manage EPA web content. Class topics include Drupal basics, creating different types of pages in the WebCMS such as document pages and forms, using Google Analytics, and best practices for metadata and accessibility.
Fels, Deborah I.; Richards, Jan; Hardman, Jim; Lee, Daniel G.
The World Wide Web has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The…
Hitzler, Pascal; Van Harmelen, Frank
The realization of Semantic Web reasoning is central to substantiating the Semantic Web vision. However, current mainstream research on this topic faces serious challenges, which forces us to question established lines of research and to rethink the underlying approaches. We argue that reasoning for
Church, Jennifer; Felker, Kyle
The dynamic world of the Web has provided libraries with a wealth of opportunities, including new approaches to the provision of information and varied internal staffing structures. The development of self-managed Web teams, endowed with authority and resources, can create an adaptable and responsive culture within libraries. This new working team…
Weaver, Nicholas; Kreibich, Christian; Dam, Martin
,000 clients that include a novel proxy location technique based on traceroutes of the responses to TCP connection establishment requests, which provides additional clues regarding the purpose of the identified web proxies. Overall, we see 14% of Netalyzr-analyzed clients with results that suggest the presence...... of web proxies....
The web site is a library's most important feature. Patrons use the web site for numerous functions, such as renewing materials, placing holds, requesting information, and accessing databases. The homepage is the place they turn to look up the hours, branch locations, policies, and events. Whether users are at work, at home, in a building, or on…
In 1993, CERN launched the World Wide Web. The first navigator, Mosiac, was developed in the United States. Since then there have been many battles for dominance over the navigator market. Now CERN is working on a new revolution - the Grid (2 pages)
Public approval hinges not only on delivering the information the public wants but on providing tangible evidence that we are listening to public concerns. We must respond. Public acceptance depends on making real change which speaks to people's concerns. The message that the public wants to hear is that government are listening and acting on what they hear. In Canada, the nuclear regulator is increasingly active in the public arena. We held cross-country consultations as we prepared Canada's strong new Act and regulations. We have developed information vehicles such as the Radiation Index and our web site. We continue to extensively involve the public in our licensing process. All licensing hearings are open to the public. Nothing is harder to capture than public trust. This conference marks a substantial investment in learning and in our common future. We can work to build our credibility as regulators who acts on public concerns. (N.C.)
Full Text Available Abstrak Embedded sistem saat ini menjadi perhatian khusus pada teknologi komputer, beberapa sistem operasi linux dan web server yang beraneka ragam juga sudah dipersiapkan untuk mendukung sistem embedded, salah satu aplikasi yang dapat digunakan dalam operasi pada sistem embedded adalah web server. Pemilihan web server pada lingkungan embedded saat ini masih jarang dilakukan, oleh karena itu penelitian ini dilakukan dengan menitik beratkan pada dua buah aplikasi web server yang tergolong memiliki fitur utama yang menawarkan “keringanan” pada konsumsi CPU maupun memori seperti Light HTTPD dan Tiny HTTPD. Dengan menggunakan parameter thread (users, ramp-up periods, dan loop count pada stress test embedded system, penelitian ini menawarkan solusi web server manakah diantara Light HTTPD dan Tiny HTTPD yang memiliki kecocokan fitur dalam penggunaan embedded sistem menggunakan beagleboard ditinjau dari konsumsi CPU dan memori. Hasil penelitian menunjukkan bahwa dalam hal konsumsi CPU pada beagleboard embedded system lebih disarankan penggunaan Light HTTPD dibandingkan dengan tiny HTTPD dikarenakan terdapat perbedaan CPU load yang sangat signifikan antar kedua layanan web tersebut Kata kunci: embedded system, web server Abstract Embedded systems are currently of particular concern in computer technology, some of the linux operating system and web server variegated also prepared to support the embedded system, one of the applications that can be used in embedded systems are operating on the web server. Selection of embedded web server on the environment is still rarely done, therefore this study was conducted with a focus on two web application servers belonging to the main features that offer a "lightness" to the CPU and memory consumption as Light HTTPD and Tiny HTTPD. By using the parameters of the thread (users, ramp-up periods, and loop count on a stress test embedded systems, this study offers a solution of web server which between the Light
Zaretzki, J.; Bergeron, C.; Huang, T.-W.
Regioselectivity-WebPredictor (RS-WebPredictor) is a server that predicts isozyme-specific cytochrome P450 (CYP)-mediated sites of metabolism (SOMs) on drug-like molecules. Predictions may be made for the promiscuous 2C9, 2D6 and 3A4 CYP isozymes, as well as CYPs 1A2, 2A6, 2B6, 2C8, 2C19 and 2E1....... RS-WebPredictor is the first freely accessible server that predicts the regioselectivity of the last six isozymes. Server execution time is fast, taking on average 2s to encode a submitted molecule and 1s to apply a given model, allowing for high-throughput use in lead optimization projects.......Availability: RS-WebPredictor is accessible for free use at http://reccr.chem.rpi.edu/ Software/RS-WebPredictor....
suit the user profile the most. This paper summarizes the domain engineering framework for such adaptive web applications. The framework provides guidelines to develop adaptive web applications as members of a family. It suggests how to utilize the design artifacts as knowledge which can be used......Information and services on the web are accessible for everyone. Users of the web differ in their background, culture, political and social environment, interests and so on. Ambient intelligence was envisioned as a concept for systems which are able to adapt to user actions and needs....... With the growing amount of information and services, the web applications become natural candidates to adopt the concepts of ambient intelligence. Such applications can deal with divers user intentions and actions based on the user profile and can suggest the combination of information content and services which...
Hyldegård, Jette; Lund, Haakon
The paper presents the results from a study on information literacy in a higher education (HE) context based on a larger research project evaluating 3 Norwegian IL web tutorials at 6 universities and colleges in Norway. The aim was to evaluate how the 3 web tutorials served students’ information...... seeking and writing process in an study context and to identify barriers to the employment and use of the IL web tutorials, hence to the underlying information literacy intentions by the developer. Both qualitative and quantitative methods were employed. A clear mismatch was found between intention...... and use of the web tutorials. In addition, usability only played a minor role compared to relevance. It is concluded that the positive expectations of the IL web tutorials tend to be overrated by the developers. Suggestions for further research are presented....
Rights and Justice and the Social Web Movement (Latin America) ... mounted to raise public awareness of the importance of privacy as a human right on the Internet. ... conference of McGill's Institute for the Study of International Development.
Ashford, Miriam Thiel; Olander, Ellinor K; Ayers, Susan
One relatively new and increasingly popular approach of increasing access to treatment is Web-based intervention programs. The advantage of Web-based approaches is the accessibility, affordability, and anonymity of potentially evidence-based treatment. Despite much research evidence on the effectiveness of Web-based interventions for anxiety found in the literature, little is known about what is publically available for potential consumers on the Web. Our aim was to explore what a consumer searching the Web for Web-based intervention options for anxiety-related issues might find. The objectives were to identify currently publically available Web-based intervention programs for anxiety and to synthesize and review these in terms of (1) website characteristics such as credibility and accessibility; (2) intervention program characteristics such as intervention focus, design, and presentation modes; (3) therapeutic elements employed; and (4) published evidence of efficacy. Web keyword searches were carried out on three major search engines (Google, Bing, and Yahoo-UK platforms). For each search, the first 25 hyperlinks were screened for eligible programs. Included were programs that were designed for anxiety symptoms, currently publically accessible on the Web, had an online component, a structured treatment plan, and were available in English. Data were extracted for website characteristics, program characteristics, therapeutic characteristics, as well as empirical evidence. Programs were also evaluated using a 16-point rating tool. The search resulted in 34 programs that were eligible for review. A wide variety of programs for anxiety, including specific anxiety disorders, and anxiety in combination with stress, depression, or anger were identified and based predominantly on cognitive behavioral therapy techniques. The majority of websites were rated as credible, secure, and free of advertisement. The majority required users to register and/or to pay a program access
The Social Web constitutes a shift in information flow from the traditional Web. Previously, content was provided by the owners of a website, for consumption by the end-user. Nowadays, these websites are being replaced by Social Web applications which are frameworks for the publication of user-provided content. Traditionally, Web content could be `trusted' to some extent based on the site it originated from. Algorithms such as Google's PageRank were (and still are) used to compute the importance of a website, based on analysis of underlying link topology. In the Social Web, analysis of link topology merely tells us about the importance of the information framework which hosts the content. Consumers of information still need to know about the importance/reliability of the content they are reading, and therefore about the reliability of the producers of that content. Research into trust and reputation of the producers of information in the Social Web is still very much in its infancy. Every day, people are forced to make trusting decisions about strangers on the Web based on a very limited amount of information. For example, purchasing a product from an eBay seller with a `reputation' of 99%, downloading a file from a peer-to-peer application such as Bit-Torrent, or allowing Amazon.com tell you what products you will like. Even something as simple as reading comments on a Web-blog requires the consumer to make a trusting decision about the quality of that information. In all of these example cases, and indeed throughout the Social Web, there is a pressing demand for increased information upon which we can make trusting decisions. This chapter examines the diversity of sources from which trust information can be harnessed within Social Web applications and discusses a high level classification of those sources. Three different techniques for harnessing and using trust from a range of sources are presented. These techniques are deployed in two sample Social Web
Life without the World Wide Web has become unthinkable, much like life without electricity or water supply. We rely on the web to check public transport schedules, buy a ticket for a concert or exchange photos with friends. However, many everyday tasks cannot be accomplished by the computer itself, since the websites are designed to be read by people, not machines. In addition, the online information is often unstructured and poorly organized, leaving the user with tedious work of searching and filtering. This book takes us to the frontiers of the emerging Web 3.0 or Semantic Web - a new gener
Luciano, Joanne S; Cumming, Grant P; Wilkinson, Mark D; Kahana, Eva
The transformative power of the Internet on all aspects of daily life, including health care, has been widely recognized both in the scientific literature and in public discourse. Viewed through the various lenses of diverse academic disciplines, these transformations reveal opportunities realized, the promise of future advances, and even potential problems created by the penetration of the World Wide Web for both individuals and for society at large. Discussions about the clinical and health research implications of the widespread adoption of information technologies, including the Internet, have been subsumed under the disciplinary label of Medicine 2.0. More recently, however, multi-disciplinary research has emerged that is focused on the achievement and promise of the Web itself, as it relates to healthcare issues. In this paper, we explore and interrogate the contributions of the burgeoning field of Web Science in relation to health maintenance, health care, and health policy. From this, we introduce Health Web Science as a subdiscipline of Web Science, distinct from but overlapping with Medicine 2.0. This paper builds on the presentations and subsequent interdisciplinary dialogue that developed among Web-oriented investigators present at the 2012 Medicine 2.0 Conference in Boston, Massachusetts.
... page: //medlineplus.gov/ency/article/002844.htm Funnel-web spider bite To use the sharing features on ... the effects of a bite from the funnel-web spider. Male funnel-web spiders are more poisonous ...
Xu, Guandong; Li, Lin
This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web mining, and the issue of how to incorporate web mining into web personalization and recommendation systems are also reviewed. Additionally, the volume explores web community mining and analysis to find the structural, organizational and temporal developments of web communities and reveal the societal s
Full Text Available The adoption of Semantic Web technologies constitutes a promising approach to data structuring and integration, both for public and private usage. While these technologies have been around for some time, their adoption is behind overall expectations, particularly in the case of Enterprises. Having that in mind, we developed a Semantic Web Implementation Model that measures and facilitates the implementation of the technology. The advantages of using the model proposed are two-fold: the model serves as a guide for driving the implementation of the Semantic Web as well as it helps to evaluate the impact of the introduction of the technology. The model was adopted by 19 enterprises in an Action Research intervention of one year with promising results: according to the model's scale, in average, all enterprises evolved from a 6% evaluation to 46% during that period. Furthermore, practical implementation recommendations, a typical consulting tool, were developed and adopted during the project by all enterprises, providing important guidelines for the identification of a development path that may be adopted on a larger scale. Meanwhile, the project also outlined that most enterprises were interested in an even broader scope of the Implementation Model and the ambition of a "All Web Technologies" approach arose. One model that could embrace the observable overlapping of different Web generations, namely the Web of Documents, the Social Web, the Web of Data and, ultimately, the Web of Context. One model that could combine the evaluation and guidance for all enterprises to follow. That's the goal of the undergoing "Project Web X-ray" that aims to involve 200 enterprises in the adoption of best practices that may lead to their business development based on Web technologies. This paper presents a case of how Action Research promoted the simultaneous advancement of academic research and enterprise development and introduces the framework and opportunities
Full Text Available to set up in time for scenarios which require real time information. This may force communications to utilise public infrastructure. Securing communications for military mobile and Web based systems over public networks poses a greater challenge compared...
Hull, Richard; Thiemann, Peter; Wadler, Philip
Participants in the seminar broke into groups on ``Patterns and Paradigms'' for web programming, ``Web Services,'' ``Data on the Web,'' ``Software Engineering'' and ``Security.'' Here we give the raw notes recorded during these sessions.
This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:
The rapid advancement of semantic web technologies, along with the fact that they are at various levels of maturity, has left many practitioners confused about the current state of these technologies. Focusing on the most mature technologies, Applied Semantic Web Technologies integrates theory with case studies to illustrate the history, current state, and future direction of the semantic web. It maintains an emphasis on real-world applications and examines the technical and practical issues related to the use of semantic technologies in intelligent information management. The book starts with
Want to know how to make your pages look beautiful, communicate your message effectively, guide visitors through your website with ease, and get everything approved by the accessibility and usability police at the same time? Head First Web Design is your ticket to mastering all of these complex topics, and understanding what's really going on in the world of web design. Whether you're building a personal blog or a corporate website, there's a lot more to web design than div's and CSS selectors, but what do you really need to know? With this book, you'll learn the secrets of designing effecti
Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. The book uses a bottom-up approach to help you build applications, and is full of step-by-step instructions and practical examples to help you improve your knowledge.Instant Flask Web Development is for developers who are new to web programming, or are familiar with web programming but new to Flask. This book gives you a head start if you have some beginner experience with Python and HTML, or are willing to learn.
Radhakrishnan, Sabarinathan; Tafer, Hakim; Seemann, Ernst Stefan
, are derived from extensive pre-computed tables of distributions of substitution effects as a function of gene length and GC content. Here, we present a web service that not only provides an interface for RNAsnp but also features a graphical output representation. In addition, the web server is connected...... to a local mirror of the UCSC genome browser database that enables the users to select the genomic sequences for analysis and visualize the results directly in the UCSC genome browser. The RNAsnp web server is freely available at: http://rth.dk/resources/rnasnp/....
Web services are poised to become a key technology for a wide range of Internet-enabled applications, spanning everything from straight B2B systems to mobile devices and proprietary in-house software. While there are several tools and platforms that can be used for building web services, developers are finding a powerful tool in Microsoft's .NET Framework and Visual Studio .NET. Designed from scratch to support the development of web services, the .NET Framework simplifies the process--programmers find that tasks that took an hour using the SOAP Toolkit take just minutes. Programming .NET
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.
The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.
What do Amazon's product reviews, eBay's feedback score system, Slashdot's Karma System, and Xbox Live's Achievements have in common? They're all examples of successful reputation systems that enable consumer websites to manage and present user contributions most effectively. This book shows you how to design and develop reputation systems for your own sites or web applications, written by experts who have designed web communities for Yahoo! and other prominent sites. Building Web Reputation Systems helps you ask the hard questions about these underlying mechanisms, and why they're critical
SRD 69 NIST Chemistry WebBook (Web, free access) The NIST Chemistry WebBook contains: Thermochemical data for over 7000 organic and small inorganic compounds; thermochemistry data for over 8000 reactions; IR spectra for over 16,000 compounds; mass spectra for over 33,000 compounds; UV/Vis spectra for over 1600 compounds; electronic and vibrational spectra for over 5000 compounds; constants of diatomic molecules(spectroscopic data) for over 600 compounds; ion energetics data for over 16,000 compounds; thermophysical property data for 74 fluids.
Thelwall, Mike; Wilkinson, David
Explains how the Web can be modeled as a mathematical graph and analyzes the graph structures of three national university publicly indexable Web sites from Australia, New Zealand, and the United Kingdom. Topics include commercial search engines and academic Web link research; method-analysis environment and data sets; and power laws. (LRW)
"The World Wide Web can now drive. Sixteen years ago yeterday, in a short post to the alt.hypertext newsgroup, tim Berners-Lee revealed the first public web pages summarizing his World Wide Web project." (1/4 page)
Comunicação pública, transparência e políticas públicas: avaliação de informações em portais brasileiros de governo / Public communications, transparency, and public policy: assessing information on Brazilian government web portals
technologies for the dissemination of information on public management and the potential creation of dialogue between government and citizens. Despite this trend, there is still a need for specific knowledge about normative aspects of public communication generated by governments on the internet and its role in the fulfillment of the right to information. This article offers a contribution to fulfill the gap of guidelines and standards for professional performance. It describes the results of empirical research which identified the potential contribution of government web portals of the main cities of São Paulo, in the southeastern Brazil, to the strengthening of citizenship, consid ered in its dimension of exercising the right to information about public policies, particularly those which have an impact on education. The depth and breadth of information were investigated according with twelve categories of evaluation: history; diagnoses; goals; goals; resources and current actions; planned resources and actions; efficiency; effectiveness; impact; cost-effectiveness; user satisfaction; and equity. The data found on the analyzed portals correspond to the average of 11% of which was considered, under the theoretical-methodo logical context of the research, as information necessary to comprise the full characterization of a public policy in relation to the categories of assessment. Opportunities to improve government web portals were detected, for which we suggest communication management strategies.
Discussion of public libraries, the Internet, and the World Wide Web focuses on development of a Web site in Washington. Highlights include access to the Internet through online public access catalogs; partnerships between various types of libraries; hardware and software; HTML training; content design; graphics design; marketing; evaluation; and…
Xu, Guandong; Zhang, Yanchun; Li, Lin
This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web ...... sense of individuals or communities. The volume will benefit both academic and industry communities interested in the techniques and applications of web search, web data management, web mining and web knowledge discovery, as well as web community and social network analysis.......This book examines the techniques and applications involved in the Web Mining, Web Personalization and Recommendation and Web Community Analysis domains, including a detailed presentation of the principles, developed algorithms, and systems of the research in these areas. The applications of web...... mining, and the issue of how to incorporate web mining into web personalization and recommendation systems are also reviewed. Additionally, the volume explores web community mining and analysis to find the structural, organizational and temporal developments of web communities and reveal the societal...
Khelghati, Mohammadreza; Hiemstra, Djoerd; van Keulen, Maurice
Web content changes rapidly . In Focused Web Harvesting  which aim it is to achieve a complete harvest for a given topic, this dynamic nature of the web creates problems for users who need to access a set of all the relevant web data to their topics of interest. Whether you are a fan
Khelghati, Mohammadreza; Hiemstra, Djoerd; van Keulen, Maurice
The change of the web content is rapid. In Focused Web Harvesting [?], which aims at achieving a complete harvest for a given topic, this dynamic nature of the web creates problems for users who need to access a complete set of related web data to their interesting topics. Whether you are a fan
This tutorial provides an overview of several document engineering techniques which are applicable to the authoring of World Wide Web documents. It illustrates how pre-WWW hypertext research is applicable to the development of WWW information resources
Smith, Karl D.
Explains an upper elementary game of tag that illustrates energy flow in food webs using candy bars as food sources. A follow-up field trip to a river and five language arts projects are also suggested. (CS)
Department of Transportation — A web service that allows end-users the ability to query the current known delays in the National Airspace System as well as the current weather from NOAA by airport...
Berners-Lee, Tim; Swick, Ralph
...) project between 2002 and 2005 provided key steps in the research in the Semantic Web technology, and also played an essential role in delivering the technology to industry and government in the form...
Full Text Available Stress placed on network infrastructure by the popularity of the World Wide Web may be partially relieved by keeping multiple copies of Web documents at geographically dispersed locations. In particular, use of proxy caches and replication provide a means of storing information 'nearer to end users'. This paper concentrates on the locational aspects of Web caching giving both an overview, from an operational research point of view, of existing research and putting forward avenues for possible further research. This area of research is in its infancy and the emphasis will be on themes and trends rather than on algorithm construction. Finally, Web caching problems are briefly related to referral systems more generally.
Abbas, Ali; N Rutty, Guy
When one thinks of print identification techniques one automatically considers fingerprints. Although finger prints have been in use now for over 100 years there is in fact an older type of identification technique related to prints left at scenes of crime and the anatomy of human body parts. This is the world of ear prints. This short web review considers web sites related to ear print identification particularly the continuing controversy as to whether or not an ear print is unique.
Performance is critical to the success of any web site, and yet today's web applications push browsers to their limits with increasing amounts of rich content and heavy use of Ajax. In this book, Steve Souders, web performance evangelist at Google and former Chief Performance Yahoo!, provides valuable techniques to help you optimize your site's performance. Souders' previous book, the bestselling High Performance Web Sites, shocked the web development world by revealing that 80% of the time it takes for a web page to load is on the client side. In Even Faster Web Sites, Souders and eight exp
Learn to build dynamic web sites with Microsoft WebMatrix Microsoft WebMatrix is designed to make developing dynamic ASP.NET web sites much easier. This complete Wrox guide shows you what it is, how it works, and how to get the best from it right away. It covers all the basic foundations and also introduces HTML, CSS, and Ajax using jQuery, giving beginning programmers a firm foundation for building dynamic web sites.Examines how WebMatrix is expected to become the new recommended entry-level tool for developing web sites using ASP.NETArms beginning programmers, students, and educators with al
Kelly, Brian; Nevile, Liddy; Sloan, David; Fanou, Sotiris; Ellison, Ruth; Herrod, Lisa
This article asserts that current approaches to enhance the accessibility of Web resources fail to provide a solid foundation for the development of a robust and future-proofed framework. In particular, they fail to take advantage of new technologies and technological practices. The article introduces a framework for Web adaptability, which encourages the development of Web-based services that can be resilient to the diversity of uses of such services, the target audience, available resources, technical innovations, organisational policies and relevant definitions of 'accessibility'. The article refers to a series of author-focussed approaches to accessibility through which the authors and others have struggled to find ways to promote accessibility for people with disabilities. These approaches depend upon the resource author's determination of the anticipated users' needs and their provision. Through approaches labelled as 1.0, 2.0 and 3.0, the authors have widened their focus to account for contexts and individual differences in target audiences. Now, the authors want to recognise the role of users in determining their engagement with resources (including services). To distinguish this new approach, the term 'adaptability' has been used to replace 'accessibility'; new definitions of accessibility have been adopted, and the authors have reviewed their previous work to clarify how it is relevant to the new approach. Accessibility 1.0 is here characterised as a technical approach in which authors are told how to construct resources for a broadly defined audience. This is known as universal design. Accessibility 2.0 was introduced to point to the need to account for the context in which resources would be used, to help overcome inadequacies identified in the purely technical approach. Accessibility 3.0 moved the focus on users from a homogenised universal definition to recognition of the idiosyncratic needs and preferences of individuals and to cater for them. All of
Dukes, M.; Gardi, E.; McAslan, H.; Scott, D.J.; White, C.D.
The non-Abelian exponentiation theorem has recently been generalised to correlators of multiple Wilson line operators. The perturbative expansions of these correlators exponentiate in terms of sets of diagrams called webs, which together give rise to colour factors corresponding to connected graphs. The colour and kinematic degrees of freedom of individual diagrams in a web are entangled by mixing matrices of purely combinatorial origin. In this paper we relate the combinatorial study of these matrices to properties of partially ordered sets (posets), and hence obtain explicit solutions for certain families of web-mixing matrix, at arbitrary order in perturbation theory. We also provide a general expression for the rank of a general class of mixing matrices, which governs the number of independent colour factors arising from such webs. Finally, we use the poset language to examine a previously conjectured sum rule for the columns of web-mixing matrices which governs the cancellation of the leading subdivergences between diagrams in the web. Our results, when combined with parallel developments in the evaluation of kinematic integrals, offer new insights into the all-order structure of infrared singularities in non-Abelian gauge theories
Sohn, Jae Min; Kang, Young Hwan
Following the footsteps for internationalization and information-oriented society, we need to open the HANARO to the public, and to serve the more detail, accurate, and various information rapidly through the internet to enhance the HANARO utilization efficiency. Following items are described to develop the HANARO Web which has function as an information platform for research reactors: User requirements, Conceptual design, Development plan (method and schedule), Maintenance and management. The conceptual design, development method and schedule and functions are proposed in developing the HANARO Web. The data of the HANARO should be processed and organized systematically for better utilization of HANARO. A supplementation of the functions is needed and the HANARO Web should be operated practically with the maximum efficiency and advertised the activities locally and internationally.
Lizio, Marina; Harshbarger, Jayson; Abugessaisa, Imad
Upon the first publication of the fifth iteration of the Functional Annotation of Mammalian Genomes collaborative project, FANTOM5, we gathered a series of primary data and database systems into the FANTOM web resource (http://fantom.gsc.riken.jp) to facilitate researchers to explore...... transcriptional regulation and cellular states. In the course of the collaboration, primary data and analysis results have been expanded, and functionalities of the database systems enhanced. We believe that our data and web systems are invaluable resources, and we think the scientific community will benefit...... for this recent update to deepen their understanding of mammalian cellular organization. We introduce the contents of FANTOM5 here, report recent updates in the web resource and provide future perspectives....
The multi-tier implementation of DCMP Web site is discussed. It is based upon newly developed PHP technology. The technology allows for creating dynamic content and scalable solutions for Web site capabilities. There are several aspects as to what type of information is to be on the site. First, it should serve the immediate needs of the researchers in the field, namely, conferences, journals, news, funds, etc. This is currently available on the site, but can be extended and improved if needed. Second, the site will reflect the connection between Condensed matter physics and the technological breakthroughs that drive the economy. Third, the site will carry an educational mission helping educate the general public, and on the other hand, help young people to start their careers in the field. The content of the DCMP Web site is under active development. It depends upon wide involvement of DCMP members.
Sohn, Jae Min; Kang, Young Hwan
Following the footsteps for internationalization and information-oriented society, we need to open the HANARO to the public, and to serve the more detail, accurate, and various information rapidly through the internet to enhance the HANARO utilization efficiency. Following items are described to develop the HANARO Web which has function as an information platform for research reactors: User requirements, Conceptual design, Development plan (method and schedule), Maintenance and management. The conceptual design, development method and schedule and functions are proposed in developing the HANARO Web. The data of the HANARO should be processed and organized systematically for better utilization of HANARO. A supplementation of the functions is needed and the HANARO Web should be operated practically with the maximum efficiency and advertised the activities locally and internationally
Gaskill, Martonia; McNulty, Anastasia; Brooks, David W.
WebQuests are activities in which students use Web resources to learn about school topics. WebQuests are advocated as constructivist activities and ones generally well regarded by students. Two experiments were conducted in school settings to compare learning using WebQuests versus conventional instruction. Students and teachers both enjoyed…
Full Text Available Is there work on web? Can we distinguish web-users between "workers"? What innovative ways takes digital activity? Is it possible to identify typical professional profiles and trades on web? Can the extreme accessibility and impersonality of the web create genuine employment relationships and support production processes? Is there a way to apply labour rules and protections to experiences that intentionally take advantage of extraterritoriality, autarky and polycentric dimension of Internet?The questions raised by the Author underline that the web, rather than simply providing means and support to the movement of information in advantage of the labour market traditional players (companies, workers, private agencies, public employment services, has instead developed autonomously its own potential. On one hand, the web has became a professional intermediary, using the capabilities of computing devices and search engine that offer global employment services 2.0; on the other hand, has caused the “dis-intermediation” towards institutional operators (public and private through the dissemination of informal circuits, sites of social recruiting, and also civic networks aimed at intercepting interstitial job opportunities.Moreover, the web tends increasingly to exchange or combine the role of the intermediary and the one of the employer in the labour market, making available - at least potentially - a virtual, but global, space for outsourcing. The A. reflects, in particular, on legal characterization of digital work through crowdsourcing and on the application of rules of contracts and Agency work.Although work on web is difficult to recognize, measure and regulate in legal terms, it has been observed that the use of technological devices and digital services mark the rebirth of the "trade" - in its original, ancient meaning of the practice of an art or the expression of a talent - in contrast to the "profession" established in the high twentieth
Ray, Randy J
Given Perl's natural fit for web applications development, it's no surprise that Perl is also a natural choice for web services development. It's the most popular web programming language, with strong implementations of both SOAP and XML-RPC, the leading ways to distribute applications using web services. But books on web services focus on writing these applications in Java or Visual Basic, leaving Perl programmers with few resources to get them started. Programming Web Services with Perl changes that, bringing Perl users all the information they need to create web services using their favori
Metson, S; Newbold, D; Belforte, S; Kavka, C; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Tuura, L; Evans, D; Fanfani, A; Feichtinger, D; Kuznetsov, V; Lingen, F van; Wakefield, S
We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools
Metson, S; Newbold, D [H.H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Belforte, S; Kavka, C [INFN, Sezione di Trieste (Italy); Bockelman, B [University of Nebraska Lincoln, Lincoln, NE (United States); Dziedziniewicz, K [CERN, Geneva (Switzerland); Egeland, R [University of Minnesota Twin Cities, Minneapolis, MN (United States); Elmer, P [Princeton (United States); Eulisse, G; Tuura, L [Northeastern University, Boston, MA (United States); Evans, D [Fermilab MS234, Batavia, IL (United States); Fanfani, A [Universita degli Studi di Bologna (Italy); Feichtinger, D [PSI, Villigen (Switzerland); Kuznetsov, V [Cornell University, Ithaca, NY (United States); Lingen, F van [California Institute of Technology, Pasedena, CA (United States); Wakefield, S [Blackett Laboratory, Imperial College, London (United Kingdom)
We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools.
This Publications Catalogue lists all sales publications of the IAEA issued and forthcoming for the period Autumn 2003 - early 2004. Most Agency publications are issued in English, though some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books
This Publications Catalogue lists all sales publications of the IAEA issued and forthcoming during the period Spring 2003. Most Agency publications are issued in English, though some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books
This Publications Catalogue lists all sales publications of the IAEA published in 2002, 2003 and forthcoming in early 2004. Most IAEA publications are issued in English, though some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. A complete listing of all IAEA priced publications is available on the IAEA's web site: http://www.iaea.org/books
Dissemination of scientific knowledge is one of the most important aspects to be covered by a scientific publication, and the importance of computer networks in this respect, is well known. Towards this end, the journal Investigación Clínica has established for more than a year its own website, with free access to all the numbers published by the journal in over a little more than 50 years. Our web site has been visited a total of 2,759 times, with an average of 3.04 published articles reviewed per visit and 61.92 % of new visits. As it is obviously expected, Venezuela was the most frequent country of origin of the visits, mainly from the cities of Maracaibo (700), Caracas (352), Mérida (150), Maracay (47) and Valencia (32). Visits from other countries included Mexico (462), Spain (141), Argentina (122), United States (105), Brazil (81), Colombia (74), Peru (32), Chile (25) and Italy (13). The rest of the countries had a frequency close to ten visits. The aim of the editors of "Investigaci6n Clínica" is to achieve an even better scientific quality and scope of the journal and we consider that having it included in the web, will successfully assist in this purpose.
Kent, Michael L.
Discusses approaches to teaching a mediated public relations course, emphasizing the World Wide Web. Outlines five course objectives, assignments and activities, evaluation, texts, and lecture topics. Argues that students mastering these course objectives will understand ethical issues relating to media use, using mediated technology in public…
Ramakrishnan, I V; Ashok, Vikas; Billah, Syed Masum
People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability.
Renteria, Jose C.; Lodha, Suresh K.
WebVis, the Hierarchical Web Home Page Visualizer, is a tool for managing home web pages. The user can access this tool via the WWW and obtain a hierarchical visualization of one's home web pages. WebVis is a real time interactive tool that supports many different queries on the statistics of internal files such as sizes, age, and type. In addition, statistics on embedded information such as VRML files, Java applets, images and sound files can be extracted and queried. Results of these queries are visualized using color, shape and size of different nodes of the hierarchy. The visualization assists the user in a variety of task, such as quickly finding outdated information or locate large files. WebVIs is one solution to the growing web space maintenance problem. Implementation of WebVis is realized with Perl and Java. Perl pattern matching and file handling routines are used to collect and process web space linkage information and web document information. Java utilizes the collected information to produce visualization of the web space. Java also provides WebVis with real time interactivity, while running off the WWW. Some WebVis examples of home web page visualization are presented.
Francisco Javier García Gómez
Full Text Available Some Spanish public libraries have sites Web in a new digital work environment. These libraries are already delivered some services in their virtual branches. We are interesting to analyze user education in their sites Web. We are reviewed and tested some digital resources and services for user education in public libraries at World Wide Web. Level developing obtained in this library work is shown in conclusions. Likewise, we contributed some references about public library web sites design focused in user education and library instruction
Slominski, Ryan [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Larrieu, Theodore L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)
Jefferson Lab's Web Extensible Display Manager (WEDM) allows staff to access EDM control system screens from a web browser in remote offices and from mobile devices. Native browser technologies are leveraged to avoid installing and managing software on remote clients such as browser plugins, tunnel applications, or an EDM environment. Since standard network ports are used firewall exceptions are minimized. To avoid security concerns from remote users modifying a control system, WEDM exposes read-only access and basic web authentication can be used to further restrict access. Updates of monitored EPICS channels are delivered via a Web Socket using a web gateway. The software translates EDM description files (denoted with the edl suffix) to HTML with Scalable Vector Graphics (SVG) following the EDM's edl file vector drawing rules to create faithful screen renderings. The WEDM server parses edl files and creates the HTML equivalent in real-time allowing existing screens to work without modification. Alternatively, the familiar drag and drop EDM screen creation tool can be used to create optimized screens sized specifically for smart phones and then rendered by WEDM.
Full Text Available The paper defines the linked data as a set of best practices that are used to publish data on the web using a machine; the technology (or mode of realization of linked data is associated with the concept of the semantic web. It is the area of the semantic web, or web of data, as defined by Tim Berners-Lee "A web of things in the world, described by data on the web". The paper highlights the continuities and differences between semantic web and web traditional, or web documents. The analysis of linked data takes place within the world of libraries, archives and museums, traditionally committed to high standards for structuring and sharing of data. The data, in fact, assume the role of generating quality information for the network. The production of linked data requires compliance with rules and the use of specific technologies and languages, especially in the case of publication of linked data in open mode. The production cycle of linked data may be the track, or a guideline, for institutions that wish to join projects to publish their data. Data quality is assessed through a rating system designed by Tim Berners-Lee.
Full Text Available An Applied study aims at analyzes the metadata of Arabic Libraries' Web Sites in Egypt and Saudi Arabia, it begins with a methodological introduction, then the study analyzes the web sites using Meta Tag Analyzer software, it included the following web sites : Library of Alexandria, Egyptian Libraries, Egyptian National, King Fahd National Library, King Abdel Aziz Public Library, and Mubarak Public Library.
Sivarajah, U; Weerakkody, V; Irani, Z
Public administration has endured signification transformation over the last decade enabled largely through Information and Communication Technology. In recent times, second generation web technologies (Web 2.0) such as social media and net-working sites are increasingly being used by governments for its digital activities rang-ing from public relations to knowledge management. However, as Web 2.0 technolo-gies are more interactive than the traditional models of information provision or crea-...
Hull, Richard; Thiemann, Peter; Wadler, Philip
The world-wide web raises a variety of new programming challenges. To name a few: programming at the level of the web browser, data-centric approaches, and attempts to automatically discover and compose web services. This seminar brought together researchers from the web programming and web services communities and strove to engage them in communication with each other. The seminar was held in an unusual style, in a mixture of short presentations and in-depth discussio...
Fosha, Charles E.
This paper addresses approaches to using publicity and public relations to meet the goals of the NASA Space Grant College. Methods universities and colleges can use to publicize space activities are presented.
Thalheim, Bernhard; Prinz, Andreas; Buchberger, Bruno
The papers in this volume aim at obtaining a common understanding of the challenging research questions in web applications comprising web information systems, web services, and web interoperability; obtaining a common understanding of verification needs in web applications; achieving a common understanding of the available rigorous approaches to system development, and the cases in which they have succeeded; identifying how rigorous software engineering methods can be exploited to develop suitable web applications; and at developing a European-scale research agenda combining theory, methods a
Shabbeer, Amina; Ozcaglar, Cagri; Yener, Bülent; Bennett, Kristin P
In this study we explore publicly available web tools designed to use molecular epidemiological data to extract information that can be employed for the effective tracking and control of tuberculosis (TB). The application of molecular methods for the epidemiology of TB complement traditional approaches used in public health. DNA fingerprinting methods are now routinely employed in TB surveillance programs and are primarily used to detect recent transmissions and in outbreak investigations. Here we present web tools that facilitate systematic analysis of Mycobacterium tuberculosis complex (MTBC) genotype information and provide a view of the genetic diversity in the MTBC population. These tools help answer questions about the characteristics of MTBC strains, such as their pathogenicity, virulence, immunogenicity, transmissibility, drug-resistance profiles and host-pathogen associativity. They provide an integrated platform for researchers to use molecular epidemiological data to address current challenges in the understanding of TB dynamics and the characteristics of MTBC. Copyright © 2011. Published by Elsevier B.V.
Full Text Available The increasing availability and popularity of computer systems has resulted in a demand for new, language- and platform-independent ways of data exchange. That demand has in turn led to a significant growth in the importance of systems based on Web services. Alongside the growing number of systems accessible via Web services came the need for specialized data repositories that could offer effective means of searching of available services. The development of mobile systems and wireless data transmission technologies has allowed the use of distributed devices and computer systems on a greater scale. The accelerating growth of distributed systems might be a good reason to consider the development of distributed Web service repositories with built-in mechanisms for data migration and synchronization.
Full Text Available Deals with Lithuanian full-text electronic periodicals distributed through the World Wide Web. An electronic periodical is usually defined as a regular publication on some particular topic distributed in digital form, chiefly through the Web, but also by electronic mail or digital disk. The author has surveyed 106 publications. Thirty-four are distributed only on the Web, and 72 have printed versions. The number of analysed publications is not very big, but four years of electronic publishing and the variety of periodicals enables us to establish the causes of this phenomenon, the main features of development, and some perspectives. Electronic periodicals were analysed according to their type, purpose, contents, publisher, regularity, language, starting date and place of publication, and other features.
Le Thanh, Nghi
The World-Wide Web may have taken the Internet by storm, but many people would be surprised to learn that it owes its existence to CERN. Around half the world's particle physicists come to CERN for their experiments, and the Web is the result of their need to share information quickly and easily on a global scale. Six years after Tim Berners-Lee's inspired idea to marry hypertext to the Internet in 1989, CERN is handing over future Web development to the World-Wide Web Consortium, run by the French National Institute for Research in Computer Science and Control, INRIA, and the Laboratory for Computer Science of the Massachusetts Institute of Technology, MIT, leaving itself free to concentrate on physics. The Laboratory marked this transition with a conference designed to give a taste of what the Web can do, whilst firmly stamping it with the label ''Made in CERN''. Over 200 European journalists and educationalists came to CERN on 8 - 9 March for the World-Wide Web Days, resulting in wide media coverage. The conference was opened by UK Science Minister David Hunt who stressed the importance of fundamental research in generating new ideas. ''Who could have guessed 10 years ago'', he said, ''that particle physics research would lead to a communication system which would allow every school to have the biggest library in the world in a single computer?''. In his introduction, the Minister also pointed out that ''CERN and other basic research laboratories help to break new technological ground and sow the seeds of what will become mainstream manufacturing in the future.'' Learning the jargon is often the hardest part of coming to grips with any new invention, so CERN put it at the top of the agenda. Jacques Altaber, who helped introduce the Internet to CERN in the early 1980s, explained that without the Internet, the Web couldn't exist. The Internet began as a US Defense Department research project in the 1970s and has grown into a global network-ofnetworks linking some
This book is about the process of creating web-based systems (i.e., websites, content, etc.) that consider each of the parts, the modules, the organisms - binary or otherwise - that make up a balanced, sustainable web ecosystem. In the current media-rich environment, a website is more than a collection of relative html documents of text and images on a static desktop computer monitor. There is now an unlimited combination of screens, devices, platforms, browsers, locations, versions, users, and exabytes of data with which to interact. Written in a highly approachable, practical style, this boo
How do you create a mission-critical site that provides exceptional performance while remaining flexible, adaptable, and reliable 24/7? Written by the manager of a UI group at Yahoo!, Developing Large Web Applications offers practical steps for building rock-solid applications that remain effective even as you add features, functions, and users. You'll learn how to develop large web applications with the extreme precision required for other types of software. Avoid common coding and maintenance headaches as small websites add more pages, more code, and more programmersGet comprehensive soluti
Nash, David Richard
The world wide web provides many resources that are useful to the myrmecologist. Here I provide a brief introduc- tion to the types of information currently available, and to recent developments in data provision over the internet which are likely to become important resources for myrmecologists...... in the near future. I discuss the following types of web site, and give some of the most useful examples of each: taxonomy, identification and distribution; conservation; myrmecological literature; individual species sites; news and discussion; picture galleries; personal pages; portals....
The World-Wide Web may have taken the Internet by storm, but many people would be surprised to learn that it owes its existence to CERN. Around half the world's particle physicists come to CERN for their experiments, and the Web is the result of their need to share information quickly and easily on a global scale. Six years after Tim Berners-Lee's inspired idea to marry hypertext to the Internet in 1989, CERN is handing over future Web development to the World-Wide Web Consortium, run by the French National Institute for Research in Computer Science and Control, INRIA, and the Laboratory for Computer Science of the Massachusetts Institute of Technology, MIT, leaving itself free to concentrate on physics. The Laboratory marked this transition with a conference designed to give a taste of what the Web can do, whilst firmly stamping it with the label ''Made in CERN''. Over 200 European journalists and educationalists came to CERN on 8 - 9 March for the World-Wide Web Days, resulting in wide media coverage. The conference was opened by UK Science Minister David Hunt who stressed the importance of fundamental research in generating new ideas. ''Who could have guessed 10 years ago'', he said, ''that particle physics research would lead to a communication system which would allow every school to have the biggest library in the world in a single computer?''. In his introduction, the Minister also pointed out that ''CERN and other basic research laboratories help to break new technological ground and sow the seeds of what will become mainstream manufacturing in the future.'' Learning the jargon is often the hardest part of coming to grips with any new invention, so CERN put it at the top of the agenda. Jacques Altaber, who helped introduce the Internet to CERN in the early 1980s, explained that without the Internet, the Web couldn't exist. The Internet began as a US Defense
Advanced technology plays a key role in enabling future Earth-observing missions needed for global monitoring and climate research. Rapid progress over the past decade and anticipated for the coming decades have diminished the size of some satellites while increasing the amount of data and required pace of integration and analysis. Sensor web developments provide correlations to constellations of smallsats. Reviewing current advances in sensor webs and requirements for constellations will improve planning, operations, and data management for future architectures of multiple satellites with a common mission goal.
Blåbjerg, Niels Jørgen
Learning Objects Web er et DEFF projekt som Aalborg Universitetsbibliotek har initieret. Projektet tager afsæt i de resultater og erfaringer som er opnået med vores tidligere projekt Streaming Webbased Information Modules (SWIM). Vi har et internationalt netværk af interessenter som giver os...... sparring og feedback i forhold til udviklingskoncept både omkring de teoretiske rammer og i forhold til praktisk anvendelse af vores undervisningskoncept. Med disse rygstød og input har vi forfulgt ønsket om at videreudvikle SWIM i det nye projekt Learning Objects Web. Udgivelsesdato: juni...
The WebSocket allows asynchronous full-duplex communication between a Web-based (i.e. Java Script-based) application and a Web-server. WebSocket started as a part of HTML5 standardization but has now been separated from HTML5 and has been developed independently. Using WebSocket, it becomes easy to develop platform independent presentation layer applications for accelerator and beamline control software. In addition, a Web browser is the only application program that needs to be installed on client computer. The WebSocket-based applications communicate with the WebSocket server using simple text-based messages, so WebSocket is applicable message-based control system like MADOCA, which was developed for the SPring-8 control system. A simple WebSocket server for the MADOCA control system and a simple motor control application were successfully made as a first trial of the WebSocket control application. Using Google-Chrome (version 13.0) on Debian/Linux and Windows 7, Opera (version 11.0) on Debian/Linux and Safari (version 5.0.3) on Mac OS X as clients, the motors can be controlled using a WebSocket-based Web-application. Diffractometer control application use in synchrotron radiation diffraction experiment was also developed. (author)
Lin, Carolyn A; Hullman, Gwen A
Antitobacco groups have joined millions of other commercial or noncommercial entities in developing a presence on the Web. These groups primarily represent the following different sponsorship categories: grassroots, medical, government, and corporate. To obtain a better understanding of the strengths and weaknesses in the message design of antitobacco Web sites, this project analyzed 100 antitobacco Web sites ranging across these four sponsorship categories. The results show that the tobacco industry sites posted just enough antismoking information to appease the antismoking publics. Medical organizations designed their Web sites as specialty sites and offered mostly scientific information. While the government sites resembled a clearinghouse for antitobacco related information, the grassroots sites represented the true advocacy outlets. In general, the industry sites provided the weakest persuasive messages and medical sites fared only slightly better. Government and grassroots sites rated most highly in presenting their antitobacco campaign messages on the Web.
Expression Web is Microsoft's newest tool for creating and maintaining dynamic Web sites. This FrontPage replacement offers all the simple ""what-you-see-is-what-you-get"" tools for creating a Web site along with some pumped up new features for working with Cascading Style Sheets and other design options. Microsoft Expression Web For Dummies arrives in time for early adopters to get a feel for how to build an attractive Web site. Author Linda Hefferman teams up with longtime FrontPage For Dummies author Asha Dornfest to show the easy way for first-time Web designers, FrontPage ve
Gaskill, Martonia; McNulty, Anastasia; Brooks, David W.
WebQuests are activities in which students use Web resources to learn about school topics. WebQuests are advocated as constructivist activities and ones generally well regarded by students. Two experiments were conducted in school settings to compare learning using WebQuests versus conventional instruction. Students and teachers both enjoyed WebQuest instruction and spoke highly of it. In one experiment, however, conventional instruction led to significantly greater student learning. In the other, there were no significant differences in the learning outcomes between conventional versus WebQuest-based instruction.
Cusack, Caitlin M; Shah, Sapna
The Agency for Healthcare Research and Quality (AHRQ) has made an investment of over $216 million in research around health information technology (health IT). As part of their investment, AHRQ has developed the National Resource Center for Health IT (NRC) which includes a public domain Web site. New content for the web site, such as white papers, toolkits, lessons from the health IT portfolio and web-based tools, is developed as needs are identified. Among the tools developed by the NRC are the Compendium of Surveys and the Clinical Decision Support (CDS) Resources. The Compendium of Surveys is a searchable repository of health IT evaluation surveys made available for public use. The CDS Resources contains content which may be used to develop clinical decision support tools, such as rules, reminders and templates. This live demonstration will show the access, use, and content of both these freely available web-based tools.
Cambronero, M.-Emilia; Okika, Joseph C.; Ravn, Anders Peter
Web services should be dependable, because businesses rely on them. For that purpose the Service Oriented Architecture has standardized specifications at a syntactical level. In this paper, we demonstrate how such specifications are used to derive semantic models in the form of (timed) automata...
Also on Web! l . R e journal of science education. 'Resonam:e' is a journal of science education, published monthly since January' 1996 by the Indian Academy of. Sciences, Bangalore,lndia. It is prirnarlly directed to students and teachers at the undergraduate level, though some matenal beyond thiS range. IS also included.
Kiss, S.; Sarfraz, M.
Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling
This book is for anyone who's worked with Clojure and wants to use it to start developing applications for the Web. Experience or familiarity with basic Clojure syntax is a must, and exposure to Leiningen (or other similar build tools such as Maven) would be helpful.
Baken, N.H.G.; Wiegel, V.; Van Oortmerssen, G.
This paper presents a vision on the importance of values and ethical aspects in web science. We create(d) the Internet, but now the Internet (technology) is shaping our world increasingly: the way we experience, interact, transact, conduct business et cetera. The Internet is ubiquitous and vital to
Discusses the increase in online plagiarism and what school librarians can do to help. Topics include the need for school district policies on plagiarism; teaching students what plagiarism is; pertinent Web sites; teaching students proper research skills; motivation for cheating; and requiring traditional sources of information for student…
The World Wide Web contains billions of documents (and counting); hence, it is likely that some document will contain the answer or content you are searching for. While major search engines like Bing and Google often manage to return relevant results to your query, there are plenty of situations in
The World Wide Web constitutes the largest existing source of texts written in a great variety of languages. A feasible and sound way of exploiting this data for linguistic research is to compile a static corpus for a given language. There are several adavantages of this approach: (i) Working with such corpora obviates the problems encountered when using Internet search engines in quantitative linguistic research (such as non-transparent ranking algorithms). (ii) Creating a corpus from web data is virtually free. (iii) The size of corpora compiled from the WWW may exceed by several orders of magnitudes the size of language resources offered elsewhere. (iv) The data is locally available to the user, and it can be linguistically post-processed and queried with the tools preferred by her/him. This book addresses the main practical tasks in the creation of web corpora up to giga-token size. Among these tasks are the sampling process (i.e., web crawling) and the usual cleanups including boilerplate removal and rem...
Evaluates 15 criminal justice Web sites that have been selected according to the following criteria: authority, currency, purpose, objectivity, and potential usefulness to researchers. The sites provide narrative and statistical information concerning crime, law enforcement, the judicial system, and corrections. Searching techniques are also…
Progressive Web Applications are native-like applications running inside of a browser context. In my presentation I would like describe their characteristics, benchmarks and building process using a quick and simple case study example with focus on Service Workers api.
Davies, John; Fensel, Dieter; Harmelen, Frank Van
With the current changes driven by the expansion of the World Wide Web, this book uses a different approach from other books on the market: it applies ontologies to electronically available information to improve the quality of knowledge management in large and distributed organizations. Ontologies
Becker, Bernd W.
The author has discussed the Multimedia Educational Resource for Teaching and Online Learning site, MERLOT, in a recent Electronic Roundup column. In this article, he discusses an entirely new Web page development tool that MERLOT has added for its members. The new tool is called the MERLOT Content Builder and is directly integrated into the…
consequences for the treatment of apparently independent epistemic problems that are subject of investigation in other thought collectives. For the practicing scientist it is necessary to take this complex web of interactions into account in order to be able to navigate in such a situation. So far most studies...
Sound technology policies can spell the difference between an effective website and an online nightmare. An effective web development policy addresses six key areas: roles and responsibilities, content/educational value, privacy and safety, adherence to copyright laws, technical standards, and use of commercial sites and services. (MLH)
A delicate pattern, like that of a spider web, appears on top of the Mars residual polar cap, after the seasonal carbon-dioxide ice slab has disappeared. Next spring, these will likely mark the sites of vents when the carbon-dioxide ice cap returns. This Mars Global Surveyor, Mars Orbiter Camera image is about 3-kilometers wide (2-miles).
Wighting, Mervyn J.; Lucking, Robert A.; Christmann, Edwin P.
Teachers search for ways to enhance oceanography units in the classroom. There are many online resources available to help one explore the mysteries of the deep. This article describes a collection of Web sites on this topic appropriate for middle level classrooms.
Hansen, Henning Sten
The protection and enhancement of the environment is the main aim of most environmental planning, and the use of geographic information as well as public participation can improve the quality of both the processes and the decisions. The current paper describes the role of web-based geographic...... information in environmental planning and gives an overview over the various approaches to public participation. The current advances in Web-based GIS in many countries contain great possibilities for supporting good governance based on information and knowledge on the one hand and active involvement...... of the citizens on the other hand. One important precondition for success in this field is a well-informed population with access to the Internet. The overall purpose of this paper is to give en overview of how to utilise geographic information and public participation as natural components in environmental...
Customers want high-quality products at low prices, and they want them now. The message is clear: The Public compares perceived alternatives. The communication problem of the nuclear industry is the same as any other nonmonopoly provider of products or services, i.e., to show the public that nuclear electricity is superior even though nuclear electricity itself is indistinguishable from any other electricity. The following topics are discussed in this paper: (1) What the public needs in general; (2) what the public wants of information delivered; (3) the nuclear information that the public wants; (4) the ANS public information web page; and (5) wider use of issues information
Kothari, Bhupesh; Claypool, Mark
Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)
Li, Hui; Rau, Pei-Luen Patrick; Fujimura, Kaori; Gao, Qin; Wang, Lin
This research aims to provide insight for web form design for older users. The effects of task complexity and information structure of web forms on older users' performance were examined. Forty-eight older participants with abundant computer and web experience were recruited. The results showed significant differences in task time and error rate…
Godsk, Mikkel; Petersen, Anja Bechmann
of the approaches' strengths. Furthermore, it is discussed and shortly demonstrated how WebCom can be used for analytical and design purposes with YouTube as an example. The chapter concludes that WebCom is able to serve as a theoretically-based model for understanding complex Web site communication situations...
Espinha, T.; Zaidman, A.; Gross, H.G.
Web APIs provide a systematic and extensible approach for application-to-application interaction. A large number of mobile applications makes use of web APIs to integrate services into apps. Each Web API’s evolution pace is determined by their respective developer and mobile application developers
Full Text Available Document coherency is a challenging problem for Web caching. Once the documents are cached throughout the Internet, it is often difficult to keep them coherent with the origin document without generating a new traffic that could increase the traffic on the international backbone and overload the popular servers. Several solutions have been proposed to solve this problem, among them two categories have been widely discussed: the strong document coherency and the weak document coherency. The cost and the efficiency of the two categories are still a controversial issue, while in some studies the strong coherency is far too expensive to be used in the Web context, in other studies it could be maintained at a low cost. The accuracy of these analysis is depending very much on how the document updating process is approximated. In this study, we compare some of the coherence methods proposed for Web caching. Among other points, we study the side effects of these methods on the Internet traffic. The ultimate goal is to study the cache behavior under several conditions, which will cover some of the factors that play an important role in the Web cache performance evaluation and quantify their impact on the simulation accuracy. The results presented in this study show indeed some differences in the outcome of the simulation of a Web cache depending on the workload being used, and the probability distribution used to approximate updates on the cached documents. Each experiment shows two case studies that outline the impact of the considered parameter on the performance of the cache.
The proposed “CERN WebRadio Club” is looking for you!!! An unique and cool educational Web-radio about science, tech, music and fun for the large public made at CERN by CERN people! We are looking for skilled IT volunteers with creativity and passion to develop our website and mobile apps and maintain our systems, in an environment made to express your ideas. Come and join us, you will not regret!
The U.S. Geological Survey (USGS) invites you to explore an earth science virtual library of digital information, publications, and data. The USGS World Wide Web sites offer an array of information that reflects scientific research and monitoring programs conducted in the areas of natural hazards, environmental resources, and cartography. This list provides gateways to access a cross section of the digital information on the USGS World Wide Web sites.
Jannick Kirk Sørensen
Between 2006 and 2011, a number of European public service broadcasting (PSB) organisations offered their website users the opportunity to create their own PSB homepage. The web customisation was conceived by the editors as a response to developments in commercial web services, particularly social networking and content aggregation services, but the customisation projects revealed tensions between the ideals of customer sovereignty and the editorial agenda-setting. This paper presents an over...
A hands-on focused step-by-step tutorial to help you create Web Service applications using Dropwizard. If you are a software engineer or a web developer and want to learn more about building your own Web Service application, then this is the book for you. Basic knowledge of Java and RESTful Web Service concepts is assumed and familiarity with SQL/MySQL and command-line scripting would be helpful.
Lewandowski, Dirk; Mayr, Philipp
Purpose: To provide a critical review of Bergman’s 2001 study on the Deep Web. In addition, we bring a new concept into the discussion, the Academic Invisible Web (AIW). We define the Academic Invisible Web as consisting of all databases and collections relevant to academia but not searchable by the general-purpose internet search engines. Indexing this part of the Invisible Web is central to scientific search engines. We provide an overview of approaches followed thus far. Design/methodol...
MANOJ KUMAR; ANUJ RANI
In today’s high tech environment every organization, individual computer users use internet for accessing web data. To maintain high confidentiality and security of the data secure web solutions are required. In this paper we described dedicated anonymous web browsing solutions which makes our browsing faster and secure. Web application which play important role for transferring our secret information including like email need more and more security concerns. This paper also describes that ho...
Elleby, Anita; Ingwersen, Peter
; the Cumulated Publication Point Indicator (CPPI), which graphically illustrates the cumulated gain of obtained vs. ideal points, both seen as vectors; and the normalized Cumulated Publication Point Index (nCPPI) that represents the cumulated gain of publication success as index values, either graphically......The paper presents comparative analyses of two publication point systems, The Norwegian and the in-house system from the interdisciplinary Danish Institute of International Studies (DIIS), used as case in the study for publications published 2006, and compares central citation-based indicators...... with novel publication point indicators (PPIs) that are formalized and exemplified. Two diachronic citation windows are applied: 2006-07 and 2006-08. Web of Science (WoS) as well as Google Scholar (GS) are applied to observe the cite delay and citedness for the different document types published by DIIS...
Hajibabaei, Mehrdad; Singer, Gregory A C
New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.
Salsovic, Annette R.
A WebQuest is an inquiry-based lesson plan that uses the Internet. This article explains what a WebQuest is, shows how to create one, and provides an example. When engaged in a WebQuest, students use technology to experience cooperative learning and discovery learning while honing their research, writing, and presentation skills. It has been found…
Sleefe, Gerard E.; Rudnick, Thomas J.; Novak, James L.
A system for electrically measuring variations over a flexible web has a capacitive sensor including spaced electrically conductive, transmit and receive electrodes mounted on a flexible substrate. The sensor is held against a flexible web with sufficient force to deflect the path of the web, which moves relative to the sensor.
The semantic web or Web 3.0 makes information more meaningful to people by making it more understandable to machines. In this article, the author examines the implications of Web 3.0 for education. The author considers three areas of impact: knowledge construction, personal learning network maintenance, and personal educational administration.…
Explores the interrelation between Web publishing and information retrieval technologies and lists new approaches to Web indexing and searching. Highlights include Web directories; search engines; portalisation; Internet service providers; browser providers; meta search engines; popularity based analysis; natural language searching; links-based…
Jackson, Joe; Gilstrap, Donald L.
Addresses the implications of the new Web metalanguage XML for searching on the World Wide Web and considers the future of XML on the Web. Compared to HTML, XML is more concerned with structure of data than documents, and these data structures should prove conducive to precise, context rich searching. (Author/LRW)
The web is a rich and diverse information source with incredible amounts of information about all kinds of subjects in various forms. This information source affords great opportunity to build systems that support users in their work and everyday lives. To help users explore information on the web, web search systems should find information that…
Nagayama, Yoshio, E-mail: firstname.lastname@example.org [National Institute for Fusion Science, 322-6 Oroshi, Toki 509-5292 (Japan); Emoto, Masahiko; Yoshida, Masanobu; Iwata, Chie [National Institute for Fusion Science, 322-6 Oroshi, Toki 509-5292 (Japan)
The large helical device (LHD), which is the world largest helical confinement system, is a national project serving Japanese fusion community. In LHD, experiments of 7000 shots are carried out for 250 proposals every year. Efficient experiment arrangement is required in order to carry out many collaborators' proposal. Sometimes collaborators who are not familiar to LHD stay at NIFS only a few day to join the experiment. Issues are as follows: how to reduce collaborator's effort, how to reduce manager's effort to optimize the schedule; how to publicize plan and results of the experiment. We have developed web systems of virtual printer, experimental proposal and scheduling by using Ruby on Rails (RoR), which encapsulates relational data base (RDB) and AJAX. RDB enables to make tables by searching and sorting data with key words. Web servers are equipped in the virtual computer system in order to minimize efforts and cost of maintenance. The LHD web portal has been also developed in order to provide collaborators an efficient and intuitive interface to access the above systems, to take LHD information, and to use tools for LHD data analysis. The web systems have reduced collaborators' and managers' efforts significantly.
Reisinger, Florian; del-Toro, Noemi; Ternent, Tobias; Hermjakob, Henning; Vizcaíno, Juan Antonio
The PRIDE (PRoteomics IDEntifications) database is one of the world-leading public repositories of mass spectrometry (MS)-based proteomics data and it is a founding member of the ProteomeXchange Consortium of proteomics resources. In the original PRIDE database system, users could access data programmatically by accessing the web services provided by the PRIDE BioMart interface. New REST (REpresentational State Transfer) web services have been developed to serve the most popular functionality provided by BioMart (now discontinued due to data scalability issues) and address the data access requirements of the newly developed PRIDE Archive. Using the API (Application Programming Interface) it is now possible to programmatically query for and retrieve peptide and protein identifications, project and assay metadata and the originally submitted files. Searching and filtering is also possible by metadata information, such as sample details (e.g. species and tissues), instrumentation (mass spectrometer), keywords and other provided annotations. The PRIDE Archive web services were first made available in April 2014. The API has already been adopted by a few applications and standalone tools such as PeptideShaker, PRIDE Inspector, the Unipept web application and the Python-based BioServices package. This application is free and open to all users with no login requirement and can be accessed at http://www.ebi.ac.uk/pride/ws/archive/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Thomsen, Jakob; Ernst, Erik; Brabrand, Claus
We present, WebSelF, a framework for web scraping which models the process of web scraping and decomposes it into four conceptually independent, reusable, and composable constituents. We have validated our framework through a full parameterized implementation that is flexible enough to capture...... previous work on web scraping. We have experimentally evaluated our framework and implementation in an experiment that evaluated several qualitatively different web scraping constituents (including previous work and combinations hereof) on about 11,000 HTML pages on daily versions of 17 web sites over...... a period of more than one year. Our framework solves three concrete problems with current web scraping and our experimental results indicate that com- position of previous and our new techniques achieve a higher degree of accuracy, precision and specificity than existing techniques alone....
Edward C Klatt
Dykstra, D.; Blomer, J.; Blumenfeld, B.; De Salvo, A.; Dewhurst, A.; Verguilov, V.
All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily support that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which they direct to the nearest publicly accessible web proxy servers. The responses
Gao, Sheng; Mioc, Darka; Boley, Harold
Web-based GIS is increasingly used in health applications. It has the potential to provide critical information in a timely manner, support health care policy development, and educate decision makers and the general public. This paper describes the trends and recent development of health...... applications using a Web-based GIS. Recent progress on the database storage and geospatial Web Services has advanced the use of Web-based GIS for health applications, with various proprietary software, open source software, and Application Programming Interfaces (APIs) available. Current challenges in applying...... care planning, and public health participation....
US Agency for International Development — WebPass Explorer (WebPASS Framework): USAID is partnering with DoS in the implementation of their WebPass Post Personnel (PS) Module. WebPassPS does not replace...
Learning system to study a discipline. In business to business interaction, different requirements and parameters of exchanged business requests might be served by different services from third parties. Such applications require certain intelligence and a slightly different approach to design. Adpative web......The unique characteristic of web applications is that they are supposed to be used by much bigger and diverse set of users and stakeholders. An example application area is e-Learning or business to business interaction. In eLearning environment, various users with different background use the e......-based applications aim to leave some of their features at the design stage in the form of variables which are dependent on several criteria. The resolution of the variables is called adaptation and can be seen from two perspectives: adaptation by humans to the changed requirements of stakeholders and dynamic system...
A Note from the Author and from O'Reilly Media about what this bookdoes--and doesn't--do: Palm webOS is a brand new platform and represents a very different type ofoperating system where the web runtime is used as the basis for the UI andApplication model. Palm and O'Reilly felt that it was important to have abook available to help developers get a basic understanding of the new Palmplatform at the time that the SDK was released; this timing played a majorrole in the content and structure of the book. Ideally this book would have been a complete reference of the new platformbut that wasn't
Callegaro, Mario; Vehovar, Asja
Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.
Gross, D.; Eisert, J.
We discuss the notion of quantum computational webs: These are quantum states universal for measurement-based computation, which can be built up from a collection of simple primitives. The primitive elements--reminiscent of building blocks in a construction kit--are (i) one-dimensional states (computational quantum wires) with the power to process one logical qubit and (ii) suitable couplings, which connect the wires to a computationally universal web. All elements are preparable by nearest-neighbor interactions in a single pass, of the kind accessible in a number of physical architectures. We provide a complete classification of qubit wires, a physically well-motivated class of universal resources that can be fully understood. Finally, we sketch possible realizations in superlattices and explore the power of coupling mechanisms based on Ising or exchange interactions.
important to you, to listening to your favorite music on- line. Because encryption is essential in protecting the authenticity of personal information...technological trends . Darknet tactics vary making it hard to identify darknet users or data hosts. For example, Dr. Gareth Owen, University of...When you do a simple Web search on a topic, the results that pop up aren’t the whole story. The Internet contains a vast trove of information
Suchmaschinen im WWW spielen heutzutage ein grosse Rolle um die vorhandene Masse der Informationen besser zugänglich zu machen. In dieser Arbeit wird anhand einer selbstkonstruierten Suchmaschine demonstriert, wie spezielle Informationen aus dem WWW automatisch gesammelt und halbautomatisch nach vorgegeben Kriterien bewertet werden können. Betrachtet wird dabei die Architektur von "Web-Crawlern" sowie Ranking- und Clustering-Verfahren für Dokumente im Allgemeinen. Entwickelt wurde ein Lernver...
This BSc Project was performed during a study stay at the Coventry University, UK. The goal of this project is to enhance the accessibility and usability of an existing company presentation located at http://www.hcc.cz, boost the site's traffic and so increase the company's revenues. The project follows these steps to accomplish this: a ) A partial refactoring of the back-end (PHP scripts). b ) Transformation of the website contents according to the recommendations of the World Wide Web conso...
Web Dynpro ABAP, a NetWeaver web application user interface tool from SAP, enables web programming connected to SAP Systems. The authors' main focus was to create a book based on their own practical experience. Each chapter includes examples which lead through the content step-by-step and enable the reader to gradually explore and grasp the Web Dynpro ABAP process. The authors explain in particular how to design Web Dynpro components, the data binding and interface methods, and the view controller methods. They also describe the other SAP NetWeaver Elements (ABAP Dictionary, Authorization) and
The arrangement for selectively irradiating a web includes a perforated band of a radiation impermeable substance which is guided in an endless path via a pair of guide rollers and has two juxtaposed runs in this path. A take-up roller conveys a web of material past one of the runs at a side thereof remote from the other run, the direction of movement of the web being other than parallel to that of the band and, preferably, normal thereto. An electron accelerator is provided at the far side of the run remote from the web and is effective for directing a radiation beam at the web through the perforations
Computer Security Team
You’re about to launch a new website? Cool!! With today’s web programming languages like PHP, Java, Python or Perl, complex websites can be created, easily fulfilling all your use cases. But hold on. Did you ever think about how easily this can be abused? Attackers today are already using automatic tools which can quickly and easily find and exploit vulnerable web applications. Web applications often suffer from security vulnerabilities, i.e. design flaws or programming bugs that remained undetected during the whole software development cycle. In production these vulnerabilities become security holes, providing an opportunity for exploitation, and can pose immense security risks (and there is no reason to believe that CERN is immune to this). The costs associated with eliminating these bugs could be loosely described by the "1:10:100 rule", i.e. the relative costs for fixing are 1:10:100 for fixing them in the programming:testing:production phases. Thus, the...
What is a web app? It's software that you use right in your web browser. Rather than installing an application on your computer, you visit a web site and sign up as a new user of its software. Instead of storing your files on your own hard disk, the web app stores them for you, online. Is it possible to switch entirely to web apps? To run nothing but a browser for an entire day? In this PDF we'll take you through one day in the life of a web apps-only user and chronicle the pros and cons of living by browser. And if the idea of switching, fully or partially, to web apps sounds appealing to
Houben, G.J.P.M.; Barna, P.; Frasincar, F.; Vdovják, R.; Cuella Lovelle, J.M.; et al., xx
As a consequence of the success of the Web, methodologies for information system development need to consider systems that use the Web paradigm. These Web Information Systems (WIS) use Web technologies to retrieve information from the Web and to deliver information in a Web presentation to the
Schultheiss, Sebastian J; Münch, Marc-Christian; Andreeva, Gergana D; Rätsch, Gunnar
We have conducted a study on the long-term availability of bioinformatics Web services: an observation of 927 Web services published in the annual Nucleic Acids Research Web Server Issues between 2003 and 2009. We found that 72% of Web sites are still available at the published addresses, only 9% of services are completely unavailable. Older addresses often redirect to new pages. We checked the functionality of all available services: for 33%, we could not test functionality because there was no example data or a related problem; 13% were truly no longer working as expected; we could positively confirm functionality only for 45% of all services. Additionally, we conducted a survey among 872 Web Server Issue corresponding authors; 274 replied. 78% of all respondents indicate their services have been developed solely by students and researchers without a permanent position. Consequently, these services are in danger of falling into disrepair after the original developers move to another institution, and indeed, for 24% of services, there is no plan for maintenance, according to the respondents. We introduce a Web service quality scoring system that correlates with the number of citations: services with a high score are cited 1.8 times more often than low-scoring services. We have identified key characteristics that are predictive of a service's survival, providing reviewers, editors, and Web service developers with the means to assess or improve Web services. A Web service conforming to these criteria receives more citations and provides more reliable service for its users. The most effective way of ensuring continued access to a service is a persistent Web address, offered either by the publishing journal, or created on the authors' own initiative, for example at http://bioweb.me. The community would benefit the most from a policy requiring any source code needed to reproduce results to be deposited in a public repository.
... to give adequate consideration to issues relating to public health, safety and welfare, or protection... Web site), mailed by first class mail to the parties who, to the Commission's knowledge, will...
Akpinar, Yavuz; Bayramoglu, Yusuf
The purpose of the study was to examine effects of a compact training for developing web sites on teachers' web attitude, as composed of: web self efficacy, perceived web enjoyment, perceived web usefulness and behavioral intention to use the web. To measure the related constructs, the Web Attitude Scale was adapted into Turkish and tested with a…
This publications catalogue lists all sales publications of the IAEA published in 2016–2017 and those forthcoming in 2017–2018. Most IAEA publications are issued in English; some are also available in Arabic, Chinese, French, Russian or Spanish. This is indicated at the bottom of the book entry. Most publications are issued in softcover. A complete listing of all IAEA priced publications is available on the IAEA’s web site: www.iaea.org/books
Lopes, Pedro; Oliveira, José Luís
As the "omics" revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter's complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a "semantic web in a box" approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/.
Doderer, Mark S; Burkhardt, Cory; Robbins, Kay A
Many bioinformatics algorithms and data sets are deployed using web services so that the results can be explored via the Internet and easily integrated into other tools and services. These services often include data from other sites that is accessed either dynamically or through file downloads. Developers of these services face several problems because of the dynamic nature of the information from the upstream services. Many publicly available repositories of bioinformatics data frequently update their information. When such an update occurs, the developers of the downstream service may also need to update. For file downloads, this process is typically performed manually followed by web service restart. Requests for information obtained by dynamic access of upstream sources is sometimes subject to rate restrictions. SideCache provides a framework for deploying web services that integrate information extracted from other databases and from web sources that are periodically updated. This situation occurs frequently in biotechnology where new information is being continuously generated and the latest information is important. SideCache provides several types of services including proxy access and rate control, local caching, and automatic web service updating. We have used the SideCache framework to automate the deployment and updating of a number of bioinformatics web services and tools that extract information from remote primary sources such as NCBI, NCIBI, and Ensembl. The SideCache framework also has been used to share research results through the use of a SideCache derived web service.
Writing for the Web unites theory, technology, and practice to explore writing and hypertext for website creation. It integrates such key topics as XHTML/CSS coding, writing (prose) for the Web, the rhetorical needs of the audience, theories of hypertext, usability and architecture, and the basics of web site design and technology. Presenting information in digestible parts, this text enables students to write and construct realistic and manageable Web sites with a strong theoretical understanding of how online texts communicate to audiences. Key features of the book
Barsoum, Emad; Kuester, Falko
The pervasive nature of web-based content has lead to the development of applications and user interfaces that port between a broad range of operating systems and databases, while providing intuitive access to static and time-varying information. However, the integration of this vast resource into virtual environments has remained elusive. In this paper we present an implementation of a 3D Web Browser (WebVR) that enables the user to search the internet for arbitrary information and to seamlessly augment this information into virtual environments. WebVR provides access to the standard data input and query mechanisms offered by conventional web browsers, with the difference that it generates active texture-skins of the web contents that can be mapped onto arbitrary surfaces within the environment. Once mapped, the corresponding texture functions as a fully integrated web-browser that will respond to traditional events such as the selection of links or text input. As a result, any surface within the environment can be turned into a web-enabled resource that provides access to user-definable data. In order to leverage from the continuous advancement of browser technology and to support both static as well as streamed content, WebVR uses ActiveX controls to extract the desired texture skin from industry strength browsers, providing a unique mechanism for data fusion and extensibility.
Jansen, Bernard J
This lecture presents an overview of the Web analytics process, with a focus on providing insight and actionable outcomes from collecting and analyzing Internet data. The lecture first provides an overview of Web analytics, providing in essence, a condensed version of the entire lecture. The lecture then outlines the theoretical and methodological foundations of Web analytics in order to make obvious the strengths and shortcomings of Web analytics as an approach. These foundational elements include the psychological basis in behaviorism and methodological underpinning of trace data as an empir
Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks.A practical, step-by-step guide featuring recipes that will get you up and running quickly with Nancy.A practical, step-by-step guide featuring recipes that will get you up and running quickly with Nancy.If you are a .NET oriented web developer who is curious to find out what lies beyond the Microsoft provided frameworks, then this book is for you! It's beneficial to have a good knowledge of C# and .NET, as well as a basic working knowledge of HTTP. If testability is important to you
The aim of the thesis was to develop a web portal for chess players, which is intended to be a place for quality chess joining over the Internet and chess education. This need is caused by existing social networks and websites for playing chess, which offer you a very small amount of quality chess joining and education. As a result of the work the chess portal was created. We have presented the development of the chess portal in the thesis, which has all the features the chess players are i...
Having taught and carried out research in LSP and business communication for many years, I have come across, again and again, the problems arising from the inferior status of language in the business environment. Being convinced that it does not have to be so, instead of going on trying to convince...... non-linguistically trained colleagues of the importance of language via the usual arguments, I suggest that we let them experience the problems arising from the non-recognition of the importance of language via a Web communication crash course, inspired by a course taught to BA students...
Li Sheng; Piao Yunsong; Liu Yang
In a given path with multiple branches, in principle, it can be expected that there are some fork points, where one branch is bifurcated into different branches, or various branches converge into one or several branches. In this paper, it is shown that if there is a web formed by such branches in a given field space, in which each branch can be responsible for a period of slow roll inflation, a multiverse separated by a domain wall network will come into being, some of which might correspond to our observable universe. We discuss this scenario and show possible observations of a given observer at late time.
Moi, Morten Benestad
This thesis studies the methods needed to create a web based application to remotely customize a CAD model. This includes customizing a CAD model by using a graphical user interface to be able to remotely control the inputs to- and outputs from the model in NX, and to get the result sent back to the user. Using CAD systems such as NX requires intensive training, is often a slow process and gives a lot of room for errors. An intuitive, simple user interface will eliminate the need for CAD trai...
Espuña Buxó, Álvaro
En este proyecto creamos una plataforma para detectar automáticamente y de forma dinámica si en una cierta página web se esta usando "canvas fingerprinting" y si el uso de éste está siendo ofuscado. Además analizamos las páginas más visitadas según Alexa y exponemos los resultado obtenidos. In this project we develop a framework that tries to detect automatically and dynamically if a website is using canvas fingerprinting and if its usage is being obfuscated. We also analyze the top ranked...
Gardner, Lyza; Grigsby, Jason
Ulltveit-Moe, Nils; Olsen, Morten Goodwin; Pillai, Anand B.
The European Internet Accessibility project (EIAO) has developed an Observatory for performing large scale automatic web accessibility evaluations of public sector web sites in Europe. The architecture includes a distributed web crawler that crawls web sites for links until either a given budget...... of web pages have been identified or the web site has been crawled exhaustively. Subsequently, a uniform random subset of the crawled web pages is sampled and sent for accessibility evaluation and the evaluation results are stored in a Resource Description Format (RDF) database that is later loaded...... challenges that the project faced and the solutions developed towards building a system capable of regular large-scale accessibility evaluations with sufficient capacity and stability. It also outlines some possible future architectural improvements....
Full Text Available The use of the internet as a medium of dissemination of information to stakeholders is increasingly gaining grounds. This study extends existing literature on web disclosures by investigating the characteristics that predict the extent of web-based disclosures. In this study, corporate websites of top Nigerian firms are used as sources of data, while a regression analysis is employed to examine the extent of prediction. Results indicate that the firm size and industry type are significant determinants of web disclosures. However, other firm traits such as ownership dispersion and financial performance do not significantly explain the extent of internet disclosures. The study recommends that a regulatory template for corporate web disclosures be put in place by government regardless of the size or industry classification of the firm. This is with a view to considerably reduce agency conflicts arising from information asymmetry in publicly listed firms in Nigeria.
The World Wide Web was born at CERN in 1989. However, although historic paper documents from over 50 years ago survive in the CERN Archive, it is by no means certain that we will be able to consult today's web pages 50 years from now. The Internet Archive's Wayback Machine includes an impressive collection of archived CERN web pages from 1996 onwards. However, their coverage is not complete - they aim for broad coverage of the whole Internet, rather than in-depth coverage of particular organisations. To try to fill this gap, the CERN Archive has entered into a partnership agreement with the Internet Memory Foundation. Harvesting of CERN's publicly available web pages is now being carried out on a regular basis, and the results are available here.
Jensen, Christian S.
Driven by factors such as the increasingly mobile use of the web and the proliferation of geo-positioning technologies, the web is rapidly acquiring a spatial aspect. Specifically, content and users are being geo-tagged, and services are being developed that exploit these tags. The research...... community is hard at work inventing means of efficiently supporting new spatial query functionality. Points of interest with a web presence, called spatial web objects, have a location as well as a textual description. Spatio-textual queries return such objects that are near a location argument...... and are relevant to a text argument. An important element in enabling such queries is to be able to rank spatial web objects. Another is to be able to determine the relevance of an object to a query. Yet another is to enable the efficient processing of such queries. The talk covers recent results on spatial web...
Hendler, James [Rensselaer Polytechnic Institute
As more and more data and information becomes available on the Web, new technologies that use explicit semantics for information organization are becoming desirable. New terms such as Linked Data, Semantic Web and Web 3.0 are used more and more, although there is increasing confusion as to what each means. In this talk, I will describe how different sorts of models can be used to link data in different ways. I will particularly explore different kinds of Web applications, from Enterprise Data Integration to Web 3.0 startups, government data release, the different needs of Web 2.0 and 3.0, the growing interest in “semantic search”, and the underlying technologies that power these new approaches.
Farooq, Omar; Aguenaou, Samir
Does the traffic generated by websites of firms signal anything to stock market participants? Does higher web-traffic translate into availability of more information and therefore lower agency problems? And if answers to above questions are in affirmative, does higher web-traffic traffic translate...... into better firm performance? This paper aims to answer these questions by documenting a positive relationship between the extent of web-traffic and firm performance in the MENA region during the 2010. We argue that higher web-traffic lowers the agency problems in firms by disseminating more information...... to stock market participants. Consequently, lower agency problems translate into better performance. Furthermore, we also show that agency reducing role of web-traffic is more pronounced in regimes where information environment is already bad. For example, our results show stronger impact of web...
Domenico, B.; Weber, W. J.
A few years back, the authors presented examples of online documents that allowed the reader to interact directly with datasets, but there were limitations that restricted the interaction to specific desktop analysis and display tools that were not generally available to all readers of the documents. Recent advances in web service technology and related standards are making it possible to develop systems for publishing online documents that enable readers to access, analyze, and display the data discussed in the publication from the perspective and in the manner from which the author wants it to be represented. By clicking on embedded links, the reader accesses not only the usual textual information in a publication, but also data residing on a local or remote web server as well as a set of processing tools for analyzing and displaying the data. With the option of having the analysis and display processing provided on the server, there are now a broader set of possibilities on the client side where the reader can interact with the data via a thin web client, a rich desktop application, or a mobile platform "app." The presentation will outline the architecture of data interactive publications along with illustrative examples.
Domenico, B.; Weber, J.
For some years now, the authors have developed examples of online documents that allowed the reader to interact directly with datasets, but there were limitations that restricted the interaction to specific desktop analysis and display tools that were not generally available to all readers of the documents. Recent advances in web service technology and related standards are making it possible to develop systems for publishing online documents that enable readers to access, analyze, and display the data discussed in the publication from the perspective and in the manner from which the author wants it to be represented. By clicking on embedded links, the reader accesses not only the usual textual information in a publication, but also data residing on a local or remote web server as well as a set of processing tools for analyzing and displaying the data. With the option of having the analysis and display processing provided on the server (or in the cloud), there are now a broader set of possibilities on the client side where the reader can interact with the data via a thin web client, a rich desktop application, or a mobile platform "app." The presentation will outline the architecture of data interactive publications along with illustrative examples.
Full Text Available The development of next-generation sequencing (NGS platforms spawned an enormous volume of data. This explosion in data has unearthed new scalability challenges for existing bioinformatics tools. The analysis of metagenomic sequences using bioinformatics pipelines is complicated by the substantial complexity of these data. In this article, we review several commonly-used online tools for metagenomics data analysis with respect to their quality and detail of analysis using simulated metagenomics data. There are at least a dozen such software tools presently available in the public domain. Among them, MGRAST, IMG/M, and METAVIR are the most well-known tools according to the number of citations by peer-reviewed scientific media up to mid-2015. Here, we describe 12 online tools with respect to their web link, annotation pipelines, clustering methods, online user support, and availability of data storage. We have also done the rating for each tool to screen more potential and preferential tools and evaluated five best tools using synthetic metagenome. The article comprehensively deals with the contemporary problems and the prospects of metagenomics from a bioinformatics viewpoint.
Petrie, C.; Margaria, T.; Lausen, H.; Zaremba, M.
Explores trade-offs among existing approaches. Reveals strengths and weaknesses of proposed approaches, as well as which aspects of the problem are not yet covered. Introduces software engineering approach to evaluating semantic web services. Service-Oriented Computing is one of the most promising software engineering trends because of the potential to reduce the programming effort for future distributed industrial systems. However, only a small part of this potential rests on the standardization of tools offered by the web services stack. The larger part of this potential rests upon the development of sufficient semantics to automate service orchestration. Currently there are many different approaches to semantic web service descriptions and many frameworks built around them. A common understanding, evaluation scheme, and test bed to compare and classify these frameworks in terms of their capabilities and shortcomings, is necessary to make progress in developing the full potential of Service-Oriented Computing. The Semantic Web Services Challenge is an open source initiative that provides a public evaluation and certification of multiple frameworks on common industrially-relevant problem sets. This edited volume reports on the first results in developing common understanding of the various technologies intended to facilitate the automation of mediation, choreography and discovery for Web Services using semantic annotations. Semantic Web Services Challenge: Results from the First Year is designed for a professional audience composed of practitioners and researchers in industry. Professionals can use this book to evaluate SWS technology for their potential practical use. The book is also suitable for advanced-level students in computer science.
Geraci, Filippo; Martinelli, Maurizio; Pellegrini, Marco; Serrecchia, Michela
The paper presents a framework aimed at assessing the capacity of Public Administration bodies (PA) to offer a good quality of information and service on their web portals. Our framework is based on the extraction of ".it? domain names registered by Italian public institutions and the subsequent analysis of their relative websites. The analysis foresees an automatic gathering of the web pages of PA portals by means of web crawling and an assessment of the quality of their online information s...
The Invisible Web is often discussed in the academic context, where its contents (mainly in the form of databases) are of great importance. But this discussion is mainly based on some seminal research done by Sherman and Price (2001) and Bergman (2001), respectively. We focus on the types of Invisible Web content relevant for academics and the improvements made by search engines to deal with these content types. In addition, we question the volume of the Invisible Web as stated by Bergman. Ou...
Bose, P.; Zimdars, A.; Hurlburt, N.; Doug, S.
NASA envisions the development of smart sensor webs, intelligent and integrated observation network that harness distributed sensing assets, their associated continuous and complex data sets, and predictive observation processing mechanisms for timely, collaborative hazard mitigation and enhanced science productivity and reliability. This paper presents Virtual Sensor Web Infrastructure for Collaborative Science (VSICS) Architecture for sustained coordination of (numerical and distributed) model-based processing, closed-loop resource allocation, and observation planning. VSICS's key ideas include i) rich descriptions of sensors as services based on semantic markup languages like OWL and SensorML; ii) service-oriented workflow composition and repair for simple and ensemble models; event-driven workflow execution based on event-based and distributed workflow management mechanisms; and iii) development of autonomous model interaction management capabilities providing closed-loop control of collection resources driven by competing targeted observation needs. We present results from initial work on collaborative science processing involving distributed services (COSEC framework) that is being extended to create VSICS.
Hoppen dos Santos, Joni; Ferreira Pires, Luis; Goncalves da Silva, Eduardo; Iacob, Maria Eugenia; Janssen, M.; Macintosh, A.; Scholl, H.J.; Tambouris, E.; Wimmer, M.A.; de Bruijn, H.; Tan, Y.H
In recent years, public organisations have been challenged to offer electronic services. This has caused a proliferation of disconnected web sites or web portals, often reflecting the internal structures (departments or sections) of these organizations. This paper shows that electronic public
Full Text Available Recent developments on the World-Wide Web provide an unparalleled opportunity to revolutionise scientific, technical and medical publication. The technology exists for the scientific world to use primary publication to create a knowledge base, or Semantic Web, with a potential greatly beyond the paper archives and electronic databases of today.
Fejzić , Diana
Due to the increasing prevalence of smartphones and tablet computers design became a crucial part of web design. For a user, responsive web design enables the best user experience, regardless of whether a user is visiting the site via a mobile phone, a tablet or a computer. This thesis covers the process of planning, designing and responsive web site development, for a fictitious company named “Creative Design d.o.o.”, with the help of web technologies. In the initial part of the thesis, w...
Politiken 01.01.2014 14:16 Danskerne skød nytåret ind med et brag, men for enkeltes vedkommende gik det galt, da nytårskrudtet blev tændt. Skadestuerne har behandlet 73 personer for fyrværkeriskader mellem klokken 18 i aftes og klokken 06 i morges. Det viser en optælling, som Politiken har...... foretaget på baggrund af tal fra Ulykkes Analyse Gruppen på Odense Universitetshospital. Artiklen er også bragt i: Alt om Ikast Brande (web), Lemvig Folkeblad (web), Politiken (web), Dagbladet Ringkjøbing Skjern (web)....
Kunz, P F
On 12 December 1991, Dr. Kunz installed the first Web server outside of Europe at the Stanford Linear Accelerator Center. Today, if you do not have access to the Web you are considered disadvantaged. Before it made sense for Tim Berners-Lee to invent the Web at CERN, there had to a number of ingredients in place. Dr. Kunz will present a history of how these ingredients developed and the role the academic research community had in forming them. In particular, the role that big science, such as high energy physics, played in giving us the Web we have today...
Full Text Available There are several studies dealing with the differences between sites originated by men and women. However, these references are mainly related to the "output", the final web site. In our research we examined the input side of web designing. We thoroughly analysed a number of randomly selected web designer softwares to see, whether and to what extent the templates they offer determine the final look of an individual's website. We have found that most of them are typical masculine templates, which makes it difficult to any women to design a feminine looking website. It can be one of the reasons of the masculine website hegemony on the web.
Little, Ruth Gaskins; Greer, Annette; Clay, Maria; McFadden, Cheryl
Public health leaders play pivotal roles in ensuring the population health for our nation. Since 2000, the number of schools of public health has almost doubled. The scholarly credentials for leaders of public health in academic and practice are important, as they make decisions that shape the future public health workforce and important public health policies. This research brief describes the educational degrees of deans of schools of public health and state health directors, as well as their demographic profiles, providing important information for future public health leadership planning. Data were extracted from a database containing information obtained from multiple Web sites including academic institution Web sites and state government Web sites. Variables describe 2 sets of public health leaders: academic deans of schools of public health and state health directors. Deans of schools of public health were 73% males and 27% females; the PhD degree was held by 40% deans, and the MD degree by 33% deans. Seventy percent of deans obtained their terminal degree more than 35 years ago. State health directors were 60% males and 40% females. Sixty percent of state health directors had an MD degree, 4% a PhD degree, and 26% no terminal degree at all. Sixty-four percent of state health directors received their terminal degree more than 25 years ago. In addition to terminal degrees, 56% of deans and 40% of state health directors held MPH degrees. The findings call into question competencies needed by future public health professionals and leadership and the need to clarify further the level of public health training and degree type that should be required for leadership qualifications in public health.
Pritychenko,B.; Sonzogni, A.A.
We present Sigma Web interface which provides user-friendly access for online analysis and plotting of the evaluated and experimental nuclear reaction data stored in the ENDF-6 and EXFOR formats. The interface includes advanced browsing and search capabilities, interactive plots of cross sections, angular distributions and spectra, nubars, comparisons between evaluated and experimental data, computations for cross section data sets, pre-calculated integral quantities, neutron cross section uncertainties plots and visualization of covariance matrices. Sigma is publicly available at the National Nuclear Data Center website at http://www.nndc.bnl.gov/sigma.
Pritychenko, B.; Sonzogni, A.A.
We present Sigma Web interface which provides user-friendly access for online analysis and plotting of the evaluated and experimental nuclear reaction data stored in the ENDF-6 and EXFOR formats. The interface includes advanced browsing and search capabilities, interactive plots of cross sections, angular distributions and spectra, nubars, comparisons between evaluated and experimental data, computations for cross section data sets, pre-calculated integral quantities, neutron cross section uncertainties plots and visualization of covariance matrices. Sigma is publicly available at the National Nuclear Data Center website at http://www.nndc.bnl.gov/sigma.
... the CSB Web site ( http://www.csb.gov .). Comments may also be sent to CSB Headquarters (see address... public review either at CSB Headquarters or by following directions posted on the CSB Web site. By..., operating [[Page 39843
Mandal, Chittaranjan; Sinha, Vijay Luxmi; Reade, Christopher M. P.
The architecture of a web-based course management tool that has been developed at IIT [Indian Institute of Technology], Kharagpur and which manages the submission of assignments is discussed. Both the distributed architecture used for data storage and the client-server architecture supporting the web interface are described. Further developments…
Schwinger, W.; Retschitzegger, W.; Schauerhuber, A.; Kappel, G.; Wimmer, M.; Pröll, B.; Cachero Castro, C.; Casteleyn, S.; De Troyer, O.; Fraternali, P.; Garrigos, I.; Garzotto, F.; Ginige, A.; Houben, G.J.P.M.; Koch, N.; Moreno, N.; Pastor, O.; Paolini, P.; Pelechano Ferragud, V.; Rossi, G.; Schwabe, D.; Tisi, M.; Vallecillo, A.; Sluijs, van der K.A.M.; Zhang, G.
Purpose – Ubiquitous web applications (UWA) are a new type of web applications which are accessed in various contexts, i.e. through different devices, by users with various interests, at anytime from anyplace around the globe. For such full-fledged, complex software systems, a methodologically sound
Bundza, Maira; Vander Meer, Patricia Fravel; Perez-Stable, Maria A.
Although the library's Web site has become a standard tool for seeking information and conducting research in academic institutions, there are a variety of ways libraries approach the often challenging--and sometimes daunting--process of Web site development and maintenance. Three librarians at Western Michigan University explored issues related…
Full Text Available In this article I want to discuss the question if and how the World Wide Web changes social memory practices. Therefore I examine the relationship between the World Wide Web, social memory practices and public discourses. Towards discussing mediated memory processes I focus on the online discourse about the trial against the former concentration camp guard John Demjanjuk.
East Jefferson General Hospital in Metairie, La., launched a new Web site in October 2001. Its user-friendly home page offers links to hospital services, medical staff, and employer information. Its jobline is a powerful tool for recruitment. The site was awarded the 2002 Pelican Award for Best Consumer Web site by the Louisiana Society for Hospital Public Relations & Marketing.
Cherry, Joan M.
Evaluation of data from assessments of full bibliographic displays in academic library OPACs (online public access catalogs) and World Wide Web catalogs against a checklist of desirable features found that OPAC displays scored 58% and Web displays scored 60%. Discusses weaknesses, focusing on those found in the majority of the displays…
Hull, Richard; Thiemann, Peter; Wadler, Philip
From 28.01. to 02.02.2007, the Dagstuhl Seminar 07051 ``Programming Paradigms for the Web: Web Programming and Web Services'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The firs...
Full Text Available This article describes a learning system constructed to facilitate teaching and learning by creating a functional web-based contact between schools and organisations which in cooperation with the school contribute to pupils’/students’ cognitive development. Examples of such organisations include science centres, museums, art and music workshops and teacher education internships. With the support of the “Web Coherence Learning” IT application (abbreviated in Swedish to Webbhang developed by the University of Kalmar, the aim is to reinforce learning processes in the encounter with organisations outside school. In close cooperation with potential users a system was developed which can be described as consisting of three modules. The first module, “the organisation page” supports the organisation in simply setting up a homepage, where overarching information on organisation operations can be published and where functions like calendar, guestbook, registration and newsletter can be included. In the second module, “the activity page” the activities offered by the organisation are described. Here pictures and information may prepare and inspire pupils/students to their own activities before future visits. The third part, “the participant page” is a communication module linked to the activity page enabling school classes to introduce themselves and their work as well as documenting the work and communicating with the educators responsible for external activities. When the project is finished, the work will be available to further school classes, parents and other interested parties. System development and testing have been performed in a small pilot study where two creativity educators at an art museum have worked together with pupils and teachers from a compulsory school class. The system was used to establish, prior to the visit of the class, a deeper contact and to maintain a more qualitative continuous dialogue during and after
Aoki-Kinoshita, Kiyoko F; Bolleman, Jerven; Campbell, Matthew P; Kawano, Shin; Kim, Jin-Dong; Lütteke, Thomas; Matsubara, Masaaki; Okuda, Shujiro; Ranzinger, Rene; Sawaki, Hiromichi; Shikanai, Toshihide; Shinmachi, Daisuke; Suzuki, Yoshinori; Toukach, Philip; Yamada, Issaku; Packer, Nicolle H; Narimatsu, Hisashi
Glycoscience is a research field focusing on complex carbohydrates (otherwise known as glycans)a, which can, for example, serve as "switches" that toggle between different functions of a glycoprotein or glycolipid. Due to the advancement of glycomics technologies that are used to characterize glycan structures, many glycomics databases are now publicly available and provide useful information for glycoscience research. However, these databases have almost no link to other life science databases. In order to implement support for the Semantic Web most efficiently for glycomics research, the developers of major glycomics databases agreed on a minimal standard for representing glycan structure and annotation information using RDF (Resource Description Framework). Moreover, all of the participants implemented this standard prototype and generated preliminary RDF versions of their data. To test the utility of the converted data, all of the data sets were uploaded into a Virtuoso triple store, and several SPARQL queries were tested as "proofs-of-concept" to illustrate the utility of the Semantic Web in querying across databases which were originally difficult to implement. We were able to successfully retrieve information by linking UniCarbKB, GlycomeDB and JCGGDB in a single SPARQL query to obtain our target information. We also tested queries linking UniProt with GlycoEpitope as well as lectin data with GlycomeDB through PDB. As a result, we have been able to link proteomics data with glycomics data through the implementation of Semantic Web technologies, allowing for more flexible queries across these domains.
Prudden, R.; Tomlinson, J.; Robinson, N.; Arribas, A.
This talk will discuss an technique for lossy data compression specialised for web animation. We set ourselves the challenge of visualising a full forecast weather field as an animated 3D web page visualisation. This data is richly spatiotemporal, however it is routinely communicated to the public as a 2D map, and scientists are largely limited to visualising data via static 2D maps or 1D scatter plots. We wanted to present Met Office weather forecasts in a way that represents all the generated data. Our approach was to repurpose the technology used to stream high definition videos. This enabled us to achieve high rates of compression, while being compatible with both web browsers and GPU processing. Since lossy compression necessarily involves discarding information, evaluating the results is an important and difficult problem. This is essentially a problem of forecast verification. The difficulty lies in deciding what it means for two weather fields to be "similar", as simple definitions such as mean squared error often lead to undesirable results. In the second part of the talk, I will briefly discuss some ideas for alternative measures of similarity.
The IAEA has taken on the obligation to organize the continued availability of literature in the field of nuclear science and technology for peaceful applications. In the International Nuclear Information System (INIS), millions of scientific citations and the full texts of hundreds of thousands of pieces of non-conventional literature (NCL) have been collected worldwide and have been assembled into the INIS database of citations and the associated collection of NCL full texts. The next step in the IAEA's endeavour to secure the continued access to scientific and technical literature in the nuclear field which is now available on the Internet to its staff and to Member States. The IAEA is currently conducting pilot projects under the heading NuArch that could eventually become the seed of a comprehensive archive of electronic documents in the nuclear field. A pilot project was started in the IAEA for the period 2004-2005 and continues for the period 2006-2007. This publication provides information and examples based upon experience in a variety of Member States. It provides general information that present technical aspects of web harvesting in the context of knowledge preservation in the nuclear field, contemporary activities in the domain of web harvesting, document archiving and Internet access technology in order to obtain a contemporary technology overview. Several aspects of possible web harvesting methodologies are presented in some detail in this document which can also serve as a basis to establish future co-operation
Computer Security Team
Publish or perish. Given the large variety of information which needs to be published, you have the freedom at CERN to deploy your own web-server and put your data online on the Internet. Web content management systems like Joomla! or WordPress together with dedicated add-ons and modules make it easy to quickly create a posh look-and-feel. But hold on. With this freedom also comes responsibility! Your responsibility does not stop once you have been granted Internet connectivity. No: It falls to you to ensure that your web server is continually secured. Only information which is meant to be public should be put online. Proper access protections must be put in place to secure other data, preferably using the CERN Single Sign-On portal and definitely using the HTTPS (secure HTTP) protocol when transmitting sensitive information like passwords. “Securing” also implies that the operating system as well as the content management system must be updated regularly. If ...
The final edition (Nos 51-52/2009 and 1-2/2010) of the last Weekly Bulletin of the year will be published on Friday 11 December and will cover events at CERN from 14 December 2009 to 8 January 2010. Announcements for publication in this issue should reach the Publication Section (Communications group) or the Staff Association, as appropriate, by noon on Tuesday 8 December. Bulletin publication 2010 The table below lists the 2010 publication dates for the paper version of the Bulletin and the corresponding deadlines for the submission of announcements. Please note that all announcements must be submitted by 12.00 midday on Tuesdays at the latest. Bulletin No. Week number Submission of announcements (before 12.00 midday) Bulletin Web version Bulletin Paper version 2-3 Tuesday 5 January Friday 8 and 15 January Wednesday 13 J...
Elleby, Anita; Ingwersen, Peter
The paper presents comparative analyses of two publication point systems, The Norwegian and the in-house system from the interdiscplinary Danish Institute for International Studies (DIIS), used as case in the study for publications published 2006, and compares central citation-based indicators...... with novel publication point indicators (PPIs) that are formalized and exemplified. Two diachronic citation windows are applied: 2006-07 and 2006-08. Web of Science (WoS) as well as Google Scholar (GS) are applied to observe the cite delay and citedness for the different document types published by DIIS...... for all document types. Statistical significant correlations were only found between WoS and GS and the two publication point systems in between, respectively. The study demonstrates how the nCPPI can be applied to institutions as evaluation tools supplementary to JCI in various combinations...
Alcohol continues to be a major contributor to morbidity and mortality globally. Despite the scientific advances, alcohol use related problems continue to pose a major challenge to medicine and public health. Internet offers a new mode to provide health care interventions. Web based interventions (WBIs) provide the health ...
Morgan, Joanne Cardin
Leadership profile pages on organizational websites have become staged opportunities for impression management. This research uses content analysis to examine the strategies of assertive impression management used to construct the leadership Web presence of the 70 presidents of national public universities, as identified in the "US News and…
Communication: Journalism Education Today, 2001
Discusses the newly-released Adobe Photoshop 6, and its use in student publications. Notes its refined text-handling capabilities, a more user-friendly interface, integrated vector functions, easier preparation of Web images, and new and more powerful layer functions. (SR)
Duarte Torres, Sergio; Weber, Ingmar
The Internet has become an important part of the daily life of children as a source of information and leisure activities. Nonetheless, given that most of the content available on the web is aimed at the general public, children are constantly exposed to inappropriate content, either because the
Tian, Hao; Lin, Jin-Mann S; Reeves, William C
To evaluate the web structure of two web-based continuing education courses, identify problems and assess the effects of web site modifications. Markov chain models were built from 2008 web usage data to evaluate the courses' web structure and navigation patterns. The web site was then modified to resolve identified design issues and the improvement in user activity over the subsequent 12 months was quantitatively evaluated. Web navigation paths were collected between 2008 and 2010. The probability of navigating from one web page to another was analyzed. The continuing education courses' sequential structure design was clearly reflected in the resulting actual web usage models, and none of the skip transitions provided was heavily used. The web navigation patterns of the two different continuing education courses were similar. Two possible design flaws were identified and fixed in only one of the two courses. Over the following 12 months, the drop-out rate in the modified course significantly decreased from 41% to 35%, but remained unchanged in the unmodified course. The web improvement effects were further verified via a second-order Markov chain model. The results imply that differences in web content have less impact than web structure design on how learners navigate through continuing education courses. Evaluation of user navigation can help identify web design flaws and guide modifications. This study showed that Markov chain models provide a valuable tool to evaluate web-based education courses. Both the results and techniques in this study would be very useful for public health education and research specialists.
Paulsworth, Ashley [Sunvestment Group, Frederick, MD (United States); Kurtz, Jim [Sunvestment Group, Frederick, MD (United States); Brun de Pontet, Stephanie [Sunvestment Group, Frederick, MD (United States)
Sunvestment Energy Group (previously called Sunvestment Group) was established to create a web application that brings together site hosts, those who will obtain the energy from the solar array, with project developers and funders, including affinity investors. Sunvestment Energy Group (SEG) uses a community-based model that engages with investors who have some affinity with the site host organization. In addition to a financial return, these investors receive non-financial value from their investments and are therefore willing to offer lower cost capital. This enables the site host to enjoy more savings from solar through these less expensive Community Power Purchase Agreements (CPPAs). The purpose of this award was to develop an online platform to bring site hosts and investors together virtually.
Full Text Available Ukraine Virtual Observatory (UkrVO has been a member of the International Virtual Observatory Alliance (IVOA since 2011. The virtual observatory (VO is not a magic solution to all problems of data storing and processing, but it provides certain standards for building infrastructure of astronomical data center. The astronomical databases help data mining and offer to users an easy access to observation metadata, images within celestial sphere and results of image processing. The astronomical web services (AWS of UkrVO give to users handy tools for data selection from large astronomical catalogues for a relatively small region of interest in the sky. Examples of the AWS usage are showed.
Meroño-Peñuela, A.; Ashkpour, A.
The Semantic Web is an extension of the Web through standards by the World Wide Web Consortium (W3C) . These standards promote common data formats and exchange protocols on the Web, most fundamentally the Resource Description Framework (RDF). Its ultimate goal is to make the Web a suitable data
Today, the study of the World Wide Web is one of the most challenging subjects. In this work we consider the Web from a probabilistic point of view. We analyze the relations between various characteristics of the Web. In particular, we are interested in the Web properties that affect the Web page
Vdovják, R.; Frasincar, F.; Houben, G.J.P.M.; Barna, P.
The success of the World Wide Web has caused the concept of information system to change. Web Information Systems (WIS) use from the Web its paradigm and technologies in order to retrieve information from sources on the Web, and to present the information in terms of a Web or hypermedia