WorldWideScience

Sample records for survey methodology page

  1. A Survey on PageRank Computing

    OpenAIRE

    Berkhin, Pavel

    2005-01-01

    This survey reviews the research related to PageRank computing. Components of a PageRank vector serve as authority weights for web pages independent of their textual content, solely based on the hyperlink structure of the web. PageRank is typically used as a web search ranking component. This defines the importance of the model and the data structures that underly PageRank processing. Computing even a single PageRank is a difficult computational task. Computing many PageRanks is a much mor...

  2. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  3. Methodology for performing surveys for fixed contamination

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1994-10-01

    This report describes a methodology for performing instrument surveys for fixed contamination that can be used to support the release of material from radiological areas, including release to controlled areas and release from radiological control. The methodology, which is based on a fast scan survey and a series of statistical, fixed measurements, meets the requirements of the U.S. Department of Energy Radiological Control Manual (RadCon Manual) (DOE 1994) and DOE Order 5400.5 (DOE 1990) for surveys for fixed contamination and requires less time than a conventional scan survey. The confidence interval associated with the new methodology conforms to the draft national standard for surveys. The methodology that is presented applies only to surveys for fixed contamination. Surveys for removable contamination are not discussed, and the new methodology does not affect surveys for removable contamination

  4. A quality evaluation methodology of health web-pages for non-professionals.

    Science.gov (United States)

    Currò, Vincenzo; Buonuomo, Paola Sabrina; Onesimo, Roberta; de Rose, Paola; Vituzzi, Andrea; di Tanna, Gian Luca; D'Atri, Alessandro

    2004-06-01

    The proposal of an evaluation methodology for determining the quality of healthcare web sites for the dissemination of medical information to non-professionals. Three (macro) factors are considered for the quality evaluation: medical contents, accountability of the authors, and usability of the web site. Starting from two results in the literature the problem of whether or not to introduce a weighting function has been investigated. This methodology has been validated on a specialized information content, i.e., sore throats, due to the large interest such a topic enjoys with target users. The World Wide Web was accessed using a meta-search system merging several search engines. A statistical analysis was made to compare the proposed methodology with the obtained ranks of the sample web pages. The statistical analysis confirms that the variables examined (per item and sub factor) show substantially similar ranks and are capable of contributing to the evaluation of the main quality macro factors. A comparison between the aggregation functions in the proposed methodology (non-weighted averages) and the weighting functions, derived from the literature, allowed us to verify the suitability of the method. The proposed methodology suggests a simple approach which can quickly award an overall quality score for medical web sites oriented to non-professionals.

  5. Effects of various methodologic strategies: survey response rates among Canadian physicians and physicians-in-training.

    Science.gov (United States)

    Grava-Gubins, Inese; Scott, Sarah

    2008-10-01

    To increase the overall 2007 response rate of the National Physician Survey (NPS) from the survey's 2004 rate of response with the implementation of various methodologic strategies. Physicians were stratified to receive either a long version (12 pages) or a short version (6 pages) of the survey (38% and 62%, respectively). Mixed modes of contact were used-58% were contacted by e-mail and 42% by regular mail-with multiple modes of contact attempted for nonrespondents. The self-administered, confidential surveys were distributed in either English or French. Medical residents and students received e-mail surveys only and were offered a substantial monetary lottery incentive for completing their surveys. A professional communications firm assisted in marketing the survey and delivered advance notification of its impending distribution. Canada. A total of 62 441 practising physicians, 2627 second-year medical residents, and 9162 medical students in Canada. Of the practising physicians group, 60 811 participants were eligible and 19 239 replied, for an overall 2007 study response rate of 31.64% (compared with 35.85% in 2004). No difference in rate of response was found between the longer and shorter versions of the survey. If contacted by regular mail, the response rate was 34.1%; the e-mail group had a response rate of 29.9%. Medical student and resident response rates were 30.8% and 27.9%, respectively (compared with 31.2% and 35.6% in 2004). Despite shortening the questionnaires, contacting more physicians by e-mail, and enhancing marketing and follow-up, the 2007 NPS response rate for practising physicians did not surpass the 2004 NPS response rate. Offering a monetary lottery incentive to medical residents and students was also unsuccessful in increasing their response rates. The role of surveys in gathering information from physicians and physicians-in-training remains problematic. Researchers need to investigate alternative strategies for achieving higher rates of

  6. Customer satisfaction surveys: Methodological recommendations for financial service providers

    Directory of Open Access Journals (Sweden)

    Đorđić Marko

    2010-01-01

    Full Text Available This methodological article investigates practical challenges that emerge when conducting customer satisfaction surveys (CSS for financial service providers such as banks, insurance or leasing companies, and so forth. It displays methodological recommendations in reference with: (a survey design, (b sampling, (c survey method, (d questionnaire design, and (e data acquisition. Article provides appropriate explanations that usage of: two-stage survey design, SRS method, large samples, and rigorous fieldwork preparation can enhance the overall quality of CSS in financial services. Proposed methodological recommendations can primarily be applied to the primary quantitative marketing research in retail financial services. However, majority of them can be successfully applied when conducting primary quantitative marketing research in corporate financial services as well. .

  7. SNE's methodological basis - web-based software in entrepreneurial surveys

    DEFF Research Database (Denmark)

    Madsen, Henning

    This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project.......This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project....

  8. Methodological Issues in Survey Research: A Historical Review

    NARCIS (Netherlands)

    de Heer, W.; de Leeuw, E.D.; van der Zouwen, J.

    1999-01-01

    In this paper, we present a historical overview of social surveys and describe the historical development of scientific survey methodology and survey statistics. The origins of survey research can be traced back to the early 19th century and the first scientiflc survey was conducted in England in

  9. SurveyWiz and factorWiz: JavaScript Web pages that make HTML forms for research on the Internet.

    Science.gov (United States)

    Birnbaum, M H

    2000-05-01

    SurveyWiz and factorWiz are Web pages that act as wizards to create HTML forms that enable one to collect data via the Web. SurveyWiz allows the user to enter survey questions or personality test items with a mixture of text boxes and scales of radio buttons. One can add demographic questions of age, sex, education, and nationality with the push of a button. FactorWiz creates the HTML for within-subjects, two-factor designs as large as 9 x 9, or higher order factorial designs up to 81 cells. The user enters levels of the row and column factors, which can be text, images, or other multimedia. FactorWiz generates the stimulus combinations, randomizes their order, and creates the page. In both programs HTML is displayed in a window, and the user copies it to a text editor to save it. When uploaded to a Web server and supported by a CGI script, the created Web pages allow data to be collected, coded, and saved on the server. These programs are intended to assist researchers and students in quickly creating studies that can be administered via the Web.

  10. Training Activity Summary Page (TASP) Campus

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Training Activity Summary Page (formerly the Training Exit Survey Cover Page) dataset contains data about each training event. This dataset includes information...

  11. Heuristic evaluation of paper-based Web pages: a simplified inspection usability methodology.

    Science.gov (United States)

    Allen, Mureen; Currie, Leanne M; Bakken, Suzanne; Patel, Vimla L; Cimino, James J

    2006-08-01

    Online medical information, when presented to clinicians, must be well-organized and intuitive to use, so that the clinicians can conduct their daily work efficiently and without error. It is essential to actively seek to produce good user interfaces that are acceptable to the user. This paper describes the methodology used to develop a simplified heuristic evaluation (HE) suitable for the evaluation of screen shots of Web pages, the development of an HE instrument used to conduct the evaluation, and the results of the evaluation of the aforementioned screen shots. In addition, this paper presents examples of the process of categorizing problems identified by the HE and the technological solutions identified to resolve these problems. Four usability experts reviewed 18 paper-based screen shots and made a total of 108 comments. Each expert completed the task in about an hour. We were able to implement solutions to approximately 70% of the violations. Our study found that a heuristic evaluation using paper-based screen shots of a user interface was expeditious, inexpensive, and straightforward to implement.

  12. Challenges in dental statistics: survey methodology topics

    OpenAIRE

    Pizzo, Giuseppe; Milani, Silvano; Spada, Elena; Ottolenghi, Livia

    2013-01-01

    This paper gathers some contributions concerning survey methodology in dental research, as discussed during the first Workshop of the SISMEC STATDENT working group on statistical methods and applications in dentistry, held in Ancona on the 28th September 2011.The first contribution deals with the European Global Oral Health Indicators Development (EGOHID) Project which proposed a comprehensive and standardized system of epidemiological tools (questionnaires and clinical forms) for national da...

  13. TESTS AND METHODOLOGIES FOR THE SURVEY OF NARROW SPACES

    Directory of Open Access Journals (Sweden)

    L. Perfetti

    2017-02-01

    Full Text Available The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today’s era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  14. Training Activity Summary Page (TASP) State and Tribe

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Training Activity Summary Page (formerly the Training Exit Survey Cover Page) dataset contains data about each training event. This dataset includes information...

  15. Methodological design of the National Health and Nutrition Survey 2016

    OpenAIRE

    Martín Romero-Martínez; Teresa Shamah-Levy; Lucia Cuevas-Nasu; Ignacio Méndez Gómez-Humarán; Elsa Berenice Gaona-Pineda; Luz María Gómez-Acosta; Juan Ángel Rivera-Dommarco; Mauricio Hernández-Ávila

    2017-01-01

    Objective. Describe the design methodology of the halfway health and nutrition national survey (Ensanut-MC) 2016. Materials and methods. The Ensanut-MC is a national probabilistic survey whose objective population are the in­habitants of private households in Mexico. The sample size was determined to make inferences on the urban and rural areas in four regions. Describes main design elements: target population, topics of study, sampling procedure, measurement procedure and logistics organizat...

  16. Cochrane Rehabilitation Methodology Committee: an international survey of priorities for future work.

    Science.gov (United States)

    Levack, William M; Meyer, Thorsten; Negrini, Stefano; Malmivaara, Antti

    2017-10-01

    Cochrane Rehabilitation aims to improve the application of evidence-based practice in rehabilitation. It also aims to support Cochrane in the production of reliable, clinically meaningful syntheses of evidence related to the practice of rehabilitation, while accommodating the many methodological challenges facing the field. To this end, Cochrane Rehabilitation established a Methodology Committee to examine, explore and find solutions for the methodological challenges related to evidence synthesis and knowledge translation in rehabilitation. We conducted an international online survey via Cochrane Rehabilitation networks to canvass opinions regarding the future work priorities for this committee and to seek information on people's current capabilities to assist with this work. The survey findings indicated strongest interest in work on how reviewers have interpreted and applied Cochrane methods in reviews on rehabilitation topics in the past, and on gathering a collection of existing publications on review methods for undertaking systematic reviews relevant to rehabilitation. Many people are already interested in contributing to the work of the Methodology Committee and there is a large amount of expertise for this work in the extended Cochrane Rehabilitation network already.

  17. Methodology of the National School-based Health Survey in Malaysia, 2012.

    Science.gov (United States)

    Yusoff, Fadhli; Saari, Riyanti; Naidu, Balkish M; Ahmad, Noor Ani; Omar, Azahadi; Aris, Tahir

    2014-09-01

    The National School-Based Health Survey 2012 was a nationwide school health survey of students in Standard 4 to Form 5 (10-17 years of age), who were schooling in government schools in Malaysia during the period of data collection. The survey comprised 3 subsurveys: the Global School Health Survey (GSHS), the Mental Health Survey, and the National School-Based Nutrition Survey. The aim of the survey was to provide data on the health status of adolescents in Malaysia toward strengthening the adolescent health program in the country. The design of the survey was created to fulfill the requirements of the 3 subsurveys. A 2-stage stratified sampling method was adopted in the sampling. The methods for data collection were via questionnaire and physical examination. The National School-Based Health Survey 2012 adopted an appropriate methodology for a school-based survey to ensure valid and reliable findings. © 2014 APJPH.

  18. A survey of dynamic methodologies for probabilistic safety assessment of nuclear power plants

    International Nuclear Information System (INIS)

    Aldemir, Tunc

    2013-01-01

    Highlights: ► Dynamic methodologies for probabilistic safety assessment (PSA) are surveyed. ► These methodologies overcome the limitations of the traditional approach to PSA. ► They are suitable for PSA using a best estimate plus uncertainty approach. ► They are highly computation intensive and produce very large number of scenarios. ► Use of scenario clustering can assist the analysis of the results. -- Abstract: Dynamic methodologies for probabilistic safety assessment (PSA) are defined as those which use a time-dependent phenomenological model of system evolution along with its stochastic behavior to account for possible dependencies between failure events. Over the past 30 years, numerous concerns have been raised in the literature regarding the capability of the traditional static modeling approaches such as the event-tree/fault-tree methodology to adequately account for the impact of process/hardware/software/firmware/human interactions on the stochastic system behavior. A survey of the types of dynamic PSA methodologies proposed to date is presented, as well as a brief summary of an example application for the PSA modeling of a digital feedwater control system of an operating pressurized water reactor. The use of dynamic methodologies for PSA modeling of passive components and phenomenological uncertainties are also discussed.

  19. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces

    Science.gov (United States)

    Perfetti, L.; Polari, C.; Fassi, F.

    2017-02-01

    The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today's era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance) for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  20. International conference on research methodology for roadside surveys of drinking-driving : alcohol countermeasures workshop

    Science.gov (United States)

    1974-09-01

    The basic purpose [of the conference] was to encourage more roadside surveys by furthering the research methodology and recommendations for conducting roadside surveys developed by a special group of the Organization for Economic Cooperation and Deve...

  1. Musculoskeletal impairment survey in Rwanda: Design of survey tool, survey methodology, and results of the pilot study (a cross sectional survey

    Directory of Open Access Journals (Sweden)

    Simms Victoria

    2007-03-01

    Full Text Available Abstract Background Musculoskeletal impairment (MSI is an important cause of morbidity and mortality worldwide, especially in developing countries. Prevalence studies for MSI in the developing world have used varying methodologies and are seldom directly comparable. This study aimed to develop a new tool to screen for and diagnose MSI and to pilot test the methodology for a national survey in Rwanda. Methods A 7 question screening tool to identify cases of MSI was developed through literature review and discussions with healthcare professionals. To validate the tool, trained rehabilitation technicians screened 93 previously identified gold standard 'cases' and 86 'non cases'. Sensitivity, specificity and positive predictive value were calculated. A standardised examination protocol was developed to determine the aetiology and diagnosis of MSI for those who fail the screening test. For the national survey in Rwanda, multistage cluster random sampling, with probability proportional to size procedures will be used for selection of a cross-sectional, nationally representative sample of the population. Households to be surveyed will be chosen through compact segment sampling and all individuals within chosen households will be screened. A pilot survey of 680 individuals was conducted using the protocol. Results: The screening tool demonstrated 99% sensitivity and 97% specificity for MSI, and a positive predictive value of 98%. During the pilot study 468 out of 680 eligible subjects (69% were screened. 45 diagnoses were identified in 38 persons who were cases of MSI. The subjects were grouped into categories based on diagnostic subgroups of congenital (1, traumatic (17, infective (2 neurological (6 and other acquired(19. They were also separated into mild (42.1%, moderate (42.1% and severe (15.8% cases, using an operational definition derived from the World Health Organisation's International Classification of Functioning, Disability and Health

  2. 3-D SURVEY APPLIED TO INDUSTRIAL ARCHAEOLOGY BY TLS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    M. Monego

    2017-05-01

    Full Text Available This work describes the three-dimensional survey of “Ex Stazione Frigorifera Specializzata”: initially used for agricultural storage, during the years it was allocated to different uses until the complete neglect. The historical relevance and the architectural heritage that this building represents has brought the start of a recent renovation project and functional restoration. In this regard it was necessary a global 3-D survey that was based on the application and integration of different geomatic methodologies (mainly terrestrial laser scanner, classical topography, and GNSS. The acquisitions of point clouds was performed using different laser scanners: with time of flight (TOF and phase shift technologies for the distance measurements. The topographic reference network, needed for scans alignment in the same system, was measured with a total station. For the complete survey of the building, 122 scans were acquired and 346 targets were measured from 79 vertices of the reference network. Moreover, 3 vertices were measured with GNSS methodology in order to georeference the network. For the detail survey of machine room were executed 14 scans with 23 targets. The 3-D global model of the building have less than one centimeter of error in the alignment (for the machine room the error in alignment is not greater than 6 mm and was used to extract products such as longitudinal and transversal sections, plans, architectural perspectives, virtual scans. A complete spatial knowledge of the building is obtained from the processed data, providing basic information for restoration project, structural analysis, industrial and architectural heritage valorization.

  3. Challenges in dental statistics: survey methodology topics

    Directory of Open Access Journals (Sweden)

    Giuseppe Pizzo

    2013-12-01

    Full Text Available This paper gathers some contributions concerning survey methodology in dental research, as discussed during the first Workshop of the SISMEC STATDENT working group on statistical methods and applications in dentistry, held in Ancona on the 28th September 2011.The first contribution deals with the European Global Oral Health Indicators Development (EGOHID Project which proposed a comprehensive and standardized system of epidemiological tools (questionnaires and clinical forms for national data collection on oral health in Europe. The second contribution regards the design and conduct of trials to evaluate the clinical efficacy and safety of toothbrushes and mouthrinses. Finally, a flexible and effective tool used to trace dental age reference charts tailored to Italian children is presented.

  4. A review of methodology and analysis of nutrition and mortality surveys conducted in humanitarian emergencies from October 1993 to April 2004

    Directory of Open Access Journals (Sweden)

    Spiegel Paul B

    2007-06-01

    Full Text Available Abstract Background Malnutrition prevalence and mortality rates are increasingly used as essential indicators to assess the severity of a crisis, to follow trends, and to guide decision-making, including allocation of funds. Although consensus has slowly developed on the methodology to accurately measure these indicators, errors in the application of the survey methodology and analysis have persisted. The aim of this study was to identify common methodological weaknesses in nutrition and mortality surveys and to provide practical recommendations for improvement. Methods Nutrition (N = 368 and crude mortality rate (CMR; N = 158 surveys conducted by 33 non-governmental organisations and United Nations agencies in 17 countries from October 1993 to April 2004 were analysed for sampling validity, precision, quality of measurement and calculation according to several criteria. Results One hundred and thirty (35.3% nutrition surveys and 5 (3.2% CMR surveys met the criteria for quality. Quality of surveys varied significantly depending on the agency. The proportion of nutrition surveys that met criteria for quality rose significantly from 1993 to 2004; there was no improvement for mortality surveys during this period. Conclusion Significant errors and imprecision in the methodology and reporting of nutrition and mortality surveys were identified. While there was an improvement in the quality of nutrition surveys over the years, the quality of mortality surveys remained poor. Recent initiatives aimed at standardising nutrition and mortality survey quality should be strengthened. There are still a number of methodological issues in nutrition and mortality surveys in humanitarian emergencies that need further study.

  5. Digital marketing in travel industry. Case: Hotel landing page optimization

    OpenAIRE

    Bitkulova, Renata

    2017-01-01

    Landing page optimization is implementation of the principles of digital service design to improve the website’s user experience. Well done landing page optimization can have a significant positive effect on the usability and profitability of the website. The objective of the study was to optimize the Russian language version of Vuokatti landing page in order to increase conversion, defined as the number of clicks to accommodation search button. A literature survey was made to determine ...

  6. Design of a Web Page as a complement of educative innovation through MOODLE

    Science.gov (United States)

    Mendiola Ubillos, M. A.; Aguado Cortijo, Pedro L.

    2010-05-01

    In the context of Information Technology to impart knowledge and to establish MOODLE system as a support and complementary tool to on-site educational methodology (b-learning) a Web Page was designed in Agronomic and Food Industry Crops (Plantas de interés Agroalimentario) during 2006-07 course. This web was inserted in the Thecnical University of Madrid (Universidad Politécnica de Madrid) computer system to facilitate to the students the first contact with the contents of this subject. In this page the objectives and methodology, personal work planning, subject program given plus the activities are showed. At another web site, the evaluation criteria and recommended bibliography are located. The objective of this web page has been to make more transparent and accessible the necessary information in the learning process and presenting it in a more attractive frame. This page has been update and modified in each academic course offered since its first implementation. We had added in some cases new specific links to increase its useful. At the end of each course a test is applied to the students that take this subject. We have asked which elements would like to modify, delete and add to this web page. In this way the direct users give their point of view and help to improve the web page each course.

  7. Improving Interdisciplinary Provider Communication Through a Unified Paging System.

    Science.gov (United States)

    Heidemann, Lauren; Petrilli, Christopher; Gupta, Ashwin; Campbell, Ian; Thompson, Maureen; Cinti, Sandro; Stewart, David A

    2016-06-01

    Interdisciplinary communication at a Veterans Affairs (VA) academic teaching hospital is largely dependent on alphanumeric paging, which has limitations as a result of one-way communication and lack of reliable physician identification. Adverse patient outcomes related to difficulty contacting the correct consulting provider in a timely manner have been reported. House officers were surveyed on the level of satisfaction with the current VA communication system and the rate of perceived adverse patient outcomes caused by potential delays within this system. Respondents were then asked to identify the ideal paging system. These results were used to develop and deploy a new Web site. A postimplementation survey was repeated 1 year later. This study was conducted as a quality improvement project. House officer satisfaction with the preintervention system was 3%. The majority used more than four modalities to identify consultants, with 59% stating that word of mouth was a typical source. The preferred mode of paging was the university hospital paging system, a Web-based program that is used at the partnering academic institution. Following integration of VA consulting services within the university hospital paging system, the level of satisfaction improved to 87%. Significant decreases were seen in perceived adverse patient outcomes (from 16% to 2%), delays in patient care (from 90% to 16%), and extended hospitalizations (from 46% to 4%). Our study demonstrates significant improvement in physician satisfaction with a newly implemented paging system that was associated with a decreased perceived number of adverse patient events and delays in care.

  8. From Study to Work: Methodological Challenges of a Graduate Destination Survey in the Western Cape, South Africa

    Science.gov (United States)

    du Toit, Jacques; Kraak, Andre; Favish, Judy; Fletcher, Lizelle

    2014-01-01

    Current literature proposes several strategies for improving response rates to student evaluation surveys. Graduate destination surveys pose the difficulty of tracing graduates years later when their contact details may have changed. This article discusses the methodology of one such a survey to maximise response rates. Compiling a sample frame…

  9. Web-based Surveys: Changing the Survey Process

    OpenAIRE

    Gunn, Holly

    2002-01-01

    Web-based surveys are having a profound influence on the survey process. Unlike other types of surveys, Web page design skills and computer programming expertise play a significant role in the design of Web-based surveys. Survey respondents face new and different challenges in completing a Web-based survey. This paper examines the different types of Web-based surveys, the advantages and challenges of using Web-based surveys, the design of Web-based surveys, and the issues of validity, error, ...

  10. [Methodological design of the National Health and Nutrition Survey 2016].

    Science.gov (United States)

    Romero-Martínez, Martín; Shamah-Levy, Teresa; Cuevas-Nasu, Lucía; Gómez-Humarán, Ignacio Méndez; Gaona-Pineda, Elsa Berenice; Gómez-Acosta, Luz María; Rivera-Dommarco, Juan Ángel; Hernández-Ávila, Mauricio

    2017-01-01

    Describe the design methodology of the halfway health and nutrition national survey (Ensanut-MC) 2016. The Ensanut-MC is a national probabilistic survey whose objective population are the inhabitants of private households in Mexico. The sample size was determined to make inferences on the urban and rural areas in four regions. Describes main design elements: target population, topics of study, sampling procedure, measurement procedure and logistics organization. A final sample of 9 479 completed household interviews, and a sample of 16 591 individual interviews. The response rate for households was 77.9%, and the response rate for individuals was 91.9%. The Ensanut-MC probabilistic design allows valid statistical inferences about interest parameters for Mexico´s public health and nutrition, specifically on overweight, obesity and diabetes mellitus. Updated information also supports the monitoring, updating and formulation of new policies and priority programs.

  11. Watershed Data Management (WDM) database for West Branch DuPage River streamflow simulation, DuPage County, Illinois, January 1, 2007, through September 30, 2013

    Science.gov (United States)

    Bera, Maitreyee

    2017-10-16

    The U.S. Geological Survey (USGS), in cooperation with the DuPage County Stormwater Management Department, maintains a database of hourly meteorological and hydrologic data for use in a near real-time streamflow simulation system. This system is used in the management and operation of reservoirs and other flood-control structures in the West Branch DuPage River watershed in DuPage County, Illinois. The majority of the precipitation data are collected from a tipping-bucket rain-gage network located in and near DuPage County. The other meteorological data (air temperature, dewpoint temperature, wind speed, and solar radiation) are collected at Argonne National Laboratory in Argonne, Ill. Potential evapotranspiration is computed from the meteorological data using the computer program LXPET (Lamoreux Potential Evapotranspiration). The hydrologic data (water-surface elevation [stage] and discharge) are collected at U.S.Geological Survey streamflow-gaging stations in and around DuPage County. These data are stored in a Watershed Data Management (WDM) database.This report describes a version of the WDM database that is quality-assured and quality-controlled annually to ensure datasets are complete and accurate. This database is named WBDR13.WDM. It contains data from January 1, 2007, through September 30, 2013. Each precipitation dataset may have time periods of inaccurate data. This report describes the methods used to estimate the data for the periods of missing, erroneous, or snowfall-affected data and thereby improve the accuracy of these data. The other meteorological datasets are described in detail in Over and others (2010), and the hydrologic datasets in the database are fully described in the online USGS annual water data reports for Illinois (U.S. Geological Survey, 2016) and, therefore, are described in less detail than the precipitation datasets in this report.

  12. Resource selection for an interdisciplinary field: a methodology.

    Science.gov (United States)

    Jacoby, Beth E; Murray, Jane; Alterman, Ina; Welbourne, Penny

    2002-10-01

    The Health Sciences and Human Services Library of the University of Maryland developed and implemented a methodology to evaluate print and digital resources for social work. Although this methodology was devised for the interdisciplinary field of social work, the authors believe it may lend itself to resource selection in other interdisciplinary fields. The methodology was developed in response to the results of two separate surveys conducted in late 1999, which indicated improvement was needed in the library's graduate-level social work collections. Library liaisons evaluated the print collection by identifying forty-five locally relevant Library of Congress subject headings and then using these subjects or synonymous terms to compare the library's titles to collections of peer institutions, publisher catalogs, and Amazon.com. The collection also was compared to social work association bibliographies, ISI Journal Citation Reports, and major social work citation databases. An approval plan for social work books was set up to assist in identifying newly published titles. The library acquired new print and digital social work resources as a result of the evaluation, thus improving both print and digital collections for its social work constituents. Visibility of digital resources was increased by cataloging individual titles in aggregated electronic journal packages and listing each title on the library Web page.

  13. Methodology of Global Adult Tobacco Survey (GATS), Malaysia, 2011.

    Science.gov (United States)

    Omar, Azahadi; Yusoff, Muhammad Fadhli Mohd; Hiong, Tee Guat; Aris, Tahir; Morton, Jeremy; Pujari, Sameer

    Malaysia participated in the second phase of the Global Adult Tobacco Survey (GATS) in 2011. GATS, a new component of the Global Tobacco Surveillance System, is a nationally representative household survey of adults 15 years old or above. The objectives of GATS Malaysia were to (i) systematically monitor tobacco use among adults and track key indicators of tobacco control and (ii) track the implementation of some of the Framework Convention of Tobacco Control (FCTC)-recommended demand related policies. GATS Malaysia 2011 was a nationwide cross-sectional survey using multistage stratified sampling to select 5112 nationally representative households. One individual aged 15 years or older was randomly chosen from each selected household and interviewed using handheld device. GATS Core Questionnaire with optional questions was pre-tested and uploaded into handheld devices after repeated quality control processes. Data collectors were trained through a centralized training. Manuals and picture book were prepared to aid in the training of data collectors and during data collection. Field-level data were aggregated on a daily basis and analysed twice a week. Quality controls were instituted to ensure collection of high quality data. Sample weighting and analysis were conducted with the assistance of researchers from the Centers for Disease Control and Prevention, Atlanta, USA. GATS Malaysia received a total response rate of 85.3% from 5112 adults surveyed. Majority of the respondents were 25-44 years old and Malays. The robust methodology used in the GATS Malaysia provides national estimates for tobacco used classified by socio-demographic characteristics and reliable data on various dimensions of tobacco control.

  14. Methodological studies for long range environmental gamma rate survey in Brazil

    International Nuclear Information System (INIS)

    Souza, Elder M.; Wasserman, Maria Angelica V.; Rochedo, Elaine R. R.

    2009-01-01

    The objective of this work is to support the establishment of a methodology for gamma radiation survey over large areas in order to estimate public exposure to natural background radiation in Brazil. In a first stage, two different sites close to large water bodies were chosen, Guanabara Bay, RJ and Amazon River close to Santarem, PA. Early results showed similar results for over water surveys despite the type of water body. Dose rates over land are higher than those over water, due to the natural radioactivity on soil, pavements and other building materials. In this study the focus was on variability of measurements performed in the same area and variability for different types of area, including roads and urbanized environments. Several measurements have been performed of several areas, that included roads and towns in Para, Bahia, Rio de Janeiro and Minas Gerais. Measurements were done by car and on boats, using a AT6101C Scanner - Spectral Radiation Scanner. Differences were detected for different areas, with roads generally presenting lower dose rates than highly urbanized areas. Also, for roads close to granite rocks and mountains, dose rates are higher than those at both coastal areas and inland lowlands. Large towns present large variability, with individual measurements close to average dose rates from anomalous uranium sites. The results will be used to derive a methodology for assessing background radiation exposure for the Brazilian population. It can be concluded that surveys are to be based on population distribution grids rather than on a simple area based grid distribution, due to both the uneven population distribution and the variability on external dose rates throughout the Brazilian territory. (author)

  15. Bibliographic survey on methodologies for development of health database of the population in case of cancer occurrences

    International Nuclear Information System (INIS)

    Cavinato, Christianne C.; Andrade, Delvonei A. de; Sabundjian, Gaiane; Diz, Maria Del Pilar E.

    2014-01-01

    The objective is to make a survey of existing methodologies and for the development of public health database, focusing on health (fatal and nonfatal cancer) of the population surrounding a nuclear facility, for purposes of calculating the environmental cost of the same. From methodologies found to develop this type of database, a methodology will be developed to be applied to the internal public of IPEN/CNEN-SP, Brazil, as a pre-test for the acquisition of health information desired

  16. U.S. Geological Survey Methodology Development for Ecological Carbon Assessment and Monitoring

    Science.gov (United States)

    Zhu, Zhi-Liang; Stackpoole, S.M.

    2009-01-01

    Ecological carbon sequestration refers to transfer and storage of atmospheric carbon in vegetation, soils, and aquatic environments to help offset the net increase from carbon emissions. Understanding capacities, associated opportunities, and risks of vegetated ecosystems to sequester carbon provides science information to support formulation of policies governing climate change mitigation, adaptation, and land-management strategies. Section 712 of the Energy Independence and Security Act (EISA) of 2007 mandates the Department of the Interior to develop a methodology and assess the capacity of our nation's ecosystems for ecological carbon sequestration and greenhouse gas (GHG) flux mitigation. The U.S. Geological Survey (USGS) LandCarbon Project is responding to the Department of Interior's request to develop a methodology that meets specific EISA requirements.

  17. Methodological design of the National Health and Nutrition Survey 2016

    Directory of Open Access Journals (Sweden)

    Martín Romero-Martínez

    2017-05-01

    Full Text Available Objective. Describe the design methodology of the halfway health and nutrition national survey (Ensanut-MC 2016. Materials and methods. The Ensanut-MC is a national probabilistic survey whose objective population are the in­habitants of private households in Mexico. The sample size was determined to make inferences on the urban and rural areas in four regions. Describes main design elements: target population, topics of study, sampling procedure, measurement procedure and logistics organization. Results. A final sample of 9 479 completed household interviews, and a sample of 16 591 individual interviews. The response rate for households was 77.9%, and the response rate for individuals was 91.9%. Conclusions. The Ensanut-MC probabilistic design allows valid statistical inferences about interest parameters for Mexico´s public health and nutrition, specifically on over­weight, obesity and diabetes mellitus. Updated information also supports the monitoring, updating and formulation of new policies and priority programs.

  18. Synthesizing exoplanet demographics from radial velocity and microlensing surveys. I. Methodology

    International Nuclear Information System (INIS)

    Clanton, Christian; Gaudi, B. Scott

    2014-01-01

    Motivated by the order of magnitude difference in the frequency of giant planets orbiting M dwarfs inferred by microlensing and radial velocity (RV) surveys, we present a method for comparing the statistical constraints on exoplanet demographics inferred from these methods. We first derive the mapping from the observable parameters of a microlensing-detected planet to those of an analogous planet orbiting an RV-monitored star. Using this mapping, we predict the distribution of RV observables for the planet population inferred from microlensing surveys, taking care to adopt reasonable priors for, and properly marginalize over, the unknown physical parameters of microlensing-detected systems. Finally, we use simple estimates of the detection limits for a fiducial RV survey to predict the number and properties of analogs of the microlensing planet population such an RV survey should detect. We find that RV and microlensing surveys have some overlap, specifically for super-Jupiter mass planets (m p ≳ 1 M Jup ) with periods between ∼3-10 yr. However, the steeply falling planetary mass function inferred from microlensing implies that, in this region of overlap, RV surveys should infer a much smaller frequency than the overall giant planet frequency (m p ≳ 0.1 M Jup ) inferred by microlensing. Our analysis demonstrates that it is possible to statistically compare and synthesize data sets from multiple exoplanet detection techniques in order to infer exoplanet demographics over wider regions of parameter space than are accessible to individual methods. In a companion paper, we apply our methodology to several representative microlensing and RV surveys to derive the frequency of planets around M dwarfs with orbits of ≲ 30 yr.

  19. Terrorism and Politics Predominate on the Front Pages of the Basque Press. Content and Area Analysis of the Front Pages of the Regional Newspapers

    Directory of Open Access Journals (Sweden)

    Dr. Jesús A. Pérez Dasilva

    2010-01-01

    Full Text Available This paper offers the results of research project 08/20 of the University of the Basque Country on the news published on the front pages of the Basque press during the years 1996, 2001 and 2006.The researchers analyse the front pages of the Basque press to determine if their content matches the demand and interests of their readers. The study shows what are the most relevant topics for these newspapers. The research involved a detailed analysis of 2,448 front pages of the five main Basque newspapers, with a total of 19,156 news items. A specific methodology was developed for this work, enabling both a quantitative and qualitative analysis of the news stories to be made. The data shown in this paper are a summary of the more detailed results that emerged in the different fields of the research.

  20. Delirium diagnosis methodology used in research: a survey-based study.

    Science.gov (United States)

    Neufeld, Karin J; Nelliot, Archana; Inouye, Sharon K; Ely, E Wesley; Bienvenu, O Joseph; Lee, Hochang Benjamin; Needham, Dale M

    2014-12-01

    To describe methodology used to diagnose delirium in research studies evaluating delirium detection tools. The authors used a survey to address reference rater methodology for delirium diagnosis, including rater characteristics, sources of patient information, and diagnostic process, completed via web or telephone interview according to respondent preference. Participants were authors of 39 studies included in three recent systematic reviews of delirium detection instruments in hospitalized patients. Authors from 85% (N = 33) of the 39 eligible studies responded to the survey. The median number of raters per study was 2.5 (interquartile range: 2-3); 79% were physicians. The raters' median duration of clinical experience with delirium diagnosis was 7 years (interquartile range: 4-10), with 5% having no prior clinical experience. Inter-rater reliability was evaluated in 70% of studies. Cognitive tests and delirium detection tools were used in the delirium reference rating process in 61% (N = 21) and 45% (N = 15) of studies, respectively, with 33% (N = 11) using both and 27% (N = 9) using neither. When patients were too drowsy or declined to participate in delirium evaluation, 70% of studies (N = 23) used all available information for delirium diagnosis, whereas 15% excluded such patients. Significant variability exists in reference standard methods for delirium diagnosis in published research. Increasing standardization by documenting inter-rater reliability, using standardized cognitive and delirium detection tools, incorporating diagnostic expert consensus panels, and using all available information in patients declining or unable to participate with formal testing may help advance delirium research by increasing consistency of case detection and improving generalizability of research results. Copyright © 2014 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  1. Exploring the use of a Facebook page in anatomy education.

    Science.gov (United States)

    Jaffar, Akram Abood

    2014-01-01

    Facebook is the most popular social media site visited by university students on a daily basis. Consequently, Facebook is the logical place to start with for integrating social media technologies into education. This study explores how a faculty-administered Facebook Page can be used to supplement anatomy education beyond the traditional classroom. Observations were made on students' perceptions and effectiveness of using the Page, potential benefits and challenges of such use, and which Insights metrics best reflect user's engagement. The Human Anatomy Education Page was launched on Facebook and incorporated into anatomy resources for 157 medical students during two academic years. Students' use of Facebook and their perceptions of the Page were surveyed. Facebook's "Insights" tool was also used to evaluate Page performance during a period of 600 days. The majority of in-class students had a Facebook account which they adopted in education. Most students perceived Human Anatomy Education Page as effective in contributing to learning and favored "self-assessment" posts. The majority of students agreed that Facebook could be a suitable learning environment. The "Insights" tool revealed globally distributed fans with considerable Page interactions. The use of a faculty-administered Facebook Page provided a venue to enhance classroom teaching without intruding into students' social life. A wider educational use of Facebook should be adopted not only because students are embracing its use, but for its inherent potentials in boosting learning. The "Insights" metrics analyzed in this study might be helpful when establishing and evaluating the performance of education-oriented Facebook Pages. © 2013 American Association of Anatomists.

  2. PAGING IN COMMUNICATIONS

    DEFF Research Database (Denmark)

    2016-01-01

    A method and an apparatus are disclosed for managing paging in a communications system. The method may include, based on a received set of physical resources, determining, in a terminal apparatus, an original paging pattern defining potential time instants for paging, wherein the potential time...... instants for paging include a subset of a total amount of resources available at a network node for paging....

  3. Systematic investigation of gastrointestinal diseases in China (SILC): validation of survey methodology.

    Science.gov (United States)

    Yan, Xiaoyan; Wang, Rui; Zhao, Yanfang; Ma, Xiuqiang; Fang, Jiqian; Yan, Hong; Kang, Xiaoping; Yin, Ping; Hao, Yuantao; Li, Qiang; Dent, John; Sung, Joseph; Zou, Duowu; Johansson, Saga; Halling, Katarina; Liu, Wenbin; He, Jia

    2009-11-19

    Symptom-based surveys suggest that the prevalence of gastrointestinal diseases is lower in China than in Western countries. The aim of this study was to validate a methodology for the epidemiological investigation of gastrointestinal symptoms and endoscopic findings in China. A randomized, stratified, multi-stage sampling methodology was used to select 18,000 adults aged 18-80 years from Shanghai, Beijing, Xi'an, Wuhan and Guangzhou. Participants from Shanghai were invited to provide blood samples and undergo upper gastrointestinal endoscopy. All participants completed Chinese versions of the Reflux Disease Questionnaire (RDQ) and the modified Rome II questionnaire; 20% were also invited to complete the 36-item Short Form Health Survey (SF-36) and Epworth Sleepiness Scale (ESS). The psychometric properties of the questionnaires were evaluated statistically. The study was completed by 16,091 individuals (response rate: 89.4%), with 3219 (89.4% of those invited) completing the SF-36 and ESS. All 3153 participants in Shanghai provided blood samples and 1030 (32.7%) underwent endoscopy. Cronbach's alpha coefficients were 0.89, 0.89, 0.80 and 0.91, respectively, for the RDQ, modified Rome II questionnaire, ESS and SF-36, supporting internal consistency. Factor analysis supported construct validity of all questionnaire dimensions except SF-36 psychosocial dimensions. This population-based study has great potential to characterize the relationship between gastrointestinal symptoms and endoscopic findings in China.

  4. Evaluating the usability of web pages: a case study

    NARCIS (Netherlands)

    Lautenbach, M.A.E.; Schegget, I.E. ter; Schoute, A.E.; Witteman, C.L.M.

    1999-01-01

    An evaluation of the Utrecht University website was carried out with 240 students. New criteria were drawn from the literature and operationalized for the study. These criteria are surveyability and findability. Web pages can be said to satisfy a usability criterion if their efficiency and

  5. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  6. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  7. UAV-Borne photogrammetry: a low cost 3D surveying methodology for cartographic update

    Directory of Open Access Journals (Sweden)

    Caroti Gabriella

    2017-01-01

    Full Text Available Territorial management requires the most possible up-to-date mapping support of the status quo. Regional scale cartography update cycle is in the medium term (10 to 15 years: therefore, in the intervening time between updates relevant Authorities must provide timely updates for new works or territorial changes. Required surveys can exploit several technologies: ground-based GPS, Terrestrial Laser Scanning (TLS, traditional topography, or, in the case of wider areas, airborne photogrammetry or laser scanning. In recent years UAV-based photogrammetry is becoming increasingly widespread as a versatile, low-cost surveying system for small to medium areas. This surveying methodology was used to generate, in order, a dense point cloud, a high resolution Digital Surface Model (DSM and an orthophotograph of a newly built marina by the mouth of the Arno river in Pisa, Italy, which is not yet included in cartography. Surveying activities took place while the construction site was in operation. Case study issues surfaced in the course of the survey are presented and discussed, suggesting ‘good practice’ rules which, if followed in the survey planning step, can lessen unwanted effects due to criticalities. Besides, results of quality analysis of orthophotographs generated by UAV-borne images are also presented. Such results are discussed in view of a possible use of orthophotographs in updating medium- to large-scale cartography and checked against existing blueprints.

  8. Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches

    Science.gov (United States)

    Farassat, Fereidoun; Casper, Jay H.

    2006-01-01

    In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.

  9. Practice Audit in Gastroenterology (PAGE) program: A novel approach to continuing professional development

    Science.gov (United States)

    Armstrong, David; Hollingworth, Roger; Gardiner, Tara; Klassen, Michael; Smith, Wendy; Hunt, Richard H; Barkun, Alan; Gould, Michael; Leddin, Desmond

    2006-01-01

    BACKGROUND: Practice audit is an important component of continuing professional development that may more readily be undertaken if it were less complex. This qualitative study assessed the use of personal digital assistants to facilitate data collection and review. METHODS: Personal digital assistants programmed with standard questionnaires related to upper gastrointestinal endoscopies (Practice Audit in Gastroenterology-Endoscopy [‘PAGE-Endo’]) and colonoscopies (PAGE-Colonoscopy [‘PAGE-Colo’]) were provided to Canadian gastroenterologists, surgeons and internists. Over a three-week audit period, participants recorded indications, and the expected (E) and reported (R) findings for each procedure. Thereafter, participants recorded compliance with reporting, the ease of use and value of the PAGE program, and their willingness to perform another audit. RESULTS: Over 15 to 18 months, 173 participants completed PAGE-Endo (6168 procedures) and 111 completed PAGE-Colo (4776 procedures). Most respondents noted that PAGE was easy to use (99%), beneficial (88% to 95%), and that they were willing undertake another audit (92% to 95%). In PAGE-Endo, alarm features were prevalent (55%), but major reported findings were less common than expected: esophagitis (E 29.9%, R 14.8%), esophageal stricture (E 8.3%, R 3.6%), gastric ulcer (E 17.0%, R 4.7%), gastric cancer (E 4.3%, R 1.0%) and duodenal ulcer (E 11.5%, R 5.7%). In PAGE-Colo, more colonoscopies were performed for symptom investigation (55%) than for screening (25%) or surveillance (20%). There were marked interprovincial variations with respect to sedation, biopsies and technical aspects of colonoscopy. CONCLUSION: Secure, real-time data entry with review of aggregate and individual data in the PAGE program provided an acceptable, straightforward methodology for accredited practice audit activities. PAGE has considerable potential for continuing professional development in gastroenterology and other specialties

  10. Facebook's personal page modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  11. 77 FR 10958 - International Services Surveys: BE-150, Quarterly Survey of Cross-Border Credit, Debit, and...

    Science.gov (United States)

    2012-02-24

    ... Emond, Chief, Special Surveys Branch, Balance of Payments Division (BE-50), Bureau of Economic Analysis... the International Investment and Trade in Services Survey Act, 22 U.S.C. 3101- [[Page 10959

  12. Environmental Testing Methodology in Biometrics

    OpenAIRE

    Fernández Saavedra, Belén; Sánchez Reíllo, Raúl; Alonso Moreno, Raúl; Miguel Hurtado, Óscar

    2010-01-01

    8 pages document + 5-slide presentation.-- Contributed to: 1st International Biometric Performance Conference (IBPC 2010, NIST, Gaithersburg, MD, US, Mar 1-5, 2010). Recently, biometrics is used in many security systems and these systems can be located in different environments. As many experts claim and previous works have demonstrated, environmental conditions influence biometric performance. Nevertheless, there is not a specific methodology for testing this influence at the moment...

  13. Entertainment Pages.

    Science.gov (United States)

    Druce, Mike

    1981-01-01

    Notes that the planning of an effective entertainment page in a school newspaper must begin by establishing its purpose. Examines all the elements that contribute to the makeup of a good entertainment page. (RL)

  14. Sources and Coverage of Medical News on Front Pages of US Newspapers

    Science.gov (United States)

    Lai, William Y. Y.; Lane, Trevor; Jones, Alison

    2009-01-01

    Background Medical news that appears on newspaper front pages is intended to reach a wide audience, but how this type of medical news is prepared and distributed has not been systematically researched. We thus quantified the level of visibility achieved by front-page medical stories in the United States and analyzed their news sources. Methodology Using the online resource Newseum, we investigated front-page newspaper coverage of four prominent medical stories, and a high-profile non-medical news story as a control, reported in the US in 2007. Two characteristics were quantified by two raters: which newspaper titles carried each target front-page story (interrater agreement, >96%; kappa, >0.92) and the news sources of each target story (interrater agreement, >94%; kappa, >0.91). National rankings of the top 200 US newspapers by audited circulation were used to quantify the extent of coverage as the proportion of the total circulation of ranked newspapers in Newseum. Findings In total, 1630 front pages were searched. Each medical story appeared on the front pages of 85 to 117 (67.5%–78.7%) ranked newspaper titles that had a cumulative daily circulation of 23.1 to 33.4 million, or 61.8% to 88.4% of all newspapers. In contrast, the non-medical story achieved front-page coverage in 152 (99.3%) newspaper titles with a total circulation of 41.0 million, or 99.8% of all newspapers. Front-page medical stories varied in their sources, but the Washington Post, Los Angeles Times, New York Times and the Associated Press together supplied 61.7% of the total coverage of target front-page medical stories. Conclusion Front-page coverage of medical news from different sources is more accurately revealed by analysis of circulation counts rather than of newspaper titles. Journals wishing to widen knowledge of research news and organizations with important health announcements should target at least the four dominant media organizations identified in this study. PMID:19724643

  15. Comparison Of Irms Delhi Methodology With Who Methodology On Immunization Coverage

    Directory of Open Access Journals (Sweden)

    Singh Padam

    1996-01-01

    Full Text Available Research question: What are the merits of IRMS Model over WHO Model for Coverage Evaluation Survey? Which method is superior and appropriate for coverage evolution survey of immunization in our setting? Objective: To compare IRMS Delhi methodology with WHO methodology on Immunization Coverage. Study Design: Cross-Sectional Setting: Urban and Rural both. Participants: Mothers& Children Sample Size: 300 children between 1-2 years and 300 mothers in rural areas and 75 children and 75 mothers in urban areas. Study Variables: Rural, Urban, Cast-Group, Size of the stratum, Literacy, Sex and Cost effectiveness. Outcome Variables: Coverage level of immunization. Analysis: Routine Statistical Analysis. Results: IRMS developed methodology scores better rating over WHO methodology, especially when coverage evolution is attempted in medium size villages with existence of socio-economic seggregation-which remains the main characteristic of the Indian villages.

  16. Full page insight

    DEFF Research Database (Denmark)

    Cortsen, Rikke Platz

    2014-01-01

    Alan Moore and his collaborating artists often manipulate time and space by drawing upon the formal elements of comics and making alternative constellations. This article looks at an element that is used frequently in comics of all kinds – the full page – and discusses how it helps shape spatio......, something that it shares with the full page in comics. Through an analysis of several full pages from Moore titles like Swamp Thing, From Hell, Watchmen and Promethea, it is made clear why the full page provides an apt vehicle for an apocalypse in comics....

  17. Alcohol- and Drug-Involved Driving in the United States: Methodology for the 2007 National Roadside Survey

    Science.gov (United States)

    Lacey, John H.; Kelley-Baker, Tara; Voas, Robert B.; Romano, Eduardo; Furr-Holden, C. Debra; Torres, Pedro; Berning, Amy

    2011-01-01

    This article describes the methodology used in the 2007 U.S. National Roadside Survey to estimate the prevalence of alcohol- and drug-impaired driving and alcohol- and drug-involved driving. This study involved randomly stopping drivers at 300 locations across the 48 continental U.S. states at sites selected through a stratified random sampling…

  18. Methodology of Layout (DIV+CSS) Applied at the Designing of Website Page of Chinese Geology Surey Bureau%DIV+CSS技术在地调局网站页面设计中的应用

    Institute of Scientific and Technical Information of China (English)

    漆海霞

    2011-01-01

    论文通过介绍DIV+CSS技术,指出了中国地质调查局网站页面设计采用该技术的重要意义。并提出了中国地质调查局网站页面设计的方法和布局的结构,并指出采用DIV+CSS技术时应重视的几个问题。%The author stressed the importance of using the methodology of layout technique DIV+CSS to design the Website page at the Chinese geology Investigation Bureau by introducing the Layout structure and the designing methods applied at the Website page. He pointed out some problems that we must be careful while using the DIV+CSS technical methods.

  19. Intention to continue using Facebook fan pages from the perspective of social capital theory.

    Science.gov (United States)

    Lin, Kuan-Yu; Lu, Hsi-Peng

    2011-10-01

    Social network sites enable users to express themselves, establish ties, and develop and maintain social relationships. Recently, many companies have begun using social media identity (e.g., Facebook fan pages) to enhance brand attractiveness, and social network sites have evolved into social utility networks, thereby creating a number of promising business opportunities. To this end, the operators of fan pages need to be aware of the factors motivating users to continue their patronization of such pages. This study set out to identify these motivating factors from the point of view of social capital. This study employed structural equation modeling to investigate a research model based on a survey of 327 fan pages users. This study discovered that ties related to social interaction (structural dimension), shared values (cognitive dimension), and trust (relational dimension) play important roles in users' continued intention to use Facebook fan pages. Finally, this study discusses the implications of these findings and offers directions for future research.

  20. Users page feedback

    CERN Multimedia

    2010-01-01

    In October last year the Communication Group proposed an interim redesign of the users’ web pages in order to improve the visibility of key news items, events and announcements to the CERN community. The proposed update to the users' page (right), and the current version (left, behind) This proposed redesign was seen as a small step on the way to much wider reforms of the CERN web landscape proposed in the group’s web communication plan.   The results are available here. Some of the key points: - the balance between news / events / announcements and access to links on the users’ pages was not right - many people asked to see a reversal of the order so that links appeared first, news/events/announcements last; - many people felt that we should keep the primary function of the users’ pages as an index to other CERN websites; - many people found the sections of the front page to be poorly delineated; - people do not like scrolling; - there were performance...

  1. Using a statewide survey methodology to prioritize pediatric cardiology core content.

    Science.gov (United States)

    Neal, Ashley E; Lehto, Elizabeth; Miller, Karen Hughes; Ziegler, Craig; Davis, Erin

    2018-01-01

    Although pediatrician-reported relevance of Canadian cardiology-specific objectives has been studied, similar data are not available for the 2016 American Board of Pediatrics (ABP) cardiology-specific objectives. This study asked Kentucky trainees, pediatricians, and pediatric cardiologists to identify "most important" content within these objectives. This cross-sectional study used an original, online survey instrument based on the 2016 ABP cardiology-specific objectives. We collected quantitative data (numerical indications of importance) and qualitative data (open-ended replies regarding missing content and difficulty in teaching and learning). Respondents indicated the top two choices of most important items within eight content areas. Descriptive statistics (frequencies and percentages) and chi-square analysis were calculated. Content within categories was organized using naturally occurring "clusters" and "gaps" in scores. Common themes among open-ended qualitative responses were identified using Pandit's version of Glaser and Strauss Grounded theory (constant comparison). Of the 136 respondents, 23 (17%) were residents, 15 (11%) fellows, 85 (62%) pediatricians, and 13 (10%) pediatric cardiologists. Of attendings, 80% reported faculty/gratis faculty status. Naturally occurring clusters in respondent-designated importance resulted in ≤3 "most selected" objectives per content area. Objectives in "most selected" content pertained to initial diagnosis (recognition of abnormality/disease) (n = 16), possible emergent/urgent intervention required (n = 14), building a differential (n = 8), and planning a workup (n = 4). Conversely, themes for "least selected" content included comanagement with subspecialist (n = 15), knowledge useful in patient-family communication (n = 9), knowledge that can be referenced (as needed) (n = 7), and longitudinal/follow-up concerns (n = 5). This study demonstrated the utility of an online survey

  2. Can citizen science produce good science? Testing the OPAL Air Survey methodology, using lichens as indicators of nitrogenous pollution

    International Nuclear Information System (INIS)

    Tregidgo, Daniel J.; West, Sarah E.; Ashmore, Mike R.

    2013-01-01

    Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood. -- Highlights: •We investigated the validity of a simplified citizen science methodology. •Lichen abundance data were used to indicate nitrogenous air pollution. •Significant changes were detected beside busy roads with low background pollution. •The methodology detected major, but not subtle, contrasts in pollution. •Sensitivity of citizen science methods to environmental change must be evaluated. -- A simplified lichen biomonitoring method used for citizen science can detect the impact of nitrogenous air pollution from local roads

  3. Discontinuation of the Bulletin's menu page

    CERN Document Server

    Publications Section

    2005-01-01

    The menus of the various CERN restaurants will no longer be published in the Bulletin as of Monday 4 April (issue No. 14/2005). The menu pages are being discontinued both as a savings measure and due to the low level of interest in this section of the Bulletin. The most recent survey of Bulletin readers showed that only 13% of the people questioned regularly read the menu section, compared to between 40% and 85% in the case of the other sections. Publications Section SG/CO Tel. 79971

  4. Discontinuation of the Bulletin's menu page

    CERN Multimedia

    Publications Section

    2005-01-01

    The menus of the various CERN restaurants will no longer be published in the Bulletin as of Monday 4 April (issue No. 14/2005). The menu pages are being discontinued both as a savings measure and due to the low level of interest in this section of the Bulletin. The most recent survey of Bulletin readers showed that only 13% of the people questioned regularly read the menu section, compared to between 40% and 85% in the case of the other sections. Publications Section DSU-CO Tel. 79971

  5. Sleep Apnea Information Page

    Science.gov (United States)

    ... Page You are here Home » Disorders » All Disorders Sleep Apnea Information Page Sleep Apnea Information Page What research is being done? ... Institutes of Health (NIH) conduct research related to sleep apnea in laboratories at the NIH, and also ...

  6. ARM User Survey Report: Data Access, Quality, and Delivery

    Energy Technology Data Exchange (ETDEWEB)

    Mather, JH; Roeder, LR; Sivaraman, C

    2012-06-28

    The objective of this survey was to obtain user feedback to determine how users of the Atmospheric Radiation Measurement (ARM) Climate Research Facility Data Archive interact with the more than 2000 available types of datastreams. The survey also gathered information about data discovery and data quality. The Market and Competitive Analysis group at Pacific Northwest National Laboratory worked with web administrators to develop a landing page from which users could access the survey. A survey invitation was sent by ARM via email to about 6100 users on February 22, 2012. The invitation was also posted on the ARM website and Facebook page. Reminders were sent via e-mail and posted on Facebook while the survey was open, February 22-March 23, 2012.

  7. Development of assessment methodology for plant configuration control

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hyun; You, Young Woo; Kim, Yoon Ik; Yang, Hui Chang; Huh, Byeong Gill; Lee, Dong Won; Ahn, Gwan Won [Seoul National Univ., Seoul (Korea, Republic of)

    2001-03-15

    The purpose of this study IS the development of effective and overall assessment methodology which reflects the characteristics of plants for the surveillance, maintenance, repair and operation of nuclear power plants. In this study, recent researches are surveyed and concept definition, procedures, current PSA methodologies, implementation of various models are evaluated. Through this survey, systematic assessment methodology is suggested. Configuration control assessment methodology suggested in this study for the purpose of the development of configuration control methodology reflecting the characteristics of Korean NPPs, can be utilized as the supplement of current PSA methodologies.

  8. Processing of next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data for the DuPage County streamflow simulation system

    Science.gov (United States)

    Bera, Maitreyee; Ortel, Terry W.

    2018-01-12

    The U.S. Geological Survey, in cooperation with DuPage County Stormwater Management Department, is testing a near real-time streamflow simulation system that assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek and West Branch DuPage River drainage basins in DuPage County, Illinois. As part of this effort, the U.S. Geological Survey maintains a database of hourly meteorological and hydrologic data for use in this near real-time streamflow simulation system. Among these data are next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data, which are retrieved from the North Central River Forecasting Center of the National Weather Service. The DuPage County streamflow simulation system uses these quantitative precipitation forecast data to create streamflow predictions for the two simulated drainage basins. This report discusses in detail how these data are processed for inclusion in the Watershed Data Management files used in the streamflow simulation system for the Salt Creek and West Branch DuPage River drainage basins.

  9. Development of assessment methodology for plant configuration control

    Energy Technology Data Exchange (ETDEWEB)

    Cheong, Chang Hyeon; Yu, Yeong Woo; Cho, Jae Seon; Kim, Ju Yeol; Kim, Yun Ik; Yang, Hui Chang; Park, Gang Min; Hur, Byeong Gil [Seoul National Univ., Seoul (Korea, Republic of)

    1999-03-15

    The purpose of this study is the development of effective and overall assessment methodology which reflects the characteristics of plants for the surveillance, maintenance, repair and operation of Nuclear Power Plant. The development of this methodology can contribute to enhance safety. In the first year of this study, recent researches are surveyed and concept definition, procedures, current PSA methodologies, implementation of various models are evaluated. Through this survey, systematic assessment methodology is suggested.

  10. In-Degree and PageRank of web pages: why do they follow similar power laws?

    NARCIS (Netherlands)

    Litvak, Nelli; Scheinhardt, Willem R.W.; Volkovich, Y.

    2009-01-01

    PageRank is a popularity measure designed by Google to rank Web pages. Experiments confirm that PageRank values obey a power law with the same exponent as In-Degree values. This paper presents a novel mathematical model that explains this phenomenon. The relation between PageRank and In-Degree is

  11. In-degree and pageRank of web pages: Why do they follow similar power laws?

    NARCIS (Netherlands)

    Litvak, Nelli; Scheinhardt, Willem R.W.; Volkovich, Y.

    The PageRank is a popularity measure designed by Google to rank Web pages. Experiments confirm that the PageRank obeys a 'power law' with the same exponent as the In-Degree. This paper presents a novel mathematical model that explains this phenomenon. The relation between the PageRank and In-Degree

  12. Expectations for methodology and translation of animal research: a survey of health care workers.

    Science.gov (United States)

    Joffe, Ari R; Bara, Meredith; Anton, Natalie; Nobis, Nathan

    2015-05-07

    Health care workers (HCW) often perform, promote, and advocate use of public funds for animal research (AR); therefore, an awareness of the empirical costs and benefits of animal research is an important issue for HCW. We aim to determine what health-care-workers consider should be acceptable standards of AR methodology and translation rate to humans. After development and validation, an e-mail survey was sent to all pediatricians and pediatric intensive care unit nurses and respiratory-therapists (RTs) affiliated with a Canadian University. We presented questions about demographics, methodology of AR, and expectations from AR. Responses of pediatricians and nurses/RTs were compared using Chi-square, with P methodological quality, most respondents expect that: AR is done to high quality; costs and difficulty are not acceptable justifications for low quality; findings should be reproducible between laboratories and strains of the same species; and guidelines for AR funded with public money should be consistent with these expectations. Asked about benefits of AR, most thought that there are sometimes/often large benefits to humans from AR, and disagreed that "AR rarely produces benefit to humans." Asked about expectations of translation to humans (of toxicity, carcinogenicity, teratogenicity, and treatment findings), most: expect translation >40% of the time; thought that misleading AR results should occur methodological quality of, and the translation rate to humans of findings from AR. These expectations are higher than the empirical data show having been achieved. Unless these areas of AR significantly improve, HCW support of AR may be tenuous.

  13. The impact of methodology in innovation measurement

    Energy Technology Data Exchange (ETDEWEB)

    Wilhelmsen, L.; Bugge, M.; Solberg, E.

    2016-07-01

    Innovation surveys and rankings such as the Community Innovation Survey (CIS) and Innovation Union Scoreboard (IUS) have developed into influential diagnostic tools that are often used to categorize countries according to their innovation performance and to legitimise innovation policies. Although a number of ongoing processes are seeking to improve existing frameworks for measuring innovation, there are large methodological differences across countries in the way innovation is measured. This causes great uncertainty regarding a) the coherence between data from innovation surveys, b) actual innovativeness of the economy, and c) the validity of research based on innovation data. Against this background we explore empirically how different survey methods for measuring innovation affect reported innovation performance. The analysis is based on a statistical exercise comparing the results from three different methodological versions of the same survey for measuring innovation in the business enterprise sector in Norway. We find striking differences in reported innovation performance depending on how the surveys are carried out methodologically. The paper concludes that reported innovation performance is highly sensitive to and strongly conditioned by methodological context. This represents a need for increased caution and awareness around data collection and research based on innovation data, and not least in terms of aggregation of data and cross-country comparison. (Author)

  14. Development Risk Methodology for Whole Systems Trade Analysis

    Science.gov (United States)

    2016-08-01

    WSTAT). In the early stages of the V&V for development risk, it was discovered that the original risk rating and methodology did not actually...4932 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 ii THIS PAGE INTENTIONALLY LEFT ...WSTA has opened trade space exploration by allowing the tool to evaluate trillions of potential system configurations to then return a handful of

  15. Can citizen science produce good science? Testing the OPAL Air Survey methodology, using lichens as indicators of nitrogenous pollution.

    Science.gov (United States)

    Tregidgo, Daniel J; West, Sarah E; Ashmore, Mike R

    2013-11-01

    Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. 2009 Survey of Gulf of Mexico Dockside Seafood Dealers

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This survey employed a two page, self-administered mail survey structured to collect economic and financial information from dockside seafood dealers who operated...

  17. Creating Web Pages Simplified

    CERN Document Server

    Wooldridge, Mike

    2011-01-01

    The easiest way to learn how to create a Web page for your family or organization Do you want to share photos and family lore with relatives far away? Have you been put in charge of communication for your neighborhood group or nonprofit organization? A Web page is the way to get the word out, and Creating Web Pages Simplified offers an easy, visual way to learn how to build one. Full-color illustrations and concise instructions take you through all phases of Web publishing, from laying out and formatting text to enlivening pages with graphics and animation. This easy-to-follow visual guide sho

  18. Flash-Aware Page Replacement Algorithm

    Directory of Open Access Journals (Sweden)

    Guangxia Xu

    2014-01-01

    Full Text Available Due to the limited main memory resource of consumer electronics equipped with NAND flash memory as storage device, an efficient page replacement algorithm called FAPRA is proposed for NAND flash memory in the light of its inherent characteristics. FAPRA introduces an efficient victim page selection scheme taking into account the benefit-to-cost ratio for evicting each victim page candidate and the combined recency and frequency value, as well as the erase count of the block to which each page belongs. Since the dirty victim page often contains clean data that exist in both the main memory and the NAND flash memory based storage device, FAPRA only writes the dirty data within the victim page back to the NAND flash memory based storage device in order to reduce the redundant write operations. We conduct a series of trace-driven simulations and experimental results show that our proposed FAPRA algorithm outperforms the state-of-the-art algorithms in terms of page hit ratio, the number of write operations, runtime, and the degree of wear leveling.

  19. The changing pages of comics : Page layouts across eight decades of American superhero comics

    NARCIS (Netherlands)

    Pederson, Kaitlin; Cohn, Neil

    2016-01-01

    Page layouts are one of the most overt features of comics’ structure. We hypothesized that American superhero comics have changed in their page layout over eight decades, and investigated this using a corpus analysis of 40 comics from 1940 through 2014. On the whole, we found that comics pages

  20. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    International Nuclear Information System (INIS)

    2013-01-01

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results

  1. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-12-15

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results.

  2. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  3. College of DuPage District 502 Educational Needs Assessment (Total Sample Summary).

    Science.gov (United States)

    Rice, Gary

    In spring 1987, the Gallup Organization conducted a telephone survey for the College of DuPage (COD) in Illinois to determine community perceptions of the college's role in the community; community awareness of COD and its competitors; perceptions of current college services; importance of factors in selecting an educational program; interest in…

  4. Methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006.

    Science.gov (United States)

    Rodríguez-Ramírez, Sonia; Mundo-Rosas, Verónica; Jiménez-Aguilar, Alejandra; Shamah-Levy, Teresa

    2009-01-01

    To describe the methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006 (ENSANUT 2006) carried out in Mexico. Dietary data from the population who participated in the ENSANUT 2006 were collected through a 7-day food-frequency questionnaire. Energy and nutrient intake of each food consumed and adequacy percentage by day were also estimated. Intakes and adequacy percentages > 5 SDs from the energy and nutrient general distribution and observations with energy adequacy percentages < 25% were excluded from the analysis. Valid dietary data were obtained from 3552 children aged 1 to 4 years, 8716 children aged 5 to 11 years, 8442 adolescents, 15951 adults, and 3357 older adults. It is important to detail the methodology for the analysis of dietary data to standardize data cleaning criteria and to be able to compare the results of different studies.

  5. 2012 National Immunization Survey Data

    Science.gov (United States)

    ... Tweet Share Compartir This website is archived for historical purposes and is no longer being maintained or ... 12, 2013: Content on this page kept for historical reasons. National Immunization Survey (NIS) – Children (19-35 ...

  6. Realistic page-turning of electronic books

    Science.gov (United States)

    Fan, Chaoran; Li, Haisheng; Bai, Yannan

    2014-01-01

    The booming electronic books (e-books), as an extension to the paper book, are popular with readers. Recently, many efforts are put into the realistic page-turning simulation o f e-book to improve its reading experience. This paper presents a new 3D page-turning simulation approach, which employs piecewise time-dependent cylindrical surfaces to describe the turning page and constructs smooth transition method between time-dependent cylinders. The page-turning animation is produced by sequentially mapping the turning page into the cylinders with different radii and positions. Compared to the previous approaches, our method is able to imitate various effects efficiently and obtains more natural animation of turning page.

  7. On Page Rank

    NARCIS (Netherlands)

    Hoede, C.

    In this paper the concept of page rank for the world wide web is discussed. The possibility of describing the distribution of page rank by an exponential law is considered. It is shown that the concept is essentially equal to that of status score, a centrality measure discussed already in 1953 by

  8. PageRank of integers

    International Nuclear Information System (INIS)

    Frahm, K M; Shepelyansky, D L; Chepelianskii, A D

    2012-01-01

    We up a directed network tracing links from a given integer to its divisors and analyze the properties of the Google matrix of this network. The PageRank vector of this matrix is computed numerically and it is shown that its probability is approximately inversely proportional to the PageRank index thus being similar to the Zipf law and the dependence established for the World Wide Web. The spectrum of the Google matrix of integers is characterized by a large gap and a relatively small number of nonzero eigenvalues. A simple semi-analytical expression for the PageRank of integers is derived that allows us to find this vector for matrices of billion size. This network provides a new PageRank order of integers. (paper)

  9. The Faculty Web Page: Contrivance or Continuation?

    Science.gov (United States)

    Lennex, Lesia

    2007-01-01

    In an age of Internet education, what does it mean for a tenure/tenure-track faculty to have a web page? How many professors have web pages? If they have a page, what does it look like? Do they really need a web page at all? Many universities have faculty web pages. What do those collective pages look like? In what way do they represent the…

  10. A methodology for enhancing implementation science proposals: comparison of face-to-face versus virtual workshops.

    Science.gov (United States)

    Marriott, Brigid R; Rodriguez, Allison L; Landes, Sara J; Lewis, Cara C; Comtois, Katherine A

    2016-05-06

    With the current funding climate and need for advancements in implementation science, there is a growing demand for grantsmanship workshops to increase the quality and rigor of proposals. A group-based implementation science-focused grantsmanship workshop, the Implementation Development Workshop (IDW), is one methodology to address this need. This manuscript provides an overview of the IDW structure, format, and findings regarding its utility. The IDW methodology allows researchers to vet projects in the proposal stage in a structured format with a facilitator and two types of expert participants: presenters and attendees. The presenter uses a one-page handout and verbal presentation to present their proposal and questions. The facilitator elicits feedback from attendees using a format designed to maximize the number of unique points made. After each IDW, participants completed an anonymous survey assessing perceptions of the IDW. Presenters completed a funding survey measuring grant submission and funding success. Qualitative interviews were conducted with a subset of participants who participated in both delivery formats. Mixed method analyses were performed to evaluate the effectiveness and acceptability of the IDW and compare the delivery formats. Of those who participated in an IDW (N = 72), 40 participated in face-to-face only, 16 in virtual only, and 16 in both formats. Thirty-eight (face-to-face n = 12, 35 % response rate; virtual n = 26, 66.7 % response rate) responded to the surveys and seven (15.3 % response rate), who had attended both formats, completed an interview. Of 36 total presenters, 17 (face-to-face n = 12, 42.9 % response rate; virtual n = 5, 62.9 % response rate) responded to the funding survey. Mixed method analyses indicated that the IDW was effective for collaboration and growth, effective for enhancing success in obtaining grants, and acceptable. A third (35.3 %) of presenters ultimately received funding for their proposal, and more than

  11. Which Methodology Works Better? English Language Teachers' Awareness of the Innovative Language Learning Methodologies

    Science.gov (United States)

    Kurt, Mustafa

    2015-01-01

    The present study investigated whether English language teachers were aware of the innovative language learning methodologies in language learning, how they made use of these methodologies and the learners' reactions to them. The descriptive survey method was employed to disclose the frequencies and percentages of 175 English language teachers'…

  12. Research of Subgraph Estimation Page Rank Algorithm for Web Page Rank

    Directory of Open Access Journals (Sweden)

    LI Lan-yin

    2017-04-01

    Full Text Available The traditional PageRank algorithm can not efficiently perform large data Webpage scheduling problem. This paper proposes an accelerated algorithm named topK-Rank,which is based on PageRank on the MapReduce platform. It can find top k nodes efficiently for a given graph without sacrificing accuracy. In order to identify top k nodes,topK-Rank algorithm prunes unnecessary nodes and edges in each iteration to dynamically construct subgraphs,and iteratively estimates lower/upper bounds of PageRank scores through subgraphs. Theoretical analysis shows that this method guarantees result exactness. Experiments show that topK-Rank algorithm can find k nodes much faster than the existing approaches.

  13. Page sample size in web accessibility testing: how many pages is enough?

    NARCIS (Netherlands)

    Velleman, Eric Martin; van der Geest, Thea

    2013-01-01

    Various countries and organizations use a different sampling approach and sample size of web pages in accessibility conformance tests. We are conducting a systematic analysis to determine how many pages is enough for testing whether a website is compliant with standard accessibility guidelines. This

  14. SuML: A Survey Markup Language for Generalized Survey Encoding

    Science.gov (United States)

    Barclay, MW; Lober, WB; Karras, BT

    2002-01-01

    There is a need in clinical and research settings for a sophisticated, generalized, web based survey tool that supports complex logic, separation of content and presentation, and computable guidelines. There are many commercial and open source survey packages available that provide simple logic; few provide sophistication beyond “goto” statements; none support the use of guidelines. These tools are driven by databases, static web pages, and structured documents using markup languages such as eXtensible Markup Language (XML). We propose a generalized, guideline aware language and an implementation architecture using open source standards.

  15. Survey research.

    Science.gov (United States)

    Alderman, Amy K; Salem, Barbara

    2010-10-01

    Survey research is a unique methodology that can provide insight into individuals' perspectives and experiences and can be collected on a large population-based sample. Specifically, in plastic surgery, survey research can provide patients and providers with accurate and reproducible information to assist with medical decision-making. When using survey methods in research, researchers should develop a conceptual model that explains the relationships of the independent and dependent variables. The items of the survey are of primary importance. Collected data are only useful if they accurately measure the concepts of interest. In addition, administration of the survey must follow basic principles to ensure an adequate response rate and representation of the intended target sample. In this article, the authors review some general concepts important for successful survey research and discuss the many advantages this methodology has for obtaining limitless amounts of valuable information.

  16. Nationwide survey of radon levels in indoor workplaces in Mexico using Nuclear Track Methodology

    International Nuclear Information System (INIS)

    Espinosa, G.; Golzarri, J.I.; Angeles, A.; Griffith, R.V.

    2009-01-01

    This report presents the preliminary results of an indoor workplace radon survey conducted during 2006-2007. Monitoring was carried out in 24 of the 32 federal entities of Mexico, incorporating 26 cities and 288 locations. The area monitored was divided into 8 regions for the purposes of the study: Chihuahua (a state with uranium mines), North-Central, South-Central, Southeast, South, Northeast, Northwest, and West. These regions differ in terms of geographic and geological characteristics, climate, altitude, and building materials and architectonic styles. Nuclear Track Methodology (NTM) was employed for the survey, using a passive closed-end cup device with Poly Allyl Diglycol Carbonate (PADC), known by its trade name CR-39 (Lantrack), as detector material. Well-established protocols for making continuous indoor radon measurements were followed, including one-step chemical etching in a 6.25 M KOH solution at 60 ± 1 deg. C with an etching time of 18 h. The track densities were determined with an automatic digital system at the Instituto de Fisica de la Universidad Nacional Autonoma de Mexico (IFUNAM) (Physics Institute of the National Autonomous University of Mexico), and calibrated in facilities at the Oak Ridge National Laboratory (ORNL). The importance of this survey lies in the fact that it represents the first time a nationwide survey of radon levels in indoor workplaces has been carried out in Mexico. Mean indoor radon levels from continuous measurements taken during and after working hours ranged from 13 Bq m -3 (the lower limit of detection) to 196 Bq m -3 . Analogous official controls or regulations for radon levels in indoor workplaces do not exist in Mexico. The survey described here contributes to knowledge of the natural radiological environment in workplaces, and will aid the relevant authorities in establishing appropriate regulations. The survey was made possible by the efforts of both a private institutions and the Dosimeter Application Project

  17. Exploiting link structure for web page genre identification

    KAUST Repository

    Zhu, Jia

    2015-07-07

    As the World Wide Web develops at an unprecedented pace, identifying web page genre has recently attracted increasing attention because of its importance in web search. A common approach for identifying genre is to use textual features that can be extracted directly from a web page, that is, On-Page features. The extracted features are subsequently inputted into a machine learning algorithm that will perform classification. However, these approaches may be ineffective when the web page contains limited textual information (e.g., the page is full of images). In this study, we address genre identification of web pages under the aforementioned situation. We propose a framework that uses On-Page features while simultaneously considering information in neighboring pages, that is, the pages that are connected to the original page by backward and forward links. We first introduce a graph-based model called GenreSim, which selects an appropriate set of neighboring pages. We then construct a multiple classifier combination module that utilizes information from the selected neighboring pages and On-Page features to improve performance in genre identification. Experiments are conducted on well-known corpora, and favorable results indicate that our proposed framework is effective, particularly in identifying web pages with limited textual information. © 2015 The Author(s)

  18. Exploiting link structure for web page genre identification

    KAUST Repository

    Zhu, Jia; Xie, Qing; Yu, Shoou I.; Wong, Wai Hung

    2015-01-01

    As the World Wide Web develops at an unprecedented pace, identifying web page genre has recently attracted increasing attention because of its importance in web search. A common approach for identifying genre is to use textual features that can be extracted directly from a web page, that is, On-Page features. The extracted features are subsequently inputted into a machine learning algorithm that will perform classification. However, these approaches may be ineffective when the web page contains limited textual information (e.g., the page is full of images). In this study, we address genre identification of web pages under the aforementioned situation. We propose a framework that uses On-Page features while simultaneously considering information in neighboring pages, that is, the pages that are connected to the original page by backward and forward links. We first introduce a graph-based model called GenreSim, which selects an appropriate set of neighboring pages. We then construct a multiple classifier combination module that utilizes information from the selected neighboring pages and On-Page features to improve performance in genre identification. Experiments are conducted on well-known corpora, and favorable results indicate that our proposed framework is effective, particularly in identifying web pages with limited textual information. © 2015 The Author(s)

  19. Methodology for performing measurements to release material from radiological control

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1993-09-01

    This report describes the existing and proposed methodologies for performing measurements of contamination prior to releasing material for uncontrolled use at the Hanford Site. The technical basis for the proposed methodology, a modification to the existing contamination survey protocol, is also described. The modified methodology, which includes a large-area swipe followed by a statistical survey, can be used to survey material that is unlikely to be contaminated for release to controlled and uncontrolled areas. The material evaluation procedure that is used to determine the likelihood of contamination is also described

  20. Methodological decisions and the evaluation of possible effects of different family structures on children: The new family structures survey (NFSS).

    Science.gov (United States)

    Schumm, Walter R

    2012-11-01

    Every social science researcher must make a number of methodological decisions when planning and implementing research projects. Each such decision carries with it both advantages and limitations. The decisions faced and made by Regnerus (2012) are discussed here in the wider context of social science literature regarding same-sex parenting. Even though the apparent outcomes of Regnerus's study were unpopular, the methodological decisions he made in the design and implementation of the New Family Structures Survey were not uncommon among social scientists, including many progressive, gay and lesbian scholars. These decisions and the research they produced deserve considerable and continued discussion, but criticisms of the underlying ethics and professionalism are misplaced because nearly every methodological decision that was made has ample precedents in research published by many other credible and distinguished scholars. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Insights into Facebook Pages: an early adolescent health research study page targeted at parents.

    Science.gov (United States)

    Amon, Krestina L; Paxton, Karen; Klineberg, Emily; Riley, Lisa; Hawke, Catherine; Steinbeck, Katharine

    2016-02-01

    Facebook has been used in health research, but there is a lack of literature regarding how Facebook may be used to recruit younger adolescents. A Facebook Page was created for an adolescent cohort study on the effects of puberty hormones on well-being and behaviour in early adolescence. Used as a communication tool with existing participants, it also aimed to alert potential participants to the study. The purpose of this paper is to provide a detailed description of the development of the study Facebook Page and present the fan response to the types of posts made on the Page using the Facebook-generated Insights data. Two types of posts were made on the study Facebook Page. The first type was study-related update posts and events. The second was relevant adolescent and family research and current news posts. Observations on the use of and response to the Page were made over 1 year across three phases (phase 1, very low Facebook use; phase 2, high Facebook use; phase 3, low Facebook use). Most Page fans were female (88.6%), with the largest group of fans aged between 35 and 44 years. Study-related update posts with photographs were the most popular. This paper provides a model on which other researchers could base Facebook communication and potential recruitment in the absence of established guidelines.

  2. Utility Estimation for Pediatric Vesicoureteral Reflux: Methodological Considerations Using an Online Survey Platform.

    Science.gov (United States)

    Tejwani, Rohit; Wang, Hsin-Hsiao S; Lloyd, Jessica C; Kokorowski, Paul J; Nelson, Caleb P; Routh, Jonathan C

    2017-03-01

    The advent of online task distribution has opened a new avenue for efficiently gathering community perspectives needed for utility estimation. Methodological consensus for estimating pediatric utilities is lacking, with disagreement over whom to sample, what perspective to use (patient vs parent) and whether instrument induced anchoring bias is significant. We evaluated what methodological factors potentially impact utility estimates for vesicoureteral reflux. Cross-sectional surveys using a time trade-off instrument were conducted via the Amazon Mechanical Turk® (https://www.mturk.com) online interface. Respondents were randomized to answer questions from child, parent or dyad perspectives on the utility of a vesicoureteral reflux health state and 1 of 3 "warm-up" scenarios (paralysis, common cold, none) before a vesicoureteral reflux scenario. Utility estimates and potential predictors were fitted to a generalized linear model to determine what factors most impacted utilities. A total of 1,627 responses were obtained. Mean respondent age was 34.9 years. Of the respondents 48% were female, 38% were married and 44% had children. Utility values were uninfluenced by child/personal vesicoureteral reflux/urinary tract infection history, income or race. Utilities were affected by perspective and were higher in the child group (34% lower in parent vs child, p pediatric conditions. Copyright © 2017 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  3. Basic Project Management Methodologies for Survey Researchers.

    Science.gov (United States)

    Beach, Robert H.

    To be effective, project management requires a heavy dependence on the document, list, and computational capability of a computerized environment. Now that microcomputers are readily available, only the rediscovery of classic project management methodology is required for improved resource allocation in small research projects. This paper provides…

  4. Finding Specification Pages from the Web

    Science.gov (United States)

    Yoshinaga, Naoki; Torisawa, Kentaro

    This paper presents a method of finding a specification page on the Web for a given object (e.g., ``Ch. d'Yquem'') and its class label (e.g., ``wine''). A specification page for an object is a Web page which gives concise attribute-value information about the object (e.g., ``county''-``Sauternes'') in well formatted structures. A simple unsupervised method using layout and symbolic decoration cues was applied to a large number of the Web pages to acquire candidate attributes for each class (e.g., ``county'' for a class ``wine''). We then filter out irrelevant words from the putative attributes through an author-aware scoring function that we called site frequency. We used the acquired attributes to select a representative specification page for a given object from the Web pages retrieved by a normal search engine. Experimental results revealed that our system greatly outperformed the normal search engine in terms of this specification retrieval.

  5. Monitoring HIV Testing in the United States: Consequences of Methodology Changes to National Surveys.

    Directory of Open Access Journals (Sweden)

    Michelle M Van Handel

    Full Text Available In 2011, the National Health Interview Survey (NHIS, an in-person household interview, revised the human immunodeficiency virus (HIV section of the survey and the Behavioral Risk Factor Surveillance System (BRFSS, a telephone-based survey, added cellphone numbers to its sampling frame. We sought to determine how these changes might affect assessment of HIV testing trends.We used linear regression with pairwise contrasts with 2003-2013 data from NHIS and BRFSS to compare percentages of persons aged 18-64 years who reported HIV testing in landline versus cellphone-only households before and after 2011, when NHIS revised its in-person questionnaire and BRFSS added cellphone numbers to its telephone-based sample.In NHIS, the percentage of persons in cellphone-only households increased 13-fold from 2003 to 2013. The percentage ever tested for HIV was 6%-10% higher among persons in cellphone-only than landline households. The percentage ever tested for HIV increased significantly from 40.2% in 2003 to 45.0% in 2010, but was significantly lower in 2011 (40.6% and 2012 (39.7%. In BRFSS, the percentage ever tested decreased significantly from 45.9% in 2003 to 40.2% in 2010, but increased to 42.9% in 2011 and 43.5% in 2013.HIV testing estimates were lower after NHIS questionnaire changes but higher after BRFSS methodology changes. Data before and after 2011 are not comparable, complicating assessment of trends.

  6. Classifying web pages with visual features

    NARCIS (Netherlands)

    de Boer, V.; van Someren, M.; Lupascu, T.; Filipe, J.; Cordeiro, J.

    2010-01-01

    To automatically classify and process web pages, current systems use the textual content of those pages, including both the displayed content and the underlying (HTML) code. However, a very important feature of a web page is its visual appearance. In this paper, we show that using generic visual

  7. Do-It-Yourself: A Special Library's Approach to Creating Dynamic Web Pages Using Commercial Off-The-Shelf Applications

    Science.gov (United States)

    Steeman, Gerald; Connell, Christopher

    2000-01-01

    Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.

  8. Methodology of Young Minds Matter: The second Australian Child and Adolescent Survey of Mental Health and Wellbeing.

    Science.gov (United States)

    Hafekost, Jennifer; Lawrence, David; Boterhoven de Haan, Katrina; Johnson, Sarah E; Saw, Suzy; Buckingham, William J; Sawyer, Michael G; Ainley, John; Zubrick, Stephen R

    2016-09-01

    To describe the study design of Young Minds Matter: The second Australian Child and Adolescent Survey of Mental Health and Wellbeing. The aims of the study, sample design, development of survey content, field procedures and final questionnaires are detailed. During 2013-2014, a national household survey of the mental health and wellbeing of young people was conducted involving a sample of 6310 families selected at random from across Australia. The survey included a face-to-face diagnostic interview with parents/carers of 4- to 17-year-olds and a self-report questionnaire completed by young people aged 11-17 years. The overall response rate to the survey was 55% with 6310 parents/carers of eligible households participating in the survey. In addition, 2967 or 89% of young people aged 11-17 years in these participating households completed a questionnaire. The survey sample was found to be broadly representative of the Australian population on major demographic characteristics when compared with data from the Census of Population and Housing. However, adjustments were made for an over-representation of younger children aged 4 to 7 years and also families with more than one eligible child in the household. Young Minds Matter provides updated national prevalence estimates of common child and adolescent mental disorders, describes patterns of service use and will help to guide future decisions in the development of policy and provision of mental health services for children and adolescents. Advancements in interviewing methodology, addition of a data linkage component and informed content development contributed to improved breadth and quality of the data collected. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  9. The 2014 Survey on Living with Chronic Diseases in Canada on Mood and Anxiety Disorders: a methodological overview

    Directory of Open Access Journals (Sweden)

    S. O’Donnell

    2016-12-01

    Full Text Available Introduction: There is a paucity of information about the impact of mood and anxiety disorders on Canadians and the approaches used to manage them. To address this gap, the 2014 Survey on Living with Chronic Diseases in Canada–Mood and Anxiety Disorders Component (SLCDC-MA was developed. The purpose of this paper is to describe the methodology of the 2014 SLCDC-MA and examine the sociodemographic characteristics of the final sample. Methods: The 2014 SLCDC-MA is a cross-sectional follow-up survey that includes Canadians from the 10 provinces aged 18 years and older with mood and/or anxiety disorders diagnosed by a health professional that are expected to last, or have already lasted, six months or more. The survey was developed by the Public Health Agency of Canada (PHAC through an iterative, consultative process with Statistics Canada and external experts. Statistics Canada performed content testing, designed the sampling frame and strategies and collected and processed the data. PHAC used descriptive analyses to describe the respondents’ sociodemographic characteristics, produced nationally representative estimates using survey weights provided by Statistics Canada, and generated variance estimates using bootstrap methodology. Results: The final 2014 SLCDC-MA sample consists of a total of 3361 respondents (68.9% response rate. Among Canadian adults with mood and/or anxiety disorders, close to two-thirds (64% were female, over half (56% were married/in a common-law relationship and 60% obtained a post-secondary education. Most were young or middle-aged (85%, Canadian born (88%, of non-Aboriginal status (95%, and resided in an urban setting (82%. Household income was fairly evenly distributed between the adequacy quintiles; however, individuals were more likely to report a household income adequacy within the lowest (23% versus highest (17% quintile. Forty-five percent reported having a mood disorder only, 24% an anxiety disorder only and 31

  10. Understanding pathways of exposure using site-specific habits surveys, particularly new pathways and methodologies

    International Nuclear Information System (INIS)

    Grzechnik, M.; McTaggart, K.; Clyne, F.

    2006-01-01

    Full text of publication follows: UK policy on the control of radiation exposure via routine discharges from nuclear licensed sites has long been based on ICRP recommendations that embody the principles of justification of practices, optimisation of protection, and dose limitation. Radiological protection of the public is based on the concept of a critical group of individuals. This group is defined as those people who, as a result of the area they reside and their habits, receive the highest radiation dose due to the operations of a site. Therefore, if the dose to this critical group is acceptable in relation to relevant dose limits and constraints, then other members of the public will receive lower doses. Thus, the principle of critical groups provides overall protection for the public. Surveys to determine local habits involve an integrated methodology, whereby the potential radioactive exposure pathways from liquid and gaseous discharges and direct radiation from the site are investigated. Surveys to identify these habits must be undertaken rigorously for consistency, and have been known to reveal unexpected pathways of radiation exposure. Pathways typically include consumption of local foodstuffs and external exposure. Furthermore, a number of critical groups ma y be identified within a single survey area if the habits of one group do not adequately describe those of the other inhabitants of the area. Survey preparation involves the initial identification of high producers and consumers of local foods in a geographically defined area surrounding the nuclear facility. Pathways can be broken down into three general groups, which include exposure arising from; 1) Terrestrial (gaseous) discharges surveyed within 5 km of the site 2) Direct radiation surveyed within 1 km of the site 3) Aquatic (liquid) discharges surveyed within local areas affected by the discharges, including seas, rivers and sewage works. The survey fieldwork involves interviewing members of the

  11. Degradable Systems: A Survey of Multistate System Theory.

    Science.gov (United States)

    1982-08-01

    and Subtitle) S. TYPE OF REPORT & PERIOD COVERED C. O DEGRADABLE SYSTEMS: A SURVEY OF MULTISTATE TECHNICAL SYSTEM THEORY 6. PERFORMING ORG. REPORT...THIS PAGE(R7,en Date £nt.,.d) AEoS-T- 8- 9 2 0 Degradable Systems: A Survey of Multistate System Theory by 1 2Emad El-Neweihi and Frank Proschan

  12. The effect of new links on Google PageRank

    NARCIS (Netherlands)

    Avrachenkov, Konstatin; Litvak, Nelli

    2004-01-01

    PageRank is one of the principle criteria according to which Google ranks Web pages. PageRank can be interpreted as a frequency of visiting a Web page by a random surfer and thus it reflects the popularity of a Web page. We study the effect of newly created links on Google PageRank. We discuss to

  13. Development of residential-conservation-survey methodology for the US Air Force. Interim report. Task two

    Energy Technology Data Exchange (ETDEWEB)

    Abrams, D. W.; Hartman, T. L.; Lau, A. S.

    1981-11-13

    A US Air Force (USAF) Residential Energy Conservation Methodology was developed to compare USAF needs and available data to the procedures of the Residential Conservation Service (RCS) program as developed for general use by utility companies serving civilian customers. Attention was given to the data implications related to group housing, climatic data requirements, life-cycle cost analysis, energy saving modifications beyond those covered by RCS, and methods for utilizing existing energy consumption data in approaching the USAF survey program. Detailed information and summaries are given on the five subtasks of the program. Energy conservation alternatives are listed and the basic analysis techniques to be used in evaluating their thermal performane are described. (MCW)

  14. Importance of intrinsic and non-network contribution in PageRank centrality and its effect on PageRank localization

    OpenAIRE

    Deyasi, Krishanu

    2016-01-01

    PageRank centrality is used by Google for ranking web-pages to present search result for a user query. Here, we have shown that PageRank value of a vertex also depends on its intrinsic, non-network contribution. If the intrinsic, non-network contributions of the vertices are proportional to their degrees or zeros, then their PageRank centralities become proportion to their degrees. Some simulations and empirical data are used to support our study. In addition, we have shown that localization ...

  15. Analytical Chemistry as Methodology in Modern Pure and Applied Chemistry

    OpenAIRE

    Honjo, Takaharu

    2001-01-01

    Analytical chemistry is an indispensable methodology in pure and applied chemistry, which is often compared to a foundation stone of architecture. In the home page of jsac, it is said that analytical chemistry is a learning of basic science, which treats the development of method in order to get usefull chemical information of materials by means of detection, separation, and characterization. Analytical chemistry has recently developed into analytical sciences, which treats not only analysis ...

  16. Journal Articles Applying National Aquatic Resource Survey Data

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) data are being used and applied above and beyond the regional and national assessments. This page includes a list of recent journal articles that reference NARS data.

  17. Universal emergence of PageRank

    Energy Technology Data Exchange (ETDEWEB)

    Frahm, K M; Georgeot, B; Shepelyansky, D L, E-mail: frahm@irsamc.ups-tlse.fr, E-mail: georgeot@irsamc.ups-tlse.fr, E-mail: dima@irsamc.ups-tlse.fr [Laboratoire de Physique Theorique du CNRS, IRSAMC, Universite de Toulouse, UPS, 31062 Toulouse (France)

    2011-11-18

    The PageRank algorithm enables us to rank the nodes of a network through a specific eigenvector of the Google matrix, using a damping parameter {alpha} Element-Of ]0, 1[. Using extensive numerical simulations of large web networks, with a special accent on British University networks, we determine numerically and analytically the universal features of the PageRank vector at its emergence when {alpha} {yields} 1. The whole network can be divided into a core part and a group of invariant subspaces. For {alpha} {yields} 1, PageRank converges to a universal power-law distribution on the invariant subspaces whose size distribution also follows a universal power law. The convergence of PageRank at {alpha} {yields} 1 is controlled by eigenvalues of the core part of the Google matrix, which are extremely close to unity, leading to large relaxation times as, for example, in spin glasses. (paper)

  18. Universal emergence of PageRank

    International Nuclear Information System (INIS)

    Frahm, K M; Georgeot, B; Shepelyansky, D L

    2011-01-01

    The PageRank algorithm enables us to rank the nodes of a network through a specific eigenvector of the Google matrix, using a damping parameter α ∈ ]0, 1[. Using extensive numerical simulations of large web networks, with a special accent on British University networks, we determine numerically and analytically the universal features of the PageRank vector at its emergence when α → 1. The whole network can be divided into a core part and a group of invariant subspaces. For α → 1, PageRank converges to a universal power-law distribution on the invariant subspaces whose size distribution also follows a universal power law. The convergence of PageRank at α → 1 is controlled by eigenvalues of the core part of the Google matrix, which are extremely close to unity, leading to large relaxation times as, for example, in spin glasses. (paper)

  19. Data Extraction Based on Page Structure Analysis

    Directory of Open Access Journals (Sweden)

    Ren Yichao

    2017-01-01

    Full Text Available The information we need has some confusing problems such as dispersion and different organizational structure. In addition, because of the existence of unstructured data like natural language and images, extracting local content pages is extremely difficult. In the light of of the problems above, this article will apply a method combined with page structure analysis algorithm and page data extraction algorithm to accomplish the gathering of network data. In this way, the problem that traditional complex extraction model behave poorly when dealing with large-scale data is perfectly solved and the page data extraction efficiency is also boosted to a new level. In the meantime, the article will also make a comparison about pages and content of different types between the methods of DOM structure based on the page and HTML regularities of distribution. After all of those, we may find a more efficient extract method.

  20. A thorough spring-clean for CERN's Web pages

    CERN Multimedia

    2001-01-01

    This coming Tuesday will see the unveiling of CERN's new user pages on the Web. Their simplified layout and design will make everybody's lives a whole lot easier. Stand by for Tuesday 17 April when, as announced in the Weekly Bulletin of 2 April (n°14/2001), the new newly-designed users welcome page will be hitting our screens as the default CERN home page. But don't worry, if you've got the blues for the good old blue-green home page it's still in service and, to ensure a smooth transition, will be maintained in parallel until 25 May. But in all likelihood you'll be quickly won over by the new-look pages, which are so much simpler to use. Welcome to the new Web! The aim of this revamp, led by the WPE (Web Public Education) group, is to simplify and introduce a more logical hierarchy into the menus and welcome pages on CERN's Intranet. In a second stage, the 'General Public' pages will get a similar makeover. The fact is that the number of links on the user pages, and in particular the welcome page...

  1. How compliant are dental practice Facebook pages with Australian health care advertising regulations? A Netnographic review.

    Science.gov (United States)

    Holden, Acl; Spallek, H

    2018-03-01

    The National Law that regulates the dental and other health care professions in Australia sets out regulations that dictate how dental practices are to advertise. This study examines the extent to which the profession complies with these regulations and the potential impact that advertising may have upon professionalism. A Facebook search of 38 local government areas in Sydney, New South Wales, was carried out to identify dental practices that had pages on this social media site. A framework for assessment of compliance was developed using the regulatory guidelines and was used to conduct a netnographic review. Two hundred and sixty-six practice pages were identified from across the 38 regions. Of these pages, 71.05% were in breach of the National Law in their use of testimonials, 5.26% displayed misleading or false information, 4.14% displayed offers that had no clear terms and conditions or had inexact pricing, 19.55% had pictures or text that was likely to create unrealistic expectations of treatment benefit and 16.92% encouraged the indiscriminate and unnecessary utilization of health services. This study found that compliance with the National Law by the Facebook pages surveyed was poor. © 2017 Australian Dental Association.

  2. 5 CFR Appendix C to Subpart B of... - Appropriated Fund Wage and Survey Areas

    Science.gov (United States)

    2010-01-01

    ... of Columbia, Washington, DC Survey Area District of Columbia: Washington, DC Maryland: Charles... Illinois: Cook Du Page Kane Lake McHenry Will Area of Application. Survey area plus: Illinois: Boone De...: Harrison Jennings Scott Washington Louisiana Lake Charles-Alexandria Survey Area Louisiana: Allen...

  3. Page 5

    African Journals Online (AJOL)

    ezra

    Page 5. Stress Management By Library And Information Science Professionals In Nigerian University Libraries. BY ... relationships, and other considerations that can be ... Building a dynamic ... and maintaining current awareness of emerging.

  4. Probabilistic relation between In-Degree and PageRank

    NARCIS (Netherlands)

    Litvak, Nelli; Scheinhardt, Willem R.W.; Volkovich, Y.

    2008-01-01

    This paper presents a novel stochastic model that explains the relation between power laws of In-Degree and PageRank. PageRank is a popularity measure designed by Google to rank Web pages. We model the relation between PageRank and In-Degree through a stochastic equation, which is inspired by the

  5. A cross-sectional survey of 5-year-old children with non-syndromic unilateral cleft lip and palate: the Cleft Care UK study. Part 1: background and methodology.

    Science.gov (United States)

    Persson, M; Sandy, J R; Waylen, A; Wills, A K; Al-Ghatam, R; Ireland, A J; Hall, A J; Hollingworth, W; Jones, T; Peters, T J; Preston, R; Sell, D; Smallridge, J; Worthington, H; Ness, A R

    2015-11-01

    We describe the methodology for a major study investigating the impact of reconfigured cleft care in the United Kingdom (UK) 15 years after an initial survey, detailed in the Clinical Standards Advisory Group (CSAG) report in 1998, had informed government recommendations on centralization. This is a UK multicentre cross-sectional study of 5-year-olds born with non-syndromic unilateral cleft lip and palate. Children born between 1 April 2005 and 31 March 2007 were seen in cleft centre audit clinics. Consent was obtained for the collection of routine clinical measures (speech recordings, hearing, photographs, models, oral health, psychosocial factors) and anthropometric measures (height, weight, head circumference). The methodology for each clinical measure followed those of the earlier survey as closely as possible. We identified 359 eligible children and recruited 268 (74.7%) to the study. Eleven separate records for each child were collected at the audit clinics. In total, 2666 (90.4%) were collected from a potential 2948 records. The response rates for the self-reported questionnaires, completed at home, were 52.6% for the Health and Lifestyle Questionnaire and 52.2% for the Satisfaction with Service Questionnaire. Response rates and measures were similar to those achieved in the previous survey. There are practical, administrative and methodological challenges in repeating cross-sectional surveys 15 years apart and producing comparable data. © 2015 The Authors. Orthodontics & Craniofacial Research Published by John Wiley & Sons Ltd.

  6. A cross-sectional survey of 5-year-old children with non-syndromic unilateral cleft lip and palate: the Cleft Care UK study. Part 1: background and methodology

    Science.gov (United States)

    Persson, M; Sandy, J R; Waylen, A; Wills, A K; Al-Ghatam, R; Ireland, A J; Hall, A J; Hollingworth, W; Jones, T; Peters, T J; Preston, R; Sell, D; Smallridge, J; Worthington, H; Ness, A R

    2015-01-01

    Structured Abstract Objectives We describe the methodology for a major study investigating the impact of reconfigured cleft care in the United Kingdom (UK) 15 years after an initial survey, detailed in the Clinical Standards Advisory Group (CSAG) report in 1998, had informed government recommendations on centralization. Setting and Sample Population This is a UK multicentre cross-sectional study of 5-year-olds born with non-syndromic unilateral cleft lip and palate. Children born between 1 April 2005 and 31 March 2007 were seen in cleft centre audit clinics. Materials and Methods Consent was obtained for the collection of routine clinical measures (speech recordings, hearing, photographs, models, oral health, psychosocial factors) and anthropometric measures (height, weight, head circumference). The methodology for each clinical measure followed those of the earlier survey as closely as possible. Results We identified 359 eligible children and recruited 268 (74.7%) to the study. Eleven separate records for each child were collected at the audit clinics. In total, 2666 (90.4%) were collected from a potential 2948 records. The response rates for the self-reported questionnaires, completed at home, were 52.6% for the Health and Lifestyle Questionnaire and 52.2% for the Satisfaction with Service Questionnaire. Conclusions Response rates and measures were similar to those achieved in the previous survey. There are practical, administrative and methodological challenges in repeating cross-sectional surveys 15 years apart and producing comparable data. PMID:26567851

  7. 76 FR 33342 - Eastern States; Filing of Plats of Survey

    Science.gov (United States)

    2011-06-08

    ...] Eastern States; Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... published in the Federal Register, Volume 75, Number 174, on page 54910 a notice entitled ``Eastern States..., 2011 and the plat of survey accepted June 22, 2010, was officially filed in Eastern States Office...

  8. Personal and Public Start Pages in a library setting

    NARCIS (Netherlands)

    Kieft-Wondergem, Dorine

    Personal and Public Start Pages are web-based resources. With these kind of tools it is possible to make your own free start page. A Start Page allows you to put all your web resources into one page, including blogs, email, podcasts, RSSfeeds. It is possible to share the content of the page with

  9. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  10. Monte Carlo methods of PageRank computation

    NARCIS (Netherlands)

    Litvak, Nelli

    2004-01-01

    We describe and analyze an on-line Monte Carlo method of PageRank computation. The PageRank is being estimated basing on results of a large number of short independent simulation runs initiated from each page that contains outgoing hyperlinks. The method does not require any storage of the hyperlink

  11. Dark Energy Survey Year 1 Results: Methodology and Projections for Joint Analysis of Galaxy Clustering, Galaxy Lensing, and CMB Lensing Two-point Functions

    Energy Technology Data Exchange (ETDEWEB)

    Giannantonio, T.; et al.

    2018-02-14

    Optical imaging surveys measure both the galaxy density and the gravitational lensing-induced shear fields across the sky. Recently, the Dark Energy Survey (DES) collaboration used a joint fit to two-point correlations between these observables to place tight constraints on cosmology (DES Collaboration et al. 2017). In this work, we develop the methodology to extend the DES Collaboration et al. (2017) analysis to include cross-correlations of the optical survey observables with gravitational lensing of the cosmic microwave background (CMB) as measured by the South Pole Telescope (SPT) and Planck. Using simulated analyses, we show how the resulting set of five two-point functions increases the robustness of the cosmological constraints to systematic errors in galaxy lensing shear calibration. Additionally, we show that contamination of the SPT+Planck CMB lensing map by the thermal Sunyaev-Zel'dovich effect is a potentially large source of systematic error for two-point function analyses, but show that it can be reduced to acceptable levels in our analysis by masking clusters of galaxies and imposing angular scale cuts on the two-point functions. The methodology developed here will be applied to the analysis of data from the DES, the SPT, and Planck in a companion work.

  12. PageRank in scale-free random graphs

    NARCIS (Netherlands)

    Chen, Ningyuan; Litvak, Nelli; Olvera-Cravioto, Mariana; Bonata, Anthony; Chung, Fan; Pralat, Paweł

    2014-01-01

    We analyze the distribution of PageRank on a directed configuration model and show that as the size of the graph grows to infinity, the PageRank of a randomly chosen node can be closely approximated by the PageRank of the root node of an appropriately constructed tree. This tree approximation is in

  13. Banner Pages on the New Printing Infrastructure

    CERN Multimedia

    2006-01-01

    Changes to the printing service were announced in CERN Bulletin No. 37-38/2006. In the new infrastructure, the printing of the banner page has been disabled in order to reduce paper consumption. Statistics show that the average print job size is small and the paper savings by not printing the banner page could be up to 20 %. When each printer is moved onto the new infrastructure banner page printing will be disabled. In the case of corridor printers which are shared by several users, the Helpdesk can re-enable banner page printing upon request. We hope ultimately to arrive at a situation where banner page printing is enabled on fewer than 10% of printers registered on the network. You can still print banner pages on printers where it has been centrally disabled by using Linux. Simply add it to your print job on the client side by adding the -o job-sheets option to your lpr command. Detailed documentation is available on each SLC3/4 under the following link: http://localhost:631/sum.html#4_2 Please bea...

  14. Women's Pages or People's Pages: The Production of News for Women in the "Washington Post" in the 1950s.

    Science.gov (United States)

    Yang, Mei-ling

    1996-01-01

    Examines the women's pages of the "Washington Post" in the 1950s that were edited by Marie Sauer. States that the newspaper turned down Sauer's request in 1952 to change from traditional women's pages to a unisex "lifestyle" section. Analyzes how women's pages were shaped by factors such as advertising, professional values, and…

  15. Alcohol- and Drug-Involved Driving in the United States: Methodology for the 2007 National Roadside Survey

    Science.gov (United States)

    Lacey, John H.; Kelley-Baker, Tara; Voas, Robert B.; Romano, Eduardo; Furr-Holden, C. Debra; Torres, Pedro; Berning, Amy

    2013-01-01

    This article describes the methodology used in the 2007 U.S. National Roadside Survey to estimate the prevalence of alcohol- and drug-impaired driving and alcohol- and drug-involved driving. This study involved randomly stopping drivers at 300 locations across the 48 continental U.S. states at sites selected through a stratified random sampling procedure. Data were collected during a 2-hour Friday daytime session at 60 locations and during 2-hour nighttime weekend periods at 240 locations. Both self-report and biological measures were taken. Biological measures included breath alcohol measurements from 9,413 respondents, oral fluid samples from 7,719 respondents, and blood samples from 3,276 respondents. PMID:21997324

  16. Using Power-Law Degree Distribution to Accelerate PageRank

    Directory of Open Access Journals (Sweden)

    Zhaoyan Jin

    2012-12-01

    Full Text Available The PageRank vector of a network is very important, for it can reflect the importance of a Web page in the World Wide Web, or of a people in a social network. However, with the growth of the World Wide Web and social networks, it needs more and more time to compute the PageRank vector of a network. In many real-world applications, the degree and PageRank distributions of these complex networks conform to the Power-Law distribution. This paper utilizes the degree distribution of a network to initialize its PageRank vector, and presents a Power-Law degree distribution accelerating algorithm of PageRank computation. Experiments on four real-world datasets show that the proposed algorithm converges more quickly than the original PageRank algorithm.

  17. Comparing classical and quantum PageRanks

    Science.gov (United States)

    Loke, T.; Tang, J. W.; Rodriguez, J.; Small, M.; Wang, J. B.

    2017-01-01

    Following recent developments in quantum PageRanking, we present a comparative analysis of discrete-time and continuous-time quantum-walk-based PageRank algorithms. Relative to classical PageRank and to different extents, the quantum measures better highlight secondary hubs and resolve ranking degeneracy among peripheral nodes for all networks we studied in this paper. For the discrete-time case, we investigated the periodic nature of the walker's probability distribution for a wide range of networks and found that the dominant period does not grow with the size of these networks. Based on this observation, we introduce a new quantum measure using the maximum probabilities of the associated walker during the first couple of periods. This is particularly important, since it leads to a quantum PageRanking scheme that is scalable with respect to network size.

  18. The ATLAS Public Web Pages: Online Management of HEP External Communication Content

    CERN Document Server

    Goldfarb, Steven; Phoboo, Abha Eli; Shaw, Kate

    2015-01-01

    The ATLAS Education and Outreach Group is in the process of migrating its public online content to a professionally designed set of web pages built on the Drupal content management system. Development of the front-end design passed through several key stages, including audience surveys, stakeholder interviews, usage analytics, and a series of fast design iterations, called sprints. Implementation of the web site involves application of the html design using Drupal templates, refined development iterations, and the overall population of the site with content. We present the design and development processes and share the lessons learned along the way, including the results of the data-driven discovery studies. We also demonstrate the advantages of selecting a back-end supported by content management, with a focus on workflow. Finally, we discuss usage of the new public web pages to implement outreach strategy through implementation of clearly presented themes, consistent audience targeting and messaging, and th...

  19. Optimizing TLB entries for mixed page size storage in contiguous memory

    Science.gov (United States)

    Chen, Dong; Gara, Alan; Giampapa, Mark E.; Heidelberger, Philip; Kriegel, Jon K.; Ohmacht, Martin; Steinmacher-Burow, Burkhard

    2013-04-30

    A system and method for accessing memory are provided. The system comprises a lookup buffer for storing one or more page table entries, wherein each of the one or more page table entries comprises at least a virtual page number and a physical page number; a logic circuit for receiving a virtual address from said processor, said logic circuit for matching the virtual address to the virtual page number in one of the page table entries to select the physical page number in the same page table entry, said page table entry having one or more bits set to exclude a memory range from a page.

  20. The Importance of Prior Probabilities for Entry Page Search

    NARCIS (Netherlands)

    Kraaij, W.; Westerveld, T.H.W.; Hiemstra, Djoerd

    An important class of searches on the world-wide-web has the goal to find an entry page (homepage) of an organisation. Entry page search is quite different from Ad Hoc search. Indeed a plain Ad Hoc system performs disappointingly. We explored three non-content features of web pages: page length,

  1. Coming to Life: A Review of Movie Comics: Page to Screen/Screen to Page

    OpenAIRE

    Labarre, Nicolas

    2017-01-01

    This book review provides an overview of 'Movie Comics: Page to Screen/Screen to Page' by Blair Davis (Rutgers University Press, 2017) a book which examines the reciprocal adaptations of film into comics and comics into films from 1930 to 1960. This review argues that 'Movie Comics' provides a useful and finely-textured cultural history of that phenomenon, which help contextualize scholarly studies of contemporary adaptations and transmedia constructions.

  2. An Improved Approach to the PageRank Problems

    Directory of Open Access Journals (Sweden)

    Yue Xie

    2013-01-01

    Full Text Available We introduce a partition of the web pages particularly suited to the PageRank problems in which the web link graph has a nested block structure. Based on the partition of the web pages, dangling nodes, common nodes, and general nodes, the hyperlink matrix can be reordered to be a more simple block structure. Then based on the parallel computation method, we propose an algorithm for the PageRank problems. In this algorithm, the dimension of the linear system becomes smaller, and the vector for general nodes in each block can be calculated separately in every iteration. Numerical experiments show that this approach speeds up the computation of PageRank.

  3. JERHRE's New Web Pages.

    Science.gov (United States)

    2006-06-01

    JERHRE'S WEBSITE, www.csueastbay.edu/JERHRE/ has two new pages. One of those pages is devoted to curriculum that may be used to educate students, investigators and ethics committee members about issues in the ethics of human subjects research, and to evaluate their learning. It appears at www.csueastbay.edu/JERHRE/cur.html. The other is devoted to emailed letters from readers. Appropriate letters will be posted as soon as they are received by the editor. Letters from readers appear at www.csueastbay.edu/JERHRE/let.html.

  4. Socially responsible ethnobotanical surveys in the Cape Floristic Region: ethical principles, methodology and quantification of data

    Directory of Open Access Journals (Sweden)

    Ben-Erik Van Wyk

    2012-03-01

    Full Text Available A broad overview of published and unpublished ethnobotanical surveys in the Cape Floristic Region (the traditional home of the San and Khoi communities shows that the data is incomplete. There is an urgent need to record the rich indigenous knowledge about plants in a systematic and social responsible manner in order to preserve this cultural and scientific heritage for future generations. Improved methods for quantifying data are introduced, with special reference to the simplicity and benefits of the new Matrix Method. This methodology prevents or reduces the number of false negatives, and also ensures the participation of elderly people who might be immobile. It also makes it possible to compare plant uses in different local communities. This method enables the researcher to quantify the knowledge on plant use that was preserved in a community, and to determine the relative importance of a specific plant in a more objective way. Ethical considerations for such ethnobotanical surveys are discussed, through the lens of current ethical codes and international conventions. This is an accessible approach, which can also be used in the life sciences classroom.

  5. Google Analytics: Single Page Traffic Reports

    Science.gov (United States)

    These are pages that live outside of Google Analytics (GA) but allow you to view GA data for any individual page on either the public EPA web or EPA intranet. You do need to log in to Google Analytics to view them.

  6. Uniform Page Migration Problem in Euclidean Space

    Directory of Open Access Journals (Sweden)

    Amanj Khorramian

    2016-08-01

    Full Text Available The page migration problem in Euclidean space is revisited. In this problem, online requests occur at any location to access a single page located at a server. Every request must be served, and the server has the choice to migrate from its current location to a new location in space. Each service costs the Euclidean distance between the server and request. A migration costs the distance between the former and the new server location, multiplied by the page size. We study the problem in the uniform model, in which the page has size D = 1 . All request locations are not known in advance; however, they are sequentially presented in an online fashion. We design a 2.75 -competitive online algorithm that improves the current best upper bound for the problem with the unit page size. We also provide a lower bound of 2.732 for our algorithm. It was already known that 2.5 is a lower bound for this problem.

  7. USGS Methodology for Assessing Continuous Petroleum Resources

    Science.gov (United States)

    Charpentier, Ronald R.; Cook, Troy A.

    2011-01-01

    The U.S. Geological Survey (USGS) has developed a new quantitative methodology for assessing resources in continuous (unconventional) petroleum deposits. Continuous petroleum resources include shale gas, coalbed gas, and other oil and gas deposits in low-permeability ("tight") reservoirs. The methodology is based on an approach combining geologic understanding with well productivities. The methodology is probabilistic, with both input and output variables as probability distributions, and uses Monte Carlo simulation to calculate the estimates. The new methodology is an improvement of previous USGS methodologies in that it better accommodates the uncertainties in undrilled or minimally drilled deposits that must be assessed using analogs. The publication is a collection of PowerPoint slides with accompanying comments.

  8. The Health Behaviour in School-aged Children (HBSC) study: methodological developments and current tensions

    DEFF Research Database (Denmark)

    Roberts, Chris; Freeman, John; Samdal, Oddrun

    2009-01-01

    OBJECTIVES: To describe the methodological development of the HBSC survey since its inception and explore methodological tensions that need to be addressed in the ongoing work on this and other large-scale cross-national surveys. METHODS: Using archival data and conversations with members...... of the network, we collaboratively analysed our joint understandings of the survey's methodology. RESULTS: We identified four tensions that are likely to be present in upcoming survey cycles: (1) maintaining quality standards against a background of rapid growth, (2) continuous improvement with limited financial...... in working through such challenges renders it likely that HBSC can provide a model of other similar studies facing these tensions....

  9. Web page classification on child suitability

    NARCIS (Netherlands)

    C. Eickhoff (Carsten); P. Serdyukov; A.P. de Vries (Arjen)

    2010-01-01

    htmlabstractChildren spend significant amounts of time on the Internet. Recent studies showed, that during these periods they are often not under adult supervision. This work presents an automatic approach to identifying suitable web pages for children based on topical and non-topical web page

  10. Coming to Life: A Review of Movie Comics: Page to Screen/Screen to Page

    Directory of Open Access Journals (Sweden)

    Nicolas Labarre

    2017-03-01

    Full Text Available This book review provides an overview of 'Movie Comics: Page to Screen/Screen to Page' by Blair Davis (Rutgers University Press, 2017 a book which examines the reciprocal adaptations of film into comics and comics into films from 1930 to 1960. This review argues that 'Movie Comics' provides a useful and finely-textured cultural history of that phenomenon, which help contextualize scholarly studies of contemporary adaptations and transmedia constructions.

  11. The ICAP (Interactive Course Assignment Pages Publishing System

    Directory of Open Access Journals (Sweden)

    Kim Griggs

    2008-03-01

    Full Text Available The ICAP publishing system is an open source custom content management system that enables librarians to easily and quickly create and manage library help pages for course assignments (ICAPs, without requiring knowledge of HTML or other web technologies. The system's unique features include an emphasis on collaboration and content reuse and an easy-to-use interface that includes in-line help, simple forms and drag and drop functionality. The system generates dynamic, attractive course assignment pages that blend Web 2.0 features with traditional library resources, and makes the pages easier to find by providing a central web page for the course assignment pages. As of December 2007, the code is available as free, open-source software under the GNU General Public License.

  12. Discovering author impact: A PageRank perspective

    OpenAIRE

    Yan, Erjia; Ding, Ying

    2010-01-01

    This article provides an alternative perspective for measuring author impact by applying PageRank algorithm to a coauthorship network. A weighted PageRank algorithm considering citation and coauthorship network topology is proposed. We test this algorithm under different damping factors by evaluating author impact in the informetrics research community. In addition, we also compare this weighted PageRank with the h-index, citation, and program committee (PC) membership of the International So...

  13. Comparing econometric and survey-based methodologies in measuring offshoring

    DEFF Research Database (Denmark)

    Refslund, Bjarke

    2016-01-01

    such as the national or regional level. Most macro analyses are based on proxies and trade statistics with limitations. Drawing on unique Danish survey data, this article demonstrates how survey data can provide important insights into the national scale and impacts of offshoring, including changes of employment...

  14. Page 28

    African Journals Online (AJOL)

    ezra

    7 (2)2007. Page 28. Serials Management In Polytechnic Libraries in Nigeria: A Comparative ... Despite the strategic position of serials publications amongst the materials .... the formulation of routines and procedures for ..... other professional librarian in the section classify ... probably due to dwindling finance given to both.

  15. Improved USGS methodology for assessing continuous petroleum resources

    Science.gov (United States)

    Charpentier, Ronald R.; Cook, Troy A.

    2010-01-01

    This report presents an improved methodology for estimating volumes of continuous (unconventional) oil and gas resources within the United States and around the world. The methodology is based on previously developed U.S. Geological Survey methodologies that rely on well-scale production data. Improvements were made primarily to how the uncertainty about estimated ultimate recoveries is incorporated in the estimates. This is particularly important when assessing areas with sparse or no production data, because the new methodology allows better use of analog data from areas with significant discovery histories.

  16. The (Untold) Drama of the Turning Page: The Role of Page Breaks in Understanding Picture Books

    Science.gov (United States)

    Jacobs, Katrina Emily Bartow

    2016-01-01

    While scholars have recognized the importance of page breaks in both the construction and comprehension of narrative within picture books, there has previously been limited research that focused directly on how children discuss and make sense of these spaces in the text. Yet, because of their nature as dramatic gaps in the narrative, page breaks…

  17. Metadata Schema Used in OCLC Sampled Web Pages

    Directory of Open Access Journals (Sweden)

    Fei Yu

    2005-12-01

    Full Text Available The tremendous growth of Web resources has made information organization and retrieval more and more difficult. As one approach to this problem, metadata schemas have been developed to characterize Web resources. However, many questions have been raised about the use of metadata schemas such as which metadata schemas have been used on the Web? How did they describe Web accessible information? What is the distribution of these metadata schemas among Web pages? Do certain schemas dominate the others? To address these issues, this study analyzed 16,383 Web pages with meta tags extracted from 200,000 OCLC sampled Web pages in 2000. It found that only 8.19% Web pages used meta tags; description tags, keyword tags, and Dublin Core tags were the only three schemas used in the Web pages. This article revealed the use of meta tags in terms of their function distribution, syntax characteristics, granularity of the Web pages, and the length distribution and word number distribution of both description and keywords tags.

  18. HIV / AIDS prevalence testing - merits, methodology and outcomes ...

    African Journals Online (AJOL)

    HIV / AIDS prevalence testing - merits, methodology and outcomes of a survey conducted at a large mining organisation in South Africa. ... These baseline prevalence data also provide an opportunity for monitoring of proposed interventions using cross-sectional surveys at designated intervals in the future. South African ...

  19. 40 CFR 1502.7 - Page limits.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Page limits. 1502.7 Section 1502.7 Protection of Environment COUNCIL ON ENVIRONMENTAL QUALITY ENVIRONMENTAL IMPACT STATEMENT § 1502.7 Page limits. The text of final environmental impact statements (e.g., paragraphs (d) through (g) of § 1502.10...

  20. Page Recognition: Quantum Leap In Recognition Technology

    Science.gov (United States)

    Miller, Larry

    1989-07-01

    No milestone has proven as elusive as the always-approaching "year of the LAN," but the "year of the scanner" might claim the silver medal. Desktop scanners have been around almost as long as personal computers. And everyone thinks they are used for obvious desktop-publishing and business tasks like scanning business documents, magazine articles and other pages, and translating those words into files your computer understands. But, until now, the reality fell far short of the promise. Because it's true that scanners deliver an accurate image of the page to your computer, but the software to recognize this text has been woefully disappointing. Old optical-character recognition (OCR) software recognized such a limited range of pages as to be virtually useless to real users. (For example, one OCR vendor specified 12-point Courier font from an IBM Selectric typewriter: the same font in 10-point, or from a Diablo printer, was unrecognizable!) Computer dealers have told me the chasm between OCR expectations and reality is so broad and deep that nine out of ten prospects leave their stores in disgust when they learn the limitations. And this is a very important, very unfortunate gap. Because the promise of recognition -- what people want it to do -- carries with it tremendous improvements in our productivity and ability to get tons of written documents into our computers where we can do real work with it. The good news is that a revolutionary new development effort has led to the new technology of "page recognition," which actually does deliver the promise we've always wanted from OCR. I'm sure every reader appreciates the breakthrough represented by the laser printer and page-makeup software, a combination so powerful it created new reasons for buying a computer. A similar breakthrough is happening right now in page recognition: the Macintosh (and, I must admit, other personal computers) equipped with a moderately priced scanner and OmniPage software (from Caere

  1. Instant PageSpeed optimization

    CERN Document Server

    Jaiswal, Sanjeev

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. Instant PageSpeed Optimization is a hands-on guide that provides a number of clear, step-by-step exercises for optimizing your websites for better performance and improving their efficiency.Instant PageSpeed Optimization is aimed at website developers and administrators who wish to make their websites load faster without any errors and consume less bandwidth. It's assumed that you will have some experience in basic web technologies like HTML, CSS3, JavaScript, and the basics of netw

  2. An Efficient PageRank Approach for Urban Traffic Optimization

    Directory of Open Access Journals (Sweden)

    Florin Pop

    2012-01-01

    to determine optimal decisions for each traffic light, based on the solution given by Larry Page for page ranking in Web environment (Page et al. (1999. Our approach is similar with work presented by Sheng-Chung et al. (2009 and Yousef et al. (2010. We consider that the traffic lights are controlled by servers and a score for each road is computed based on efficient PageRank approach and is used in cost function to determine optimal decisions. We demonstrate that the cumulative contribution of each car in the traffic respects the main constrain of PageRank approach, preserving all the properties of matrix consider in our model.

  3. Educational use of World Wide Web pages on CD-ROM.

    Science.gov (United States)

    Engel, Thomas P; Smith, Michael

    2002-01-01

    The World Wide Web is increasingly important for medical education. Internet served pages may also be used on a local hard disk or CD-ROM without a network or server. This allows authors to reuse existing content and provide access to users without a network connection. CD-ROM offers several advantages over network delivery of Web pages for several applications. However, creating Web pages for CD-ROM requires careful planning. Issues include file names, relative links, directory names, default pages, server created content, image maps, other file types and embedded programming. With care, it is possible to create server based pages that can be copied directly to CD-ROM. In addition, Web pages on CD-ROM may reference Internet served pages to provide the best features of both methods.

  4. Survey of Transmission Cost Allocation Methodologies for Regional Transmission Organizations

    Energy Technology Data Exchange (ETDEWEB)

    Fink, S.; Porter, K.; Mudd, C.; Rogers, J.

    2011-02-01

    The report presents transmission cost allocation methodologies for reliability transmission projects, generation interconnection, and economic transmission projects for all Regional Transmission Organizations.

  5. Soil-Web: An online soil survey for California, Arizona, and Nevada

    Science.gov (United States)

    Beaudette, D. E.; O'Geen, A. T.

    2009-10-01

    Digital soil survey products represent one of the largest and most comprehensive inventories of soils information currently available. The complex structure of these databases, intensive use of codes and scientific jargon make it difficult for non-specialists to utilize digital soil survey resources. A project was initiated to construct a web-based interface to digital soil survey products (STATSGO and SSURGO) for California, Arizona, and Nevada that would be accessible to the general public. A collection of mature, open source applications (including Mapserver, PostGIS and Apache Web Server) were used as a framework to support data storage, querying, map composition, data presentation, and contextual links to related materials. Application logic was written in the PHP language to "glue" together the many components of an online soil survey. A comprehensive website ( http://casoilresource.lawr.ucdavis.edu/map) was created to facilitate access to digital soil survey databases through several interfaces including: interactive map, Google Earth and HTTP-based application programming interface (API). Each soil polygon is linked to a map unit summary page, which includes links to soil component summary pages. The most commonly used soil properties, land interpretations and ratings are presented. Graphical and tabular summaries of soil profile information are dynamically created, and aid with rapid assessment of key soil properties. Quick links to official series descriptions (OSD) and other such information are presented. All terminology is linked back to the USDA-NRCS Soil Survey Handbook which contains extended definitions. The Google Earth interface to Soil-Web can be used to explore soils information in three dimensions. A flexible web API was implemented to allow advanced users of soils information to access our website via simple web page requests. Soil-Web has been successfully used in soil science curriculum, outreach activities, and current research projects

  6. Mobile Technology Use by People Experiencing Multiple Sclerosis Fatigue: Survey Methodology.

    Science.gov (United States)

    Van Kessel, Kirsten; Babbage, Duncan R; Reay, Nicholas; Miner-Williams, Warren M; Kersten, Paula

    2017-02-28

    Fatigue is one of the most commonly reported symptoms of multiple sclerosis (MS). It has a profound impact on all spheres of life, for people with MS and their relatives. It is one of the key precipitants of early retirement. Individual, group, and Internet cognitive behavioral therapy-based approaches to supporting people with MS to manage their fatigue have been shown to be effective. The aim of this project was to (1) survey the types of mobile devices and level of Internet access people with MS use or would consider using for a health intervention and (2) characterize the levels of fatigue severity and their impact experienced by the people in our sample to provide an estimate of fatigue severity of people with MS in New Zealand. The ultimate goal of this work was to support the future development of a mobile intervention for the management of fatigue for people with MS. Survey methodology using an online questionnaire was used to assess people with MS. A total of 51 people with MS participated. The average age was 48.5 years, and the large majority of the sample (77%) was female. Participants reported significant levels of fatigue as measured with the summary score of the Neurological Fatigue Index (mean 31.4 [SD 5.3]). Most (84%) respondents scored on average more than 3 on the fatigue severity questions, reflecting significant fatigue. Mobile phone usage was high with 86% of respondents reporting having a mobile phone; apps were used by 75% of respondents. Most participants (92%) accessed the Internet from home. New Zealand respondents with MS experienced high levels of both fatigue severity and fatigue impact. The majority of participants have a mobile device and access to the Internet. These findings, along with limited access to face-to-face cognitive behavioral therapy-based interventions, create an opportunity to develop a mobile technology platform for delivering a cognitive behavioral therapy-based intervention to decrease the severity and impact of

  7. (Per)Forming Archival Research Methodologies

    Science.gov (United States)

    Gaillet, Lynee Lewis

    2012-01-01

    This article raises multiple issues associated with archival research methodologies and methods. Based on a survey of recent scholarship and interviews with experienced archival researchers, this overview of the current status of archival research both complicates traditional conceptions of archival investigation and encourages scholars to adopt…

  8. A survey of psychiatrists' attitudes toward treatment guidelines.

    Science.gov (United States)

    Healy, Daniel J; Goldman, Mona; Florence, Timothy; Milner, Karen K

    2004-04-01

    We developed a survey to look at psychiatrists' attitudes toward psychotropic prescribing guidelines, specifically the Texas Medication Algorithm Project (TMAP) algorithms. The 22-page survey was distributed to 24 psychiatrists working in 4 CMHC's; 13 completed the survey. 90% agreed that guidelines should be general and flexible. The majority also agreed that guidelines should define how to measure response to a specific agent; fewer agreed guidelines should specify dosage, side effect management, or augmentation strategies. Psychiatrists were familiar with TMAP; none referred to it in their practice. In spite of this, psychiatrists' medication preferences were similar to those suggested by guidelines.

  9. Migrating Multi-page Web Applications to Single-page AJAX Interfaces

    NARCIS (Netherlands)

    Mesbah, A.; Van Deursen, A.

    2006-01-01

    Recently, a new web development technique for creating interactive web applications, dubbed AJAX, has emerged. In this new model, the single-page web interface is composed of individual components which can be updated/replaced independently. With the rise of AJAX web applications classical

  10. Improving Training in Methodology Enriches the Science of Psychology

    Science.gov (United States)

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2009-01-01

    Replies to the comment Ramifications of increased training in quantitative methodology by Herbet Zimiles on the current authors original article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America". The…

  11. Making a Historical Survey of a State's Nuclear Ambitions. Impact of Historical Developments of a State's National Nuclear Non-Proliferation Policy on Additional Protocol Implementation

    International Nuclear Information System (INIS)

    Jonter, Thomas

    2003-03-01

    and training researchers and officials about to start nationally base surveys. In chapter 4 the method is described. Two training courses have already been conducted. Based on the experiences from these courses the methodology will be explained and discussed. The fourth objective of this report is to make a competence profile of prospective co-operative partners in Sweden and in other countries, who either can be used to develop training programs, or assist in carrying out the historical surveys. This task is dealt with in chapter 5. Lastly, the fifth objective is to compile a list of databases, literature and home pages dealing with reviews of certain States' nuclear energy and nuclear weapons research. The compilation will concentrate on databases, literature and home pages, which specifically concerns survey activities in a comprehensive perspective. This task is dealt with in chapter 6

  12. Assessing the quantified impact of a hybrid POGIL methodology on student averages in a forensic science survey course

    Science.gov (United States)

    Meeks, Tyna L.

    A causal-comparative/quasi experimental study examined the effect of incorporating a hybrid teaching methodology that blended lecture with Process Oriented Guided Inquiry Lessons (POGILs) on the overall academic achievement of a diverse student body in a large lecture setting. Additional considerations included student gender, ethnicity, declared major (STEM or non-STEM), and SAT scores. An evaluation of the effect that these characteristics had on student achievement due to differentiating import placed on the use of POGILs as a learning tool was included. This study used data obtained from a longitudinal examination of eight years of student data from an introductory forensic science survey course offered in a R1 northeastern university. This study addressed the effectiveness of applying a proscribed active learning methodology, one proposed effective in collegiate education, to a new environment, forensic science. The methodology employed combined fourteen POGILs, created specifically for the chosen course, with didactic lecture during the entire semester of a forensic science survey course. This quasi-experimental design used the manipulation of the independent variable, the use of a hybrid lecture instead of exclusive use of traditional didactic lectures, on the students' academic achievement on exams given during the course. Participants in this study (N=1436) were undergraduate students enrolled in the single semester introductory science course. A longitudinal study that incorporated eight years of data was completed, 4 years pre-intervention (2007-2010) and 4 years post-intervention (2011-2014). The forensic science survey course, taught by only one professor during the eight-year period, was a science discipline that had yet to integrate an active learning educational model. Findings indicate four variables significantly contributed to explaining nearly 28% of the variation seen in the student class averages earned during the eight-year period: the

  13. Monte Carlo methods in PageRank computation: When one iteration is sufficient

    NARCIS (Netherlands)

    Avrachenkov, K.; Litvak, Nelli; Nemirovsky, D.; Osipova, N.

    2005-01-01

    PageRank is one of the principle criteria according to which Google ranks Web pages. PageRank can be interpreted as a frequency of visiting a Web page by a random surfer and thus it reflects the popularity of a Web page. Google computes the PageRank using the power iteration method which requires

  14. Monte Carlo methods in PageRank computation: When one iteration is sufficient

    NARCIS (Netherlands)

    Avrachenkov, K.; Litvak, Nelli; Nemirovsky, D.; Osipova, N.

    PageRank is one of the principle criteria according to which Google ranks Web pages. PageRank can be interpreted as a frequency of visiting a Web page by a random surfer, and thus it reflects the popularity of a Web page. Google computes the PageRank using the power iteration method, which requires

  15. Dark Energy Survey Year 1 Results: Multi-Probe Methodology and Simulated Likelihood Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Krause, E.; et al.

    2017-06-28

    We present the methodology for and detail the implementation of the Dark Energy Survey (DES) 3x2pt DES Year 1 (Y1) analysis, which combines configuration-space two-point statistics from three different cosmological probes: cosmic shear, galaxy-galaxy lensing, and galaxy clustering, using data from the first year of DES observations. We have developed two independent modeling pipelines and describe the code validation process. We derive expressions for analytical real-space multi-probe covariances, and describe their validation with numerical simulations. We stress-test the inference pipelines in simulated likelihood analyses that vary 6-7 cosmology parameters plus 20 nuisance parameters and precisely resemble the analysis to be presented in the DES 3x2pt analysis paper, using a variety of simulated input data vectors with varying assumptions. We find that any disagreement between pipelines leads to changes in assigned likelihood $\\Delta \\chi^2 \\le 0.045$ with respect to the statistical error of the DES Y1 data vector. We also find that angular binning and survey mask do not impact our analytic covariance at a significant level. We determine lower bounds on scales used for analysis of galaxy clustering (8 Mpc$~h^{-1}$) and galaxy-galaxy lensing (12 Mpc$~h^{-1}$) such that the impact of modeling uncertainties in the non-linear regime is well below statistical errors, and show that our analysis choices are robust against a variety of systematics. These tests demonstrate that we have a robust analysis pipeline that yields unbiased cosmological parameter inferences for the flagship 3x2pt DES Y1 analysis. We emphasize that the level of independent code development and subsequent code comparison as demonstrated in this paper is necessary to produce credible constraints from increasingly complex multi-probe analyses of current data.

  16. Cross-Continental Comparison of National Food Consumption Survey Methods—A Narrative Review

    Directory of Open Access Journals (Sweden)

    Willem De Keyzer

    2015-05-01

    Full Text Available Food consumption surveys are performed in many countries. Comparison of results from those surveys across nations is difficult because of differences in methodological approaches. While consensus about the preferred methodology associated with national food consumption surveys is increasing, no inventory of methodological aspects across continents is available. The aims of the present review are (1 to develop a framework of key methodological elements related to national food consumption surveys, (2 to create an inventory of these properties of surveys performed in the continents North-America, South-America, Asia and Australasia, and (3 to discuss and compare these methodological properties cross-continentally. A literature search was performed using a fixed set of search terms in different databases. The inventory was completed with all accessible information from all retrieved publications and corresponding authors were requested to provide additional information where missing. Surveys from ten individual countries, originating from four continents are listed in the inventory. The results are presented according to six major aspects of food consumption surveys. The most common dietary intake assessment method used in food consumption surveys worldwide is the 24-HDR (24 h dietary recall, occasionally administered repeatedly, mostly using interview software. Only three countries have incorporated their national food consumption surveys into continuous national health and nutrition examination surveys.

  17. A Note on the PageRank of Undirected Graphs

    OpenAIRE

    Grolmusz, Vince

    2012-01-01

    The PageRank is a widely used scoring function of networks in general and of the World Wide Web graph in particular. The PageRank is defined for directed graphs, but in some special cases applications for undirected graphs occur. In the literature it is widely noted that the PageRank for undirected graphs are proportional to the degrees of the vertices of the graph. We prove that statement for a particular personalization vector in the definition of the PageRank, and we also show that in gene...

  18. Digital Ethnography: Library Web Page Redesign among Digital Natives

    Science.gov (United States)

    Klare, Diane; Hobbs, Kendall

    2011-01-01

    Presented with an opportunity to improve Wesleyan University's dated library home page, a team of librarians employed ethnographic techniques to explore how its users interacted with Wesleyan's current library home page and web pages in general. Based on the data that emerged, a group of library staff and members of the campus' information…

  19. Web page sorting algorithm based on query keyword distance relation

    Science.gov (United States)

    Yang, Han; Cui, Hong Gang; Tang, Hao

    2017-08-01

    In order to optimize the problem of page sorting, according to the search keywords in the web page in the relationship between the characteristics of the proposed query keywords clustering ideas. And it is converted into the degree of aggregation of the search keywords in the web page. Based on the PageRank algorithm, the clustering degree factor of the query keyword is added to make it possible to participate in the quantitative calculation. This paper proposes an improved algorithm for PageRank based on the distance relation between search keywords. The experimental results show the feasibility and effectiveness of the method.

  20. Bulk Fuel Pricing: DOD Needs to Take Additional Actions to Establish a More Reliable Methodology

    Science.gov (United States)

    2015-11-19

    Page 1 GAO-16-78R Bulk Fuel Pricing 441 G St. N.W. Washington, DC 20548 November 19, 2015 The Honorable Ashton Carter The Secretary of...Defense Bulk Fuel Pricing : DOD Needs to Take Additional Actions to Establish a More Reliable Methodology Dear Secretary Carter: Each fiscal...year, the Office of the Under Secretary of Defense (Comptroller), in coordination with the Defense Logistics Agency, sets a standard price per barrel

  1. CrazyEgg Reports for Single Page Analysis

    Science.gov (United States)

    CrazyEgg provides an in depth look at visitor behavior on one page. While you can use GA to do trend analysis of your web area, CrazyEgg helps diagnose the design of a single Web page by visually displaying all visitor clicks during a specified time.

  2. An Adaptive Reordered Method for Computing PageRank

    Directory of Open Access Journals (Sweden)

    Yi-Ming Bu

    2013-01-01

    Full Text Available We propose an adaptive reordered method to deal with the PageRank problem. It has been shown that one can reorder the hyperlink matrix of PageRank problem to calculate a reduced system and get the full PageRank vector through forward substitutions. This method can provide a speedup for calculating the PageRank vector. We observe that in the existing reordered method, the cost of the recursively reordering procedure could offset the computational reduction brought by minimizing the dimension of linear system. With this observation, we introduce an adaptive reordered method to accelerate the total calculation, in which we terminate the reordering procedure appropriately instead of reordering to the end. Numerical experiments show the effectiveness of this adaptive reordered method.

  3. Using shadow page cache to improve isolated drivers performance.

    Science.gov (United States)

    Zheng, Hao; Dong, Xiaoshe; Wang, Endong; Chen, Baoke; Zhu, Zhengdong; Liu, Chengzhe

    2015-01-01

    With the advantage of the reusability property of the virtualization technology, users can reuse various types and versions of existing operating systems and drivers in a virtual machine, so as to customize their application environment. In order to prevent users' virtualization environments being impacted by driver faults in virtual machine, Chariot examines the correctness of driver's write operations by the method of combining a driver's write operation capture and a driver's private access control table. However, this method needs to keep the write permission of shadow page table as read-only, so as to capture isolated driver's write operations through page faults, which adversely affect the performance of the driver. Based on delaying setting frequently used shadow pages' write permissions to read-only, this paper proposes an algorithm using shadow page cache to improve the performance of isolated drivers and carefully study the relationship between the performance of drivers and the size of shadow page cache. Experimental results show that, through the shadow page cache, the performance of isolated drivers can be greatly improved without impacting Chariot's reliability too much.

  4. stage/page/play

    DEFF Research Database (Denmark)

    context. Contributors: Per Brask, Dario Fo, Jette Barnholdt Hansen, Pil Hansen, Sven Åke Heed, Ulla Kallenbach, Sofie Kluge, Annelis Kuhlmann, Kela Kvam, Anna Lawaetz, Bent Flemming Nielsen, Franco Perrelli, Magnus Tessing Schneider, Antonio Scuderi. stage/page/play is published as a festschrift...

  5. Give your feedback on the new Users’ page

    CERN Multimedia

    CERN Bulletin

    If you haven't already done so, visit the new Users’ page and provide the Communications group with your feedback. You can do this quickly and easily via an online form. A dedicated web steering group will design the future page on the basis of your comments. As a first step towards reforming the CERN website, the Communications group is proposing a ‘beta’ version of the Users’ pages. The primary aim of this version is to improve the visibility of key news items, events and announcements to the CERN community. The beta version is very much work in progress: your input is needed to make sure that the final site meets the needs of CERN’s wide and mixed community. The Communications group will read all your comments and suggestions, and will establish a web steering group that will make sure that the future CERN web pages match the needs of the community. More information on this process, including the gradual 'retirement' of the grey Users' pages we are a...

  6. Classroom Web Pages: A "How-To" Guide for Educators.

    Science.gov (United States)

    Fehling, Eric E.

    This manual provides teachers, with very little or no technology experience, with a step-by-step guide for developing the necessary skills for creating a class Web Page. The first part of the manual is devoted to the thought processes preceding the actual creation of the Web Page. These include looking at other Web Pages, deciding what should be…

  7. OnlineMin: A Fast Strongly Competitive Randomized Paging Algorithm

    DEFF Research Database (Denmark)

    Brodal, Gerth Stølting; Moruz, Gabriel; Negoescu, Andrei

    2012-01-01

    approach that both has optimal competitiveness and selects victim pages in subquadratic time. In fact, if k pages fit in internal memory the best previous solution required O(k 2) time per request and O(k) space, whereas our approach takes also O(k) space, but only O(logk) time in the worst case per page...

  8. Functional Multiplex PageRank

    Science.gov (United States)

    Iacovacci, Jacopo; Rahmede, Christoph; Arenas, Alex; Bianconi, Ginestra

    2016-10-01

    Recently it has been recognized that many complex social, technological and biological networks have a multilayer nature and can be described by multiplex networks. Multiplex networks are formed by a set of nodes connected by links having different connotations forming the different layers of the multiplex. Characterizing the centrality of the nodes in a multiplex network is a challenging task since the centrality of the node naturally depends on the importance associated to links of a certain type. Here we propose to assign to each node of a multiplex network a centrality called Functional Multiplex PageRank that is a function of the weights given to every different pattern of connections (multilinks) existent in the multiplex network between any two nodes. Since multilinks distinguish all the possible ways in which the links in different layers can overlap, the Functional Multiplex PageRank can describe important non-linear effects when large relevance or small relevance is assigned to multilinks with overlap. Here we apply the Functional Page Rank to the multiplex airport networks, to the neuronal network of the nematode C. elegans, and to social collaboration and citation networks between scientists. This analysis reveals important differences existing between the most central nodes of these networks, and the correlations between their so-called pattern to success.

  9. [The methodology and sample description of the National Survey on Addiction Problems in Hungary 2015 (NSAPH 2015)].

    Science.gov (United States)

    Paksi, Borbala; Demetrovics, Zsolt; Magi, Anna; Felvinczi, Katalin

    2017-06-01

    This paper introduces the methods and methodological findings of the National Survey on Addiction Problems in Hungary (NSAPH 2015). Use patterns of smoking, alcohol use and other psychoactive substances were measured as well as that of certain behavioural addictions (problematic gambling - PGSI, DSM-V, eating disorders - SCOFF, problematic internet use - PIUQ, problematic on-line gaming - POGO, problematic social media use - FAS, exercise addictions - EAI-HU, work addiction - BWAS, compulsive buying - CBS). The paper describes the applied measurement techniques, sample selection, recruitment of respondents and the data collection strategy as well. Methodological results of the survey including reliability and validity of the measures are reported. The NSAPH 2015 research was carried out on a nationally representative sample of the Hungarian adult population aged 16-64 yrs (gross sample 2477, net sample 2274 persons) with the age group of 18-34 being overrepresented. Statistical analysis of the weight-distribution suggests that weighting did not create any artificial distortion in the database leaving the representativeness of the sample unaffected. The size of the weighted sample of the 18-64 years old adult population is 1490 persons. The extent of the theoretical margin of error in the weighted sample is ±2,5%, at a reliability level of 95% which is in line with the original data collection plans. Based on the analysis of reliability and the extent of errors beyond sampling within the context of the database we conclude that inconsistencies create relatively minor distortions in cumulative prevalence rates; consequently the database makes possible the reliable estimation of risk factors related to different substance use behaviours. The reliability indexes of measurements used for prevalence estimates of behavioural addictions proved to be appropriate, though the psychometric features in some cases suggest the presence of redundant items. The comparison of

  10. Recognition of pornographic web pages by classifying texts and images.

    Science.gov (United States)

    Hu, Weiming; Wu, Ou; Chen, Zhouyao; Fu, Zhouyu; Maybank, Steve

    2007-06-01

    With the rapid development of the World Wide Web, people benefit more and more from the sharing of information. However, Web pages with obscene, harmful, or illegal content can be easily accessed. It is important to recognize such unsuitable, offensive, or pornographic Web pages. In this paper, a novel framework for recognizing pornographic Web pages is described. A C4.5 decision tree is used to divide Web pages, according to content representations, into continuous text pages, discrete text pages, and image pages. These three categories of Web pages are handled, respectively, by a continuous text classifier, a discrete text classifier, and an algorithm that fuses the results from the image classifier and the discrete text classifier. In the continuous text classifier, statistical and semantic features are used to recognize pornographic texts. In the discrete text classifier, the naive Bayes rule is used to calculate the probability that a discrete text is pornographic. In the image classifier, the object's contour-based features are extracted to recognize pornographic images. In the text and image fusion algorithm, the Bayes theory is used to combine the recognition results from images and texts. Experimental results demonstrate that the continuous text classifier outperforms the traditional keyword-statistics-based classifier, the contour-based image classifier outperforms the traditional skin-region-based image classifier, the results obtained by our fusion algorithm outperform those by either of the individual classifiers, and our framework can be adapted to different categories of Web pages.

  11. Disordered Gambling Prevalence: Methodological Innovations in a General Danish Population Survey.

    Science.gov (United States)

    Harrison, Glenn W; Jessen, Lasse J; Lau, Morten I; Ross, Don

    2018-03-01

    We study Danish adult gambling behavior with an emphasis on discovering patterns relevant to public health forecasting and economic welfare assessment of policy. Methodological innovations include measurement of formative in addition to reflective constructs, estimation of prospective risk for developing gambling disorder rather than risk of being falsely negatively diagnosed, analysis with attention to sample weights and correction for sample selection bias, estimation of the impact of trigger questions on prevalence estimates and sample characteristics, and distinguishing between total and marginal effects of risk-indicating factors. The most significant novelty in our design is that nobody was excluded on the basis of their response to a 'trigger' or 'gateway' question about previous gambling history. Our sample consists of 8405 adult Danes. We administered the Focal Adult Gambling Screen to all subjects and estimate prospective risk for disordered gambling. We find that 87.6% of the population is indicated for no detectable risk, 5.4% is indicated for early risk, 1.7% is indicated for intermediate risk, 2.6% is indicated for advanced risk, and 2.6% is indicated for disordered gambling. Correcting for sample weights and controlling for sample selection has a significant effect on prevalence rates. Although these estimates of the 'at risk' fraction of the population are significantly higher than conventionally reported, we infer a significant decrease in overall prevalence rates of detectable risk with these corrections, since gambling behavior is positively correlated with the decision to participate in gambling surveys. We also find that imposing a threshold gambling history leads to underestimation of the prevalence of gambling problems.

  12. 618-11 Burial Ground USRADS radiological surveys

    International Nuclear Information System (INIS)

    Wendling, M.A.

    1994-01-01

    This report summarizes and documents the results of the radiological surveys conducted from February 4 through February 10, 1993 over the 618-11 Burial Ground, Hanford Site, Richland, Washington. In addition, this report explains the survey methodology using the Ultrasonic Ranging and Data System (USRADS). The 618-11 Burial Ground radiological survey field task consisted of two activities: characterization of the specific background conditions and the radiological survey of the area. The radiological survey of the 618-11 Burial Ground, along with the background study, were conducted by Site Investigative Surveys Environmental Restoration Health Physics Organization of the Westinghouse Hanford Company. The survey methodology was based on utilization of the Ultrasonic Ranging and Data System (USRADS) for automated recording of the gross gamma radiation levels at or near six (6) inches and at three (3) feet from the surface soil

  13. Variations in PET/CT methodology for oncologic imaging at U.S. academic medical centers: an imaging response assessment team survey.

    Science.gov (United States)

    Graham, Michael M; Badawi, Ramsey D; Wahl, Richard L

    2011-02-01

    In 2005, 8 Imaging Response Assessment Teams (IRATs) were funded by the National Cancer Institute (NCI) as supplemental grants to existing NCI Cancer Centers. After discussion among the IRATs regarding the need for increased standardization of clinical and research PET/CT methodology, it became apparent that data acquisition and processing approaches differ considerably among centers. To determine the variability in detail, a survey of IRAT sites and IRAT affiliates was performed. A 34-question instrument evaluating patient preparation, scanner type, performance approach, display, and analysis was developed. Fifteen institutions, including the 8 original IRATs and 7 institutions that had developed affiliate IRATs, were surveyed. The major areas of variation were (18)F-FDG dose (259-740 MBq [7-20 mCi]) uptake time (45-90 min), sedation (never to frequently), handling of diabetic patients, imaging time (2-7 min/bed position), performance of diagnostic CT scans as a part of PET/CT, type of acquisition (2-dimensional vs. 3-dimensional), CT technique, duration of fasting (4 or 6 h), and (varying widely) acquisition, processing, display, and PACS software--with 4 sites stating that poor-quality images appear on PACS. There is considerable variability in the way PET/CT scans are performed at academic institutions that are part of the IRAT network. This variability likely makes it difficult to quantitatively compare studies performed at different centers. These data suggest that additional standardization in methodology will be required so that PET/CT studies, especially those performed quantitatively, are more comparable across sites.

  14. Cross-continental comparison of national food consumption survey methods--a narrative review

    Science.gov (United States)

    Food consumption surveys are performed in many countries. Comparison of results from those surveys across nations is difficult because of differences in methodological approaches. While consensus about the preferred methodology associated with national food consumption surveys is increasing, no in...

  15. WebScore: An Effective Page Scoring Approach for Uncertain Web Social Networks

    Directory of Open Access Journals (Sweden)

    Shaojie Qiao

    2011-10-01

    Full Text Available To effectively score pages with uncertainty in web social networks, we first proposed a new concept called transition probability matrix and formally defined the uncertainty in web social networks. Second, we proposed a hybrid page scoring algorithm, called WebScore, based on the PageRank algorithm and three centrality measures including degree, betweenness, and closeness. Particularly,WebScore takes into a full consideration of the uncertainty of web social networks by computing the transition probability from one page to another. The basic idea ofWebScore is to: (1 integrate uncertainty into PageRank in order to accurately rank pages, and (2 apply the centrality measures to calculate the importance of pages in web social networks. In order to verify the performance of WebScore, we developed a web social network analysis system which can partition web pages into distinct groups and score them in an effective fashion. Finally, we conducted extensive experiments on real data and the results show that WebScore is effective at scoring uncertain pages with less time deficiency than PageRank and centrality measures based page scoring algorithms.

  16. Buying a Constant Competitive Ratio for Paging

    NARCIS (Netherlands)

    Csirik, János; Imreh, Csanád; Noga, John; Seiden, Steve S.; Woeginger, Gerhard; Meyer auf der Heide, Friedhelm

    2001-01-01

    We consider a variant of the online paging problem where the online algorithm may buy additional cache slots at a certain cost. The overall cost incurred equals the total cost for the cache plus the number of page faults. This problem and our results are a generalization of both, the classical

  17. Buying a constant competitive ratio for paging

    NARCIS (Netherlands)

    Csirik, J.; Imreh, Cs.; Noga, J.; Seiden, S.S.; Woeginger, G.J.; Meyer auf der Heide, F.

    2001-01-01

    We consider a variant of the online paging problem where the online algorithm may buy additional cache slots at a certain cost. The overall cost incurred equals the total cost for the cache plus the number of page faults. This problem and our results are a generalization of both, the classical

  18. The EuroPrevall-INCO surveys on the prevalence of food allergies in children from China, India and Russia: the study methodology.

    Science.gov (United States)

    Wong, G W K; Mahesh, P A; Ogorodova, L; Leung, T F; Fedorova, O; Holla, A D; Fernandez-Rivas, M; Clare Mills, E N; Kummeling, I; van Ree, R; Yazdanbakhsh, M; Burney, P

    2010-03-01

    Very little is known regarding the global variations in the prevalence of food allergies. The EuroPrevall-INCO project has been developed to evaluate the prevalence of food allergies in China, India and Russia using the standardized methodology of the EuroPrevall protocol used for studies in the European Union. The epidemiological surveys of the project were designed to estimate variations in the prevalence of food allergy and exposure to known or suspected risk factors for food allergy and to compare the data with different European countries. Random samples of primary schoolchildren were recruited from urban and rural regions of China, Russia and India for screening to ascertain possible adverse reactions to foods. Cases and controls were then selected to answer a detailed questionnaire designed to evaluate the possible risk factors of food allergies. Objective evidence of sensitisation including skin-prick test and serum specific IgE measurement was also collected. More than 37 000 children from the three participating countries have been screened. The response rates for the screening phase ranged from 83% to 95%. More than 3000 cases and controls were studied in the second phase of the study. Further confirmation of food allergies by double blind food challenge was conducted. This will be the first comparative study of the epidemiology of food allergies in China, India, and Russia using the same standardized methodology. The findings of these surveys will complement the data obtained from Europe and provide insights into the development of food allergy.

  19. Use of Facebook by Hospitals in Taiwan: A Nationwide Survey

    Directory of Open Access Journals (Sweden)

    Po-Chin Yang

    2018-06-01

    Full Text Available Background: Social media advertising has become increasingly influential in recent years. Because Facebook has the most active users worldwide, many hospitals in Taiwan have created official Facebook fan pages. Our study was to present an overview of official Facebook fan pages of hospitals in Taiwan. Methods: All 417 hospitals were surveyed about their use of Facebook fan pages in December 2017. The last update time, posts in the past 30 days, number of “Likes”, and other features were analyzed and stratified according to the accreditation statuses of the hospitals. Results: In Taiwan, only 51.1% (n = 213 of the hospitals had an official Facebook fan page. Among these hospitals, 71.8% (n = 153 had updated their pages in the past 30 days, although 89.2% (n = 190 provided online interactions. Academic medical centers tended to have more “Likes” than regional and local community hospitals (on average 5947.4, 2644.8, and 1548.0, respectively. Conclusions: In spite of the popularity of Facebook among the general population, most hospitals in Taiwan do not seem to make good use of this kind of social media. The reasons for the use and nonuse of Facebook on the part of both hospitals and patients deserve further investigation.

  20. Use of Facebook by Hospitals in Taiwan: A Nationwide Survey.

    Science.gov (United States)

    Yang, Po-Chin; Lee, Wui-Chiang; Liu, Hao-Yen; Shih, Mei-Ju; Chen, Tzeng-Ji; Chou, Li-Fang; Hwang, Shinn-Jang

    2018-06-06

    Background : Social media advertising has become increasingly influential in recent years. Because Facebook has the most active users worldwide, many hospitals in Taiwan have created official Facebook fan pages. Our study was to present an overview of official Facebook fan pages of hospitals in Taiwan. Methods : All 417 hospitals were surveyed about their use of Facebook fan pages in December 2017. The last update time, posts in the past 30 days, number of “Likes”, and other features were analyzed and stratified according to the accreditation statuses of the hospitals. Results : In Taiwan, only 51.1% ( n = 213) of the hospitals had an official Facebook fan page. Among these hospitals, 71.8% ( n = 153) had updated their pages in the past 30 days, although 89.2% ( n = 190) provided online interactions. Academic medical centers tended to have more “Likes” than regional and local community hospitals (on average 5947.4, 2644.8, and 1548.0, respectively). Conclusions : In spite of the popularity of Facebook among the general population, most hospitals in Taiwan do not seem to make good use of this kind of social media. The reasons for the use and nonuse of Facebook on the part of both hospitals and patients deserve further investigation.

  1. Positive Community Norm Survey 2011 : Methodology and Results

    Science.gov (United States)

    2012-09-01

    This survey established a baseline understanding of the positive norms that exist in Idaho, plus reveal the gaps in knowledge and perceived norms with regard to impaired driving. These gaps will indicate the most effective opportunities for future co...

  2. Assessing importance and satisfaction judgments of intermodal work commuters with electronic survey methodology.

    Science.gov (United States)

    2013-09-01

    Recent advances in multivariate methodology provide an opportunity to further the assessment of service offerings in public transportation for work commuting. We offer methodologies that are alternative to direct rating scale and have advantages in t...

  3. Siting Study Framework and Survey Methodology for Marine and Hydrokinetic Energy Project in Offshore Southeast Florida

    Energy Technology Data Exchange (ETDEWEB)

    Vinick, Charles; Riccobono, Antonino, MS; Messing, Charles G., Ph.D.; Walker, Brian K., Ph.D.; Reed, John K., Ph.D.

    2012-02-28

    Dehlsen Associates, LLC was awarded a grant by the United States Department of Energy (DOE) Golden Field Office for a project titled 'Siting Study Framework and Survey Methodology for Marine and Hydrokinetic Energy Project in Offshore Southeast Florida,' corresponding to DOE Grant Award Number DE-EE0002655 resulting from DOE funding Opportunity Announcement Number DE-FOA-0000069 for Topic Area 2, and it is referred to herein as 'the project.' The purpose of the project was to enhance the certainty of the survey requirements and regulatory review processes for the purpose of reducing the time, efforts, and costs associated with initial siting efforts of marine and hydrokinetic energy conversion facilities that may be proposed in the Atlantic Ocean offshore Southeast Florida. To secure early input from agencies, protocols were developed for collecting baseline geophysical information and benthic habitat data that can be used by project developers and regulators to make decisions early in the process of determining project location (i.e., the siting process) that avoid or minimize adverse impacts to sensitive marine benthic habitat. It is presumed that such an approach will help facilitate the licensing process for hydrokinetic and other ocean renewable energy projects within the study area and will assist in clarifying the baseline environmental data requirements described in the U.S. Department of the Interior Bureau of Ocean Energy Management, Regulation and Enforcement (formerly Minerals Management Service) final regulations on offshore renewable energy (30 Code of Federal Regulations 285, published April 29, 2009). Because projects generally seek to avoid or minimize impacts to sensitive marine habitats, it was not the intent of this project to investigate areas that did not appear suitable for the siting of ocean renewable energy projects. Rather, a two-tiered approach was designed with the first step consisting of gaining overall insight

  4. Three results on the PageRank vector: eigenstructure, sensitivity, and the derivative

    OpenAIRE

    Gleich, David; Glynn, Peter; Golub, Gene; Greif, Chen

    2007-01-01

    The three results on the PageRank vector are preliminary but shed light on the eigenstructure of a PageRank modified Markov chain and what happens when changing the teleportation parameter in the PageRank model. Computations with the derivative of the PageRank vector with respect to the teleportation parameter show predictive ability and identify an interesting set of pages from Wikipedia.

  5. Review and evaluation of paleohydrologic methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Foley, M.G.; Zimmerman, D.A.; Doesburg, J.M.; Thorne, P.D.

    1982-12-01

    A literature review was conducted to identify methodologies that could be used to interpret paleohydrologic environments. Paleohydrology is the study of past hydrologic systems or of the past behavior of an existing hydrologic system. The purpose of the review was to evaluate how well these methodologies could be applied to the siting of low-level radioactive waste facilities. The computer literature search queried five bibliographical data bases containing over five million citations of technical journals, books, conference papers, and reports. Two data-base searches (United States Geological Survey - USGS) and a manual search were also conducted. The methodologies were examined for data requirements and sensitivity limits. Paleohydrologic interpretations are uncertain because of the effects of time on hydrologic and geologic systems and because of the complexity of fluvial systems. Paleoflow determinations appear in many cases to be order-of-magnitude estimates. However, the methodologies identified in this report mitigate this uncertainty when used collectively as well as independently. That is, the data from individual methodologies can be compared or combined to corroborate hydrologic predictions. In this manner, paleohydrologic methodologies are viable tools to assist in evaluating the likely future hydrology of low-level radioactive waste sites.

  6. Review and evaluation of paleohydrologic methodologies

    International Nuclear Information System (INIS)

    Foley, M.G.; Zimmerman, D.A.; Doesburg, J.M.; Thorne, P.D.

    1982-12-01

    A literature review was conducted to identify methodologies that could be used to interpret paleohydrologic environments. Paleohydrology is the study of past hydrologic systems or of the past behavior of an existing hydrologic system. The purpose of the review was to evaluate how well these methodologies could be applied to the siting of low-level radioactive waste facilities. The computer literature search queried five bibliographical data bases containing over five million citations of technical journals, books, conference papers, and reports. Two data-base searches (United States Geological Survey - USGS) and a manual search were also conducted. The methodologies were examined for data requirements and sensitivity limits. Paleohydrologic interpretations are uncertain because of the effects of time on hydrologic and geologic systems and because of the complexity of fluvial systems. Paleoflow determinations appear in many cases to be order-of-magnitude estimates. However, the methodologies identified in this report mitigate this uncertainty when used collectively as well as independently. That is, the data from individual methodologies can be compared or combined to corroborate hydrologic predictions. In this manner, paleohydrologic methodologies are viable tools to assist in evaluating the likely future hydrology of low-level radioactive waste sites

  7. Analytical methods for proteome data obtained from SDS-PAGE multi-dimensional separation and mass spectrometry

    Directory of Open Access Journals (Sweden)

    Gun Wook Park

    2010-03-01

    Full Text Available For proteome analysis, various experimental protocols using mass spectrometry have been developed over thelast decade. The different protocols have differing performances and degrees of accuracy. Furthermore, the “best”protocol for a proteomic analysis of a sample depends on the purpose of the analysis, especially in connection withdisease proteomics, including biomarker discovery and therapeutics analyses of human serum or plasma. Theprotein complexity and the wide dynamic range of blood samples require high-dimensional separation technology.In this article, we review proteome analysis protocols in which both Sodium Dodecyl Sulfate-Polyacryl Amide GelElectrophoresis(SDS-PAGE and liquid chromatography are used for peptide and protein separations. Multidimensionalseparation technology supplies a high-quality dataset of tandem mass spectra and reveals signals fromlow-abundance proteins, although it can be time-consuming and laborious work. We survey shotgun proteomicsprotocols using SDS-PAGE and liquid chromatography and introduce bioinformatics tools for the analysis ofproteomics data. We also review efforts toward the biological interpretation of the proteome.

  8. PageRank tracker: from ranking to tracking.

    Science.gov (United States)

    Gong, Chen; Fu, Keren; Loza, Artur; Wu, Qiang; Liu, Jia; Yang, Jie

    2014-06-01

    Video object tracking is widely used in many real-world applications, and it has been extensively studied for over two decades. However, tracking robustness is still an issue in most existing methods, due to the difficulties with adaptation to environmental or target changes. In order to improve adaptability, this paper formulates the tracking process as a ranking problem, and the PageRank algorithm, which is a well-known webpage ranking algorithm used by Google, is applied. Labeled and unlabeled samples in tracking application are analogous to query webpages and the webpages to be ranked, respectively. Therefore, determining the target is equivalent to finding the unlabeled sample that is the most associated with existing labeled set. We modify the conventional PageRank algorithm in three aspects for tracking application, including graph construction, PageRank vector acquisition and target filtering. Our simulations with the use of various challenging public-domain video sequences reveal that the proposed PageRank tracker outperforms mean-shift tracker, co-tracker, semiboosting and beyond semiboosting trackers in terms of accuracy, robustness and stability.

  9. Upgrade of CERN OP Webtools IRRAD Page

    CERN Document Server

    Vik, Magnus Bjerke

    2017-01-01

    CERN Beams Department maintains a website with various tools for the Operations Group, with one of them being specific for the Proton Irradiation Facility (IRRAD). The IRRAD team use the tool to follow up and optimize the operation of the facility. The original version of the tool was difficult to maintain and adding new features to the page was challenging. Thus this summer student project is aimed to upgrade the web page by rewriting the web page with maintainability and flexibility in mind. The new application uses a server--client architecture with a REST API on the back end which is used by the front end to request data for visualization. PHP is used on the back end to implement the API's and Swagger is used to document them. Vue, Semantic UI, Webpack, Node and ECMAScript 5 is used on the fronted to visualize and administrate the data. The result is a new IRRAD operations web application with extended functionality, improved structure and an improved user interface. It includes a new Status Panel page th...

  10. No Hawking-Page phase transition in three dimensions

    International Nuclear Information System (INIS)

    Myung, Y.S.

    2005-01-01

    We investigate whether or not the Hawking-Page phase transition is possible to occur in three dimensions. Starting with the simplest class of Lanczos-Lovelock action, thermodynamic behavior of all AdS-type black holes without charge falls into two classes: Schwarzschild-AdS black holes in even dimensions and Chern-Simons black holes in odd dimensions. The former class can provide the Hawking-Page transition between Schwarzschild-AdS black holes and thermal AdS space. On the other hand, the latter class is exceptional and thus the Hawking-Page transition is hard to occur. In three dimensions, a second-order phase transition might occur between the non-rotating BTZ black hole and the massless BTZ black hole (thermal AdS space), instead of the first-order Hawking-Page transition between the non-rotating BTZ black hole and thermal AdS space

  11. The impact of Arizona Highways Magazine's facebook page.

    Science.gov (United States)

    2014-02-01

    This project examined the relationship between use of the Arizona Highways magazine (AHM) Facebook Page and the decision to : travel to or within Arizona. Key purposes were to: (1) provide a thorough understanding of AHM Facebook Page users, includin...

  12. Identifying e-cigarette vape stores: description of an online search methodology.

    Science.gov (United States)

    Kim, Annice E; Loomis, Brett; Rhodes, Bryan; Eggers, Matthew E; Liedtke, Christopher; Porter, Lauren

    2016-04-01

    Although the overall impact of Electronic Nicotine Delivery Systems (ENDS) on public health is unclear, awareness, use, and marketing of the products have increased markedly in recent years. Identifying the increasing number of 'vape stores' that specialise in selling ENDS can be challenging given the lack of regulatory policies and licensing. This study assesses the utility of online search methods in identifying ENDS vape stores. We conducted online searches in Google Maps, Yelp, and YellowPages to identify listings of ENDS vape stores in Florida, and used a crowdsourcing platform to call and verify stores that primarily sold ENDS to consumers. We compared store listings generated from the online search and crowdsourcing methodology to list licensed tobacco and ENDS retailers from the Florida Department of Business and Professional Regulation. The combined results from all three online sources yielded a total of 403 ENDS vape stores. Nearly 32.5% of these stores were on the state tobacco licensure list, while 67.5% were not. Accuracy of online results was highest for Yelp (77.6%), followed by YellowPages (77.1%) and Google (53.0%). Using the online search methodology we identified more ENDS vape stores than were on the state tobacco licensure list. This approach may be a promising strategy to identify and track the growth of ENDS vape stores over time, especially in states without a systematic licensing requirement for such stores. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  13. Customisation of Indico pages - Layout and Menus

    CERN Multimedia

    CERN. Geneva; Ferreira, Pedro

    2017-01-01

    In this tutorial you are going to learn how to customize the layout of your Indico pages (for example you can change the color of the background images or change the logo) and the menus on your Indico pages  (for example you can add or hide certain blocks, or change their name and order).  

  14. Microfluidic device having an immobilized pH gradient and PAGE gels for protein separation and analysis

    Science.gov (United States)

    Sommer, Gregory J.; Hatch, Anson V.; Singh, Anup K.; Wang, Ying-Chih

    2012-12-11

    Disclosed is a novel microfluidic device enabling on-chip implementation of a two-dimensional separation methodology. Previously disclosed microscale immobilized pH gradients (IPG) are combined with perpendicular polyacrylamide gel electrophoresis (PAGE) microchannels to achieve orthogonal separations of biological samples. Device modifications enable inclusion of sodium dodecyl sulfate (SDS) in the second dimension. The device can be fabricated to use either continuous IPG gels, or the microscale isoelectric fractionation membranes we have also previously disclosed, for the first dimension. The invention represents the first all-gel two-dimensional separation microdevice, with significantly higher resolution power over existing devices.

  15. A Delphi Method Analysis to Create an Emergency Medicine Educational Patient Satisfaction Survey

    Directory of Open Access Journals (Sweden)

    Kory S. London

    2015-12-01

    Full Text Available Introduction: Feedback on patient satisfaction (PS as a means to monitor and improve performance in patient communication is lacking in residency training. A physician’s promotion, compensation and job satisfaction may be impacted by his individual PS scores, once he is in practice. Many communication and satisfaction surveys exist but none focus on the emergency department setting for educational purposes. The goal of this project was to create an emergency medicine-based educational PS survey with strong evidence for content validity. Methods: We used the Delphi Method (DM to obtain expert opinion via an iterative process of surveying. Questions were mined from four PS surveys as well as from group suggestion. The DM analysis determined the structure, content and appropriate use of the tool. The group used four-point Likert-type scales and Lynn’s criteria for content validity to determine relevant questions from the stated goals. Results: Twelve recruited experts participated in a series of seven surveys to achieve consensus. A 10-question, single-page survey with an additional page of qualitative questions and demographic questions was selected. Thirty one questions were judged to be relevant from an original 48-question list. Of these, the final 10 questions were chosen. Response rates for individual survey items was 99.5%. Conclusion: The DM produced a consensus survey with content validity evidence. Future work will be needed to obtain evidence for response process, internal structure and construct validity.

  16. EDUCATIONAL PAGES IN FACEBOOK - A STUDY

    OpenAIRE

    Dr.N.Ramakrishnan; Mrs. R.PrasithaIndhumathy

    2017-01-01

    Facebook Pages are a great resource for educational technology professionals to find companies, thought leaders, groups and organizations to share ideas and experiences with peers while expanding industry knowledge and increasing connections. Like most Facebook users, many educators use Facebook to connect with friends new and old, but the Internet's most popular site can also be a great learning and teaching tool. There are many Facebook pages that have been created as a resource to collect,...

  17. Required Discussion Web Pages in Psychology Courses and Student Outcomes

    Science.gov (United States)

    Pettijohn, Terry F., II; Pettijohn, Terry F.

    2007-01-01

    We conducted 2 studies that investigated student outcomes when using discussion Web pages in psychology classes. In Study 1, we assigned 213 students enrolled in Introduction to Psychology courses to either a mandatory or an optional Web page discussion condition. Students used the discussion Web page significantly more often and performed…

  18. UAV-Based Photogrammetry and Integrated Technologies for Architectural Applications—Methodological Strategies for the After-Quake Survey of Vertical Structures in Mantua (Italy

    Directory of Open Access Journals (Sweden)

    Cristiana Achille

    2015-06-01

    Full Text Available This paper examines the survey of tall buildings in an emergency context like in the case of post-seismic events. The after-earthquake survey has to guarantee time-savings, high precision and security during the operational stages. The main goal is to optimize the application of methodologies based on acquisition and automatic elaborations of photogrammetric data even with the use of Unmanned Aerial Vehicle (UAV systems in order to provide fast and low cost operations. The suggested methods integrate new technologies with commonly used technologies like TLS and topographic acquisition. The value of the photogrammetric application is demonstrated by a test case, based on the comparison of acquisition, calibration and 3D modeling results in case of use of a laser scanner, metric camera and amateur reflex camera. The test would help us to demonstrate the efficiency of image based methods in the acquisition of complex architecture. The case study is Santa Barbara Bell tower in Mantua. The applied survey solution allows a complete 3D database of the complex architectural structure to be obtained for the extraction of all the information needed for significant intervention. This demonstrates the applicability of the photogrammetry using UAV for the survey of vertical structures, complex buildings and difficult accessible architectural parts, providing high precision results.

  19. UAV-Based Photogrammetry and Integrated Technologies for Architectural Applications—Methodological Strategies for the After-Quake Survey of Vertical Structures in Mantua (Italy)

    Science.gov (United States)

    Achille, Cristiana; Adami, Andrea; Chiarini, Silvia; Cremonesi, Stefano; Fassi, Francesco; Fregonese, Luigi; Taffurelli, Laura

    2015-01-01

    This paper examines the survey of tall buildings in an emergency context like in the case of post-seismic events. The after-earthquake survey has to guarantee time-savings, high precision and security during the operational stages. The main goal is to optimize the application of methodologies based on acquisition and automatic elaborations of photogrammetric data even with the use of Unmanned Aerial Vehicle (UAV) systems in order to provide fast and low cost operations. The suggested methods integrate new technologies with commonly used technologies like TLS and topographic acquisition. The value of the photogrammetric application is demonstrated by a test case, based on the comparison of acquisition, calibration and 3D modeling results in case of use of a laser scanner, metric camera and amateur reflex camera. The test would help us to demonstrate the efficiency of image based methods in the acquisition of complex architecture. The case study is Santa Barbara Bell tower in Mantua. The applied survey solution allows a complete 3D database of the complex architectural structure to be obtained for the extraction of all the information needed for significant intervention. This demonstrates the applicability of the photogrammetry using UAV for the survey of vertical structures, complex buildings and difficult accessible architectural parts, providing high precision results. PMID:26134108

  20. UAV-Based Photogrammetry and Integrated Technologies for Architectural Applications--Methodological Strategies for the After-Quake Survey of Vertical Structures in Mantua (Italy).

    Science.gov (United States)

    Achille, Cristiana; Adami, Andrea; Chiarini, Silvia; Cremonesi, Stefano; Fassi, Francesco; Fregonese, Luigi; Taffurelli, Laura

    2015-06-30

    This paper examines the survey of tall buildings in an emergency context like in the case of post-seismic events. The after-earthquake survey has to guarantee time-savings, high precision and security during the operational stages. The main goal is to optimize the application of methodologies based on acquisition and automatic elaborations of photogrammetric data even with the use of Unmanned Aerial Vehicle (UAV) systems in order to provide fast and low cost operations. The suggested methods integrate new technologies with commonly used technologies like TLS and topographic acquisition. The value of the photogrammetric application is demonstrated by a test case, based on the comparison of acquisition, calibration and 3D modeling results in case of use of a laser scanner, metric camera and amateur reflex camera. The test would help us to demonstrate the efficiency of image based methods in the acquisition of complex architecture. The case study is Santa Barbara Bell tower in Mantua. The applied survey solution allows a complete 3D database of the complex architectural structure to be obtained for the extraction of all the information needed for significant intervention. This demonstrates the applicability of the photogrammetry using UAV for the survey of vertical structures, complex buildings and difficult accessible architectural parts, providing high precision results.

  1. 200-UP-2 operable unit radiological surveys

    International Nuclear Information System (INIS)

    Wendling, M.A.

    1994-01-01

    This report summarizes and documents the results of the radiological surveys conducted from August 17 through December 16, 1993 over a partial area of the 200-UP-2 Operable Unit, 200-W Area, Hanford Site, Richland, Washington. In addition, this report explains the survey methodology of the Mobile Surface Contamination Monitor 11 (MSCM-II) and the Ultra Sonic Ranging And Data System (USRADS). The radiological survey of the 200-UP-2 Operable Unit was conducted by the Site Investigative Surveys/Environmental Restoration Health Physics Organization of the Westinghouse Hanford Company. The survey methodology for the majority of area was based on utilization of the MSCM-II or the USRADS for automated recording of the gross beta/gamma radiation levels at or near six (6) inches from the surface soil

  2. Automatically annotating web pages using Google Rich Snippets

    NARCIS (Netherlands)

    Hogenboom, F.P.; Frasincar, F.; Vandic, D.; Meer, van der J.; Boon, F.; Kaymak, U.

    2011-01-01

    We propose the Automatic Review Recognition and annO- tation of Web pages (ARROW) framework, a framework for Web page review identification and annotation using RDFa Google Rich Snippets. The ARROW framework consists of four steps: hotspot identification, subjectivity analysis, in- formation

  3. 75 FR 54910 - Eastern States: Filing of Plats of Survey

    Science.gov (United States)

    2010-09-09

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLES956000-L14200000-BJ0000-LXSITRST0000] Eastern States: Filing of Plats of Survey AGENCY: Bureau of Land Management, Interior. ACTION: Notice of... Federal Register, Volume 75, Number 131, on page 39579 a notice entitled ``Eastern States: Filing of Plats...

  4. Web Page Recommendation Using Web Mining

    OpenAIRE

    Modraj Bhavsar; Mrs. P. M. Chavan

    2014-01-01

    On World Wide Web various kind of content are generated in huge amount, so to give relevant result to user web recommendation become important part of web application. On web different kind of web recommendation are made available to user every day that includes Image, Video, Audio, query suggestion and web page. In this paper we are aiming at providing framework for web page recommendation. 1) First we describe the basics of web mining, types of web mining. 2) Details of each...

  5. The Carnegie-Spitzer-IMACS redshift survey of galaxy evolution since z = 1.5. I. Description and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Kelson, Daniel D.; Williams, Rik J.; Dressler, Alan; McCarthy, Patrick J.; Shectman, Stephen A.; Mulchaey, John S.; Villanueva, Edward V.; Crane, Jeffrey D.; Quadri, Ryan F. [The Observatories of the Carnegie Institution for Science, 813 Santa Barbara Street, Pasadena, CA 91101 (United States)

    2014-03-10

    We describe the Carnegie-Spitzer-IMACS (CSI) Survey, a wide-field, near-IR selected spectrophotometric redshift survey with the Inamori Magellan Areal Camera and Spectrograph (IMACS) on Magellan-Baade. By defining a flux-limited sample of galaxies in Spitzer Infrared Array Camera 3.6 μm imaging of SWIRE fields, the CSI Survey efficiently traces the stellar mass of average galaxies to z ∼ 1.5. This first paper provides an overview of the survey selection, observations, processing of the photometry and spectrophotometry. We also describe the processing of the data: new methods of fitting synthetic templates of spectral energy distributions are used to derive redshifts, stellar masses, emission line luminosities, and coarse information on recent star formation. Our unique methodology for analyzing low-dispersion spectra taken with multilayer prisms in IMACS, combined with panchromatic photometry from the ultraviolet to the IR, has yielded high-quality redshifts for 43,347 galaxies in our first 5.3 deg{sup 2} of the SWIRE XMM-LSS field. We use three different approaches to estimate our redshift errors and find robust agreement. Over the full range of 3.6 μm fluxes of our selection, we find typical redshift uncertainties of σ {sub z}/(1 + z) ≲ 0.015. In comparisons with previously published spectroscopic redshifts we find scatters of σ {sub z}/(1 + z) = 0.011 for galaxies at 0.7 ≤ z ≤ 0.9, and σ {sub z}/(1 + z) = 0.014 for galaxies at 0.9 ≤ z ≤ 1.2. For galaxies brighter and fainter than i = 23 mag, we find σ {sub z}/(1 + z) = 0.008 and σ {sub z}/(1 + z) = 0.022, respectively. Notably, our low-dispersion spectroscopy and analysis yields comparable redshift uncertainties and success rates for both red and blue galaxies, largely eliminating color-based systematics that can seriously bias observed dependencies of galaxy evolution on environment.

  6. SPAX - PAX with Super-Pages

    Science.gov (United States)

    Bößwetter, Daniel

    Much has been written about the pros and cons of column-orientation as a means to speed up read-mostly analytic workloads in relational databases. In this paper we try to dissect the primitive mechanisms of a database that help express the coherence of tuples and present a novel way of organizing relational data in order to exploit the advantages of both, the row-oriented and the column-oriented world. As we go, we break with yet another bad habit of databases, namely the equal granularity of reads and writes which leads us to the introduction of consecutive clusters of disk pages called super-pages.

  7. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  8. Decomposition of the Google PageRank and Optimal Linking Strategy

    NARCIS (Netherlands)

    Avrachenkov, Konstatin; Litvak, Nelli

    We provide the analysis of the Google PageRank from the perspective of the Markov Chain Theory. First we study the Google PageRank for a Web that can be decomposed into several connected components which do not have any links to each other. We show that in order to determine the Google PageRank for

  9. Enhancing the Ranking of a Web Page in the Ocean of Data

    Directory of Open Access Journals (Sweden)

    Hitesh KUMAR SHARMA

    2013-10-01

    Full Text Available In today's world, web is considered as ocean of data and information (like text, videos, multimedia etc. consisting of millions and millions of web pages in which web pages are linked with each other like a tree. It is often argued that, especially considering the dynamic of the internet, too much time has passed since the scientific work on PageRank, as that it still could be the basis for the ranking methods of the Google search engine. There is no doubt that within the past years most likely many changes, adjustments and modifications regarding the ranking methods of Google have taken place, but PageRank was absolutely crucial for Google's success, so that at least the fundamental concept behind PageRank should still be constitutive. This paper describes the components which affects the ranking of the web pages and helps in increasing the popularity of web site. By adapting these factors website developers can increase their site's page rank and within the PageRank concept, considering the rank of a document is given by the rank of those documents which link to it. Their rank again is given by the rank of documents which link to them. The PageRank of a document is always determined recursively by the PageRank of other documents.

  10. Networked Feminist Movement: “Lugar de Mulher” blog and Facebook page analysis

    Directory of Open Access Journals (Sweden)

    Daniele Ferreira Seridório

    2015-12-01

    Full Text Available The concept of Network Social Movements explains the appropriation of communication and information technologies by social movements. The digital network is a space to spread information, to the collective construction of vindications and even manifestation. This paper aims to understand and analyse how the feminist movement uses theses communication tools. To achieve that, we used the quantity methodology to analyse the blog “Lugar de Mulher” and the Facebbok page of this blog during five days. Our results showed that, even though this website could be used as a digital tool for the movement, the blog “Luger de Mulher” does not construct actions in the offline space and has limited interactions with users that participate in the debate the occurs in the digital space.

  11. Automatic Hidden-Web Table Interpretation by Sibling Page Comparison

    Science.gov (United States)

    Tao, Cui; Embley, David W.

    The longstanding problem of automatic table interpretation still illudes us. Its solution would not only be an aid to table processing applications such as large volume table conversion, but would also be an aid in solving related problems such as information extraction and semi-structured data management. In this paper, we offer a conceptual modeling solution for the common special case in which so-called sibling pages are available. The sibling pages we consider are pages on the hidden web, commonly generated from underlying databases. We compare them to identify and connect nonvarying components (category labels) and varying components (data values). We tested our solution using more than 2,000 tables in source pages from three different domains—car advertisements, molecular biology, and geopolitical information. Experimental results show that the system can successfully identify sibling tables, generate structure patterns, interpret tables using the generated patterns, and automatically adjust the structure patterns, if necessary, as it processes a sequence of hidden-web pages. For these activities, the system was able to achieve an overall F-measure of 94.5%.

  12. Multiplex PageRank.

    Directory of Open Access Journals (Sweden)

    Arda Halu

    Full Text Available Many complex systems can be described as multiplex networks in which the same nodes can interact with one another in different layers, thus forming a set of interacting and co-evolving networks. Examples of such multiplex systems are social networks where people are involved in different types of relationships and interact through various forms of communication media. The ranking of nodes in multiplex networks is one of the most pressing and challenging tasks that research on complex networks is currently facing. When pairs of nodes can be connected through multiple links and in multiple layers, the ranking of nodes should necessarily reflect the importance of nodes in one layer as well as their importance in other interdependent layers. In this paper, we draw on the idea of biased random walks to define the Multiplex PageRank centrality measure in which the effects of the interplay between networks on the centrality of nodes are directly taken into account. In particular, depending on the intensity of the interaction between layers, we define the Additive, Multiplicative, Combined, and Neutral versions of Multiplex PageRank, and show how each version reflects the extent to which the importance of a node in one layer affects the importance the node can gain in another layer. We discuss these measures and apply them to an online multiplex social network. Findings indicate that taking the multiplex nature of the network into account helps uncover the emergence of rankings of nodes that differ from the rankings obtained from one single layer. Results provide support in favor of the salience of multiplex centrality measures, like Multiplex PageRank, for assessing the prominence of nodes embedded in multiple interacting networks, and for shedding a new light on structural properties that would otherwise remain undetected if each of the interacting networks were analyzed in isolation.

  13. Multiplex PageRank.

    Science.gov (United States)

    Halu, Arda; Mondragón, Raúl J; Panzarasa, Pietro; Bianconi, Ginestra

    2013-01-01

    Many complex systems can be described as multiplex networks in which the same nodes can interact with one another in different layers, thus forming a set of interacting and co-evolving networks. Examples of such multiplex systems are social networks where people are involved in different types of relationships and interact through various forms of communication media. The ranking of nodes in multiplex networks is one of the most pressing and challenging tasks that research on complex networks is currently facing. When pairs of nodes can be connected through multiple links and in multiple layers, the ranking of nodes should necessarily reflect the importance of nodes in one layer as well as their importance in other interdependent layers. In this paper, we draw on the idea of biased random walks to define the Multiplex PageRank centrality measure in which the effects of the interplay between networks on the centrality of nodes are directly taken into account. In particular, depending on the intensity of the interaction between layers, we define the Additive, Multiplicative, Combined, and Neutral versions of Multiplex PageRank, and show how each version reflects the extent to which the importance of a node in one layer affects the importance the node can gain in another layer. We discuss these measures and apply them to an online multiplex social network. Findings indicate that taking the multiplex nature of the network into account helps uncover the emergence of rankings of nodes that differ from the rankings obtained from one single layer. Results provide support in favor of the salience of multiplex centrality measures, like Multiplex PageRank, for assessing the prominence of nodes embedded in multiple interacting networks, and for shedding a new light on structural properties that would otherwise remain undetected if each of the interacting networks were analyzed in isolation.

  14. Page: a program for gamma spectra analysis in PC microcomputers

    International Nuclear Information System (INIS)

    Goncalves, M.A.; Yamaura, M.; Costa, G.J.C.; Carvalho, E.I. de; Matsuda, H.T.; Araujo, B.F. de.

    1991-04-01

    PAGE is a software package, written in BASIC language, to perform gamma spectra analysis. It was developed to be used in a high-purity intrinsic germanium detector-multichannel analyser-PC microcomputer system. The analysis program of PAGE package accomplishes functions as follows: peak location; gamma nuclides identification; activity determination. Standard nuclides sources were used to calibrate the system. To perform the efficiency x energy calibration a logarithmic fit was applied. Analysis of nuclides with overlapping peaks is allowed by PAGE program. PAGE has additional auxiliary programs for: building and list of isotopic nuclear data libraries; data acquisition from multichannel analyser; spectrum display with automatic area and FWHM determinations. This software is to be applied in analytical process control where time response is a very important parameter. PAGE takes ca. 1.5 minutes to analyse a complex spectrum from a 4096 channels MCA. (author)

  15. PageRank, HITS and a unified framework for link analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Chris; He, Xiaofeng; Husbands, Parry; Zha, Hongyuan; Simon, Horst

    2001-10-01

    Two popular webpage ranking algorithms are HITS and PageRank. HITS emphasizes mutual reinforcement between authority and hub webpages, while PageRank emphasizes hyperlink weight normalization and web surfing based on random walk models. We systematically generalize/combine these concepts into a unified framework. The ranking framework contains a large algorithm space; HITS and PageRank are two extreme ends in this space. We study several normalized ranking algorithms which are intermediate between HITS and PageRank, and obtain closed-form solutions. We show that, to first order approximation, all ranking algorithms in this framework, including PageRank and HITS, lead to same ranking which is highly correlated with ranking by indegree. These results support the notion that in web resource ranking indegree and outdegree are of fundamental importance. Rankings of webgraphs of different sizes and queries are presented to illustrate our analysis.

  16. Gotta survey somebody : Methodological challenges in population studies of older people

    OpenAIRE

    Kelfve, Susanne

    2015-01-01

    Conducting representative surveys of older people is challenging. This thesis aims to analyze a) the characteristics of individuals at risk of being underrepresented in surveys of older people, b) the systematic errors likely to occur as a result of these selections, and c) whether these systematic errors can be minimized by weighting adjustments.   In Study I, we investigated a) who would be missing from a survey that excluded those living in institutions and that did not use indirect interv...

  17. Dynamic Web Pages: Performance Impact on Web Servers.

    Science.gov (United States)

    Kothari, Bhupesh; Claypool, Mark

    2001-01-01

    Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)

  18. The relative worst order ratio applied to paging

    DEFF Research Database (Denmark)

    Boyar, Joan; Favrholdt, Lene Monrad; Larsen, Kim Skak

    2007-01-01

    The relative worst order ratio, a new measure for the quality of on-line algorithms, was recently defined and applied to two bin packing problems. Here, we apply it to the paging problem and obtain the following results: We devise a new deterministic paging algorithm, Retrospective-LRU, and show...

  19. #NoMorePage3

    DEFF Research Database (Denmark)

    Glozer, Sarah; McCarthy, Lauren; Whelan, Glen

    2015-01-01

    Fourth wave feminists are currently seeking to bring an end to The Sun’s Page 3, a British institution infamous for featuring a topless female model daily. This paper investigates the No More Page 3 (NMP3) campaign through which feminist activists have sought to disrupt the institutionalized...... the institutional work and political corporate social responsibility literatures, we document the manner in which feminist activists have used The Co- operative’s social media site to publicly disrupt entrenched gender norms. Through identifying symbiotic yet competing discourses we discover themes of disruption...... and maintenance amongst and between interlocutors, facilitated by The Co-operative’s arena of citizenship and its notion of suspended discourse. Our analysis contributes to the institutional work literature by demonstrating the mutual need for disruption to ‘meet’ or contest maintenance work in corporate practice...

  20. The mediating role of facebook fan pages.

    Science.gov (United States)

    Chih, Wen-Hai; Hsu, Li-Chun; Wang, Kai-Yu; Lin, Kuan-Yu

    2014-01-01

    Using the dual mediation hypothesis, this study investigates the role of interestingness (the power of attracting or holding one's attention) attitude towards the news, in the formation of Facebook Fan Page users' electronic word-of-mouth intentions. A total of 599 Facebook fan page users in Taiwan were recruited and structural equation modeling (SEM) was used to test the research hypotheses. The results show that both perceived news entertainment and informativeness positively influence interestingness attitude towards the news. Interestingness attitude towards the news subsequently influences hedonism and utilitarianism attitudes towards the Fan Page, which then influence eWOM intentions. Interestingness attitude towards the news plays a more important role than hedonism and utilitarianism attitudes in generating electronic word-of-mouth intentions. Based on the findings, the implications and future research suggestions are provided.

  1. An EPRI methodology for determining and monitoring simulator operating limits

    International Nuclear Information System (INIS)

    Eichelberg, R.; Pellechi, M.; Wolf, B.; Colley, R.

    1989-01-01

    Of paramount concern to nuclear utilities today is whether their plant-referenced simulator(s) comply with ANSI/ANS 3.5-1985. Of special interest is Section 4.3 of the Standard which requires, in part, that a means be provided to alert the instructor when certain parameters approach values indicative of events beyond the implemented model or known plant behavior. EPRI established Research Project 2054-2 to develop a comprehensive plan for determining, monitoring, and implementing simulator operating limits. As part of the project, a survey was conducted to identify the current/anticipated approach each of the sampled utilities was using to meet the requirements of Section 4.3. A preliminary methodology was drafted and host utilities interviewed. The interview process led to redefining the methodology. This paper covers the objectives of the EPRI project, survey responses, overview of the methodology, resource requirements and conclusions

  2. Measurement of environmental impacts of telework adoption amidst change in complex organizations. AT and T survey methodology and results

    Energy Technology Data Exchange (ETDEWEB)

    Atkyns, Robert; Blazek, Michele; Roitz, Joseph [AT and T, 179 Bothin Road, 94930 Fairfax, CA (United States)

    2002-10-01

    Telecommuting practices and their environmental and organizational performance impacts have stimulated research across academic disciplines. Although telecommuting trends and impact projections are reported, few true longitudinal studies involving large organizations have been conducted. Published studies typically lack the research design elements to control a major confounding variable: rapid and widespread organizational change. Yet social science 'Best Practices' and market research industry quality control procedures exist that can help manage organizational change effects and other common sources of measurement error. In 1992, AT and T established a formal, corporate-wide telecommuting policy. A research and statistical modeling initiative was implemented to measure how flexible work arrangements reduce automotive emissions. Annual employee surveys were begun in 1994. As telecommuting benefits have been increasingly recognized within AT and T, the essential construct has been redefined as 'telework.' The survey's scope has expanded to address broader organization issues and provide guidance to multiple internal constituencies. This paper focuses upon the procedures used to reliably measure the adoption of telework practices and model their environmental impact, and contrasts those procedures with other, less reliable methodologies.

  3. Conducting Web-based Surveys.

    OpenAIRE

    David J. Solomon

    2001-01-01

    Web-based surveying is becoming widely used in social science and educational research. The Web offers significant advantages over more traditional survey techniques however there are still serious methodological challenges with using this approach. Currently coverage bias or the fact significant numbers of people do not have access, or choose not to use the Internet is of most concern to researchers. Survey researchers also have much to learn concerning the most effective ways to conduct s...

  4. Identify Web-page Content meaning using Knowledge based System for Dual Meaning Words

    OpenAIRE

    Sinha, Sukanta; Dattagupta, Rana; Mukhopadhyay, Debajyoti

    2012-01-01

    Meaning of Web-page content plays a big role while produced a search result from a search engine. Most of the cases Web-page meaning stored in title or meta-tag area but those meanings do not always match with Web-page content. To overcome this situation we need to go through the Web-page content to identify the Web-page meaning. In such cases, where Webpage content holds dual meaning words that time it is really difficult to identify the meaning of the Web-page. In this paper, we are introdu...

  5. Mathematicians' Views on Current Publishing Issues: A Survey of Researchers

    Science.gov (United States)

    Fowler, Kristine K.

    2011-01-01

    This article reports research mathematicians' attitudes about and activity in specific scholarly communication areas, as captured in a 2010 survey of more than 600 randomly-selected mathematicians worldwide. Key findings include: (1) Most mathematicians have papers in the arXiv, but posting to their own web pages remains more common; (2) A third…

  6. Patients With Thumb Carpometacarpal Arthritis Have Quantifiable Characteristic Expectations That Can Be Measured With a Survey.

    Science.gov (United States)

    Kang, Lana; Hashmi, Sohaib Z; Nguyen, Joseph; Lee, Steve K; Weiland, Andrew J; Mancuso, Carol A

    2016-01-01

    Although patient expectations associated with major orthopaedic conditions have shown clinically relevant and variable effects on outcomes, expectations associated with thumb carpometacarpal (CMC) arthritis have not been identified, described, or analyzed before, to our knowledge. We asked: (1) Do patients with thumb CMC arthritis express characteristic expectations that are quantifiable and have measurable frequency? (2) Can a survey on expectations developed from patient-derived data quantitate expectations in patients with thumb CMC arthritis? The study was a prospective cohort study. The first phase was a 12-month-period involving interviews of 42 patients with thumb CMC arthritis to define their expectations of treatment. The interview process used techniques and principles of qualitative methodology including open-ended interview questions, unrestricted time, and study size determined by data saturation. Verbatim responses provided content for the draft survey. The second phase was a 12-month period assessing the survey for test-retest reliability with the recruitment of 36 participants who completed the survey twice. The survey was finalized from clinically relevant content, frequency of endorsement, weighted kappa values for concordance of responses, and intraclass coefficient and Cronbach's alpha for interrater reliability and internal consistency. Thirty-two patients volunteered 256 characteristic expectations, which consisted of 21 discrete categories. Expectations with similar concepts were combined by eliminating redundancy while maintaining original terminology. These were reduced to 19 items that comprised a one-page survey. This survey showed high concordance, interrater reliability, and internal consistency, with weighted kappa values between 0.58 and 0.78 (95% CI, 0.39-0.78; p Patients with thumb CMC arthritis volunteer a characteristic and quantifiable set of expectations. Using responses recorded verbatim from patient interviews, a clinically

  7. Who should be undertaking population-based surveys in humanitarian emergencies?

    Directory of Open Access Journals (Sweden)

    Spiegel Paul B

    2007-06-01

    Full Text Available Abstract Background Timely and accurate data are necessary to prioritise and effectively respond to humanitarian emergencies. 30-by-30 cluster surveys are commonly used in humanitarian emergencies because of their purported simplicity and reasonable validity and precision. Agencies have increasingly used 30-by-30 cluster surveys to undertake measurements beyond immunisation coverage and nutritional status. Methodological errors in cluster surveys have likely occurred for decades in humanitarian emergencies, often with unknown or unevaluated consequences. Discussion Most surveys in humanitarian emergencies are done by non-governmental organisations (NGOs. Some undertake good quality surveys while others have an already overburdened staff with limited epidemiological skills. Manuals explaining cluster survey methodology are available and in use. However, it is debatable as to whether using standardised, 'cookbook' survey methodologies are appropriate. Coordination of surveys is often lacking. If a coordinating body is established, as recommended, it is questionable whether it should have sole authority to release surveys due to insufficient independence. Donors should provide sufficient funding for personnel, training, and survey implementation, and not solely for direct programme implementation. Summary A dedicated corps of trained epidemiologists needs to be identified and made available to undertake surveys in humanitarian emergencies. NGOs in the field may need to form an alliance with certain specialised agencies or pool technically capable personnel. If NGOs continue to do surveys by themselves, a simple training manual with sample survey questionnaires, methodology, standardised files for data entry and analysis, and manual for interpretation should be developed and modified locally for each situation. At the beginning of an emergency, a central coordinating body should be established that has sufficient authority to set survey standards

  8. AngularJS Performance: A Survey Study

    OpenAIRE

    Ramos, Miguel; Valente, Marco Tulio; Terra, Ricardo

    2017-01-01

    AngularJS is a popular JavaScript MVC-based framework to construct single-page web applications. In this paper, we report the results of a survey with 95 professional developers about performance issues of AngularJS applications. We report common practices followed by developers to avoid performance problems (e.g., use of third-party or custom components), the general causes of performance problems in AngularJS applications (e.g., inadequate architecture decisions taken by AngularJS users), a...

  9. Poisson statistics of PageRank probabilities of Twitter and Wikipedia networks

    Science.gov (United States)

    Frahm, Klaus M.; Shepelyansky, Dima L.

    2014-04-01

    We use the methods of quantum chaos and Random Matrix Theory for analysis of statistical fluctuations of PageRank probabilities in directed networks. In this approach the effective energy levels are given by a logarithm of PageRank probability at a given node. After the standard energy level unfolding procedure we establish that the nearest spacing distribution of PageRank probabilities is described by the Poisson law typical for integrable quantum systems. Our studies are done for the Twitter network and three networks of Wikipedia editions in English, French and German. We argue that due to absence of level repulsion the PageRank order of nearby nodes can be easily interchanged. The obtained Poisson law implies that the nearby PageRank probabilities fluctuate as random independent variables.

  10. Using Shadow Page Cache to Improve Isolated Drivers Performance

    Directory of Open Access Journals (Sweden)

    Hao Zheng

    2015-01-01

    Full Text Available With the advantage of the reusability property of the virtualization technology, users can reuse various types and versions of existing operating systems and drivers in a virtual machine, so as to customize their application environment. In order to prevent users’ virtualization environments being impacted by driver faults in virtual machine, Chariot examines the correctness of driver’s write operations by the method of combining a driver’s write operation capture and a driver’s private access control table. However, this method needs to keep the write permission of shadow page table as read-only, so as to capture isolated driver’s write operations through page faults, which adversely affect the performance of the driver. Based on delaying setting frequently used shadow pages’ write permissions to read-only, this paper proposes an algorithm using shadow page cache to improve the performance of isolated drivers and carefully study the relationship between the performance of drivers and the size of shadow page cache. Experimental results show that, through the shadow page cache, the performance of isolated drivers can be greatly improved without impacting Chariot’s reliability too much.

  11. Automatic comic page image understanding based on edge segment analysis

    Science.gov (United States)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  12. Lessons Learned from the Administration of a Web-Based Survey.

    Science.gov (United States)

    Mertler, Craig A.

    This paper describes the methodology used in a research study involving the collection of data through a Web-based survey, focusing on the advantages and limitations of the methodology. The Teacher motivation and Job Satisfaction Survey was administered to K-12 teachers. Many of the difficulties occurred during the planning phase, as opposed to…

  13. A survey of decision tree classifier methodology

    Science.gov (United States)

    Safavian, S. R.; Landgrebe, David

    1991-01-01

    Decision tree classifiers (DTCs) are used successfully in many diverse areas such as radar signal classification, character recognition, remote sensing, medical diagnosis, expert systems, and speech recognition. Perhaps the most important feature of DTCs is their capability to break down a complex decision-making process into a collection of simpler decisions, thus providing a solution which is often easier to interpret. A survey of current methods is presented for DTC designs and the various existing issues. After considering potential advantages of DTCs over single-state classifiers, subjects of tree structure design, feature selection at each internal node, and decision and search strategies are discussed.

  14. Calculating PageRank in a changing network with added or removed edges

    Science.gov (United States)

    Engström, Christopher; Silvestrov, Sergei

    2017-01-01

    PageRank was initially developed by S. Brinn and L. Page in 1998 to rank homepages on the Internet using the stationary distribution of a Markov chain created using the web graph. Due to the large size of the web graph and many other real world networks fast methods to calculate PageRank is needed and even if the original way of calculating PageRank using a Power iterations is rather fast, many other approaches have been made to improve the speed further. In this paper we will consider the problem of recalculating PageRank of a changing network where the PageRank of a previous version of the network is known. In particular we will consider the special case of adding or removing edges to a single vertex in the graph or graph component.

  15. Block models and personalized PageRank.

    Science.gov (United States)

    Kloumann, Isabel M; Ugander, Johan; Kleinberg, Jon

    2017-01-03

    Methods for ranking the importance of nodes in a network have a rich history in machine learning and across domains that analyze structured data. Recent work has evaluated these methods through the "seed set expansion problem": given a subset [Formula: see text] of nodes from a community of interest in an underlying graph, can we reliably identify the rest of the community? We start from the observation that the most widely used techniques for this problem, personalized PageRank and heat kernel methods, operate in the space of "landing probabilities" of a random walk rooted at the seed set, ranking nodes according to weighted sums of landing probabilities of different length walks. Both schemes, however, lack an a priori relationship to the seed set objective. In this work, we develop a principled framework for evaluating ranking methods by studying seed set expansion applied to the stochastic block model. We derive the optimal gradient for separating the landing probabilities of two classes in a stochastic block model and find, surprisingly, that under reasonable assumptions the gradient is asymptotically equivalent to personalized PageRank for a specific choice of the PageRank parameter [Formula: see text] that depends on the block model parameters. This connection provides a formal motivation for the success of personalized PageRank in seed set expansion and node ranking generally. We use this connection to propose more advanced techniques incorporating higher moments of landing probabilities; our advanced methods exhibit greatly improved performance, despite being simple linear classification rules, and are even competitive with belief propagation.

  16. On Survey Data Analysis in Corporate Finance

    OpenAIRE

    Serita, Toshio

    2008-01-01

    Recently, survey data analysis has emerged as a new method for testing hypotheses andfor clarifying the relative importance of different factors in corporate finance decisions. This paper investigates the advantages and drawbacks of survey data analysis, methodology of survey data analysis such as questionnaire design, and analytical methods for survey data, incomparison with traditional large sample analysis. We show that survey data analysis does not replace traditional large sample analysi...

  17. Navigation To and From a Page: Which Links Get Clicked From Where

    Science.gov (United States)

    Use Google Analytics navigation summary data to find out what page users most frequently click your Contact Us link from (Previous Page Path), or which links on your homepage are popular or unpopular (Next Page Path).

  18. INTEGRATED SURVEY FOR ARCHITECTURAL RESTORATION: A METHODOLOGICAL COMPARISON OF TWO CASE STUDIES

    Directory of Open Access Journals (Sweden)

    G. Bianchi

    2016-06-01

    Full Text Available A preliminary survey campaign is essential in projects of restoration, urban renewal, rebuilding or promotion of architectural heritage. Today several survey techniques allow full 3D object restitution and modelling that provides a richer description than simple 2D representations. However, the amount of data to collect increases dramatically and a trade-off between efficiency and productivity from one side and assuring accuracy and completeness of the results on the other must be found. Depending on the extent and the complexity of the task, a single technique or a combination of several ones might be employed. Especially when documentation at different scales and with different levels of detail are foreseen, the latter will likely be necessary. The paper describes two architectural surveys in Italy: the old village of Navelli (AQ, affected by the earthquake in 2009, and the two most relevant remains in Codiponte (MS, damaged by the earthquake in 2013, both in the context of a project of restoration and conservation. In both sites, a 3D survey was necessary to represent effectively the objects. An integrated survey campaign was performed in both cases, which consists of a GPS network as support for georeferencing, an aerial survey and a field survey made by laser scanner and close range photogrammetry. The two case studies, thanks to their peculiarities, can be taken as exemplar to wonder if the integration of different surveying techniques is today still mandatory or, considering the technical advances of each technology, it is in fact just optional.

  19. Development Methodology for an Integrated Legal Cadastre

    NARCIS (Netherlands)

    Hespanha, J.P.

    2012-01-01

    This Thesis describes the research process followed in order to achieve a development methodology applicable to the reform of cadastral systems with a legal basis. It was motivated by the author’s participation in one of the first surveying and mapping operations for a digital cadastre in Portugal,

  20. Is Domain Highlighting Actually Helpful in Identifying Phishing Web Pages?

    Science.gov (United States)

    Xiong, Aiping; Proctor, Robert W; Yang, Weining; Li, Ninghui

    2017-06-01

    To evaluate the effectiveness of domain highlighting in helping users identify whether Web pages are legitimate or spurious. As a component of the URL, a domain name can be overlooked. Consequently, browsers highlight the domain name to help users identify which Web site they are visiting. Nevertheless, few studies have assessed the effectiveness of domain highlighting, and the only formal study confounded highlighting with instructions to look at the address bar. We conducted two phishing detection experiments. Experiment 1 was run online: Participants judged the legitimacy of Web pages in two phases. In Phase 1, participants were to judge the legitimacy based on any information on the Web page, whereas in Phase 2, they were to focus on the address bar. Whether the domain was highlighted was also varied. Experiment 2 was conducted similarly but with participants in a laboratory setting, which allowed tracking of fixations. Participants differentiated the legitimate and fraudulent Web pages better than chance. There was some benefit of attending to the address bar, but domain highlighting did not provide effective protection against phishing attacks. Analysis of eye-gaze fixation measures was in agreement with the task performance, but heat-map results revealed that participants' visual attention was attracted by the highlighted domains. Failure to detect many fraudulent Web pages even when the domain was highlighted implies that users lacked knowledge of Web page security cues or how to use those cues. Potential applications include development of phishing prevention training incorporating domain highlighting with other methods to help users identify phishing Web pages.

  1. PageRank (II): Mathematics

    African Journals Online (AJOL)

    maths/stats

    ... GAUSS SEIDEL'S. NUMERICAL ALGORITHMS IN PAGE RANK ANALYSIS. ... The convergence is guaranteed, if the absolute value of the largest eigen ... improved Gauss-Seidel iteration algorithm, based on the decomposition. U. L. D. M. +. +. = ..... This corresponds to determine the eigen vector of T with eigen value 1.

  2. Mode Equivalence of Health Indicators Between Data Collection Modes and Mixed-Mode Survey Designs in Population-Based Health Interview Surveys for Children and Adolescents: Methodological Study

    Science.gov (United States)

    Hoffmann, Robert; Houben, Robin; Krause, Laura; Kamtsiuris, Panagiotis; Gößwald, Antje

    2018-01-01

    Background The implementation of an Internet option in an existing public health interview survey using a mixed-mode design is attractive because of lower costs and faster data availability. Additionally, mixed-mode surveys can increase response rates and improve sample composition. However, mixed-mode designs can increase the risk of measurement error (mode effects). Objective This study aimed to determine whether the prevalence rates or mean values of self- and parent-reported health indicators for children and adolescents aged 0-17 years differ between self-administered paper-based questionnaires (SAQ-paper) and self-administered Web-based questionnaires (SAQ-Web), as well as between a single-mode control group and different mixed-mode groups. Methods Data were collected for a methodological pilot of the third wave of the "German Health Interview and Examination Survey for Children and Adolescents". Questionnaires were completed by parents or adolescents. A population-based sample of 11,140 children and adolescents aged 0-17 years was randomly allocated to 4 survey designs—a single-mode control group with paper-and-pencil questionnaires only (n=970 parents, n=343 adolescents)—and 3 mixed-mode designs, all of which offered Web-based questionnaire options. In the concurrent mixed-mode design, both questionnaires were offered at the same time (n=946 parents, n=290 adolescents); in the sequential mixed-mode design, the SAQ-Web was sent first, followed by the paper questionnaire along with a reminder (n=854 parents, n=269 adolescents); and in the preselect mixed-mode design, both options were offered and the respondents were asked to request the desired type of questionnaire (n=698 parents, n=292 adolescents). In total, 3468 questionnaires of parents of children aged 0-17 years (SAQ-Web: n=708; SAQ-paper: n=2760) and 1194 questionnaires of adolescents aged 11-17 years (SAQ-Web: n=299; SAQ-paper: n=895) were analyzed. Sociodemographic characteristics and a broad

  3. Benchmarking survey for recycling.

    Energy Technology Data Exchange (ETDEWEB)

    Marley, Margie Charlotte; Mizner, Jack Harry

    2005-06-01

    This report describes the methodology, analysis and conclusions of a comparison survey of recycling programs at ten Department of Energy sites including Sandia National Laboratories/New Mexico (SNL/NM). The goal of the survey was to compare SNL/NM's recycling performance with that of other federal facilities, and to identify activities and programs that could be implemented at SNL/NM to improve recycling performance.

  4. Home Page of Richard Talaga

    Science.gov (United States)

    bottom of the page is a (planned) link to one of my favorite hobbies. Current Experiments MINOS Neutrino Argonne efforts to join this (existing) experiment * Talaga's Presentations Hobbies and Photos Dive Photos

  5. Quality of drug information on the World Wide Web and strategies to improve pages with poor information quality. An intervention study on pages about sildenafil.

    Science.gov (United States)

    Martin-Facklam, Meret; Kostrzewa, Michael; Martin, Peter; Haefeli, Walter E

    2004-01-01

    The generally poor quality of health information on the world wide web (WWW) has caused preventable adverse outcomes. Quality management of information on the internet is therefore critical given its widespread use. In order to develop strategies for the safe use of drugs, we scored general and content quality of pages about sildenafil and performed an intervention to improve their quality. The internet was searched with Yahoo and AltaVista for pages about sildenafil and 303 pages were included. For assessment of content quality a score based on accuracy and completeness of essential drug information was assigned. For assessment of general quality, four criteria were evaluated and their association with high content quality was determined by multivariate logistic regression analysis. The pages were randomly allocated to either control or intervention group. Evaluation took place before, as well as 7 and 22 weeks after an intervention which consisted of two letters with individualized feedback information on the respective page which were sent electronically to the address mentioned on the page. Providing references to scientific publications or prescribing information was significantly associated with high content quality (odds ratio: 8.2, 95% CI 3.2, 20.5). The intervention had no influence on general or content quality. To prevent adverse outcomes caused by misinformation on the WWW individualized feedback to the address mentioned on the page was ineffective. It is currently probably the most straight-forward approach to inform lay persons about indicators of high information quality, i.e. the provision of references.

  6. Network and User-Perceived Performance of Web Page Retrievals

    Science.gov (United States)

    Kruse, Hans; Allman, Mark; Mallasch, Paul

    1998-01-01

    The development of the HTTP protocol has been driven by the need to improve the network performance of the protocol by allowing the efficient retrieval of multiple parts of a web page without the need for multiple simultaneous TCP connections between a client and a server. We suggest that the retrieval of multiple page elements sequentially over a single TCP connection may result in a degradation of the perceived performance experienced by the user. We attempt to quantify this perceived degradation through the use of a model which combines a web retrieval simulation and an analytical model of TCP operation. Starting with the current HTTP/l.1 specification, we first suggest a client@side heuristic to improve the perceived transfer performance. We show that the perceived speed of the page retrieval can be increased without sacrificing data transfer efficiency. We then propose a new client/server extension to the HTTP/l.1 protocol to allow for the interleaving of page element retrievals. We finally address the issue of the display of advertisements on web pages, and in particular suggest a number of mechanisms which can make efficient use of IP multicast to send advertisements to a number of clients within the same network.

  7. Hiding in Plain Sight: The Anatomy of Malicious Facebook Pages

    OpenAIRE

    Dewan, Prateek; Kumaraguru, Ponnurangam

    2015-01-01

    Facebook is the world's largest Online Social Network, having more than 1 billion users. Like most other social networks, Facebook is home to various categories of hostile entities who abuse the platform by posting malicious content. In this paper, we identify and characterize Facebook pages that engage in spreading URLs pointing to malicious domains. We used the Web of Trust API to determine domain reputations of URLs published by pages, and identified 627 pages publishing untrustworthy info...

  8. The survey of academic libraries

    CERN Document Server

    2014-01-01

    The Survey of Academic Libraries, 2014-15 Edition looks closely at key benchmarks for academic libraries in areas such as spending for books and e-books, deployment and pay rates for student workers, use of tablet computers, cloud computing and other new technologies, database licensing practices, and much more. The study includes detailed data on overall budgets, capital budgets, salaries and materials spending, and much more of interest to academic librarians and their suppliers. Data in this 200+ page report is broken out by size and type of library for easy benchmarking.

  9. THE NEW PURCHASING SERVICE PAGE NOW ON THE WEB!

    CERN Multimedia

    SPL Division

    2000-01-01

    Users of CERN's Purchasing Service are encouraged to visit the new Purchasing Service web page, accessible from the CERN homepage or directly at: http://spl-purchasing.web.cern.ch/spl-purchasing/ There, you will find answers to questions such as: Who are the buyers? What do I need to know before creating a DAI? How many offers do I need? Where shall I send the offer I received? I know the amount of my future requirement, how do I proceed? How are contracts adjudicated at CERN? Which exhibitions and visits of Member State companies are foreseen in the future? A company I know is interested in making a presentation at CERN, who should they contact? Additionally, you will find information concerning: The Purchasing procedures Market Surveys and Invitations to Tender The Industrial Liaison Officers appointed in each Member State The Purchasing Broker at CERN

  10. Performance-based methodology for assessing seismic vulnerability and capacity of buildings

    Science.gov (United States)

    Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li

    2010-06-01

    This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.

  11. [An evaluation of the quality of health web pages using a validated questionnaire].

    Science.gov (United States)

    Conesa Fuentes, Maria del Carmen; Aguinaga Ontoso, Enrique; Hernández Morante, Juan José

    2011-01-01

    The objective of the present study was to evaluate the quality of general health information in Spanish language web pages, and the official Regional Services web pages from the different Autonomous Regions. It is a cross-sectional study. We have used a previously validated questionnaire to study the present state of the health information on Internet for a lay-user point of view. By mean of PageRank (Google®), we obtained a group of webs, including a total of 65 health web pages. We applied some exclusion criteria, and finally obtained a total of 36 webs. We also analyzed the official web pages from the different Health Services in Spain (19 webs), making a total of 54 health web pages. In the light of our data, we observed that, the quality of the general information health web pages was generally rather low, especially regarding the information quality. Not one page reached the maximum score (19 points). The mean score of the web pages was of 9.8±2.8. In conclusion, to avoid the problems arising from the lack of quality, health professionals should design advertising campaigns and other media to teach the lay-user how to evaluate the information quality. Copyright © 2009 Elsevier España, S.L. All rights reserved.

  12. Bibliographic survey on methodologies for development of health database of the population in case of cancer occurrences; Levantamento bibliografico sobre metodologias para elaboracao de um banco de dados da saude da populacao em casos de ocorrencias de cancer

    Energy Technology Data Exchange (ETDEWEB)

    Cavinato, Christianne C.; Andrade, Delvonei A. de; Sabundjian, Gaiane, E-mail: christiannecobellocavinato@gmail.com, E-mail: delvonei@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Diz, Maria Del Pilar E., E-mail: maria.pilar@icesp.org.br [Instituto do Cancer do Estado de Sao Paulo (ICESP), Sao Paulo, SP (Brazil)

    2014-07-01

    The objective is to make a survey of existing methodologies and for the development of public health database, focusing on health (fatal and nonfatal cancer) of the population surrounding a nuclear facility, for purposes of calculating the environmental cost of the same. From methodologies found to develop this type of database, a methodology will be developed to be applied to the internal public of IPEN/CNEN-SP, Brazil, as a pre-test for the acquisition of health information desired.

  13. An Analysis of Academic Library Web Pages for Faculty

    Science.gov (United States)

    Gardner, Susan J.; Juricek, John Eric; Xu, F. Grace

    2008-01-01

    Web sites are increasingly used by academic libraries to promote key services and collections to teaching faculty. This study analyzes the content, location, language, and technological features of fifty-four academic library Web pages designed especially for faculty to expose patterns in the development of these pages.

  14. Online Public Access Catalog User Studies: A Review of Research Methodologies, March 1986-November 1989.

    Science.gov (United States)

    Seymour, Sharon

    1991-01-01

    Review of research methodologies used in studies of online public access catalog (OPAC) users finds that a variety of research methodologies--e.g., surveys, transaction log analysis, interviews--have been used with varying degrees of expertise. It is concluded that poor research methodology resulting from limited training and resources limits the…

  15. Folding worlds between pages

    CERN Multimedia

    Meier, Matthias

    2010-01-01

    "We all remember pop-up books form our childhood. As fascinated as we were back then, we probably never imagined how much engineering know-how went into these books. Pop-up engineer Anton Radevsky has even managed to fold a 27-kilometre particle accelerator into a book" (4 pages)

  16. Building interactive simulations in a Web page design program.

    Science.gov (United States)

    Kootsey, J Mailen; Siriphongs, Daniel; McAuley, Grant

    2004-01-01

    A new Web software architecture, NumberLinX (NLX), has been integrated into a commercial Web design program to produce a drag-and-drop environment for building interactive simulations. NLX is a library of reusable objects written in Java, including input, output, calculation, and control objects. The NLX objects were added to the palette of available objects in the Web design program to be selected and dropped on a page. Inserting an object in a Web page is accomplished by adding a template block of HTML code to the page file. HTML parameters in the block must be set to user-supplied values, so the HTML code is generated dynamically, based on user entries in a popup form. Implementing the object inspector for each object permits the user to edit object attributes in a form window. Except for model definition, the combination of the NLX architecture and the Web design program permits construction of interactive simulation pages without writing or inspecting code.

  17. Ranking nodes in growing networks: When PageRank fails.

    Science.gov (United States)

    Mariani, Manuel Sebastian; Medo, Matúš; Zhang, Yi-Cheng

    2015-11-10

    PageRank is arguably the most popular ranking algorithm which is being applied in real systems ranging from information to biological and infrastructure networks. Despite its outstanding popularity and broad use in different areas of science, the relation between the algorithm's efficacy and properties of the network on which it acts has not yet been fully understood. We study here PageRank's performance on a network model supported by real data, and show that realistic temporal effects make PageRank fail in individuating the most valuable nodes for a broad range of model parameters. Results on real data are in qualitative agreement with our model-based findings. This failure of PageRank reveals that the static approach to information filtering is inappropriate for a broad class of growing systems, and suggest that time-dependent algorithms that are based on the temporal linking patterns of these systems are needed to better rank the nodes.

  18. Ranking nodes in growing networks: When PageRank fails

    Science.gov (United States)

    Mariani, Manuel Sebastian; Medo, Matúš; Zhang, Yi-Cheng

    2015-11-01

    PageRank is arguably the most popular ranking algorithm which is being applied in real systems ranging from information to biological and infrastructure networks. Despite its outstanding popularity and broad use in different areas of science, the relation between the algorithm’s efficacy and properties of the network on which it acts has not yet been fully understood. We study here PageRank’s performance on a network model supported by real data, and show that realistic temporal effects make PageRank fail in individuating the most valuable nodes for a broad range of model parameters. Results on real data are in qualitative agreement with our model-based findings. This failure of PageRank reveals that the static approach to information filtering is inappropriate for a broad class of growing systems, and suggest that time-dependent algorithms that are based on the temporal linking patterns of these systems are needed to better rank the nodes.

  19. Mapping neighborhood scale survey responses with uncertainty metrics

    Directory of Open Access Journals (Sweden)

    Charles Robert Ehlschlaeger

    2016-12-01

    Full Text Available This paper presents a methodology of mapping population-centric social, infrastructural, and environmental metrics at neighborhood scale. This methodology extends traditional survey analysis methods to create cartographic products useful in agent-based modeling and geographic information analysis. It utilizes and synthesizes survey microdata, sub-upazila attributes, land use information, and ground truth locations of attributes to create neighborhood scale multi-attribute maps. Monte Carlo methods are employed to combine any number of survey responses to stochastically weight survey cases and to simulate survey cases' locations in a study area. Through such Monte Carlo methods, known errors from each of the input sources can be retained. By keeping individual survey cases as the atomic unit of data representation, this methodology ensures that important covariates are retained and that ecological inference fallacy is eliminated. These techniques are demonstrated with a case study from the Chittagong Division in Bangladesh. The results provide a population-centric understanding of many social, infrastructural, and environmental metrics desired in humanitarian aid and disaster relief planning and operations wherever long term familiarity is lacking. Of critical importance is that the resulting products have easy to use explicit representation of the errors and uncertainties of each of the input sources via the automatically generated summary statistics created at the application's geographic scale.

  20. Survey and selection of assessment methodologies for GAVE options

    International Nuclear Information System (INIS)

    Weterings, R.

    1999-05-01

    The Dutch government is interested in the possibilities for a market introduction of new gaseous and liquid energy carriers. To this purpose the GAVE-programme was recently set up. This study is carried out within the framework of the GAVE-programme and aims at the selection of methodologies for assessing the technological, economic, ecological and social perspectives of these new energy options (so-called GAVE-options). Based on the results of these assessments the Dutch ministries of Housing, Planning and Environment (VROM) and Economic Affairs (EZ) will decide at the end of 1999 about starting demonstration projects of promising energy carriers

  1. Determining factors behind the PageRank log-log plot

    NARCIS (Netherlands)

    Volkovich, Y.; Litvak, Nelli; Donato, D.

    We study the relation between PageRank and other parameters of information networks such as in-degree, out-degree, and the fraction of dangling nodes. We model this relation through a stochastic equation inspired by the original definition of PageRank. Further, we use the theory of regular variation

  2. Project Management - Development of course materiale as WEB pages

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe; Bjergø, Søren

    1997-01-01

    Development of Internet pages with lessons plans, slideshows, links, conference system and interactive student section for communication between students and to teacher as well.......Development of Internet pages with lessons plans, slideshows, links, conference system and interactive student section for communication between students and to teacher as well....

  3. Design methodology of Dutch banknotes

    Science.gov (United States)

    de Heij, Hans A. M.

    2000-04-01

    Since the introduction of a design methodology for Dutch banknotes, the quality of Dutch paper currency has improved in more than one way. The methodology is question provides for (i) a design policy, which helps fix clear objectives; (ii) design management, to ensure a smooth cooperation between the graphic designer, printer, papermaker an central bank, (iii) a program of requirements, a banknote development guideline for all parties involved. This systematic approach enables an objective selection of design proposals, including security features. Furthermore, the project manager obtains regular feedback from the public by conducting market surveys. Each new design of a Netherlands Guilder banknote issued by the Nederlandsche Bank of the past 50 years has been an improvement on its predecessor in terms of value recognition, security and durability.

  4. Utilizing the Total Design Method in medicine: maximizing response rates in long, non-incentivized, personal questionnaire postal surveys.

    Science.gov (United States)

    Kazzazi, Fawz; Haggie, Rebecca; Forouhi, Parto; Kazzazi, Nazar; Malata, Charles M

    2018-01-01

    Maximizing response rates in questionnaires can improve their validity and quality by reducing non-response bias. A comprehensive analysis is essential for producing reasonable conclusions in patient-reported outcome research particularly for topics of a sensitive nature. This often makes long (≥7 pages) questionnaires necessary but these have been shown to reduce response rates in mail surveys. Our work adapted the "Total Design Method," initially produced for commercial markets, to raise response rates in a long (total: 11 pages, 116 questions), non-incentivized, very personal postal survey sent to almost 350 women. A total of 346 women who had undergone mastectomy and immediate breast reconstruction from 2008-2014 (inclusive) at Addenbrooke's University Hospital were sent our study pack (Breast-Q satisfaction questionnaire and support documents) using our modified "Total Design Method." Participants were sent packs and reminders according to our designed schedule. Of the 346 participants, we received 258 responses, an overall response rate of 74.5% with a useable response rate of 72.3%. One hundred and six responses were received before the week 1 reminder (30.6%), 120 before week 3 (34.6%), 225 before the week 7 reminder (64.6%) and the remainder within 3 weeks of the final pack being sent. The median age of patients that the survey was sent to, and the median age of the respondents, was 54 years. In this study, we have demonstrated the successful implementation of a novel approach to postal surveys. Despite the length of the questionnaire (nine pages, 116 questions) and limitations of expenses to mail a survey to ~350 women, we were able to attain a response rate of 74.6%.

  5. AUTOMATIC TAGGING OF PERSIAN WEB PAGES BASED ON N-GRAM LANGUAGE MODELS USING MAPREDUCE

    Directory of Open Access Journals (Sweden)

    Saeed Shahrivari

    2015-07-01

    Full Text Available Page tagging is one of the most important facilities for increasing the accuracy of information retrieval in the web. Tags are simple pieces of data that usually consist of one or several words, and briefly describe a page. Tags provide useful information about a page and can be used for boosting the accuracy of searching, document clustering, and result grouping. The most accurate solution to page tagging is using human experts. However, when the number of pages is large, humans cannot be used, and some automatic solutions should be used instead. We propose a solution called PerTag which can automatically tag a set of Persian web pages. PerTag is based on n-gram models and uses the tf-idf method plus some effective Persian language rules to select proper tags for each web page. Since our target is huge sets of web pages, PerTag is built on top of the MapReduce distributed computing framework. We used a set of more than 500 million Persian web pages during our experiments, and extracted tags for each page using a cluster of 40 machines. The experimental results show that PerTag is both fast and accurate

  6. CT paging arteriography with a multidetector-row CT. Advantages in splanchnic arterial imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kobayashi, Seiji [Keio Univ., Tokyo (Japan). School of Medicine

    1999-11-01

    The purpose of this study is to assess the utility of CT paging arteriography with a multidetector-row CT as a replacement for conventional angiography in the evaluation of splanchnic arterial anomalies. Sixty-three patients underwent CT paging arteriography with a multidetector-row CT. In the 56 patients with conventional angiographic correlation, there was only one minor disagreement with CT paging arteriography. In the 7 patients who underwent IVDSA (intra venous digital subtraction angiography), CT paging arteriography defined four hepatic arterial anomalies which could not be depicted by IVDSA. In conclusion, CT paging arteriography provides noninvasive means to identify splanchnic arterial anomalies. (author)

  7. Appalachian National Scenic Trail pilot survey

    Science.gov (United States)

    Stan Zarnoch; Michael Bowker; Ken Cordell; Matt Owens; Gary T. Green; Allison Ginn

    2011-01-01

    Visitation statistics on the Appalachian National Scenic Trail (AT) are important for management and Federal Government reporting purposes. However, no survey methodology has been developed to obtain accurate trailwide estimates over linear trails that traverse many hundreds of back-country miles. This research develops a stratified random survey design which utilizes...

  8. NOAA History - Main Page

    Science.gov (United States)

    NOAA History Banner gold bar divider home - takes you to index page about the site contacts noaa americas science and service noaa legacy 1807 - 2007 NOAA History is an intrinsic part of the history of Initiative scroll divider More NOAA History from Around the Nation scroll divider drawing of a tornado NOAA

  9. Cloaked Facebook pages: Exploring fake Islamist propaganda in social media

    DEFF Research Database (Denmark)

    Farkas, Johan Dam; Schou, Jannick; Neumayer, Christina

    2017-01-01

    This research analyses cloaked Facebook pages that are created to spread political propaganda by cloaking a user profile and imitating the identity of a political opponent in order to spark hateful and aggressive reactions. This inquiry is pursued through a multi-sited online ethnographic case...... study of Danish Facebook pages disguised as radical Islamist pages, which provoked racist and anti-Muslim reactions as well as negative sentiments towards refugees and immigrants in Denmark in general. Drawing on Jessie Daniels’ critical insights into cloaked websites, this research furthermore analyses...

  10. JavaScript: Convenient Interactivity for the Class Web Page.

    Science.gov (United States)

    Gray, Patricia

    This paper shows how JavaScript can be used within HTML pages to add interactive review sessions and quizzes incorporating graphics and sound files. JavaScript has the advantage of providing basic interactive functions without the use of separate software applications and players. Because it can be part of a standard HTML page, it is…

  11. 78 FR 9729 - Eastern States: Filing of Plat of Survey, North Carolina

    Science.gov (United States)

    2013-02-11

    ... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLES956000-L14200000-BJ0000] Eastern States..., on pages 318 through 319 a notice entitled ``Eastern States: Filing of Plats of Survey''. In said... Boundary, lands held in trust for the Eastern Band of Cherokee Indians, Swain County, in the State of North...

  12. Measuring consistency of web page design and its effects on performance and satisfaction.

    Science.gov (United States)

    Ozok, A A; Salvendy, G

    2000-04-01

    This study examines the methods for measuring the consistency levels of web pages and the effect of consistency on the performance and satisfaction of the world-wide web (WWW) user. For clarification, a home page is referred to as a single page that is the default page of a web site on the WWW. A web page refers to a single screen that indicates a specific address on the WWW. This study has tested a series of web pages that were mostly hyperlinked. Therefore, the term 'web page' has been adopted for the nomenclature while referring to the objects of which the features were tested. It was hypothesized that participants would perform better and be more satisfied using web pages that have consistent rather than inconsistent interface design; that the overall consistency level of an interface design would significantly correlate with the three elements of consistency, physical, communicational and conceptual consistency; and that physical and communicational consistencies would interact with each other. The hypotheses were tested in a four-group, between-subject design, with 10 participants in each group. The results partially support the hypothesis regarding error rate, but not regarding satisfaction and performance time. The results also support the hypothesis that each of the three elements of consistency significantly contribute to the overall consistency of a web page, and that physical and communicational consistencies interact with each other, while conceptual consistency does not interact with them.

  13. Code AI Personal Web Pages

    Science.gov (United States)

    Garcia, Joseph A.; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The document consists of a publicly available web site (george.arc.nasa.gov) for Joseph A. Garcia's personal web pages in the AI division. Only general information will be posted and no technical material. All the information is unclassified.

  14. Reporting on post-menopausal hormone therapy: an analysis of gynaecologists' web pages.

    Science.gov (United States)

    Bucksch, Jens; Kolip, Petra; Deitermann, Bernhilde

    2004-01-01

    The present study was designed to analyse Web pages of German gynaecologists with regard to postmenopausal hormone therapy (HT). There is a growing body of evidence, that the overall health risks of HT exceed the benefits. Making one's own informed choice has become a central concern for menopausal women. The Internet is an important source of health information, but the quality is often dubious. The study focused on the analysis of basic criteria such as last modification date and quality of the HT information content. The results of the Women's Health Initiative Study (WHI) were used as a benchmark. We searched for relevant Web pages by entering a combination of key words (9 x 13 = 117) into the search engine www.google.de. Each Web page was analysed using a standardized questionnaire. The basic criteria and the quality of content on each Web page were separately categorized by two evaluators. Disagreements were resolved by discussion. Of the 97 websites identified, basic criteria were not met by the majority. For example, the modification date was displayed by only 23 (23.7%) Web pages. The quality of content of most Web pages regarding HT was inaccurate and incomplete. Whilst only nine (9.3%) took up a balanced position, 66 (68%) recommended HT without any restrictions. In 22 cases the recommendation was indistinct and none of the sites refused HT. With regard to basic criteria, there was no difference between HT-recommending Web pages and sites with balanced position. Evidence-based information resulting from the WHI trial was insufficiently represented on gynaecologists' Web pages. Because of the growing number of consumers looking online for health information, the danger of obtaining harmful information has to be minimized. Web pages of gynaecologists do not appear to be recommendable for women because they do not provide recent evidence-based findings about HT.

  15. [Use of internet and electronic resources among Spanish intensivist physicians. First national survey].

    Science.gov (United States)

    Gómez-Tello, V; Latour-Pérez, J; Añón Elizalde, J M; Palencia-Herrejón, E; Díaz-Alersi, R; De Lucas-García, N

    2006-01-01

    Estimate knowledge and use habits of different electronic resources in a sample of Spanish intensivists: Internet, E-mail, distribution lists, and use of portable electronic devices. Self-applied questionnaire. A 50-question questionnaire was distributed among Spanish intensivists through the hospital marketing delegates of a pharmaceutical company and of electronic forums. A total of 682 questionnaires were analyzed (participation: 74%). Ninety six percent of those surveyed used Internet individually: 67% admitted training gap. Internet was the second source of clinical consultations most used (61%), slightly behind consultation to colleagues (65%). The pages consulted most were bibliographic databases (65%) and electronic professional journals (63%), with limited use of Evidence Based Medicine pages (19%). Ninety percent of those surveyed used e-mail regularly in the practice of their profession, although 25% admitted that were not aware of its possibilities. The use of E-mail decreased significantly with increase in age. A total of 62% of the intensivists used distribution lists. Of the rest, 42% were not aware of its existence and 32% admitted they had insufficient training to handle them. Twenty percent of those surveyed had portable electronic devices and 64% considered it useful, basically due to its rapid consultation at bedside. Female gender was a negative predictive factor of its use (OR 0.35; 95% CI 0.2-0.63; p=0.0002). A large majority of the Spanish intensivists use Internet and E-mail. E-mail lists and use of portable devices are still underused resources. There are important gaps in training and infrequent use of essential pages. There are specific groups that require directed educational policies.

  16. Draft report: a selection methodology for LWR safety R and D programs and proposals

    International Nuclear Information System (INIS)

    Husseiny, A.A.; Ritzman, R.L.

    1980-03-01

    The results of work done to develop a methodology for selecting LWR safety R and D programs and proposals is described. A critical survey of relevant decision analysis methods is provided including the specifics of multiattribute utility theory. This latter method forms the basis of the developed selection methodology. Details of the methodology and its use are provided along with a sample illustration of its application

  17. Draft report: a selection methodology for LWR safety R and D programs and proposals

    Energy Technology Data Exchange (ETDEWEB)

    Husseiny, A. A.; Ritzman, R. L.

    1980-03-01

    The results of work done to develop a methodology for selecting LWR safety R and D programs and proposals is described. A critical survey of relevant decision analysis methods is provided including the specifics of multiattribute utility theory. This latter method forms the basis of the developed selection methodology. Details of the methodology and its use are provided along with a sample illustration of its application.

  18. Social media for physiotherapy clinics: considerations in creating a Facebook page

    OpenAIRE

    Ahmed, Osman; Claydon, L.S.; Ribeiro, D.C.; Arumugam, A.; Higgs, C.; Baxter, G.D.

    2013-01-01

    Social media websites play a prominent role in modern society, and the most popular of these websites is Facebook. Increasingly, physiotherapy clinics have begun to utilize Facebook in order to create pages to publicize their services. There are many factors to consider in the planning, implementing, and maintenance of Facebook pages for physiotherapy clinics, including ethical and privacy issues. The primary purpose of creating a page must be clearly defined, with dedicated clinicians given ...

  19. Mode Equivalence of Health Indicators Between Data Collection Modes and Mixed-Mode Survey Designs in Population-Based Health Interview Surveys for Children and Adolescents: Methodological Study.

    Science.gov (United States)

    Mauz, Elvira; Hoffmann, Robert; Houben, Robin; Krause, Laura; Kamtsiuris, Panagiotis; Gößwald, Antje

    2018-03-05

    The implementation of an Internet option in an existing public health interview survey using a mixed-mode design is attractive because of lower costs and faster data availability. Additionally, mixed-mode surveys can increase response rates and improve sample composition. However, mixed-mode designs can increase the risk of measurement error (mode effects). This study aimed to determine whether the prevalence rates or mean values of self- and parent-reported health indicators for children and adolescents aged 0-17 years differ between self-administered paper-based questionnaires (SAQ-paper) and self-administered Web-based questionnaires (SAQ-Web), as well as between a single-mode control group and different mixed-mode groups. Data were collected for a methodological pilot of the third wave of the "German Health Interview and Examination Survey for Children and Adolescents". Questionnaires were completed by parents or adolescents. A population-based sample of 11,140 children and adolescents aged 0-17 years was randomly allocated to 4 survey designs-a single-mode control group with paper-and-pencil questionnaires only (n=970 parents, n=343 adolescents)-and 3 mixed-mode designs, all of which offered Web-based questionnaire options. In the concurrent mixed-mode design, both questionnaires were offered at the same time (n=946 parents, n=290 adolescents); in the sequential mixed-mode design, the SAQ-Web was sent first, followed by the paper questionnaire along with a reminder (n=854 parents, n=269 adolescents); and in the preselect mixed-mode design, both options were offered and the respondents were asked to request the desired type of questionnaire (n=698 parents, n=292 adolescents). In total, 3468 questionnaires of parents of children aged 0-17 years (SAQ-Web: n=708; SAQ-paper: n=2760) and 1194 questionnaires of adolescents aged 11-17 years (SAQ-Web: n=299; SAQ-paper: n=895) were analyzed. Sociodemographic characteristics and a broad range of health indicators for

  20. Creating a Facebook Page for the Seismological Society of America

    Science.gov (United States)

    Newman, S. B.

    2009-12-01

    In August, 2009 I created a Facebook “fan” page for the Seismological Society of America. We had been exploring cost-effective options for providing forums for two-way communication for some months. We knew that a number of larger technical societies had invested significant sums of money to create customized social networking sites but that a small society would need to use existing low-cost software options. The first thing I discovered when I began to set up the fan page was that an unofficial SSA Facebook group already existed, established by Steven J. Gibbons, a member in Norway. Steven had done an excellent job of posting material about SSA. Partly because of the existing group, the official SSA fan page gained fans rapidly. We began by posting information about our own activities and then added links to activities in the broader geoscience community. While much of this material also appeared on our website and in our publication, Seismological Research Letters (SRL), the tone on the FB page is different. It is less formal with more emphasis on photos and links to other sites, including our own. Fans who are active on FB see the posts as part of their social network and do not need to take the initiative to go to the SSA site. Although the goal was to provide a forum for two-way communication, our initial experience was that people were clearly reading the page but not contributing content. This appears to be case with fan pages of sister geoscience societies. FB offers some demographic information to fan site administrators. In an initial review of the demographics it appeared that fans were younger than the overall demographics of the Society. It appeared that a few of the fans are not members or even scientists. Open questions are: what content will be most useful to fans? How will the existence of the page benefit the membership as a whole? Will the page ultimately encourage two-way communication as hoped? Web 2.0 is generating a series of new

  1. Near-Duplicate Web Page Detection: An Efficient Approach Using Clustering, Sentence Feature and Fingerprinting

    Directory of Open Access Journals (Sweden)

    J. Prasanna Kumar

    2013-02-01

    Full Text Available Duplicate and near-duplicate web pages are the chief concerns for web search engines. In reality, they incur enormous space to store the indexes, ultimately slowing down and increasing the cost of serving results. A variety of techniques have been developed to identify pairs of web pages that are aldquo;similarardquo; to each other. The problem of finding near-duplicate web pages has been a subject of research in the database and web-search communities for some years. In order to identify the near duplicate web pages, we make use of sentence level features along with fingerprinting method. When a large number of web documents are in consideration for the detection of web pages, then at first, we use K-mode clustering and subsequently sentence feature and fingerprint comparison is used. Using these steps, we exactly identify the near duplicate web pages in an efficient manner. The experimentation is carried out on the web page collections and the results ensured the efficiency of the proposed approach in detecting the near duplicate web pages.

  2. The impact of visual layout factors on performance in Web pages: a cross-language study.

    Science.gov (United States)

    Parush, Avi; Shwarts, Yonit; Shtub, Avy; Chandra, M Jeya

    2005-01-01

    Visual layout has a strong impact on performance and is a critical factor in the design of graphical user interfaces (GUIs) and Web pages. Many design guidelines employed in Web page design were inherited from human performance literature and GUI design studies and practices. However, few studies have investigated the more specific patterns of performance with Web pages that may reflect some differences between Web page and GUI design. We investigated interactions among four visual layout factors in Web page design (quantity of links, alignment, grouping indications, and density) in two experiments: one with pages in Hebrew, entailing right-to-left reading, and the other with English pages, entailing left-to-right reading. Some performance patterns (measured by search times and eye movements) were similar between languages. Performance was particularly poor in pages with many links and variable densities, but it improved with the presence of uniform density. Alignment was not shown to be a performance-enhancing factor. The findings are discussed in terms of the similarities and differences in the impact of layout factors between GUIs and Web pages. Actual or potential applications of this research include specific guidelines for Web page design.

  3. A replication and methodological critique of the study "Evaluating drug trafficking on the Tor Network".

    Science.gov (United States)

    Munksgaard, Rasmus; Demant, Jakob; Branwen, Gwern

    2016-09-01

    The development of cryptomarkets has gained increasing attention from academics, including growing scientific literature on the distribution of illegal goods using cryptomarkets. Dolliver's 2015 article "Evaluating drug trafficking on the Tor Network: Silk Road 2, the Sequel" addresses this theme by evaluating drug trafficking on one of the most well-known cryptomarkets, Silk Road 2.0. The research on cryptomarkets in general-particularly in Dolliver's article-poses a number of new questions for methodologies. This commentary is structured around a replication of Dolliver's original study. The replication study is not based on Dolliver's original dataset, but on a second dataset collected applying the same methodology. We have found that the results produced by Dolliver differ greatly from our replicated study. While a margin of error is to be expected, the inconsistencies we found are too great to attribute to anything other than methodological issues. The analysis and conclusions drawn from studies using these methods are promising and insightful. However, based on the replication of Dolliver's study, we suggest that researchers using these methodologies consider and that datasets be made available for other researchers, and that methodology and dataset metrics (e.g. number of downloaded pages, error logs) are described thoroughly in the context of web-o-metrics and web crawling. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Surveillance System for Risk and Protective Factors for Chronic Diseases by Telephone Survey (Vigitel): changes in weighting methodology.

    Science.gov (United States)

    Bernal, Regina Tomie Ivata; Iser, Betine Pinto Moehlecke; Malta, Deborah Carvalho; Claro, Rafael Moreira

    2017-01-01

    to introduce the methodology used to calculate post-stratification weights of the 2012 Surveillance System for Risk and Protective Factors for Chronic Diseases by Telephone Survey (Vigitel) and to compare the trends of indicators estimated by cell-by-cell weighting and raking methods. in this panel of cross-sectional studies, the prevalences of smokers, overweight, and intake of fruits and vegetables from 2006 to 2012 were estimated using the cell-by-cell weighting and raking methods. there were no differences in time trends of the indicators estimated by both methods, but the prevalence of smokers estimated by the raking method was lower than the estimated by cell-by-cell weighting, whilst the prevalence of fruit and vegetable intake was higher; for overweight, there was no difference between the methods. raking method presented higher accuracy of the estimates when compared to cell-by-cell weighting method, proving to be most convenient, although it presents register coverage bias.

  5. Some Findings Concerning Requirements in Agile Methodologies

    Science.gov (United States)

    Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan

    Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.

  6. The study of evaluation methodology of the aging and degradation researches

    International Nuclear Information System (INIS)

    Cho, C. J.; Park, Z. H.; Jeong, I. S.

    2001-01-01

    To judge the usefulness of aging related researches like PLIM (Plant lifetime Management) and aging related degradation, et. al. in PSR(Periodic Safety Review), the evaluation methodology of the R and D have been proposed up to now are reviewed. The infometric methodology is considered to be the optimum method for the evaluation of the nuclear related researches. And finally, to increase the objectiveness and reliability of the infometric methodology in the aging and degradation researches, the indexes of safety, technology and economics are introduced. From this study, the infometric methodology has the advantage of the actual engineering evaluation in the nuclear related researches with other methodologies, but for the further research, the effective construction of DB and survey of various statistics in the technical reports and papers are needed

  7. 76 FR 70481 - Notice of Filing of Plats of Survey; South Dakota

    Science.gov (United States)

    2011-11-14

    ...] Notice of Filing of Plats of Survey; South Dakota AGENCY: Bureau of Land Management, Interior. [[Page...: 5th Principal Meridian, South Dakota T. 124 N., R. 53 W. The plat, in two sheets, representing the... Principal Meridian, South Dakota, was accepted October 28, 2011. We will place a copy of the plat, in two...

  8. A study on assessment methodology of surveillance test interval and allowed outage time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol

    1996-07-01

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method

  9. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol [Seoul Nationl Univ., Seoul (Korea, Republic of)] (and others)

    1996-07-15

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method.

  10. 77 FR 61473 - Proposed Collection; Comment Request for Voluntary Customer Surveys To Implement E.O. 12862...

    Science.gov (United States)

    2012-10-09

    ... Voluntary Customer Surveys To Implement E.O. 12862 Coordinated by the Corporate Planning and Performance... required by the Paperwork Reduction Act of 1995, Public Law 104-13 (44 U.S.C. 3506(c)(2)(A)). Currently... Coordinated by [[Page 61474

  11. Are consumer surveys valuable as a service improvement tool in health services? A critical appraisal.

    Science.gov (United States)

    Patwardhan, Anjali; Patwardhan, Prakash

    2009-01-01

    In the recent climate of consumerism and consumer focused care, health and social care needs to be more responsive than ever before. Consumer needs and preferences can be elicited with accepted validity and reliability only by strict methodological control, customerisation of the questionnaire and skilled interpretation. To construct, conduct, interpret and implement improved service provision, requires a trained work force and infrastructure. This article aims to appraise various aspects of consumer surveys and to assess their value as effective service improvement tools. The customer is the sole reason organisations exist. Consumer surveys are used worldwide as service and quality of care improvement tools by all types of service providers including health service providers. The article critically appraises the value of consumer surveys as service improvement tools in health services tool and its future applications. No one type of survey is the best or ideal. The key is the selection of the correct survey methodology, unique and customised for the particular type/aspect of care being evaluated. The method used should reflect the importance of the information required. Methodological rigor is essential for the effectiveness of consumer surveys as service improvement tools. Unfortunately so far there is no universal consensus on superiority of one particular methodology over another or any benefit of one specific methodology in a given situation. More training and some dedicated resource allocation is required to develop consumer surveys. More research is needed to develop specific survey methodology and evaluation techniques for improved validity and reliability of the surveys as service improvement tools. Measurement of consumer preferences/priorities, evaluation of services and key performance scores, is not easy. Consumer surveys seem impressive tools as they provide the customer a voice for change or modification. However, from a scientific point

  12. The VLA Sky Survey

    Science.gov (United States)

    Lacy, Mark; VLASS Survey Team, VLASS Survey Science Group

    2018-01-01

    The VLA Sky Survey (VLASS), which began in September 2017, is a seven year project to image the entire sky north of Declination -40 degrees in three epochs. The survey is being carried out in I,Q and U polarization at a frequency of 2-4GHz, and a resolution of 2.5 arcseconds, with each epoch being separated by 32 months. Raw data from the survey, along with basic "quicklook" images are made freely available shortly after observation. Within a few months, NRAO will begin making available further basic data products, including refined images and source lists. In this talk I shall describe the science goals and methodology of the survey, the current survey status, and some early results, along with plans for collaborations with external groups to produce enhanced, high level data products.

  13. Analysis and Testing of Ajax-based Single-page Web Applications

    NARCIS (Netherlands)

    Mesbah, A.

    2009-01-01

    This dissertation has focused on better understanding the shifting web paradigm and the consequences of moving from the classical multi-page model to an Ajax-based single-page style. Specifically to that end, this work has examined this new class of software from three main software engineering

  14. Demonstrating the Potential for Web-Based Survey Methodology with a Case Study.

    Science.gov (United States)

    Mertler, Craig

    2002-01-01

    Describes personal experience with using the Internet to administer a teacher-motivation and job-satisfaction survey to elementary and secondary teachers. Concludes that advantages of Web-base surveys, such as cost savings and efficiency of data collection, outweigh disadvantages, such as the limitations of listservs. (Contains 10 references.)…

  15. An Optimization Model for Product Placement on Product Listing Pages

    Directory of Open Access Journals (Sweden)

    Yan-Kwang Chen

    2014-01-01

    Full Text Available The design of product listing pages is a key component of Website design because it has significant influence on the sales volume on a Website. This study focuses on product placement in designing product listing pages. Product placement concerns how venders of online stores place their products over the product listing pages for maximization of profit. This problem is very similar to the offline shelf management problem. Since product information sources on a Web page are typically communicated through the text and image, visual stimuli such as color, shape, size, and spatial arrangement often have an effect on the visual attention of online shoppers and, in turn, influence their eventual purchase decisions. In view of the above, this study synthesizes the visual attention literature and theory of shelf-space allocation to develop a mathematical programming model with genetic algorithms for finding optimal solutions to the focused issue. The validity of the model is illustrated with example problems.

  16. Teachers' Attitude towards Implementation of Learner-Centered Methodology in Science Education in Kenya

    Science.gov (United States)

    Ndirangu, Caroline

    2017-01-01

    This study aims to evaluate teachers' attitude towards implementation of learner-centered methodology in science education in Kenya. The study used a survey design methodology, adopting the purposive, stratified random and simple random sampling procedures and hypothesised that there was no significant relationship between the head teachers'…

  17. Fiches « En une page » | CRDI - Centre de recherches pour le ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    10 nov. 2010 ... Fiche « En une page » de MCP no 1. Un IED sans envergure, une croissance stagnante : le Mercosur peut-il faire mieux ? Fiche « En une page » de MCP no 2. Impôts, taxes et équité entre les sexes : codes, comportements et conséquences, voulues ou non. Fiche « En une page » de MCP no 3.

  18. Variability in "1"8F-FDG PET/CT methodology of acquisition, reconstruction and analysis for oncologic imaging: state survey

    International Nuclear Information System (INIS)

    Fischer, Andreia C.F. da S.; Druzian, Aline C.; Bacelar, Alexandre; Pianta, Diego B.; Silva, Ana M. Marques da

    2016-01-01

    The SUV in "1"8F-FDG PET/CT oncological imaging is useful for cancer diagnosis, staging and treatment assessment. There are, however, several factors that can give rise to bias in SUV measurements. When using SUV as a diagnostic tool, one needs to minimize the variability in this measurement by standardization of patient preparation, acquisition and reconstruction parameters. The aim of this study is to evaluate the methodological variability in PET/CT acquisition in Rio Grande do Sul State. For that, in each department, a questionnaire was applied to survey technical information from PET/CT systems and about the acquisitions and analysis methods utilized. All departments implement quality assurance programs consistent with (inter)national recommendations. However, the acquisition and reconstruction methods of acquired PET data differ. The implementation of a harmonized strategy for quantifying the SUV is suggested, in order to obtain greater reproducibility and repeatability. (author)

  19. Student Library Pages: Valuable Resource for the Library Media Center.

    Science.gov (United States)

    Crowther, Eleanor

    1993-01-01

    Describes the use of students as library pages at the Loudoun Country Day School (Virginia). Highlights include student selection procedures, including interviews; parental consent form; library page duties; benefits to students; benefits to the library; and parent attitudes. Copies of the student interview form and parental consent form are…

  20. An algorithm to assess methodological quality of nutrition and mortality cross-sectional surveys: development and application to surveys conducted in Darfur, Sudan.

    Science.gov (United States)

    Prudhon, Claudine; de Radiguès, Xavier; Dale, Nancy; Checchi, Francesco

    2011-11-09

    Nutrition and mortality surveys are the main tools whereby evidence on the health status of populations affected by disasters and armed conflict is quantified and monitored over time. Several reviews have consistently revealed a lack of rigor in many surveys. We describe an algorithm for analyzing nutritional and mortality survey reports to identify a comprehensive range of errors that may result in sampling, response, or measurement biases and score quality. We apply the algorithm to surveys conducted in Darfur, Sudan. We developed an algorithm based on internationally agreed upon methods and best practices. Penalties are attributed for a list of errors, and an overall score is built from the summation of penalties accrued by the survey as a whole. To test the algorithm reproducibility, it was independently applied by three raters on 30 randomly selected survey reports. The algorithm was further applied to more than 100 surveys conducted in Darfur, Sudan. The Intra Class Correlation coefficient was 0.79 for mortality surveys and 0.78 for nutrition surveys. The overall median quality score and range of about 100 surveys conducted in Darfur were 0.60 (0.12-0.93) and 0.675 (0.23-0.86) for mortality and nutrition surveys, respectively. They varied between the organizations conducting the surveys, with no major trend over time. Our study suggests that it is possible to systematically assess quality of surveys and reveals considerable problems with the quality of nutritional and particularly mortality surveys conducted in the Darfur crisis.

  1. PREFACE: PAGES 1st Young Scientists Meeting (YSM) - 'Retrospective views on our planet's future'

    Science.gov (United States)

    Margrethe Basse, Ellen

    2010-03-01

    'Retrospective views on our planet's future' - This was the theme of a tandem of meetings held by Past Global Changes (PAGES; http://www.pages-igbp.org), a project of the International Geosphere-Biosphere Programme (IGBP). It reflects the philosophy of PAGES and its community of scientists that the past holds the key to better projections of the future. Climatic and environmental evidence from the past can be used to sharpen future projections of global change, thereby informing political and societal decisions on mitigation and adaptation. Young scientists are critical to the future of this endeavour, which we call 'paleoscience'. Their scientific knowledge, interdisciplinarity, international collaboration, and leadership skills will be required if this field is to continue to thrive. Meanwhile, it is also important to remember that science develops not only by applying new strategies and new tools to make new observations, but also by building upon existing knowledge. Modern research in paleoscience began around fifty years ago, and one could say that the third generation of researchers is now emerging. It is a wise investment to ensure that existing skills and knowledge are transferred to this generation. This will enable them to lead the science towards new accomplishments, and to make important contributions towards the wider field of global change science. Motivated by such considerations, PAGES organized its first Young Scientists Meeting (YSM), held in Corvallis (Oregon, USA) in July 2009 (http://www.pages-osm.org/ysm/index.html). The meeting took place immediately before the much larger 3rd PAGES Open Science Meeting (OSM; http://www.pages-osm.org/osm/index.html). The YSM brought together 91 early-career scientists from 21 different nations. During the two-day meeting, PhD students, postdoctoral researchers, and new faculty met to present their work and build networks across geographical and disciplinary borders. Several experienced and well

  2. Who sends the email? Using electronic surveys in violence research.

    Science.gov (United States)

    Sutherland, Melissa A; Amar, Angela F; Laughon, Kathryn

    2013-08-01

    Students aged 16-24 years are at greatest risk for interpersonal violence and the resulting short and long-term health consequences. Electronic survey methodology is well suited for research related to interpersonal violence. Yet methodological questions remain about best practices in using electronic surveys. While researchers often indicate that potential participants receive multiple emails as reminders to complete the survey, little mention is made of the sender of the recruitment email. The purpose of this analysis is to describe the response rates from three violence-focused research studies when the recruitment emails are sent from a campus office, researcher or survey sampling firm. Three violence-focused studies were conducted about interpersonal violence among college students in the United States. Seven universities and a survey sampling firm were used to recruit potential participants to complete an electronic survey. The sender of the recruitment emails varied within and across the each of the studies depending on institutional review boards and university protocols. An overall response rate of 30% was noted for the 3 studies. Universities in which researcher-initiated recruitment emails were used had higher response rates compared to universities where campus officials sent the recruitment emails. Researchers found lower response rates to electronic surveys at Historically Black Colleges or Universities and that other methods were needed to improve response rates. The sender of recruitment emails for electronic surveys may be an important factor in response rates for violence-focused research. For researchers identification of best practices for survey methodology is needed to promote accurate disclosure and increase response rates.

  3. METHODOLOGICAL PROPOSAL FOR COMPILING THE ILO UNEMPLOYMENT WITH MONTHLY PERIODICITY

    Directory of Open Access Journals (Sweden)

    Silvia PISICĂ

    2011-08-01

    Full Text Available Development of methodology for deriving the monthly unemployment statistics directly from the quarterly Labour Force Survey (LFS results by econometric modeling meets the requirements of insuring the information on short-term needed for employment policies, aiming to achieve the objectives of Europe 2020. Estimated monthly data series according to the methodology allow assessment of short-term trends in unemployment measured according to the criteria of the International Labour Organisation (ILO in terms of comparability with European statistics.

  4. Research Methodologies in Supply Chain Management

    DEFF Research Database (Denmark)

    Kotzab, Herbert

    . Within the 36 chapters 70 authors bring together a rich selection of theoretical and practical examples of how research methodologies are applied in supply chain management. The book contains papers on theoretical implications as well as papers on a range of key methods, such as modelling, surveys, case...... studies or action research. It will be of great interest to researchers in the area of supply chain management and logistics, but also to neighbouring fields, such as network management or global operations.......While supply chain management has risen to great prominence in recent year, there are hardly related developments in research methodologies. Yet, as supply chains cover more than one company, one central issue is how to collect and analyse data along the whole or relevant part of the supply chain...

  5. Detection of spam web page using content and link-based techniques

    Indian Academy of Sciences (India)

    Spam pages are generally insufficient and inappropriate results for user. ... kinds of Web spamming techniques: Content spam and Link spam. 1. Content spam: The .... of the spam pages are machine generated and hence tech- nique of ...

  6. An algorithm to assess methodological quality of nutrition and mortality cross-sectional surveys: development and application to surveys conducted in Darfur, Sudan

    Directory of Open Access Journals (Sweden)

    Prudhon Claudine

    2011-11-01

    Full Text Available Abstract Background Nutrition and mortality surveys are the main tools whereby evidence on the health status of populations affected by disasters and armed conflict is quantified and monitored over time. Several reviews have consistently revealed a lack of rigor in many surveys. We describe an algorithm for analyzing nutritional and mortality survey reports to identify a comprehensive range of errors that may result in sampling, response, or measurement biases and score quality. We apply the algorithm to surveys conducted in Darfur, Sudan. Methods We developed an algorithm based on internationally agreed upon methods and best practices. Penalties are attributed for a list of errors, and an overall score is built from the summation of penalties accrued by the survey as a whole. To test the algorithm reproducibility, it was independently applied by three raters on 30 randomly selected survey reports. The algorithm was further applied to more than 100 surveys conducted in Darfur, Sudan. Results The Intra Class Correlation coefficient was 0.79 for mortality surveys and 0.78 for nutrition surveys. The overall median quality score and range of about 100 surveys conducted in Darfur were 0.60 (0.12-0.93 and 0.675 (0.23-0.86 for mortality and nutrition surveys, respectively. They varied between the organizations conducting the surveys, with no major trend over time. Conclusion Our study suggests that it is possible to systematically assess quality of surveys and reveals considerable problems with the quality of nutritional and particularly mortality surveys conducted in the Darfur crisis.

  7. Assessing the Ecological Footprint of Ecotourism Packages: A Methodological Proposition

    Directory of Open Access Journals (Sweden)

    Maria Serena Mancini

    2018-06-01

    Full Text Available Tourism represents a key economic sector worldwide, constituting great leverage for local economic development but also putting noticeable environmental pressures on local natural resources. Ecotourism may be a viable alternative to mass tourism to minimize impacts on ecosystems, but it needs shared sustainability standards and monitoring tools to evaluate impacts. This paper presents a first methodological proposition to calculate the environmental impact of ecotourism packages through the use of an ad-hoc, customized version of the Ecological Footprint methodology. It follows a participatory, bottom-up approach to collecting input data for the four main services (Accommodation, Food & Drinks, Activity & Service, and Mobility & Transfer provided to tourists through the use of surveys and stakeholders engagement. The outcome of our approach materializes in an excel-based ecotourism workbook capable of processing input data collected through surveys and returning Ecological Footprint values for specific ecotourism packages. Although applied to ecotourism in Mediterranean Protected Areas within the context of the DestiMED project, we believe that the methodology and approach presented here can constitute a blueprint and a benchmark for future studies dealing with the impact of ecotourism packages.

  8. Characterizing Microseismicity at the Newberry Volcano Geothermal Site using PageRank

    Science.gov (United States)

    Aguiar, A. C.; Myers, S. C.

    2015-12-01

    The Newberry Volcano, within the Deschutes National Forest in Oregon, has been designated as a candidate site for the Department of Energy's Frontier Observatory for Research in Geothermal Energy (FORGE) program. This site was stimulated using high-pressure fluid injection during the fall of 2012, which generated several hundred microseismic events. Exploring the spatial and temporal development of microseismicity is key to understanding how subsurface stimulation modifies stress, fractures rock, and increases permeability. We analyze Newberry seismicity using both surface and borehole seismometers from the AltaRock and LLNL seismic networks. For our analysis we adapt PageRank, Google's initial search algorithm, to evaluate microseismicity during the 2012 stimulation. PageRank is a measure of connectivity, where higher ranking represents highly connected windows. In seismic applications connectivity is measured by the cross correlation of 2 time windows recorded on a common seismic station and channel. Aguiar and Beroza (2014) used PageRank based on cross correlation to detect low-frequency earthquakes, which are highly repetitive but difficult to detect. We expand on this application by using PageRank to define signal-correlation topology for micro-earthquakes, including the identification of signals that are connected to the largest number of other signals. We then use this information to create signal families and compare PageRank families to the spatial and temporal proximity of associated earthquakes. Studying signal PageRank will potentially allow us to efficiently group earthquakes with similar physical characteristics, such as focal mechanisms and stress drop. Our ultimate goal is to determine whether changes in the state of stress and/or changes in the generation of subsurface fracture networks can be detected using PageRank topology. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under

  9. Demonstrating the use of web analytics and an online survey to understand user groups of a national network of river level data

    Science.gov (United States)

    Macleod, Christopher Kit; Braga, Joao; Arts, Koen; Ioris, Antonio; Han, Xiwu; Sripada, Yaji; van der Wal, Rene

    2016-04-01

    The number of local, national and international networks of online environmental sensors are rapidly increasing. Where environmental data are made available online for public consumption, there is a need to advance our understanding of the relationships between the supply of and the different demands for such information. Understanding how individuals and groups of users are using online information resources may provide valuable insights into their activities and decision making. As part of the 'dot.rural wikiRivers' project we investigated the potential of web analytics and an online survey to generate insights into the use of a national network of river level data from across Scotland. These sources of online information were collected alongside phone interviews with volunteers sampled from the online survey, and interviews with providers of online river level data; as part of a larger project that set out to help improve the communication of Scotland's online river data. Our web analytics analysis was based on over 100 online sensors which are maintained by the Scottish Environmental Protection Agency (SEPA). Through use of Google Analytics data accessed via the R Ganalytics package we assessed: if the quality of data provided by Google Analytics free service is good enough for research purposes; if we could demonstrate what sensors were being used, when and where; how the nature and pattern of sensor data may affect web traffic; and whether we can identify and profile these users based on information from traffic sources. Web analytics data consists of a series of quantitative metrics which capture and summarize various dimensions of the traffic to a certain web page or set of pages. Examples of commonly used metrics include the number of total visits to a site and the number of total page views. Our analyses of the traffic sources from 2009 to 2011 identified several different major user groups. To improve our understanding of how the use of this national

  10. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  11. PageRank as a method to rank biomedical literature by importance.

    Science.gov (United States)

    Yates, Elliot J; Dixon, Louise C

    2015-01-01

    Optimal ranking of literature importance is vital in overcoming article overload. Existing ranking methods are typically based on raw citation counts, giving a sum of 'inbound' links with no consideration of citation importance. PageRank, an algorithm originally developed for ranking webpages at the search engine, Google, could potentially be adapted to bibliometrics to quantify the relative importance weightings of a citation network. This article seeks to validate such an approach on the freely available, PubMed Central open access subset (PMC-OAS) of biomedical literature. On-demand cloud computing infrastructure was used to extract a citation network from over 600,000 full-text PMC-OAS articles. PageRanks and citation counts were calculated for each node in this network. PageRank is highly correlated with citation count (R = 0.905, P PageRank can be trivially computed on commodity cluster hardware and is linearly correlated with citation count. Given its putative benefits in quantifying relative importance, we suggest it may enrich the citation network, thereby overcoming the existing inadequacy of citation counts alone. We thus suggest PageRank as a feasible supplement to, or replacement of, existing bibliometric ranking methods.

  12. A Generalizable Methodology for Quantifying User Satisfaction

    Science.gov (United States)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  13. What’s New? Deploying a Library New Titles Page with Minimal Programming

    Directory of Open Access Journals (Sweden)

    John Meyerhofer

    2017-01-01

    Full Text Available With a new titles web page, a library has a place to show faculty, students, and staff the items they are purchasing for their community. However, many times heavy programing knowledge and/or a LAMP stack (Linux, Apache, MySQL, PHP or APIs separate a library’s data from making a new titles web page a reality. Without IT staff, a new titles page can become nearly impossible or not worth the effort. Here we will demonstrate how a small liberal arts college took its acquisition data and combined it with a Google Sheet, HTML, and a little JavaScript to create a new titles web page that was dynamic and engaging to its users.

  14. Content and Design Features of Academic Health Sciences Libraries' Home Pages.

    Science.gov (United States)

    McConnaughy, Rozalynd P; Wilson, Steven P

    2018-01-01

    The goal of this content analysis was to identify commonly used content and design features of academic health sciences library home pages. After developing a checklist, data were collected from 135 academic health sciences library home pages. The core components of these library home pages included a contact phone number, a contact email address, an Ask-a-Librarian feature, the physical address listed, a feedback/suggestions link, subject guides, a discovery tool or database-specific search box, multimedia, social media, a site search option, a responsive web design, and a copyright year or update date.

  15. A PROMPT METHODOLOGY TO GEOREFERENCE COMPLEX HYPOGEA ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    S. Troisi

    2017-02-01

    Full Text Available Actually complex underground structures and facilities occupy a wide space in our cities, most of them are often unsurveyed; cable duct, drainage system are not exception. Furthermore, several inspection operations are performed in critical air condition, that do not allow or make more difficult a conventional survey. In this scenario a prompt methodology to survey and georeferencing such facilities is often indispensable. A visual based approach was proposed in this paper; such methodology provides a 3D model of the environment and the path followed by the camera using the conventional photogrammetric/Structure from motion software tools. The key-role is played by the lens camera; indeed, a fisheye system was employed to obtain a very wide field of view (FOV and therefore high overlapping among the frames. The camera geometry is in according to a forward motion along the axis camera. Consequently, to avoid instability of bundle adjustment algorithm a preliminary calibration of camera was carried out. A specific case study was reported and the accuracy achieved.

  16. a Prompt Methodology to Georeference Complex Hypogea Environments

    Science.gov (United States)

    Troisi, S.; Baiocchi, V.; Del Pizzo, S.; Giannone, F.

    2017-02-01

    Actually complex underground structures and facilities occupy a wide space in our cities, most of them are often unsurveyed; cable duct, drainage system are not exception. Furthermore, several inspection operations are performed in critical air condition, that do not allow or make more difficult a conventional survey. In this scenario a prompt methodology to survey and georeferencing such facilities is often indispensable. A visual based approach was proposed in this paper; such methodology provides a 3D model of the environment and the path followed by the camera using the conventional photogrammetric/Structure from motion software tools. The key-role is played by the lens camera; indeed, a fisheye system was employed to obtain a very wide field of view (FOV) and therefore high overlapping among the frames. The camera geometry is in according to a forward motion along the axis camera. Consequently, to avoid instability of bundle adjustment algorithm a preliminary calibration of camera was carried out. A specific case study was reported and the accuracy achieved.

  17. Default Parallels Plesk Panel Page

    Science.gov (United States)

    services that small businesses want and need. Our software includes key building blocks of cloud service virtualized servers Service Provider Products Parallels® Automation Hosting, SaaS, and cloud computing , the leading hosting automation software. You see this page because there is no Web site at this

  18. Combining users' activity survey and simulators to evaluate human activity recognition systems.

    Science.gov (United States)

    Azkune, Gorka; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2015-04-08

    Evaluating human activity recognition systems usually implies following expensive and time-consuming methodologies, where experiments with humans are run with the consequent ethical and legal issues. We propose a novel evaluation methodology to overcome the enumerated problems, which is based on surveys for users and a synthetic dataset generator tool. Surveys allow capturing how different users perform activities of daily living, while the synthetic dataset generator is used to create properly labelled activity datasets modelled with the information extracted from surveys. Important aspects, such as sensor noise, varying time lapses and user erratic behaviour, can also be simulated using the tool. The proposed methodology is shown to have very important advantages that allow researchers to carry out their work more efficiently. To evaluate the approach, a synthetic dataset generated following the proposed methodology is compared to a real dataset computing the similarity between sensor occurrence frequencies. It is concluded that the similarity between both datasets is more than significant.

  19. Combining Users’ Activity Survey and Simulators to Evaluate Human Activity Recognition Systems

    Directory of Open Access Journals (Sweden)

    Gorka Azkune

    2015-04-01

    Full Text Available Evaluating human activity recognition systems usually implies following expensive and time-consuming methodologies, where experiments with humans are run with the consequent ethical and legal issues. We propose a novel evaluation methodology to overcome the enumerated problems, which is based on surveys for users and a synthetic dataset generator tool. Surveys allow capturing how different users perform activities of daily living, while the synthetic dataset generator is used to create properly labelled activity datasets modelled with the information extracted from surveys. Important aspects, such as sensor noise, varying time lapses and user erratic behaviour, can also be simulated using the tool. The proposed methodology is shown to have very important advantages that allow researchers to carry out their work more efficiently. To evaluate the approach, a synthetic dataset generated following the proposed methodology is compared to a real dataset computing the similarity between sensor occurrence frequencies. It is concluded that the similarity between both datasets is more than significant.

  20. Combining Users' Activity Survey and Simulators to Evaluate Human Activity Recognition Systems

    Science.gov (United States)

    Azkune, Gorka; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2015-01-01

    Evaluating human activity recognition systems usually implies following expensive and time-consuming methodologies, where experiments with humans are run with the consequent ethical and legal issues. We propose a novel evaluation methodology to overcome the enumerated problems, which is based on surveys for users and a synthetic dataset generator tool. Surveys allow capturing how different users perform activities of daily living, while the synthetic dataset generator is used to create properly labelled activity datasets modelled with the information extracted from surveys. Important aspects, such as sensor noise, varying time lapses and user erratic behaviour, can also be simulated using the tool. The proposed methodology is shown to have very important advantages that allow researchers to carry out their work more efficiently. To evaluate the approach, a synthetic dataset generated following the proposed methodology is compared to a real dataset computing the similarity between sensor occurrence frequencies. It is concluded that the similarity between both datasets is more than significant. PMID:25856329

  1. Survey of the prevalence and methodology of quality assurance for B-mode ultrasound image quality among veterinary sonographers.

    Science.gov (United States)

    Hoscheit, Larry P; Heng, Hock Gan; Lim, Chee Kin; Weng, Hsin-Yi

    2018-05-01

    Image quality in B-mode ultrasound is important as it reflects the diagnostic accuracy and diagnostic information provided during clinical scanning. Quality assurance programs for B-mode ultrasound systems/components are comprised of initial quality acceptance testing and subsequent regularly scheduled quality control testing. The importance of quality assurance programs for B-mode ultrasound image quality using ultrasound phantoms is well documented in the human medical and medical physics literature. The purpose of this prospective, cross-sectional, survey study was to determine the prevalence and methodology of quality acceptance testing and quality control testing of image quality for ultrasound system/components among veterinary sonographers. An online electronic survey was sent to 1497 members of veterinary imaging organizations: the American College of Veterinary Radiology, the Veterinary Ultrasound Society, and the European Association of Veterinary Diagnostic Imaging, and a total of 167 responses were received. The results showed that the percentages of veterinary sonographers performing quality acceptance testing and quality control testing are 42% (64/151; 95% confidence interval 34-52%) and 26% (40/156: 95% confidence interval 19-33%) respectively. Of the respondents who claimed to have quality acceptance testing or quality control testing of image quality in place for their ultrasound system/components, 0% have performed quality acceptance testing or quality control testing correctly (quality acceptance testing 95% confidence interval: 0-6%, quality control testing 95% confidence interval: 0-11%). Further education and guidelines are recommended for veterinary sonographers in the area of quality acceptance testing and quality control testing for B-mode ultrasound equipment/components. © 2018 American College of Veterinary Radiology.

  2. "I didn't know her, but…": parasocial mourning of mediated deaths on Facebook RIP pages

    Science.gov (United States)

    Klastrup, Lisbeth

    2015-04-01

    This article examines the use of six Danish "Rest in Peace" or (RIP) memorial pages. The article focuses on the relation between news media and RIP page use, in relation to general communicative practices on these pages. Based on an analysis of press coverage of the deaths of six young people and a close analysis of 1,015 comments extracted from the RIP pages created to memorialize them, it is shown that their deaths attracted considerable media attention, as did the RIP pages themselves. Comment activity seem to reflect the news stories in the way the commenters refer to the context of death and the emotional distress they experience, but mainly comments on the RIP pages are conventional expressions of sympathy and "RIP" wishes. The article concludes that public RIP pages might be understood as virtual spontaneous shrines, affording an emerging practice of "RIP-ing."

  3. A biplex approach to PageRank centrality: From classic to multiplex networks.

    Science.gov (United States)

    Pedroche, Francisco; Romance, Miguel; Criado, Regino

    2016-06-01

    In this paper, we present a new view of the PageRank algorithm inspired by multiplex networks. This new approach allows to introduce a new centrality measure for classic complex networks and a new proposal to extend the usual PageRank algorithm to multiplex networks. We give some analytical relations between these new approaches and the classic PageRank centrality measure, and we illustrate the new parameters presented by computing them on real underground networks.

  4. A biplex approach to PageRank centrality: From classic to multiplex networks

    Science.gov (United States)

    Pedroche, Francisco; Romance, Miguel; Criado, Regino

    2016-06-01

    In this paper, we present a new view of the PageRank algorithm inspired by multiplex networks. This new approach allows to introduce a new centrality measure for classic complex networks and a new proposal to extend the usual PageRank algorithm to multiplex networks. We give some analytical relations between these new approaches and the classic PageRank centrality measure, and we illustrate the new parameters presented by computing them on real underground networks.

  5. A Methodology To Incorporate The Safety Culture Into Probabilistic Safety Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sunghyun; Kim, Namyeong; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2015-10-15

    In order to incorporate organizational factors into PSA, a methodology needs to be developed. Using the AHP to weigh organizational factors as well as the SLIM to rate those factors, a methodology is introduced in this study. The safety issues related to nuclear safety culture have occurred increasingly. The quantification tool has to be developed in order to include the organizational factor into Probabilistic Safety Assessments. In this study, the state-of-the-art for the organizational evaluation methodologies has been surveyed. This study includes the research for organizational factors, maintenance process, maintenance process analysis models, a quantitative methodology using Analytic Hierarchy Process, Success Likelihood Index Methodology. The purpose of this study is to develop a methodology to incorporate the safety culture into PSA for obtaining more objective risk than before. The organizational factor considered in nuclear safety culture might affect the potential risk of human error and hardware-failure. The safety culture impact index to monitor the plant safety culture can be assessed by applying the developed methodology into a nuclear power plant.

  6. A Methodology To Incorporate The Safety Culture Into Probabilistic Safety Assessments

    International Nuclear Information System (INIS)

    Park, Sunghyun; Kim, Namyeong; Jae, Moosung

    2015-01-01

    In order to incorporate organizational factors into PSA, a methodology needs to be developed. Using the AHP to weigh organizational factors as well as the SLIM to rate those factors, a methodology is introduced in this study. The safety issues related to nuclear safety culture have occurred increasingly. The quantification tool has to be developed in order to include the organizational factor into Probabilistic Safety Assessments. In this study, the state-of-the-art for the organizational evaluation methodologies has been surveyed. This study includes the research for organizational factors, maintenance process, maintenance process analysis models, a quantitative methodology using Analytic Hierarchy Process, Success Likelihood Index Methodology. The purpose of this study is to develop a methodology to incorporate the safety culture into PSA for obtaining more objective risk than before. The organizational factor considered in nuclear safety culture might affect the potential risk of human error and hardware-failure. The safety culture impact index to monitor the plant safety culture can be assessed by applying the developed methodology into a nuclear power plant

  7. Asking about Sex in General Health Surveys: Comparing the Methods and Findings of the 2010 Health Survey for England with Those of the Third National Survey of Sexual Attitudes and Lifestyles.

    Directory of Open Access Journals (Sweden)

    Philip Prah

    Full Text Available Including questions about sexual health in the annual Health Survey for England (HSE provides opportunities for regular measurement of key public health indicators, augmenting Britain's decennial National Survey of Sexual Attitudes and Lifestyles (Natsal. However, contextual and methodological differences may limit comparability of the findings. We examine the extent of these differences between HSE 2010 and Natsal-3 and investigate their impact on parameter estimates.Complex survey analyses of data from men and women in the 2010 HSE (n = 2,782 men and 3,588 women and Natsal-3 undertaken 2010-2012 (n = 4,882 men and 6,869 women aged 16-69y and resident in England, both using probability sampling, compared their characteristics, the amount of non-response to, and estimates from, sexual health questions. Both surveys used self-completion for the sexual behaviour questions but this was via computer-assisted self-interview (CASI in Natsal-3 and a pen-and-paper questionnaire in HSE 2010.The surveys achieved similar response rates, both around 60%, and demographic profiles largely consistent with the census, although HSE participants tended to be less educated, and reported worse general health, than Natsal-3 participants. Item non-response to the sexual health questions was typically higher in HSE 2010 (range: 9-18% relative to Natsal-3 (all <5%. Prevalence estimates for sexual risk behaviours and STI-related indicators were generally slightly lower in HSE 2010 than Natsal-3.While a relatively high response to sexual health questions in HSE 2010 demonstrates the feasibility of asking such questions in a general health survey, differences with Natsal-3 do exist. These are likely due to the HSE's context as a general health survey and methodological limitations such as its current use of pen-and-paper questionnaires. Methodological developments to the HSE should be considered so that its data can be interpreted in combination with those from dedicated

  8. Page | 133 LEGISLATIVE APPROVAL OF EXECUTIVE ...

    African Journals Online (AJOL)

    Fr. Ikenga

    NAUJILJ 9 (2) 2018. Page | 133 ... Keywords: Executive appointments, Legislative approval, National Assembly, Constitutional duty. 1. ... Representatives is led by a Speaker.6 The election of the leadership of the senate is entirely the affair of.

  9. A teen's guide to creating web pages and blogs

    CERN Document Server

    Selfridge, Peter; Osburn, Jennifer

    2008-01-01

    Whether using a social networking site like MySpace or Facebook or building a Web page from scratch, millions of teens are actively creating a vibrant part of the Internet. This is the definitive teen''s guide to publishing exciting web pages and blogs on the Web. This easy-to-follow guide shows teenagers how to: Create great MySpace and Facebook pages Build their own unique, personalized Web site Share the latest news with exciting blogging ideas Protect themselves online with cyber-safety tips Written by a teenager for other teens, this book leads readers step-by-step through the basics of web and blog design. In this book, teens learn to go beyond clicking through web sites to learning winning strategies for web design and great ideas for writing blogs that attract attention and readership.

  10. The use of social media in dental hygiene programs: a survey of program directors.

    Science.gov (United States)

    Henry, Rachel K; Pieren, Jennifer A

    2014-08-01

    The use of social media and social networking sites has become increasingly common by the current generation of students. Colleges and universities are using social media and social networking sites to advertise, engage and recruit prospective students. The purpose of this study was to evaluate how social media is being used in dental hygiene program admissions and policy. Researchers developed a survey instrument investigating the use of social media. The survey included questions about demographic information, personal use of social media, program use of social media, social media use in admissions and social media policies. An email was sent to 321 dental hygiene program directors asking them to complete the survey. All participants were provided 4 weeks to complete the survey, and 2 reminder emails were sent. A total of 155 responses were received (48.3% response rate). While 84% of respondents indicated their program had a web page, only 20% had an official Facebook page for the program and 2% had a Twitter page. Thirty-five percent had a program policy specifically addressing the use of social media and 31% indicated that their university or institution had a policy. Only 4% of programs evaluate a potential student's Internet presence, mostly by searching on Facebook. Statistically significant differences (p≤0.05) were noted between those respondents with more personal social media accounts and those with fewer accounts, as those with more accounts were more likely to evaluate a potential student's Internet presence. Open ended responses included concern about social media issues, but some uncertainty on how to handle social media in the program. The concern for social media and professionalism was evident and more research and discussion in this area is warranted. Social media is currently being used in a variety of ways in dental hygiene programs, but not in the area of admissions. There is some uncertainty about the role social media should play in a

  11. Who Sends the Email? Using Electronic Surveys in Violence Research

    Directory of Open Access Journals (Sweden)

    Melissa A Sutherland

    2013-08-01

    Full Text Available Introduction: Students aged 16–24 years are at greatest risk for interpersonal violence and the resulting short and long-term health consequences. Electronic survey methodology is well suited for research related to interpersonal violence. Yet methodological questions remain about best practices in using electronic surveys. While researchers often indicate that potential participants receive multiple emails as reminders to complete the survey, little mention is made of the sender of the recruitment email. The purpose of this analysis is to describe the response rates from three violence-focused research studies when the recruitment emails are sent from a campus office, researcher or survey sampling firm. Methods: Three violence-focused studies were conducted about interpersonal violence among college students in the United States. Seven universities and a survey sampling firm were used to recruit potential participants to complete an electronic survey. The sender of the recruitment emails varied within and across the each of the studies depending on institutional review boards and university protocols.Results: An overall response rate of 30% was noted for the 3 studies. Universities in which researcher initiated recruitment emails were used had higher response rates compared to universities where campus officials sent the recruitment emails. Researchers found lower response rates to electronic surveys at Historically Black Colleges or Universities and that other methods were needed to improve response rates.Conclusion: The sender of recruitment emails for electronic surveys may be an important factor in response rates for violence-focused research. For researchers identification of best practices for survey methodology is needed to promote accurate disclosure and increase response rates. [West J Emerg Med. 2013;14(4:363–369.

  12. Latest developments on safety analysis methodologies at the Juzbado plant

    International Nuclear Information System (INIS)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A.

    2010-01-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  13. Lifting the veil: a typological survey of the methodological features of Islamic ethical reasoning on biomedical issues.

    Science.gov (United States)

    Abdur-Rashid, Khalil; Furber, Steven Woodward; Abdul-Basser, Taha

    2013-04-01

    We survey the meta-ethical tools and institutional processes that traditional Islamic ethicists apply when deliberating on bioethical issues. We present a typology of these methodological elements, giving particular attention to the meta-ethical techniques and devices that traditional Islamic ethicists employ in the absence of decisive or univocal authoritative texts or in the absence of established transmitted cases. In describing how traditional Islamic ethicists work, we demonstrate that these experts possess a variety of discursive tools. We find that the ethical responsa-i.e., the products of the application of the tools that we describe-are generally characterized by internal consistency. We also conclude that Islamic ethical reasoning on bioethical issues, while clearly scripture-based, is also characterized by strong consequentialist elements and possesses clear principles-based characteristics. The paper contributes to the study of bioethics by familiarizing non-specialists in Islamic ethics with the role, scope, and applicability of key Islamic ethical concepts, such as "aims" (maqāṣid), "universals" (kulliyyāt), "interest" (maṣlaḥa), "maxims" (qawā`id), "controls" (ḍawābit), "differentiators" (furūq), "preponderization" (tarjīḥ), and "extension" (tafrī`).

  14. What impact do questionnaire length and monetary incentives have on mailed health psychology survey response?

    Science.gov (United States)

    Robb, Kathryn A; Gatting, Lauren; Wardle, Jane

    2017-11-01

    Response rates to health-related surveys are declining. This study tested two strategies to improve the response rate to a health psychology survey mailed through English general practices: (1) sending a shortened questionnaire and (2) offering a monetary incentive to return a completed questionnaire. Randomized controlled trial. Adults (n = 4,241) aged 45-59 years, from four General Practices in South-East England, were mailed a survey on attitudes towards bowel cancer screening. Using a 2 × 4 factorial design, participants were randomized to receive a 'short' (four A4 pages) or a 'long' (seven A4 pages) questionnaire, and one of four monetary incentives to return a completed questionnaire - (1) no monetary incentive, (2) £2.50 shop voucher, (3) £5.00 shop voucher, and (4) inclusion in a £250 shop voucher prize draw. Age, gender, and area-level deprivation were obtained from the General Practices. The overall response rate was 41% (n = 1,589). Response to the 'short' questionnaire (42%) was not significantly different from the 'long' questionnaire (40%). The £2.50 incentive (43%) significantly improved response rates in univariate analyses, and remained significant after controlling for age, gender, area-level deprivation, and questionnaire length. The £5.00 (42%) and £250 prize draw (41%) incentives had no significant impact on response rates compared to no incentive (38%). A small monetary incentive (£2.50) may slightly increase response to a mailed health psychology survey. The length of the questionnaire (four pages vs. seven pages) did not influence response. Although frequently used, entry into a prize draw did not increase response. Achieving representative samples remains a challenge for health psychology. Statement of contribution What is already known on this subject Response rates to mailed questionnaires continue to decline, threatening the representativeness of data. Prize draw incentives are frequently used but there is little evidence

  15. Earthquake Scenarios Based Upon the Data and Methodologies of the U.S. Geological Survey's National Seismic Hazard Mapping Project

    Science.gov (United States)

    Rukstales, K. S.; Petersen, M. D.; Frankel, A. D.; Harmsen, S. C.; Wald, D. J.; Quitoriano, V. R.; Haller, K. M.

    2011-12-01

    The U.S. Geological Survey's (USGS) National Seismic Hazard Mapping Project (NSHMP) utilizes a database of over 500 faults across the conterminous United States to constrain earthquake source models for probabilistic seismic hazard maps. Additionally, the fault database is now being used to produce a suite of deterministic ground motions for earthquake scenarios that are based on the same fault source parameters and empirical ground motion prediction equations used for the probabilistic hazard maps. Unlike the calculated hazard map ground motions, local soil amplification is applied to the scenario calculations based on the best available Vs30 (average shear-wave velocity down to 30 meters) mapping, or in some cases using topographic slope as a proxy. Systematic outputs include all standard USGS ShakeMap products, including GIS, KML, XML, and HAZUS input files. These data are available from the ShakeMap web pages with a searchable archive. The scenarios are being produced within the framework of a geographic information system (GIS) so that alternative scenarios can readily be produced by altering fault source parameters, Vs30 soil amplification, as well as the weighting of ground motion prediction equations used in the calculations. The alternative scenarios can then be used for sensitivity analysis studies to better characterize uncertainty in the source model and convey this information to decision makers. By providing a comprehensive collection of earthquake scenarios based upon the established data and methods of the USGS NSHMP, we hope to provide a well-documented source of data which can be used for visualization, planning, mitigation, loss estimation, and research purposes.

  16. Implied Volatility Surface: Construction Methodologies and Characteristics

    OpenAIRE

    Cristian Homescu

    2011-01-01

    The implied volatility surface (IVS) is a fundamental building block in computational finance. We provide a survey of methodologies for constructing such surfaces. We also discuss various topics which can influence the successful construction of IVS in practice: arbitrage-free conditions in both strike and time, how to perform extrapolation outside the core region, choice of calibrating functional and selection of numerical optimization algorithms, volatility surface dynamics and asymptotics.

  17. Research report of fiscal 1997. Survey on information networks for environment technology transfer; 1997 nendo hokokusho. Kankyo gijutsu iten joho network chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    ICETT is one of the member organizations of APEC environment technology virtual center which established in Osaka in 1996 for active environment technology exchange between APEC countries based on a survey result obtained in fiscal 1995. ICETT began its information service in April, 1997, and completed useful prototype home pages which include the database composed of practical useful information on environment preservation technology of domestic factories and offices, and training information and local governments` approach cases to pollution control. In fiscal 1997, further survey was made, and the simulation software of a model process was also developed for training. Clean technology and energy saving technology strongly demanded by developing countries were surveyed to prepare a database including useful information for developing countries. A basic design software for bio-treatment of waste water was also prepared. The total number of home page accesses exceeded a predicted number for the first half year. 5 refs., 25 figs.

  18. PageFocus: Using paradata to detect and prevent cheating on online achievement tests.

    Science.gov (United States)

    Diedenhofen, Birk; Musch, Jochen

    2017-08-01

    Cheating threatens the validity of unproctored online achievement tests. To address this problem, we developed PageFocus, a JavaScript that detects when participants abandon test pages by switching to another window or browser tab. In a first study, we aimed at testing whether PageFocus could detect and prevent cheating. We asked 115 lab and 186 online participants to complete a knowledge test comprising items that were difficult to answer but easy to look up on the Internet. Half of the participants were invited to look up the solutions, which significantly increased their test scores. The PageFocus script detected test takers who abandoned the test page with very high sensitivity and specificity, and successfully reduced cheating by generating a popup message that asked participants not to cheat. In a second study, 510 online participants completed a knowledge test comprising items that could easily be looked up and a reasoning task involving matrices that were impossible to look up. In a first group, a performance-related monetary reward was promised to the top scorers; in a second group, participants took part in a lottery that provided performance-unrelated rewards; and in a third group, no incentive was offered. PageFocus revealed that participants cheated more when performance-related incentives were offered. As expected, however, this effect was limited to items that could easily be looked up. We recommend that PageFocus be routinely employed to detect and prevent cheating on online achievement tests.

  19. Social Responsibility and Corporate Web Pages: Self-Presentation or Agenda-Setting?

    Science.gov (United States)

    Esrock, Stuart L.; Leichty, Greg B.

    1998-01-01

    Examines how corporate entities use the Web to present themselves as socially responsible citizens and to advance policy positions. Samples randomly "Fortune 500" companies, revealing that, although 90% had Web pages and 82% of the sites addressed a corporate social responsibility issue, few corporations used their pages to monitor…

  20. Microphysics evolution and methodology

    International Nuclear Information System (INIS)

    Dionisio, J.S.

    1990-01-01

    A few general features of microscopics evolution and their relationship with microscopics methodology are briefly surveyed. Several pluri-disciplinary and interdisciplinary aspects of microscopics research are also discussed in the present scientific context. The need for an equilibrium between individual tendencies and collective constraints required by team work, already formulated thirty years ago by Frederic Joliot, is particularly stressed in the present conjuncture of Nuclear Research favouring very large team projects and discouraging individual initiatives. The increasing importance of the science of science (due to their multiple social, economical, ecological aspects) and the stronger competition between national and international tendencies of scientific (and technical) cooperation are also discussed. (author)

  1. Paged GIRS (Graph Information Retrieval System) Users Manual.

    Science.gov (United States)

    1981-05-01

    list: G NODEl LINK1 valuei 2) Add integer I to the end of the list G NODE1 LINKI "I" 3) Place valuei in the third location from the bottom of the list...G NODEI LINK1 .-3 valuei 4) Replace the second integer value from the top of the list with the integer 10. G NODE1 LINKI -..2 ൒" 35 5) Assign a...random number to valuei (for the current page) and place the triple on page 5, continuant 0. NODEI - 5 LVREQP(2) = 0 VALUEI - 0 G NODE1 LINKI VALUEI PRINT

  2. The importance of social media for patients and families affected by congenital anomalies: A Facebook cross-sectional analysis and user survey.

    Science.gov (United States)

    Jacobs, Robyn; Boyd, Leanne; Brennan, Kirsty; Sinha, C K; Giuliani, Stefano

    2016-11-01

    We aimed to define characteristics and needs of Facebook users in relation to congenital anomalies. Cross-sectional analysis of Facebook related to four congenital anomalies: anorectal malformation (ARM), congenital diaphragmatic hernia (CDH), congenital heart disease (CHD) and hypospadias/epispadias (HS/ES). A keyword search was performed to identify relevant Groups/Pages. An anonymous survey was posted to obtain quantitative/qualitative data on users and their healthcare needs. 54 Groups and 24 Pages were identified (ARM: 10 Groups; CDH: 9 Groups, 7 Pages; CHD: 32 Groups, 17 Pages; HS/ES: 3 Groups), with 16,191 Group members and 48,766 Page likes. 868/1103 (79%) of respondents were parents. Male:female ratio was 1:10.9. 65% of the users were 26-40years old. Common reasons for joining these Groups/Pages included: seeking support, education, making friends, and providing support to others. 932/1103 (84%) would like healthcare professionals (HCPs) to actively participate in their Group. 31% of the respondents felt that they did not receive enough support from their healthcare system. 97% of the respondents would like to join a Group linked to their primary hospital. Facebook Groups/Pages related to congenital anomalies are highly populated and active. There is a need for HCPs and policy makers to better understand and participate in social media to support families and improve patient care. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Yellow pages advertising by physicians. Are doctors providing the information consumers want most?

    Science.gov (United States)

    Butler, D D; Abernethy, A M

    1996-01-01

    Yellow pages listing are the most widely used form of physician advertising. Every month, approximately 21.6 million adults in the United States refer to the yellow pages before obtaining medical care. Mobile consumers--approximately 17% of the U.S. population who move each year--are heavy users of yellow pages. Consumers desire information on a physician's experience, but it is included in less than 1% of all physician display ads.

  4. Reconfigurable Full-Page Braille Displays

    Science.gov (United States)

    Garner, H. Douglas

    1994-01-01

    Electrically actuated braille display cells of proposed type arrayed together to form full-page braille displays. Like other braille display cells, these provide changeable patterns of bumps driven by digitally recorded text stored on magnetic tapes or in solid-state electronic memories. Proposed cells contain electrorheological fluid. Viscosity of such fluid increases in strong electrostatic field.

  5. Pro single page application development using Backbone.js and ASP.NET

    CERN Document Server

    Fink, Gil

    2014-01-01

    One of the most important and exciting trends in web development in recent years is the move towards single page applications, or SPAs. Instead of clicking through hyperlinks and waiting for each page to load, the user loads a site once and all the interactivity is handled fluidly by a rich JavaScript front end. If you come from a background in ASP.NET development, you'll be used to handling most interactions on the server side. Pro Single Page Application Development will guide you through your transition to this powerful new application type.The book starts in Part I by laying the groundwork

  6. A methodology for digital soil mapping in poorly-accessible areas

    NARCIS (Netherlands)

    Cambule, A.; Rossiter, D.G.; Stoorvogel, J.J.

    2013-01-01

    Effective soil management requires knowledge of the spatial patterns of soil variation within the landscape to enable wise land use decisions. This is typically obtained through time-consuming and costly surveys. The aim of this study was to develop a cost-efficient methodology for digital soil

  7. Microseismic Event Relocation and Focal Mechanism Estimation Based on PageRank Linkage

    Science.gov (United States)

    Aguiar, A. C.; Myers, S. C.

    2017-12-01

    Microseismicity associated with enhanced geothermal systems (EGS) is key in understanding how subsurface stimulation can modify stress, fracture rock, and increase permeability. Large numbers of microseismic events are commonly associated with hydroshearing an EGS, making data mining methods useful in their analysis. We focus on PageRank, originally developed as Google's search engine, and subsequently adapted for use in seismology to detect low-frequency earthquakes by linking events directly and indirectly through cross-correlation (Aguiar and Beroza, 2014). We expand on this application by using PageRank to define signal-correlation topology for micro-earthquakes from the Newberry Volcano EGS in Central Oregon, which has been stimulated two times using high-pressure fluid injection. We create PageRank signal families from both data sets and compare these to the spatial and temporal proximity of associated earthquakes. PageRank families are relocated using differential travel times measured by waveform cross-correlation (CC) and the Bayesloc approach (Myers et al., 2007). Prior to relocation events are loosely clustered with events at a distance from the cluster. After relocation, event families are found to be tightly clustered. Indirect linkage of signals using PageRank is a reliable way to increase the number of events confidently determined to be similar, suggesting an efficient and effective grouping of earthquakes with similar physical characteristics (ie. location, focal mechanism, stress drop). We further explore the possibility of using PageRank families to identify events with similar relative phase polarities and estimate focal mechanisms following Shelly et al. (2016) method, where CC measurements are used to determine individual polarities within event clusters. Given a positive result, PageRank might be a useful tool in adaptive approaches to enhance production at well-instrumented geothermal sites. Prepared by LLNL under Contract DE-AC52-07NA27344

  8. Methodological exploratory study applied to occupational epidemiology

    Energy Technology Data Exchange (ETDEWEB)

    Carneiro, Janete C.G. Gaburo; Vasques, MOnica Heloisa B.; Fontinele, Ricardo S.; Sordi, Gian Maria A. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)]. E-mail: janetegc@ipen.br

    2007-07-01

    The utilization of epidemiologic methods and techniques has been object of practical experimentation and theoretical-methodological reflection in health planning and programming process. Occupational Epidemiology is the study of the causes and prevention of diseases and injuries from exposition and risks in the work environment. In this context, there is no intention to deplete such a complex theme but to deal with basic concepts of Occupational Epidemiology, presenting the main characteristics of the analysis methods used in epidemiology, as investigate the possible determinants of exposition (chemical, physical and biological agents). For this study, the social-demographic profile of the IPEN-CNEN/SP work force was used. The knowledge of this reference population composition is based on sex, age, educational level, marital status and different occupations, aiming to know the relation between the health aggravating factors and these variables. The methodology used refers to a non-experimental research based on a theoretical methodological practice. The work performed has an exploratory character, aiming a later survey of indicators in the health area in order to analyze possible correlations related to epidemiologic issues. (author)

  9. Methodological exploratory study applied to occupational epidemiology

    International Nuclear Information System (INIS)

    Carneiro, Janete C.G. Gaburo; Vasques, MOnica Heloisa B.; Fontinele, Ricardo S.; Sordi, Gian Maria A.

    2007-01-01

    The utilization of epidemiologic methods and techniques has been object of practical experimentation and theoretical-methodological reflection in health planning and programming process. Occupational Epidemiology is the study of the causes and prevention of diseases and injuries from exposition and risks in the work environment. In this context, there is no intention to deplete such a complex theme but to deal with basic concepts of Occupational Epidemiology, presenting the main characteristics of the analysis methods used in epidemiology, as investigate the possible determinants of exposition (chemical, physical and biological agents). For this study, the social-demographic profile of the IPEN-CNEN/SP work force was used. The knowledge of this reference population composition is based on sex, age, educational level, marital status and different occupations, aiming to know the relation between the health aggravating factors and these variables. The methodology used refers to a non-experimental research based on a theoretical methodological practice. The work performed has an exploratory character, aiming a later survey of indicators in the health area in order to analyze possible correlations related to epidemiologic issues. (author)

  10. Measuring our Universe from Galaxy Redshift Surveys.

    Science.gov (United States)

    Lahav, Ofer; Suto, Yasushi

    2004-01-01

    Galaxy redshift surveys have achieved significant progress over the last couple of decades. Those surveys tell us in the most straightforward way what our local Universe looks like. While the galaxy distribution traces the bright side of the Universe, detailed quantitative analyses of the data have even revealed the dark side of the Universe dominated by non-baryonic dark matter as well as more mysterious dark energy (or Einstein's cosmological constant). We describe several methodologies of using galaxy redshift surveys as cosmological probes, and then summarize the recent results from the existing surveys. Finally we present our views on the future of redshift surveys in the era of precision cosmology.

  11. Implementing database system for LHCb publications page

    CERN Document Server

    Abdullayev, Fakhriddin

    2017-01-01

    The LHCb is one of the main detectors of Large Hadron Collider, where physicists and scientists work together on high precision measurements of matter-antimatter asymmetries and searches for rare and forbidden decays, with the aim of discovering new and unexpected forces. The work does not only consist of analyzing data collected from experiments but also in publishing the results of those analyses. The LHCb publications are gathered on LHCb publications page to maximize their availability to both LHCb members and to the high energy community. In this project a new database system was implemented for LHCb publications page. This will help to improve access to research papers for scientists and better integration with current CERN library website and others.

  12. Objectives and methodology of Romanian SEPHAR II Survey. Project for comparing the prevalence and control of cardiovascular risk factors in two East-European countries: Romania and Poland.

    Science.gov (United States)

    Dorobantu, Maria; Tautu, Oana-Florentina; Darabont, Roxana; Ghiorghe, Silviu; Badila, Elisabeta; Dana, Minca; Dobreanu, Minodora; Baila, Ilarie; Rutkowski, Marcin; Zdrojewski, Tomasz

    2015-08-12

    Comparing results of representative surveys conducted in different East-European countries could contribute to a better understanding and management of cardiovascular risk factors, offering grounds for the development of health policies addressing the special needs of this high cardiovascular risk region of Europe. The aim of this paper was to describe the methodology on which the comparison between the Romanian survey SEPHAR II and the Polish survey NATPOL 2011 results is based. SEPHAR II, like NATPOL 2011, is a cross-sectional survey conducted on a representative sample of the adult Romanian population (18 to 80 years) and encompasses two visits with the following components: completing the study questionnaire, blood pressure and anthropometric measurements, and collection of blood and urine samples. From a total of 2223 subjects found at 2860 visited addresses, 2044 subjects gave written consent but only 1975 subjects had eligible data for the analysis, accounting for a response rate of 69.06%. Additionally we excluded 11 subjects who were 80 years of age (NATPOL 2011 included adult subjects up to 79 years). Therefore, the sample size included in the statistical analysis is 1964. It has similar age groups and gender structure as the Romanian population aged 18-79 years from the last census available at the moment of conducting the survey (weight adjustments for epidemiological analyses range from 0.48 to 8.7). Sharing many similarities, the results of SEPHAR II and NATPOL 2011 surveys can be compared by a proper statistical method offering crucial information regarding cardiovascular risk factors in a high-cardiovascular risk European region.

  13. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang [Seoul National Univ., Seoul (Korea, Republic of)

    1997-07-15

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code.

  14. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang

    1997-07-01

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code

  15. Discovering urban mobility patterns with PageRank based traffic modeling and prediction

    Science.gov (United States)

    Wang, Minjie; Yang, Su; Sun, Yi; Gao, Jun

    2017-11-01

    Urban transportation system can be viewed as complex network with time-varying traffic flows as links to connect adjacent regions as networked nodes. By computing urban traffic evolution on such temporal complex network with PageRank, it is found that for most regions, there exists a linear relation between the traffic congestion measure at present time and the PageRank value of the last time. Since the PageRank measure of a region does result from the mutual interactions of the whole network, it implies that the traffic state of a local region does not evolve independently but is affected by the evolution of the whole network. As a result, the PageRank values can act as signatures in predicting upcoming traffic congestions. We observe the aforementioned laws experimentally based on the trajectory data of 12000 taxies in Beijing city for one month.

  16. Progress on water data integration and distribution: a summary of select U.S. Geological Survey data systems

    Science.gov (United States)

    Blodgett, David L.; Lucido, Jessica M.; Kreft, James M.

    2016-01-01

    Critical water-resources issues ranging from flood response to water scarcity make access to integrated water information, services, tools, and models essential. Since 1995 when the first water data web pages went online, the U.S. Geological Survey has been at the forefront of water data distribution and integration. Today, real-time and historical streamflow observations are available via web pages and a variety of web service interfaces. The Survey has built partnerships with Federal and State agencies to integrate hydrologic data providing continuous observations of surface and groundwater, temporally discrete water quality data, groundwater well logs, aquatic biology data, water availability and use information, and tools to help characterize the landscape for modeling. In this paper, we summarize the status and design patterns implemented for selected data systems. We describe how these systems contribute to a U.S. Federal Open Water Data Initiative and present some gaps and lessons learned that apply to global hydroinformatics data infrastructure.

  17. Review Pages: Cities, Energy and Mobility

    Directory of Open Access Journals (Sweden)

    Gennaro Angiello

    2015-12-01

    Full Text Available Starting from the relationship between urban planning and mobility management, TeMA has gradually expanded the view of the covered topics, always remaining in the groove of rigorous scientific in-depth analysis. During the last two years a particular attention has been paid on the Smart Cities theme and on the different meanings that come with it. The last section of the journal is formed by the Review Pages. They have different aims: to inform on the problems, trends and evolutionary processes; to investigate on the paths by highlighting the advanced relationships among apparently distant disciplinary fields; to explore the interaction’s areas, experiences and potential applications; to underline interactions, disciplinary developments but also, if present, defeats and setbacks. Inside the journal the Review Pages have the task of stimulating as much as possible the circulation of ideas and the discovery of new points of view. For this reason the section is founded on a series of basic’s references, required for the identification of new and more advanced interactions. These references are the research, the planning acts, the actions and the applications, analysed and investigated both for their ability to give a systematic response to questions concerning the urban and territorial planning, and for their attention to aspects such as the environmental sustainability and the innovation in the practices. For this purpose the Review Pages are formed by five sections (Web Resources; Books; Laws; Urban Practices; News and Events, each of which examines a specific aspect of the broader information storage of interest for TeMA.

  18. La double-page chez Hirohiko Araki : l’ubris faite norme

    Directory of Open Access Journals (Sweden)

    Aurélien Pigeat

    2011-03-01

    Full Text Available Le style d’Araki se caractérise par un usage massif de la double-page, loin de l’usage ponctuel que l’on trouve habituellement dans les shonen. Elle témoigne d’une forme d’ubris qui se fait norme de composition c’est-à-dire qu’elle érige le principe de débordement en règle de structuration de l’action et des planches Elle est ainsi la marque de la démesure d’un auteur qui, sous l’apparence du manga populaire, et à travers les codes de ce dernier, élabore une œuvre d’une puissance et d’une sophistication rares..

    Pour appréhender ce phénomène, il nous faut d’abord observer concrètement l’usage que fait Hirohiko Araki de la double-page, les lieux du récit que celle-ci investit, la fréquence de ses manifestations, et la manière dont elle caractérise les différentes parties de la saga Jojo’s bizarre Adventure. L’ubris de la double-page met alors particulièrement en relief deux éléments centraux de la poétique d’Hirohiko Araki : le corps et le temps.  La double-page apparaît comme la réponse matérielle à ces corps qui débordent de la page classique, qui éclatent ou se dispersent de telle sorte que l’auteur doit étendre les limites de ce cadre pour les représenter. Le temps entre lui en scène à travers le détail : la double-page devient le règne d’un détail sur lequel le lecteur passe trop vite et auquel le héros accorde toute son attention. Enfin, la double-page possède chez Hirohiko Araki une dimension proprement esthétique que révèle clairement l’usage presque absolu qui en est fait dans la septième partie de Jojo’s bizarre Adventure, Steel Ball Run, et dans le volume Rohan au Louvre, même si l’on peut la deviner déjà auparavant. La double-page est un moment où l’énigme se présente au héros, se met en mouvement ou se trouve élucidée. Le règne de l’ubris s’affirme là, dans ces doubles-pages qui n’en sont finalement plus, dans cette

  19. Personal Web home pages of adolescents with cancer: self-presentation, information dissemination, and interpersonal connection.

    Science.gov (United States)

    Suzuki, Lalita K; Beale, Ivan L

    2006-01-01

    The content of personal Web home pages created by adolescents with cancer is a new source of information about this population of potential benefit to oncology nurses and psychologists. Individual Internet elements found on 21 home pages created by youths with cancer (14-22 years old) were rated for cancer-related self-presentation, information dissemination, and interpersonal connection. Examples of adolescents' online narratives were also recorded. Adolescents with cancer used various Internet elements on their home pages for cancer-related self-presentation (eg, welcome messages, essays, personal history and diary pages, news articles, and poetry), information dissemination (e.g., through personal interest pages, multimedia presentations, lists, charts, and hyperlinks), and interpersonal connection (eg, guestbook entries). Results suggest that various elements found on personal home pages are being used by a limited number of young patients with cancer for self-expression, information access, and contact with peers.

  20. Improving Web Page Retrieval using Search Context from Clicked Domain Names

    NARCIS (Netherlands)

    Li, R.

    Search context is a crucial factor that helps to understand a user’s information need in ad-hoc Web page retrieval. A query log of a search engine contains rich information on issued queries and their corresponding clicked Web pages. The clicked data implies its relevance to the query and can be

  1. Global and Local Page Replacement Algorithms on Virtual Memory Systems for Image Processing

    OpenAIRE

    WADA, Ben Tsutom

    1985-01-01

    Three virtual memory systems for image processing, different one another in frame allocation algorithms and page replacement algorithms, were examined experimentally upon their page-fault characteristics. The hypothesis, that global page replacement algorithms are susceptible to thrashing, held in the raster scan experiment, while it did not in another non raster-scan experiment. The results of the experiments may be useful also in making parallel image processors more efficient, while they a...

  2. Accounting Programs' Home Pages: What's Happening.

    Science.gov (United States)

    Peek, Lucia E.; Roxas, Maria L.

    2002-01-01

    Content analysis of 62 accounting programs' websites indicated the following: 53% include mission statements; 62.9% list accreditation; many faculty biographies and personal pages used inconsistent formats; provision of information on financial aid, student organizations, career services, and certified public accountant requirements varied. Many…

  3. Designing Surveys for Language Programs.

    Science.gov (United States)

    Brown, James Dean

    A discussion of survey methodology for investigating second language programs and instruction examines two methods: oral interviews and written questionnaires. Each method is defined, and variations are explored. For interviews, this includes individual, group, and telephone interviews. For questionnaires, this includes self-administered and…

  4. Proteomic study of muscle sarcoplasmic proteins using AUT-PAGE/SDS-PAGE as two-dimensional gel electrophoresis.

    Science.gov (United States)

    Picariello, Gianluca; De Martino, Alessandra; Mamone, Gianfranco; Ferranti, Pasquale; Addeo, Francesco; Faccia, Michele; Spagnamusso, Salvatore; Di Luccia, Aldo

    2006-03-20

    In the present study, an alternative procedure for two-dimensional (2D) electrophoretic analysis in proteomic investigation of the most represented basic muscle water-soluble proteins is suggested. Our method consists of Acetic acid-Urea-Triton polyacrylamide gel (AUT-PAGE) analysis in the first dimension and standard sodium dodecyl sulphate polyacrylamide gel (SDS-PAGE) in the second dimension. Although standard two-dimensional Immobilized pH Gradient-Sodium Dodecyl-Sulphate (2D IPG-SDS) gel electrophoresis has been successfully used to study these proteins, most of the water-soluble proteins are spread on the alkaline part of the 2D map and are poorly focused. Furthermore, the similarity in their molecular weights impairs resolution of the classical approach. The addition of Triton X-100, a non-ionic detergent, into the gel induces a differential electrophoretic mobility of proteins as a result of the formation of mixed micelles between the detergent and the hydrophobic moieties of polypeptides, separating basic proteins with a criterion similar to reversed phase chromatography based on their hydrophobicity. The acid pH induces positive net charges, increasing with the isoelectric point of proteins, thus allowing enhanced resolution in the separation. By using 2D AUT-PAGE/SDS electrophoresis approach to separate water-soluble proteins from fresh pork and from dry-cured products, we could spread proteins over a greater area, achieving a greater resolution than that obtained by IPG in the pH range 3-10 and 6-11. Sarcoplasmic proteins undergoing proteolysis during the ripening of products were identified by Matrix Assisted Laser Desorption/Ionization-Time of Flight (MALDI-ToF) mass spectrometry peptide mass fingerprinting in a easier and more effective way. Two-dimensional AUT-PAGE/SDS electrophoresis has allowed to simplify separation of sarcoplasmic protein mixtures making this technique suitable in the defining of quality of dry-cured pork products by immediate

  5. Learning lessons from field surveys in humanitarian contexts: a case study of field surveys conducted in North Kivu, DRC 2006-2008

    Directory of Open Access Journals (Sweden)

    Grellety Emmanuel

    2009-09-01

    Full Text Available Abstract Survey estimates of mortality and malnutrition are commonly used to guide humanitarian decision-making. Currently, different methods of conducting field surveys are the subject of debate among epidemiologists. Beyond the technical arguments, decision makers may find it difficult to conceptualize what the estimates actually mean. For instance, what makes this particular situation an emergency? And how should the operational response be adapted accordingly. This brings into question not only the quality of the survey methodology, but also the difficulties epidemiologists face in interpreting results and selecting the most important information to guide operations. As a case study, we reviewed mortality and nutritional surveys conducted in North Kivu, Democratic Republic of Congo (DRC published from January 2006 to January 2009. We performed a PubMed/Medline search for published articles and scanned publicly available humanitarian databases and clearinghouses for grey literature. To evaluate the surveys, we developed minimum reporting criteria based on available guidelines and selected peer-review articles. We identified 38 reports through our search strategy; three surveys met our inclusion criteria. The surveys varied in methodological quality. Reporting against minimum criteria was generally good, but presentation of ethical procedures, raw data and survey limitations were missed in all surveys. All surveys also failed to consider contextual factors important for data interpretation. From this review, we conclude that mechanisms to ensure sound survey design and conduct must be implemented by operational organisations to improve data quality and reporting. Training in data interpretation would also be useful. Novel survey methods should be trialled and prospective data gathering (surveillance employed wherever feasible.

  6. Enriching the trustworthiness of health-related web pages.

    Science.gov (United States)

    Gaudinat, Arnaud; Cruchet, Sarah; Boyer, Celia; Chrawdhry, Pravir

    2011-06-01

    We present an experimental mechanism for enriching web content with quality metadata. This mechanism is based on a simple and well-known initiative in the field of the health-related web, the HONcode. The Resource Description Framework (RDF) format and the Dublin Core Metadata Element Set were used to formalize these metadata. The model of trust proposed is based on a quality model for health-related web pages that has been tested in practice over a period of thirteen years. Our model has been explored in the context of a project to develop a research tool that automatically detects the occurrence of quality criteria in health-related web pages.

  7. Identification of Malicious Web Pages by Inductive Learning

    Science.gov (United States)

    Liu, Peishun; Wang, Xuefang

    Malicious web pages are an increasing threat to current computer systems in recent years. Traditional anti-virus techniques focus typically on detection of the static signatures of Malware and are ineffective against these new threats because they cannot deal with zero-day attacks. In this paper, a novel classification method for detecting malicious web pages is presented. This method is generalization and specialization of attack pattern based on inductive learning, which can be used for updating and expanding knowledge database. The attack pattern is established from an example and generalized by inductive learning, which can be used to detect unknown attacks whose behavior is similar to the example.

  8. Measurment of Web Usability: Web Page of Hacettepe University Department of Information Management

    OpenAIRE

    Nazan Özenç Uçak; Tolga Çakmak

    2009-01-01

    Today, information is produced increasingly in electronic form and retrieval of information is provided via web pages. As a result of the rise of the number of web pages, many of them seem to comprise similar contents but different designs. In this respect, presenting information over the web pages according to user expectations and specifications is important in terms of effective usage of information. This study provides an insight about web usability studies that are executed for measuring...

  9. Search Engine Ranking, Quality, and Content of Web Pages That Are Critical Versus Noncritical of Human Papillomavirus Vaccine.

    Science.gov (United States)

    Fu, Linda Y; Zook, Kathleen; Spoehr-Labutta, Zachary; Hu, Pamela; Joseph, Jill G

    2016-01-01

    Online information can influence attitudes toward vaccination. The aim of the present study was to provide a systematic evaluation of the search engine ranking, quality, and content of Web pages that are critical versus noncritical of human papillomavirus (HPV) vaccination. We identified HPV vaccine-related Web pages with the Google search engine by entering 20 terms. We then assessed each Web page for critical versus noncritical bias and for the following quality indicators: authorship disclosure, source disclosure, attribution of at least one reference, currency, exclusion of testimonial accounts, and readability level less than ninth grade. We also determined Web page comprehensiveness in terms of mention of 14 HPV vaccine-relevant topics. Twenty searches yielded 116 unique Web pages. HPV vaccine-critical Web pages comprised roughly a third of the top, top 5- and top 10-ranking Web pages. The prevalence of HPV vaccine-critical Web pages was higher for queries that included term modifiers in addition to root terms. Compared with noncritical Web pages, Web pages critical of HPV vaccine overall had a lower quality score than those with a noncritical bias (p engine queries despite being of lower quality and less comprehensive than noncritical Web pages. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  10. Are parental leaves considered as work interruptions by survey respondents? A methodological note

    Directory of Open Access Journals (Sweden)

    Céline Le Bourdais

    2012-01-01

    Full Text Available Parental leaves and family-related work interruptions are linked to a variety of issues, such as children’s well-being or women’s work trajectories. Yet, the measurement of periods of absence from the labour market might be imprecise, especially in retrospective surveys. To evaluate the quality of the collected information, we examine whether women who reported taking a parental leave longer than six months also mentioned a corresponding work interruption, using the 2008 Living in Canada Survey (LCS – Pilot. Our analysis shows that nearly half of women failed to do so. We investigate the sources of the discrepancy and suggest possible avenues of change for future surveys.

  11. Referencing web pages and e-journals.

    Science.gov (United States)

    Bryson, David

    2013-12-01

    One of the areas that can confuse students and authors alike is how to reference web pages and electronic journals (e-journals). The aim of this professional development article is to go back to first principles for referencing and see how with examples these should be referenced.

  12. Web pages: What can you see in a single fixation?

    Science.gov (United States)

    Jahanian, Ali; Keshvari, Shaiyan; Rosenholtz, Ruth

    2018-01-01

    Research in human vision suggests that in a single fixation, humans can extract a significant amount of information from a natural scene, e.g. the semantic category, spatial layout, and object identities. This ability is useful, for example, for quickly determining location, navigating around obstacles, detecting threats, and guiding eye movements to gather more information. In this paper, we ask a new question: What can we see at a glance at a web page - an artificial yet complex "real world" stimulus? Is it possible to notice the type of website, or where the relevant elements are, with only a glimpse? We find that observers, fixating at the center of a web page shown for only 120 milliseconds, are well above chance at classifying the page into one of ten categories. Furthermore, this ability is supported in part by text that they can read at a glance. Users can also understand the spatial layout well enough to reliably localize the menu bar and to detect ads, even though the latter are often camouflaged among other graphical elements. We discuss the parallels between web page gist and scene gist, and the implications of our findings for both vision science and human-computer interaction.

  13. A Survey on Chinese Scholars' Adoption of Mixed Methods

    Science.gov (United States)

    Zhou, Yuchun

    2018-01-01

    Since the 1980s when mixed methods emerged as "the third research methodology", it was widely adopted in Western countries. However, inadequate literature revealed how this methodology was accepted by scholars in Asian countries, such as China. Therefore, this paper used a quantitative survey to investigate Chinese scholars' perceptions…

  14. Does content affect whether users remember that Web pages were hyperlinked?

    Science.gov (United States)

    Jones, Keith S; Ballew, Timothy V; Probst, C Adam

    2008-10-01

    We determined whether memory for hyperlinks improved when they represented relations between the contents of the Web pages. J. S. Farris (2003) found that memory for hyperlinks improved when they represented relations between the contents of the Web pages. However, Farris's (2003) participants could have used their knowledge of site content to answer questions about relations that were instantiated via the site's content and its hyperlinks. In Experiment 1, users navigated a Web site and then answered questions about relations that were instantiated only via content, only via hyperlinks, and via content and hyperlinks. Unlike Farris (2003), we split the latter into two sets. One asked whether certain content elements were related, and the other asked whether certain Web pages were hyperlinked. Experiment 2 replicated Experiment 1 with one modification: The questions that were asked about relations instantiated via content and hyperlinks were changed so that each question's wrong answer was also related to the question's target. Memory for hyperlinks improved when they represented relations instantiated within the content of the Web pages. This was true when (a) questions about content and hyperlinks were separated (Experiment 1) and (b) each question's wrong answer was also related to the question's target (Experiment 2). The accuracy of users' mental representations of local architecture depended on whether hyperlinks were related to the site's content. Designers who want users to remember hyperlinks should associate those hyperlinks with content that reflects the relation between the contents on the Web pages.

  15. EuropeaN Energy balance Research to prevent excessive weight Gain among Youth (ENERGY project: Design and methodology of the ENERGY cross-sectional survey

    Directory of Open Access Journals (Sweden)

    Moreno Luis

    2011-01-01

    Full Text Available Abstract Background Obesity treatment is by large ineffective long term, and more emphasis on the prevention of excessive weight gain in childhood and adolescence is warranted. To inform energy balance related behaviour (EBRB change interventions, insight in the potential personal, family and school environmental correlates of these behaviours is needed. Studies on such multilevel correlates of EBRB among schoolchildren in Europe are lacking. The ENERGY survey aims to (1 provide up-to-date prevalence rates of measured overweight, obesity, self-reported engagement in EBRBs, and objective accelerometer-based assessment of physical activity and sedentary behaviour and blood-sample biomarkers of metabolic function in countries in different regions of Europe, (2 to identify personal, family and school environmental correlates of these EBRBs. This paper describes the design, methodology and protocol of the survey. Method/Design A school-based cross-sectional survey was carried out in 2010 in seven different European countries; Belgium, Greece, Hungary, the Netherlands, Norway, Slovenia, and Spain. The survey included measurements of anthropometrics, child, parent and school-staff questionnaires, and school observations to measure and assess outcomes (i.e. height, weight, and waist circumference, EBRBs and potential personal, family and school environmental correlates of these behaviours including the social-cultural, physical, political, and economic environmental factors. In addition, a selection of countries conducted accelerometer measurements to objectively assess physical activity and sedentary behaviour, and collected blood samples to assess several biomarkers of metabolic function. Discussion The ENERGY survey is a comprehensive cross-sectional study measuring anthropometrics and biomarkers as well as assessing a range of EBRBs and their potential correlates at the personal, family and school level, among 10-12 year old children in seven

  16. The Concept of Post-Non-Classic al Methodological Strategy by Professor Olexander Oguy

    Directory of Open Access Journals (Sweden)

    Olha Chervinska

    2017-12-01

    Full Text Available The article under discussion is a survey of how a literary concept of a prominent Ukrainian German Studies expert Olexandr Dmytrovych Oguy has been developing on the pages of a scientific journal “Problems of Literary Criticism” from 1993 till 2013. The article traces up the stages of progress of scientist’s methodological strategy concerning the so-called concept of post-non-classicism, beginning with his first article in “The Issues”, where he presented an efficient sample of involving the timeless linguistic universals with the purpose of comparing the typologically similar genre forms. In this way, he managed to prove the authenticity of “The Tale of Igor’s Campaign” in historical-typological comparison with the medieval epic works (Issue 1, 1993. After a 15-year break, O. Oguy’s articles “A Three-Dimensional Poetic Space of ‘Lanzelet’ by Ulrich von Zatzikhoven” (review, 2010, “The Historical-Social Genres in Post-Non-Classical Methodology: the Principles of Classification” (2012, “The Principles of Formation of the Middle Age German Literature: Overcoming the Crisis Stages through New Genres” (2013 were published in the journal. Each of these works contains a distinct conceptual program. The review was not restricted by a mere critical analysis of the work by a German researcher Kai Lorenz. It also highlighted the difference between methodological fundamentals of European and Ukrainian scientific text analyses – “different paradigms of foreign and Ukrainian Medievalist Studies”. This scientific text is a topical “transplantation” of analytic methodology of K. Lorenz into Ukrainian philological practice. It was introduced into scientific circulation as the so-called “trichotomic model” of functional types of the interacting spatial structures. The article “The Historical-Social Genres in Post-Non-Classical Methodology: the Principles of Classification” (Issue 86, 2012 is a vivid example of

  17. Around power law for PageRank components in Buckley-Osthus model of web graph

    OpenAIRE

    Gasnikov, Alexander; Zhukovskii, Maxim; Kim, Sergey; Noskov, Fedor; Plaunov, Stepan; Smirnov, Daniil

    2017-01-01

    In the paper we investigate power law for PageRank components for the Buckley-Osthus model for web graph. We compare different numerical methods for PageRank calculation. With the best method we do a lot of numerical experiments. These experiments confirm the hypothesis about power law. At the end we discuss real model of web-ranking based on the classical PageRank approach.

  18. Current State of Agile Methodologies Worldwide and in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Martin Tománek

    2015-06-01

    Full Text Available The objective of this research paper is to compare the current state of agile methodologies in the world and in the Czech Republic. The comparison is executed as the comparative analysis of two publicly available researches conducted in 2013 and published in 2014. The comparison is further enriched by the results of the unpublished survey in the global logistics company which was conducted also in 2013. The potential trend for agile methodologies in the Czech Republic is also discussed with regard to the worldwide trend.

  19. Measuring our Universe from Galaxy Redshift Surveys

    Directory of Open Access Journals (Sweden)

    Lahav Ofer

    2004-07-01

    Full Text Available Galaxy redshift surveys have achieved significant progress over the last couple of decades. Those surveys tell us in the most straightforward way what our local Universe looks like. While the galaxy distribution traces the bright side of the Universe, detailed quantitative analyses of the data have even revealed the dark side of the Universe dominated by non-baryonic dark matter as well as more mysterious dark energy (or Einstein's cosmological constant. We describe several methodologies of using galaxy redshift surveys as cosmological probes, and then summarize the recent results from the existing surveys. Finally we present our views on the future of redshift surveys in the era of precision cosmology.

  20. Digital libraries and World Wide Web sites and page persistence.

    Directory of Open Access Journals (Sweden)

    Wallace Koehler

    1999-01-01

    Full Text Available Web pages and Web sites, some argue, can either be collected as elements of digital or hybrid libraries, or, as others would have it, the WWW is itself a library. We begin with the assumption that Web pages and Web sites can be collected and categorized. The paper explores the proposition that the WWW constitutes a library. We conclude that the Web is not a digital library. However, its component parts can be aggregated and included as parts of digital library collections. These, in turn, can be incorporated into "hybrid libraries." These are libraries with both traditional and digital collections. Material on the Web can be organized and managed. Native documents can be collected in situ, disseminated, distributed, catalogueed, indexed, controlled, in traditional library fashion. The Web therefore is not a library, but material for library collections is selected from the Web. That said, the Web and its component parts are dynamic. Web documents undergo two kinds of change. The first type, the type addressed in this paper, is "persistence" or the existence or disappearance of Web pages and sites, or in a word the lifecycle of Web documents. "Intermittence" is a variant of persistence, and is defined as the disappearance but reappearance of Web documents. At any given time, about five percent of Web pages are intermittent, which is to say they are gone but will return. Over time a Web collection erodes. Based on a 120-week longitudinal study of a sample of Web documents, it appears that the half-life of a Web page is somewhat less than two years and the half-life of a Web site is somewhat more than two years. That is to say, an unweeded Web document collection created two years ago would contain the same number of URLs, but only half of those URLs point to content. The second type of change Web documents experience is change in Web page or Web site content. Again based on the Web document samples, very nearly all Web pages and sites undergo some

  1. Functionality screen of streptavidin mutants by non-denaturing SDS-PAGE using biotin-4-fluorescein.

    Science.gov (United States)

    Humbert, Nicolas; Ward, Thomas R

    2008-01-01

    Site-directed mutagenesis or directed evolution of proteins often leads to the production of inactive mutants. For streptavidin and related proteins, mutations may lead to the loss of their biotin-binding properties. With high-throughput screening methodologies in mind, it is imperative to detect, prior to the high-density protein production, the bacteria that produce non-functional streptavidin isoforms. Based on the incorporation of biotin-4-fluorescein in streptavidin mutants present in Escherichia coli bacterial extracts, we detail a functional screen that allows the identification of biotin-binding streptavidin variants. Bacteria are cultivated in a small volume, followed by a rapid treatment of the cells; biotin-4-fluorescein is added to the bacterial extract and loaded on an Sodium Dodecyl Sulfate Poly-Acrylamide Gel Electrophoresis (SDS-PAGE) under non-denaturing conditions. Revealing is performed using a UV transilluminator. This screen is thus easy to implement, cheap and requires only readily available equipment.

  2. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hyun; You, Young Woo; Cho, Jae Seon; Huh, Chang Wook; Kim, Do Hyoung; Kim, Ju Youl; Kim, Yoon Ik; Yang, Hui Chang; Park, Kang Min [Seoul National Univ., Seoul (Korea, Republic of)

    1998-03-15

    The objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Internal(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plant safety. In this study, the survey about the assessment methodologies, modelings and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. The sensitivity analyses about the failure factors of the components are performed in the bases of the and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code. The qualitative assessment for the STI/AOR of RPS/ESFAS assured safety the most important system in the nuclear power plant are performed.

  3. “I didn't know her, but…”: parasocial mourning of mediated deaths on Facebook RIP pages

    DEFF Research Database (Denmark)

    Klastrup, Lisbeth

    2015-01-01

    This article examines the use of six Danish “Rest in Peace” or (RIP) memorial pages. The article focuses on the relation between news media and RIP page use, in relation to general communicative practices on these pages. Based on an analysis of press coverage of the deaths of six young people...... and a close analysis of 1,015 comments extracted from the RIP pages created to memorialize them, it is shown that their deaths attracted considerable media attention, as did the RIP pages themselves. Comment activity seem to reflect the news stories in the way the commenters refer to the context of death...... and the emotional distress they experience, but mainly comments on the RIP pages are conventional expressions of sympathy and “RIP” wishes. The article concludes that public RIP pages might be understood as virtual spontaneous shrines, affording an emerging practice of “RIP-ing.”...

  4. Environmental sample banking-research and methodology

    International Nuclear Information System (INIS)

    Becker, D.A.

    1976-01-01

    The National Bureau of Standards (NBS), in cooperation with the Environment Protection Agency and the National Science Foundation, is engaged in a research program establishing methodology for environmental sample banking. This program is aimed toward evaluating the feasibility of a National Environment Specimen Bank (NESB). The capability for retrospective chemical analyses to evaluate changes in our environment would provide useful information. Much of this information could not be obtained using data from previously analyzed samples. However, to assure validity for these stored samples, they must be sampled, processed and stored under rigorously evaluated, controlled and documented conditions. The program currently under way in the NBS Analytical Chemistry Division has 3 main components. The first is an extension survey of available literature concerning problems of contamination, losses and storage. The components of interest include trace elements, pesticides, other trace organics (PCBs, plasticizers, etc.), radionuclides and microbiological species. The second component is an experimental evaluation of contamination and losses during sampling and sample handling. Of particular interest here is research into container cleaning methodology for trace elements, with respect to adsorption, desorption, leaching and partial dissolution by various sample matrices. The third component of this program is an evaluation of existing methodology for long-term sample storage

  5. Document representations for classification of short web-page descriptions

    Directory of Open Access Journals (Sweden)

    Radovanović Miloš

    2008-01-01

    Full Text Available Motivated by applying Text Categorization to classification of Web search results, this paper describes an extensive experimental study of the impact of bag-of- words document representations on the performance of five major classifiers - Naïve Bayes, SVM, Voted Perceptron, kNN and C4.5. The texts, representing short Web-page descriptions sorted into a large hierarchy of topics, are taken from the dmoz Open Directory Web-page ontology, and classifiers are trained to automatically determine the topics which may be relevant to a previously unseen Web-page. Different transformations of input data: stemming, normalization, logtf and idf, together with dimensionality reduction, are found to have a statistically significant improving or degrading effect on classification performance measured by classical metrics - accuracy, precision, recall, F1 and F2. The emphasis of the study is not on determining the best document representation which corresponds to each classifier, but rather on describing the effects of every individual transformation on classification, together with their mutual relationships. .

  6. ARL Physics Web Pages: An Evaluation by Established, Transitional and Emerging Benchmarks.

    Science.gov (United States)

    Duffy, Jane C.

    2002-01-01

    Provides an overview of characteristics among Association of Research Libraries (ARL) physics Web pages. Examines current academic Web literature and from that develops six benchmarks to measure physics Web pages: ease of navigation; logic of presentation; representation of all forms of information; engagement of the discipline; interactivity of…

  7. Use of cesium-137 methodology in the evaluation of superficial erosive processes

    International Nuclear Information System (INIS)

    Andrello, Avacir Casanova; Appoloni, Carlos Roberto; Guimaraes, Maria de Fatima; Nascimento Filho, Virgilio Franco do

    2003-01-01

    Superficial erosion is one of the main soil degradation agents and erosion rates estimations for different edaphic climate conditions for the conventional models, as USLE and RUSLE, are expensive and time-consuming. The use of cesium- 137 anthropogenic radionuclide is a new methodology that has been much studied and its application in the erosion soil evaluation has grown in countries as USA, UK, Australia and others. A brief narration of this methodology is being presented, as the development of the equations utilized for the erosion rates quantification through the cesium- 137 measurements. Two watersheds studied in Brazil have shown that the cesium- 137 methodology was practicable and coherent with the survey in field for applications in erosion studies. (author)

  8. An ant colony optimization based feature selection for web page classification.

    Science.gov (United States)

    Saraç, Esra; Özel, Selma Ayşe

    2014-01-01

    The increased popularity of the web has caused the inclusion of huge amount of information to the web, and as a result of this explosive information growth, automated web page classification systems are needed to improve search engines' performance. Web pages have a large number of features such as HTML/XML tags, URLs, hyperlinks, and text contents that should be considered during an automated classification process. The aim of this study is to reduce the number of features to be used to improve runtime and accuracy of the classification of web pages. In this study, we used an ant colony optimization (ACO) algorithm to select the best features, and then we applied the well-known C4.5, naive Bayes, and k nearest neighbor classifiers to assign class labels to web pages. We used the WebKB and Conference datasets in our experiments, and we showed that using the ACO for feature selection improves both accuracy and runtime performance of classification. We also showed that the proposed ACO based algorithm can select better features with respect to the well-known information gain and chi square feature selection methods.

  9. Improving Standard Poststratification Techniques For Random-Digit-Dialing Telephone Surveys

    Directory of Open Access Journals (Sweden)

    Michael P. Battaglia

    2008-03-01

    Full Text Available Random-digit-dialing surveys in the United States such as the Behavioral Risk Factor Surveillance System (BRFSS typically poststratify on age, gender and race/ethnicity using control totals from an appropriate source such as the 2000 Census, the Current Population Survey, or the American Community Survey. Using logistic regression and interaction detection software we identified key "main effect" socio-demographic variables and important two-factor interactions associated with several health risk factor outcomes measured in the BRFSS, one of the largest annual RDD surveys in the United States. A procedure was developed to construct control totals, which were consistent with estimates of age, gender, and race/ethnicity obtained from a commercial source and distributions of other demographic variables from the Current Population Survey. Raking was used to incorporate main effects and two-factor interaction margins into the weighting of the BRFSS survey data. The resulting risk factor estimates were then compared with those based on the current BRFSS weighting methodology and mean squared error estimates were developed. The research demonstrates that by identifying socio-demographic variables associated with key outcome variables and including these variables in the weighting methodology, nonresponse bias can be substantially reduced.

  10. Measuring medicine prices in Peru: validation of key aspects of WHO/HAI survey methodology.

    Science.gov (United States)

    Madden, Jeanne M; Meza, Edson; Ewen, Margaret; Laing, Richard O; Stephens, Peter; Ross-Degnan, Dennis

    2010-04-01

    To assess the possibility of bias due to the limited target list and geographic sampling of the World Health Organization (WHO)/Health Action International (HAI) Medicine Prices and Availability survey used in more than 70 rapid sample surveys since 2001. A survey was conducted in Peru in 2005 using an expanded sample of medicine outlets, including remote areas. Comprehensive data were gathered on medicines in three therapeutic classes to assess the adequacy of WHO/HAI's target medicines list and the focus on only two product versions. WHO/HAI median retail prices were compared with average wholesale prices from global pharmaceutical sales data supplier IMS Health. No significant differences were found in overall availability or prices of target list medicines by retail location. The comprehensive survey of angiotensin-converting enzyme inhibitor, anti-diabetic, and anti-ulcer products revealed that some treatments not on the target list were costlier for patients and more likely to be unavailable, particularly in remote areas. WHO/HAI retail prices and IMS wholesale prices were strongly correlated for higher priced products, and weakly correlated for lower priced products (which had higher estimated retailer markups). The WHO/HAI survey approach strikes an appropriate balance between modest research costs and optimal information for policy. Focusing on commonly used medicines yields sufficient and valid results. Surveyors elsewhere should consider the limits of the survey data as well as any local circumstances, such as scarcity, that may call for extra field efforts.

  11. Developing a web page: bringing clinics online.

    Science.gov (United States)

    Peterson, Ronnie; Berns, Susan

    2004-01-01

    Introducing clinical staff education, along with new policies and procedures, to over 50 different clinical sites can be a challenge. As any staff educator will confess, getting people to attend an educational inservice session can be difficult. Clinical staff request training, but no one has time to attend training sessions. Putting the training along with the policies and other information into "neat" concise packages via the computer and over the company's intranet was the way to go. However, how do you bring the clinics online when some of the clinical staff may still be reluctant to turn on their computers for anything other than to gather laboratory results? Developing an easy, fun, and accessible Web page was the answer. This article outlines the development of the first training Web page at the University of Wisconsin Medical Foundation, Madison, WI.

  12. Future Trends in Children's Web Pages: Probing Hidden Biases for Information Quality

    Science.gov (United States)

    Kurubacak, Gulsun

    2007-01-01

    As global digital communication continues to flourish, Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political restrictions…

  13. Environment: General; Grammar & Usage; Money Management; Music History; Web Page Creation & Design.

    Science.gov (United States)

    Web Feet, 2001

    2001-01-01

    Describes Web site resources for elementary and secondary education in the topics of: environment, grammar, money management, music history, and Web page creation and design. Each entry includes an illustration of a sample page on the site and an indication of the grade levels for which it is appropriate. (AEF)

  14. WEB LOG EXPLORER – CONTROL OF MULTIDIMENSIONAL DYNAMICS OF WEB PAGES

    Directory of Open Access Journals (Sweden)

    Mislav Šimunić

    2012-07-01

    Full Text Available Demand markets dictate and pose increasingly more requirements to the supplymarket that are not easily satisfied. The supply market presenting its web pages to thedemand market should find the best and quickest ways to respond promptly to the changesdictated by the demand market. The question is how to do that in the most efficient andquickest way. The data on the usage of web pages on a specific web site are recorded in alog file. The data in a log file are stochastic and unordered and require systematicmonitoring, categorization, analyses, and weighing. From the data processed in this way, itis necessary to single out and sort the data by their importance that would be a basis for acontinuous generation of dynamics/changes to the web site pages in line with the criterionchosen. To perform those tasks successfully, a new software solution is required. For thatpurpose, the authors have developed the first version of the WLE (WebLogExplorersoftware solution, which is actually realization of web page multidimensionality and theweb site as a whole. The WebLogExplorer enables statistical and semantic analysis of a logfile and on the basis thereof, multidimensional control of the web page dynamics. Theexperimental part of the work was done within the web site of HTZ (Croatian NationalTourist Board being the main portal of the global tourist supply in the Republic of Croatia(on average, daily "log" consists of c. 600,000 sets, average size of log file is 127 Mb, andc. 7000-8000 daily visitors on the web site.

  15. Stochastic analysis of web page ranking

    NARCIS (Netherlands)

    Volkovich, Y.

    2009-01-01

    Today, the study of the World Wide Web is one of the most challenging subjects. In this work we consider the Web from a probabilistic point of view. We analyze the relations between various characteristics of the Web. In particular, we are interested in the Web properties that affect the Web page

  16. 16 CFR 436.3 - Cover page.

    Science.gov (United States)

    2010-01-01

    ...) Buying a franchise is a complex investment. The information in this disclosure document can help you make up your mind. More information on franchising, such as “A Consumer's Guide to Buying a Franchise... with a cover page, in the order and form as follows: (a) The title “FRANCHISE DISCLOSURE DOCUMENT” in...

  17. Key-phrase based classification of public health web pages.

    Science.gov (United States)

    Dolamic, Ljiljana; Boyer, Célia

    2013-01-01

    This paper describes and evaluates the public health web pages classification model based on key phrase extraction and matching. Easily extendible both in terms of new classes as well as the new language this method proves to be a good solution for text classification faced with the total lack of training data. To evaluate the proposed solution we have used a small collection of public health related web pages created by a double blind manual classification. Our experiments have shown that by choosing the adequate threshold value the desired value for either precision or recall can be achieved.

  18. Parental compliance - an emerging problem in Liverpool community child health surveys 1991-2006

    Directory of Open Access Journals (Sweden)

    Koshy Gibby

    2012-04-01

    Full Text Available Abstract Background Compliance is a critical issue for parental questionnaires in school based epidemiological surveys and high compliance is difficult to achieve. The objective of this study was to determine trends and factors associated with parental questionnaire compliance during respiratory health surveys of school children in Merseyside between 1991 and 2006. Methods Four cross-sectional respiratory health surveys employing a core questionnaire and methodology were conducted in 1991, 1993, 1998 and 2006 among 5-11 year old children in the same 10 schools in Bootle and 5 schools in Wallasey, Merseyside. Parental compliance fell sequentially in consecutive surveys. This analysis aimed to determine the association of questionnaire compliance with variation in response rates to specific questions across surveys, and the demographic profiles for parents of children attending participant schools. Results Parental questionnaire compliance was 92% (1872/2035 in 1991, 87.4% (3746/4288 in 1993, 78.1% (1964/2514 in 1998 and 30.3% (1074/3540 in 2006. The trend to lower compliance in later surveys was consistent across all surveyed schools. Townsend score estimations of socio-economic status did not differ between schools with high or low questionnaire compliance and were comparable across the four surveys with only small differences between responders and non-responders to specific core questions. Respiratory symptom questions were mostly well answered with fewer than 15% of non-responders across all surveys. There were significant differences between mean child age, maternal and paternal smoking prevalence, and maternal employment between the four surveys (all p Conclusion Methodological differences or changes in socio-economic status of respondents between surveys were unlikely to explain compliance differences. Changes in maternal employment patterns may have been contributory. This analysis demonstrates a major shift in community parental

  19. 2015 Survey of Non-Starch Ethanol and Renewable Hydrocarbon Biofuels Producers

    Energy Technology Data Exchange (ETDEWEB)

    Schwab, Amy [National Renewable Energy Lab. (NREL), Golden, CO (United States); Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lewis, John [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-01-22

    In order to understand the anticipated status of the industry for non-starch ethanol and renewable hydrocarbon biofuels as of the end of calendar year 2015, the National Renewable Energy Laboratory (NREL) conducted its first annual survey update of U.S. non-starch ethanol and renewable hydrocarbon biofuels producers. This report presents the results of this survey, describes the survey methodology, and documents important changes since the 2013 survey.

  20. EFL Teaching Methodological Practices in Cali

    Directory of Open Access Journals (Sweden)

    Chaves Orlando

    2013-04-01

    Full Text Available In this article we aim at showing partial results of a study about the profiles of English as a Foreign Language (EFL teachers in both public and private primary and secondary strata 1-4 schools in Cali, Colombia. Teachers’ methodological approaches and practices are described and analyzed from a sample of 220 teachers. Information was gathered from surveys, interviews and institutional documents. The quantitative information was processed with the Statistical Package for the Social Sciences and Excelwhile the qualitative information (from a survey and focal interviews was analyzed hermeneutically. An analysis grid was used for the examination of institutional documents (area planning, syllabi, and didactic materials. Teachers’ methodology (approaches/methods, lessons, activities, objectives, curricula, syllabi and evaluation are analyzed in the light of literature in the field. Finally, we discuss theimplications of methodological approaches.En este artículo se presentan los resultados parciales de una investigación sobre los perfiles de los profesores de inglés como lengua extranjera que enseñan en colegios de educación básica primaria y secundaria, públicos y privados, de estratos 1 a 4 en Cali, Colombia. Se describen y analizan sus enfoques y prácticas metodológicas a partir de una muestra de 220 docentes. Se obtuvo información cualitativa y cuantitativa por medio de encuestas, entrevistas y documentos institucionales. La información cuantitativa se procesó con el software Statistical Package for Social Sciences y Excel, mientras que la información cualitativa se analizó hermenéuticamente. Se usó una rejilla de análisispara el examen de los documentos institucionales (planes de área, programas, y materiales didácticos. La metodología (enfoques/métodos, clases, actividades, objetivos, currículo, programas y evaluación se analizan a partir de la literatura especializada en el campo. Finalmente, se

  1. The Extrapolation-Accelerated Multilevel Aggregation Method in PageRank Computation

    Directory of Open Access Journals (Sweden)

    Bing-Yuan Pu

    2013-01-01

    Full Text Available An accelerated multilevel aggregation method is presented for calculating the stationary probability vector of an irreducible stochastic matrix in PageRank computation, where the vector extrapolation method is its accelerator. We show how to periodically combine the extrapolation method together with the multilevel aggregation method on the finest level for speeding up the PageRank computation. Detailed numerical results are given to illustrate the behavior of this method, and comparisons with the typical methods are also made.

  2. Search Results | Page 788 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 7871 - 7880 of 8490 ... ... changing economic landscape. Research in Action. Private sector development Trade and investment. Changing the rules for businesses. Research in Action. Economic and social development POVERTY ALLEVIATION Poverty Gender. Managing opium: Policy choices for Afghanistan. Pages.

  3. The methodology of population surveys of headache prevalence, burden and cost: Principles and recommendations from the Global Campaign against Headache

    Science.gov (United States)

    2014-01-01

    The global burden of headache is very large, but knowledge of it is far from complete and needs still to be gathered. Published population-based studies have used variable methodology, which has influenced findings and made comparisons difficult. Among the initiatives of the Global Campaign against Headache to improve and standardize methods in use for cross-sectional studies, the most important is the production of consensus-based methodological guidelines. This report describes the development of detailed principles and recommendations. For this purpose we brought together an expert consensus group to include experience and competence in headache epidemiology and/or epidemiology in general and drawn from all six WHO world regions. The recommendations presented are for anyone, of whatever background, with interests in designing, performing, understanding or assessing studies that measure or describe the burden of headache in populations. While aimed principally at researchers whose main interests are in the field of headache, they should also be useful, at least in parts, to those who are expert in public health or epidemiology and wish to extend their interest into the field of headache disorders. Most of all, these recommendations seek to encourage collaborations between specialists in headache disorders and epidemiologists. The focus is on migraine, tension-type headache and medication-overuse headache, but they are not intended to be exclusive to these. The burdens arising from secondary headaches are, in the majority of cases, more correctly attributed to the underlying disorders. Nevertheless, the principles outlined here are relevant for epidemiological studies on secondary headaches, provided that adequate definitions can be not only given but also applied in questionnaires or other survey instruments. PMID:24467862

  4. To give or not to give, that's the question: How methodology is destiny in Dutch giving data

    NARCIS (Netherlands)

    Bekkers, R.H.F.P.; Wiepking, P.

    2006-01-01

    In research on giving, methodology is destiny. The volume of donations estimated from sample surveys strongly depends on the length of the questionnaire used to measure giving. By comparing two giving surveys from the Netherlands, the authors show that a short questionnaire on giving not only

  5. Perceived learning effectiveness of a course Facebook page: teacher-led versus student-led approach

    Directory of Open Access Journals (Sweden)

    Tugba Orten Tugrul

    2017-01-01

    Full Text Available This research aims to compare the perceived effectiveness of teacher -led and student-led content management approaches embraced in a course Facebook page designed to enhance traditional classroom learning. Eighty-five undergraduate marketing course students voluntarily completed a questionnaire composed of two parts; a depiction of a course Facebook page where both teacher and students can share instructional contents, and questions about perceived learning effectiveness. The findings indicate that students have more favorable evaluations of a student-led approach in sharing instructional contents on a course Facebook Page than a teacher-led approach. Additionally, it is shown that instructional contents posted by both teacher and students enhance the overall learning effectiveness of a course Facebook page incorporated into a traditional classroom teaching.

  6. Collective frequency variation in network synchronization and reverse PageRank.

    Science.gov (United States)

    Skardal, Per Sebastian; Taylor, Dane; Sun, Jie; Arenas, Alex

    2016-04-01

    A wide range of natural and engineered phenomena rely on large networks of interacting units to reach a dynamical consensus state where the system collectively operates. Here we study the dynamics of self-organizing systems and show that for generic directed networks the collective frequency of the ensemble is not the same as the mean of the individuals' natural frequencies. Specifically, we show that the collective frequency equals a weighted average of the natural frequencies, where the weights are given by an outflow centrality measure that is equivalent to a reverse PageRank centrality. Our findings uncover an intricate dependence of the collective frequency on both the structural directedness and dynamical heterogeneity of the network, and also reveal an unexplored connection between synchronization and PageRank, which opens the possibility of applying PageRank optimization to synchronization. Finally, we demonstrate the presence of collective frequency variation in real-world networks by considering the UK and Scandinavian power grids.

  7. Collective frequency variation in network synchronization and reverse PageRank

    Science.gov (United States)

    Skardal, Per Sebastian; Taylor, Dane; Sun, Jie; Arenas, Alex

    2016-04-01

    A wide range of natural and engineered phenomena rely on large networks of interacting units to reach a dynamical consensus state where the system collectively operates. Here we study the dynamics of self-organizing systems and show that for generic directed networks the collective frequency of the ensemble is not the same as the mean of the individuals' natural frequencies. Specifically, we show that the collective frequency equals a weighted average of the natural frequencies, where the weights are given by an outflow centrality measure that is equivalent to a reverse PageRank centrality. Our findings uncover an intricate dependence of the collective frequency on both the structural directedness and dynamical heterogeneity of the network, and also reveal an unexplored connection between synchronization and PageRank, which opens the possibility of applying PageRank optimization to synchronization. Finally, we demonstrate the presence of collective frequency variation in real-world networks by considering the UK and Scandinavian power grids.

  8. Ultra-processed food product brands on Facebook pages: highly accessed by Brazilians through their marketing techniques.

    Science.gov (United States)

    Horta, Paula M; Rodrigues, Fernanda T; Dos Santos, Luana C

    2018-06-01

    To analyse the content and extent of marketing of ultra-processed food products (UPP) and their brand pages on Facebook, which are highly accessed by Brazilians. Descriptive. Sixteen UPP brand pages on Facebook were selected from 250 pages that were the most liked by Brazilians in October 2015. We analysed the frequency of 'likes' and members 'talking about' each one of the pages, in addition to fifteen marketing techniques used in the previous year (September 2014 to October 2015). The number of posts, likes, 'shares' and 'commentaries', and the mean number of likes, shares and commentaries per post, were collected for one month, from 23 September to 23 October 2015. The two most liked pages were: Coke® (93 673 979 likes) and McDonald's® (59 749 819 likes). Regarding the number of people talking about the pages, McDonald's led with 555 891 commentaries, followed by Coke (287 274), Burger King® (246 148) and Kibon® (244 523). All pages used marketing techniques, which included photos, user conversations, presence of brand elements and links. Videos were observed on 93·8 % of the pages; promotions on 68·8 %; and celebrities on 62·5 %. In one month, Garoto®, Outback® and Coke were brands that published more than one post per day. Kibon achieved the highest ratio of likes per post (285 845·50) and Burger King had the highest mean shares per post (10 083·93), including commentaries per post (7958·13). UPP marketing is extensively used on Facebook pages and is highly accessed by Brazilians, with UPP companies employing a diversity of marketing strategies.

  9. Use of IMMPACT domains in clinical trials of acupuncture for chronic pain: a protocol for a methodological survey.

    Science.gov (United States)

    Mazzei, Lauren Giustti; Bergamaschi, Cristiane de Cássia; Silva, Marcus Tolentino; Lopes, Luciane Cruz

    2017-09-27

    Pain is one of the most common and most debilitating complaints among patients. It affects the individual, their relationship with friends and family, their ability to function at work, and their sociability. Acupuncture is one of the therapeutic resources for managing chronic pain. Given the variability of outcome measures in controlled randomised clinical trials on non-oncologicchronic pain (CRCT-NOCP), the Initiative in Methods, Measurements and Pain Assessment in Clinical Trials (IMMPACT) recommends six domains to be covered in evaluating the effectiveness of treatments for chronic pain. To check whether the methodological quality of outcome reporting in published trials has used IMMPACT recommendations in measuring CRCT-NOCP outcomes when acupuncture was used as a treatment. This is a methodological study. We will systematically search for eligible studies in specific databases with a defined strategy. We will use the MeSHterms of 'acupuncture', 'chronic pain' and similar terms, without restrictions on idiom. Eligible studies will include those which are randomised and chose NOCP patients to be treated with acupuncture or control (sham acupuncture or no acupuncture), recruited after September 2004, with ≥100 patients. The measured outcomes are to be the presence of outcome domains recommended by IMMPACT, domains reported by the patient or clinician, tools used to measure such domains, as well as other features of the studies. We shall conduct a regression analysis to explore factors which can be associated with the presence of outcome domains according to IMMPACT recommendations. This survey will be submitted for presentation at congresses and for publication in a scientific journal. The findings obtained in this study will allow us to measure the quality of the evidence and provide greater transparency in decisions regarding the use of acupuncture as a viable alternative to managing chronic pain. © Article author(s) (or their employer(s) unless otherwise

  10. PageRank for low frequency earthquake detection

    Science.gov (United States)

    Aguiar, A. C.; Beroza, G. C.

    2013-12-01

    We have analyzed Hi-Net seismic waveform data during the April 2006 tremor episode in the Nankai Trough in SW Japan using the autocorrelation approach of Brown et al. (2008), which detects low frequency earthquakes (LFEs) based on pair-wise waveform matching. We have generalized this to exploit the fact that waveforms may repeat multiple times, on more than just a pair-wise basis. We are working towards developing a sound statistical basis for event detection, but that is complicated by two factors. First, the statistical behavior of the autocorrelations varies between stations. Analyzing one station at a time assures that the detection threshold will only depend on the station being analyzed. Second, the positive detections do not satisfy "closure." That is, if window A correlates with window B, and window B correlates with window C, then window A and window C do not necessarily correlate with one another. We want to evaluate whether or not a linked set of windows are correlated due to chance. To do this, we map our problem on to one that has previously been solved for web search, and apply Google's PageRank algorithm. PageRank is the probability of a 'random surfer' to visit a particular web page; it assigns a ranking for a webpage based on the amount of links associated with that page. For windows of seismic data instead of webpages, the windows with high probabilities suggest likely LFE signals. Once identified, we stack the matched windows to improve the snr and use these stacks as template signals to find other LFEs within continuous data. We compare the results among stations and declare a detection if they are found in a statistically significant number of stations, based on multinomial statistics. We compare our detections using the single-station method to detections found by Shelly et al. (2007) for the April 2006 tremor sequence in Shikoku, Japan. We find strong similarity between the results, as well as many new detections that were not found using

  11. Building Interactive Simulations in Web Pages without Programming.

    Science.gov (United States)

    Mailen Kootsey, J; McAuley, Grant; Bernal, Julie

    2005-01-01

    A software system is described for building interactive simulations and other numerical calculations in Web pages. The system is based on a new Java-based software architecture named NumberLinX (NLX) that isolates each function required to build the simulation so that a library of reusable objects could be assembled. The NLX objects are integrated into a commercial Web design program for coding-free page construction. The model description is entered through a wizard-like utility program that also functions as a model editor. The complete system permits very rapid construction of interactive simulations without coding. A wide range of applications are possible with the system beyond interactive calculations, including remote data collection and processing and collaboration over a network.

  12. Using Quality Tools and Methodologies to Improve a Hospital's Quality Position.

    Science.gov (United States)

    Branco, Daniel; Wicks, Angela M; Visich, John K

    2017-01-01

    The authors identify the quality tools and methodologies most frequently used by quality-positioned hospitals versus nonquality hospitals. Northeastern U.S. hospitals in both groups received a brief, 12-question survey. The authors found that 93.75% of the quality hospitals and 81.25% of the nonquality hospitals used some form of process improvement methodologies. However, there were significant differences between the groups regarding the impact of quality improvement initiatives on patients. The findings indicate that in quality hospitals the use of quality improvement initiatives had a significantly greater positive impact on patient satisfaction and patient outcomes when compared to nonquality hospitals.

  13. Quantum control theory and applications: A survey

    OpenAIRE

    Dong, Daoyi; Petersen, Ian R

    2009-01-01

    This paper presents a survey on quantum control theory and applications from a control systems perspective. Some of the basic concepts and main developments (including open-loop control and closed-loop control) in quantum control theory are reviewed. In the area of open-loop quantum control, the paper surveys the notion of controllability for quantum systems and presents several control design strategies including optimal control, Lyapunov-based methodologies, variable structure control and q...

  14. Search | Page 4 | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    Search | Page 7 | IDRC - International Development Research Centre. Hammou Lammrani has been working for IDRC in the Middle East and North Africa since 2007. Specialising in agriculture, water, and knowledge management, .

  15. SPRINT RA 230: Methodology for knowledge based developments

    International Nuclear Information System (INIS)

    Wallsgrove, R.; Munro, F.

    1991-01-01

    SPRINT RA 230: A Methodology for Knowledge Based Developments, funded by the European Commission, was set up to investigate the use of KBS in the engineering industry. Its aim was to find out low KBS were currently used and what people's conceptions of them was, to disseminate current knowledge and to recommend further research into this area. A survey (by post and face to face interviews) was carried out under SPRINT RA 230 to investigate requirements for more intelligent software. In the survey we looked both at how people think about Knowledge Based Systems (KBS), what they find useful and what is not useful, and what current expertise problems or limitations of conventional software might suggest KBS solutions. (orig./DG)

  16. From the Users' Perspective-The UCSD Libraries User Survey Project.

    Science.gov (United States)

    Talbot, Dawn E.; Lowell, Gerald R.; Martin, Kerry

    1998-01-01

    Discussion of a user-driven survey conducted at the University of California, San Diego libraries focuses on the methodology that resulted in a high response rate. Highlights goals for the survey, including acceptance of data by groups outside the library and for benchmarking data; planning; user population; and questionnaire development. (LRW)

  17. Children as respondents in survey research: Cognitive development and response quality

    NARCIS (Netherlands)

    Borgers, N.; Leeuw, E.D. de; Hox, J.J.

    2000-01-01

    Although children are no longer a neglected minority in official statistics and surveys, methodological knowledge on how to survey children is still scarce. Researchers have to rely mainly on ad-hoc knowledge from such diverse fields as child psychiatry and educational testing, or extrapolate

  18. Generic radiological characterization protocol for surveys conducted for DOE remedial action programs

    International Nuclear Information System (INIS)

    Berven, B.A.; Cottrell, W.D.; Leggett, R.W.; Little, C.A.; Myrick, T.E.; Goldsmith, W.A.; Haywood, F.F.

    1986-05-01

    This report describes goals and methodology that can be used by radiological survey contractors in surveys at properties associated with the Department of Energy's remedial action programs. The description includes: (1) a general discussion of the history of the remedial action programs; (2) the types of surveys that may be employed by the Radiological Survey Activities (RASA) contractor; (3) generic survey methods that may be used during radiological surveys; and (4) a format for presenting information and data in a survey report. 9 refs

  19. Readability and quality of wikipedia pages on neurosurgical topics.

    Science.gov (United States)

    Modiri, Omeed; Guha, Daipayan; Alotaibi, Naif M; Ibrahim, George M; Lipsman, Nir; Fallah, Aria

    2018-03-01

    Wikipedia is the largest online encyclopedia with over 40 million articles, and generating 500 million visits per month. The aim of this study is to assess the readability and quality of Wikipedia pages on neurosurgical related topics. We selected the neurosurgical related Wikipedia pages based on the series of online patient information articles that are published by the American Association of Neurological Surgeons (AANS). We assessed readability of Wikipedia pages using five different readability scales (Flesch Reading Ease, Flesch Kincaid Grade Level, Gunning Fog Index, SMOG) Grade level, and Coleman-Liau Index). We used the Center for Disease Control (CDC) Clear Communication Index as well as the DISCERN Instrument to evaluate the quality of each Wikipedia article. We identified a total of fifty-five Wikipedia articles that corresponded with patient information articles published by the AANS. This constitutes 77.46% of the AANS topics. The mean Flesch Kincaid reading ease score for all of the Wikipedia articles we analyzed is 31.10, which indicates that a college-level education is necessary to understand them. In comparison to the readability analysis for the AANS articles, the Wikipedia articles were more difficult to read across every scale. None of the Wikipedia articles meet the CDC criterion for clear communications. Our analyses demonstrated that Wikipedia articles related to neurosurgical topics are associated with higher grade levels for reading and also below the expected levels of clear communications for patients. Collaborative efforts from the neurosurgical community are needed to enhance the readability and quality of Wikipedia pages related to neurosurgery. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. TOPS On-Line: Automating the Construction and Maintenance of HTML Pages

    Science.gov (United States)

    Jones, Kennie H.

    1994-01-01

    After the Technology Opportunities Showcase (TOPS), in October, 1993, Langley Research Center's (LaRC) Information Systems Division (ISD) accepted the challenge to preserve the investment in information assembled in the TOPS exhibits by establishing a data base. Following the lead of several people at LaRC and others around the world, the HyperText Transport Protocol (HTTP) server and Mosaic were the obvious tools of choice for implementation. Initially, some TOPS exhibitors began the conventional approach of constructing HyperText Markup Language (HTML) pages of their exhibits as input to Mosaic. Considering the number of pages to construct, a better approach was conceived that would automate the construction of pages. This approach allowed completion of the data base construction in a shorter period of time using fewer resources than would have been possible with the conventional approach. It also provided flexibility for the maintenance and enhancement of the data base. Since that time, this approach has been used to automate construction of other HTML data bases. Through these experiences, it is concluded that the most effective use of the HTTP/Mosaic technology will require better tools and techniques for creating, maintaining and managing the HTML pages. The development and use of these tools and techniques are the subject of this document.

  1. Review Pages: Cities, Energy and Climate Change

    Directory of Open Access Journals (Sweden)

    Gennaro Angiello

    2015-04-01

    Full Text Available Starting from the relationship between urban planning and mobility management, TeMA has gradually expanded the view of the covered topics, always remaining in the groove of rigorous scientific in-depth analysis. During the last two years a particular attention has been paid on the Smart Cities theme and on the different meanings that come with it. The last section of the journal is formed by the Review Pages. They have different aims: to inform on the problems, trends and evolutionary processes; to investigate on the paths by highlighting the advanced relationships among apparently distant disciplinary fields; to explore the interaction’s areas, experiences and potential applications; to underline interactions, disciplinary developments but also, if present, defeats and setbacks. Inside the journal the Review Pages have the task of stimulating as much as possible the circulation of ideas and the discovery of new points of view. For this reason the section is founded on a series of basic’s references, required for the identification of new and more advanced interactions. These references are the research, the planning acts, the actions and the applications, analysed and investigated both for their ability to give a systematic response to questions concerning the urban and territorial planning, and for their attention to aspects such as the environmental sustainability and the innovation in the practices. For this purpose the Review Pages are formed by five sections (Web Resources; Books; Laws; Urban Practices; News and Events, each of which examines a specific aspect of the broader information storage of interest for TeMA.

  2. Review Pages: Cities, Energy and Built Environment

    Directory of Open Access Journals (Sweden)

    Gennaro Angiello

    2015-07-01

    Full Text Available Starting from the relationship between urban planning and mobility management, TeMA has gradually expanded the view of the covered topics, always remaining in the groove of rigorous scientific in-depth analysis. During the last two years a particular attention has been paid on the Smart Cities theme and on the different meanings that come with it. The last section of the journal is formed by the Review Pages. They have different aims: to inform on the problems, trends and evolutionary processes; to investigate on the paths by highlighting the advanced relationships among apparently distant disciplinary fields; to explore the interaction’s areas, experiences and potential applications; to underline interactions, disciplinary developments but also, if present, defeats and setbacks. Inside the journal the Review Pages have the task of stimulating as much as possible the circulation of ideas and the discovery of new points of view. For this reason the section is founded on a series of basic’s references, required for the identification of new and more advanced interactions. These references are the research, the planning acts, the actions and the applications, analysed and investigated both for their ability to give a systematic response to questions concerning the urban and territorial planning, and for their attention to aspects such as the environmental sustainability and the innovation in the practices. For this purpose the Review Pages are formed by five sections (Web Resources; Books; Laws; Urban Practices; News and Events, each of which examines a specific aspect of the broader information storage of interest for TeMA.

  3. PageRank model of opinion formation on Ulam networks

    Science.gov (United States)

    Chakhmakhchyan, L.; Shepelyansky, D.

    2013-12-01

    We consider a PageRank model of opinion formation on Ulam networks, generated by the intermittency map and the typical Chirikov map. The Ulam networks generated by these maps have certain similarities with such scale-free networks as the World Wide Web (WWW), showing an algebraic decay of the PageRank probability. We find that the opinion formation process on Ulam networks has certain similarities but also distinct features comparing to the WWW. We attribute these distinctions to internal differences in network structure of the Ulam and WWW networks. We also analyze the process of opinion formation in the frame of generalized Sznajd model which protects opinion of small communities.

  4. Search Results | Page 784 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 7831 - 7840 of 8491 ... Research in Action. Water. Liquid manna? Treating urban wastewater for local gardening. Research in Action. Biodiversity Gender. Medicinal plant potential and profits in Latin America. Research in Action. HIV/AIDS Biodiversity. Recognition and respect for African traditional medicine. Pages.

  5. Search Results | Page 843 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 8421 - 8430 of 8489 ... IDRC permits reading, downloading, copying, redistributing, printing, linking and searching, for non-commercial or academic purposes, of any of its content, provided that credit and reference is given to IDRC and the original source page and, in the. Webpage.

  6. Search Results | Page 29 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Results 281 - 290 of 8491 ... Policy Analysis on Growth and Employment - PAGE II ... Displacement Caused by Development Projects in Zimbabwe ... Supporting business opportunities for rural women in east and southern Africa. Women in Zimbabwe, Kenya, and Uganda experience disadvantages and gender inequalities in ...

  7. Methodology for assessment of undiscovered oil and gas resources for the 2008 Circum-Arctic Resource Appraisal

    Science.gov (United States)

    Charpentier, Ronald R.; Moore, Thomas E.; Gautier, D.L.

    2017-11-15

    The methodological procedures used in the geologic assessments of the 2008 Circum-Arctic Resource Appraisal (CARA) were based largely on the methodology developed for the 2000 U.S. Geological Survey World Petroleum Assessment. The main variables were probability distributions for numbers and sizes of undiscovered accumulations with an associated risk of occurrence. The CARA methodology expanded on the previous methodology in providing additional tools and procedures more applicable to the many Arctic basins that have little or no exploration history. Most importantly, geologic analogs from a database constructed for this study were used in many of the assessments to constrain numbers and sizes of undiscovered oil and gas accumulations.

  8. Exploring innovative ways to conduct coverage surveys for neglected tropical diseases in Malawi, Mali, and Uganda.

    Science.gov (United States)

    Woodhall, Dana M; Mkwanda, Square; Dembele, Massitan; Lwanga, Harriet; Drexler, Naomi; Dubray, Christine; Harris, Jennifer; Worrell, Caitlin; Mathieu, Els

    2014-04-01

    Currently, a 30-cluster survey to monitor drug coverage after mass drug administration for neglected tropical diseases is the most common methodology used by control programs. We investigated alternative survey methodologies that could potentially provide an estimation of drug coverage. Three alternative survey methods (market, village chief, and religious leader) were conducted and compared to the 30-cluster method in Malawi, Mali, and Uganda. In Malawi, drug coverage for the 30-cluster, market, village chief, and religious leader methods were 66.8% (95% CI 60.3-73.4), 74.3%, 76.3%, and 77.8%, respectively. In Mali, results for round 1 were 62.6% (95% CI 54.4-70.7), 56.1%, 74.8%, and 83.2%, and 57.2% (95% CI 49.0-65.4), 54.5%, 72.2%, and 73.3%, respectively, for round 2. Uganda survey results were 65.7% (59.4-72.0), 43.7%, 67.2%, and 77.6% respectively. Further research is needed to test different coverage survey methodologies to determine which survey methods are the most scientifically rigorous and resource efficient. Published by Elsevier B.V.

  9. CONTAMINATED SOIL VOLUME ESTIMATE TRACKING METHODOLOGY

    International Nuclear Information System (INIS)

    Durham, L.A.; Johnson, R.L.; Rieman, C.; Kenna, T.; Pilon, R.

    2003-01-01

    The U.S. Army Corps of Engineers (USACE) is conducting a cleanup of radiologically contaminated properties under the Formerly Utilized Sites Remedial Action Program (FUSRAP). The largest cost element for most of the FUSRAP sites is the transportation and disposal of contaminated soil. Project managers and engineers need an estimate of the volume of contaminated soil to determine project costs and schedule. Once excavation activities begin and additional remedial action data are collected, the actual quantity of contaminated soil often deviates from the original estimate, resulting in cost and schedule impacts to the project. The project costs and schedule need to be frequently updated by tracking the actual quantities of excavated soil and contaminated soil remaining during the life of a remedial action project. A soil volume estimate tracking methodology was developed to provide a mechanism for project managers and engineers to create better project controls of costs and schedule. For the FUSRAP Linde site, an estimate of the initial volume of in situ soil above the specified cleanup guidelines was calculated on the basis of discrete soil sample data and other relevant data using indicator geostatistical techniques combined with Bayesian analysis. During the remedial action, updated volume estimates of remaining in situ soils requiring excavation were calculated on a periodic basis. In addition to taking into account the volume of soil that had been excavated, the updated volume estimates incorporated both new gamma walkover surveys and discrete sample data collected as part of the remedial action. A civil survey company provided periodic estimates of actual in situ excavated soil volumes. By using the results from the civil survey of actual in situ volumes excavated and the updated estimate of the remaining volume of contaminated soil requiring excavation, the USACE Buffalo District was able to forecast and update project costs and schedule. The soil volume

  10. Site survey for nuclear power plants

    International Nuclear Information System (INIS)

    1984-01-01

    This Safety Guide describes the first stage of the siting process for nuclear power plants - the site survey to select one or more preferred candidate sites. Its purpose is to recommend procedures and provide information for use in implementing a part of the Code of Practice on Safety in Nuclear Power Plant Siting (IAEA Safety Series No.50-C-S). The organization, procedures, methodologies, guidance for documenting the site survey process and examples of detailed procedures on some safety-related site characteristics are given in the Guide

  11. Nurse Religiosity and Spiritual Care: An Online Survey.

    Science.gov (United States)

    Taylor, Elizabeth Johnston; Gober-Park, Carla; Schoonover-Shoffner, Kathy; Mamier, Iris; Somaiya, Chintan K; Bahjri, Khaled

    2017-08-01

    This study measured the frequency of nurse-provided spiritual care and how it is associated with various facets of nurse religiosity. Data were collected using an online survey accessed from the home page of the Journal of Christian Nursing. The survey included the Nurse Spiritual Care Therapeutics Scale, six scales quantifying facets of religiosity, and demographic and work-related items. Respondents ( N = 358) indicated high religiosity yet reported neutral responses to items about sharing personal beliefs and tentativeness of belief. Findings suggested spiritual care was infrequent. Multivariate analysis showed prayer frequency, employer support of spiritual care, and non-White ethnicity were significantly associated with spiritual care frequency (adjusted R 2 = .10). Results not only provide an indication of spiritual care frequency but empirical encouragement for nurse managers to provide a supportive environment for spiritual care. Findings expose the reality that nurse religiosity is directly related, albeit weakly, to spiritual care frequency.

  12. To Give or Not to Give, That Is the Question : How Methodology Is Destiny in Dutch Giving Data

    NARCIS (Netherlands)

    Bekkers, René; Wiepking, Pamala

    2006-01-01

    In research on giving, methodology is destiny. The volume of donations estimated from sample surveys strongly depends on the length of the questionnaire used to measure giving. By comparing two giving surveys from the Netherlands, the authors show that a short questionnaire on giving not only

  13. A methodology and success/failure criteria for determining emergency diesel generator reliability

    International Nuclear Information System (INIS)

    Wyckoff, H.L.

    1986-01-01

    In the U.S., comprehensive records of nationwide emergency diesel generator (EDG) reliability at nuclear power plants have not been consistently collected. Those surveys that have been undertaken have not always been complete and accurate. Moreover, they have been based On an extremely conservative methodology and success/failure criteria that are specified in U.S. Nuclear Regulatory Commission Reg. Guide 1.108. This Reg. Guide was one of the NRCs earlier efforts and does not yield the caliber of statistically defensible reliability values that are now needed. On behalf of the U.S. utilities, EPRI is taking the lead in organizing, investigating, and compiling a realistic database of EDG operating success/failure experience for the years 1983, 1984 and 1985. These data will be analyzed to provide an overall picture of EDG reliability. This paper describes the statistical methodology and start and run success/- failure criteria that EPRI is using. The survey is scheduled to be completed in March 1986. (author)

  14. A methodology and success/failure criteria for determining emergency diesel generator reliability

    Energy Technology Data Exchange (ETDEWEB)

    Wyckoff, H. L. [Electric Power Research Institute, Palo Alto, California (United States)

    1986-02-15

    In the U.S., comprehensive records of nationwide emergency diesel generator (EDG) reliability at nuclear power plants have not been consistently collected. Those surveys that have been undertaken have not always been complete and accurate. Moreover, they have been based On an extremely conservative methodology and success/failure criteria that are specified in U.S. Nuclear Regulatory Commission Reg. Guide 1.108. This Reg. Guide was one of the NRCs earlier efforts and does not yield the caliber of statistically defensible reliability values that are now needed. On behalf of the U.S. utilities, EPRI is taking the lead in organizing, investigating, and compiling a realistic database of EDG operating success/failure experience for the years 1983, 1984 and 1985. These data will be analyzed to provide an overall picture of EDG reliability. This paper describes the statistical methodology and start and run success/- failure criteria that EPRI is using. The survey is scheduled to be completed in March 1986. (author)

  15. Search Results | Page 80 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-01-01

    Results 791 - 800 of 8491 ... January 1, 2010. Studies. -. Digital and other poverties : exploring the connection in four East African countries. Published date. January 1, 2010. Studies. -. Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia. Published date. January 1, 2009. Pages.

  16. SI_pages 771-785.pdf

    Indian Academy of Sciences (India)

    Administrator

    Fig. S1. Powder X-ray diffraction pattern (Cu Kα) for AlPO-5: (a) as prepared, (b) calcined and (c) simulated [Ref. 27]. 10. 20. 30. 40. 50. Intensity (a.u) b c a. 2θ. Page 2. Fig. S2. Powder X-ray diffraction pattern (Cu Kα) for MeAlPO-5: (a) as prepared ZnAlPO-5, (b) calcined. ZnAlPO-5, (c) as prepared CoAlPO-5, (d) calcined ...

  17. A survey of selected Internet pharmacies in the United States.

    Science.gov (United States)

    Peterson, A M

    2001-01-01

    To determine whether differences in the provision of pharmacy services exist among different types of Internet pharmacies. Survey of selected pharmacies with a presence on the Internet. Data were abstracted onto a data collection form for further analysis. Data collection was limited to 3 weeks. U.S.-based Internet pharmacies that allow patients to purchase prescription medications online. Pharmacies were identified using a metasearch engine with the search terms "Internet pharmacy" and "Internet pharmacist." Survey. Comparisons of availability of 10 commonly used products representing a variety of product categories, prescription verification methods, and privacy issues; and determinations of site navigability, drug information and provider access, and payment methods. Sites were categorized as "chain pharmacy extensions," "mail order pharmacies," "independent pharmacy extensions," and "online pharmacies." Thirty-three sites were reviewed. There was significant variation among the four types of pharmacies selling prescriptions over the Internet. Most pharmacies provided all of the drugs in the survey. Patients were required to provide their own prescription at 88% of the sites, and 75% of sites used mail or fax to verify prescription integrity. More than 50% of sites had privacy policies posted, and 64% used cookies. Chain pharmacy extensions required completion of an average of 10.2 pages to order drugs versus 2.4 to 4 pages for all other site types. Drug information was written at an eighth-grade reading level at 36% of the sites. More than two-thirds of the sites provided a toll-free telephone for a health care professional. Nearly 80% of the sites accepted health insurance, and 95% accepted credit cards; however, only 40% used a secure transmission mechanism for patient or payment information. Internet pharmacies provide varying levels of service. Policies regarding the use of the Internet for obtaining medications should focus on improving the privacy of

  18. 2016 Survey of Non-Starch Alcohol and Renewable Hydrocarbon Biofuels Producers

    Energy Technology Data Exchange (ETDEWEB)

    Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Schwab, Amy [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bacovsky, Dina [Bioenergy 2020+ GmbH (Germany)

    2017-02-01

    In order to understand the anticipated status of the industry for non-starch ethanol and renewable hydrocarbon biofuels as of the end of calendar year 2015, the National Renewable Energy Laboratory (NREL) updated its annual survey of U.S. non-starch ethanol and renewable hydrocarbon biofuels producers. This report presents the results of this survey update, describes the survey methodology, and documents important changes since the 2015 survey published at the end of 2015 (Schwab et al. 2015).

  19. Understanding Sample Surveys: Selective Learning about Social Science Research Methods

    Science.gov (United States)

    Currin-Percival, Mary; Johnson, Martin

    2010-01-01

    We investigate differences in what students learn about survey methodology in a class on public opinion presented in two critically different ways: with the inclusion or exclusion of an original research project using a random-digit-dial telephone survey. Using a quasi-experimental design and data obtained from pretests and posttests in two public…

  20. Didactic content and teaching methodologies on required allopathic US family medicine clerkships.

    Science.gov (United States)

    Schwiebert, L P; Aspy, C B

    1999-02-01

    Despite the increased prominence of family medicine clerkships in required third- and fourth-year clinical rotations in US allopathic medical schools, the content of these clerkships varies markedly among institutions, and there is little in the literature concerning the current or desired content of family medicine clerkships. This study explores the didactic content of a national sample of required family medicine clerkships to assess what and how this important aspect of clerkship curriculum is taught. Using an original survey instrument, we surveyed US medical schools through mailings and follow-up phone contacts. We categorized free-form responses using a coding dictionary specific to this study and computed descriptive statistics. Of 127 medical schools contacted, 105 (83%) responded. Among respondents, 86 (82%) had a required family medicine clerkship, 80% of them in the third year. Mean clerkship length was 5.3 weeks (median = 4 weeks), and the mean number of didactic sessions was about 2 per week. Almost 80% of clerkships had sessions in the broad area of family medicine, and prevention was the most frequent individual topic, taught in 32 (37%) of clerkships. Seventy-one percent of sessions used methodologies other than lectures. The mean time devoted to teaching 24 of the top 26 topics identified in the survey was between 1.2 and 3.1 hours/rotation, although case presentations and common problems each averaged more than 7 hours on clerkships teaching these topics. This survey provided more detailed information than previously available about the didactic content of required US allopathic family medicine clerkships. The survey also documented the lack of agreement among these clerkships on didactic content. Most didactic sessions used interactive rather than lecture format. The information from this first detailed survey provides family medicine clerkship directors with national comparisons of didactic content and methodology as a foundation for further

  1. Methodological Capacity within the Field of "Educational Technology" Research: An Initial Investigation

    Science.gov (United States)

    Bulfin, Scott; Henderson, Michael; Johnson, Nicola F.; Selwyn, Neil

    2014-01-01

    The academic study of educational technology is often characterised by critics as methodologically limited. In order to test this assumption, the present paper reports on data collected from a survey of 462 "research active" academic researchers working in the broad areas of educational technology and educational media. The paper…

  2. Thomas Jefferson, Page Design, and Desktop Publishing.

    Science.gov (United States)

    Hartley, James

    1991-01-01

    Discussion of page design for desktop publishing focuses on the importance of functional issues as opposed to aesthetic issues, and criticizes a previous article that stressed aesthetic issues. Topics discussed include balance, consistency in text structure, and how differences in layout affect the clarity of "The Declaration of…

  3. Search Results | Page 82 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-01-01

    Results 811 - 820 of 8491 ... IPv6 deployment. Published date. January 1, 2010. Studies. -. Welcome address by Prof. Z.D. Kadzamira, Chairman, UA, at the UbuntuNet Connect 2010. Published date. January 1, 2010. Studies. -. CHAIN project and prospects for Sub Saharan Africa. Published date. January 1, 2010. Pages.

  4. Recruitment of mental health survey participants using Internet advertising: content, characteristics and cost effectiveness.

    Science.gov (United States)

    Batterham, Philip J

    2014-06-01

    Postal and telephone survey research is threatened by declining response rates and high cost. Online recruitment is becoming more popular, although there is little empirical evidence about its cost-effectiveness or the representativeness of online samples. There is also limited research on optimal strategies for developing advertising content for online recruitment. The present study aimed to assess these aspects of online recruitment. Two mental health surveys used advertisements within a social network website (Facebook) to recruit adult Australian participants. The initial survey used advertisements linking directly to an external survey website, and recruited 1283 participants at $9.82 per completed survey. A subsequent survey used advertisements linking to a Facebook page that featured links to the external survey, recruiting 610 participants at $1.51 per completion. Both surveys were more cost-effective than similar postal surveys conducted previously, which averaged $19.10 per completion. Online and postal surveys both had somewhat unrepresentative samples. However, online surveys tended to be more successful in recruiting hard-to-reach populations. Advertising using "problem" terminology was more effective than "positive" terminology, while there was no significant effect of altruistic versus self-gain terminology. Online recruitment is efficient, flexible and cost-effective, suggesting that online recruitment has considerable potential for specific research designs. Copyright © 2014 John Wiley & Sons, Ltd.

  5. Customize your Facebook fan page to promote your business or product on Facebook

    OpenAIRE

    Dhwanil

    2012-01-01

    What’s new in Facebook? You can simply post all sorts of content, photograph or video, but the actual layout and design of your fan pages is the same as everyone using. But you can customize your Facebook Fan Page with new professional look and feel to promote your business on Facebook.

  6. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    A final report summarizing the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in Cornwall. The geological setting of the test site in the Cornubian granite batholith is described. The effect of structure imposed by discontinuities on the engineering behaviour of rock masses is discussed and the scanline survey method of obtaining data on discontinuities in the rock mass is described. The applicability of some methods of statistical analysis for discontinuity data is reviewed. The requirement for remote geophysical methods of characterizing the mass is discussed and experiments using seismic and ultrasonic velocity measurements are reported. Methods of determining the in-situ stresses are described and the final results of a programme of in-situ stress measurements using the overcoring and hydrofracture methods are reported. (author)

  7. Michigan residential heating oil and propane price survey: 1995-1996 heating season. Final report

    International Nuclear Information System (INIS)

    Moriarty, C.

    1996-05-01

    This report summarizes the results of a survey of residential No. 2 distillate fuel (home heating oil) and liquefied petroleum gas (propane) prices over the 1995--1996 heating season in Michigan. The Michigan's Public Service Commission (MPSC) conducted the survey under a cooperative agreement with the US Department of Energy's (DOE) Energy Information Administration (EIA). This survey was funded in part by a grant from the DOE. From October 1995 through March 1996, the MPSC surveyed participating distributors by telephone for current residential retail home heating oil and propane prices. The MPSC transmitted the data via a computer modem to the EIA using the Petroleum Electronic Data Reporting Option (PEDRO). Survey results were published in aggregate on the MPSC World Wide Web site at http://ermisweb.state.mi.us/shopp. The page was updated with both residential and wholesale prices immediately following the transmission of the data to the EIA. The EIA constructed the survey using a sample of Michigan home heating oil and propane retailers. The sample accounts for different sales volumes, geographic location, and sources of primary supply

  8. Piloting social engagement on a federal agency-administered Facebook page.

    Science.gov (United States)

    Chiu, Kimberly; Wagner, Lindsay; Choe, Lena; Chew, Catherine; Kremzner, Mary

    2016-01-01

    To evaluate the impact of a Federal drug information center initiating engagement with stakeholders on a Facebook Page administered by a Federal Agency. The U.S. Food and Drug Administration (FDA) Facebook page from July 21, 2014, to October 18, 2014. FDA's Division of Drug Information (DDI) in the Center for Drug Evaluation and Research (CDER) Office of Communications serves as a federal drug information center providing timely, accurate, and useful information on CDER initiatives and CDER-regulated products. We report a 90-day (July 21 to October 18, 2014) pilot during which DDI pharmacists monitored and moderated comments received on FDA's Facebook page to identify those warranting a reply. Once identified, DDI pharmacists replied within 2 business days. Impact was measured by comparing the average number of Likes, Shares, and Reach for Facebook posts before and after the pilot. Additional metrics collected include the number of DDI replies provided to stakeholders' comments and the number of DDI replies provided on time (within 2 business days). During the pilot, DDI contributed 14 posts. On average, each post reached 23,582 more individuals (an increase of 187% compared with pre-pilot posts). On average, each post also received 463 more Likes (450% increase) and 130 more Shares (271% increase). DDI pharmacists replied to 3% (121/3994) and hid 0.58% (23/3994) of Facebook comments received during the 90-day period. All actions were taken within 2 business days. Initiating social engagement had a positive impact on FDA's Facebook page. Published by Elsevier Inc.

  9. Using Intelligent Techniques in Construction Project Cost Estimation: 10-Year Survey

    Directory of Open Access Journals (Sweden)

    Abdelrahman Osman Elfaki

    2014-01-01

    Full Text Available Cost estimation is the most important preliminary process in any construction project. Therefore, construction cost estimation has the lion’s share of the research effort in construction management. In this paper, we have analysed and studied proposals for construction cost estimation for the last 10 years. To implement this survey, we have proposed and applied a methodology that consists of two parts. The first part concerns data collection, for which we have chosen special journals as sources for the surveyed proposals. The second part concerns the analysis of the proposals. To analyse each proposal, the following four questions have been set. Which intelligent technique is used? How have data been collected? How are the results validated? And which construction cost estimation factors have been used? From the results of this survey, two main contributions have been produced. The first contribution is the defining of the research gap in this area, which has not been fully covered by previous proposals of construction cost estimation. The second contribution of this survey is the proposal and highlighting of future directions for forthcoming proposals, aimed ultimately at finding the optimal construction cost estimation. Moreover, we consider the second part of our methodology as one of our contributions in this paper. This methodology has been proposed as a standard benchmark for construction cost estimation proposals.

  10. The Behavior of Online Museum Visitors on Facebook Fan Page of the Museum in Indonesia

    OpenAIRE

    Arta Moro Sundjaja; Ford Lumban Gaol; Sri Bramantoro Abdinagoro; Bahtiar S. Abbas

    2017-01-01

    The objective of this research was to discover the behavior of museum visitors on Facebook fan page in Indonesia based on the user motivation, user expectation, online community involvement, and Facebook fan page of the museum. This research used a quantitative approach to descriptive analysis. The population was the Facebook users who had followed the Facebook fan page of the museum in Indonesia. The samples used were 270 respondents. The researchers distributed the questionnaire to a Facebo...

  11. Page | 138 JUDICIAL REVIEW OF OUSTER CLAUSE PROVISIONS ...

    African Journals Online (AJOL)

    Fr. Ikenga

    ouster of jurisdiction of the courts on pre-election matters and impeachment of the executive do constitute an ... NAUJILJ 9 (1) 2018. Page | .... Ouster clauses are general provisions, which preclude an organ of government from exercising its.

  12. Adaptive Intervention Methodology for Reduction of Respondent Contact Burden in the American Community Survey

    Directory of Open Access Journals (Sweden)

    Ashmead Robert

    2017-12-01

    Full Text Available The notion of respondent contact burden in sample surveys is defined, and a multi-stage process to develop policies for curtailing nonresponse follow-up is described with the goal of reducing this burden on prospective survey respondents. The method depends on contact history paradata containing information about contact attempts both for respondents and for sampled nonrespondents. By analysis of past data, policies to stop case follow-up based on control variables measured in paradata can be developed by calculating propensities to respond for paradata-defined subgroups of sampled cases. Competing policies can be assessed by comparing outcomes (lost interviews, numbers of contacts, patterns of reluctant participation, or refusal to participate as if these stopping policies had been followed in past data. Finally, embedded survey experiments may be used to assess contact-burden reduction policies when these are implemented in the field. The multi-stage method described here abstracts the stages followed in a series of research studies aimed at reducing contact burden in the Computer Assisted Telephone Interview (CATI and Computer Assisted Personal Interview (CAPI modes of the American Community Survey (ACS, which culminated in implementation of policy changes in the ACS.

  13. Continental-Scale Temperature Reconstructions from the PAGES 2k Network

    Science.gov (United States)

    Kaufman, D. S.

    2012-12-01

    We present a major new synthesis of seven regional temperature reconstructions to elucidate the global pattern of variations and their association with climate-forcing mechanisms over the past two millennia. To coordinate the integration of new and existing data of all proxy types, the Past Global Changes (PAGES) project developed the 2k Network. It comprises nine working groups representing eight continental-scale regions and the oceans. The PAGES 2k Consortium, authoring this paper, presently includes 79 representatives from 25 countries. For this synthesis, each of the PAGES 2k working groups identified the proxy climate records for reconstructing past temperature and associated uncertainty using the data and methodologies that they deemed most appropriate for their region. The datasets are from 973 sites where tree rings, pollen, corals, lake and marine sediment, glacier ice, speleothems, and historical documents record changes in biologically and physically mediated processes that are sensitive to temperature change, among other climatic factors. The proxy records used for this synthesis are available through the NOAA World Data Center for Paleoclimatology. On long time scales, the temperature reconstructions display similarities among regions, and a large part of this common behavior can be explained by known climate forcings. Reconstructed temperatures in all regions show an overall long-term cooling trend until around 1900 C.E., followed by strong warming during the 20th century. On the multi-decadal time scale, we assessed the variability among the temperature reconstructions using principal component (PC) analysis of the standardized decadal mean temperatures over the period of overlap among the reconstructions (1200 to 1980 C.E.). PC1 explains 35% of the total variability and is strongly correlated with temperature reconstructions from the four Northern Hemisphere regions, and with the sum of external forcings including solar, volcanic, and greenhouse

  14. survey research in practical theology and congregational studies

    African Journals Online (AJOL)

    such as the social, political and economic environment that influence society also affect the ... correlation research is part of a quantitative research methodology and could contribute to ... Another type is qualitative survey that focuses on the ...

  15. E-survey with researchers, members of ethics committees and sponsors of clinical research in Brazil: an emerging methodology for scientific research.

    Science.gov (United States)

    Dainesi, Sonia Mansoldo; Goldbaum, Moisés

    2012-12-01

    The growth of Internet users enables epidemiological studies to be conducted electronically, representing a promising methodology for data collection. Members of Ethics Committees, Clinical Researchers and Sponsors were interviewed using questionnaires sent over the Internet. Along with the questionnaire, participants received a message explaining the survey and also the informed consent. Returning the questionnaire meant the consent of the participant was given. No incentive was offered; two reminders were sent. The response rate was 21% (124/599), 20% (58/290) and 45% (24/53) respectively for Ethics Committees, Researchers and Sponsors. The percentage of return before the two reminders was about 62%. Reasons for non-response: participant not found, refusal to participate, lack of experience in clinical research or in the therapeutic field. Characteristics of participants: 45% of Ethics Committee participants, 64% of Researchers and 63% of Sponsors were male; mean age (range), respectively: 47 (28-74), 53 (24-72) and 40 (29-65) years. Among Researchers and Sponsors, all respondents had at least a university degree and, in the Ethics Committees group, only two (1.7%) did not have one. Most of the questionnaires in all groups came from the Southeast Region of Brazil, probably reflecting the highest number of clinical trials and research professionals in this region. Despite the potential limitations of a survey done through the Internet, this study led to a response rate similar to what has been observed with other models, efficiency in obtaining responses (speed and quality), convenience for respondents and low cost.

  16. Refining dermatology journal impact factors using PageRank.

    Science.gov (United States)

    Dellavalle, Robert P; Schilling, Lisa M; Rodriguez, Marko A; Van de Sompel, Herbert; Bollen, Johan

    2007-07-01

    Thomson Institute for Scientific Information's journal impact factor, the most common measure of journal status, is based on crude citation counts that do not account for the quality of the journals where the citations originate. This study examines how accounting for citation origin affects the impact factor ranking of dermatology journals. The 2003 impact factors of dermatology journals were adjusted by a weighted PageRank algorithm that assigned greater weight to citations originating in more frequently cited journals. Adjusting for citation origin moved the rank of the Journal of the American Academy of Dermatology higher than that of the Archives of Dermatology (third to second) but did not affect the ranking of the highest impact dermatology journal, the Journal of Investigative Dermatology. The dermatology journals most positively affected by adjusting for citation origin were Contact Dermatitis (moving from 22nd to 7th in rankings) and Burns (21st to 10th). Dermatology journals most negatively affected were Seminars in Cutaneous Medicine and Surgery (5th to 14th), the Journal of Cutaneous Medicine and Surgery (19th to 27th), and the Journal of Investigative Dermatology Symposium Proceedings (26th to 34th). Current measures of dermatology journal status do not incorporate survey data from dermatologists regarding which journals dermatologists esteem most. Adjusting for citation origin provides a more refined measure of journal status and changes relative dermatology journal rankings.

  17. Methodology for Selecting Best Management Practices Integrating Multiple Stakeholders and Criteria. Part 1: Methodology

    Directory of Open Access Journals (Sweden)

    Mauricio Carvallo Aceves

    2016-02-01

    Full Text Available The implementation of stormwater Best Management Practices (BMPs could help re-establish the natural hydrological cycle of watersheds after urbanization, with each BMP presenting a different performance across a range of criteria (flood prevention, pollutant removal, etc.. Additionally, conflicting views from the relevant stakeholders may arise, resulting in a complex selection process. This paper proposes a methodology for BMP selection based on the application of multi-criteria decision aid (MCDA methods, integrating multiple stakeholder priorities and BMP combinations. First, in the problem definition, the MCDA methods, relevant criteria and design guidelines are selected. Next, information from the preliminary analysis of the watershed is used to obtain a list of relevant BMPs. The third step comprises the watershed modeling and analysis of the BMP alternatives to obtain performance values across purely objective criteria. Afterwards, a stakeholder analysis based on survey applications is carried out to obtain social performance values and criteria priorities. Then, the MCDA methods are applied to obtain the final BMP rankings. The last step considers the sensitivity analysis and rank comparisons in order to draw the final conclusions and recommendations. Future improvements to the methodology could explore inclusion of multiple objective analysis, and alternative means for obtaining social performance values.

  18. DuPage County, Ill., makes federal case to extend road through national lab.

    CERN Multimedia

    Meyer, H G

    2003-01-01

    DuPage County officials have submitted plans to extend a highway through the Fermi National Accelerator Laboratory. Citing a need for more north-south options along the county's western edge, the officials proposed a four-lane roadway including a new 5.5-mile segment curving through the national particle physics research laboratory (1 page).

  19. Chemical ionisation mass spectrometry: a survey of instrument technology

    International Nuclear Information System (INIS)

    Mather, R.E.; Todd, J.F.J.

    1979-01-01

    The purpose of this review is to survey the innovations and improvements which have been made in both instrumentation and methodology in chemical ionization mass spectrometry in the past ten years. (Auth.)

  20. Generalized PageRank on Directed Configuration Networks

    NARCIS (Netherlands)

    Chen, Ningyuan; Litvak, Nelli; Olvera-Cravioto, Mariana

    2017-01-01

    Note: formula is not displayed correctly. This paper studies the distribution of a family of rankings, which includes Google’s PageRank, on a directed configuration model. In particular, it is shown that the distribution of the rank of a randomly chosen node in the graph converges in distribution to

  1. Ecosystem Food Web Lift-The-Flap Pages

    Science.gov (United States)

    Atwood-Blaine, Dana; Rule, Audrey C.; Morgan, Hannah

    2016-01-01

    In the lesson on which this practical article is based, third grade students constructed a "lift-the-flap" page to explore food webs on the prairie. The moveable papercraft focused student attention on prairie animals' external structures and how the inferred functions of those structures could support further inferences about the…

  2. Search Results | Page 60 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2009-08-19

    Results 591 - 600 of 8490 ... ... the case of DrumNet in Kenya; paper presented at IAAE eARN Africa Symposium, Beijing, August 19, 2009. Published date. January 1, 2009. Studies. Conservation and sustainable use of biodiversity : the case of medicinal plants and traditional medicine. Published date. January 1, 2006. Pages.

  3. A study on safety assessment methodology for a vitrification plant

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Y. C.; Lee, G. S.; Choi, Y. C.; Kim, G. H. [Yonsei Univ., Seoul (Korea, Republic of)

    2002-03-15

    In this study, the technical and regulatory status of radioactive waste vitrification technologies in foreign and domestic plants is investigated and analyzed, and then significant factors are suggested which must be contained in the final technical guideline or standard for the safety assessment of vitrification plants. Also, the methods to estimate the stability of vitrified waste forms are suggested with property analysis of them. The contents and scope of the study are summarized as follows : survey of the status on radioactive waste vitrification technologies in foreign and domestic plants, survey of the characterization methodology for radioactive waste form, analysis of stability for vitrified waste forms, survey and analysis of technical standards and regulations concerned with them in foreign and domestic plants, suggestion of significant factors for the safety assessment of vitrification plants, submission of regulated technical standard on radioactive waste vitrification plats.

  4. Safety Gear Decontamination Practices Among Florida Firefighters: Analysis of a Text-Based Survey Methodology.

    Science.gov (United States)

    Moore, Kevin J; Koru-Sengul, Tulay; Alvarez, Armando; Schaefer-Solle, Natasha; Harrison, Tyler R; Kobetz, Erin N; Caban-Martinez, Alberto J

    2018-02-01

    Despite the National Fire Protection Association (NFPA) 1851 Personal Protective Equipment Care and Maintenance guidelines, little is known about the routine cleaning of firefighter bunker gear. In collaboration with a large Florida firefighter union, a mobile phone text survey was administered, which included eight questions in an item logic format. In total, 250 firefighters participated in the survey of which 65% reported cleaning their bunker gear in the past 12 months. Approximately 32% ( n = 52) indicated that they had above average confidence in gear cleaning procedures. Arriving at a fire incident response was a significant predictor of gear cleaning in the 12 months preceding survey administration. Using mobile phone-based texting for periodic queries on adherence to NFPA cleaning guidelines and safety message distribution may assist firefighters to increase decontamination procedure frequency.

  5. Flood-inundation maps for the DuPage River from Plainfield to Shorewood, Illinois, 2013

    Science.gov (United States)

    Murphy, Elizabeth A.; Sharpe, Jennifer B.

    2013-01-01

    Digital flood-inundation maps for a 15.5-mi reach of the DuPage River from Plainfield to Shorewood, Illinois, were created by the U.S. Geological Survey (USGS) in cooperation with the Will County Stormwater Management Planning Committee. The inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/ depict estimates of the areal extent of flooding corresponding to selected water levels (gage heights or stages) at the USGS streamgage at DuPage River at Shorewood, Illinois (sta. no. 05540500). Current conditions at the USGS streamgage may be obtained on the Internet at http://waterdata.usgs.gov/usa/nwis/uv?05540500. In addition, the information has been provided to the National Weather Service (NWS) for incorporation into their Advanced Hydrologic Prediction Service (AHPS) flood warning system (http://water.weather.gov/ahps/). The NWS forecasts flood hydrographs at many places that are often colocated with USGS streamgages. The NWS-forecasted peak-stage information, also shown on the DuPage River at Shorewood inundation Web site, may be used in conjunction with the maps developed in this study to show predicted areas of flood inundation. In this study, flood profiles were computed for the stream reach by means of a one-dimensional step-backwater model. The hydraulic model was then used to determine nine water-surface profiles for flood stages at 1-ft intervals referenced to the streamgage datum and ranging from NWS Action stage of 6 ft to the historic crest of 14.0 ft. The simulated water-surface profiles were then combined with a Digital Elevation Model (DEM) (derived from Light Detection And Ranging (LiDAR) data) by using a Geographic Information System (GIS) in order to delineate the area flooded at each water level. These maps, along with information on the Internet regarding current gage height from USGS streamgages and forecasted stream stages from the NWS, provide emergency

  6. Assessment of mental health and illness by telephone survey: experience with an Alberta mental health survey.

    Science.gov (United States)

    Patten, Scott B; Adair, Carol E; Williams, Jeanne Va; Brant, Rollin; Wang, Jian Li; Casebeer, Ann; Beauséjour, Pierre

    2006-01-01

    Mental health is an emerging priority for health surveillance. It has not been determined that the existing data sources can adequately meet surveillance needs. The objective of this project was to explore the use of telephone surveys as a means of collecting supplementary surveillance information. A computer-assisted telephone interview was administered to 5,400 subjects in Alberta. The interview included a set of brief, validated measures for evaluating mental disorder prevalence and related variables. The individual subject response rate was 78 percent, but a substantial number of refusals occurred at the initial household contact. The age and sex distribution of the study sample differed from that of the provincial population prior to weighting. Prevalence proportions did not vary substantially across administrative health regions. There is a potential role for telephone data collection in mental health surveillance, but these results highlight some associated methodological challenges. They also draw into question the importance of regional variation in mental disorder prevalence--which might otherwise have been a key advantage of telephone survey methodologies.

  7. An End-to-End DNA Taxonomy Methodology for Benthic Biodiversity Survey in the Clarion-Clipperton Zone, Central Pacific Abyss

    Directory of Open Access Journals (Sweden)

    Adrian G. Glover

    2015-12-01

    Full Text Available Recent years have seen increased survey and sampling expeditions to the Clarion-Clipperton Zone (CCZ, central Pacific Ocean abyss, driven by commercial interests from contractors in the potential extraction of polymetallic nodules in the region. Part of the International Seabed Authority (ISA regulatory requirements are that these contractors undertake environmental research expeditions to their CCZ exploration claims following guidelines approved by the ISA Legal and Technical Commission (ISA, 2010. Section 9 (e of these guidelines instructs contractors to “…collect data on the sea floor communities specifically relating to megafauna, macrofauna, meiofauna, microfauna, nodule fauna and demersal scavengers”. There are a number of methodological challenges to this, including the water depth (4000–5000 m, extremely warm surface waters (~28 °C compared to bottom water (~1.5 °C and great distances to ports requiring a large and long seagoing expedition with only a limited number of scientists. Both scientists and regulators have recently realized that a major gap in our knowledge of the region is the fundamental taxonomy of the animals that live there; this is essential to inform our knowledge of the biogeography, natural history and ultimately our stewardship of the region. Recognising this, the ISA is currently sponsoring a series of taxonomic workshops on the CCZ fauna and to assist in this process we present here a series of methodological pipelines for DNA taxonomy (incorporating both molecular and morphological data of the macrofauna and megafauna from the CCZ benthic habitat in the recent ABYSSLINE cruise program to the UK-1 exploration claim. A major problem on recent CCZ cruises has been the collection of high-quality samples suitable for both morphology and DNA taxonomy, coupled with a workflow that ensures these data are made available. The DNA sequencing techniques themselves are relatively standard, once good samples have been

  8. Search Results | Page 110 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    In Conversation: David Brooks on Water Scarcity and Local-level Management. Research in Action. Access to ICT Networking Gender. Telecentres: From Idea to Reality in Mozambique ... Commentary: From the Rockies to the Andes — How to Manage Scarcer Water Supplies. Pages. « first · ‹ previous … 104 · 105 · 106 ...

  9. Coal resources available for development; a methodology and pilot study

    Science.gov (United States)

    Eggleston, Jane R.; Carter, M. Devereux; Cobb, James C.

    1990-01-01

    Coal accounts for a major portion of our Nation's energy supply in projections for the future. A demonstrated reserve base of more than 475 billion short tons, as the Department of Energy currently estimates, indicates that, on the basis of today's rate of consumption, the United States has enough coal to meet projected energy needs for almost 200 years. However, the traditional procedures used for estimating the demonstrated reserve base do not account for many environmental and technological restrictions placed on coal mining. A new methodology has been developed to determine the quantity of coal that might actually be available for mining under current and foreseeable conditions. This methodology is unique in its approach, because it applies restrictions to the coal resource before it is mined. Previous methodologies incorporated restrictions into the recovery factor (a percentage), which was then globally applied to the reserve (minable coal) tonnage to derive a recoverable coal tonnage. None of the previous methodologies define the restrictions and their area and amount of impact specifically. Because these restrictions and their impacts are defined in this new methodology, it is possible to achieve more accurate and specific assessments of available resources. This methodology has been tested in a cooperative project between the U.S. Geological Survey and the Kentucky Geological Survey on the Matewan 7.5-minute quadrangle in eastern Kentucky. Pertinent geologic, mining, land-use, and technological data were collected, assimilated, and plotted. The National Coal Resources Data System was used as the repository for data, and its geographic information system software was applied to these data to eliminate restricted coal and quantify that which is available for mining. This methodology does not consider recovery factors or the economic factors that would be considered by a company before mining. Results of the pilot study indicate that, of the estimated

  10. D Integrated Methodologies for the Documentation and the Virtual Reconstruction of AN Archaeological Site

    Science.gov (United States)

    Balletti, C.; Guerra, F.; Scocca, V.; Gottardi, C.

    2015-02-01

    Highly accurate documentation and 3D reconstructions are fundamental for analyses and further interpretations in archaeology. In the last years the integrated digital survey (ground-based survey methods and UAV photogrammetry) has confirmed its main role in the documentation and comprehension of excavation contexts, thanks to instrumental and methodological development concerning the on site data acquisition. The specific aim of the project, reported in this paper and realized by the Laboratory of Photogrammetry of the IUAV University of Venice, is to check different acquisition systems and their effectiveness test, considering each methodology individually or integrated. This research focuses on the awareness that the integration of different survey's methodologies can as a matter of fact increase the representative efficacy of the final representations; these are based on a wider and verified set of georeferenced metric data. Particularly the methods' integration allows reducing or neutralizing issues related to composite and complex objects' survey, since the most appropriate tools and techniques can be chosen considering the characteristics of each part of an archaeological site (i.e. urban structures, architectural monuments, small findings). This paper describes the experience in several sites of the municipality of Sepino (Molise, Italy), where the 3d digital acquisition of cities and structure of monuments, sometimes hard to reach, was realized using active and passive techniques (rage-based and image based methods). This acquisition was planned in order to obtain not only the basic support for interpretation analysis, but also to achieve models of the actual state of conservation of the site on which some reconstructive hypotheses can be based on. Laser scanning data were merged with Structure from Motion techniques' clouds into the same reference system, given by a topographical and GPS survey. These 3d models are not only the final results of the metric

  11. Indexing contamination surveys

    International Nuclear Information System (INIS)

    Brown, R.L.

    1998-01-01

    The responsibility for safely managing the Tank Farms at Hanford belongs to Lockheed Martin Hanford Corporation which is part of the six company Project Hanford Management Team led by Fluor Daniel Hanford, Inc.. These Tank Farm Facilities contain numerous outdoor contamination areas which are surveyed at a periodicity consistent with the potential radiological conditions, occupancy, and risk of changes in radiological conditions. This document describes the survey documentation and data tracking method devised to track the results of contamination surveys this process is referred to as indexing. The indexing process takes a representative data set as an indicator for the contamination status of the facility. The data are further manipulated into a single value that can be tracked and trended using standard statistical methodology. To report meaningful data, the routine contamination surveys must be performed in a manner that allows the survey method and the data collection process to be recreated. Three key criteria are necessary to accomplish this goal: Accurate maps, consistent documentation, and consistent consolidation of data meeting these criteria provides data of sufficient quality to be tracked. Tracking of survey data is accomplished by converting the individual survey results into a weighted value, corrected for the actual number of survey points. This information can be compared over time using standard statistical analysis to identify trends. At the Tank Farms, the need to track and trend the facility's radiological status presents unique challenges. Many of these Tank Farm facilities date back to the second world war. The Tank Farm Facilities are exposed to weather extremes, plant and animal intrusion, as well as all of the normal challenges associated with handling radiological waste streams. Routine radiological surveys did not provide a radiological status adequate for continuing comparisons

  12. The Impact of Laws on Metric Conversion: A Survey of Selected Large U.S. Corporations.

    Science.gov (United States)

    1982-02-01

    resulting from increasing metric usage (Section 6-8). * Conduct researc [., including appropriate surveys; publish recults of such research; and recommend to...Products Case Study There has been very little conversion of the products manufactured and marketed by this firm. According to the firm representative...sizes for marketing reasons. The industry action resulted in this firm PAGE A-3 • F - L ’... . " ... . -, . . .-,--m :..: - - A THE NEWMAN & HERMANSON

  13. Hydro-Quebec's survey on outage cost in industries

    International Nuclear Information System (INIS)

    Naggar, R.

    1990-01-01

    In 1989 Hydro-Quebec completed a survey on the cost of power interruptions to its industrial customers. A total of 11,000 firms formed the base of the survey, which was reduced to 1,647 for analysis purposes. The questionnaire was designed around the concept of representation of knowledge. The costs of various situations were inferred for every enterprise on the basis of knowledge obtained through the surveys. The results of the survey describe the variation in costs of interruption as a function of time of occurrence, duration and advance notice. These costs are expressed in terms of a reference case by the equivalent hourly cost. The magnitude of the cost of the reference interruption is designated the reference cost of undelivered energy. This paper describes the methodology of the survey but does not include survey results. 4 refs., 2 tabs

  14. The Impact of the Rating Agencies' Through-the-cycle Methodology on Rating Dynamics

    NARCIS (Netherlands)

    Altman, E.I.; Rijken, H.A.

    2005-01-01

    Surveys on the use of agency credit ratings reveal that some investors believe that credit-rating agencies are relatively slow in adjusting their ratings. A well-accepted explanation for this perception on rating timeliness is the through-the-cycle methodology that agencies use. Through-the-cycle

  15. Platformed antagonism: Racist discourses on fake Muslim Facebook pages

    DEFF Research Database (Denmark)

    Farkas, Johan; Schou, Jannick; Neumayer, Christina

    2018-01-01

    This research examines how fake identities on social media create and sustain antagonistic and racist discourses. It does so by analysing 11 Danish Facebook pages, disguised as Muslim extremists living in Denmark, conspiring to kill and rape Danish citizens. It explores how anonymous content...... producers utilize Facebook's socio-technical characteristics to construct, what we propose to term as, platformed antagonism. This term refers to socio-technical and discursive practices that produce new modes of antagonistic relations on social media platforms. Through a discourse-theoretical analysis...... of posts, images, 'about' sections and user comments on the studied Facebook pages, the article highlights how antagonism between ethno-cultural identities is produced on social media through fictitious social media accounts, prompting thousands of user reactions. These findings enhance our current...

  16. Scientists Admitting to Plagiarism: A Meta-analysis of Surveys.

    Science.gov (United States)

    Pupovac, Vanja; Fanelli, Daniele

    2015-10-01

    We conducted a systematic review and meta-analysis of anonymous surveys asking scientists whether they ever committed various forms of plagiarism. From May to December 2011 we searched 35 bibliographic databases, five grey literature databases and hand searched nine journals for potentially relevant studies. We included surveys that asked scientists if, in a given recall period, they had committed or knew of a colleague who committed plagiarism, and from each survey extracted the proportion of those who reported at least one case. Studies that focused on academic (i.e. student) plagiarism were excluded. Literature searches returned 12,460 titles from which 17 relevant survey studies were identified. Meta-analysis of studies reporting committed (N = 7) and witnessed (N = 11) plagiarism yielded a pooled estimate of, respectively, 1.7% (95% CI 1.2-2.4) and 30% (95% CI 17-46). Basic methodological factors, including sample size, year of survey, delivery method and whether survey questions were explicit rather than indirect made a significant difference on survey results. Even after controlling for these methodological factors, between-study differences in admission rates were significantly above those expected by sampling error alone and remained largely unexplained. Despite several limitations of the data and of this meta-analysis, we draw three robust conclusions: (1) The rate at which scientists report knowing a colleague who committed plagiarism is higher than for data fabrication and falsification; (2) The rate at which scientists report knowing a colleague who committed plagiarism is correlated to that of fabrication and falsification; (3) The rate at which scientists admit having committed either form of misconduct (i.e. fabrication, falsification and plagiarism) in surveys has declined over time.

  17. Supply chain simulation tools and techniques: a survey

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    The main contribution of this paper is twofold: it surveys different types of simulation for supply chain management; it discusses several methodological issues. These different types of simulation are spreadsheet simulation, system dynamics, discrete-event simulation and business games. Which

  18. Web-based surveys as an alternative to traditional mail methods.

    Science.gov (United States)

    Fleming, Christopher M; Bowden, Mark

    2009-01-01

    Environmental economists have long used surveys to gather information about people's preferences. A recent innovation in survey methodology has been the advent of web-based surveys. While the Internet appears to offer a promising alternative to conventional survey administration modes, concerns exist over potential sampling biases associated with web-based surveys and the effect these may have on valuation estimates. This paper compares results obtained from a travel cost questionnaire of visitors to Fraser Island, Australia, that was conducted using two alternate survey administration modes; conventional mail and web-based. It is found that response rates and the socio-demographic make-up of respondents to the two survey modes are not statistically different. Moreover, both modes yield similar consumer surplus estimates.

  19. Web-page Prediction for Domain Specific Web-search using Boolean Bit Mask

    OpenAIRE

    Sinha, Sukanta; Duttagupta, Rana; Mukhopadhyay, Debajyoti

    2012-01-01

    Search Engine is a Web-page retrieval tool. Nowadays Web searchers utilize their time using an efficient search engine. To improve the performance of the search engine, we are introducing a unique mechanism which will give Web searchers more prominent search results. In this paper, we are going to discuss a domain specific Web search prototype which will generate the predicted Web-page list for user given search string using Boolean bit mask.

  20. LOLing at tragedy: Facebook trolls, memorial pages and resistance to grief online

    OpenAIRE

    Phillips, Whitney

    2011-01-01

    This paper examines the emergence of organized trolling behaviors on Facebook, specifically in relation to memorial groups and fan pages. In addition to mapping the development of RIP trolling — in which online instigators post abusive comments and images onto pages created for and dedicated to the deceased — the paper also examines the highly contentious and ultimately parasitic relationship(s) between memorial trolls, Facebook’s social networking platform and mainstream me...