WorldWideScience

Sample records for survey methodology page

  1. Deriving Dynamics of Web Pages: A Survey

    OpenAIRE

    Oita, Marilena; Senellart, Pierre

    2011-01-01

    International audience; The World Wide Web is dynamic by nature: content is continuously added, deleted, or changed, which makes it challenging for Web crawlers to keep up-to-date with the current version of a Web page, all the more so since not all apparent changes are significant ones. We review major approaches to change detection in Web pages and extraction of temporal properties (especially, timestamps) of Web pages. We focus our attention on techniques and systems that have been propose...

  2. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  3. Survey Page Length and Progress Indicators: What Are Their Relationships to Item Nonresponse?

    Science.gov (United States)

    Bowman, Nicholas A.; Herzog, Serge; Sarraf, Shimon; Tukibayeva, Malika

    2014-01-01

    The popularity of online student surveys has been associated with greater item nonresponse. This chapter presents research aimed at exploring what factors might help minimize item nonresponse, such as altering online survey page length and using progress indicators.

  4. Sampling survey methodology issues of SBS- survey

    OpenAIRE

    Liljana Boci; Elona Berberi

    2015-01-01

    This paper aims at providing an insight on what is required to build an efficient and high quality business statistics from sample survey procedures, and on the effective and appropriate use of survey data in analysis. It aims at describing a general overview of what is required to have a good survey estimate. It shows in practice how to estimate characteristics of the population in SBS considering: weighting, non-response adjustments, post stratification, estimating a population totals, the ...

  5. Sampling survey methodology issues of SBS- survey

    Directory of Open Access Journals (Sweden)

    Liljana Boci

    2015-07-01

    Full Text Available This paper aims at providing an insight on what is required to build an efficient and high quality business statistics from sample survey procedures, and on the effective and appropriate use of survey data in analysis. It aims at describing a general overview of what is required to have a good survey estimate. It shows in practice how to estimate characteristics of the population in SBS considering: weighting, non-response adjustments, post stratification, estimating a population totals, the identification and treatment of outliers, and analyses of coefficient of variation. It provides sources of errors and gives recommendations of how to improve them throw sample survey techniques.

  6. The German SAVE survey: documentation and methodology

    OpenAIRE

    Schunk, Daniel

    2007-01-01

    The purpose of this document is to describe methodological details of the German SAVE survey and to provide users of SAVE with all necessary information for working with the publicly available SAVE dataset.

  7. Heuristic evaluation of paper-based Web pages: a simplified inspection usability methodology.

    Science.gov (United States)

    Allen, Mureen; Currie, Leanne M; Bakken, Suzanne; Patel, Vimla L; Cimino, James J

    2006-08-01

    Online medical information, when presented to clinicians, must be well-organized and intuitive to use, so that the clinicians can conduct their daily work efficiently and without error. It is essential to actively seek to produce good user interfaces that are acceptable to the user. This paper describes the methodology used to develop a simplified heuristic evaluation (HE) suitable for the evaluation of screen shots of Web pages, the development of an HE instrument used to conduct the evaluation, and the results of the evaluation of the aforementioned screen shots. In addition, this paper presents examples of the process of categorizing problems identified by the HE and the technological solutions identified to resolve these problems. Four usability experts reviewed 18 paper-based screen shots and made a total of 108 comments. Each expert completed the task in about an hour. We were able to implement solutions to approximately 70% of the violations. Our study found that a heuristic evaluation using paper-based screen shots of a user interface was expeditious, inexpensive, and straightforward to implement.

  8. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  9. A Survey on Speech Enhancement Methodologies

    Directory of Open Access Journals (Sweden)

    Ravi Kumar. K

    2016-12-01

    Full Text Available Speech enhancement is a technique which processes the noisy speech signal. The aim of speech enhancement is to improve the perceived quality of speech and/or to improve its intelligibility. Due to its vast applications in mobile telephony, VOIP, hearing aids, Skype and speaker recognition, the challenges in speech enhancement have grown over the years. It is more challenging to suppress back ground noise that effects human communication in noisy environments like airports, road works, traffic, and cars. The objective of this survey paper is to outline the single channel speech enhancement methodologies used for enhancing the speech signal which is corrupted with additive background noise and also discuss the challenges and opportunities of single channel speech enhancement. This paper mainly focuses on transform domain techniques and supervised (NMF, HMM speech enhancement techniques. This paper gives frame work for developments in speech enhancement methodologies

  10. Challenges in dental statistics: survey methodology topics

    Directory of Open Access Journals (Sweden)

    Giuseppe Pizzo

    2013-12-01

    Full Text Available This paper gathers some contributions concerning survey methodology in dental research, as discussed during the first Workshop of the SISMEC STATDENT working group on statistical methods and applications in dentistry, held in Ancona on the 28th September 2011.The first contribution deals with the European Global Oral Health Indicators Development (EGOHID Project which proposed a comprehensive and standardized system of epidemiological tools (questionnaires and clinical forms for national data collection on oral health in Europe. The second contribution regards the design and conduct of trials to evaluate the clinical efficacy and safety of toothbrushes and mouthrinses. Finally, a flexible and effective tool used to trace dental age reference charts tailored to Italian children is presented.

  11. SNE's methodological basis - web-based software in entrepreneurial surveys

    DEFF Research Database (Denmark)

    Madsen, Henning

    This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project.......This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project....

  12. Basic Project Management Methodologies for Survey Researchers.

    Science.gov (United States)

    Beach, Robert H.

    To be effective, project management requires a heavy dependence on the document, list, and computational capability of a computerized environment. Now that microcomputers are readily available, only the rediscovery of classic project management methodology is required for improved resource allocation in small research projects. This paper provides…

  13. Physician and patient survey of allergic rhinitis: methodology.

    Science.gov (United States)

    Higgins, V; Kay, S; Small, M

    2007-01-01

    Methodology for Disease Specific Programme (DSP) surveys designed by Adelphi Group Products is used each year to survey patients and physicians on their perceptions of treatment effectiveness, symptoms and impact of diseases. These point-in-time surveys, conducted in the USA and Europe (France, Germany, Italy, Spain and UK), provide useful information on the real-world management and treatment of diseases. This paper describes the methodology for the DSP survey in allergic rhinitis, detailing the preparation of materials, recruitment of physicians, data collection and data management.

  14. Teaching Labor Market Survey Methodology in Rehabilitation Counseling

    Science.gov (United States)

    Barros-Bailey, Mary

    2012-01-01

    Labor Market Survey (LMS) and labor market analysis knowledge and methodologies are minimum competencies expected of rehabilitation counselors through credentialing and accreditation boards. However, LMS knowledge and methodology is an example of a contemporary oral tradition that is universally recognized in rehabilitation and disability services…

  15. A survey of decision tree classifier methodology

    Science.gov (United States)

    Safavian, S. R.; Landgrebe, David

    1991-01-01

    Decision tree classifiers (DTCs) are used successfully in many diverse areas such as radar signal classification, character recognition, remote sensing, medical diagnosis, expert systems, and speech recognition. Perhaps the most important feature of DTCs is their capability to break down a complex decision-making process into a collection of simpler decisions, thus providing a solution which is often easier to interpret. A survey of current methods is presented for DTC designs and the various existing issues. After considering potential advantages of DTCs over single-state classifiers, subjects of tree structure design, feature selection at each internal node, and decision and search strategies are discussed.

  16. TESTS AND METHODOLOGIES FOR THE SURVEY OF NARROW SPACES

    Directory of Open Access Journals (Sweden)

    L. Perfetti

    2017-02-01

    Full Text Available The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today’s era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  17. 3-D Survey Applied to Industrial Archaeology by Tls Methodology

    Science.gov (United States)

    Monego, M.; Fabris, M.; Menin, A.; Achilli, V.

    2017-05-01

    This work describes the three-dimensional survey of "Ex Stazione Frigorifera Specializzata": initially used for agricultural storage, during the years it was allocated to different uses until the complete neglect. The historical relevance and the architectural heritage that this building represents has brought the start of a recent renovation project and functional restoration. In this regard it was necessary a global 3-D survey that was based on the application and integration of different geomatic methodologies (mainly terrestrial laser scanner, classical topography, and GNSS). The acquisitions of point clouds was performed using different laser scanners: with time of flight (TOF) and phase shift technologies for the distance measurements. The topographic reference network, needed for scans alignment in the same system, was measured with a total station. For the complete survey of the building, 122 scans were acquired and 346 targets were measured from 79 vertices of the reference network. Moreover, 3 vertices were measured with GNSS methodology in order to georeference the network. For the detail survey of machine room were executed 14 scans with 23 targets. The 3-D global model of the building have less than one centimeter of error in the alignment (for the machine room the error in alignment is not greater than 6 mm) and was used to extract products such as longitudinal and transversal sections, plans, architectural perspectives, virtual scans. A complete spatial knowledge of the building is obtained from the processed data, providing basic information for restoration project, structural analysis, industrial and architectural heritage valorization.

  18. 3-D SURVEY APPLIED TO INDUSTRIAL ARCHAEOLOGY BY TLS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    M. Monego

    2017-05-01

    Full Text Available This work describes the three-dimensional survey of “Ex Stazione Frigorifera Specializzata”: initially used for agricultural storage, during the years it was allocated to different uses until the complete neglect. The historical relevance and the architectural heritage that this building represents has brought the start of a recent renovation project and functional restoration. In this regard it was necessary a global 3-D survey that was based on the application and integration of different geomatic methodologies (mainly terrestrial laser scanner, classical topography, and GNSS. The acquisitions of point clouds was performed using different laser scanners: with time of flight (TOF and phase shift technologies for the distance measurements. The topographic reference network, needed for scans alignment in the same system, was measured with a total station. For the complete survey of the building, 122 scans were acquired and 346 targets were measured from 79 vertices of the reference network. Moreover, 3 vertices were measured with GNSS methodology in order to georeference the network. For the detail survey of machine room were executed 14 scans with 23 targets. The 3-D global model of the building have less than one centimeter of error in the alignment (for the machine room the error in alignment is not greater than 6 mm and was used to extract products such as longitudinal and transversal sections, plans, architectural perspectives, virtual scans. A complete spatial knowledge of the building is obtained from the processed data, providing basic information for restoration project, structural analysis, industrial and architectural heritage valorization.

  19. Methodology of Global Adult Tobacco Survey (GATS), Malaysia, 2011.

    Science.gov (United States)

    Omar, Azahadi; Yusoff, Muhammad Fadhli Mohd; Hiong, Tee Guat; Aris, Tahir; Morton, Jeremy; Pujari, Sameer

    Malaysia participated in the second phase of the Global Adult Tobacco Survey (GATS) in 2011. GATS, a new component of the Global Tobacco Surveillance System, is a nationally representative household survey of adults 15 years old or above. The objectives of GATS Malaysia were to (i) systematically monitor tobacco use among adults and track key indicators of tobacco control and (ii) track the implementation of some of the Framework Convention of Tobacco Control (FCTC)-recommended demand related policies. GATS Malaysia 2011 was a nationwide cross-sectional survey using multistage stratified sampling to select 5112 nationally representative households. One individual aged 15 years or older was randomly chosen from each selected household and interviewed using handheld device. GATS Core Questionnaire with optional questions was pre-tested and uploaded into handheld devices after repeated quality control processes. Data collectors were trained through a centralized training. Manuals and picture book were prepared to aid in the training of data collectors and during data collection. Field-level data were aggregated on a daily basis and analysed twice a week. Quality controls were instituted to ensure collection of high quality data. Sample weighting and analysis were conducted with the assistance of researchers from the Centers for Disease Control and Prevention, Atlanta, USA. GATS Malaysia received a total response rate of 85.3% from 5112 adults surveyed. Majority of the respondents were 25-44 years old and Malays. The robust methodology used in the GATS Malaysia provides national estimates for tobacco used classified by socio-demographic characteristics and reliable data on various dimensions of tobacco control.

  20. Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches

    Science.gov (United States)

    Farassat, Fereidoun; Casper, Jay H.

    2006-01-01

    In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.

  1. Methodological design of the National Health and Nutrition Survey 2016

    Directory of Open Access Journals (Sweden)

    Martín Romero-Martínez

    2017-05-01

    Full Text Available Objective. Describe the design methodology of the halfway health and nutrition national survey (Ensanut-MC 2016. Materials and methods. The Ensanut-MC is a national probabilistic survey whose objective population are the in­habitants of private households in Mexico. The sample size was determined to make inferences on the urban and rural areas in four regions. Describes main design elements: target population, topics of study, sampling procedure, measurement procedure and logistics organization. Results. A final sample of 9 479 completed household interviews, and a sample of 16 591 individual interviews. The response rate for households was 77.9%, and the response rate for individuals was 91.9%. Conclusions. The Ensanut-MC probabilistic design allows valid statistical inferences about interest parameters for Mexico´s public health and nutrition, specifically on over­weight, obesity and diabetes mellitus. Updated information also supports the monitoring, updating and formulation of new policies and priority programs.

  2. Introduction pages

    OpenAIRE

    2015-01-01

    Introduction Pages and Table of Contents Research ArticlesInsulin Requirements in Relation to Insulin Pump Indications in Type 1 DiabetesPDFGabriela GHIMPEŢEANU, Silvia Ş. IANCU, Gabriela ROMAN, Anca M. ALIONESCU259-263Comparative Antibacterial Efficacy of Vitellaria paradoxa (Shea Butter Tree) Extracts Against Some Clinical Bacterial IsolatesPDFKamoldeen Abiodun AJIJOLAKEWU, Fola Jose AWARUN264-268A Murine Effort Model for Studying the Influence of Trichinella on Muscular Activity of MicePDF...

  3. SurveyWiz and factorWiz: JavaScript Web pages that make HTML forms for research on the Internet.

    Science.gov (United States)

    Birnbaum, M H

    2000-05-01

    SurveyWiz and factorWiz are Web pages that act as wizards to create HTML forms that enable one to collect data via the Web. SurveyWiz allows the user to enter survey questions or personality test items with a mixture of text boxes and scales of radio buttons. One can add demographic questions of age, sex, education, and nationality with the push of a button. FactorWiz creates the HTML for within-subjects, two-factor designs as large as 9 x 9, or higher order factorial designs up to 81 cells. The user enters levels of the row and column factors, which can be text, images, or other multimedia. FactorWiz generates the stimulus combinations, randomizes their order, and creates the page. In both programs HTML is displayed in a window, and the user copies it to a text editor to save it. When uploaded to a Web server and supported by a CGI script, the created Web pages allow data to be collected, coded, and saved on the server. These programs are intended to assist researchers and students in quickly creating studies that can be administered via the Web.

  4. Impact of methodological "shortcuts" in conducting public health surveys: Results from a vaccination coverage survey

    Directory of Open Access Journals (Sweden)

    Luman Elizabeth T

    2008-03-01

    Full Text Available Abstract Background Lack of methodological rigor can cause survey error, leading to biased results and suboptimal public health response. This study focused on the potential impact of 3 methodological "shortcuts" pertaining to field surveys: relying on a single source for critical data, failing to repeatedly visit households to improve response rates, and excluding remote areas. Methods In a vaccination coverage survey of young children conducted in the Commonwealth of the Northern Mariana Islands in July 2005, 3 sources of vaccination information were used, multiple follow-up visits were made, and all inhabited areas were included in the sampling frame. Results are calculated with and without these strategies. Results Most children had at least 2 sources of data; vaccination coverage estimated from any single source was substantially lower than from all sources combined. Eligibility was ascertained for 79% of households after the initial visit and for 94% of households after follow-up visits; vaccination coverage rates were similar with and without follow-up. Coverage among children on remote islands differed substantially from that of their counterparts on the main island indicating a programmatic need for locality-specific information; excluding remote islands from the survey would have had little effect on overall estimates due to small populations and divergent results. Conclusion Strategies to reduce sources of survey error should be maximized in public health surveys. The impact of the 3 strategies illustrated here will vary depending on the primary outcomes of interest and local situations. Survey limitations such as potential for error should be well-documented, and the likely direction and magnitude of bias should be considered.

  5. The XMM Cluster Survey: X-ray analysis methodology

    CERN Document Server

    Lloyd-Davies, E J; Hosmer, Mark; Mehrtens, Nicola; Davidson, Michael; Sabirli, Kivanc; Mann, Robert G; Hilton, Matt; Liddle, Andrew R; Viana, Pedro T P; Campbell, Heather C; Collins, Chris A; Dubois, E Naomi; Freeman, Peter; Hoyle, Ben; Kay, Scott T; Kuwertz, Emma; Miller, Christopher J; Nichol, Robert C; Sahlen, Martin; Stanford, S Adam; Stott, John P

    2010-01-01

    The XMM Cluster Survey (XCS) is a serendipitous search for galaxy clusters using all publicly available data in the XMM- Newton Science Archive. Its main aims are to measure cosmological parameters and trace the evolution of X-ray scaling relations. In this paper we describe the data processing methodology applied to the 5776 XMM observations used to construct the current XCS source catalogue. A total of 3669 > 4-{\\sigma} cluster candidates with >50 background-subtracted X-ray counts are extracted from a total non-overlapping area suitable for cluster searching of 410 deg^2 . Of these, 1022 candidates are detected with >300 X-ray counts, and we demonstrate that robust temperature measurements can be obtained down to this count limit. We describe in detail the automated pipelines used to perform the spectral and surface brightness fitting for these sources, as well as to estimate redshifts from the X-ray data alone. A total of 517 (126) X-ray temperatures to a typical accuracy of <40 (<10) per cent have ...

  6. Reflections on International Comparative Education Survey Methodology: A Case Study of the European Survey on Language Competences

    Science.gov (United States)

    Ashton, Karen

    2016-01-01

    This paper reflects on the methodology used in international comparative education surveys by conducting a systematic review of the European Survey on Language Competences (ESLC). The ESLC was administered from February to March 2011, with final results released in June 2012. The survey tested approximately 55,000 students across 14 European…

  7. Reflections on International Comparative Education Survey Methodology: A Case Study of the European Survey on Language Competences

    Science.gov (United States)

    Ashton, Karen

    2016-01-01

    This paper reflects on the methodology used in international comparative education surveys by conducting a systematic review of the European Survey on Language Competences (ESLC). The ESLC was administered from February to March 2011, with final results released in June 2012. The survey tested approximately 55,000 students across 14 European…

  8. Survey of Transmission Cost Allocation Methodologies for Regional Transmission Organizations

    Energy Technology Data Exchange (ETDEWEB)

    Fink, S.; Porter, K.; Mudd, C.; Rogers, J.

    2011-02-01

    The report presents transmission cost allocation methodologies for reliability transmission projects, generation interconnection, and economic transmission projects for all Regional Transmission Organizations.

  9. Refinement of Research Surveying in Software Methodologies by Analogy: finding your patch

    Directory of Open Access Journals (Sweden)

    Eugene Doroshenko

    1999-05-01

    Full Text Available To enhance research surveying in software methodologies, a model is introduced that can indicate field maturity based on vocabulary and relevant literature. This model is developed by drawing analogies with software methodologies. Two analogies are used: software models and software life cycles or processes. How this model can reduce research surveying problems for researchers is described using extracts from application results as examples. Although the model does support research surveying activities, it cannot choose the subject for the researcher.

  10. Training Activity Summary Page (TASP) Campus

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Training Activity Summary Page (formerly the Training Exit Survey Cover Page) dataset contains data about each training event. This dataset includes information...

  11. The Measure of a Nation: The USDA and the Rise of Survey Methodology

    Science.gov (United States)

    Mahoney, Kevin T.; Baker, David B.

    2007-01-01

    Survey research has played a major role in American social science. An outgrowth of efforts by the United States Department of Agriculture in the 1930s, the Division of Program Surveys (DPS) played an important role in the development of survey methodology. The DPS was headed by the ambitious and entrepreneurial Rensis Likert, populated by young…

  12. Introduction pages

    Directory of Open Access Journals (Sweden)

    Radu E. Sestras

    2015-09-01

    Full Text Available Introduction Pages and Table of Contents Research ArticlesInsulin Requirements in Relation to Insulin Pump Indications in Type 1 DiabetesPDFGabriela GHIMPEŢEANU,\tSilvia Ş. IANCU,\tGabriela ROMAN,\tAnca M. ALIONESCU259-263Comparative Antibacterial Efficacy of Vitellaria paradoxa (Shea Butter Tree Extracts Against Some Clinical Bacterial IsolatesPDFKamoldeen Abiodun AJIJOLAKEWU,\tFola Jose AWARUN264-268A Murine Effort Model for Studying the Influence of Trichinella on Muscular Activity of MicePDFIonut MARIAN,\tCălin Mircea GHERMAN,\tAndrei Daniel MIHALCA269-271Prevalence and Antibiogram of Generic Extended-Spectrum β-Lactam-Resistant Enterobacteria in Healthy PigsPDFIfeoma Chinyere UGWU,\tMadubuike Umunna ANYANWU,\tChidozie Clifford UGWU,\tOgbonna Wilfred UGWUANYI272-280Index of Relative Importance of the Dietary Proportions of Sloth Bear (Melursus ursinus in Semi-Arid RegionPDFTana P. MEWADA281-288Bioaccumulation Potentials of Momordica charantia L. Medicinal Plant Grown in Lead Polluted Soil under Organic Fertilizer AmendmentPDFOjo Michael OSENI,\tOmotola Esther DADA,\tAdekunle Ajayi ADELUSI289-294Induced Chitinase and Chitosanase Activities in Turmeric Plants by Application of β-D-Glucan NanoparticlesPDFSathiyanarayanan ANUSUYA,\tMuthukrishnan SATHIYABAMA295-298Present or Absent? About a Threatened Fern, Asplenium adulterinum Milde, in South-Eastern Carpathians (RomaniaPDFAttila BARTÓK,\tIrina IRIMIA299-307Comparative Root and Stem Anatomy of Four Rare Onobrychis Mill. (Fabaceae Taxa Endemic in TurkeyPDFMehmet TEKİN,\tGülden YILMAZ308-312Propagation of Threatened Nepenthes khasiana: Methods and PrecautionsPDFJibankumar S. KHURAIJAM,\tRup K. ROY313-315Alleviate Seed Ageing Effects in Silybum marianum by Application of Hormone Seed PrimingPDFSeyed Ata SIADAT,\tSeyed Amir MOOSAVI,\tMehran SHARAFIZADEH316-321The Effect of Halopriming and Salicylic Acid on the Germination of Fenugreek (Trigonella foenum-graecum under Different Cadmium

  13. Web Page Design.

    Science.gov (United States)

    Lindsay, Lorin

    Designing a web home page involves many decisions that affect how the page will look, the kind of technology required to use the page, the links the page will provide, and kinds of patrons who can use the page. The theme of information literacy needs to be built into every web page; users need to be taught the skills of sorting and applying…

  14. A survey of load methodologies for shuttle orbiter payloads

    Science.gov (United States)

    Chen, J. C.; Garba, J. A.; Salama, M.; Trubert, M.

    1981-01-01

    Loads methods currently being used to design planetary spacecraft to be launched on the shuttle orbiter are summarized. Experiences gained from expendable launch vehicle payloads are used to develop methodologies for the space shuttle orbiter payloads. The objectives for the development of a new methodology for the shuttle payloads are to reduce the cost and schedule for the payload load analysis by decoupling the payload analysis from the launch vehicle to the maximum extent possible. Methods are described for payload member load estimation or obtaining upper bounds for dynamic loads, as well as load prediction or calculating actual transient member load time histories.

  15. 2016 Workplace and Gender Relations Survey of Active Duty Members: Statistical Methodology Report

    Science.gov (United States)

    2017-03-01

    MEMBERS: STATISTICAL METHODOLOGY REPORT Office of People Analytics (OPA) Defense Research , Surveys , and Statistics Center 4800 Mark Center Drive...Introduction The Defense Research , Surveys , and Statistics Center, Office of People Analytics (OPA), conducts both web-based and paper-and-pen surveys to... Research Surveys , and Statistics Center (RSSC) resided within the Defense Manpower Data Center (DMDC). In 2016, the Defense Human Resource Activity

  16. Methodology of Layout (DIV+CSS) Applied at the Designing of Website Page of Chinese Geology Surey Bureau%DIV+CSS技术在地调局网站页面设计中的应用

    Institute of Scientific and Technical Information of China (English)

    漆海霞

    2011-01-01

    论文通过介绍DIV+CSS技术,指出了中国地质调查局网站页面设计采用该技术的重要意义。并提出了中国地质调查局网站页面设计的方法和布局的结构,并指出采用DIV+CSS技术时应重视的几个问题。%The author stressed the importance of using the methodology of layout technique DIV+CSS to design the Website page at the Chinese geology Investigation Bureau by introducing the Layout structure and the designing methods applied at the Website page. He pointed out some problems that we must be careful while using the DIV+CSS technical methods.

  17. Methodology of the Global Adult Tobacco Survey - 2008-2010.

    Science.gov (United States)

    Palipudi, Krishna Mohan; Morton, Jeremy; Hsia, Jason; Andes, Linda; Asma, Samira; Talley, Brandon; Caixeta, Roberta D; Fouad, Heba; Khoury, Rula N; Ramanandraibe, Nivo; Rarick, James; Sinha, Dhirendra N; Pujari, Sameer; Tursan d'Espaignet, Edouard

    2016-06-01

    In 2008, the Centers for Disease Control and Prevention (CDC) and the World Health Organization developed the Global Adult Tobacco Survey (GATS), an instrument to monitor global tobacco use and measure indicators of tobacco control. GATS, a nationally representative household survey of persons aged 15 years or older, was conducted for the first time during 2008-2010 in 14 low- and middle-income countries. In each country, GATS used a standard core questionnaire, sample design, and procedures for data collection and management and, as needed, added country-specific questions that were reviewed and approved by international experts. The core questionnaire included questions about various characteristics of the respondents, their tobacco use (smoking and smokeless), and a wide range of tobacco-related topics (cessation; secondhand smoke; economics; media; and knowledge, attitudes, and perceptions). In each country, a multistage cluster sample design was used, with households selected proportionate to the size of the population. Households were chosen randomly within a primary or secondary sampling unit, and one respondent was selected at random from each household to participate in the survey. Interviewers administered the survey in the country's local language(s) using handheld electronic data collection devices. Interviews were conducted privately, and same-sex interviewers were used in countries where mixed-sex interviews would be culturally inappropriate. All 14 countries completed the survey during 2008-2010. In each country, the ministry of health was the lead coordinating agency for GATS, and the survey was implemented by national statistical organizations or surveillance institutes. This article describes the background and rationale for GATS and includes a comprehensive description of the survey methods and protocol.

  18. Long-range oil and gas forecasting methodologies: literature survey

    Energy Technology Data Exchange (ETDEWEB)

    Cherniavsky, E.A.

    1980-08-01

    Performance of long-range energy system analyses requires the capability to project conventional domestic oil and gas supplies in the long term. The objective of the Long-range Forecasting Methodology project is to formulate an approach to this problem which will be compatible with the principal tool employed by the Energy Information Administration of the Department of Energy for long-range energy system analyses, the Long-term Energy Analysis Package (LEAP). This paper reports on projection methodologies that have appeared in the literature, evaluates them in terms of their applicability to the LEAP framework, and discusses the principal determinants of conventional domestic oil and gas supply in the long run.

  19. Telephone surveying for drug abuse: methodological issues and an application.

    Science.gov (United States)

    Frank, B

    1985-01-01

    In light of New York State's experience, it is probable that future household drug use surveys will use telephone administration. Drug use questions are not as sensitive as had been thought, and are easily administered by telephone. In addition, the lower costs, the computer-assisted capabilities, and the saving in time are some of the advantages in comparison to face-to-face surveying. In order to address the nontelephone segments of the household population--despite their declining proportion--and to improve response rates, mixed-mode interviewing may have to be considered. Given a better understanding of telephone-associated behavior and the increasing popularity of technological advances, such as the portability and mobility of phones, telephone surveying may become even more attractive in the future.

  20. Methodology of the National School-based Health Survey in Malaysia, 2012.

    Science.gov (United States)

    Yusoff, Fadhli; Saari, Riyanti; Naidu, Balkish M; Ahmad, Noor Ani; Omar, Azahadi; Aris, Tahir

    2014-09-01

    The National School-Based Health Survey 2012 was a nationwide school health survey of students in Standard 4 to Form 5 (10-17 years of age), who were schooling in government schools in Malaysia during the period of data collection. The survey comprised 3 subsurveys: the Global School Health Survey (GSHS), the Mental Health Survey, and the National School-Based Nutrition Survey. The aim of the survey was to provide data on the health status of adolescents in Malaysia toward strengthening the adolescent health program in the country. The design of the survey was created to fulfill the requirements of the 3 subsurveys. A 2-stage stratified sampling method was adopted in the sampling. The methods for data collection were via questionnaire and physical examination. The National School-Based Health Survey 2012 adopted an appropriate methodology for a school-based survey to ensure valid and reliable findings.

  1. Methodology of the Global Adult Tobacco Survey in China, 2010

    Institute of Scientific and Technical Information of China (English)

    Jason HSIA; Gong-Huan YANG; Qiang LI; Lin XIAO; Yan YANG; Yan-Wei WU; Samira ASMA

    2010-01-01

    @@ INTRODUCTION The Global Adult Tobacco Survey (GATS) is a component of Global Tobacco Surveillance System (GTSS) under auspices of the Bloomberg philanthropy and the Bill and Melinda Gates Foundation. GATS is a household survey with a standard protocol and its goals are to measure tobacco use, to assess changes due to policy and to facilitate cross country comparison. China is the largest consumer and producer of tobacco in the world. China was selected as one of 14 countries of high burden of tobacco use, large population, and mostly low income, to conduct the GATS.

  2. 2010 National Beneficiary Survey: Methodology and Descriptive Statistics.

    OpenAIRE

    Debra Wright; Gina Livermore; Denise Hoffman; Eric Grau; Maura Bardos

    2012-01-01

    This report presents the sampling design and data collection activities for round 4 (2010) of the Social Security Administration’s National Beneficiary Survey (NBS). It also provides descriptive statistics on working-age individuals receiving Supplemental Security Income and Social Security Disability Insurance benefits, based on the nationally representative sample from the 2010 NBS.

  3. Adaptation of the methodology of sample surveys for marketing researches

    Directory of Open Access Journals (Sweden)

    Kataev Andrey

    2015-08-01

    Full Text Available The article presents the results of the theory of adaptation of sample survey for the purposes of marketing, that allows to answer the fundamental question of any marketing research – how many objects should be studied for drawing adequate conclusions.

  4. Qualitative response models: A survey of methodology and illustrative applications

    Directory of Open Access Journals (Sweden)

    Nojković Aleksandra

    2007-01-01

    Full Text Available This paper introduces econometric modeling with discrete (categorical dependent variables. Such models, commonly referred to as qualitative response (QR models, have become a standard tool of microeconometric analysis. Microeconometric research represents empirical analysis of microdata, i.e. economic information about individuals, households and firms. Microeconometrics has been most widely adopted in various fields, such as labour economics, consumer behavior, or economy of transport. The latest research shows that this methodology can also be successfully transferred to macroeconomic context and applied to time series and panel data analysis in a wider scope. .

  5. 2014 Service Academy Gender Relations Survey: Statistical Methodology Report

    Science.gov (United States)

    2014-12-12

    Survey (2014 SAGR) is designed to track sexual assault and sexual harassment issues at the Service Academies. U.S. Code 10, as amended by Section 532...requested that they be included, beginning in 2008, in order to evaluate and improve their programs addressing sexual assault and sexual harassment ...questions on stalking, sexual harassment and its component behaviors, sexist behavior, and prior experiences of unwanted sexual contact. In 2014

  6. The use of social surveys in translation studies: methodological characteristics

    OpenAIRE

    Kuznik, Anna; Hurtado Albir, Amparo; Espinal Berenguer, Anna; Andrews, Mark

    2010-01-01

    Translation is an activity carried out by professionals – in some cases after a period of formal training – who are employed or self-employed, and whose work is destined for translation users. Translators, translator trainees, employers of translators, and translation users are four clearly defined social groups within the translation industry that may be the subject of study using one of the methods most frequently used within the field of social sciences: the social survey. This paper prese...

  7. Sleep Apnea Information Page

    Science.gov (United States)

    ... Page You are here Home » Disorders » All Disorders Sleep Apnea Information Page Sleep Apnea Information Page Search Disorders Search NINDS SEARCH ... Institutes of Health (NIH) conduct research related to sleep apnea in laboratories at the NIH, and also ...

  8. A methodological approach based on indirect sampling to survey the homeless people

    OpenAIRE

    Claudia De Vitiis; Stefano Falorsi; Francesca Inglese; Alessandra Masi; Nicoletta Pannuzi; Monica Russo

    2014-01-01

    The Italian National Institute of Statistics carried out the first survey on homeless population. The survey aims at estimating the unknown size and some demographic and social characteristics of this population. The methodological strategy used to investigate homeless population could not follow the standard approaches of official statistics usually based on the use of population lists. The sample strategy for the homeless survey refers to the theory of indirect sampling, based on the use of...

  9. Design of Educational Web Pages

    Science.gov (United States)

    Galan, Jose Gomez; Blanco, Soledad Mateos

    2004-01-01

    The methodological characteristics of teaching in primary and secondary education make it necessary to revise the pedagogical and instructive lines with which to introduce the new Information and Communication Technologies into the school context. The construction of Web pages that can be used to improve student learning is, therefore, fundamental…

  10. Literature Survey of previous research work in Models and Methodologies in Project Management

    OpenAIRE

    Ravinder Singh; Dr. Kevin Lano

    2014-01-01

    This paper provides a survey of the existing literature and research carried out in the area of project management using different models, methodologies, and frameworks. Project Management (PM) broadly means programme management, portfolio management, practice management, project management office, etc. A project management system has a set of processes, procedures, framework, methods, tools, methodologies, techniques, resources, etc. which are used to manage the full life cycle of projects. ...

  11. La création de pages Internet dans le cadre de parcours individualisé : la place de la méthodologie dans l'apprentissage des langues. Creating Internet Pages for L2 Self-learning : Where Methodology Fits In

    Directory of Open Access Journals (Sweden)

    Joline Boulon

    1998-12-01

    Full Text Available Les projets en ligne tracent un parcours utile pour un travail de compréhension globale, et c’est donc à partir du projet que nous cherchons à développer une méthodologie précise pour les tâches que nous demandons aux apprenants. Cependant, l’input compréhensible n’est pas suffisant pour un apprentissage de toutes les compétences d’une L2. Un travail qui ne demande jamais d’analyse au niveau de la structure de la langue ou des formes ne peut pas sensibiliser suffisamment les apprenants pour qu’ils aient une interlangue qui se développe. Il nous semble que le projet doit être inscrit dans un cadre qui trace un parcours méthodologique très précis tout en permettant à l’apprenant un travail autant de précision que de compréhension globale. Nous prenons un exemple pour illustrer ce que nous sommes en train de créer sur les pages Internet.On line language projects suggest worthwhile steps to take for overall comprehension work, which is why we have chosen to use projects to develop a specific methodology for tasks that we have our learners do. However, comprehensible input is insufficient for learning all of the language skills. Work that never demands an analysis of language structure or form cannot make learners sufficiently aware of the language in order to have their interlanguage develop. It seems to us that projects should include a very specific methodology that allows learners to work on precision as well as overall comprehension. We discuss an example of the pages we are in the process of creating on Internet.

  12. 2015 Workplace and Gender Relations Survey of Reserve Component Members: Statistical Methodology Report

    Science.gov (United States)

    2016-03-17

    Component Members (2015 WGRR). Mr. Tim Markham, mathematical statistician within the Statistical Methods Branch, used the DMDC Sampling Tool to...and the relationship (covariance) between response propensities and the estimated statistics (i.e., sexual assault rate), and takes the following...2015 Workplace and Gender Relations Survey of Reserve Component Members Statistical Methodology Report Additional copies of this report

  13. The National Aviation Operational Monitoring Service (NAOMS): A Documentation of the Development of a Survey Methodology

    Science.gov (United States)

    Connors, Mary M.; Mauro, Robert; Statler, Irving C.

    2012-01-01

    The National Aviation Operational Monitoring Service (NAOMS) was a research project under NASA s Aviation Safety Program during the years from 2000 to 2005. The purpose of this project was to develop a methodology for gaining reliable information on changes over time in the rates-of-occurrence of safety-related events as a means of assessing the safety of the national airspace. The approach was a scientifically designed survey of the operators of the aviation system concerning their safety-related experiences. This report presents the results of the methodology developed and a demonstration of the NAOMS concept through a survey of nearly 20,000 randomly selected air-carrier pilots. Results give evidence that the NAOMS methodology can provide a statistically sound basis for evaluating trends of incidents that could compromise safety. The approach and results are summarized in the report and supporting documentation and complete analyses of results are presented in 14 appendices.

  14. Musculoskeletal impairment survey in Rwanda: Design of survey tool, survey methodology, and results of the pilot study (a cross sectional survey

    Directory of Open Access Journals (Sweden)

    Simms Victoria

    2007-03-01

    Full Text Available Abstract Background Musculoskeletal impairment (MSI is an important cause of morbidity and mortality worldwide, especially in developing countries. Prevalence studies for MSI in the developing world have used varying methodologies and are seldom directly comparable. This study aimed to develop a new tool to screen for and diagnose MSI and to pilot test the methodology for a national survey in Rwanda. Methods A 7 question screening tool to identify cases of MSI was developed through literature review and discussions with healthcare professionals. To validate the tool, trained rehabilitation technicians screened 93 previously identified gold standard 'cases' and 86 'non cases'. Sensitivity, specificity and positive predictive value were calculated. A standardised examination protocol was developed to determine the aetiology and diagnosis of MSI for those who fail the screening test. For the national survey in Rwanda, multistage cluster random sampling, with probability proportional to size procedures will be used for selection of a cross-sectional, nationally representative sample of the population. Households to be surveyed will be chosen through compact segment sampling and all individuals within chosen households will be screened. A pilot survey of 680 individuals was conducted using the protocol. Results: The screening tool demonstrated 99% sensitivity and 97% specificity for MSI, and a positive predictive value of 98%. During the pilot study 468 out of 680 eligible subjects (69% were screened. 45 diagnoses were identified in 38 persons who were cases of MSI. The subjects were grouped into categories based on diagnostic subgroups of congenital (1, traumatic (17, infective (2 neurological (6 and other acquired(19. They were also separated into mild (42.1%, moderate (42.1% and severe (15.8% cases, using an operational definition derived from the World Health Organisation's International Classification of Functioning, Disability and Health

  15. 面向移动终端的Web页面重组技术综述%Survey of Web page reconstructing technology faced mobile terminal

    Institute of Scientific and Technical Information of China (English)

    史晶; 吴庆波; 杨沙洲

    2011-01-01

    在移动终端上浏览传统Web页面,存在着页面布局不合理、屏幕适应性差、噪声信息多等问题,严重影响页面的显示效果.Web页面重组技术通过对页面信息进行提取、组合,能够有效地解决上述问题,能够满足移动用户丰富多彩的页面体验效果.首先从页面提取和组合等方面对页面重组技术进行了论述,同时分析了相关技术的适用范围以及其复杂性,最后对当前领域研究的重点问题进行了总结.%There are many issues on the traditional Web page in the mobile terminal,such as unreasonable page layout, bad screen adaptation, quite a number of noisy information,etc. They seriously influence the page effect. Web page restructuring technology can solve above problems effectively through extracting and combining page information, and can make the mobile user have colorful page experience. This paper discussed Web page reconstructing technology from Web page extraction, combi-nation and related fields,and analyzed their applicability and complexity, finally concluded hot issues of this field.

  16. Literature Survey of previous research work in Models and Methodologies in Project Management

    Directory of Open Access Journals (Sweden)

    Ravinder Singh

    2014-09-01

    Full Text Available This paper provides a survey of the existing literature and research carried out in the area of project management using different models, methodologies, and frameworks. Project Management (PM broadly means programme management, portfolio management, practice management, project management office, etc. A project management system has a set of processes, procedures, framework, methods, tools, methodologies, techniques, resources, etc. which are used to manage the full life cycle of projects. This also means to create risk, quality, performance, and other management plans to monitor and manage the projects efficiently and effectively.

  17. Development of a methodology for selecting office building upgrading solutions based on a test survey in European buildings

    Energy Technology Data Exchange (ETDEWEB)

    Wittchen, K.B.; Brandt, E. [Danish Building and Urban Research, Horsholm (Denmark)

    2001-07-01

    The potential for using the TOBUS methodology to select office building upgrading solutions have been investigated during field tests in 15 European office buildings in 5 European countries. The 15 office buildings represent a variety of building traditions, architectural designs, construction periods and energy and indoor performance. The buildings were audited following the TOBUS methodology developed within the project. The results of the test surveys were primarily used to improve the TOBUS methodology and secondly to suggest general upgrading solutions and energy retrofit measures for the surveyed buildings. This paper describes the development of the TOBUS methodology based on the 15 test surveys. (author)

  18. [Customer satisfaction in home care: methodological issues based on a survey carried out in Lazio].

    Science.gov (United States)

    Pasquarella, A; Marceca, M; Casagrande, S; Gentile, D; Zeppilli, D; Buonaiuto, N; Cozzolino, M; Guasticchi, G

    2007-01-01

    Home care customer satisfaction has been, until now, rarely evaluated. After illustrating the main italian regional surveys on this issue, the article presents a customer satisfaction survey carried out in the district of Civitavecchia (Local Health Unit 'Rome F'), Lazio, regarding 30 home care beneficiaries. Methodological aspects emerging from the survey are basically focused on: advantages and disadvantages of quantitative and qualitative approaches (possibly associated each other); main criteria of eligibility of people selected for interviewing, both patients or caregivers; conditions that maximize answers reliability, including training on interviewers. Authors highlight opportunity of using such kind of survey, integrated with other different tools, into a systemic vision, for promoting management changes coming from suggested problems, aimed at total quality management.

  19. Electronic Surveys: Methodological Implications for Using the World Wide Web to Collect Survey Data.

    Science.gov (United States)

    Bertot, John Carlo; McClure, Charles R.

    1996-01-01

    Focuses on a Web-based version of a national survey for the National Commission on Libraries and Information Science (NCLIS) of public library involvement with the Internet. Describes the procedures used to develop the Web-based questionnaire, identifies issues concerning Web-based research, and presents recommendations for the future…

  20. Binational Arsenic Exposure Survey: Methodology and Estimated Arsenic Intake from Drinking Water and Urinary Arsenic Concentrations

    OpenAIRE

    Harris, Robin B.; Burgess, Jefferey L; Maria Mercedes Meza-Montenegro; Luis Enrique Gutiérrez-Millán; Mary Kay O’Rourke; Jason Roberge

    2012-01-01

    The Binational Arsenic Exposure Survey (BAsES) was designed to evaluate probable arsenic exposures in selected areas of southern Arizona and northern Mexico, two regions with known elevated levels of arsenic in groundwater reserves. This paper describes the methodology of BAsES and the relationship between estimated arsenic intake from beverages and arsenic output in urine. Households from eight communities were selected for their varying groundwater arsenic concentrations in Arizona, USA and...

  1. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces

    Science.gov (United States)

    Perfetti, L.; Polari, C.; Fassi, F.

    2017-02-01

    The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today's era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance) for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  2. Creating Web Pages Simplified

    CERN Document Server

    Wooldridge, Mike

    2011-01-01

    The easiest way to learn how to create a Web page for your family or organization Do you want to share photos and family lore with relatives far away? Have you been put in charge of communication for your neighborhood group or nonprofit organization? A Web page is the way to get the word out, and Creating Web Pages Simplified offers an easy, visual way to learn how to build one. Full-color illustrations and concise instructions take you through all phases of Web publishing, from laying out and formatting text to enlivening pages with graphics and animation. This easy-to-follow visual guide sho

  3. Full page insight

    DEFF Research Database (Denmark)

    Cortsen, Rikke Platz

    2014-01-01

    Alan Moore and his collaborating artists often manipulate time and space by drawing upon the formal elements of comics and making alternative constellations. This article looks at an element that is used frequently in comics of all kinds – the full page – and discusses how it helps shape spatio......, something that it shares with the full page in comics. Through an analysis of several full pages from Moore titles like Swamp Thing, From Hell, Watchmen and Promethea, it is made clear why the full page provides an apt vehicle for an apocalypse in comics....

  4. Full page insight

    DEFF Research Database (Denmark)

    Cortsen, Rikke Platz

    2014-01-01

    Alan Moore and his collaborating artists often manipulate time and space by drawing upon the formal elements of comics and making alternative constellations. This article looks at an element that is used frequently in comics of all kinds – the full page – and discusses how it helps shape spatio-t......, something that it shares with the full page in comics. Through an analysis of several full pages from Moore titles like Swamp Thing, From Hell, Watchmen and Promethea, it is made clear why the full page provides an apt vehicle for an apocalypse in comics....

  5. Training Activity Summary Page (TASP) State and Tribe

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Training Activity Summary Page (formerly the Training Exit Survey Cover Page) dataset contains data about each training event. This dataset includes information...

  6. Large population survey: strengths and limits. Methodology of the EDIFICE survey.

    Science.gov (United States)

    Roussel, Claire; Touboul, Chantal

    2011-01-01

    In France, mass screening for breast and colon cancer issupported by the French National Cancer Institute (INCa). In these nationwide screening campaigns, individuals aged between 50 and 74 years receive a personalized letter inviting them for a screening examination every 2 years. Prostate cancer screening is, however, still controversial and has not been included in the INCa recommendations so far. Research organizations are particularly interested in screening and indeed, several studies have been conducted in France and other countries to examine the different aspects of the subject. To provide actual benefits, screening should be undertaken on a regular scheduled basis. Therefore, several studies have assessed the factors influencing the participation rate of women in breast cancer screening in France (). The Institut National de Prévention et d'Education pour la Santé conducted one of these in 2005: the Baromètre Cancer (including 4046 individuals aged 15 years or older, interviewed by telephone) analysed beliefs and perceptions about cancer screening and studied attendance rates for breast, colon and prostate cancer (including scheduled screening). No previous survey has ever been conducted simultaneously among the general population and physicians with regard to individual and scheduled screening for breast cancer and colorectal cancer (CRC) or individual screening for prostate cancer. EDIFICE is thus the first large-scale survey to assess screening practices in France by analysing the targeted population on the one hand and the clinical practice of French general practitioners (GPs) on the other hand, using the 'mirror study' method to compare results. Two national surveys were conducted in 2005 and 2008. In 2005, only 22 geographical regions were included in the screening programme for CRC.

  7. An Overview of Ophthalmologic Survey Methodology in the 2008-2015 Korean National Health and Nutrition Examination Surveys.

    Science.gov (United States)

    Yoon, Kyung Chul; Choi, Won; Lee, Hyo Seok; Kim, Sang-Duck; Kim, Seung-Hyun; Kim, Chan Yun; Park, Ki Ho; Park, Young Jeung; Baek, Seung-Hee; Song, Su Jeong; Shin, Jae Pil; Yang, Suk-Woo; Yu, Seung-Young; Lee, Jong Soo; Lim, Key Hwan; Oh, Kyung Won; Kang, Se Woong

    2015-12-01

    The Korea National Health and Nutrition Examination Survey (KNHANES) is a national program designed to assess the health and nutritional status of the noninstitutionalized population of South Korea. The KNHANES was initiated in 1998 and has been conducted annually since 2007. Starting in the latter half of 2008, ophthalmologic examinations were included in the survey in order to investigate the prevalence and risk factors of common eye diseases such as visual impairment, refractive errors, strabismus, blepharoptosis, cataract, pterygium, diabetic retinopathy, age-related macular degeneration, glaucoma, dry eye disease, and color vision deficiency. The measurements included in the ophthalmic questionnaire and examination methods were modified in the KNHANES IV, V, and VI. In this article, we provide detailed information about the methodology of the ophthalmic examinations in KNHANES in order to aid in further investigations related to major eye diseases in South Korea.

  8. Dietary survey methodology of FINDIET 2007 with a risk assessment perspective.

    Science.gov (United States)

    Reinivuo, Heli; Hirvonen, Tero; Ovaskainen, Marja-Leena; Korhonen, Tommi; Valsta, Liisa M

    2010-06-01

    A cross-sectional survey, FINDIET 2007, was carried out in Finland. Food intake data was collected by a 48 h recall interview. Additional food intake data was collected by a repeated 3 d food diary, a barcode-based product diary, a food frequency questionnaire and by a supplementary questionnaire on rarely eaten foods. The purpose of the present paper is to describe the methodology of the national dietary survey and to discuss the particular implications for the applications of food consumption data in risk assessment. The food consumption data of the FINDIET 2007 survey can be used in food risk assessment, due thanks to flexible data processing of individual food consumption, and a risk assessment point of view was taken into account. However, international standardisation projects are needed in order to estimate comparable food intakes.

  9. Alcohol mixed with energy drinks: methodology and design of the Utrecht Student Survey

    Science.gov (United States)

    de Haan, Lydia; de Haan, Hein A; Olivier, Berend; Verster, Joris C

    2012-01-01

    This paper describes the methodology of the Utrecht Student Survey. This online survey was conducted in June 2011 by 6002 students living in Utrecht, The Netherlands. The aim of the survey was to determine the potential impact of mixing alcoholic beverages with energy drinks on overall alcohol consumption and alcohol-related consequences. In contrast to most previous surveys conducted on this topic, the current survey used a more appropriate within-subject design, comparing the alcohol consumption of individuals who consume alcohol mixed with energy drinks on occasions. Specifically, a comparison was conducted to examine the occasions during which these individuals consume this mixture versus occasions during which they consume alcohol alone. In addition to energy drinks, the consumption of other non-alcoholic mixers was also assessed when combined with alcoholic beverages. Furthermore, the reasons for consuming energy drinks alone or in combination with alcohol were investigated, and were compared to reasons for mixing alcohol with other non-alcoholic beverages. Finally, personality characteristics and the level of risk-taking behavior among the individuals were also assessed to explore their relationship with alcohol consumption. The Utrecht Student Survey will be replicated in the USA, Australia, and the UK. Results will be pooled, but also examined for possible cross-cultural differences. PMID:23118547

  10. A Study to Examine Differences Between In Person and Online Survey Data Collection Methodologies.

    Directory of Open Access Journals (Sweden)

    ROBERT CASE,

    2009-01-01

    Full Text Available The purpose of this study was to examine differences between the results of an in person or face-to-face direct spending survey and a post-event online direct spending survey. Participants in a large annual marathon held in the Mid-Atlantic Region of the United States were used as subjects for the study. The research methodology selected for this study included an in person survey instrument administered to out-of-town marathon participants prior to the start of the event during the race number and race timing chip pick-up period. The same survey instrument was administered online four days after the conclusion of the marathon to the same group of out-of-town marathon participants who did not previously respond to the in person survey. Analysis of data and results revealed that average direct spending for the online respondents was consistently and significantly higher than spending for the in person respondents on direct spending questions. Spending on lodging for both groups showed no significant differences. It was recommended that the use of online survey methods be considered when conducting direct spending studies for participant oriented sporting events when adequate e-mail addresses are available and the potential respondents have a certain level of computer literacy.

  11. Research of Web Pages Categorization

    Institute of Scientific and Technical Information of China (English)

    Zhongda Lin; Kun Deng; Yanfen Hong

    2006-01-01

    In this paper, we discuss several issues related to automated classification of web pages, especially text classification of web pages. We analyze features selection and categorization algorithms of web pages and give some suggestions for web pages categorization.

  12. Larry Page on Google

    Institute of Scientific and Technical Information of China (English)

    Miguel Helft

    2012-01-01

    Last month, Larry Page sat down with Fortune Senior Writer Miguel Helft for a lengthy interview for a forthcoming Fortune magazine article. It was only Page' s second wide-ranging conversation with a print publication since becoming CEO of Google in April 2011.

  13. Cuckoo Hashing with Pages

    CERN Document Server

    Dietzfelbinger, Martin; Rink, Michael

    2011-01-01

    Although cuckoo hashing has significant applications in both theoretical and practical settings, a relevant downside is that it requires lookups to multiple locations. In many settings, where lookups are expensive, cuckoo hashing becomes a less compelling alternative. One such standard setting is when memory is arranged in large pages, and a major cost is the number of page accesses. We propose the study of cuckoo hashing with pages, advocating approaches where each key has several possible locations, or cells, on a single page, and additional choices on a second backup page. We show experimentally that with k cell choices on one page and a single backup cell choice, one can achieve nearly the same loads as when each key has k+1 random cells to choose from, with most lookups requiring just one page access, even when keys are placed online using a simple algorithm. While our results are currently experimental, they suggest several interesting new open theoretical questions for cuckoo hashing with pages.

  14. On Page Rank

    NARCIS (Netherlands)

    Hoede, C.

    2008-01-01

    In this paper the concept of page rank for the world wide web is discussed. The possibility of describing the distribution of page rank by an exponential law is considered. It is shown that the concept is essentially equal to that of status score, a centrality measure discussed already in 1953 by Ka

  15. Page Styles on steroids

    DEFF Research Database (Denmark)

    Madsen, Lars

    2008-01-01

    Designing a page style has long been a pain for novice users. Some parts are easy; others need strong LATEX knowledge. In this article we will present the memoir way of dealing with page styles, including new code added to the recent version of memoir that will reduce the pain to a mild annoyance...

  16. New WWW Pages

    CERN Multimedia

    Pommes, K

    New WWW pages have been created in order to provide easy access to the many activities and pertaining information of the ATLAS Technical Coordination. The main entry point is available on the ATLAS Collaboration page by clicking the Technical Coordination link which leads to the page shown in the following picture. Each button links to a page listing all tasks of the corresponding activity, the responsible task leaders, schedules, work-packages, and action lists, etc... The "ATLAS Documentation Center" button will present the pop-up window shown in the next figure: Besides linking to the Technical Coordination Activities, this page provides direct access to the tools for Project Progress Tracking (PPT) and Engineering Data Management (EDMS), as well as to the main topics being coordinated by the Technical Coordination.

  17. Anti-Semitism and criticism of Israel: Methodology and results of the ASCI survey

    Directory of Open Access Journals (Sweden)

    Wilhelm Kempf

    2015-04-01

    Full Text Available Building upon psychological conflict theory, on the one hand, and item-response models, on the other, the present paper develops an integrated methodology that aims at differentiating the various ways of criticizing Israel. An application of this methodology to the Anti-Semitism and Criticism of Israel (ASCI survey found two ways of criticizing Israel resulting from two different and antipodal processes. (1 Anti-Semitic criticism of Israel is generally laden with prejudice and shares not only anti-Semitic, anti-Zionist and anti-Israeli, but also anti-Palestinian resentments as well. (2 Non-anti-Semitic criticism of Israel is motivated by pacifism and human rights commitment and rejects any form of anti-Semitic, anti-Zionist, anti-Israeli or anti-Palestinian resentment. However, even critics of Israel who were not originally motivated by anti-Semitism are also in danger of developing anti-Semitic prejudices.

  18. U.S. Geological Survey Methodology Development for Ecological Carbon Assessment and Monitoring

    Science.gov (United States)

    Zhu, Zhi-Liang; Stackpoole, S.M.

    2009-01-01

    Ecological carbon sequestration refers to transfer and storage of atmospheric carbon in vegetation, soils, and aquatic environments to help offset the net increase from carbon emissions. Understanding capacities, associated opportunities, and risks of vegetated ecosystems to sequester carbon provides science information to support formulation of policies governing climate change mitigation, adaptation, and land-management strategies. Section 712 of the Energy Independence and Security Act (EISA) of 2007 mandates the Department of the Interior to develop a methodology and assess the capacity of our nation's ecosystems for ecological carbon sequestration and greenhouse gas (GHG) flux mitigation. The U.S. Geological Survey (USGS) LandCarbon Project is responding to the Department of Interior's request to develop a methodology that meets specific EISA requirements.

  19. Economic page turners

    OpenAIRE

    Frank, Björn

    2011-01-01

    Economic page turners like Freakonomics are well written and there is much to be learned from them - not only about economics, but also about writing techniques. Their authors know how to build up suspense, i.e., they make readers want to know what comes. An uncountable number of pages in books and magazines are filled with advice on writing reportages or suspense novels. While many of the tips are specific to the respective genres, some carry over to economic page turners in an instructive w...

  20. Market study: Tactile paging system

    Science.gov (United States)

    1977-01-01

    A market survey was conducted regarding the commercialization potential and key market factors relevant to a tactile paging system for deaf-blind people. The purpose of the tactile paging system is to communicate to the deaf-blind people in an institutional environment. The system consists of a main console and individual satellite wrist units. The console emits three signals by telemetry to the wrist com (receiving unit) which will measure approximately 2 x 4 x 3/4 inches and will be fastened to the wrist by a strap. The three vibration signals are fire alarm, time period indication, and a third signal which will alert the wearer of the wrist com to the fact that the pin on the top of the wrist is emitting a morse coded message. The Morse code message can be felt and recognized with the finger.

  1. [Relevant methodological issues from the SBBrasil 2010 Project for national health surveys].

    Science.gov (United States)

    Roncalli, Angelo Giuseppe; Silva, Nilza Nunes da; Nascimento, Antonio Carlos; Freitas, Cláudia Helena Soares de Morais; Casotti, Elisete; Peres, Karen Glazer; Moura, Lenildo de; Peres, Marco A; Freire, Maria do Carmo Matias; Cortes, Maria Ilma de Souza; Vettore, Mario Vianna; Paludetto Júnior, Moacir; Figueiredo, Nilcema; Goes, Paulo Sávio Angeiras de; Pinto, Rafaela da Silveira; Marques, Regina Auxiliadora de Amorim; Moysés, Samuel Jorge; Reis, Sandra Cristina Guimarães Bahia; Narvai, Paulo Capel

    2012-01-01

    The SBBrasil 2010 Project (SBB10) was designed as a nationwide oral health epidemiological survey within a health surveillance strategy. This article discusses methodological aspects of the SBB10 Project that can potentially help expand and develop knowledge in the health field. This was a nationwide survey with stratified multi-stage cluster sampling. The sample domains were 27 State capitals and 150 rural municipalities (counties) from the country's five major geographic regions. The sampling units were census tracts and households for the State capitals and municipalities, census tracts, and households for the rural areas. Thirty census tracts were selected in the State capitals and 30 municipalities in the countryside. The precision considered the demographic domains grouped by density of the overall population and the internal variability of oral health indices. The study evaluated dental caries, periodontal disease, malocclusion, fluorosis, tooth loss, and dental trauma in five age groups (5, 12, 15-19, 35-44, and 65-74 years).

  2. Does the Op-Ed Page Have a Chance to Become a Public Forum?

    Science.gov (United States)

    Ciofalo, Andrew; Traverso, Kim

    1994-01-01

    Surveys op-ed page editors, finding that fewer than half of the responding papers have op-ed pages; that professional journalists, public figures, and propagandists dominate the pages; and that editors firmly control the agenda. (SR)

  3. Methodological note: allocation of disability items in the American Community Survey.

    Science.gov (United States)

    Siordia, Carlos; Young, Rebekah

    2013-04-01

    Determining the prevalence and correlates of disability requires the use of sample surveys in data analysis. In an effort to generate complete datasets, allocation procedures (i.e., the assignment of values to missing or illogical responses) are frequently used for missing or inconsistent responses. The goal of this investigation was to explore how six disability-related questions vary in their degree of allocation and how research results may be sensitive to this procedure. This is important because many researchers using large disability information banks are not survey methodologists and may be unaware of how the Census Bureau's editing procedures can influence research findings. We use 2010 1-year Public Use Microdata Sample files from the American Community Survey (ACS). We investigated the allocation rates of the following disability items: self-care; hearing; vision; independent living; ambulatory; and cognitive ability. We also asked how allocation rates varied by demographic characteristics and whether the allocated values could influence multivariate results. Disability item allocation in ACS data have detectable patterns, where the rate of disability allocation is higher for mail surveys, males, older people, groups who speak English not well or not at all, US citizens, Latinos(as), and for people living in or near poverty. Multivariate models may be sensitive to how these allocated values are treated. The rate of allocations varies as a function of demographic variables because of methodological procedures and survey participation behaviors. Because allocation rates may affect research and policy about the disabled population, more research is required. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. [Methodological Aspects of the Sampling Design for the 2015 National Mental Health Survey].

    Science.gov (United States)

    Rodríguez, Nelcy; Rodríguez, Viviana Alejandra; Ramírez, Eugenia; Cediel, Sandra; Gil, Fabián; Rondón, Martín Alonso

    2016-12-01

    The WHO has encouraged the development, implementation and evaluation of policies related to mental health all over the world. In Colombia, within this framework and promoted by the Ministry of Health and Social Protection, as well as being supported by Colciencias, the fourth National Mental Health Survey (NMHST) was conducted using a observational cross sectional study. According to the context and following the guidelines and sampling design, a summary of the methodology used for this sampling process is presented. The fourth NMHST used the Homes Master Sample for Studies in Health from the National System of Studies and Population Surveys for Health to calculate its sample. This Master Sample was developed and implemented in the year 2013 by the Ministry of Social Protection. This study included non-institutionalised civilian population divided into four age groups: children 7-11 years, adolescent 12-17 years, 18-44 years and 44 years old or older. The sample size calculation was based on the reported prevalences in other studies for the outcomes of mental disorders, depression, suicide, associated morbidity, and alcohol use. A probabilistic, cluster, stratified and multistage selection process was used. Expansions factors to the total population were calculated. A total of 15,351 completed surveys were collected and were distributed according to the age groups: 2727, 7-11 years, 1754, 12-17 years, 5889, 18-44 years, and 4981, ≥45 years. All the surveys were distributed in five regions: Atlantic, Oriental, Bogotá, Central and Pacific. A sufficient number of surveys were collected in this study to obtain a more precise approximation of the mental problems and disorders at the regional and national level. Copyright © 2016 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  5. Users page feedback

    CERN Multimedia

    2010-01-01

    In October last year the Communication Group proposed an interim redesign of the users’ web pages in order to improve the visibility of key news items, events and announcements to the CERN community. The proposed update to the users' page (right), and the current version (left, behind) This proposed redesign was seen as a small step on the way to much wider reforms of the CERN web landscape proposed in the group’s web communication plan.   The results are available here. Some of the key points: - the balance between news / events / announcements and access to links on the users’ pages was not right - many people asked to see a reversal of the order so that links appeared first, news/events/announcements last; - many people felt that we should keep the primary function of the users’ pages as an index to other CERN websites; - many people found the sections of the front page to be poorly delineated; - people do not like scrolling; - there were performance...

  6. JERHRE's New Web Pages.

    Science.gov (United States)

    2006-06-01

    JERHRE'S WEBSITE, www.csueastbay.edu/JERHRE/ has two new pages. One of those pages is devoted to curriculum that may be used to educate students, investigators and ethics committee members about issues in the ethics of human subjects research, and to evaluate their learning. It appears at www.csueastbay.edu/JERHRE/cur.html. The other is devoted to emailed letters from readers. Appropriate letters will be posted as soon as they are received by the editor. Letters from readers appear at www.csueastbay.edu/JERHRE/let.html.

  7. Typographic Landscapes of Pelotas: initial survey of the collection and some methodological definitions

    Directory of Open Access Journals (Sweden)

    Daniela Velleda Brisolara

    2016-03-01

    Full Text Available This paper presents part of the research “Pelotas Typographic Landscapes: An exploratory study of the typography in the urban space”. The study aims to investigate the different occurrences of typographic landscapes in certain urban areas of the city of Pelotas / RS to better understanding of typography as a historical and cultural information. The research is based on studies of ‘typographic landscapes’ that have been developed in recent years in several Brazilian cities, which presents methods for collecting, processing and analysis of typography data in the urban space. Therefore, the methodology adopted is based on the indications of investigative process already started in this field, such as the establishment of routes and field research records, making adaptations necessary to context. This paper presents samples of survey from two typographical categories performed in a specific region, in addition to the proposed adjustment to the structure of cataloguing and analysis record.

  8. Development of residential-conservation-survey methodology for the US Air Force. Interim report. Task two

    Energy Technology Data Exchange (ETDEWEB)

    Abrams, D. W.; Hartman, T. L.; Lau, A. S.

    1981-11-13

    A US Air Force (USAF) Residential Energy Conservation Methodology was developed to compare USAF needs and available data to the procedures of the Residential Conservation Service (RCS) program as developed for general use by utility companies serving civilian customers. Attention was given to the data implications related to group housing, climatic data requirements, life-cycle cost analysis, energy saving modifications beyond those covered by RCS, and methods for utilizing existing energy consumption data in approaching the USAF survey program. Detailed information and summaries are given on the five subtasks of the program. Energy conservation alternatives are listed and the basic analysis techniques to be used in evaluating their thermal performane are described. (MCW)

  9. Folding worlds between pages

    CERN Multimedia

    Meier, Matthias

    2010-01-01

    "We all remember pop-up books form our childhood. As fascinated as we were back then, we probably never imagined how much engineering know-how went into these books. Pop-up engineer Anton Radevsky has even managed to fold a 27-kilometre particle accelerator into a book" (4 pages)

  10. Understanding pathways of exposure using site-specific habits surveys, particularly new pathways and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Grzechnik, M.; McTaggart, K.; Clyne, F. [Centre for Environment, Fisheries and Aquaculture Science, Lowestoft (United Kingdom)

    2006-07-01

    Full text of publication follows: UK policy on the control of radiation exposure via routine discharges from nuclear licensed sites has long been based on ICRP recommendations that embody the principles of justification of practices, optimisation of protection, and dose limitation. Radiological protection of the public is based on the concept of a critical group of individuals. This group is defined as those people who, as a result of the area they reside and their habits, receive the highest radiation dose due to the operations of a site. Therefore, if the dose to this critical group is acceptable in relation to relevant dose limits and constraints, then other members of the public will receive lower doses. Thus, the principle of critical groups provides overall protection for the public. Surveys to determine local habits involve an integrated methodology, whereby the potential radioactive exposure pathways from liquid and gaseous discharges and direct radiation from the site are investigated. Surveys to identify these habits must be undertaken rigorously for consistency, and have been known to reveal unexpected pathways of radiation exposure. Pathways typically include consumption of local foodstuffs and external exposure. Furthermore, a number of critical groups ma y be identified within a single survey area if the habits of one group do not adequately describe those of the other inhabitants of the area. Survey preparation involves the initial identification of high producers and consumers of local foods in a geographically defined area surrounding the nuclear facility. Pathways can be broken down into three general groups, which include exposure arising from; 1) Terrestrial (gaseous) discharges surveyed within 5 km of the site 2) Direct radiation surveyed within 1 km of the site 3) Aquatic (liquid) discharges surveyed within local areas affected by the discharges, including seas, rivers and sewage works. The survey fieldwork involves interviewing members of the

  11. Methodology for adding glycemic index to the National Health and Nutrition Examination Survey nutrient database.

    Science.gov (United States)

    Lin, Chii-Shy; Kimokoti, Ruth W; Brown, Lisa S; Kaye, Elizabeth A; Nunn, Martha E; Millen, Barbara E

    2012-11-01

    Generating valid estimates of dietary glycemic index (GI) and glycemic load (GL) has been a challenge in nutritional epidemiology. The methodologic issues may have contributed to the wide variation of GI/GL associations with health outcomes observed in existing literature. We describe a standardized methodology for assigning GI values to items in the National Health and Nutrition Examination Survey (NHANES) nutrient database using the new International Tables to develop research-driven, systematic procedures and strategies to estimate dietary GI/GL exposures of a nationally representative population sample. Nutrient databases for NHANES 2003-2006 contain information on 3,155 unique foods derived from the US Department of Agriculture National Nutrient Database for Standard Reference versions 18 and 20. Assignment of GI values were made to a subset of 2,078 carbohydrate-containing foods using systematic food item matching procedures applied to 2008 international GI tables and online data sources. Matching protocols indicated that 45.4% of foods had identical matches with existing data sources, 31.9% had similar matches, 2.5% derived GI values calculated with the formula for combination foods, 13.6% were assigned a default GI value based on low carbohydrate content, and 6.7% of GI values were based on data extrapolation. Most GI values were derived from international sources; 36.1% were from North American product information. To confirm data assignments, dietary GI and GL intakes of the NHANES 2003-2006 adult participants were estimated from two 24-hour recalls and compared with published studies. Among the 3,689 men and 4,112 women studied, mean dietary GI was 56.2 (men 56.9, women 55.5), mean dietary GL was 138.1 (men 162.1, women 116.4); the distribution of dietary GI was approximately normal. Estimates of population GI and GL compare favorably with other published literature. This methodology of adding GI values to an existing population nutrient database

  12. Socially responsible ethnobotanical surveys in the Cape Floristic Region: ethical principles, methodology and quantification of data

    Directory of Open Access Journals (Sweden)

    Ben-Erik Van Wyk

    2012-03-01

    Full Text Available A broad overview of published and unpublished ethnobotanical surveys in the Cape Floristic Region (the traditional home of the San and Khoi communities shows that the data is incomplete. There is an urgent need to record the rich indigenous knowledge about plants in a systematic and social responsible manner in order to preserve this cultural and scientific heritage for future generations. Improved methods for quantifying data are introduced, with special reference to the simplicity and benefits of the new Matrix Method. This methodology prevents or reduces the number of false negatives, and also ensures the participation of elderly people who might be immobile. It also makes it possible to compare plant uses in different local communities. This method enables the researcher to quantify the knowledge on plant use that was preserved in a community, and to determine the relative importance of a specific plant in a more objective way. Ethical considerations for such ethnobotanical surveys are discussed, through the lens of current ethical codes and international conventions. This is an accessible approach, which can also be used in the life sciences classroom.

  13. The National Eye Survey of Trinidad and Tobago (NESTT): Rationale, Objectives and Methodology.

    Science.gov (United States)

    Braithwaite, Tasanee; Verlander, Neville Q; Bartholomew, Debra; Bridgemohan, Petra; McNally, Kevin; Roach, Allana; Sharma, Subash; Singh, Deo; Pesudovs, Konrad; Teelucksingh, Surujpal; Carrington, Christine; Ramsewak, Samuel; Bourne, Rupert

    2017-04-01

    This paper describes the rationale, study design and procedures of the National Eye Survey of Trinidad and Tobago (NESTT). The main objective of this survey is to obtain prevalence estimates of vision impairment and blindness for planning and policy development. A population-based, cross-sectional survey was undertaken using random multistage cluster sampling, with probability-proportionate-to-size methods. Eligible participants aged 5 years and older were sampled from the non-institutional population in each of 120 cluster segments. Presenting distance and near visual acuity were screened in their communities. People aged 40 years and older, and selected younger people, were invited for comprehensive clinic assessment. The interview included information on potential risk factors for vision loss, associated costs and quality of life. The examination included measurement of anthropometrics, blood glucose, refraction, ocular biometry, corneal hysteresis, and detailed assessment of the anterior and posterior segments, with photography and optical coherence tomography imaging. Adult participants were invited to donate saliva samples for DNA extraction and storage. The fieldwork was conducted over 13 months in 2013-2014. A representative sample of 10,651 individuals in 3410 households within 120 cluster segments identified 9913 people who were eligible for recruitment. The study methodology was robust and adequate to provide the first population-based estimates of the prevalence and causes of visual impairment and blindness in Trinidad and Tobago. Information was also gathered on risk factors, costs and quality of life associated with vision loss, and on normal ocular parameters for the population aged 40 years and older.

  14. Haemophilia Experiences, Results and Opportunities (HERO) Study: survey methodology and population demographics.

    Science.gov (United States)

    Forsyth, A L; Gregory, M; Nugent, D; Garrido, C; Pilgaard, T; Cooper, D L; Iorio, A

    2014-01-01

    Psychosocial factors have a significant impact on the quality of life of persons with haemophilia (PWH). The Haemophilia Experiences, Results and Opportunities (HERO) initiative was developed to provide a greater understanding of the psychological components which influence the lives of PWH. This article describes the HERO methodology and the characteristics of respondents. Two online surveys (one for adult PWH ≥18 years and one for parents of children surveys included demographic and treatment characteristics, relationships, sexual intimacy, quality of life, barriers to treatment and sources of information. A total of 675 PWH [age, median (range) 36 (18-86 years)] and 561 parents [39 (23-68 years)] completed the survey. PWH/parents reported haemophilia A (74%/76%), B (13%/16%) or with inhibitors (13%/8%). Spontaneous joint bleeding was reported in 76%/52% of PWH/children with haemophilia A, 67%/47% with haemophilia B and 93%/76% with inhibitors. Median number of bleeds (interquartile range) was 7 (2-20) for PWH and 4 (2-10) for children in the past year. Most PWH and children were treated with factor concentrate. PWH reported arthritis (49%) and HIV/HCV infections (18%/43%) related to haemophilia. Most PWH and parent respondents had received formal education (85%/89%) and were employed full- or part-time (60%/72%). HERO is one of the largest multinational studies focused on psychosocial issues in haemophilia, including historical and treatment information that will allow for multivariate analyses of determinants of health in haemophilia.

  15. Siting Study Framework and Survey Methodology for Marine and Hydrokinetic Energy Project in Offshore Southeast Florida

    Energy Technology Data Exchange (ETDEWEB)

    Vinick, Charles; Riccobono, Antonino, MS; Messing, Charles G., Ph.D.; Walker, Brian K., Ph.D.; Reed, John K., Ph.D.

    2012-02-28

    Dehlsen Associates, LLC was awarded a grant by the United States Department of Energy (DOE) Golden Field Office for a project titled 'Siting Study Framework and Survey Methodology for Marine and Hydrokinetic Energy Project in Offshore Southeast Florida,' corresponding to DOE Grant Award Number DE-EE0002655 resulting from DOE funding Opportunity Announcement Number DE-FOA-0000069 for Topic Area 2, and it is referred to herein as 'the project.' The purpose of the project was to enhance the certainty of the survey requirements and regulatory review processes for the purpose of reducing the time, efforts, and costs associated with initial siting efforts of marine and hydrokinetic energy conversion facilities that may be proposed in the Atlantic Ocean offshore Southeast Florida. To secure early input from agencies, protocols were developed for collecting baseline geophysical information and benthic habitat data that can be used by project developers and regulators to make decisions early in the process of determining project location (i.e., the siting process) that avoid or minimize adverse impacts to sensitive marine benthic habitat. It is presumed that such an approach will help facilitate the licensing process for hydrokinetic and other ocean renewable energy projects within the study area and will assist in clarifying the baseline environmental data requirements described in the U.S. Department of the Interior Bureau of Ocean Energy Management, Regulation and Enforcement (formerly Minerals Management Service) final regulations on offshore renewable energy (30 Code of Federal Regulations 285, published April 29, 2009). Because projects generally seek to avoid or minimize impacts to sensitive marine habitats, it was not the intent of this project to investigate areas that did not appear suitable for the siting of ocean renewable energy projects. Rather, a two-tiered approach was designed with the first step consisting of gaining overall insight

  16. Mobile Technology Use by People Experiencing Multiple Sclerosis Fatigue: Survey Methodology

    Science.gov (United States)

    Reay, Nicholas

    2017-01-01

    Background Fatigue is one of the most commonly reported symptoms of multiple sclerosis (MS). It has a profound impact on all spheres of life, for people with MS and their relatives. It is one of the key precipitants of early retirement. Individual, group, and Internet cognitive behavioral therapy–based approaches to supporting people with MS to manage their fatigue have been shown to be effective. Objective The aim of this project was to (1) survey the types of mobile devices and level of Internet access people with MS use or would consider using for a health intervention and (2) characterize the levels of fatigue severity and their impact experienced by the people in our sample to provide an estimate of fatigue severity of people with MS in New Zealand. The ultimate goal of this work was to support the future development of a mobile intervention for the management of fatigue for people with MS. Methods Survey methodology using an online questionnaire was used to assess people with MS. A total of 51 people with MS participated. The average age was 48.5 years, and the large majority of the sample (77%) was female. Results Participants reported significant levels of fatigue as measured with the summary score of the Neurological Fatigue Index (mean 31.4 [SD 5.3]). Most (84%) respondents scored on average more than 3 on the fatigue severity questions, reflecting significant fatigue. Mobile phone usage was high with 86% of respondents reporting having a mobile phone; apps were used by 75% of respondents. Most participants (92%) accessed the Internet from home. Conclusions New Zealand respondents with MS experienced high levels of both fatigue severity and fatigue impact. The majority of participants have a mobile device and access to the Internet. These findings, along with limited access to face-to-face cognitive behavioral therapy–based interventions, create an opportunity to develop a mobile technology platform for delivering a cognitive behavioral therapy

  17. Evaluating Multilingual Gisting of Web Pages

    CERN Document Server

    Resnik, P

    1997-01-01

    We describe a prototype system for multilingual gisting of Web pages, and present an evaluation methodology based on the notion of gisting as decision support. This evaluation paradigm is straightforward, rigorous, permits fair comparison of alternative approaches, and should easily generalize to evaluation in other situations where the user is faced with decision-making on the basis of information in restricted or alternative form.

  18. Binational Arsenic Exposure Survey: Methodology and Estimated Arsenic Intake from Drinking Water and Urinary Arsenic Concentrations

    Directory of Open Access Journals (Sweden)

    Robin B. Harris

    2012-03-01

    Full Text Available The Binational Arsenic Exposure Survey (BAsES was designed to evaluate probable arsenic exposures in selected areas of southern Arizona and northern Mexico, two regions with known elevated levels of arsenic in groundwater reserves. This paper describes the methodology of BAsES and the relationship between estimated arsenic intake from beverages and arsenic output in urine. Households from eight communities were selected for their varying groundwater arsenic concentrations in Arizona, USA and Sonora, Mexico. Adults responded to questionnaires and provided dietary information. A first morning urine void and water from all household drinking sources were collected. Associations between urinary arsenic concentration (total, organic, inorganic and estimated level of arsenic consumed from water and other beverages were evaluated through crude associations and by random effects models. Median estimated total arsenic intake from beverages among participants from Arizona communities ranged from 1.7 to 14.1 µg/day compared to 0.6 to 3.4 µg/day among those from Mexico communities. In contrast, median urinary inorganic arsenic concentrations were greatest among participants from Hermosillo, Mexico (6.2 µg/L whereas a high of 2.0 µg/L was found among participants from Ajo, Arizona. Estimated arsenic intake from drinking water was associated with urinary total arsenic concentration (p < 0.001, urinary inorganic arsenic concentration (p < 0.001, and urinary sum of species (p < 0.001. Urinary arsenic concentrations increased between 7% and 12% for each one percent increase in arsenic consumed from drinking water. Variability in arsenic intake from beverages and urinary arsenic output yielded counter intuitive results. Estimated intake of arsenic from all beverages was greatest among Arizonans yet participants in Mexico had higher urinary total and inorganic arsenic concentrations. Other contributors to urinary arsenic concentrations should be evaluated.

  19. Binational arsenic exposure survey: methodology and estimated arsenic intake from drinking water and urinary arsenic concentrations.

    Science.gov (United States)

    Roberge, Jason; O'Rourke, Mary Kay; Meza-Montenegro, Maria Mercedes; Gutiérrez-Millán, Luis Enrique; Burgess, Jefferey L; Harris, Robin B

    2012-04-01

    The Binational Arsenic Exposure Survey (BAsES) was designed to evaluate probable arsenic exposures in selected areas of southern Arizona and northern Mexico, two regions with known elevated levels of arsenic in groundwater reserves. This paper describes the methodology of BAsES and the relationship between estimated arsenic intake from beverages and arsenic output in urine. Households from eight communities were selected for their varying groundwater arsenic concentrations in Arizona, USA and Sonora, Mexico. Adults responded to questionnaires and provided dietary information. A first morning urine void and water from all household drinking sources were collected. Associations between urinary arsenic concentration (total, organic, inorganic) and estimated level of arsenic consumed from water and other beverages were evaluated through crude associations and by random effects models. Median estimated total arsenic intake from beverages among participants from Arizona communities ranged from 1.7 to 14.1 µg/day compared to 0.6 to 3.4 µg/day among those from Mexico communities. In contrast, median urinary inorganic arsenic concentrations were greatest among participants from Hermosillo, Mexico (6.2 µg/L) whereas a high of 2.0 µg/L was found among participants from Ajo, Arizona. Estimated arsenic intake from drinking water was associated with urinary total arsenic concentration (p < 0.001), urinary inorganic arsenic concentration (p < 0.001), and urinary sum of species (p < 0.001). Urinary arsenic concentrations increased between 7% and 12% for each one percent increase in arsenic consumed from drinking water. Variability in arsenic intake from beverages and urinary arsenic output yielded counter intuitive results. Estimated intake of arsenic from all beverages was greatest among Arizonans yet participants in Mexico had higher urinary total and inorganic arsenic concentrations. Other contributors to urinary arsenic concentrations should be evaluated.

  20. Influenza knowledge, attitude, and behavior survey for grade school students: design and novel assessment methodology.

    Science.gov (United States)

    Koep, Tyler H; Huskins, W Charles; Clemens, Christal; Jenkins, Sarah; Pierret, Chris; Ekker, Stephen C; Enders, Felicity T

    2014-12-01

    Despite the fact infectious diseases can spread readily in grade schools, few studies have explored prevention in this setting. Additionally, we lack valid tools for students to self-report knowledge, attitudes, and behaviors. As part of an ongoing study of a curriculum intervention to promote healthy behaviors, we developed and evaluated age-appropriate surveys to determine students' understanding of influenza prevention. Surveys were adapted from adolescent and adult influenza surveys and administered to students in grades 2-5 (ages 7-11) at two Rochester public schools. We assessed student understanding by analyzing percent repeatability of 20 survey questions and compared percent "don't know" (DK) responses across grades, gender, and race. Questions thought to be ambiguous after early survey administration were investigated in student focus groups, modified as appropriate, and reassessed. The response rate across all surveys was >87%. Survey questions were well understood; 16 of 20 questions demonstrated strong pre/post repeatability (>70%). Only 1 question showed an increase in DK response for higher grades (p survey questions and improved measures of understanding in the final survey administration. Grade-school students' knowledge, attitudes and behavior toward influenza prevention can be assessed using surveys. Quantitative and qualitative analysis may be used to assess participant understanding and refine survey development for pediatric survey instruments. These methods may be used to assess the repeatability and validity of surveys to assess the impact of health education interventions in young children.

  1. Can citizen science produce good science? Testing the OPAL Air Survey methodology, using lichens as indicators of nitrogenous pollution.

    Science.gov (United States)

    Tregidgo, Daniel J; West, Sarah E; Ashmore, Mike R

    2013-11-01

    Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood.

  2. Multiplex PageRank

    CERN Document Server

    Halu, Arda; Pansaraza, Pietro; Bianconi, Ginestra

    2013-01-01

    Many complex systems can be described as multiplex networks in which the same nodes can interact with one another in different layers, thus forming a set of interacting and co-evolving networks. Examples of such multiplex systems are social networks where people are involved in different types of relationships and interact through various forms of communication media. The ranking of nodes in multiplex networks is one of the most pressing and challenging tasks that research on complex networks is currently facing. When pairs of nodes can be connected through multiple links and in multiple layers, the ranking of nodes should necessarily reflect the importance of nodes in one layer as well as their importance in other interdependent layers. In this paper, we draw on the idea of biased random walks to define the Multiplex PageRank centrality measure in which the effects of the interplay between networks on the centrality of nodes are directly taken into account. In particular, depending on the intensity of the in...

  3. Turning the page

    Science.gov (United States)

    2015-01-01

    in hand to allow for a more limited production of a paper version of the Annals for RCS fellows and members who continue to elect to receive their Annals in the traditional format. Medical colleges around the world are currently undergoing similar deliberations and for some a digital version may represent the only opportunity to maintain editorial independence – unhindered by the implications of a commercial publishing partner. It is however hoped that for the vast majority of fellows and members, the new and enhanced digital platform will offer significant advantages such that the digital version becomes the de facto medium of choice. Matt Whitaker and the team at the Annals should be congratulated for their sterling efforts in making this transition. The new site, now live at http://publishing.rcseng.ac.uk, will enhance the experience of finding, accessing, reading, citing, sharing and saving articles from the Annals, Bulletin and FDJ. Sign-on will be much easier; page load times quicker and the search engine more powerful and intuitive. The new platform boasts improved functionality, full in-page article text and multi-media, citation tracking, reference generators and advanced social media integration. We are simultaneously launching a new video library where we will be hosting our technical videos. It will, I am certain, become a huge resource for our surgical fraternity. Our new platform will be followed later this year by the inevitable and ubiquitous app, which will allow readers to download issues of the Annals and read them offline and at leisure on whatever their tablet of choice might be. It is my belief that these and forthcoming changes herald the transformation of the Annals into a truly modern journal with all the digital services that authors and readers now rightly expect from their RCS publication. Tim Lane Editor-in-Chief, rcsannalseditor@gmail.com

  4. Internet snapshot survey: A novel methodology to monitor novel psychotropic substances and its need in Asia.

    Science.gov (United States)

    Mahapatra, Ananya; Sharma, Pawan

    2016-06-01

    Recently there has been upsurge in the use of novel psychoactive substances, commonly known as legal highs. There is limited data available on the use and availability of these substances. Internet snapshot methodology has been successfully used in Europe and America to understand rapidly adapting internet based drug market but no data is available from Asian region. Hence there is need of application of similar methodology in Asia to explore and gauge the problem statement about these substances.

  5. Improving quality in population surveys of headache prevalence, burden and cost: key methodological considerations.

    Science.gov (United States)

    Steiner, Timothy J; Stovner, Lars Jacob; Al Jumah, Mohammed; Birbeck, Gretchen L; Gururaj, Gopalakrishna; Jensen, Rigmor; Katsarava, Zaza; Queiroz, Luiz Paulo; Scher, Ann I; Tekle-Haimanot, Redda; Wang, Shuu-Jiun; Martelletti, Paolo; Dua, Tarun; Chatterji, Somnath

    2013-01-01

    Population-based studies of headache disorders are important. They inform needs assessment and underpin service policy for a set of disorders that are a public-health priority. On the one hand, our knowledge of the global burden of headache is incomplete, with major geographical gaps; on the other, methodological differences and variable quality are notable among published studies of headache prevalence, burden and cost. The purpose here was to start the process of developing standardized and better methodology in these studies. An expert consensus group was assembled to identify the key methodological issues, and areas where studies might fail. Members had competence and practical experience in headache epidemiology or epidemiology in general, and were drawn from all WHO world regions. We reviewed the relevant literature, and supplemented the knowledge gathered from this exercise with experience gained from recent Global Campaign population-based studies, not all yet published. We extracted methodological themes and identified issues within them that were of key importance. We found wide variations in methodology. The themes within which methodological shortcomings had adverse impact on quality were the following: study design; selection and/or definition of population of interest; sampling and bias avoidance; sample size estimation; access to selected subjects (managing and reporting non-participation); case definition (including diagnosis and timeframe); case ascertainment (including diagnostic validation of questionnaires); burden estimation; reporting (methods and results). These are discussed.

  6. Methodology of Correcting Nonresponse Bias: Introducing Another Bias? The Case of the Swiss Innovation Survey 2002

    OpenAIRE

    Sydow, Nora

    2006-01-01

    The non-response in a survey can lead to severe bias. In order to manage this problem, it is usual to make a second survey by a sample of non-respondent. This allows us to test if there is a significant difference in the key variables of the survey between respondents and nonrespondents and, if yes, to take it into account. But, the risk is great to introduce another bias depending on the mode (mail vs phone) of survey. The KOF industrial economics group is exploring for many years the innova...

  7. Comparing econometric and survey-based methodologies in measuring offshoring: The Danish experience

    DEFF Research Database (Denmark)

    Refslund, Bjarke

    2016-01-01

    such as the national or regional level. Most macro analyses are based on proxies and trade statistics with limitations. Drawing on unique Danish survey data, this article demonstrates how survey data can provide important insights into the national scale and impacts of offshoring, including changes of employment...

  8. INTEGRATED SURVEY FOR ARCHITECTURAL RESTORATION: A METHODOLOGICAL COMPARISON OF TWO CASE STUDIES

    Directory of Open Access Journals (Sweden)

    G. Bianchi

    2016-06-01

    Full Text Available A preliminary survey campaign is essential in projects of restoration, urban renewal, rebuilding or promotion of architectural heritage. Today several survey techniques allow full 3D object restitution and modelling that provides a richer description than simple 2D representations. However, the amount of data to collect increases dramatically and a trade-off between efficiency and productivity from one side and assuring accuracy and completeness of the results on the other must be found. Depending on the extent and the complexity of the task, a single technique or a combination of several ones might be employed. Especially when documentation at different scales and with different levels of detail are foreseen, the latter will likely be necessary. The paper describes two architectural surveys in Italy: the old village of Navelli (AQ, affected by the earthquake in 2009, and the two most relevant remains in Codiponte (MS, damaged by the earthquake in 2013, both in the context of a project of restoration and conservation. In both sites, a 3D survey was necessary to represent effectively the objects. An integrated survey campaign was performed in both cases, which consists of a GPS network as support for georeferencing, an aerial survey and a field survey made by laser scanner and close range photogrammetry. The two case studies, thanks to their peculiarities, can be taken as exemplar to wonder if the integration of different surveying techniques is today still mandatory or, considering the technical advances of each technology, it is in fact just optional.

  9. Integrated Survey for Architectural Restoration: a Methodological Comparison of Two Case Studies

    Science.gov (United States)

    Bianchi, G.; Bruno, N.; Dall'Asta, E.; Forlani, G.; Re, C.; Roncella, R.; Santise, M.; Vernizzi, C.; Zerbi, A.

    2016-06-01

    A preliminary survey campaign is essential in projects of restoration, urban renewal, rebuilding or promotion of architectural heritage. Today several survey techniques allow full 3D object restitution and modelling that provides a richer description than simple 2D representations. However, the amount of data to collect increases dramatically and a trade-off between efficiency and productivity from one side and assuring accuracy and completeness of the results on the other must be found. Depending on the extent and the complexity of the task, a single technique or a combination of several ones might be employed. Especially when documentation at different scales and with different levels of detail are foreseen, the latter will likely be necessary. The paper describes two architectural surveys in Italy: the old village of Navelli (AQ), affected by the earthquake in 2009, and the two most relevant remains in Codiponte (MS), damaged by the earthquake in 2013, both in the context of a project of restoration and conservation. In both sites, a 3D survey was necessary to represent effectively the objects. An integrated survey campaign was performed in both cases, which consists of a GPS network as support for georeferencing, an aerial survey and a field survey made by laser scanner and close range photogrammetry. The two case studies, thanks to their peculiarities, can be taken as exemplar to wonder if the integration of different surveying techniques is today still mandatory or, considering the technical advances of each technology, it is in fact just optional.

  10. SURVEY OF ABANDONED INDUSTRIAL SITES IN THE PROVINCE OF CARINTHIA/AUSTRIA - METHODOLOGY AND RESULTS

    Directory of Open Access Journals (Sweden)

    WOLFGANG FISCHER

    2008-12-01

    Full Text Available The paper in hand mainly addresses the issue of abandoned industrial and commercial sites. The results of a survey form the central content and regards the method of investigation of this survey of abandoned sites in the province of Carinthia, including experiences and recommendations. The survey started with a number of more than 10,000 businesses. After the different steps of the survey (research phase, exploration phase, clarification phase and evaluation phase 444 sites had to be attributed an increased potential for danger. After these steps followed an evaluation which concerned the priority in regard to the securing and/ or remediation of the abandoned sites. On the basis of certain gathered values, future surveys of abandoned industrial and commercial sites can be calculated fairly exactly.

  11. #NoMorePage3

    DEFF Research Database (Denmark)

    Glozer, Sarah; McCarthy, Lauren; Whelan, Glen

    2015-01-01

    Fourth wave feminists are currently seeking to bring an end to The Sun’s Page 3, a British institution infamous for featuring a topless female model daily. This paper investigates the No More Page 3 (NMP3) campaign through which feminist activists have sought to disrupt the institutionalized obje...

  12. Web Page Design (Part Three).

    Science.gov (United States)

    Descy, Don E.

    1997-01-01

    Discusses fonts as well as design considerations that should be reviewed when designing World Wide Web pages and sites to make them easier for clients to use and easier to maintain. Also discusses the simplicity of names; organization of pages, folders, and files; and sites to help build Web sites. (LRW)

  13. Environmental and production survey methodology to estimate severity and extent of aquaculture impact in three areas of the Philippines

    Directory of Open Access Journals (Sweden)

    Rune Palerud

    2008-12-01

    Full Text Available The project “Environmental Monitoring and Modelling of Aquaculture in the Philippines” known as EMMA, was undertaken by the National Integrated Fisheries Technology Development Centre (NIFTDC of the Bureau of Fisheries and Aquatic Resources (BFAR and Akvaplan-niva AS of Tromsø, Norway. The project was funded by the Norwegian Agency for Development Cooperation (NORAD. This project tested survey equipment for the monitoring of aquaculture impact to the water column and sediment. Baseline surveys were undertaken as the goal of the study was to develop suitable aquaculture monitoring techniques and adapt predictive models to assist in identifying risk areas for aquaculture and allow planned development of sustainable aquaculture. Three different locations were chosen as case studies - Bolinao, Pangasinan (marine site, Dagupan (brackish water site, and Taal Lake (freshwater site. Production surveys were also undertaken to estimate production and nutrient outputs to the water bodies in order to be able to link aquaculture production with severity and extent of impacts. Different methodologies for the estimation of production were tested to find a cost effective and accurate methodology.

  14. The Common Topoi of STEM Discourse: An Apologia and Methodological Proposal, with Pilot Survey

    Science.gov (United States)

    Walsh, Lynda

    2010-01-01

    In this article, the author proposes a methodology for the rhetorical analysis of scientific, technical, mathematical, and engineering (STEM) discourse based on the common topics (topoi) of this discourse. Beginning with work by Miller, Prelli, and other rhetoricians of STEM discourse--but factoring in related studies in cognitive linguistics--she…

  15. A review of methodology and analysis of nutrition and mortality surveys conducted in humanitarian emergencies from October 1993 to April 2004

    Directory of Open Access Journals (Sweden)

    Spiegel Paul B

    2007-06-01

    Full Text Available Abstract Background Malnutrition prevalence and mortality rates are increasingly used as essential indicators to assess the severity of a crisis, to follow trends, and to guide decision-making, including allocation of funds. Although consensus has slowly developed on the methodology to accurately measure these indicators, errors in the application of the survey methodology and analysis have persisted. The aim of this study was to identify common methodological weaknesses in nutrition and mortality surveys and to provide practical recommendations for improvement. Methods Nutrition (N = 368 and crude mortality rate (CMR; N = 158 surveys conducted by 33 non-governmental organisations and United Nations agencies in 17 countries from October 1993 to April 2004 were analysed for sampling validity, precision, quality of measurement and calculation according to several criteria. Results One hundred and thirty (35.3% nutrition surveys and 5 (3.2% CMR surveys met the criteria for quality. Quality of surveys varied significantly depending on the agency. The proportion of nutrition surveys that met criteria for quality rose significantly from 1993 to 2004; there was no improvement for mortality surveys during this period. Conclusion Significant errors and imprecision in the methodology and reporting of nutrition and mortality surveys were identified. While there was an improvement in the quality of nutrition surveys over the years, the quality of mortality surveys remained poor. Recent initiatives aimed at standardising nutrition and mortality survey quality should be strengthened. There are still a number of methodological issues in nutrition and mortality surveys in humanitarian emergencies that need further study.

  16. Doctoral Training in Statistics, Measurement, and Methodology in Psychology: Replication and Extension of Aiken, West, Sechrest, and Reno's (1990) Survey of PhD Programs in North America

    Science.gov (United States)

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…

  17. Doctoral Training in Statistics, Measurement, and Methodology in Psychology: Replication and Extension of Aiken, West, Sechrest, and Reno's (1990) Survey of PhD Programs in North America

    Science.gov (United States)

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…

  18. Survey on methodologies in the risk assessment of chemical exposures in emergency response situations in Europe

    DEFF Research Database (Denmark)

    Heinälä, Milla; Gundert-Remy, Ursula; Wood, Maureen Heraty

    2013-01-01

    A scientifically sound assessment of the risk to human health resulting from acute chemical releases is the cornerstone for chemical incident prevention, preparedness and response. Although the general methodology to identify acute toxicity of chemicals has not substantially changed in the last....../corrosive chemicals will remain serious risks also in future the development of plausible scenarios for potential emerging risks is also needed. This includes risks from new mixtures and chemicals (e.g. nanoparticles)....

  19. The burden of headache disorders in India: methodology and questionnaire validation for a community-based survey in Karnataka State.

    Science.gov (United States)

    Rao, Girish N; Kulkarni, Girish B; Gururaj, Gopalkrishna; Rajesh, Kavita; Subbakrishna, D Kumaraswamy; Steiner, Timothy J; Stovner, Lars J

    2012-10-01

    Primary headache disorders are a major public-health problem globally and, possibly more so, in low- and middle-income countries. No methodologically sound studies of prevalence and burden of headache in the adult Indian population have been published previously. The present study was a door-to-door cold-calling survey in urban and rural areas in and around Bangalore, Karnataka State. From 2,714 households contacted, 2,514 biologically unrelated individuals were eligible for the survey and 2,329 (92.9 %) participated (1,103 [48 %] rural; 1,226 [52 %] urban; 1,141 [49 %] male; 1,188 [51 %] female; mean age 38.0 years). The focus was on primary headache (migraine and tension-type headache [TTH]) and medication-overuse headache. A structured questionnaire administered by trained lay interviewers was the instrument both for diagnosis (algorithmically determined from responses) and burden estimation. The screening question enquired into headache in the last year. The validation study compared questionnaire-based diagnoses with those obtained soon after through personal interview by a neurologist in a random sub-sample of participants (n = 381; 16 %). It showed high values (> 80 %) for sensitivity, specificity and predictive values for any headache, and for specificity and negative predictive value for migraine and TTH. Kappa values for diagnostic agreement were good for any headache (0.69 [95 % CI 0.61-0.76]), moderate (0.46 [0.35-0.56]) for migraine and fair (0.39 [0.29-0.49]) for TTH. The survey methodology, including identification of and access to participants, proved feasible. The questionnaire proved effective in the survey population. The study will give reliable estimates of the prevalence and burden of headache, and of migraine and TTH specifically, in urban and rural Karnataka.

  20. The Carnegie-Spitzer-IMACS Redshift Survey of Galaxy Evolution since z=1.5: I. Description and Methodology

    CERN Document Server

    Kelson, Daniel D; Dressler, Alan; McCarthy, Patrick J; Shectman, Stephen A; Mulchaey, John S; Villanueva, Edward V; Crane, Jeffrey D; Quadri, Ryan F

    2012-01-01

    We describe the Carnegie-Spitzer-IMACS (CSI) Survey, a wide-field, near-IR selected spectrophotometric redshift survey with the Inamori Magellan Areal Camera and Spectrograph (IMACS) on Magellan-Baade. By defining a flux-limited sample of galaxies in Spitzer 3.6micron imaging of SWIRE fields, the CSI Survey efficiently traces the stellar mass of average galaxies to z~1.5. This first paper provides an overview of the survey selection, observations, processing of the photometry and spectrophotometry. We also describe the processing of the data: new methods of fitting synthetic templates of spectral energy distributions are used to derive redshifts, stellar masses, emission line luminosities, and coarse information on recent star-formation. Our unique methodology for analyzing low-dispersion spectra taken with multilayer prisms in IMACS, combined with panchromatic photometry from the ultraviolet to the IR, has yielded 37,000 high quality redshifts in our first 5.3 sq.degs of the SWIRE XMM-LSS field. We use three...

  1. Application of hazard analysis and critical control point methodology and risk-based grading to consumer food safety surveys.

    Science.gov (United States)

    Røssvoll, Elin Halbach; Ueland, Øydis; Hagtvedt, Therese; Jacobsen, Eivind; Lavik, Randi; Langsrud, Solveig

    2012-09-01

    Traditionally, consumer food safety survey responses have been classified as either "right" or "wrong" and food handling practices that are associated with high risk of infection have been treated in the same way as practices with lower risks. In this study, a risk-based method for consumer food safety surveys has been developed, and HACCP (hazard analysis and critical control point) methodology was used for selecting relevant questions. We conducted a nationally representative Web-based survey (n = 2,008), and to fit the self-reported answers we adjusted a risk-based grading system originally developed for observational studies. The results of the survey were analyzed both with the traditional "right" and "wrong" classification and with the risk-based grading system. The results using the two methods were very different. Only 5 of the 10 most frequent food handling violations were among the 10 practices associated with the highest risk. These 10 practices dealt with different aspects of heat treatment (lacking or insufficient), whereas the majority of the most frequent violations involved storing food at room temperature for too long. Use of the risk-based grading system for survey responses gave a more realistic picture of risks associated with domestic food handling practices. The method highlighted important violations and minor errors, which are performed by most people and are not associated with significant risk. Surveys built on a HACCP-based approach with risk-based grading will contribute to a better understanding of domestic food handling practices and will be of great value for targeted information and educational activities.

  2. Distributed paging for general networks

    Energy Technology Data Exchange (ETDEWEB)

    Awerbuch, B.; Bartal, Y.; Fiat, A. [Tel-Aviv Univ. (Israel)

    1996-12-31

    Distributed paging deals with the dynamic allocation of copies of files in a distributed network as to minimize the total communication cost over a sequence of read and write requests. Most previous work deals with the file allocation problem where infinite nodal memory capacity is assumed. In contrast the distributed paging problem makes the more realistic assumption that nodal memory capacity is limited. Former work on distributed paging deals with the problem only in the case of a uniform network topology. This paper gives the first distributed paging algorithm for general networks. The algorithm is competitive in storage and communication. The competitive ratios are poly-logarithmic in the total number of network nodes and the diameter of the network.

  3. PageRank of integers

    CERN Document Server

    Frahm, K M; Shepelyansky, D L

    2012-01-01

    We build up a directed network tracing links from a given integer to its divisors and analyze the properties of the Google matrix of this network. The PageRank vector of this matrix is computed numerically and it is shown that its probability is inversely proportional to the PageRank index thus being similar to the Zipf law and the dependence established for the World Wide Web. The spectrum of the Google matrix of integers is characterized by a large gap and a relatively small number of nonzero eigenvalues. A simple semi-analytical expression for the PageRank of integers is derived that allows to find this vector for matrices of billion size. This network provides a new PageRank order of integers.

  4. Code AI Personal Web Pages

    Science.gov (United States)

    Garcia, Joseph A.; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The document consists of a publicly available web site (george.arc.nasa.gov) for Joseph A. Garcia's personal web pages in the AI division. Only general information will be posted and no technical material. All the information is unclassified.

  5. Methodology for conduct of epidemiologic surveys and randomized controlled trials of diabetic polyneuropathy.

    Science.gov (United States)

    Dyck, Peter James

    2014-01-01

    This chapter outlines: (1) the reasons why epidemiologic surveys and randomized controlled clinical trials (RCTs) of diabetic polyneuropathy (DPN) are difficult and expensive, and often poorly done, (2) primary and secondary neuropathy end points, (3) single versus composite neuropathic end points, (4) adequate reference values from study of population representative cohorts, and (5) the issue of clinical proficiency.

  6. Split-Half Administration of the 2015 School Crime Supplement to the National Crime Victimization Survey. Methodology Report. NCES 2017-004

    Science.gov (United States)

    Lessne, Deborah; Cidade, Melissa

    2016-01-01

    This report outlines the development, methodology, and results of the split-half administration of the 2015 School Crime Supplement (SCS) to the National Crime Victimization Survey (NCVS). The National Crime Victimization Survey (NCVS) is sponsored by the U.S. Department of Justice, Bureau of Justice Statistics (BJS). The National Center for…

  7. Split-Half Administration of the 2015 School Crime Supplement to the National Crime Victimization Survey: Methodology Report. NCES 2017-004

    Science.gov (United States)

    Lessne, Deborah; Cidade, Melissa

    2016-01-01

    This report outlines the development, methodology, and results of the split-half administration of the 2015 School Crime Supplement (SCS) to the National Crime Victimization Survey (NCVS). The National Crime Victimization Survey (NCVS) is sponsored by the U.S. Department of Justice, Bureau of Justice Statistics (BJS). The U.S. Census Bureau…

  8. Energy efficiency and energy savings in Japanese residential buildings - research methodology and surveyed results

    Energy Technology Data Exchange (ETDEWEB)

    Lopes, L.; Hokoi, S.; Miura, H. [Kyoto University (Japan). Faculty of Engineering, Department of Architecture and Environmental Design; Shuhei, K. [Kansai Electric Power Company Inc., Amagasaki (Japan). Energy Use R and D Center

    2005-07-01

    Worldwide energy consumption has risen 30% in the last 25 years. Fossil fuels exploitation is causing depletion of resources and serious environmental problems. Energy efficiency improvement and energy savings are important targets to be achieved on every society as a whole and in residential buildings in particular. In this article, results of a survey and questionnaire on energy consumption and thermal environment held in Kansai area, Japan are reported. Energy savings potential was analyzed for the surveyed 13 houses focusing on certain electrical appliances e.g. TV, rice cooker and refrigerator. Residents' environmental awareness towards energy consumption was clarified through questionnaire. An energy information session towards residents was held, and the resulting changes in lifestyle and their implications on energy consumption were evaluated. (author)

  9. Data Processing Procedures and Methodology for Estimating Trip Distances for the 1995 American Travel Survey (ATS)

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, H.-L.; Rollow, J.

    2000-05-01

    The 1995 American Travel Survey (ATS) collected information from approximately 80,000 U.S. households about their long distance travel (one-way trips of 100 miles or more) during the year of 1995. It is the most comprehensive survey of where, why, and how U.S. residents travel since 1977. ATS is a joint effort by the U.S. Department of Transportation (DOT) Bureau of Transportation Statistics (BTS) and the U.S. Department of Commerce Bureau of Census (Census); BTS provided the funding and supervision of the project, and Census selected the samples, conducted interviews, and processed the data. This report documents the technical support for the ATS provided by the Center for Transportation Analysis (CTA) in Oak Ridge National Laboratory (ORNL), which included the estimation of trip distances as well as data quality editing and checking of variables required for the distance calculations.

  10. The XMM Cluster Survey: Optical analysis methodology and the first data release

    CERN Document Server

    Mehrtens, Nicola; Lloyd-Davies, E J; Hilton, Matt; Miller, Christopher J; Stanford, S A; Hosmer, Mark; Hoyle, Ben; Collins, Chris A; Liddle, Andrew R; Viana, Pedro T P; Nichol, Robert C; Stott, John P; Dubois, E Naomi; Kay, Scott T; Sahlen, Martin; Young, Owain; Short, C J; Christodoulou, L; Watson, William A; Davidson, Michael; Harrison, Craig D; Baruah, Leon; Smith, Mathew; Burke, Claire; Deadman, Paul-James; Rooney, Philip J; Edmondson, Edward M; West, Michael; Campbell, Heather C; Edge, Alastair C; Mann, Robert G; Wake, David; Benoist, Christophe; da Costa, Luiz; Maia, Marcio A G; Ogando, Ricardo

    2011-01-01

    The XMM Cluster Survey (XCS) is a serendipitous search for galaxy clusters using all publicly available data in the XMM-Newton Science Archive. Its main aims are to measure cosmological parameters and trace the evolution of X-ray scaling relations. In this paper we present the first data release from the XMM Cluster Survey (XCS-DR1). This consists of 503 optically confirmed, serendipitously detected, X-ray clusters. Of these clusters, 255 are new to the literature and 356 are new X-ray discoveries. We present 464 clusters a with a redshift estimate (0.06 1.0, including a new spectroscopically-confirmed cluster at z = 1.01); (ii) 67 clusters with high Tx (> 5 keV); (iii) 131 clusters/groups with low Tx (< 2 keV); (iv) 27 clusters with measured Tx values in the SDSS `Stripe 82' co-add region; (v) 78 clusters with measured Tx values in the Dark Energy Survey region; (vi) 40 clusters detected with sufficient counts to permit mass measurements (under the assumption of hydrostatic equilibrium); (vii) 105 cluste...

  11. The 2014 Survey on Living with Chronic Diseases in Canada on Mood and Anxiety Disorders: a methodological overview

    Directory of Open Access Journals (Sweden)

    S. O’Donnell

    2016-12-01

    Full Text Available Introduction: There is a paucity of information about the impact of mood and anxiety disorders on Canadians and the approaches used to manage them. To address this gap, the 2014 Survey on Living with Chronic Diseases in Canada–Mood and Anxiety Disorders Component (SLCDC-MA was developed. The purpose of this paper is to describe the methodology of the 2014 SLCDC-MA and examine the sociodemographic characteristics of the final sample. Methods: The 2014 SLCDC-MA is a cross-sectional follow-up survey that includes Canadians from the 10 provinces aged 18 years and older with mood and/or anxiety disorders diagnosed by a health professional that are expected to last, or have already lasted, six months or more. The survey was developed by the Public Health Agency of Canada (PHAC through an iterative, consultative process with Statistics Canada and external experts. Statistics Canada performed content testing, designed the sampling frame and strategies and collected and processed the data. PHAC used descriptive analyses to describe the respondents’ sociodemographic characteristics, produced nationally representative estimates using survey weights provided by Statistics Canada, and generated variance estimates using bootstrap methodology. Results: The final 2014 SLCDC-MA sample consists of a total of 3361 respondents (68.9% response rate. Among Canadian adults with mood and/or anxiety disorders, close to two-thirds (64% were female, over half (56% were married/in a common-law relationship and 60% obtained a post-secondary education. Most were young or middle-aged (85%, Canadian born (88%, of non-Aboriginal status (95%, and resided in an urban setting (82%. Household income was fairly evenly distributed between the adequacy quintiles; however, individuals were more likely to report a household income adequacy within the lowest (23% versus highest (17% quintile. Forty-five percent reported having a mood disorder only, 24% an anxiety disorder only and 31

  12. [The Health, Well-Being, and Aging ("SABE") survey: methodology applied and profile of the study population].

    Science.gov (United States)

    Albala, Cecilia; Lebrão, Maria Lúcia; León Díaz, Esther María; Ham-Chande, Roberto; Hennis, Anselm J; Palloni, Alberto; Peláez, Martha; Pratts, Omar

    2005-01-01

    This document outlines the methodology of the Salud, Bienestar y Envejecimiento (Health, Well-Being, and Aging) survey (known as the "SABE survey"), and it also summarizes the challenges that the rapid aging of the population in Latin America and the Caribbean imposes on society in general and especially on health services. The populations of the countries of Latin America and the Caribbean are aging at a rate that has not been seen in the developed world. The evaluation of health problems and disability among older adults in those countries indicates that those persons are aging with more functional limitations and worse health than is true for their counterparts in developed nations. In addition, family networks in Latin America and the Caribbean are changing rapidly and have less capacity to make up for the lack of protections provided by social institutions. The multicenter SABE study was developed with the objective of evaluating the state of health of older adults in seven cities of Latin America and the Caribbean: Bridgetown, Barbados; Buenos Aires, Argentina; Havana, Cuba; Mexico City, Mexico; Montevideo, Uruguay; Santiago, Chile; and São Paulo, Brazil. The SABE survey has established the starting point for systematic research on aging in urban areas of Latin America and the Caribbean. Comparative studies of these characteristics and with this comparative nature should be extended to other countries, areas, and regions of the world in order to expand the knowledge available on older adults.

  13. [Methodological discussion about prevalence of the dental fluorosis on dental health surveys].

    Science.gov (United States)

    Freitas, Cláudia Helena Soares de Morais; Sampaio, Fábio Correia; Roncalli, Angelo Giuseppe; Moysés, Samuel Jorge

    2013-12-01

    To analyze the limitations of studying dental fluorosis in cross-sectional studies. Data from the Oral Health of the Brazilian Population (SBBrasil 2003) and the Brazilian Oral Health Survey (SBBrasil 2010) were used. Epidemiological trends for fluorosis in 12-year-old Brazilians, aspects of the reliability of the data as well as the accuracy of the estimates are assessed for these two studies. The distribution of prevalence of fluorosis was carried out according to the domains of the study (state capitals and regions) and the year in which the study took place. The confidence intervals (95%CI) were also shown for simple prevalence (without taking into account level of severity). The prevalence of dental fluorosis showed considerable variation, between 0% and 61% in 2003 and 0% and 59% in 2010. Inconsistencies were observed in the data in individual terms (for year and for domain) and in the behavior of the trend. Considering the expected prevalence and the data available in the two studies, the minimum sample size should be 1,500 individuals in order to obtain 3.4% and 6.6% confidence intervals, considering the minimum coefficient of variation to be 15%. Given the subjectivity in its classification, examinations for dental fluorosis may show more variation than those for other oral health conditions. The power to establish differences between the domains of the study with the sample of the SBBrasil 2010 is quite limited. Based on the 2003 and 2010 studies, it was not possible to analyze patterns of dental fluorosis in Brazil; these data are merely exploratory indicators of the prevalence of dental fluorosis. It was impossible to make comparisons due to different analysis models being used in the two surveys. Investigating dental fluorosis in population-based surveys is not even an economically viable technique, using localized epidemiological studies with a sampling plan would be more suitable [corrected].

  14. A SURVEY AND ANALYSIS ON CHINESE TEACHERSAND LEARNERS' NEEDS CONFLICT OF METHODOLOGY

    Institute of Scientific and Technical Information of China (English)

    YuXia

    2004-01-01

    Needs analysis, as a commonly used technique of“identifying learners” needs for language teaching, has beenadopted hy many College English teachers. However, the pointthat needs to be made is that the learners needs may conflictwith the teacher's, which is confirmed by the survey studyconducted by the author. Based on Nunan's negotiation model,this paper explores means of balancing the conflict under learner-centered system, It is proposed that needs conflict can be easedup by negotiating learning activities, developing learners learning strategy, establishing rapport with students and improving the ability of organizing class.

  15. Recent advancements in information extraction methodology and hardware for earth resources survey systems

    Science.gov (United States)

    Erickson, J. D.; Thomson, F. J.

    1974-01-01

    The present work discusses some recent developments in preprocessing and extractive processing techniques and hardware and in user applications model development for earth resources survey systems. The Multivariate Interactive Digital Analysis System (MIDAS) is currently being developed, and is an attempt to solve the problem of real time multispectral data processing in an operational system. The main features and design philosophy of this system are described. Examples of wetlands mapping and land resource inventory are presented. A user model developed for predicting the yearly production of mallard ducks from remote sensing and ancillary data is described.

  16. The Canadian survey of health, lifestyle and ageing with multiple sclerosis: methodology and initial results

    Science.gov (United States)

    Ploughman, Michelle; Beaulieu, Serge; Harris, Chelsea; Hogan, Stephen; Manning, Olivia J; Alderdice, Penelope W; Fisk, John D; Sadovnick, A Dessa; O'Connor, Paul; Morrow, Sarah A; Metz, Luanne M; Smyth, Penelope; Mayo, Nancy; Marrie, Ruth Ann; Knox, Katherine B; Stefanelli, Mark; Godwin, Marshall

    2014-01-01

    Objective People with multiple sclerosis (MS) are living longer so strategies to enhance long-term health are garnering more interest. We aimed to create a profile of ageing with MS in Canada by recruiting 1250 (5% of the Canadian population above 55 years with MS) participants and focusing data collection on health and lifestyle factors, disability, participation and quality of life to determine factors associated with healthy ageing. Design National multicentre postal survey. Setting Recruitment from Canadian MS clinics, MS Society of Canada chapters and newspaper advertisements. Participants People aged 55 years or older with MS symptoms more than 20 years. Outcome measures Validated outcome measures and custom-designed questions examining MS disease characteristics, living situation, disability, comorbid conditions, fatigue, health behaviours, mental health, social support, impact of MS and others. Results Of the 921 surveys, 743 were returned (80.7% response rate). Participants (mean age 64.6±6.2 years) reported living with MS symptoms for an average of 32.9±9.5 years and 28.6% were either wheelchair users or bedridden. There was only 5.4% missing data and 709 respondents provided optional qualitative information. According to data derived from the 2012 Canadian Community Health Survey of Canadians above 55 years of age, older people with MS from this survey sample are about eight times less likely to be employed full-time. Older people with MS were less likely to engage in regular physical activity (26.7%) compared with typical older Canadians (45.2%). However, they were more likely to abstain from alcohol and smoking. Conclusions Despite barriers to participation, we were able to recruit and gather detailed responses (with good data quality) from a large proportion of older Canadians with MS. The data suggest that this sample of older people with MS is less likely to be employed, are less active and more disabled than other older Canadians

  17. Discontinuation of the Bulletin's menu page

    CERN Multimedia

    Publications Section

    2005-01-01

    The menus of the various CERN restaurants will no longer be published in the Bulletin as of Monday 4 April (issue No. 14/2005). The menu pages are being discontinued both as a savings measure and due to the low level of interest in this section of the Bulletin. The most recent survey of Bulletin readers showed that only 13% of the people questioned regularly read the menu section, compared to between 40% and 85% in the case of the other sections. Publications Section SG/CO Tel. 79971

  18. Discontinuation of the Bulletin's menu page

    CERN Multimedia

    Publications Section

    2005-01-01

    The menus of the various CERN restaurants will no longer be published in the Bulletin as of Monday 4 April (issue No. 14/2005). The menu pages are being discontinued both as a savings measure and due to the low level of interest in this section of the Bulletin. The most recent survey of Bulletin readers showed that only 13% of the people questioned regularly read the menu section, compared to between 40% and 85% in the case of the other sections. Publications Section DSU-CO Tel. 79971

  19. A personalized web page content filtering model based on segmentation

    CERN Document Server

    Kuppusamy, K S; 10.5121/ijist.2012.2104

    2012-01-01

    In the view of massive content explosion in World Wide Web through diverse sources, it has become mandatory to have content filtering tools. The filtering of contents of the web pages holds greater significance in cases of access by minor-age people. The traditional web page blocking systems goes by the Boolean methodology of either displaying the full page or blocking it completely. With the increased dynamism in the web pages, it has become a common phenomenon that different portions of the web page holds different types of content at different time instances. This paper proposes a model to block the contents at a fine-grained level i.e. instead of completely blocking the page it would be efficient to block only those segments which holds the contents to be blocked. The advantages of this method over the traditional methods are fine-graining level of blocking and automatic identification of portions of the page to be blocked. The experiments conducted on the proposed model indicate 88% of accuracy in filter...

  20. The Faculty Web Page: Contrivance or Continuation?

    Science.gov (United States)

    Lennex, Lesia

    2007-01-01

    In an age of Internet education, what does it mean for a tenure/tenure-track faculty to have a web page? How many professors have web pages? If they have a page, what does it look like? Do they really need a web page at all? Many universities have faculty web pages. What do those collective pages look like? In what way do they represent the…

  1. Uranium resource assessment by the Geological Survey; methodology and plan to update the national resource base

    Science.gov (United States)

    Finch, Warren Irvin; McCammon, Richard B.

    1987-01-01

    Based on the Memorandum of Understanding {MOU) of September 20, 1984, between the U.S. Geological Survey of the U.S. Department of Interior and the Energy Information Administration {EIA) of the U.S. Department of Energy {DOE), the U.S. Geological Survey began to make estimates of the undiscovered uranium endowment of selected areas of the United States in 1985. A modified NURE {National Uranium Resource Evaluation) method will be used in place of the standard NURE method of the DOE that was used for the national assessment reported in October 1980. The modified method, here named the 'deposit-size-frequency' {DSF) method, is presented for the first time, and calculations by the two methods are compared using an illustrative example based on preliminary estimates for the first area to be evaluated under the MOU. The results demonstrate that the estimate of the endowment using the DSF method is significantly larger and more uncertain than the estimate obtained by the NURE method. We believe that the DSF method produces a more realistic estimate because the principal factor estimated in the endowment equation is disaggregated into more parts and is more closely tied to specific geologic knowledge than by the NURE method. The DSF method consists of modifying the standard NURE estimation equation, U=AxFxTxG, by replacing the factors FxT by a single factor that represents the tonnage for the total number of deposits in all size classes. Use of the DSF method requires that the size frequency of deposits in a known or control area has been established and that the relation of the size-frequency distribution of deposits to probable controlling geologic factors has been determined. Using these relations, the principal scientist {PS) first estimates the number and range of size classes and then, for each size class, estimates the lower limit, most likely value, and upper limit of the numbers of deposits in the favorable area. Once these probable estimates have been refined

  2. Methodological considerations concerning the development of oral dental erosion indexes: literature survey, validity and reliability

    DEFF Research Database (Denmark)

    Berg-Beckhoff, Gabriele; Kutschmann, Marcus; Bardehle, Doris

    2008-01-01

    Within the context of preventing non-communicable diseases, the World Health Report (2002) and the WHO Global Oral Health Program (2003) put forward a new strategy of disease prevention and health promotion. Greater emphasis is placed on developing global policies in oral health promotion and oral...... disease prevention. The Decayed, Missing, Filled Teeth (DMFT) index does not meet new challenges in the field of oral health. Dental erosion seems to be a growing problem, and in some countries, an increase in erosion of teeth is associated with an increase in the consumption of beverages containing acids....... Therefore, within a revision of the WHO Oral Health Surveys Basic Methods, new oral disease patterns, e.g. dental erosion, have to be taken into account. Within the last 20 years, many studies on dental erosion have been carried out and published. There has been a rapid growth in the number of indexes...

  3. Instant PageSpeed optimization

    CERN Document Server

    Jaiswal, Sanjeev

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. Instant PageSpeed Optimization is a hands-on guide that provides a number of clear, step-by-step exercises for optimizing your websites for better performance and improving their efficiency.Instant PageSpeed Optimization is aimed at website developers and administrators who wish to make their websites load faster without any errors and consume less bandwidth. It's assumed that you will have some experience in basic web technologies like HTML, CSS3, JavaScript, and the basics of netw

  4. What snippets say about pages

    NARCIS (Netherlands)

    Demeester, Thomas; Nguyen, Dong; Trieschnigg, Dolf; Develder, Chris; Hiemstra, Djoerd

    2013-01-01

    What is the likelihood that a Web page is considered relevant to a query, given the relevance assessment of the corresponding snippet? Using a new FederatedWeb Search test collection that contains search results from over a hundred search engines on the internet, we are able to investigate such rese

  5. Database-Based Web Page

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Database-based web page which uses IIS4.0 + ASP + ADO + SQL7.0 isbriefly introduced. It has been successfully used in E-commerce , bulletin board system and chat room, and so on in the web site of Computer Center Hudong Campus, Tong ji University.

  6. Web Page Design (Part One).

    Science.gov (United States)

    Descy, Don E.

    1997-01-01

    Discusses rules for Web page design: consider audiences' Internet skills and equipment; know your content; outline the material; map or sketch the site; be consistent; regulate size of graphics to control download time; place eye catching material in the first 300 pixels; moderate use of color to control file size and bandwidth; include a…

  7. Learning through Web Page Design.

    Science.gov (United States)

    Peel, Deborah

    2001-01-01

    Describes and evaluates the use of Web page design in an undergraduate course in the United Kingdom on town planning. Highlights include incorporating information and communication technologies into higher education; and a theoretical framework for the use of educational technology. (LRW)

  8. Large Area Scene Selection Interface (LASSI). Methodology of Selecting Landsat Imagery for the Global Land Survey 2005

    Science.gov (United States)

    Franks, Shannon; Masek, Jeffrey G.; Headley, Rachel M.; Gasch, John; Arvidson, Terry

    2009-01-01

    The Global Land Survey (GLS) 2005 is a cloud-free, orthorectified collection of Landsat imagery acquired during the 2004-2007 epoch intended to support global land-cover and ecological monitoring. Due to the numerous complexities in selecting imagery for the GLS2005, NASA and the U.S. Geological Survey (USGS) sponsored the development of an automated scene selection tool, the Large Area Scene Selection Interface (LASSI), to aid in the selection of imagery for this data set. This innovative approach to scene selection applied a user-defined weighting system to various scene parameters: image cloud cover, image vegetation greenness, choice of sensor, and the ability of the Landsat 7 Scan Line Corrector (SLC)-off pair to completely fill image gaps, among others. The parameters considered in scene selection were weighted according to their relative importance to the data set, along with the algorithm's sensitivity to that weight. This paper describes the methodology and analysis that established the parameter weighting strategy, as well as the post-screening processes used in selecting the optimal data set for GLS2005.

  9. Surveys on the Prevalence of Pediatric Bronchial Asthma in Japan: A Comparison between the 1982, 1992, and 2002 Surveys Conducted in the Same Region Using the Same Methodology

    Directory of Open Access Journals (Sweden)

    Sankei Nishima

    2009-01-01

    Conclusions: BA prevalence in the third survey increased 2.1 and 1.4 times respectively compared to the first survey and second survey, indicating an upward trend in all regions and age groups surveyed.

  10. Facebook's personal page modelling and simulation

    Science.gov (United States)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  11. Innovating Web Page Classification Through Reducing Noise

    Institute of Scientific and Technical Information of China (English)

    LI Xiaoli (李晓黎); SHI Zhongzhi(史忠植)

    2002-01-01

    This paper presents a new method that eliminates noise in Web page classification. It first describes the presentation of a Web page based on HTML tags. Then through a novel distance formula, it eliminates the noise in similarity measure. After carefully analyzing Web pages, we design an algorithm that can distinguish related hyperlinks from noisy ones.We can utilize non-noisy hyperlinks to improve the performance of Web page classification (the CAWN algorithm). For any page, wecan classify it through the text and category of neighbor pages related to the page. The experimental results show that our approach improved classification accuracy.

  12. Exploring the use of a Facebook page in anatomy education.

    Science.gov (United States)

    Jaffar, Akram Abood

    2014-01-01

    Facebook is the most popular social media site visited by university students on a daily basis. Consequently, Facebook is the logical place to start with for integrating social media technologies into education. This study explores how a faculty-administered Facebook Page can be used to supplement anatomy education beyond the traditional classroom. Observations were made on students' perceptions and effectiveness of using the Page, potential benefits and challenges of such use, and which Insights metrics best reflect user's engagement. The Human Anatomy Education Page was launched on Facebook and incorporated into anatomy resources for 157 medical students during two academic years. Students' use of Facebook and their perceptions of the Page were surveyed. Facebook's "Insights" tool was also used to evaluate Page performance during a period of 600 days. The majority of in-class students had a Facebook account which they adopted in education. Most students perceived Human Anatomy Education Page as effective in contributing to learning and favored "self-assessment" posts. The majority of students agreed that Facebook could be a suitable learning environment. The "Insights" tool revealed globally distributed fans with considerable Page interactions. The use of a faculty-administered Facebook Page provided a venue to enhance classroom teaching without intruding into students' social life. A wider educational use of Facebook should be adopted not only because students are embracing its use, but for its inherent potentials in boosting learning. The "Insights" metrics analyzed in this study might be helpful when establishing and evaluating the performance of education-oriented Facebook Pages.

  13. Assessing the quantified impact of a hybrid POGIL methodology on student averages in a forensic science survey course

    Science.gov (United States)

    Meeks, Tyna L.

    A causal-comparative/quasi experimental study examined the effect of incorporating a hybrid teaching methodology that blended lecture with Process Oriented Guided Inquiry Lessons (POGILs) on the overall academic achievement of a diverse student body in a large lecture setting. Additional considerations included student gender, ethnicity, declared major (STEM or non-STEM), and SAT scores. An evaluation of the effect that these characteristics had on student achievement due to differentiating import placed on the use of POGILs as a learning tool was included. This study used data obtained from a longitudinal examination of eight years of student data from an introductory forensic science survey course offered in a R1 northeastern university. This study addressed the effectiveness of applying a proscribed active learning methodology, one proposed effective in collegiate education, to a new environment, forensic science. The methodology employed combined fourteen POGILs, created specifically for the chosen course, with didactic lecture during the entire semester of a forensic science survey course. This quasi-experimental design used the manipulation of the independent variable, the use of a hybrid lecture instead of exclusive use of traditional didactic lectures, on the students' academic achievement on exams given during the course. Participants in this study (N=1436) were undergraduate students enrolled in the single semester introductory science course. A longitudinal study that incorporated eight years of data was completed, 4 years pre-intervention (2007-2010) and 4 years post-intervention (2011-2014). The forensic science survey course, taught by only one professor during the eight-year period, was a science discipline that had yet to integrate an active learning educational model. Findings indicate four variables significantly contributed to explaining nearly 28% of the variation seen in the student class averages earned during the eight-year period: the

  14. Functional Multiplex PageRank

    CERN Document Server

    Iacovacci, Jacopo; Arenas, Alex; Bianconi, Ginestra

    2016-01-01

    Recently it has been recognized that many complex social, technological and biological networks have a multilayer nature and can be described by multiplex networks. Multiplex networks are formed by a set of nodes connected by links having different connotations forming the different layers of the multiplex. Characterizing the centrality of the nodes in a multiplex network is a challenging task since the centrality of the node naturally depends on the importance associated to links of a certain type. Here we propose to assign to each node of a multiplex network a centrality called Functional Multiplex PageRank that is a function of the weights given to every different pattern of connections (multilinks) existent in the multiplex network between any two nodes. Since multilinks distinguish all the possible ways in which the links in different layers can overlap, the Functional Multiplex PageRank can describe important non-linear effects when large relevance or small relevance is assigned to multilinks with overl...

  15. The Carnegie-Spitzer-IMACS redshift survey of galaxy evolution since z = 1.5. I. Description and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Kelson, Daniel D.; Williams, Rik J.; Dressler, Alan; McCarthy, Patrick J.; Shectman, Stephen A.; Mulchaey, John S.; Villanueva, Edward V.; Crane, Jeffrey D.; Quadri, Ryan F. [The Observatories of the Carnegie Institution for Science, 813 Santa Barbara Street, Pasadena, CA 91101 (United States)

    2014-03-10

    We describe the Carnegie-Spitzer-IMACS (CSI) Survey, a wide-field, near-IR selected spectrophotometric redshift survey with the Inamori Magellan Areal Camera and Spectrograph (IMACS) on Magellan-Baade. By defining a flux-limited sample of galaxies in Spitzer Infrared Array Camera 3.6 μm imaging of SWIRE fields, the CSI Survey efficiently traces the stellar mass of average galaxies to z ∼ 1.5. This first paper provides an overview of the survey selection, observations, processing of the photometry and spectrophotometry. We also describe the processing of the data: new methods of fitting synthetic templates of spectral energy distributions are used to derive redshifts, stellar masses, emission line luminosities, and coarse information on recent star formation. Our unique methodology for analyzing low-dispersion spectra taken with multilayer prisms in IMACS, combined with panchromatic photometry from the ultraviolet to the IR, has yielded high-quality redshifts for 43,347 galaxies in our first 5.3 deg{sup 2} of the SWIRE XMM-LSS field. We use three different approaches to estimate our redshift errors and find robust agreement. Over the full range of 3.6 μm fluxes of our selection, we find typical redshift uncertainties of σ {sub z}/(1 + z) ≲ 0.015. In comparisons with previously published spectroscopic redshifts we find scatters of σ {sub z}/(1 + z) = 0.011 for galaxies at 0.7 ≤ z ≤ 0.9, and σ {sub z}/(1 + z) = 0.014 for galaxies at 0.9 ≤ z ≤ 1.2. For galaxies brighter and fainter than i = 23 mag, we find σ {sub z}/(1 + z) = 0.008 and σ {sub z}/(1 + z) = 0.022, respectively. Notably, our low-dispersion spectroscopy and analysis yields comparable redshift uncertainties and success rates for both red and blue galaxies, largely eliminating color-based systematics that can seriously bias observed dependencies of galaxy evolution on environment.

  16. The methodology of population surveys of headache prevalence, burden and cost: principles and recommendations from the Global Campaign against Headache.

    Science.gov (United States)

    Stovner, Lars Jacob; Al Jumah, Mohammed; Birbeck, Gretchen L; Gururaj, Gopalakrishna; Jensen, Rigmor; Katsarava, Zaza; Queiroz, Luiz Paulo; Scher, Ann I; Tekle-Haimanot, Redda; Wang, Shuu-Jiun; Steiner, Timothy J

    2014-01-27

    The global burden of headache is very large, but knowledge of it is far from complete and needs still to be gathered. Published population-based studies have used variable methodology, which has influenced findings and made comparisons difficult. Among the initiatives of the Global Campaign against Headache to improve and standardize methods in use for cross-sectional studies, the most important is the production of consensus-based methodological guidelines. This report describes the development of detailed principles and recommendations. For this purpose we brought together an expert consensus group to include experience and competence in headache epidemiology and/or epidemiology in general and drawn from all six WHO world regions. The recommendations presented are for anyone, of whatever background, with interests in designing, performing, understanding or assessing studies that measure or describe the burden of headache in populations. While aimed principally at researchers whose main interests are in the field of headache, they should also be useful, at least in parts, to those who are expert in public health or epidemiology and wish to extend their interest into the field of headache disorders. Most of all, these recommendations seek to encourage collaborations between specialists in headache disorders and epidemiologists. The focus is on migraine, tension-type headache and medication-overuse headache, but they are not intended to be exclusive to these. The burdens arising from secondary headaches are, in the majority of cases, more correctly attributed to the underlying disorders. Nevertheless, the principles outlined here are relevant for epidemiological studies on secondary headaches, provided that adequate definitions can be not only given but also applied in questionnaires or other survey instruments.

  17. Functional Multiplex PageRank

    Science.gov (United States)

    Iacovacci, Jacopo; Rahmede, Christoph; Arenas, Alex; Bianconi, Ginestra

    2016-10-01

    Recently it has been recognized that many complex social, technological and biological networks have a multilayer nature and can be described by multiplex networks. Multiplex networks are formed by a set of nodes connected by links having different connotations forming the different layers of the multiplex. Characterizing the centrality of the nodes in a multiplex network is a challenging task since the centrality of the node naturally depends on the importance associated to links of a certain type. Here we propose to assign to each node of a multiplex network a centrality called Functional Multiplex PageRank that is a function of the weights given to every different pattern of connections (multilinks) existent in the multiplex network between any two nodes. Since multilinks distinguish all the possible ways in which the links in different layers can overlap, the Functional Multiplex PageRank can describe important non-linear effects when large relevance or small relevance is assigned to multilinks with overlap. Here we apply the Functional Page Rank to the multiplex airport networks, to the neuronal network of the nematode C. elegans, and to social collaboration and citation networks between scientists. This analysis reveals important differences existing between the most central nodes of these networks, and the correlations between their so-called pattern to success.

  18. Methodological proposal for a systematic archaeological survey: the case of the Guadiana Menor Valley (Jaén, Spain

    Directory of Open Access Journals (Sweden)

    Chapa Brunet, Teresa

    2003-06-01

    Full Text Available The aim of this paper is to present a methodological approach to the development ola systematic archaeological survey. Inspired by the principles of Landscape Archeology, the archaeological record includes the material remains as well as geographical data. We discuss the data selection and its integration on a database and a GIS. The mechanisms generated in order to develop the sampling and its statistical bases are explained.

    El objeto de este artículo es realizar una propuesta metodológica para el desarrollo de una prospección arqueológica sistemática. Inspirada en los principios de la Arqueología del Paisaje, la documentación integra tanto los restos materiales arqueológicos como las variables del contexto geográfico. Se argumenta la selección de variables y su integración en una base de datos y en un SIG, especificándose los mecanismos generados para la elaboración del muestreo y su justificación estadística.

  19. 47 CFR 65.104 - Page limitations for rate of return submissions.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Page limitations for rate of return submissions... SERVICES (CONTINUED) INTERSTATE RATE OF RETURN PRESCRIPTION PROCEDURES AND METHODOLOGIES Procedures § 65.104 Page limitations for rate of return submissions. Rate of return submissions, including...

  20. Evaluating the usability of web pages: a case study

    NARCIS (Netherlands)

    Lautenbach, M.A.E.; Schegget, I.E. ter; Schoute, A.E.; Witteman, C.L.M.

    2008-01-01

    An evaluation of the Utrecht University website was carried out with 240 students. New criteria were drawn from the literature and operationalized for the study. These criteria are surveyability and findability. Web pages can be said to satisfy a usability criterion if their efficiency and effective

  1. Interstellar Initiative Web Page Design

    Science.gov (United States)

    Mehta, Alkesh

    1999-01-01

    This summer at NASA/MSFC, I have contributed to two projects: Interstellar Initiative Web Page Design and Lenz's Law Relative Motion Demonstration. In the Web Design Project, I worked on an Outline. The Web Design Outline was developed to provide a foundation for a Hierarchy Tree Structure. The Outline would help design a Website information base for future and near-term missions. The Website would give in-depth information on Propulsion Systems and Interstellar Travel. The Lenz's Law Relative Motion Demonstrator is discussed in this volume by Russell Lee.

  2. The effect of methodological differences in two survey's estimates of the percentage of employers sponsoring health insurance.

    Science.gov (United States)

    Hing, E; Poe, G; Euller, R

    1999-01-01

    Two large surveys on employer-sponsored health insurance produced different estimates of the percentage of employers offering insurance to their employees in 1993. These differences occurred despite major similarities in the surveys' purpose and design. In this paper, five survey design factors are assessed. Estimates from the second survey were recomputed to eliminate cases not included in the first survey. Survey estimates were no longer significantly different when cases were removed because establishments had moved, were single-employee establishments on the sample frame, were classified as completed only in the second survey, or when poststratification adjustments in the weighting used only in the second survey were eliminated. Based on a comparison of 449 cases that responded in both surveys, changes in the wording of questions also probably contributed to the difference in survey estimates. These results indicate that estimates from these types of surveys are very sensitive to differing designs.

  3. Doctoral training in statistics, measurement, and methodology in psychology: replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America.

    Science.gov (United States)

    Aiken, Leona S; West, Stephen G; Millsap, Roger E

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD programs (86%) participated. This survey replicated and extended a previous survey (L. S. Aiken, S. G. West, L. B. Sechrest, & R. R. Reno, 1990), permitting examination of curriculum development. Most training supported laboratory and not field research. The median of 1.6 years of training in statistics and measurement was mainly devoted to the modally 1-year introductory statistics course, leaving little room for advanced study. Curricular enhancements were noted in statistics and to a minor degree in measurement. Additional coverage of both fundamental and innovative quantitative methodology is needed. The research design curriculum has largely stagnated, a cause for great concern. Elite programs showed no overall advantage in quantitative training. Forces that support curricular innovation are characterized. Human capital challenges to quantitative training, including recruiting and supporting young quantitative faculty, are discussed. Steps must be taken to bring innovations in quantitative methodology into the curriculum of PhD programs in psychology.

  4. Evaluation of the Importance of Web Pages

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Google's algorithm on PageRank is analyzed in details. Some disadvantages of this algorithm is presented, for instance, preferring old pages, ignoring special sites and inaccurate judge of hyperlinks pointed out from one page. Furthermore, author's improved algorithm is described. Experiments show that the author's consideration on evaluating the importance of pages can make an improvement over the original algorithm. Based on this improved algorithm a topicspecific searching system have been developed.

  5. Web Page Segmentation for Small Screen Devices Using Tag Path Clustering Approach

    Directory of Open Access Journals (Sweden)

    Ms. S.Aruljothi

    2013-07-01

    Full Text Available The web pages breathing these days are developed to be displayed on a Desktop PCs and so viewing them on mobile web browsers is extremely tough. Since mobile devices have restricted resources, small screen device users need to scroll down and across the complicated sites persistently. To address the problem of resource limitation of small screen devices, a unique methodology of web page segmentation with tag path clustering is proposed, that reduces the memory space demand of the small hand-held devices. For segmenting web pages, both reappearance key patterns detection technique and page layout information are used to provide better segmentation accuracy.

  6. Reporting of financial and non-financial conflicts of interest by authors of systematic reviews: a methodological survey

    Science.gov (United States)

    Anouti, Sirine; Al-Gibbawi, Mounir; Abou-Jaoude, Elias A; Hasbani, Divina Justina; Guyatt, Gordon; Akl, Elie A

    2016-01-01

    Background Conflicts of interest may bias the findings of systematic reviews. The objective of this methodological survey was to assess the frequency and different types of conflicts of interest that authors of Cochrane and non-Cochrane systematic reviews report. Methods We searched for systematic reviews using the Cochrane Database of Systematic Reviews and Ovid MEDLINE (limited to the 119 Core Clinical Journals and the year 2015). We defined a conflict of interest disclosure as the reporting of whether a conflict of interest exists or not, and used a framework to classify conflicts of interest into individual (financial, professional and intellectual) and institutional (financial and advocatory) conflicts of interest. We conducted descriptive and regression analyses. Results Of the 200 systematic reviews, 194 (97%) reported authors' conflicts of interest disclosures, typically in the main document, and in a few cases either online (2%) or on request (5%). Of the 194 Cochrane and non-Cochrane reviews, 49% and 33%, respectively, had at least one author reporting any type of conflict of interest (p=0.023). Institutional conflicts of interest were less frequently reported than individual conflicts of interest, and Cochrane reviews were more likely to report individual intellectual conflicts of interest compared with non-Cochrane reviews (19% and 5%, respectively, p=0.004). Regression analyses showed a positive association between reporting of conflicts of interest (at least one type of conflict of interest, individual financial conflict of interest, institutional financial conflict of interest) and journal impact factor and between reporting individual financial conflicts of interest and pharmacological versus non-pharmacological intervention. Conclusions Although close to half of the published systematic reviews report that authors (typically many) have conflicts of interest, more than half report that they do not. Authors reported individual conflicts of interest

  7. Web Page Design and Network Analysis.

    Science.gov (United States)

    Wan, Hakman A.; Chung, Chi-wai

    1998-01-01

    Examines problems in Web-site design from the perspective of network analysis. In view of the similarity between the hypertext structure of Web pages and a generic network, network analysis presents concepts and theories that provide insight for Web-site design. Describes the problem of home-page location and control of number of Web pages and…

  8. Classifying web pages with visual features

    NARCIS (Netherlands)

    de Boer, V.; van Someren, M.; Lupascu, T.; Filipe, J.; Cordeiro, J.

    2010-01-01

    To automatically classify and process web pages, current systems use the textual content of those pages, including both the displayed content and the underlying (HTML) code. However, a very important feature of a web page is its visual appearance. In this paper, we show that using generic visual fea

  9. Museum: Multidimensional web page segment evaluation model

    CERN Document Server

    Kuppusamy, K S

    2012-01-01

    The evaluation of a web page with respect to a query is a vital task in the web information retrieval domain. This paper proposes the evaluation of a web page as a bottom-up process from the segment level to the page level. A model for evaluating the relevancy is proposed incorporating six different dimensions. An algorithm for evaluating the segments of a web page, using the above mentioned six dimensions is proposed. The benefits of fine-granining the evaluation process to the segment level instead of the page level are explored. The proposed model can be incorporated for various tasks like web page personalization, result re-ranking, mobile device page rendering etc.

  10. Web Page Watermarking for Tamper-Proof

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    This paper proposed a watermarking algorithm for tamper-proof of web pages. For a web page, it generates a watermark consisting of a sequence of Space and Tab. The watermark is then embedded into the web page after each word and each line. When a watermarked web page is tampered, the extracted watermark can detect and locate the modifications to the web page. Besides, the framework of watermarked Web Server system was given.Compared with traditional digital signature methods, this watermarking method is more transparent in that there is no necessary to detach the watermark before displaying web pages. The experimental results show that the proposed scheme is an effective tool for tamper-proof of web pages.

  11. Hidden Page WebCrawler Model for Secure Web Pages

    Directory of Open Access Journals (Sweden)

    K. F. Bharati

    2013-03-01

    Full Text Available The traditional search engines available over the internet are dynamic in searching the relevant content over the web. The search engine has got some constraints like getting the data asked from a varied source, where the data relevancy is exceptional. The web crawlers are designed only to more towards a specific path of the web and are restricted in moving towards a different path as they are secured or at times restricted due to the apprehension of threats. It is possible to design a web crawler that will have the capability of penetrating through the paths of the web, not reachable by the traditional web crawlers, in order to get a better solution in terms of data, time and relevancy for the given search query. The paper makes use of a newer parser and indexer for coming out with a novel idea of web crawler and a framework to support it. The proposed web crawler is designed to attend Hyper Text Transfer Protocol Secure (HTTPS based websites and web pages that needs authentication to view and index. User has to fill a search form and his/her creditionals will be used by the web crawler to attend secure web server for authentication. Once it is indexed the secure web server will be inside the web crawler’s accessible zone

  12. Methodology for estimating dietary data from the semi-quantitative food frequency questionnaire of the Mexican National Health and Nutrition Survey 2012

    Directory of Open Access Journals (Sweden)

    Ivonne Ramírez-Silva

    2016-12-01

    Full Text Available Objective. To describe the methodology used to clean up and estimate dietary intake (DI data from the Semi-Quantitative Food Frequency Questionnaire (SFFQ of the Mexican National Health and Nutrition Survey 2012. Materials and methods. DI was collected through a shortterm SFFQ regarding 140 foods (from October 2011 to May 2012. Energy and nutrient intake was calculated accordingto a nutrient database constructed specifically for the SFFQ. Results. A total of 133 nutrients including energy and fiber were generated from SFFQ data. Between 4.8 and 9.6% of the survey sample was excluded as a result of the cleaning process. Valid DI data were obtained regarding energy and nutrients consumed by 1 212 pre-school children, 1 323 school children, 1 961 adolescents, 2 027 adults and 526 older adults. Conclusions. We documented the methodology used to clean up and estimate DI from the SFFQ used in national dietary assessments in Mexico.

  13. Mimicked Web Page Detection over Internet

    Directory of Open Access Journals (Sweden)

    Y. Narasimha Rao

    2014-01-01

    Full Text Available Phishing is process of steeling valuable information such as ATM pins, Credit card details over internet. Where the attacker creates mimicked web pages from the legitimate web pages to fool users. In this paper, we propose an effective anti-phishing solution, by combining image based visual similarity based approach to detect plagiarized web pages. We used effective algorithms for our detection mechanism, speeded up Robust Features (SURF algorithm in order to generate signature based on extracting stable key points from the screen shot of the web page. When a legitimate web page is registered with our system, this algorithm applied on that web page in order to generate signatures, and these signatures are stored in the database for our trained system. When there is a suspected web page, this algorithm is applied to generate both the signatures of the suspected page and is verified against our database of corresponding legitimate web pages. Our results verified that our proposed system is very effective to detect the mimicked web pages with minimal false positives

  14. Wetlands of Argonne National Laboratory-East DuPage County, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    Van Lonkhuyzen, R.A.; LaGory, K.E.

    1994-03-01

    Jurisdictional wetlands of the Argonne National Laboratory-East (ANL-E) site in DuPage County, Illinois, were delineated in the summer and autumn of 1993 in accordance with the 1987 US Army Corps of Engineers methodology. Potential wetland sites with an area greater than 500 m{sup 2} (0.05 ha [0.124 acre]) were identified for delineation on the basis of aerial photographs, the DuPage County soil survey, and reconnaissance-level field studies. To qualify as a jurisdictional wetland, an area had to support a predominance of hydrophytic vegetation as well as have hydric soil and wetland hydrology. Thirty-five individual jurisdictional wetlands were delineated at ANL-E, totaling 180,604 m{sup 2} (18.1 ha [44.6 acres]). These wetlands were digitized onto the ANL-E site map for use in project planning. Characteristics of each wetland are presented -- including size, dominant plant species and their indicator status, hydrologic characteristics (including water source), and soil characteristics.

  15. Identifying Information Senders of Web Pages

    Science.gov (United States)

    Kato, Yoshikiyo; Kawahara, Daisuke; Inui, Kentaro; Kurohashi, Sadao; Shibata, Tomohide

    The source of information is one of the crucial elements when judging the credibility of the information. On the current Web, however, the information about the source is not readily available to the users. In this paper, we formulate the problem of identifying the information source as the problem of identifying the information sender configuration (ISC) of a Web page. An information sender of a Web page is an entity which is involved in the publication of the information on the page. An information sender configuration of a Web page describes the information senders of the page and the relationship among them. Information sender identification is a sub-problem of identifying ISC, and we present a method for extracting information senders from Web pages, along with its evaluation. ISC provides a basis for deeper analysis of information on the Web.

  16. Web Page Categorization Using Artificial Neural Networks

    CERN Document Server

    Kamruzzaman, S M

    2010-01-01

    Web page categorization is one of the challenging tasks in the world of ever increasing web technologies. There are many ways of categorization of web pages based on different approach and features. This paper proposes a new dimension in the way of categorization of web pages using artificial neural network (ANN) through extracting the features automatically. Here eight major categories of web pages have been selected for categorization; these are business & economy, education, government, entertainment, sports, news & media, job search, and science. The whole process of the proposed system is done in three successive stages. In the first stage, the features are automatically extracted through analyzing the source of the web pages. The second stage includes fixing the input values of the neural network; all the values remain between 0 and 1. The variations in those values affect the output. Finally the third stage determines the class of a certain web page out of eight predefined classes. This stage i...

  17. Realistic page-turning of electronic books

    Science.gov (United States)

    Fan, Chaoran; Li, Haisheng; Bai, Yannan

    2014-01-01

    The booming electronic books (e-books), as an extension to the paper book, are popular with readers. Recently, many efforts are put into the realistic page-turning simulation o f e-book to improve its reading experience. This paper presents a new 3D page-turning simulation approach, which employs piecewise time-dependent cylindrical surfaces to describe the turning page and constructs smooth transition method between time-dependent cylinders. The page-turning animation is produced by sequentially mapping the turning page into the cylinders with different radii and positions. Compared to the previous approaches, our method is able to imitate various effects efficiently and obtains more natural animation of turning page.

  18. Optimization of web pages for search engines

    OpenAIRE

    Harej, Anže

    2011-01-01

    The thesis describes the most important elements of a Web Page and outside factors that affect Search Engine Optimization. The basic structure of a Web page, structure and functionality of a modern Search Engine is described at the beginning. The first section deals with the start of Search Engine Optimization, including planning, analysis of web space and the selection of the most important keywords for which the site will be optimized. The next section Web Page Optimization describes...

  19. Library links on medical school home pages.

    Science.gov (United States)

    Thomas, Sheila L

    2011-01-01

    The purpose of this study was to assess the websites of American Association of Medical Colleges (AAMC)-member medical schools for the presence of library links. Sixty-one percent (n = 92) of home pages of the 150 member schools of the AAMC contain library links. For the 58 home pages not offering such links, 50 provided a pathway of two or three clicks to a library link. The absence of library links on 39% of AAMC medical school home pages indicates that the designers of those pages did not consider the library to be a primary destination for their visitors.

  20. Estimation of Nationwide Vaccination Coverage and Comparison of Interview and Telephone Survey Methodology for Estimating Vaccination Status

    Science.gov (United States)

    Park, Boyoung; Lee, Yeon-Kyeng; Cho, Lisa Y.; Go, Un Yeong; Yang, Jae Jeong; Ma, Seung Hyun; Choi, Bo-Youl; Lee, Moo-Sik; Lee, Jin-Seok; Choi, Eun Hwa; Lee, Hoan Jong

    2011-01-01

    This study compared interview and telephone surveys to select the better method for regularly estimating nationwide vaccination coverage rates in Korea. Interview surveys using multi-stage cluster sampling and telephone surveys using stratified random sampling were conducted. Nationwide coverage rates were estimated in subjects with vaccination cards in the interview survey. The interview survey relative to the telephone survey showed a higher response rate, lower missing rate, higher validity and a less difference in vaccination coverage rates between card owners and non-owners. Primary vaccination coverage rate was greater than 90% except for the fourth dose of DTaP (diphtheria/tetanus/pertussis), the third dose of polio, and the third dose of Japanese B encephalitis (JBE). The DTaP4: Polio3: MMR1 fully vaccination rate was 62.0% and BCG1:HepB3:DTaP4:Polio3:MMR1 was 59.5%. For age-appropriate vaccination, the coverage rate was 50%-80%. We concluded that the interview survey was better than the telephone survey. These results can be applied to countries with incomplete registry and decreasing rates of landline telephone coverage due to increased cell phone usage and countries. Among mandatory vaccines, efforts to increase vaccination rate for the fourth dose of DTaP, the third dose of polio, JBE and regular vaccinations at recommended periods should be conducted in Korea. PMID:21655054

  1. A Survey of Internship Programs for Management Undergraduates in AACSB-Accredited Institutions

    Science.gov (United States)

    Kim, Eyong B.; Kim, Kijoo; Bzullak, Michael

    2012-01-01

    Purpose: The purpose of this paper is to survey the current status of internship programs for Management undergraduate students and to introduce a well-established internship program. Design/methodology/approach: A web page analysis was conducted on 473 institutions that have AACSB (the Association to Advance Collegiate Schools of Business)…

  2. A methodology to address mixed AGN and starlight contributions in emission line galaxies found in the RESOLVE survey and ECO catalog

    Science.gov (United States)

    Richardson, Chris T.; Kannappan, Sheila; Bittner, Ashley; Isaac, Rohan; RESOLVE

    2017-01-01

    We present a novel methodology for modeling emission line galaxy samples that span the entire BPT diagram. Our methodology has several advantages over current modeling schemes: the free variables in the model are identical for both AGN and SF galaxies; these free variables are more closely linked to observable galaxy properties; and the ionizing spectra including an AGN and starlight are handled self-consistently rather than empirically. We show that our methodology is capable of fitting the vast majority of SDSS galaxies that fall within the traditional regions of galaxy classification on the BPT diagram. We also present current results for relaxing classification boundaries and extending our galaxies into the dwarf regime, using the REsolved Spectroscopy of a Local VolumE (RESOLVE) survey and the Environmental COntext (ECO) catalog, with special attention to compact blue E/S0s. We compare this methodology to PCA decomposition of the spectra. This work is supported by National Science Foundation awards AST-0955368 and CISE/ACI-1156614.

  3. 40 CFR 1502.7 - Page limits.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Page limits. 1502.7 Section 1502.7 Protection of Environment COUNCIL ON ENVIRONMENTAL QUALITY ENVIRONMENTAL IMPACT STATEMENT § 1502.7 Page limits. The text of final environmental impact statements (e.g., paragraphs (d) through (g) of §...

  4. Textual Article Clustering in Newspaper Pages

    NARCIS (Netherlands)

    Aiello, Marco; Pegoretti, Andrea

    2006-01-01

    In the analysis of a newspaper page an important step is the clustering of various text blocks into logical units, i.e., into articles. We propose three algorithms based on text processing techniques to cluster articles in newspaper pages. Based on the complexity of the three algorithms and

  5. Textual Article Clustering in Newspaper Pages

    NARCIS (Netherlands)

    Aiello, Marco; Pegoretti, Andrea

    2004-01-01

    In the analysis of a newspaper page an important step is the clustering of various text blocks into logical units, i.e., into articles. We propose three algorithms based on text processing techniques to cluster articles in newspaper pages. Based on the complexity of the three algorithms and

  6. Customisation of Indico pages - Layout and Menus

    CERN Document Server

    CERN. Geneva; Ferreira, Pedro

    2017-01-01

    In this tutorial you are going to learn how to customize the layout of your Indico pages (for example you can change the color of the background images or change the logo) and the menus on your Indico pages  (for example you can add or hide certain blocks, or change their name and order).  

  7. Textual Article Clustering in Newspaper Pages

    NARCIS (Netherlands)

    Aiello, Marco; Pegoretti, Andrea

    2006-01-01

    In the analysis of a newspaper page an important step is the clustering of various text blocks into logical units, i.e., into articles. We propose three algorithms based on text processing techniques to cluster articles in newspaper pages. Based on the complexity of the three algorithms and experime

  8. Textual Article Clustering in Newspaper Pages

    NARCIS (Netherlands)

    Aiello, Marco; Pegoretti, Andrea

    2004-01-01

    In the analysis of a newspaper page an important step is the clustering of various text blocks into logical units, i.e., into articles. We propose three algorithms based on text processing techniques to cluster articles in newspaper pages. Based on the complexity of the three algorithms and experime

  9. Minimal Guidelines for Authors of Web Pages.

    Science.gov (United States)

    ADE Bulletin, 2002

    2002-01-01

    Presents guidelines that recommend the minimal reference information that should be provided on Web pages intended for use by students, teachers, and scholars in the modern languages. Suggests the inclusion of information about responsible parties, copyright declaration, privacy statements, and site information. Makes a note on Web page style. (SG)

  10. Web page classification on child suitability

    NARCIS (Netherlands)

    Eickhoff, C.; Serdyukov, P.; Vries, A.P. de

    2010-01-01

    Children spend significant amounts of time on the Internet. Recent studies showed, that during these periods they are often not under adult supervision. This work presents an automatic approach to identifying suitable web pages for children based on topical and non-topical web page aspects. We discu

  11. Veracity in in vitro fertilization Web pages.

    Science.gov (United States)

    Cowan, Bryan D

    2005-03-01

    Huang et al. described compliance of IVF websites against the American Medical Association online health information guidelines and reported that IVF websites scored poorly. We describe a protocol for IVF websites that would inform readers about truthfulness of the page, develop standards for page construction, and establish a review process.

  12. CERN Web Pages Receive a Makeover

    CERN Multimedia

    2001-01-01

    Asudden allergic reaction to the colour turquoise? Never fear, from Monday 2 April you'll be able to click in the pink box at the top of the CERN users' welcome page to go to the all-new welcome page, which is simpler and better organized. CERN's new-look intranet is the first step in a complete Web-makeover being applied by the Web Public Education (WPE) group of ETT Division. The transition will be progressive, to allow users to familiarize themselves with the new pages. Until 17 April, CERN users will still get the familiar turquoise welcome page by default, with the new pages operating in parallel. From then on, the default will switch to the new pages, with the old ones being finally switched off on 25 May. Some 400 pages have received the makeover treatment. For more information about the changes to your Web, take a look at: http://www.cern.ch/CERN/NewUserPages/ Happy surfing!

  13. Comparing classical and quantum PageRanks

    CERN Document Server

    Loke, T; Rodriguez, J; Small, M; Wang, J B

    2015-01-01

    Following recent developments in quantum PageRanking, we present a comparative analysis of discrete-time and continuous-time quantum-walk-based PageRank algorithms. For the discrete-time case, we introduce an alternative PageRank measure based on the maximum probabilities achieved by the walker on the nodes. We demonstrate that the required time of evolution does not scale significantly with increasing network size. We affirm that all three quantum PageRank measures considered here distinguish clearly between outerplanar hierarchical, scale-free, and Erd\\"os-R\\'enyi network types. Relative to classical PageRank and to different extents, the quantum measures better highlight secondary hubs and resolve ranking degeneracy among peripheral nodes for the networks we studied in this paper.

  14. PageRank Pipeline Benchmark: Proposal for a Holistic System Benchmark for Big-Data Platforms

    CERN Document Server

    Dreher, Patrick; Hill, Chris; Gadepally, Vijay; Kuszmaul, Bradley; Kepner, Jeremy

    2016-01-01

    The rise of big data systems has created a need for benchmarks to measure and compare the capabilities of these systems. Big data benchmarks present unique scalability challenges. The supercomputing community has wrestled with these challenges for decades and developed methodologies for creating rigorous scalable benchmarks (e.g., HPC Challenge). The proposed PageRank pipeline benchmark employs supercomputing benchmarking methodologies to create a scalable benchmark that is reflective of many real-world big data processing systems. The PageRank pipeline benchmark builds on existing prior scalable benchmarks (Graph500, Sort, and PageRank) to create a holistic benchmark with multiple integrated kernels that can be run together or independently. Each kernel is well defined mathematically and can be implemented in any programming environment. The linear algebraic nature of PageRank makes it well suited to being implemented using the GraphBLAS standard. The computations are simple enough that performance predictio...

  15. An EAACI “European Survey on Adverse Systemic Reactions in Allergen Immunotherapy (EASSI)”: the methodology

    DEFF Research Database (Denmark)

    Calderón, Moises A; Rodríguez Del Río, Pablo; Vidal, Carmen;

    2014-01-01

    UNLABELLED: At present, there is no European report on clinically relevant systemic reactions due to the regular use of allergen immunotherapy (AIT), administered either subcutaneously or sublingually (SCIT and SLIT, respectively) outside clinical trials. Using an electronic survey and a "harmoni......UNLABELLED: At present, there is no European report on clinically relevant systemic reactions due to the regular use of allergen immunotherapy (AIT), administered either subcutaneously or sublingually (SCIT and SLIT, respectively) outside clinical trials. Using an electronic survey...

  16. Analyzing the Impact of Visitors on Page Views with Google Analytics

    Directory of Open Access Journals (Sweden)

    MOHAMMAD AMIN OMIDVAR

    2011-02-01

    Full Text Available This paper develops a flexible methodology to analyze the effectiveness of different variables on various dependent variables which all are times series and especially shows how to use a timeseries regression on one of the most important and primary index (page views per visit on Google analytic and in conjunction it shows how to use the most suitable data to gain a more accurate result. Search engine visitors have a variety of impact on page views which cannot be described by single regression. On one hand referral visitors are well-fitted on linear regression with low impact. On the other hand, direct visitors made a huge impact on page views. The higher connection speed does not simply imply higher impact on page views and the content of web page and the territory of visitors can help connection speed to describe user behavior.Returning visitors have some similarities with direct visitors.

  17. Analyzing the Impact of Visitors on Page Views with Google Analytics

    CERN Document Server

    Omidvar, Mohammad Amin; Shokry, Najes; 10.5121/ijwest.2011.2102

    2011-01-01

    This paper develops a flexible methodology to analyze the effectiveness of different variables on various dependent variables which all are times series and especially shows how to use a time series regression on one of the most important and primary index (page views per visit) on Google analytic and in conjunction it shows how to use the most suitable data to gain a more accurate result. Search engine visitors have a variety of impact on page views which cannot be described by single regression. On one hand referral visitors are well-fitted on linear regression with low impact. On the other hand, direct visitors made a huge impact on page views. The higher connection speed does not simply imply higher impact on page views and the content of web page and the territory of visitors can help connection speed to describe user behavior. Returning visitors have some similarities with direct visitors.

  18. The effect of new links on Google PageRank

    NARCIS (Netherlands)

    Avrachenkov, Konstatin; Litvak, Nelly

    2004-01-01

    PageRank is one of the principle criteria according to which Google ranks Web pages. PageRank can be interpreted as a frequency of visiting a Web page by a random surfer and thus it reflects the popularity of a Web page. We study the effect of newly created links on Google PageRank. We discuss to wh

  19. The changing pages of comics : Page layouts across eight decades of American superhero comics

    NARCIS (Netherlands)

    Pederson, Kaitlin; Cohn, Neil

    2016-01-01

    Page layouts are one of the most overt features of comics’ structure. We hypothesized that American superhero comics have changed in their page layout over eight decades, and investigated this using a corpus analysis of 40 comics from 1940 through 2014. On the whole, we found that comics pages decre

  20. The changing pages of comics : Page layouts across eight decades of American superhero comics

    NARCIS (Netherlands)

    Pederson, Kaitlin; Cohn, Neil

    2016-01-01

    Page layouts are one of the most overt features of comics’ structure. We hypothesized that American superhero comics have changed in their page layout over eight decades, and investigated this using a corpus analysis of 40 comics from 1940 through 2014. On the whole, we found that comics pages

  1. Web Page Segmentation for Small Screen Devices Using Tag Path Clustering Approach

    OpenAIRE

    Ms. S.Aruljothi; Mrs. S. Sivaranjani; Dr.S.Sivakumari

    2013-01-01

    The web pages breathing these days are developed to be displayed on a Desktop PCs and so viewing them on mobile web browsers is extremely tough. Since mobile devices have restricted resources, small screen device users need to scroll down and across the complicated sites persistently. To address the problem of resource limitation of small screen devices, a unique methodology of web page segmentation with tag path clustering is proposed, that reduces the memory space demand of the small hand-h...

  2. Bibliographic survey on methodologies for development of health database of the population in case of cancer occurrences; Levantamento bibliografico sobre metodologias para elaboracao de um banco de dados da saude da populacao em casos de ocorrencias de cancer

    Energy Technology Data Exchange (ETDEWEB)

    Cavinato, Christianne C.; Andrade, Delvonei A. de; Sabundjian, Gaiane, E-mail: christiannecobellocavinato@gmail.com, E-mail: delvonei@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Diz, Maria Del Pilar E., E-mail: maria.pilar@icesp.org.br [Instituto do Cancer do Estado de Sao Paulo (ICESP), Sao Paulo, SP (Brazil)

    2014-07-01

    The objective is to make a survey of existing methodologies and for the development of public health database, focusing on health (fatal and nonfatal cancer) of the population surrounding a nuclear facility, for purposes of calculating the environmental cost of the same. From methodologies found to develop this type of database, a methodology will be developed to be applied to the internal public of IPEN/CNEN-SP, Brazil, as a pre-test for the acquisition of health information desired.

  3. Case and Relation (CARE based Page Rank Algorithm for Semantic Web Search Engines

    Directory of Open Access Journals (Sweden)

    N. Preethi

    2012-05-01

    Full Text Available Web information retrieval deals with a technique of finding relevant web pages for any given query from a collection of documents. Search engines have become the most helpful tool for obtaining useful information from the Internet. The next-generation Web architecture, represented by the Semantic Web, provides the layered architecture possibly allowing data to be reused across application. The proposed architecture use a hybrid methodology named Case and Relation (CARE based Page Rank algorithm which uses past problem solving experience maintained in the case base to form a best matching relations and then use them for generating graphs and spanning forests to assign a relevant score to the pages.

  4. 一种改进的PageRank算法%An Improved PageRank Algorithm

    Institute of Scientific and Technical Information of China (English)

    王钟斐

    2011-01-01

    Aiming at the problems of Topic-drift and emphasizing on old web pages for PageRank algorithm, combined with anchor texts similarity and timing feedback factor, this article presents an improved algorithm STPR, and analyzs STPR algorithm by experiment. First this article compares the traditional PageRank algorithm and adding anchor text similarity PageRank algorithm, results show that by adding anchor text similarity PageRank algorithm helps to reduce the occurrence of the phenomenon of topic-drift. Second this article compares adding anchor text similarity PageRank algorithm and STPR algorithm, results show that STPR algorithm not only reduce topic-drift phenomenon, but also make up PageRank value for new web pages.%针对PageRank算法存在主题漂移以及偏重旧网页的问题,结合锚文本相似度和时间反馈因子提出了一种PageRank改进算法STPR,并对STPR算法进行实验分析.先比较了传统PageRank算法与加入锚文本相似度的PageRank算法,结果表明加入锚文本相似度的PageRank算法有利于减少主题漂移现象的发生;其次比较了加入锚文本相似度的PageRank算法与STPR算法,结果表明STPR算法不但减少了主题漂移现象,而且还弥补了新网页的PageRank值.

  5. The matrix method to calculate page rank

    Directory of Open Access Journals (Sweden)

    H. Barboucha, M. Nasri

    2014-06-01

    Full Text Available Choosing the right keywords is relatively easy, whereas getting a high PageRank is more complicated. The index Page Rank is what defines the position in the result pages of search engines (for Google of course, but the other engines are now using more or less the same kind of algorithm. It is therefore very important to understand how this type of algorithm functions to hope to appear on the first page of results (the only page read in 95 % of cases or at least be among the first. We propose in this paper to clarify the operation of this algorithm using a matrix method and a JavaScript program enabling to experience this type of analysis. It is of course a simplified version, but it can add value to the website and achieve a high ranking in the search results and reach a larger customer base. The interest is to disclose an algorithm to calculate the relevance of each page. This is in fact a mathematical algorithm based on a web graph. This graph is formed of all the web pages that are modeled by nodes, and hyperlinks that are modeled by arcs.

  6. Flash-Aware Page Replacement Algorithm

    Directory of Open Access Journals (Sweden)

    Guangxia Xu

    2014-01-01

    Full Text Available Due to the limited main memory resource of consumer electronics equipped with NAND flash memory as storage device, an efficient page replacement algorithm called FAPRA is proposed for NAND flash memory in the light of its inherent characteristics. FAPRA introduces an efficient victim page selection scheme taking into account the benefit-to-cost ratio for evicting each victim page candidate and the combined recency and frequency value, as well as the erase count of the block to which each page belongs. Since the dirty victim page often contains clean data that exist in both the main memory and the NAND flash memory based storage device, FAPRA only writes the dirty data within the victim page back to the NAND flash memory based storage device in order to reduce the redundant write operations. We conduct a series of trace-driven simulations and experimental results show that our proposed FAPRA algorithm outperforms the state-of-the-art algorithms in terms of page hit ratio, the number of write operations, runtime, and the degree of wear leveling.

  7. Design and methodology of a mixed methods follow-up study to the 2014 Ghana Demographic and Health Survey

    Science.gov (United States)

    Staveteig, Sarah; Aryeetey, Richmond; Anie-Ansah, Michael; Ahiadeke, Clement; Ortiz, Ladys

    2017-01-01

    ABSTRACT Background: The intended meaning behind responses to standard questions posed in large-scale health surveys are not always well understood. Systematic follow-up studies, particularly those which pose a few repeated questions followed by open-ended discussions, are well positioned to gauge stability and consistency of data and to shed light on the intended meaning behind survey responses. Such follow-up studies require extensive coordination and face challenges in protecting respondent confidentiality during the process of recontacting and reinterviewing participants. Objectives: We describe practical field strategies for undertaking a mixed methods follow-up study during a large-scale health survey. Methods: The study was designed as a mixed methods follow-up study embedded within the 2014 Ghana Demographic and Health Survey (GDHS). The study was implemented in 13 clusters. Android tablets were used to import reference data from the parent survey and to administer the questionnaire, which asked a mixture of closed- and open-ended questions on reproductive intentions, decision-making, and family planning. Results: Despite a number of obstacles related to recontacting respondents and concern about respondent fatigue, over 92 percent of the selected sub-sample were successfully recontacted and reinterviewed; all consented to audio recording. A confidential linkage between GDHS data, follow-up tablet data, and audio transcripts was successfully created for the purpose of analysis. Conclusions: We summarize the challenges in follow-up study design, including ethical considerations, sample size, auditing, filtering, successful use of tablets, and share lessons learned for future such follow-up surveys. PMID:28145817

  8. Web Page Recommendation Using Web Mining

    Directory of Open Access Journals (Sweden)

    Modraj Bhavsar

    2014-07-01

    Full Text Available On World Wide Web various kind of content are generated in huge amount, so to give relevant result to user web recommendation become important part of web application. On web different kind of web recommendation are made available to user every day that includes Image, Video, Audio, query suggestion and web page. In this paper we are aiming at providing framework for web page recommendation. 1 First we describe the basics of web mining, types of web mining. 2 Details of each web mining technique.3We propose the architecture for the personalized web page recommendation.

  9. Ensemble Enabled Weighted PageRank

    CERN Document Server

    Luo, Dongsheng; Hu, Renjun; Duan, Liang; Ma, Shuai

    2016-01-01

    This paper describes our solution for WSDM Cup 2016. Ranking the query independent importance of scholarly articles is a critical and challenging task, due to the heterogeneity and dynamism of entities involved. Our approach is called Ensemble enabled Weighted PageRank (EWPR). To do this, we first propose Time-Weighted PageRank that extends PageRank by introducing a time decaying factor. We then develop an ensemble method to assemble the authorities of the heterogeneous entities involved in scholarly articles. We finally propose to use external data sources to further improve the ranking accuracy. Our experimental study shows that our EWPR is a good choice for ranking scholarly articles.

  10. UAV-Based Photogrammetry and Integrated Technologies for Architectural Applications—Methodological Strategies for the After-Quake Survey of Vertical Structures in Mantua (Italy

    Directory of Open Access Journals (Sweden)

    Cristiana Achille

    2015-06-01

    Full Text Available This paper examines the survey of tall buildings in an emergency context like in the case of post-seismic events. The after-earthquake survey has to guarantee time-savings, high precision and security during the operational stages. The main goal is to optimize the application of methodologies based on acquisition and automatic elaborations of photogrammetric data even with the use of Unmanned Aerial Vehicle (UAV systems in order to provide fast and low cost operations. The suggested methods integrate new technologies with commonly used technologies like TLS and topographic acquisition. The value of the photogrammetric application is demonstrated by a test case, based on the comparison of acquisition, calibration and 3D modeling results in case of use of a laser scanner, metric camera and amateur reflex camera. The test would help us to demonstrate the efficiency of image based methods in the acquisition of complex architecture. The case study is Santa Barbara Bell tower in Mantua. The applied survey solution allows a complete 3D database of the complex architectural structure to be obtained for the extraction of all the information needed for significant intervention. This demonstrates the applicability of the photogrammetry using UAV for the survey of vertical structures, complex buildings and difficult accessible architectural parts, providing high precision results.

  11. UAV-Based Photogrammetry and Integrated Technologies for Architectural Applications--Methodological Strategies for the After-Quake Survey of Vertical Structures in Mantua (Italy).

    Science.gov (United States)

    Achille, Cristiana; Adami, Andrea; Chiarini, Silvia; Cremonesi, Stefano; Fassi, Francesco; Fregonese, Luigi; Taffurelli, Laura

    2015-06-30

    This paper examines the survey of tall buildings in an emergency context like in the case of post-seismic events. The after-earthquake survey has to guarantee time-savings, high precision and security during the operational stages. The main goal is to optimize the application of methodologies based on acquisition and automatic elaborations of photogrammetric data even with the use of Unmanned Aerial Vehicle (UAV) systems in order to provide fast and low cost operations. The suggested methods integrate new technologies with commonly used technologies like TLS and topographic acquisition. The value of the photogrammetric application is demonstrated by a test case, based on the comparison of acquisition, calibration and 3D modeling results in case of use of a laser scanner, metric camera and amateur reflex camera. The test would help us to demonstrate the efficiency of image based methods in the acquisition of complex architecture. The case study is Santa Barbara Bell tower in Mantua. The applied survey solution allows a complete 3D database of the complex architectural structure to be obtained for the extraction of all the information needed for significant intervention. This demonstrates the applicability of the photogrammetry using UAV for the survey of vertical structures, complex buildings and difficult accessible architectural parts, providing high precision results.

  12. UAV-Based Photogrammetry and Integrated Technologies for Architectural Applications—Methodological Strategies for the After-Quake Survey of Vertical Structures in Mantua (Italy)

    Science.gov (United States)

    Achille, Cristiana; Adami, Andrea; Chiarini, Silvia; Cremonesi, Stefano; Fassi, Francesco; Fregonese, Luigi; Taffurelli, Laura

    2015-01-01

    This paper examines the survey of tall buildings in an emergency context like in the case of post-seismic events. The after-earthquake survey has to guarantee time-savings, high precision and security during the operational stages. The main goal is to optimize the application of methodologies based on acquisition and automatic elaborations of photogrammetric data even with the use of Unmanned Aerial Vehicle (UAV) systems in order to provide fast and low cost operations. The suggested methods integrate new technologies with commonly used technologies like TLS and topographic acquisition. The value of the photogrammetric application is demonstrated by a test case, based on the comparison of acquisition, calibration and 3D modeling results in case of use of a laser scanner, metric camera and amateur reflex camera. The test would help us to demonstrate the efficiency of image based methods in the acquisition of complex architecture. The case study is Santa Barbara Bell tower in Mantua. The applied survey solution allows a complete 3D database of the complex architectural structure to be obtained for the extraction of all the information needed for significant intervention. This demonstrates the applicability of the photogrammetry using UAV for the survey of vertical structures, complex buildings and difficult accessible architectural parts, providing high precision results. PMID:26134108

  13. Objectives and methodology of Romanian SEPHAR II Survey. Project for comparing the prevalence and control of cardiovascular risk factors in two East-European countries: Romania and Poland

    Science.gov (United States)

    Dorobantu, Maria; Tautu, Oana-Florentina; Ghiorghe, Silviu; Badila, Elisabeta; Dana, Minca; Dobreanu, Minodora; Baila, Ilarie; Rutkowski, Marcin; Zdrojewski, Tomasz

    2015-01-01

    Introduction Comparing results of representative surveys conducted in different East-European countries could contribute to a better understanding and management of cardiovascular risk factors, offering grounds for the development of health policies addressing the special needs of this high cardiovascular risk region of Europe. The aim of this paper was to describe the methodology on which the comparison between the Romanian survey SEPHAR II and the Polish survey NATPOL 2011 results is based. Material and methods SEPHAR II, like NATPOL 2011, is a cross-sectional survey conducted on a representative sample of the adult Romanian population (18 to 80 years) and encompasses two visits with the following components: completing the study questionnaire, blood pressure and anthropometric measurements, and collection of blood and urine samples. Results From a total of 2223 subjects found at 2860 visited addresses, 2044 subjects gave written consent but only 1975 subjects had eligible data for the analysis, accounting for a response rate of 69.06%. Additionally we excluded 11 subjects who were 80 years of age (NATPOL 2011 included adult subjects up to 79 years). Therefore, the sample size included in the statistical analysis is 1964. It has similar age groups and gender structure as the Romanian population aged 18–79 years from the last census available at the moment of conducting the survey (weight adjustments for epidemiological analyses range from 0.48 to 8.7). Conclusions Sharing many similarities, the results of SEPHAR II and NATPOL 2011 surveys can be compared by a proper statistical method offering crucial information regarding cardiovascular risk factors in a high-cardiovascular risk European region. PMID:26322082

  14. Predictors of Smoking and Smokeless Tobacco Use in College Students: A Preliminary Study Using Web-Based Survey Methodology

    Science.gov (United States)

    Morrell, Holly E. R.; Cohen, Lee M.; Bacchi, Donna; West, Joel

    2005-01-01

    Cigarette smoking and smokeless tobacco (SLT) use are associated with numerous health hazards and economic costs, and rates of tobacco use have recently increased among young adults. In this study, the authors compared predictors of smoking and SLT use among college students (N = 21,410) from 13 Texas universities using a Web-based survey. Results…

  15. The Emerging Infections Network electronic mail conference and web page.

    Science.gov (United States)

    Strausbaugh, L J; Liedtke, L A

    2001-01-15

    In February 1997, the Emerging Infections Network (EIN) established an electronic mail conference to facilitate discussions about emerging infectious diseases and related topics among its members and public health officials. Later that year, the EIN opened its section of the Infectious Diseases Society of America's home page. The EIN Web page was developed to give its members an alternative route for responding to EIN surveys and to facilitate rapid dispersal of EIN reports. The unrestricted portion of the site allows visitors access to information about the EIN and to published EIN reports on specific topics. For the most part, these are brief summaries or abstracts. In the restricted, password-protected portion of the EIN site, members can access the detailed, original reports from EIN queries and the comprehensive listings of member observations. Search functions in both portions of the EIN site enhance the retrieval of reports and observations on specific topics.

  16. Expectations for the methodology and translation of animal research: a survey of the general public, medical students and animal researchers in North America.

    Science.gov (United States)

    Joffe, Ari R; Bara, Meredith; Anton, Natalie; Nobis, Nathan

    2016-09-01

    To determine what are considered acceptable standards for animal research (AR) methodology and translation rate to humans, a validated survey was sent to: a) a sample of the general public, via Sampling Survey International (SSI; Canada), Amazon Mechanical Turk (AMT; USA), a Canadian city festival (CF) and a Canadian children's hospital (CH); b) a sample of medical students (two first-year classes); and c) a sample of scientists (corresponding authors and academic paediatricians). There were 1379 responses from the general public sample (SSI, n = 557; AMT, n = 590; CF, n = 195; CH, n = 102), 205/330 (62%) medical student responses, and 23/323 (7%, too few to report) scientist responses. Asked about methodological quality, most of the general public and medical student respondents expect that: AR is of high quality (e.g. anaesthesia and analgesia are monitored, even overnight, and 'humane' euthanasia, optimal statistical design, comprehensive literature review, randomisation and blinding, are performed), and costs and difficulty are not acceptable justifications for lower quality (e.g. costs of expert consultation, or more laboratory staff). Asked about their expectations of translation to humans (of toxicity, carcinogenicity, teratogenicity and treatment findings), most expect translation more than 60% of the time. If translation occurred less than 20% of the time, a minority disagreed that this would "significantly reduce your support for AR". Medical students were more supportive of AR, even if translation occurred less than 20% of the time. Expectations for AR are much higher than empirical data show to have been achieved.

  17. Generating Best Features for Web Page Classification

    Directory of Open Access Journals (Sweden)

    K. Selvakuberan

    2008-03-01

    Full Text Available As the Internet provides millions of web pages for each and every search term, getting interesting and required results quickly from the Web becomes very difficult. Automatic classification of web pages into relevant categories is the current research topic which helps the search engine to get relevant results. As the web pages contain many irrelevant, infrequent and stop words that reduce the performance of the classifier, extracting or selecting representative features from the web page is an essential pre-processing step. The goal of this paper is to find minimum number of highly qualitative features by integrating feature selection techniques. We conducted experiments with various numbers of features selected by different feature selection algorithms on a well defined initial set of features and show that cfssubset evaluator combined with term frequency method gives minimal qualitative features enough to attain considerable classification accuracy.

  18. Universal Emergence of PageRank

    CERN Document Server

    Frahm, K M; Shepelyansky, D L

    2011-01-01

    The PageRank algorithm enables to rank the nodes of a network through a specific eigenvector of the Google matrix, using a damping parameter $\\alpha \\in ]0,1[$. Using extensive numerical simulations of large web networks, we determine numerically and analytically the universal features of PageRank vector at its emergence when $\\alpha \\rightarrow 1$. The whole network can be divided into a core part and a group of invariant subspaces. For $ \\alpha \\rightarrow 1$ the PageRank converges to a universal power law distribution on the invariant subspaces whose size distribution also follows a universal power law. The convergence of PageRank at $ \\alpha \\rightarrow 1$ is controlled by eigenvalues of the core part of the Google matrix which are exponentially close to unity leading to large relaxation times as for example in spin glasses.

  19. Universal emergence of PageRank

    Energy Technology Data Exchange (ETDEWEB)

    Frahm, K M; Georgeot, B; Shepelyansky, D L, E-mail: frahm@irsamc.ups-tlse.fr, E-mail: georgeot@irsamc.ups-tlse.fr, E-mail: dima@irsamc.ups-tlse.fr [Laboratoire de Physique Theorique du CNRS, IRSAMC, Universite de Toulouse, UPS, 31062 Toulouse (France)

    2011-11-18

    The PageRank algorithm enables us to rank the nodes of a network through a specific eigenvector of the Google matrix, using a damping parameter {alpha} Element-Of ]0, 1[. Using extensive numerical simulations of large web networks, with a special accent on British University networks, we determine numerically and analytically the universal features of the PageRank vector at its emergence when {alpha} {yields} 1. The whole network can be divided into a core part and a group of invariant subspaces. For {alpha} {yields} 1, PageRank converges to a universal power-law distribution on the invariant subspaces whose size distribution also follows a universal power law. The convergence of PageRank at {alpha} {yields} 1 is controlled by eigenvalues of the core part of the Google matrix, which are extremely close to unity, leading to large relaxation times as, for example, in spin glasses. (paper)

  20. Best Practices for Searchable Collection Pages

    Science.gov (United States)

    Searchable Collection pages are stand-alone documents that do not have any web area navigation. They should not recreate existing content on other sites and should be tagged with quality metadata and taxonomy terms.

  1. Intention to continue using Facebook fan pages from the perspective of social capital theory.

    Science.gov (United States)

    Lin, Kuan-Yu; Lu, Hsi-Peng

    2011-10-01

    Social network sites enable users to express themselves, establish ties, and develop and maintain social relationships. Recently, many companies have begun using social media identity (e.g., Facebook fan pages) to enhance brand attractiveness, and social network sites have evolved into social utility networks, thereby creating a number of promising business opportunities. To this end, the operators of fan pages need to be aware of the factors motivating users to continue their patronization of such pages. This study set out to identify these motivating factors from the point of view of social capital. This study employed structural equation modeling to investigate a research model based on a survey of 327 fan pages users. This study discovered that ties related to social interaction (structural dimension), shared values (cognitive dimension), and trust (relational dimension) play important roles in users' continued intention to use Facebook fan pages. Finally, this study discusses the implications of these findings and offers directions for future research.

  2. A Web Page Summarization for Mobile Phones

    Science.gov (United States)

    Hasegawa, Takaaki; Nishikawa, Hitoshi; Imamura, Kenji; Kikui, Gen'ichiro; Okumur, Manabu

    Recently, web pages for mobile devices are widely spread on the Internet and a lot of people can access web pages through search engines by mobile devices as well as personal computers. A summary of a retrieved web page is important because the people judge whether or not the page would be relevant to their information need according to the summary. In particular, the summary must be not only compact but also grammatical and meaningful when the users retrieve information using a mobile phone with a small screen. Most search engines seem to produce a snippet based on the keyword-in-context (KWIC) method. However, this simple method could not generate a refined summary suitable for mobile phones because of low grammaticality and content overlap with the page title. We propose a more suitable method to generate a snippet for mobile devices using sentence extraction and sentence compression methods. First, sentences are biased based on whether they include the query terms from the users or words that are relevant to the queries, as well as whether they do not overlap with the page title based on maximal marginal relevance (MMR). Second, the selected sentences are compressed based on their phrase coverage, which is measured by the scores of words, and their phrase connection probability measured based on the language model, according to the dependency structure converted from the sentence. The experimental results reveal the proposed method outperformed the KWIC method in terms of relevance judgment, grammaticality, non-redundancy and content coverage.

  3. Banner Pages on the New Printing Infrastructure

    CERN Multimedia

    2006-01-01

    Changes to the printing service were announced in CERN Bulletin No. 37-38/2006. In the new infrastructure, the printing of the banner page has been disabled in order to reduce paper consumption. Statistics show that the average print job size is small and the paper savings by not printing the banner page could be up to 20 %. When each printer is moved onto the new infrastructure banner page printing will be disabled. In the case of corridor printers which are shared by several users, the Helpdesk can re-enable banner page printing upon request. We hope ultimately to arrive at a situation where banner page printing is enabled on fewer than 10% of printers registered on the network. You can still print banner pages on printers where it has been centrally disabled by using Linux. Simply add it to your print job on the client side by adding the -o job-sheets option to your lpr command. Detailed documentation is available on each SLC3/4 under the following link: http://localhost:631/sum.html#4_2 Please bea...

  4. The Carnegie-Spitzer-IMACS Redshift Survey of Galaxy Evolution since z=1.5: I. Description and Methodology and More!

    CERN Document Server

    Kelson, Daniel D; Dressler, Alan; McCarthy, Patrick J; Shectman, Stephen A; Mulchaey, John S; Villanueva, Edward V; Crane, Jeffrey D; Quadri, Ryan F

    2014-01-01

    We describe the Carnegie-Spitzer-IMACS (CSI) Survey, a wide-field, near-IR selected spectrophotometric redshift survey with IMACS on Magellan-Baade. CSI uses a flux-limited sample of galaxies in Spitzer IRAC 3.6micron imaging of SWIRE fields to efficiently trace the stellar mass of average galaxies to z~1.5. This paper provides an overview of the survey selection, observations, and processing of the photometry and spectrophotometry. We also describe the analysis of the data: new methods of fitting synthetic SEDs are used to derive redshifts, stellar masses, emission line luminosities, and coarse information on recent star-formation. Our unique methodology for analyzing low-dispersion spectra taken with multilayer prisms in IMACS, combined with panchromatic photometry from the ultraviolet to the IR, has yielded high quality redshifts for 43,347 galaxies in our first 5.3 sq. degs of the SWIRE XMM-LSS field. A new approach to assessing data quality is also described, and three different approaches are used to es...

  5. 《中国小说史略》的方法论特色%Methodological characteristics of"Chinese novels history survey"

    Institute of Scientific and Technical Information of China (English)

    张应怀

    2014-01-01

    Mr. Lu Xun's"Chinese novel history survey"is a brilliant exposition, a unique approach in the novel history books. In this paper, through the investigation of two kinds of Chinese and western research methods in the"Chinese novel history survey", to illustrate the traditional literature theory and modern methodology of"Chinese novel history survey"academic value has an important influence.%鲁迅先生的《中国小说史略》是一部论述精辟、方法独到的小说史专著。本文通过考察中西两种研究方法在《中国小说史略》的具体运用情况,来说明传统文献学理论和现代方法论对《中国小说史略》学术价值的生成产生了重要影响。

  6. The network adjustment aimed for the campaigned gravity survey using a Bayesian approach: methodology and model test

    Science.gov (United States)

    Chen, Shi; Liao, Xu; Ma, Hongsheng; Zhou, Longquan; Wang, Xingzhou; Zhuang, Jiancang

    2017-04-01

    The relative gravimeter, which generally uses zero-length springs as the gravity senor, is still as the first choice in the field of terrestrial gravity measurement because of its efficiency and low-cost. Because the drift rate of instrument can be changed with the time and meter, it is necessary for estimating the drift rate to back to the base or known gravity value stations for repeated measurement at regular hour's interval during the practical survey. However, the campaigned gravity survey for the large-scale region, which the distance of stations is far away from serval or tens kilometers, the frequent back to close measurement will highly reduce the gravity survey efficiency and extremely time-consuming. In this paper, we proposed a new gravity data adjustment method for estimating the meter drift by means of Bayesian statistical interference. In our approach, we assumed the change of drift rate is a smooth function depend on the time-lapse. The trade-off parameters were be used to control the fitting residuals. We employed the Akaike's Bayesian Information Criterion (ABIC) for the estimated these trade-off parameters. The comparison and analysis of simulated data between the classical and Bayesian adjustment show that our method is robust and has self-adaptive ability for facing to the unregularly non-linear meter drift. At last, we used this novel approach to process the realistic campaigned gravity data at the North China. Our adjustment method is suitable to recover the time-varied drift rate function of each meter, and also to detect the meter abnormal drift during the gravity survey. We also defined an alternative error estimation for the inversed gravity value at the each station on the basis of the marginal distribution theory. Acknowledgment: This research is supported by Science Foundation Institute of Geophysics, CEA from the Ministry of Science and Technology of China (Nos. DQJB16A05; DQJB16B07), China National Special Fund for Earthquake

  7. Personal and Public Start Pages in a library setting

    NARCIS (Netherlands)

    Kieft-Wondergem, Dorine

    2009-01-01

    Personal and Public Start Pages are web-based resources. With these kind of tools it is possible to make your own free start page. A Start Page allows you to put all your web resources into one page, including blogs, email, podcasts, RSSfeeds. It is possible to share the content of the page with oth

  8. Personal and Public Start Pages in a library setting

    NARCIS (Netherlands)

    Kieft-Wondergem, Dorine

    Personal and Public Start Pages are web-based resources. With these kind of tools it is possible to make your own free start page. A Start Page allows you to put all your web resources into one page, including blogs, email, podcasts, RSSfeeds. It is possible to share the content of the page with

  9. Design and Validation of an Attention Model of Web Page Users

    OpenAIRE

    Ananya Jana; Samit Bhattacharya

    2015-01-01

    In this paper, we propose a model to predict the locations of the most attended pictorial information on a web page and the attention sequence of the information. We propose to divide the content of a web page into conceptually coherent units or objects, based on a survey of more than 100 web pages. The proposed model takes into account three characteristics of an image object: chromatic contrast, size, and position and computes a numerical value, the attention factor. We can predict from the...

  10. How Compliant are Dental Practice Facebook Pages with Australian Healthcare Advertising Regulations? A Netnographic Review.

    Science.gov (United States)

    Holden, Alexander C L; Spallek, Heiko

    2017-09-22

    The National Law that regulates the dental and other healthcare professions in Australia sets out regulations that dictate how dental practices are to advertise. This study examines the extent to which the profession complies with these regulations and the potential impact that advertising may have upon professionalism. A Facebook search of 38 Local Government Areas in Sydney, New South Wales was carried out to identify dental practices that had pages on this social media site. A framework for assessment of compliance was developed using the regulatory guidelines and was used to conduct a netnographic review. 266 practice pages were identified from across the 38 regions. 71.05% of pages were in breach of the National Law in their use of testimonials, 5.26% of pages displayed misleading or false information, 4.14% of pages displayed offers that had no clear terms and conditions or had inexact pricing, 19.55% of pages had pictures or text that was likely to create unrealistic expectations of treatment benefit and 16.92% of pages encouraged the indiscriminate and unnecessary utilisation of health services. This study found that compliance to the National Law by the Facebook pages surveyed was poor. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  11. Joint analysis of galaxy-galaxy lensing and galaxy clustering: Methodology and forecasts for Dark Energy Survey

    Energy Technology Data Exchange (ETDEWEB)

    Park, Y.; Krause, E.; Dodelson, S.; Jain, B.; Amara, A.; Becker, M. R.; Bridle, S. L.; Clampitt, J.; Crocce, M.; Fosalba, P.; Gaztanaga, E.; Honscheid, K.; Rozo, E.; Sobreira, F.; Sánchez, C.; Wechsler, R. H.; Abbott, T.; Abdalla, F. B.; Allam, S.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Dietrich, J. P.; Doel, P.; Eifler, T. F.; Fausti Neto, A.; Fernandez, E.; Finley, D. A.; Flaugher, B.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; James, D. J.; Kent, S.; Kuehn, K.; Kuropatkin, N.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Melchior, P.; Miller, C. J.; Miquel, R.; Nichol, R. C.; Ogando, R.; Plazas, A. A.; Roe, N.; Romer, A. K.; Rykoff, E. S.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Soares-Santos, M.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Vikram, V.; Walker, A. R.; Weller, J.; Zuntz, J.

    2016-09-30

    The joint analysis of galaxy-galaxy lensing and galaxy clustering is a promising method for inferring the growth function of large scale structure. Our analysis will be carried out on data from the Dark Energy Survey (DES), with its measurements of both the distribution of galaxies and the tangential shears of background galaxies induced by these foreground lenses. We develop a practical approach to modeling the assumptions and systematic effects affecting small scale lensing, which provides halo masses, and large scale galaxy clustering. Introducing parameters that characterize the halo occupation distribution (HOD), photometric redshift uncertainties, and shear measurement errors, we study how external priors on different subsets of these parameters affect our growth constraints. Degeneracies within the HOD model, as well as between the HOD and the growth function, are identified as the dominant source of complication, with other systematic effects sub-dominant. The impact of HOD parameters and their degeneracies necessitate the detailed joint modeling of the galaxy sample that we employ. Finally, we conclude that DES data will provide powerful constraints on the evolution of structure growth in the universe, conservatively/optimistically constraining the growth function to 7.9%/4.8% with its first-year data that covered over 1000 square degrees, and to 3.9%/2.3% with its full five-year data that will survey 5000 square degrees, including both statistical and systematic uncertainties.

  12. Joint analysis of galaxy-galaxy lensing and galaxy clustering: Methodology and forecasts for Dark Energy Survey

    Science.gov (United States)

    Park, Y.; Krause, E.; Dodelson, S.; Jain, B.; Amara, A.; Becker, M. R.; Bridle, S. L.; Clampitt, J.; Crocce, M.; Fosalba, P.; Gaztanaga, E.; Honscheid, K.; Rozo, E.; Sobreira, F.; Sánchez, C.; Wechsler, R. H.; Abbott, T.; Abdalla, F. B.; Allam, S.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Dietrich, J. P.; Doel, P.; Eifler, T. F.; Fausti Neto, A.; Fernandez, E.; Finley, D. A.; Flaugher, B.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; James, D. J.; Kent, S.; Kuehn, K.; Kuropatkin, N.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Melchior, P.; Miller, C. J.; Miquel, R.; Nichol, R. C.; Ogando, R.; Plazas, A. A.; Roe, N.; Romer, A. K.; Rykoff, E. S.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Soares-Santos, M.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Vikram, V.; Walker, A. R.; Weller, J.; Zuntz, J.; DES Collaboration

    2016-09-01

    The joint analysis of galaxy-galaxy lensing and galaxy clustering is a promising method for inferring the growth function of large-scale structure. Anticipating a near future application of this analysis to Dark Energy Survey (DES) measurements of galaxy positions and shapes, we develop a practical approach to modeling the assumptions and systematic effects affecting the joint analysis of small-scale galaxy-galaxy lensing and large-scale galaxy clustering. Introducing parameters that characterize the halo occupation distribution (HOD), photometric redshift uncertainties, and shear measurement errors, we study how external priors on different subsets of these parameters affect our growth constraints. Degeneracies within the HOD model, as well as between the HOD and the growth function, are identified as the dominant source of complication, with other systematic effects being subdominant. The impact of HOD parameters and their degeneracies necessitate the detailed joint modeling of the galaxy sample that we employ. We conclude that DES data will provide powerful constraints on the evolution of structure growth in the Universe, conservatively/optimistically constraining the growth function to 7.9%/4.8% with its first-year data that cover over 1000 square degrees, and to 3.9%/2.3% with its full five-year data that will survey 5000 square degrees, including both statistical and systematic uncertainties.

  13. Design of a Web Page as a complement of educative innovation through MOODLE

    Science.gov (United States)

    Mendiola Ubillos, M. A.; Aguado Cortijo, Pedro L.

    2010-05-01

    In the context of Information Technology to impart knowledge and to establish MOODLE system as a support and complementary tool to on-site educational methodology (b-learning) a Web Page was designed in Agronomic and Food Industry Crops (Plantas de interés Agroalimentario) during 2006-07 course. This web was inserted in the Thecnical University of Madrid (Universidad Politécnica de Madrid) computer system to facilitate to the students the first contact with the contents of this subject. In this page the objectives and methodology, personal work planning, subject program given plus the activities are showed. At another web site, the evaluation criteria and recommended bibliography are located. The objective of this web page has been to make more transparent and accessible the necessary information in the learning process and presenting it in a more attractive frame. This page has been update and modified in each academic course offered since its first implementation. We had added in some cases new specific links to increase its useful. At the end of each course a test is applied to the students that take this subject. We have asked which elements would like to modify, delete and add to this web page. In this way the direct users give their point of view and help to improve the web page each course.

  14. Uniform Page Migration Problem in Euclidean Space

    Directory of Open Access Journals (Sweden)

    Amanj Khorramian

    2016-08-01

    Full Text Available The page migration problem in Euclidean space is revisited. In this problem, online requests occur at any location to access a single page located at a server. Every request must be served, and the server has the choice to migrate from its current location to a new location in space. Each service costs the Euclidean distance between the server and request. A migration costs the distance between the former and the new server location, multiplied by the page size. We study the problem in the uniform model, in which the page has size D = 1 . All request locations are not known in advance; however, they are sequentially presented in an online fashion. We design a 2.75 -competitive online algorithm that improves the current best upper bound for the problem with the unit page size. We also provide a lower bound of 2.732 for our algorithm. It was already known that 2.5 is a lower bound for this problem.

  15. SRAM-Based FPGA Systems for Safety-Critical Applications:A Survey on Design Standards and Proposed Methodologies

    Institute of Scientific and Technical Information of China (English)

    Cinzia Bernardeschi; Luca Cassano; Andrea Domenici

    2015-01-01

    As the ASIC design cost becomes affordable only for very large-scale productions, the FPGA technology is currently becoming the leading technology for those applications that require a small-scale production. FPGAs can be considered as a technology crossing between hardware and software. Only a small-number of standards for the design of safety-critical systems give guidelines and recommendations that take the peculiarities of the FPGA technology into consideration. The main contribution of this paper is an overview of the existing design standards that regulate the design and verification of FPGA-based systems in safety-critical application fields. Moreover, the paper proposes a survey of significant published research proposals and existing industrial guidelines about the topic, and collects and reports about some lessons learned from industrial and research pro jects involving the use of FPGA devices.

  16. Reporting, handling and assessing the risk of bias associated with missing participant data in systematic reviews: a methodological survey.

    Science.gov (United States)

    Akl, Elie A; Carrasco-Labra, Alonso; Brignardello-Petersen, Romina; Neumann, Ignacio; Johnston, Bradley C; Sun, Xin; Briel, Matthias; Busse, Jason W; Ebrahim, Shanil; Granados, Carlos E; Iorio, Alfonso; Irfan, Affan; Martínez García, Laura; Mustafa, Reem A; Ramírez-Morera, Anggie; Selva, Anna; Solà, Ivan; Sanabria, Andrea Juliana; Tikkinen, Kari A O; Vandvik, Per O; Vernooij, Robin W M; Zazueta, Oscar E; Zhou, Qi; Guyatt, Gordon H; Alonso-Coello, Pablo

    2015-09-30

    To describe how systematic reviewers are reporting missing data for dichotomous outcomes, handling them in the analysis and assessing the risk of associated bias. We searched MEDLINE and the Cochrane Database of Systematic Reviews for systematic reviews of randomised trials published in 2010, and reporting a meta-analysis of a dichotomous outcome. We randomly selected 98 Cochrane and 104 non-Cochrane systematic reviews. Teams of 2 reviewers selected eligible studies and abstracted data independently and in duplicate using standardised, piloted forms with accompanying instructions. We conducted regression analyses to explore factors associated with using complete case analysis and with judging the risk of bias associated with missing participant data. Of Cochrane and non-Cochrane reviews, 47% and 7% (previews) and assuming no participants with missing data had the event (4%). The use of complete case analysis was associated only with Cochrane reviews (relative to non-Cochrane: OR=7.25; 95% CI 1.58 to 33.3, p=0.01). 65% of reviews assessed risk of bias associated with missing data; this was associated with Cochrane reviews (relative to non-Cochrane: OR=6.63; 95% CI 2.50 to 17.57, p=0.0001), and the use of the Grading of Recommendations Assessment, Development and Evaluation (GRADE) methodology (OR=5.02; 95% CI 1.02 to 24.75, p=0.047). Though Cochrane reviews are somewhat less problematic, most Cochrane and non-Cochrane systematic reviews fail to adequately report and handle missing data, potentially resulting in misleading judgements regarding risk of bias. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  17. Machine Learning Feature Selection for Tuning Memory Page Swapping

    Science.gov (United States)

    2013-09-01

    erroneous and generally results in useful pages being paged out too early, only to be paged back in shortly there after. [1] The first in/first out ( FIFO ...the tail of the queue are selected. This algorithm has been shown to have significant shortcomings. When using a FIFO PRA, it is possible to encounter a...page which was just paged out. FIFO is therefore, a sub-optimal page replacement algorithm. Least recently used (LRU) is incredibly simple in concept

  18. Model for Predicting End User Web Page Response Time

    OpenAIRE

    Nagarajan, Sathya Narayanan; Ravikumar, Srijith

    2012-01-01

    Perceived responsiveness of a web page is one of the most important and least understood metrics of web page design, and is critical for attracting and maintaining a large audience. Web pages can be designed to meet performance SLAs early in the product lifecycle if there is a way to predict the apparent responsiveness of a particular page layout. Response time of a web page is largely influenced by page layout and various network characteristics. Since the network characteristics vary widely...

  19. 57 | Page A CROSS NATIONAL SURVEY OF THE LEGAL ...

    African Journals Online (AJOL)

    Fr. Ikenga

    Key words: Casual Work Arrangement, Labour, Protection, Legal Framework. 1. ... and outside such relationships (e.g. informal work, commercial contract holders such as .... This provision is the bedrock for the natural principle of right to.

  20. Page 34 Survey of Information Acquisition and Dissemination to ...

    African Journals Online (AJOL)

    distance learning students: a case study of National Open University, Ibadan study centre. ... expiration between the learner and the learning institution. .... Majority of respondents 76(35.8%) were self employed, 25 (11.8%) were civil servant,.

  1. Web Page Recommendation Models Theory and Algorithms

    CERN Document Server

    Gündüz-Ögüdücü, Sule

    2010-01-01

    One of the application areas of data mining is the World Wide Web (WWW or Web), which serves as a huge, widely distributed, global information service for every kind of information such as news, advertisements, consumer information, financial management, education, government, e-commerce, health services, and many other information services. The Web also contains a rich and dynamic collection of hyperlink information, Web page access and usage information, providing sources for data mining. The amount of information on the Web is growing rapidly, as well as the number of Web sites and Web page

  2. Survey of Fire Detection Technologies and System Evaluation/Certification Methodologies and Their Suitability for Aircraft Cargo Compartments

    Science.gov (United States)

    Cleary, T.; Grosshandler, W.

    1999-01-01

    As part of the National Aeronautics and Space Administration (NASA) initiated program on global civil aviation, NIST is assisting Federal Aviation Administration in its research to improve fire detection in aircraft cargo compartments. Aircraft cargo compartment detection certification methods have been reviewed. The Fire Emulator-Detector Evaluator (FE/DE) has been designed to evaluate fire detection technologies such as new sensors, multi-element detectors, and detectors that employ complex algorithms. The FE/DE is a flow tunnel that can reproduce velocity, temperature, smoke, and Combustion gas levels to which a detector might be exposed during a fire. A scientific literature survey and patent search have been conducted relating to existing and emerging fire detection technologies, and the potential use of new fire detection strategies in cargo compartment areas has been assessed. In the near term, improved detector signal processing and multi-sensor detectors based on combinations of smoke measurements, combustion gases and temperature are envisioned as significantly impacting detector system performance.

  3. Page sample size in web accessibility testing: how many pages is enough?

    NARCIS (Netherlands)

    Velleman, Eric; Geest, van der Thea

    2013-01-01

    Various countries and organizations use a different sampling approach and sample size of web pages in accessibility conformance tests. We are conducting a systematic analysis to determine how many pages is enough for testing whether a website is compliant with standard accessibility guidelines. This

  4. Agile Methodology - Past and Future

    Science.gov (United States)

    2011-05-01

    Agile Methodology – P t d F t ”as an u ure Warren W. Tignor SAIC Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting...AND SUBTITLE Agile Methodology - Past and Future 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER...Takeuchi & Nonaka HBR 1986, p139 RUGBY Waterfall Red vs Agile Black Team- . - Manifesto 2001 SCRUM GRAPHIC* * Adapted from Schwaber (2007) Agile

  5. Terrorism and Politics Predominate on the Front Pages of the Basque Press. Content and Area Analysis of the Front Pages of the Regional Newspapers

    Directory of Open Access Journals (Sweden)

    Dr. Jesús A. Pérez Dasilva

    2010-01-01

    Full Text Available This paper offers the results of research project 08/20 of the University of the Basque Country on the news published on the front pages of the Basque press during the years 1996, 2001 and 2006.The researchers analyse the front pages of the Basque press to determine if their content matches the demand and interests of their readers. The study shows what are the most relevant topics for these newspapers. The research involved a detailed analysis of 2,448 front pages of the five main Basque newspapers, with a total of 19,156 news items. A specific methodology was developed for this work, enabling both a quantitative and qualitative analysis of the news stories to be made. The data shown in this paper are a summary of the more detailed results that emerged in the different fields of the research.

  6. DESIGNING AN ENGLISH LEARNING WEB PAGE

    Institute of Scientific and Technical Information of China (English)

    Wu; Xiaozhen

    1999-01-01

    This paper reviews the development in CALL research and the currently acknowledged guide-lines for CALL designing.Following these guidelines,the author designed an English learning Web pageof her own.Target learners,rationale,designing aids,as well as the lesson plan using the Web page,areincluded.

  7. Upgrade of CERN OP Webtools IRRAD Page

    CERN Document Server

    Vik, Magnus Bjerke

    2017-01-01

    CERN Beams Department maintains a website with various tools for the Operations Group, with one of them being specific for the Proton Irradiation Facility (IRRAD). The IRRAD team use the tool to follow up and optimize the operation of the facility. The original version of the tool was difficult to maintain and adding new features to the page was challenging. Thus this summer student project is aimed to upgrade the web page by rewriting the web page with maintainability and flexibility in mind. The new application uses a server--client architecture with a REST API on the back end which is used by the front end to request data for visualization. PHP is used on the back end to implement the API's and Swagger is used to document them. Vue, Semantic UI, Webpack, Node and ECMAScript 5 is used on the fronted to visualize and administrate the data. The result is a new IRRAD operations web application with extended functionality, improved structure and an improved user interface. It includes a new Status Panel page th...

  8. 16 CFR 436.3 - Cover page.

    Science.gov (United States)

    2010-01-01

    ... with a cover page, in the order and form as follows: (a) The title “FRANCHISE DISCLOSURE DOCUMENT” in... begin operation of a franchise is . This includes that must be paid to the franchisor or affiliate. (2) This disclosure document summarizes certain provisions of your franchise agreement and...

  9. Modeling clicks beyond the first result page

    NARCIS (Netherlands)

    Chuklin, A.; Serdyukov, P.; de Rijke, M.

    2013-01-01

    Most modern web search engines yield a list of documents of a fixed length (usually 10) in response to a user query. The next ten search results are usually available in one click. These documents either replace the current result page or are appended to the end. Hence, in order to examine more

  10. Web Mining Using PageRank Algorithm

    Directory of Open Access Journals (Sweden)

    Vignesh. V

    2013-11-01

    Full Text Available Data mining is extracting and automatic discovering the web based information has been used as web mining. It is one of the most universal and a dominant application on the Internet and it becomes increasing in size and search tools that combine the results of multiple search engines are becoming more valuable. But, almost none of these studies deals with genetic relation algorithm (GRA, where GRA is one of the evolutionary methods with graph structure. GRA was designed to both increase the effectiveness of search engine and improve their efficiency. GRA considers the correlation coefficient between stock brands as strength, which indicates the relation between nodes in each individual of GRA. The reduced number of hyperlinks provided by GRA in the final generation consists of only the most similar hyperlinks with respect to the query. But, the end user’s not satisfied fully. To improve the satisfaction of user by using Page rank algorithm to measure the importance of a page and to prioritize pages returned from a GRA. It will reduce the user’s searching time. PageRank algorithm works to allocate rank for filtered links based on number of keyword occurred in the content.

  11. Reconfigurable Full-Page Braille Displays

    Science.gov (United States)

    Garner, H. Douglas

    1994-01-01

    Electrically actuated braille display cells of proposed type arrayed together to form full-page braille displays. Like other braille display cells, these provide changeable patterns of bumps driven by digitally recorded text stored on magnetic tapes or in solid-state electronic memories. Proposed cells contain electrorheological fluid. Viscosity of such fluid increases in strong electrostatic field.

  12. Modeling clicks beyond the first result page

    NARCIS (Netherlands)

    Chuklin, A.; Serdyukov, P.; de Rijke, M.

    2013-01-01

    Most modern web search engines yield a list of documents of a fixed length (usually 10) in response to a user query. The next ten search results are usually available in one click. These documents either replace the current result page or are appended to the end. Hence, in order to examine more docu

  13. Referencing web pages and e-journals.

    Science.gov (United States)

    Bryson, David

    2013-12-01

    One of the areas that can confuse students and authors alike is how to reference web pages and electronic journals (e-journals). The aim of this professional development article is to go back to first principles for referencing and see how with examples these should be referenced.

  14. Accounting Programs' Home Pages: What's Happening.

    Science.gov (United States)

    Peek, Lucia E.; Roxas, Maria L.

    2002-01-01

    Content analysis of 62 accounting programs' websites indicated the following: 53% include mission statements; 62.9% list accreditation; many faculty biographies and personal pages used inconsistent formats; provision of information on financial aid, student organizations, career services, and certified public accountant requirements varied. Many…

  15. Efficient Web Change Monitoring with Page Digest

    Energy Technology Data Exchange (ETDEWEB)

    Buttler, D J; Rocco, D; Liu, L

    2004-02-20

    The Internet and the World Wide Web have enabled a publishing explosion of useful online information, which has produced the unfortunate side effect of information overload: it is increasingly difficult for individuals to keep abreast of fresh information. In this paper we describe an approach for building a system for efficiently monitoring changes to Web documents. This paper has three main contributions. First, we present a coherent framework that captures different characteristics of Web documents. The system uses the Page Digest encoding to provide a comprehensive monitoring system for content, structure, and other interesting properties of Web documents. Second, the Page Digest encoding enables improved performance for individual page monitors through mechanisms such as short-circuit evaluation, linear time algorithms for document and structure similarity, and data size reduction. Finally, we develop a collection of sentinel grouping techniques based on the Page Digest encoding to reduce redundant processing in large-scale monitoring systems by grouping similar monitoring requests together. We examine how effective these techniques are over a wide range of parameters and have seen an order of magnitude speed up over existing Web-based information monitoring systems.

  16. The Platino project: methodology of a multicenter prevalence survey of chronic obstructive pulmonary disease in major Latin American cities

    Directory of Open Access Journals (Sweden)

    Perez-Padilla Rogelio

    2004-06-01

    Full Text Available Abstract Background The prevalence of Chronic Obstructive Pulmonary Disease (COPD in many developed countries appears to be increasing. There is some evidence from Latin America that COPD is a growing cause of death, but information on prevalence is scant. It is possible that, due to the high frequency of smoking in these countries, this disease may represent a major public health problem that has not yet been recognized as such. The PLATINO study is aimed at measuring COPD prevalence in major cities in Latin America. Methods/Design A multi-country survey is being carried out in major cities in Latin America. In each metropolitan area, a population-based sample of approximately 1,000 individuals aged 40 years or older is being interviewed using standardized questionnaires. Eligible subjects are submitted to pre- and post-bronchodilator spirometry, and classified according to several criteria for COPD. Anthropometric examinations are also performed. Several risk factors are being studied, including smoking, socioeconomic factors, exposure to domestic biomass pollution, occupational exposure to dust and hospital admissions due to respiratory conditions during childhood. Whether or not subjects affected by COPD are aware of their disease, and if so how it is being managed by health services, is also being investigated, as are the consequences of this condition on quality of life and work performance. Results At the present time, the study is completed in São Paulo, Mexico City and Montevideo; Chile has started the study in March 2004 and it will be followed by Venezuela; two other metropolitan areas could still join the PLATINO project. Similar sampling procedures, with stratification for socio-economic status, are being used in all sites. Strict coordination, training and standardization procedures have been used to ensure comparability of results across sites. Overall 92% of the pre-bronchodilator spirometry tests fulfilled ATS criteria of

  17. The Importance of Prior Probabilities for Entry Page Search

    NARCIS (Netherlands)

    Kraaij, W.; Westerveld, T.H.W.; Hiemstra, D.

    2002-01-01

    An important class of searches on the world-wide-web has the goal to find an entry page (homepage) of an organisation. Entry page search is quite different from Ad Hoc search. Indeed a plain Ad Hoc system performs disappointingly. We explored three non-content features of web pages: page length, num

  18. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  19. SURVEY

    DEFF Research Database (Denmark)

    SURVEY er en udbredt metode og benyttes inden for bl.a. samfundsvidenskab, humaniora, psykologi og sundhedsforskning. Også uden for forskningsverdenen er der mange organisationer som f.eks. konsulentfirmaer og offentlige institutioner samt marketingsafdelinger i private virksomheder, der arbejder...... med surveys. Denne bog gennemgår alle surveyarbejdets faser og giver en praktisk indføring i: • design af undersøgelsen og udvælgelse af stikprøver, • formulering af spørgeskemaer samt indsamling og kodning af data, • metoder til at analysere resultaterne...

  20. An End-to-End DNA Taxonomy Methodology for Benthic Biodiversity Survey in the Clarion-Clipperton Zone, Central Pacific Abyss

    Directory of Open Access Journals (Sweden)

    Adrian G. Glover

    2015-12-01

    Full Text Available Recent years have seen increased survey and sampling expeditions to the Clarion-Clipperton Zone (CCZ, central Pacific Ocean abyss, driven by commercial interests from contractors in the potential extraction of polymetallic nodules in the region. Part of the International Seabed Authority (ISA regulatory requirements are that these contractors undertake environmental research expeditions to their CCZ exploration claims following guidelines approved by the ISA Legal and Technical Commission (ISA, 2010. Section 9 (e of these guidelines instructs contractors to “…collect data on the sea floor communities specifically relating to megafauna, macrofauna, meiofauna, microfauna, nodule fauna and demersal scavengers”. There are a number of methodological challenges to this, including the water depth (4000–5000 m, extremely warm surface waters (~28 °C compared to bottom water (~1.5 °C and great distances to ports requiring a large and long seagoing expedition with only a limited number of scientists. Both scientists and regulators have recently realized that a major gap in our knowledge of the region is the fundamental taxonomy of the animals that live there; this is essential to inform our knowledge of the biogeography, natural history and ultimately our stewardship of the region. Recognising this, the ISA is currently sponsoring a series of taxonomic workshops on the CCZ fauna and to assist in this process we present here a series of methodological pipelines for DNA taxonomy (incorporating both molecular and morphological data of the macrofauna and megafauna from the CCZ benthic habitat in the recent ABYSSLINE cruise program to the UK-1 exploration claim. A major problem on recent CCZ cruises has been the collection of high-quality samples suitable for both morphology and DNA taxonomy, coupled with a workflow that ensures these data are made available. The DNA sequencing techniques themselves are relatively standard, once good samples have been

  1. Implementing database system for LHCb publications page

    CERN Document Server

    Abdullayev, Fakhriddin

    2017-01-01

    The LHCb is one of the main detectors of Large Hadron Collider, where physicists and scientists work together on high precision measurements of matter-antimatter asymmetries and searches for rare and forbidden decays, with the aim of discovering new and unexpected forces. The work does not only consist of analyzing data collected from experiments but also in publishing the results of those analyses. The LHCb publications are gathered on LHCb publications page to maximize their availability to both LHCb members and to the high energy community. In this project a new database system was implemented for LHCb publications page. This will help to improve access to research papers for scientists and better integration with current CERN library website and others.

  2. Personal home pages as an information resource

    Directory of Open Access Journals (Sweden)

    Shant Narsesian

    2004-12-01

    Full Text Available Nowadays, for many people, the World Wide Web (WWW is the first place to go to look something up, to find that bit of information. However, even though people have their favourite sites, and their favourite search engines, they often seem to miss that bit of information. This could very well be because it is hiding on a small, unpopular, enthusiast's Personal Home Page. The author believes that there is more information on the Web than that which one will find on the major, "commercial-style" sites. Hence, this paper looks at the possibility of using Personal Home Pages (PHP as an information resource, not only for the academic, but the web-surfing world in general.

  3. Developing a web page: bringing clinics online.

    Science.gov (United States)

    Peterson, Ronnie; Berns, Susan

    2004-01-01

    Introducing clinical staff education, along with new policies and procedures, to over 50 different clinical sites can be a challenge. As any staff educator will confess, getting people to attend an educational inservice session can be difficult. Clinical staff request training, but no one has time to attend training sessions. Putting the training along with the policies and other information into "neat" concise packages via the computer and over the company's intranet was the way to go. However, how do you bring the clinics online when some of the clinical staff may still be reluctant to turn on their computers for anything other than to gather laboratory results? Developing an easy, fun, and accessible Web page was the answer. This article outlines the development of the first training Web page at the University of Wisconsin Medical Foundation, Madison, WI.

  4. Perspectives on the consecutive pages problem

    Science.gov (United States)

    Srinivasan, V. K.

    2011-04-01

    This article presents different approaches to a problem, dubbed by the author as 'the consecutive pages problem'. The aim of this teaching-oriented article is to promote the teaching of abstract concepts in mathematics, by selecting a challenging amusement problem and then presenting various solutions in such a way that it can engage the attention of a fourth-grade student, a high school senior student, an average college student and scholars.

  5. A Mobile Agent-based Web Page Constructing Framework MiPage

    Science.gov (United States)

    Fukuta, Naoki; Ozono, Tadachika; Shintani, Toramatsu

    In this paper, we present a programming framework, `MiPage', for realizing intelligent WWW applications based on the mobile agent technology. On the framework, an agent is programmed by using hyper text markup language and logic programming language. To realize the framework, we designed a new logic programming environment `MiLog', and an agent program compiler `MiPage Compiler'. The framework enables us to enhance both richness of the services and manageability of the application.

  6. Coming to Life: A Review of Movie Comics: Page to Screen/Screen to Page

    Directory of Open Access Journals (Sweden)

    Nicolas Labarre

    2017-03-01

    Full Text Available This book review provides an overview of 'Movie Comics: Page to Screen/Screen to Page' by Blair Davis (Rutgers University Press, 2017 a book which examines the reciprocal adaptations of film into comics and comics into films from 1930 to 1960. This review argues that 'Movie Comics' provides a useful and finely-textured cultural history of that phenomenon, which help contextualize scholarly studies of contemporary adaptations and transmedia constructions.

  7. Improving Web Page Readability by Plain Language

    CERN Document Server

    Hussain, Walayat; Ali, Arif

    2011-01-01

    In today's world anybody who wants to access any information the first choice is to use the web because it is the only source to provide easy and instant access to information. However web readers face many hurdles from web which includes load of web pages, text size, finding related information, spelling and grammar etc. However understanding of web pages written in English language creates great problems for non native readers who have basic knowledge of English. In this paper, we propose a plain language for a local language (Urdu) using English alphabets for web pages in Pakistan. For this purpose we developed two websites, one with a normal English fonts and other in a local language text scheme using English alphabets. We also conducted a questionnaire from 40 different users with a different level of English language fluency in Pakistan to gain the evidence of the practicality of our approach. The result shows that the proposed plain language text scheme using English alphabets improved the reading com...

  8. Improving Web Page Readability by Plain Language

    Directory of Open Access Journals (Sweden)

    Walayat Hussain

    2011-05-01

    Full Text Available In todays world anybody who wants to access any information the first choice is to use the web because it is the only source to provide easy and instant access to information. However web readers face many hurdles from web which includes load of web pages, text size, finding related information, spelling and grammar etc. However understanding of web pages written in English language creates great problems for non native readers who have basic knowledge of English. In this paper, we propose a plain language for a local language (Urdu using English alphabets for web pages in Pakistan. For this purpose we developed two websites, one with a normal English fonts and other in a local language text scheme using English alphabets. We also conducted a questionnaire from 40 different users with a different level of English language fluency in Pakistan to gain the evidence of the practicality of our approach. The result shows that the proposed plain language text scheme using English alphabets improved the reading comprehension for non native English speakers in Pakistan.

  9. 新的PageRank优化算法%New PageRank optimization algorithm

    Institute of Scientific and Technical Information of China (English)

    蒋永辉; 吴洪丽

    2012-01-01

    Search engines repeatedly return currently popular pages at the top of search results, popular pages tend to get even more popular, while unpopular pages get ignored by an average user. In order to escape from this problem, an improved ranking function and effective Web user model are employed, and a New PageRank Optimization (NPRO) algorithm is provided. Experimental data show that the provided algorithm can attain unbiased Web ranking.%为了克服PageRank在搜索过程中重复性地把当前受欢迎的网页放在搜索结果的首要位置,而不受欢迎的网页被大多数用户忽略的问题,采用了一种改进的评估函数及有效的用户模型,获得了一个新的PageRank优化算法.实验结果表明,该算法达到了较好的公平性.

  10. Insights into Facebook Pages: an early adolescent health research study page targeted at parents.

    Science.gov (United States)

    Amon, Krestina L; Paxton, Karen; Klineberg, Emily; Riley, Lisa; Hawke, Catherine; Steinbeck, Katharine

    2016-02-01

    Facebook has been used in health research, but there is a lack of literature regarding how Facebook may be used to recruit younger adolescents. A Facebook Page was created for an adolescent cohort study on the effects of puberty hormones on well-being and behaviour in early adolescence. Used as a communication tool with existing participants, it also aimed to alert potential participants to the study. The purpose of this paper is to provide a detailed description of the development of the study Facebook Page and present the fan response to the types of posts made on the Page using the Facebook-generated Insights data. Two types of posts were made on the study Facebook Page. The first type was study-related update posts and events. The second was relevant adolescent and family research and current news posts. Observations on the use of and response to the Page were made over 1 year across three phases (phase 1, very low Facebook use; phase 2, high Facebook use; phase 3, low Facebook use). Most Page fans were female (88.6%), with the largest group of fans aged between 35 and 44 years. Study-related update posts with photographs were the most popular. This paper provides a model on which other researchers could base Facebook communication and potential recruitment in the absence of established guidelines.

  11. Effect of eprosartan-based antihypertensive therapy on coronary heart disease risk assessed by Framingham methodology in Canadian patients with diabetes: results of the POWER survey

    Directory of Open Access Journals (Sweden)

    Petrella RJ

    2015-03-01

    Full Text Available Robert J Petrella,1–3 Dawn P Gill,2,3 Jean-Pascal Berrou4On behalf of the POWER survey Study Group 1Departments of Family Medicine, Medicine (Cardiology and Kinesiology, University of Western Ontario, London, ON, Canada; 2Aging, Rehabilitation and Geriatric Care Research Centre, Lawson Health Research Institute, London, ON, Canada; 3Department of Family Medicine and School of Health Studies, University of Western Ontario, London, ON, Canada; 4Abbott Products Operations AG, Allschwil, Switzerland Objective: As part of the Physicians’ Observational Work on Patient Education According to their Vascular Risk (POWER survey, we used Framingham methodology to examine the effect of an eprosartan-based regimen on total coronary heart disease (CHD risk in diabetic patients recruited in Canada. Methods: Patients with new or uncontrolled hypertension (sitting systolic blood pressure [SBP] >140 mmHg with diastolic blood pressure <110 mmHg were identified at 335 Canadian primary care practices. Initial treatment consisted of eprosartan 600 mg/day, which was later supplemented with other antihypertensives as required. Outcomes included change in SBP at 6 months (primary objective and absolute change in the Framingham 10-year CHD risk score (secondary objective. Results: We identified an intention-to-treat diabetes population of 195 patients. Most diabetic patients were prescribed two or more antihypertensive drugs throughout the survey. Mean reductions in SBP and diastolic blood pressure were 20.8±14.8 mmHg and 9.5±10.7 mmHg, respectively. The overall absolute mean 10-year CHD risk, calculated using Framingham formulae, declined by 2.9±3.5 points (n=49. Average baseline risk was higher in men than women (14.8±8.6 versus 5.6±1.8 points; men also had a larger average risk reduction (4.2±4.3 versus 1.5±1.3 points. The extent of absolute risk reduction also increased with increasing age (trend not statistically significant. Conclusion: Eprosartan

  12. Methodology and early findings of the fifth survey of childhood and adolescence surveillance and prevention of adult noncommunicable disease: The caspian-v study

    Directory of Open Access Journals (Sweden)

    Mohammad Esmaeil Motlagh

    2017-01-01

    Full Text Available Background: This paper presents the methodology and early findings of the fifth survey of a school-based surveillance program in Iran. Methods: This nationwide study was conducted in 2015 as the fifth survey of a surveillance program entitled "Childhood and Adolescence Surveillance and PreventIon of Adult Non- communicable disease" (CASPIAN-V study. The protocol was mainly based on the World Health Organization-Global School student Health Survey. We studied 14400 students, aged 7-18 years, and their parents living in 30 provinces in Iran. Fasting blood was obtained from a sub-sample of 4200 randomly selected students. Results: The participation rate for the whole study and for blood sampling were 99% and 91.5%, respectively. The mean (SD age of participants was 12.3 (3.2 years, consisting of 49.4% girls and 71.4% urban residents. Overall, 16.1% were underweight (17.4% of boys and 14.8% of girls, and 20.8% had excess weight consisting of 9.4% (8.7% of boys and 10.2% of girls of overweight and 11.4% (12.5% of boys and 10.3% of girls of obesity. Abdominal obesity was documented in 21.1% of students (21.6% of boys and 20.5% of girls. Low HDL-C was the most prevalent abnormality of the lipid profile (29.5% followed by high serum triglycerides (27.7%. Of students, 59.9% consumed whole wheat bread; and 57% reported that they never or rarely added salt to table. The reported daily consumption of fresh fruits, vegetables, and milk was about 60%, 32% and 40%, respectively. 13.7% of participants had at least 30-min daily leisure-time physical activity. Conclusions: The current findings provide an overview of the current health status and lifestyle habits of children and adolescents. This surveillance program would help planning preventive programs at individual and community levels.

  13. Methodology and Early Findings of the Fifth Survey of Childhood and Adolescence Surveillance and Prevention of Adult Noncommunicable Disease: The CASPIAN-V Study

    Science.gov (United States)

    Motlagh, Mohammad Esmaeil; Ziaodini, Hasan; Qorbani, Mostafa; Taheri, Majzoubeh; Aminaei, Tahereh; Goodarzi, Azam; Ataie-Jafari, Asal; Rezaei, Fatemeh; Ahadi, Zeinab; Shafiee, Gita; Shahsavari, Ali; Heshmat, Ramin; Kelishadi, Roya

    2017-01-01

    Background: This paper presents the methodology and early findings of the fifth survey of a school-based surveillance program in Iran. Methods: This nationwide study was conducted in 2015 as the fifth survey of a surveillance program entitled “Childhood and Adolescence Surveillance and PreventIon of Adult Non- communicable disease” (CASPIAN-V) study. The protocol was mainly based on the World Health Organization-Global School student Health Survey. We studied 14400 students, aged 7-18 years, and their parents living in 30 provinces in Iran. Fasting blood was obtained from a sub-sample of 4200 randomly selected students. Results: The participation rate for the whole study and for blood sampling were 99% and 91.5%, respectively. The mean (SD) age of participants was 12.3 (3.2) years, consisting of 49.4% girls and 71.4% urban residents. Overall, 16.1% were underweight (17.4% of boys and 14.8% of girls), and 20.8% had excess weight consisting of 9.4% (8.7% of boys and 10.2% of girls) of overweight and 11.4% (12.5% of boys and 10.3% of girls) of obesity. Abdominal obesity was documented in 21.1% of students (21.6% of boys and 20.5% of girls). Low HDL-C was the most prevalent abnormality of the lipid profile (29.5%) followed by high serum triglycerides (27.7%). Of students, 59.9% consumed whole wheat bread; and 57% reported that they never or rarely added salt to table. The reported daily consumption of fresh fruits, vegetables, and milk was about 60%, 32% and 40%, respectively. 13.7% of participants had at least 30-min daily leisure-time physical activity. Conclusions: The current findings provide an overview of the current health status and lifestyle habits of children and adolescents. This surveillance program would help planning preventive programs at individual and community levels. PMID:28217266

  14. Survey of Vulnerability Methodological Needs

    Science.gov (United States)

    1991-11-01

    20505 2 Central Intelligence Agency ATTN: OIA ( Barbara A. Kroggel) (Monica McGuinn) Washington, DC 20505 1 Central Intelligence Agency ATTN: ORD...James Ellis Barbara J. Harris Constance P. Rollins Tom Wasmund Code G13 Dahlgren, VA 22448-5000 1 Commander Naval Surface Warfare Center ATTN: Glen...15069 1 Analysis and Technology ATTN: RADM Thomas M. Hopkins USN (Ret) 1113 Carper Street McLean, VA 22101 1 ANSER ATTN: James W. McNulty 1215 Jefferson

  15. Model for Predicting End User Web Page Response Time

    CERN Document Server

    Nagarajan, Sathya Narayanan

    2012-01-01

    Perceived responsiveness of a web page is one of the most important and least understood metrics of web page design, and is critical for attracting and maintaining a large audience. Web pages can be designed to meet performance SLAs early in the product lifecycle if there is a way to predict the apparent responsiveness of a particular page layout. Response time of a web page is largely influenced by page layout and various network characteristics. Since the network characteristics vary widely from country to country, accurately modeling and predicting the perceived responsiveness of a web page from the end user's perspective has traditionally proven very difficult. We propose a model for predicting end user web page response time based on web page, network, browser download and browser rendering characteristics. We start by understanding the key parameters that affect perceived response time. We then model each of these parameters individually using experimental tests and statistical techniques. Finally, we d...

  16. Weighted Page Content Rank for Ordering Web Search Result

    Directory of Open Access Journals (Sweden)

    POOJA SHARMA,

    2010-12-01

    Full Text Available With the explosive growth of information sources available on the World Wide Web, it has become increasingly necessary for user’s to utilize automated tools in order to find, extract, filter and evaluate the desired information and resources. Web structure mining and content mining plays an effective role in this approach. There are two Ranking algorithms PageRank and Weighted PageRank. PageRank is a commonly used algorithm in Web Structure Mining. Weighted Page Rank also takes the importance of the inlinks and outlinks of the pages but the rank score to all links is not equally distributed. i.e. unequal distribution is performed. In this paper we proposed a new algorithm, Weighted Page Content Rank (WPCRbased on web content mining and structure mining that shows the relevancy of the pages to a given query is better determined, as compared to the existing PageRank and Weighted PageRank algorithms.

  17. Exploiting link structure for web page genre identification

    KAUST Repository

    Zhu, Jia

    2015-07-07

    As the World Wide Web develops at an unprecedented pace, identifying web page genre has recently attracted increasing attention because of its importance in web search. A common approach for identifying genre is to use textual features that can be extracted directly from a web page, that is, On-Page features. The extracted features are subsequently inputted into a machine learning algorithm that will perform classification. However, these approaches may be ineffective when the web page contains limited textual information (e.g., the page is full of images). In this study, we address genre identification of web pages under the aforementioned situation. We propose a framework that uses On-Page features while simultaneously considering information in neighboring pages, that is, the pages that are connected to the original page by backward and forward links. We first introduce a graph-based model called GenreSim, which selects an appropriate set of neighboring pages. We then construct a multiple classifier combination module that utilizes information from the selected neighboring pages and On-Page features to improve performance in genre identification. Experiments are conducted on well-known corpora, and favorable results indicate that our proposed framework is effective, particularly in identifying web pages with limited textual information. © 2015 The Author(s)

  18. OnlineMin: A Fast Strongly Competitive Randomized Paging Algorithm

    DEFF Research Database (Denmark)

    Brodal, Gerth Stølting; Moruz, Gabriel; Negoescu, Andrei

    2012-01-01

    n the field of online algorithms paging is one of the most studied problems. For randomized paging algorithms a tight bound of H k on the competitive ratio has been known for decades, yet existing algorithms matching this bound have high running times. We present the first randomized paging...... approach that both has optimal competitiveness and selects victim pages in subquadratic time. In fact, if k pages fit in internal memory the best previous solution required O(k 2) time per request and O(k) space, whereas our approach takes also O(k) space, but only O(logk) time in the worst case per page...

  19. THE NEW PURCHASING SERVICE PAGE NOW ON THE WEB!

    CERN Multimedia

    SPL Division

    2000-01-01

    Users of CERN's Purchasing Service are encouraged to visit the new Purchasing Service web page, accessible from the CERN homepage or directly at: http://spl-purchasing.web.cern.ch/spl-purchasing/ There, you will find answers to questions such as: Who are the buyers? What do I need to know before creating a DAI? How many offers do I need? Where shall I send the offer I received? I know the amount of my future requirement, how do I proceed? How are contracts adjudicated at CERN? Which exhibitions and visits of Member State companies are foreseen in the future? A company I know is interested in making a presentation at CERN, who should they contact? Additionally, you will find information concerning: The Purchasing procedures Market Surveys and Invitations to Tender The Industrial Liaison Officers appointed in each Member State The Purchasing Broker at CERN

  20. Review Pages: Cities, Energy and Climate Change

    Directory of Open Access Journals (Sweden)

    Gennaro Angiello

    2015-04-01

    Full Text Available Starting from the relationship between urban planning and mobility management, TeMA has gradually expanded the view of the covered topics, always remaining in the groove of rigorous scientific in-depth analysis. During the last two years a particular attention has been paid on the Smart Cities theme and on the different meanings that come with it. The last section of the journal is formed by the Review Pages. They have different aims: to inform on the problems, trends and evolutionary processes; to investigate on the paths by highlighting the advanced relationships among apparently distant disciplinary fields; to explore the interaction’s areas, experiences and potential applications; to underline interactions, disciplinary developments but also, if present, defeats and setbacks. Inside the journal the Review Pages have the task of stimulating as much as possible the circulation of ideas and the discovery of new points of view. For this reason the section is founded on a series of basic’s references, required for the identification of new and more advanced interactions. These references are the research, the planning acts, the actions and the applications, analysed and investigated both for their ability to give a systematic response to questions concerning the urban and territorial planning, and for their attention to aspects such as the environmental sustainability and the innovation in the practices. For this purpose the Review Pages are formed by five sections (Web Resources; Books; Laws; Urban Practices; News and Events, each of which examines a specific aspect of the broader information storage of interest for TeMA.

  1. Review Pages: Cities, Energy and Built Environment

    Directory of Open Access Journals (Sweden)

    Gennaro Angiello

    2015-07-01

    Full Text Available Starting from the relationship between urban planning and mobility management, TeMA has gradually expanded the view of the covered topics, always remaining in the groove of rigorous scientific in-depth analysis. During the last two years a particular attention has been paid on the Smart Cities theme and on the different meanings that come with it. The last section of the journal is formed by the Review Pages. They have different aims: to inform on the problems, trends and evolutionary processes; to investigate on the paths by highlighting the advanced relationships among apparently distant disciplinary fields; to explore the interaction’s areas, experiences and potential applications; to underline interactions, disciplinary developments but also, if present, defeats and setbacks. Inside the journal the Review Pages have the task of stimulating as much as possible the circulation of ideas and the discovery of new points of view. For this reason the section is founded on a series of basic’s references, required for the identification of new and more advanced interactions. These references are the research, the planning acts, the actions and the applications, analysed and investigated both for their ability to give a systematic response to questions concerning the urban and territorial planning, and for their attention to aspects such as the environmental sustainability and the innovation in the practices. For this purpose the Review Pages are formed by five sections (Web Resources; Books; Laws; Urban Practices; News and Events, each of which examines a specific aspect of the broader information storage of interest for TeMA.

  2. Review Pages: Cities, Energy and Mobility

    Directory of Open Access Journals (Sweden)

    Gennaro Angiello

    2015-12-01

    Full Text Available Starting from the relationship between urban planning and mobility management, TeMA has gradually expanded the view of the covered topics, always remaining in the groove of rigorous scientific in-depth analysis. During the last two years a particular attention has been paid on the Smart Cities theme and on the different meanings that come with it. The last section of the journal is formed by the Review Pages. They have different aims: to inform on the problems, trends and evolutionary processes; to investigate on the paths by highlighting the advanced relationships among apparently distant disciplinary fields; to explore the interaction’s areas, experiences and potential applications; to underline interactions, disciplinary developments but also, if present, defeats and setbacks. Inside the journal the Review Pages have the task of stimulating as much as possible the circulation of ideas and the discovery of new points of view. For this reason the section is founded on a series of basic’s references, required for the identification of new and more advanced interactions. These references are the research, the planning acts, the actions and the applications, analysed and investigated both for their ability to give a systematic response to questions concerning the urban and territorial planning, and for their attention to aspects such as the environmental sustainability and the innovation in the practices. For this purpose the Review Pages are formed by five sections (Web Resources; Books; Laws; Urban Practices; News and Events, each of which examines a specific aspect of the broader information storage of interest for TeMA.

  3. Machine Learning Algorithms in Web Page Classification

    Directory of Open Access Journals (Sweden)

    W.A.AWAD

    2012-11-01

    Full Text Available In this paper we use machine learning algorithms like SVM, KNN and GIS to perform a behaviorcomparison on the web pages classifications problem, from the experiment we see in the SVM with smallnumber of negative documents to build the centroids has the smallest storage requirement and the least online test computation cost. But almost all GIS with different number of nearest neighbors have an evenhigher storage requirement and on line test computation cost than KNN. This suggests that some futurework should be done to try to reduce the storage requirement and on list test cost of GIS.

  4. Page 1 NIGERIAN JOURNAL OF OPHTHALMIOLOGY LEPROSY ...

    African Journals Online (AJOL)

    LEPROSY AND THE EYE – A REVIEW OF. EPIDEMIOLOGY ... with leprosy. Methodology: Current literature on various aspects of .... involvement of the facial and trigeminal nerves leads to .... loss of facial expression, which is usually bilateral.

  5. Effect of eprosartan-based antihypertensive therapy on coronary heart disease risk assessed by Framingham methodology in Canadian patients: results of the POWER survey

    Directory of Open Access Journals (Sweden)

    Petrella RJ

    2014-01-01

    Full Text Available Robert J Petrella,1 Guy Tremblay,2 Guy De Backer,3 Dawn P Gill,4,5,6 On behalf of the POWER survey Study Group 1Department of Family Medicine and Cardiology, Lawson Health Research Institute, University of Western Ontario, ON, Canada; 2Centre hospitalier universitaire de Québec, Hôpital du Saint-Sacrement, Sainte-Foy, Québec, QC, Canada; 3Department of Public Health, Ghent University, Ghent, Belgium; 4Aging, Rehabilitation and Geriatric Care Research Centre, Lawson Health Research Institute, London, ON, Canada; 5School of Health Studies, Western University, London, ON, Canada; 6Department of Epidemiology, University of Washington, Seattle, WA, USA Purpose/introduction: The Canadian Hypertension Education Program (CHEP has identified blood pressure (BP control as a key target for an overall reduction in cardiovascular disease risk. The POWER survey (Physicians’ Observational Work on Patient Education According to their Vascular Risk used Framingham methodology to investigate the impact of an angiotensin-receptor-blocker-based regimen on arterial BP and total coronary heart disease (CHD risk in a subset of patients recruited in Canada. Methods: 309 Canadian practices screened for patients with either newly diagnosed or uncontrolled mild/moderate hypertension (sitting systolic blood pressure [SBP] >140 mmHg with diastolic blood pressure [DBP] <110 mmHg. Treatment comprised eprosartan 600 mg/day with add-on antihypertensive therapy after 1 month if required. The primary efficacy variable was change in SBP at 6 months; the secondary variable was the absolute change in the Framingham 10-year CHD risk score. Results: 1,385 patients were identified, of whom 1,114 were included in the intention-to-treat (ITT cohort. Thirty-eight point four percent of ITT patients were managed with monotherapy at 6 months, versus 35.2% and 13.7% with two-drug or multiple-drug therapy, respectively. SBP in the ITT cohort declined 22.4 (standard deviation [SD] 14.8 mm

  6. Estimation and Short-Term Prediction of the Course of the HIV Epidemic Using Demographic and Health Survey Methodology-Like Data.

    Directory of Open Access Journals (Sweden)

    Stéphanie Blaizot

    Full Text Available Mathematical models have played important roles in the understanding of epidemics and in the study of the impacts of various behavioral or medical measures. However, modeling accurately the future spread of an epidemic requires context-specific parameters that are difficult to estimate because of lack of data. Our objective is to propose a methodology to estimate context-specific parameters using Demographic and Health Survey (DHS-like data that can be used in mathematical modeling of short-term HIV spreading.The model splits the population according to sex, age, HIV status, and antiretroviral treatment status. To estimate context-specific parameters, we used individuals' histories included in DHS-like data and a statistical analysis that used decomposition of the Poisson likelihood. To predict the course of the HIV epidemic, sex- and age-specific differential equations were used. This approach was applied to recent data from Kenya. The approach allowed the estimation of several key epidemiological parameters. Women had a higher infection rate than men and the highest infection rate in the youngest age groups (15-24 and 25-34 years whereas men had the highest infection rate in age group 25-34 years. The immunosuppression rates were similar between age groups. The treatment rate was the highest in age group 35-59 years in both sexes. The results showed that, within the 15-24 year age group, increasing male circumcision coverage and antiretroviral therapy coverage at CD4 ≤ 350/mm3 over the current 70% could have short-term impacts.The study succeeded in estimating the model parameters using DHS-like data rather than literature data. The analysis provides a framework for using the same data for estimation and prediction, which can improve the validity of context-specific predictions and help designing HIV prevention campaigns.

  7. A Syntactic Classification based Web Page Ranking Algorithm

    CERN Document Server

    Mukhopadhyay, Debajyoti; Kim, Young-Chon

    2011-01-01

    The existing search engines sometimes give unsatisfactory search result for lack of any categorization of search result. If there is some means to know the preference of user about the search result and rank pages according to that preference, the result will be more useful and accurate to the user. In the present paper a web page ranking algorithm is being proposed based on syntactic classification of web pages. Syntactic Classification does not bother about the meaning of the content of a web page. The proposed approach mainly consists of three steps: select some properties of web pages based on user's demand, measure them, and give different weightage to each property during ranking for different types of pages. The existence of syntactic classification is supported by running fuzzy c-means algorithm and neural network classification on a set of web pages. The change in ranking for difference in type of pages but for same query string is also being demonstrated.

  8. A cross-sectional survey of 5-year-old children with non-syndromic unilateral cleft lip and palate: the Cleft Care UK study. Part 1: background and methodology

    Science.gov (United States)

    Persson, M; Sandy, J R; Waylen, A; Wills, A K; Al-Ghatam, R; Ireland, A J; Hall, A J; Hollingworth, W; Jones, T; Peters, T J; Preston, R; Sell, D; Smallridge, J; Worthington, H; Ness, A R

    2015-01-01

    Structured Abstract Objectives We describe the methodology for a major study investigating the impact of reconfigured cleft care in the United Kingdom (UK) 15 years after an initial survey, detailed in the Clinical Standards Advisory Group (CSAG) report in 1998, had informed government recommendations on centralization. Setting and Sample Population This is a UK multicentre cross-sectional study of 5-year-olds born with non-syndromic unilateral cleft lip and palate. Children born between 1 April 2005 and 31 March 2007 were seen in cleft centre audit clinics. Materials and Methods Consent was obtained for the collection of routine clinical measures (speech recordings, hearing, photographs, models, oral health, psychosocial factors) and anthropometric measures (height, weight, head circumference). The methodology for each clinical measure followed those of the earlier survey as closely as possible. Results We identified 359 eligible children and recruited 268 (74.7%) to the study. Eleven separate records for each child were collected at the audit clinics. In total, 2666 (90.4%) were collected from a potential 2948 records. The response rates for the self-reported questionnaires, completed at home, were 52.6% for the Health and Lifestyle Questionnaire and 52.2% for the Satisfaction with Service Questionnaire. Conclusions Response rates and measures were similar to those achieved in the previous survey. There are practical, administrative and methodological challenges in repeating cross-sectional surveys 15 years apart and producing comparable data. PMID:26567851

  9. Digital Ethnography: Library Web Page Redesign among Digital Natives

    Science.gov (United States)

    Klare, Diane; Hobbs, Kendall

    2011-01-01

    Presented with an opportunity to improve Wesleyan University's dated library home page, a team of librarians employed ethnographic techniques to explore how its users interacted with Wesleyan's current library home page and web pages in general. Based on the data that emerged, a group of library staff and members of the campus' information…

  10. Required Discussion Web Pages in Psychology Courses and Student Outcomes

    Science.gov (United States)

    Pettijohn, Terry F., II; Pettijohn, Terry F.

    2007-01-01

    We conducted 2 studies that investigated student outcomes when using discussion Web pages in psychology classes. In Study 1, we assigned 213 students enrolled in Introduction to Psychology courses to either a mandatory or an optional Web page discussion condition. Students used the discussion Web page significantly more often and performed…

  11. Young Children's Interpretations of Page Breaks in Contemporary Picture Storybooks

    Science.gov (United States)

    Sipe, Lawrence R.; Brightman, Anne E.

    2009-01-01

    This article reports on a study of the responses of a second-grade class to the page breaks in contemporary picturebooks. In a picturebook, the text and accompanying illustrations are divided into a series of facing pages called openings, and the divisions between the openings are called page breaks or turns. Unlike a novel, in which the page…

  12. MedlinePlus Survey Results 2015

    Science.gov (United States)

    ... page: https://medlineplus.gov/survey/index.html MedlinePlus Survey Results 2015 To use the sharing features on ... government sites in the "Information/News" category. Other survey question responses: What best describes your role in ...

  13. Migrating Multi-page Web Applications to Single-page AJAX Interfaces

    NARCIS (Netherlands)

    Mesbah, A.; Van Deursen, A.

    2006-01-01

    Recently, a new web development technique for creating interactive web applications, dubbed AJAX, has emerged. In this new model, the single-page web interface is composed of individual components which can be updated/replaced independently. With the rise of AJAX web applications classical multi-pag

  14. An Efficient PageRank Approach for Urban Traffic Optimization

    Directory of Open Access Journals (Sweden)

    Florin Pop

    2012-01-01

    to determine optimal decisions for each traffic light, based on the solution given by Larry Page for page ranking in Web environment (Page et al. (1999. Our approach is similar with work presented by Sheng-Chung et al. (2009 and Yousef et al. (2010. We consider that the traffic lights are controlled by servers and a score for each road is computed based on efficient PageRank approach and is used in cost function to determine optimal decisions. We demonstrate that the cumulative contribution of each car in the traffic respects the main constrain of PageRank approach, preserving all the properties of matrix consider in our model.

  15. A Model for Web Page Usage Mining Based on Segmentation

    CERN Document Server

    Kuppusamy, K S

    2012-01-01

    The web page usage mining plays a vital role in enriching the page's content and structure based on the feedbacks received from the user's interactions with the page. This paper proposes a model for micro-managing the tracking activities by fine-tuning the mining from the page level to the segment level. The proposed model enables the web-master to identify the segments which receives more focus from users comparing with others. The segment level analytics of user actions provides an important metric to analyse the factors which facilitate the increase in traffic for the page. The empirical validation of the model is performed through prototype implementation.

  16. Facebook Pages and the Effects of Reputation Management

    Directory of Open Access Journals (Sweden)

    Nicoleta Ciacu

    2013-07-01

    Full Text Available This paper aims to identify the categories which own the largest number of Facebook pages as well as the Romanian Facebook pages which have the largest number of fans. Another aim of this study is to analyse the categories generating the largest number of fans on their official Facebook pages, the increase of the number of fans in the last month and the Facebook pages generating the biggest feedback among their fans. The data was collected by using Social Media analysis sites, such as www.socialbakers.com and www.facebrands.ro. The study sample comprises 20 Facebook pages with the largest number of fans.

  17. A New Page Ranking Algorithm Based On WPRVOL Algorithm

    Directory of Open Access Journals (Sweden)

    Roja Javadian Kootenae

    2013-03-01

    Full Text Available The amount of information on the web is always growing, thus powerful search tools are needed to search for such a large collection. Search engines in this direction help users so they can find their desirable information among the massive volume of information in an easier way. But what is important in the search engines and causes a distinction between them is page ranking algorithm used in them. In this paper a new page ranking algorithm based on "Weighted Page Ranking based on Visits of Links (WPRVOL Algorithm" for search engines is being proposed which is called WPR'VOL for short. The proposed algorithm considers the number of visits of first and second level in-links. The original WPRVOL algorithm takes into account the number of visits of first level in-links of the pages and distributes rank scores based on the popularity of the pages whereas the proposed algorithm considers both in-links of that page (first level in-links and in-links of the pages that point to it (second level in-links in order to calculation of rank of the page, hence more related pages are displayed at the top of search result list. In the summary it is said that the proposed algorithm assigns higher rank to pages that both themselves and pages that point to them be important.

  18. PageRank算法的研究与改进

    Institute of Scientific and Technical Information of China (English)

    李青淋; 邵家玉

    2016-01-01

    Page ranking algorithm is the core of the search engine.Through the analysis of the traditional PageRank algo-rithm,the algorithm is based on the link between the pages,so it has deficiencies of emphasis on the old web pages,theme offset and so on.In order to improve the accuracy of the page ranking algorithm,an improved PageRank algorithm is pro-posed based on the analysis of the correlation factor of the page content and the time factor of the page update.%页面排序算法是搜索引擎的核心之一。通过分析传统PageRank算法可知该算法主要是依据页面之间的链接关系,容易出现偏重旧网页、主题偏移等不足之处。为了提高网页排序算法的准确率,结合网页内容相关性因子和网页更新时间因子,提出一种改进的PageRank算法。实验结果表明,改进后的PageRank算法提高了搜索的查全率与查准率,提高了网页排序的质量。

  19. PageRank optimization applied to spam detection

    CERN Document Server

    Fercoq, Olivier

    2012-01-01

    We give a new link spam detection and PageRank demotion algorithm called MaxRank. Like TrustRank and AntiTrustRank, it starts with a seed of hand-picked trusted and spam pages. We define the MaxRank of a page as the frequency of visit of this page by a random surfer minimizing an average cost per time unit. On a given page, the random surfer selects a set of hyperlinks and clicks with uniform probability on any of these hyperlinks. The cost function penalizes spam pages and hyperlink removals. The goal is to determine a hyperlink deletion policy that minimizes this score. The MaxRank is interpreted as a modified PageRank vector, used to sort web pages instead of the usual PageRank vector. The bias vector of this ergodic control problem, which is unique up to an additive constant, is a measure of the "spamicity" of each page, used to detect spam pages. We give a scalable algorithm for MaxRank computation that allowed us to perform experimental results on the WEBSPAM-UK2007 dataset. We show that our algorithm o...

  20. The Web Application Test Based on Page Coverage Criteria

    Institute of Scientific and Technical Information of China (English)

    CAI Li-zhi; TONG Wei-qin; YANG Gen-xing

    2008-01-01

    Software testing coverage criteria play an important role in the whole testing process. The current coverage criteria for web applications are based on program or URL. They are not suitable for black-box test or intuitional to use. This paper defines a kind of test criteria based on page coverage sequences only navigated by web application, including Page_Single, Page_Post, Page_Pre,Page_Seq2, Page_SeqK. The test criteria based on page coverage sequences made by interactions between web application and browser are being under consideration after that. In order to avoid ambiguity of natural language, these coverage criteria are depicted using Z formal language. The empirical result shows that the criteria complement traditional coverage and fault detection capability criteria.

  1. Calibrating page sized Gafchromic EBT3 films

    Energy Technology Data Exchange (ETDEWEB)

    Crijns, W.; Maes, F.; Heide, U. A. van der; Van den Heuvel, F. [Department of Radiation Oncology, University Hospitals Leuven, Herestraat 49, 3000 Leuven (Belgium); Department ESAT/PSI-Medical Image Computing, Medical Imaging Research Center, KU Leuven, Herestraat 49, 3000 Leuven (Belgium); Department of Radiation Oncology, The Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital, Plesmanlaan 121, 1066 CX Amsterdam (Netherlands); Department of Radiation Oncology, University Hospitals Leuven, Herestraat 49, 3000 Leuven (Belgium)

    2013-01-15

    Purpose: The purpose is the development of a novel calibration method for dosimetry with Gafchromic EBT3 films. The method should be applicable for pretreatment verification of volumetric modulated arc, and intensity modulated radiotherapy. Because the exposed area on film can be large for such treatments, lateral scan errors must be taken into account. The correction for the lateral scan effect is obtained from the calibration data itself. Methods: In this work, the film measurements were modeled using their relative scan values (Transmittance, T). Inside the transmittance domain a linear combination and a parabolic lateral scan correction described the observed transmittance values. The linear combination model, combined a monomer transmittance state (T{sub 0}) and a polymer transmittance state (T{sub {infinity}}) of the film. The dose domain was associated with the observed effects in the transmittance domain through a rational calibration function. On the calibration film only simple static fields were applied and page sized films were used for calibration and measurements (treatment verification). Four different calibration setups were considered and compared with respect to dose estimation accuracy. The first (I) used a calibration table from 32 regions of interest (ROIs) spread on 4 calibration films, the second (II) used 16 ROIs spread on 2 calibration films, the third (III), and fourth (IV) used 8 ROIs spread on a single calibration film. The calibration tables of the setups I, II, and IV contained eight dose levels delivered to different positions on the films, while for setup III only four dose levels were applied. Validation was performed by irradiating film strips with known doses at two different time points over the course of a week. Accuracy of the dose response and the lateral effect correction was estimated using the dose difference and the root mean squared error (RMSE), respectively. Results: A calibration based on two films was the optimal

  2. 搜索引擎PageRank算法的改进%Improvement of PageRank Algorithm for Search Engine

    Institute of Scientific and Technical Information of China (English)

    杨劲松; 凌培亮

    2009-01-01

    In order to solve the problems in information retrieval when enterprise making rapid decision, this paper proposes an improved PageRank algorithm. Considering the time factor by Web page, it distributes the forward link different PageRank value based on the proportion by the similarity analysis between anchor text and Web page text. The final PageRank value is more suitable for topic-specific search engine and keeps simplicity of algorithm. Experimental result shows that the improved algorithm can effectively reduce the phenomenon of topic-drift and enhance the PageRank value of new Web page.%为了解决企业快速决策时信息检索的问题,提出一种改进的PageRank算法.在考虑网页产生时间因素的同时,通过锚文本与网页主题的相似度分析按权重分配网页各正向链接PageRank值,产生的PageRank值更贴合主题搜索引擎的要求,并保持算法的简洁性.实验结果证明该改进算法能有效减少主题漂移现象,恰当提升新网页PageRank值.

  3. Hierarchical Web Page Classification Based on a Topic Model and Neighboring Pages Integration

    OpenAIRE

    Sriurai, Wongkot; Meesad, Phayung; Haruechaiyasak, Choochart

    2010-01-01

    Most Web page classification models typically apply the bag of words (BOW) model to represent the feature space. The original BOW representation, however, is unable to recognize semantic relationships between terms. One possible solution is to apply the topic model approach based on the Latent Dirichlet Allocation algorithm to cluster the term features into a set of latent topics. Terms assigned into the same topic are semantically related. In this paper, we propose a novel hierarchical class...

  4. Block Models and Personalized PageRank

    CERN Document Server

    Kloumann, Isabel; Kleinberg, Jon

    2016-01-01

    Methods for ranking the importance of nodes in a network have a rich history in machine learning and across domains that analyze structured data. Recent work has evaluated these methods though the seed set expansion problem: given a subset $S$ of nodes from a community of interest in an underlying graph, can we reliably identify the rest of the community? We start from the observation that the most widely used techniques for this problem, personalized PageRank and heat kernel methods, operate in the space of landing probabilities of a random walk rooted at the seed set, ranking nodes according to weighted sums of landing probabilities of different length walks. Both schemes, however, lack an a priori relationship to the seed set objective. In this work we develop a principled framework for evaluating ranking methods by studying seed set expansion applied to the stochastic block model. We derive the optimal gradient for separating the landing probabilities of two classes in a stochastic block model, and find, ...

  5. Striations in PageRank-Ordered Matrices

    CERN Document Server

    Pennycuff, Corey

    2016-01-01

    Patterns often appear in a variety of large, real-world networks, and interesting physical phenomena are often explained by network topology as in the case of the bow-tie structure of the World Wide Web, or the small world phenomenon in social networks. The discovery and modelling of such regular patterns has a wide application from disease propagation to financial markets. In this work we describe a newly discovered regularly occurring striation pattern found in the PageRank ordering of adjacency matrices that encode real-world networks. We demonstrate that these striations are the result of well-known graph generation processes resulting in regularities that are manifest in the typical neighborhood distribution. The spectral view explored in this paper encodes a tremendous amount about the explicit and implicit topology of a given network, so we also discuss the interesting network properties, outliers and anomalies that a viewer can determine from a brief look at the re-ordered matrix.

  6. SDS -PAGE and Western Blotting Techniques.

    Science.gov (United States)

    Blancher, C; Jones, A

    2001-01-01

    The goal of Western blotting, or more correctly, immunoblotting, is to identify with a specific antibody a particular antigen within a complex mixture of proteins that has been fractionated in a polyacrylamide gel and immobilized onto a membrane. Immunoblotting can be used to determine a number of important characteristics of protein antigens-the presence and quantity of an antigen, the relative molecular weight of the polypeptide chain, and the efficiency of extraction of the antigen.Immunoblotting occurs in six stages: (1) extraction and quantification of protein samples; (2) resolution of the protein sample in sodium dodecyl sulfatepolyacrylamide denaturing gel electrophoresis (SDS-PAGE); (3) transfer of the separated polypeptides to a membrane support; (4) blocking nonspecific binding sites on the membrane; (5) addition of antibodies; and (6) detection.Sample preparation is important for obtaining accurate separation of the proteins on the basis of molecular weight. Depending on whether an antigen is primarily extracellular, cytoplasmic, or membrane-associated different procedures might be required to prepare the sample initially. Although there are exceptions, many soluble nuclear and cytoplasmic proteins can be solubilized by lysis buffers that contain the nonionic detergent Nonidet P-40 (NP-40) and either no salt at all or relatively high concentrations of salt (e.g., 0.5 M NaCl). However, the efficiency of extraction is often greatly affected by pH of the buffer and the presence or absence of chelating agents such EDTA.

  7. Extraction of Flat and Nested Data Records from Web Pages

    CERN Document Server

    Hiremath, P S

    2010-01-01

    This paper studies the problem of identification and extraction of flat and nested data records from a given web page. With the explosive growth of information sources available on the World Wide Web, it has become increasingly difficult to identify the relevant pieces of information, since web pages are often cluttered with irrelevant content like advertisements, navigation-panels, copyright notices etc., surrounding the main content of the web page. Hence, it is useful to mine such data regions and data records in order to extract information from such web pages to provide value-added services. Currently available automatic techniques to mine data regions and data records from web pages are still unsatisfactory because of their poor performance. In this paper a novel method to identify and extract the flat and nested data records from the web pages automatically is proposed. It comprises of two steps : (1) Identification and Extraction of the data regions based on visual clues information. (2) Identificatio...

  8. Some notes on taxonomic methodology

    NARCIS (Netherlands)

    Hammen, van der L.

    1986-01-01

    The present paper constitutes an introduction to taxonomic methodology. After an analysis of taxonomic practice, and a brief survey of kinds of attributes, the paper deals with observation, description, comparison, arrangement and classification, hypothesis construction, deduction, model, experiment

  9. An Efficient PageRank Approach for Urban Traffic Optimization

    OpenAIRE

    2012-01-01

    The cities are not static environments. They change constantly. When we talk about traffic in the city, the evolution of traffic lights is a journey from mindless automation to increasingly intelligent, fluid traffic management. In our approach, presented in this paper, reinforcement-learning mechanism based on cost function is introduced to determine optimal decisions for each traffic light, based on the solution given by Larry Page for page ranking in Web environment (Page et al. (1999))...

  10. One-Page Multimedia Interactive Map

    Directory of Open Access Journals (Sweden)

    Nicola Maiellaro

    2017-01-01

    Full Text Available The relevance of local knowledge in cultural heritage is by now acknowledged. It helps to determine many community-based projects by identifying the material to be digitally maintained in multimedia collections provided by communities of volunteers, rather than for-profit businesses or government entities. Considering that the search and browsing of texts, images, video, and 3D models related to places is more essential than using a simple text-based search, an interactive multimedia map was implemented in this study. The map, which is loaded on a single HyperText Markup Language (HTML page using AJAX (Asynchronous JavaScript and XML, with a client-side control mechanism utilising jQuery components that are both freely available and ad-hoc developed, is updated according to user interaction. To simplify the publication of geo-referenced information, the application stores all the data in a Geographic JavaScript Object Notation (GeoJSON file rather than in a database. The multimedia contents—associated with the selected Points of Interest (PoIs—can be selected through text search and list browsing as well as by viewing their previews one by one in a sequence all together in a scrolling window (respectively: “Table”, “Folder”, and “Tile” functions. PoIs—visualised on the map with multi-shape markers using a set of unambiguous colours—can be filtered through their categories and types, accessibility status and timeline, thus improving the system usability. The map functions are illustrated using data collected in a Comenius project. Notes on the application software and architecture are also presented in this paper.

  11. A thorough spring-clean for CERN's Web pages

    CERN Multimedia

    2001-01-01

    This coming Tuesday will see the unveiling of CERN's new user pages on the Web. Their simplified layout and design will make everybody's lives a whole lot easier. Stand by for Tuesday 17 April when, as announced in the Weekly Bulletin of 2 April (n°14/2001), the new newly-designed users welcome page will be hitting our screens as the default CERN home page. But don't worry, if you've got the blues for the good old blue-green home page it's still in service and, to ensure a smooth transition, will be maintained in parallel until 25 May. But in all likelihood you'll be quickly won over by the new-look pages, which are so much simpler to use. Welcome to the new Web! The aim of this revamp, led by the WPE (Web Public Education) group, is to simplify and introduce a more logical hierarchy into the menus and welcome pages on CERN's Intranet. In a second stage, the 'General Public' pages will get a similar makeover. The fact is that the number of links on the user pages, and in particular the welcome page...

  12. DISTRIBUTED APPROACH to WEB PAGE CATEGORIZATION USING MAPREDUCE PROGRAMMING MODEL

    Directory of Open Access Journals (Sweden)

    P.Malarvizhi

    2011-12-01

    Full Text Available The web is a large repository of information and to facilitate the search and retrieval of pages from it,categorization of web documents is essential. An effective means to handle the complexity of information retrieval from the internet is through automatic classification of web pages. Although lots of automatic classification algorithms and systems have been presented, most of the existing approaches are computationally challenging. In order to overcome this challenge, we have proposed a parallel algorithm, known as MapReduce programming model to automatically categorize the web pages. This approach incorporates three concepts. They are web crawler, MapReduce programming model and the proposed web page categorization approach. Initially, we have utilized web crawler to mine the World Wide Web and the crawled web pages are then directly given as input to the MapReduce programming model. Here the MapReduce programming model adapted to our proposed web page categorization approach finds the appropriate category of the web page according to its content. The experimental results show that our proposed parallel web page categorization approach achieves satisfactory results in finding the right category for any given web page.

  13. A Heuristic Algorithm for optimizing Page Selection Instructions

    CERN Document Server

    Li, Qing'an; Chen, Yong; Wu, Wei; Xu, Wenwen

    2010-01-01

    Page switching is a technique that increases the memory in microcontrollers without extending the address buses. This technique is widely used in the design of 8-bit MCUs. In this paper, we present an algorithm to reduce the overhead of page switching. To pursue small code size, we place the emphasis on the allocation of functions into suitable pages with a heuristic algorithm, thereby the cost-effective placement of page selection instructions. Our experimental results showed the optimization achieved a reduction in code size of 13.2 percent.

  14. Recognition of pornographic web pages by classifying texts and images.

    Science.gov (United States)

    Hu, Weiming; Wu, Ou; Chen, Zhouyao; Fu, Zhouyu; Maybank, Steve

    2007-06-01

    With the rapid development of the World Wide Web, people benefit more and more from the sharing of information. However, Web pages with obscene, harmful, or illegal content can be easily accessed. It is important to recognize such unsuitable, offensive, or pornographic Web pages. In this paper, a novel framework for recognizing pornographic Web pages is described. A C4.5 decision tree is used to divide Web pages, according to content representations, into continuous text pages, discrete text pages, and image pages. These three categories of Web pages are handled, respectively, by a continuous text classifier, a discrete text classifier, and an algorithm that fuses the results from the image classifier and the discrete text classifier. In the continuous text classifier, statistical and semantic features are used to recognize pornographic texts. In the discrete text classifier, the naive Bayes rule is used to calculate the probability that a discrete text is pornographic. In the image classifier, the object's contour-based features are extracted to recognize pornographic images. In the text and image fusion algorithm, the Bayes theory is used to combine the recognition results from images and texts. Experimental results demonstrate that the continuous text classifier outperforms the traditional keyword-statistics-based classifier, the contour-based image classifier outperforms the traditional skin-region-based image classifier, the results obtained by our fusion algorithm outperform those by either of the individual classifiers, and our framework can be adapted to different categories of Web pages.

  15. Fuzzy Clustering Method for Web User Based on Pages Classification

    Institute of Scientific and Technical Information of China (English)

    ZHAN Li-qiang; LIU Da-xin

    2004-01-01

    A new method for Web users fuzzy clustering based on analysis of user interest characteristic is proposed in this article.The method first defines page fuzzy categories according to the links on the index page of the site, then computes fuzzy degree of cross page through aggregating on data of Web log.After that, by using fuzzy comprehensive evaluation method, the method constructs user interest vectors according to page viewing times and frequency of hits, and derives the fuzzy similarity matrix from the interest vectors for the Web users.Finally, it gets the clustering result through the fuzzy clustering method.The experimental results show the effectiveness of the method.

  16. A Model for Web Page Usage Mining Based on Segmentation

    OpenAIRE

    Kuppusamy, K. S.; Aghila, G.

    2012-01-01

    The web page usage mining plays a vital role in enriching the page's content and structure based on the feedbacks received from the user's interactions with the page. This paper proposes a model for micro-managing the tracking activities by fine-tuning the mining from the page level to the segment level. The proposed model enables the web-master to identify the segments which receives more focus from users comparing with others. The segment level analytics of user actions provides an importan...

  17. Discovering author impact: A PageRank perspective

    CERN Document Server

    Yan, Erjia

    2010-01-01

    This article provides an alternative perspective for measuring author impact by applying PageRank algorithm to a coauthorship network. A weighted PageRank algorithm considering citation and coauthorship network topology is proposed. We test this algorithm under different damping factors by evaluating author impact in the informetrics research community. In addition, we also compare this weighted PageRank with the h-index, citation, and program committee (PC) membership of the International Society for Scientometrics and Informetrics (ISSI) conferences. Findings show that this weighted PageRank algorithm provides reliable results in measuring author impact.

  18. Tourism Methodologies

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  19. Formatting a Paper-based Survey Questionnaire: Best Practices

    Directory of Open Access Journals (Sweden)

    Elizabeth Fanning

    2005-08-01

    Full Text Available This paper summarizes best practices with regard to paper-based survey questionnaire design. Initial design considerations, the cover and cover page, directions, ordering of questions, navigational path (branching, and page design are discussed.

  20. Nuclear proteasomes carry a constitutive posttranslational modification which derails SDS-PAGE (but not CTAB-PAGE).

    Science.gov (United States)

    Pitcher, David S; de Mattos-Shipley, Kate; Wang, Ziming; Tzortzis, Konstantinos; Goudevenou, Katerina; Flynn, Helen; Bohn, Georg; Rahemtulla, Amin; Roberts, Irene; Snijders, Ambrosius P; Karadimitris, Anastasios; Kleijnen, Maurits F

    2014-12-01

    We report that subunits of human nuclear proteasomes carry a previously unrecognised, constitutive posttranslational modification. Subunits with this modification are not visualised by SDS-PAGE, which is used in almost all denaturing protein gel electrophoresis. In contrast, CTAB-PAGE readily visualises such modified subunits. Thus, under most experimental conditions, with identical samples, SDS-PAGE yielded gel electrophoresis patterns for subunits of nuclear proteasomes which were misleading and strikingly different from those obtained with CTAB-PAGE. Initial analysis indicates a novel modification of a high negative charge with some similarity to polyADP-ribose, possibly explaining compatibility with (positively-charged) CTAB-PAGE but not (negatively-charged) SDS-PAGE and providing a mechanism for how nuclear proteasomes may interact with chromatin, DNA and other nuclear components.

  1. Application of sample inventorying methodology in surveying occupant density at public spaces%抽样清点法在公共场所人员密度调查中的应用

    Institute of Scientific and Technical Information of China (English)

    甘廷霞; 谢晓刚; 胡忠日

    2012-01-01

    Introduce the sample inventorying/counting methodology , and apply the method to survey the occupant density at the Xidan Department Store of Chengdu. By comparing the results with other methodologies, the practicability of sample inventorying approach and its applicable venues were analyzed. The method offered a guidance and reference for survey approaches for survey of basic data such as occupant density at public spaces.%介绍抽样清点方法,应用抽样清点法对成都西单商场的人员密度进行调查,将调查结果与采用其他调查方法进行人员密度调查的结果对比,分析抽样清点法在人员密度调查中的实用性和使用场所,为开展公共场所人员密度等基础数据的调查研究提供调查方法的指导和参考.

  2. Paired Comparison Survey Analyses Utilizing Rasch Methodology of the Relative Difficulty and Estimated Work Relative Value Units of CPT® Code 27279

    Science.gov (United States)

    Lorio, Morgan; Ferrara, Lisa

    2016-01-01

    Background Minimally invasive sacroiliac joint arthrodesis (“MI SIJ fusion”) received a Category I CPT® code (27279) effective January 1, 2015 and was assigned a work relative value unit (“RVU”) of 9.03. The International Society for the Advancement of Spine Surgery (“ISASS”) conducted a study consisting of a Rasch analysis of two separate surveys of surgeons to assess the accuracy of the assigned work RVU. Methods A survey was developed and sent to ninety-three ISASS surgeon committee members. Respondents were asked to compare CPT® 27279 to ten other comparator CPT® codes reflective of common spine surgeries. The survey presented each comparator CPT® code with its code descriptor as well as the description of CPT® 27279 and asked respondents to indicate whether CPT® 27279 was greater, equal, or less in terms of work effort than the comparator code. A second survey was sent to 557 U.S.-based spine surgeon members of ISASS and 241 spine surgeon members of the Society for Minimally Invasive Spine Surgery (“SMISS”). The design of the second survey mirrored that of the first survey except for the use of a broader set of comparator CPT® codes (27 vs. 10). Using the work RVUs of the comparator codes, a Rasch analysis was performed to estimate the relative difficulty of CPT® 27279, after which the work RVU of CPT® 27279 was estimated by regression analysis. Results Twenty surgeons responded to the first survey and thirty-four surgeons responded to the second survey. The results of the regression analysis of the first survey indicate a work RVU for CPT® 27279 of 14.36 and the results of the regression analysis of the second survey indicate a work RVU for CPT® 27279 of 14.1. Conclusion The Rasch analysis indicates that the current work RVU assigned to CPT® 27279 is undervalued at 9.03. Averaging the results of the regression analyses of the two surveys indicates a work RVU for CPT® 27279 of 14.23.

  3. JavaScript: Convenient Interactivity for the Class Web Page.

    Science.gov (United States)

    Gray, Patricia

    This paper shows how JavaScript can be used within HTML pages to add interactive review sessions and quizzes incorporating graphics and sound files. JavaScript has the advantage of providing basic interactive functions without the use of separate software applications and players. Because it can be part of a standard HTML page, it is…

  4. Toward a User-Centered Academic Library Home Page

    Science.gov (United States)

    McHale, Nina

    2008-01-01

    In the past decade, academic libraries have struggled with the design of an effective library home page. Since librarians' mental models of information architecture differ from those of their patrons, usability assessments are necessary in designing a user-centered home page. This study details a usability sequence of card sort and paper and…

  5. Dynamic Web Pages: Performance Impact on Web Servers.

    Science.gov (United States)

    Kothari, Bhupesh; Claypool, Mark

    2001-01-01

    Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)

  6. ONTOPARK: ONTOLOGY BASED PAGE RANKING FRAMEWORK USING RESOURCE DESCRIPTION FRAMEWORK

    Directory of Open Access Journals (Sweden)

    S. Yasodha

    2014-01-01

    Full Text Available Traditional search engines like Google and Yahoo fail to rank the relevant information for users’ query. This is because such search engines rely on keywords for searching and they fail to consider the semantics of the query. More sophisticated methods that do provide the relevant information for the query is the need of the time. The Semantic Web that stores metadata as ontology could be used to solve this problem. The major drawback of the PageRank algorithm of Google is that ranking is based not only on the page ranks produced but also on the number of hits to the Web page. This paved way for illegitimate means of boosting page ranks. As a result, Web pages whose page rank is zero are also ranked in top-order. This drawback of PageRank algorithm motivated us to contribute to the Web community to provide semantic search results. So we propose ONTOPARK, an ontology based framework for ranking Web pages. The proposed framework combines the Vector Space Model of Information Retrieval with Ontology. The framework constructs semantically annotated Resource Description Framework (RDF files which form the RDF knowledgebase for each query. The proposed framework has been evaluated by two measures, precision and recall. The proposed framework improves the precision of both single-word and multi-word queries which infer that replacing Web database by semantic knowledgebase will definitely improve the quality of search. The surfing time of the surfers will also be minimized.

  7. Evaluating Information Quality: Hidden Biases on the Children's Web Pages

    Science.gov (United States)

    Kurubacak, Gulsun

    2006-01-01

    As global digital communication continues to flourish, the Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political…

  8. An Analysis of Academic Library Web Pages for Faculty

    Science.gov (United States)

    Gardner, Susan J.; Juricek, John Eric; Xu, F. Grace

    2008-01-01

    Web sites are increasingly used by academic libraries to promote key services and collections to teaching faculty. This study analyzes the content, location, language, and technological features of fifty-four academic library Web pages designed especially for faculty to expose patterns in the development of these pages.

  9. Project Management - Development of course materiale as WEB pages

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe; Bjergø, Søren

    1997-01-01

    Development of Internet pages with lessons plans, slideshows, links, conference system and interactive student section for communication between students and to teacher as well.......Development of Internet pages with lessons plans, slideshows, links, conference system and interactive student section for communication between students and to teacher as well....

  10. Resource selection for an interdisciplinary field: a methodology.

    Science.gov (United States)

    Jacoby, Beth E; Murray, Jane; Alterman, Ina; Welbourne, Penny

    2002-10-01

    The Health Sciences and Human Services Library of the University of Maryland developed and implemented a methodology to evaluate print and digital resources for social work. Although this methodology was devised for the interdisciplinary field of social work, the authors believe it may lend itself to resource selection in other interdisciplinary fields. The methodology was developed in response to the results of two separate surveys conducted in late 1999, which indicated improvement was needed in the library's graduate-level social work collections. Library liaisons evaluated the print collection by identifying forty-five locally relevant Library of Congress subject headings and then using these subjects or synonymous terms to compare the library's titles to collections of peer institutions, publisher catalogs, and Amazon.com. The collection also was compared to social work association bibliographies, ISI Journal Citation Reports, and major social work citation databases. An approval plan for social work books was set up to assist in identifying newly published titles. The library acquired new print and digital social work resources as a result of the evaluation, thus improving both print and digital collections for its social work constituents. Visibility of digital resources was increased by cataloging individual titles in aggregated electronic journal packages and listing each title on the library Web page.

  11. PageRank model of opinion formation on social networks

    Science.gov (United States)

    Kandiah, Vivek; Shepelyansky, Dima L.

    2012-11-01

    We propose the PageRank model of opinion formation and investigate its rich properties on real directed networks of the Universities of Cambridge and Oxford, LiveJournal, and Twitter. In this model, the opinion formation of linked electors is weighted with their PageRank probability. Such a probability is used by the Google search engine for ranking of web pages. We find that the society elite, corresponding to the top PageRank nodes, can impose its opinion on a significant fraction of the society. However, for a homogeneous distribution of two opinions, there exists a bistability range of opinions which depends on a conformist parameter characterizing the opinion formation. We find that the LiveJournal and Twitter networks have a stronger tendency to a totalitarian opinion formation than the university networks. We also analyze the Sznajd model generalized for scale-free networks with the weighted PageRank vote of electors.

  12. The ICAP (Interactive Course Assignment Pages Publishing System

    Directory of Open Access Journals (Sweden)

    Kim Griggs

    2008-03-01

    Full Text Available The ICAP publishing system is an open source custom content management system that enables librarians to easily and quickly create and manage library help pages for course assignments (ICAPs, without requiring knowledge of HTML or other web technologies. The system's unique features include an emphasis on collaboration and content reuse and an easy-to-use interface that includes in-line help, simple forms and drag and drop functionality. The system generates dynamic, attractive course assignment pages that blend Web 2.0 features with traditional library resources, and makes the pages easier to find by providing a central web page for the course assignment pages. As of December 2007, the code is available as free, open-source software under the GNU General Public License.

  13. Collective Behaviour Learning :A Concept For Filtering Web Pages

    Directory of Open Access Journals (Sweden)

    G. Mercy Bai

    2014-03-01

    Full Text Available The rapid growth of the WWW poses unprecedented challenges for general purpose crawlers and search engines. The Former technique used to crawl web pages was FOCUS (Forum Crawler Under Supervision.This project presents a collective behavior learning algorithm for web crawling. The collective behavior learning algorithm crawl the web pages based on particular keyword. Discriminative learning extracts only the related URL of the particular keyword based on filtering. The goal of this project is to crawl relevant forum content from the web with minimal overhead. The unwanted URL is removed from the web pages and the web page crawling is reduced by using the collective behavior learning. The web pages must be extracted based on certain learning techniques and can be used to collect the unwanted URL’S.

  14. An Improved Approach to the PageRank Problems

    Directory of Open Access Journals (Sweden)

    Yue Xie

    2013-01-01

    Full Text Available We introduce a partition of the web pages particularly suited to the PageRank problems in which the web link graph has a nested block structure. Based on the partition of the web pages, dangling nodes, common nodes, and general nodes, the hyperlink matrix can be reordered to be a more simple block structure. Then based on the parallel computation method, we propose an algorithm for the PageRank problems. In this algorithm, the dimension of the linear system becomes smaller, and the vector for general nodes in each block can be calculated separately in every iteration. Numerical experiments show that this approach speeds up the computation of PageRank.

  15. Monte Carlo methods in PageRank computation: When one iteration is sufficient

    NARCIS (Netherlands)

    Avrachenkov, K.; Litvak, N.; Nemirovsky, D.; Osipova, N.

    2007-01-01

    PageRank is one of the principle criteria according to which Google ranks Web pages. PageRank can be interpreted as a frequency of visiting a Web page by a random surfer, and thus it reflects the popularity of a Web page. Google computes the PageRank using the power iteration method, which requires

  16. Monte Carlo methods in PageRank computation: When one iteration is sufficient

    NARCIS (Netherlands)

    Avrachenkov, K.; Litvak, N.; Nemirovsky, D.; Osipova, N.

    2005-01-01

    PageRank is one of the principle criteria according to which Google ranks Web pages. PageRank can be interpreted as a frequency of visiting a Web page by a random surfer and thus it reflects the popularity of a Web page. Google computes the PageRank using the power iteration method which requires ab

  17. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  18. Methodology For The System Integration Of Adaptive Resilience In Armor

    Science.gov (United States)

    2016-09-01

    achieved in an externally reconfigurable fashion; however, this would not make sense because this would create a vulnerability in the armor protection that...complex operating environment, systems engineering, system integration, engineering resilience, resilience theory 15. NUMBER OF PAGES 269 16...each step of the methodology. This methodology makes possible many new applications for integrating adaptive resilience technological systems. These

  19. 带权网络的个性化PageRank计算%Computing personalized PageRank in weighted networks

    Institute of Scientific and Technical Information of China (English)

    彭茂; 张媛

    2016-01-01

    PageRank assigns authority weights to each web page based on the web hyperlink structure,while the personalized PageRank is a generalized version of ordinary PageRank. The computation of personalized PageRank vector in unweighted web is well studied in the past decades,but little is known for the case of weighted webs.In this paper,we analyze the algorithms for PageRank computations in static as well as dynamic weighted networks.The al⁃gorithms are based on matrix transformation or Monte Carlo methods,and are analyzed theoretically for computation performance.Experiments show that the proposed localized algorithm outperforms power iteration and a referenced Monte Carlo method.%PageRank是衡量网络节点重要性的指标之一,个性化PageRank是普通Pag⁃eRank的推广形式。目前关于(个性化) PageRank的研究主要集中在无权网络,而关于带权网络的研究结果较少。有鉴于此,基于矩阵变换和蒙特卡罗方法,分别给出了在静态和动态带权网络中个性化PageRank计算方法,并从理论上分析了算法的性能。实验结果显示,两种算法都优于传统的幂迭代算法。

  20. Ranking pages and the topology of the web

    CERN Document Server

    Arratia, Argimiro

    2011-01-01

    This paper presents our studies on the rearrangement of links from the structure of websites for the purpose of improving the valuation of a page or group of pages as established by a ranking function as Google's PageRank. We build our topological taxonomy starting from unidirectional and bidirectional rooted trees, and up to more complex hierarchical structures as cyclical rooted trees (obtained by closing cycles on bidirectional trees) and PR--digraph rooted trees (digraphs whose condensation digraph is a rooted tree that behave like cyclical rooted trees). We give different modifications on the structure of these trees and its effect on the valuation given by the PageRank function. We derive closed formulas for the PageRank of the root of various types of trees, and establish a hierarchy of these topologies in terms of PageRank. We show that the PageRank of the root of cyclical and PR--digraph trees basically depends on the number of vertices per level and the number of cycles of distinct lengths among lev...

  1. Fishing over the sides or over the stern: does it matter : comparison of two fishing methodologies in the Wadden Sea Demersal Fish Survey

    NARCIS (Netherlands)

    Chen, C.; Bolle, L.J.; Boois I.J. de, Ingeborg

    2016-01-01

    Since 1972, the Demersal Fish Survey (DFS) in the Wadden Sea has been carried out with the RV Stern. Within a few years this vessel will be replaced by another vessel as a result of the current ship replacement policy of Rijkswaterstaat Rijksrederij. It is not yet clear which vessel will replace RV

  2. An Efficient Web Page Ranking for Semantic Web

    Science.gov (United States)

    Chahal, P.; Singh, M.; Kumar, S.

    2014-01-01

    With the enormous amount of information presented on the web, the retrieval of relevant information has become a serious problem and is also the topic of research for last few years. The most common tools to retrieve information from web are search engines like Google. The Search engines are usually based on keyword searching and indexing of web pages. This approach is not very efficient as the result-set of web pages obtained include large irrelevant pages. Sometimes even the entire result-set may contain lot of irrelevant pages for the user. The next generation of search engines must address this problem. Recently, many semantic web search engines have been developed like Ontolook, Swoogle, which help in searching meaningful documents presented on semantic web. In this process the ranking of the retrieved web pages is very crucial. Some attempts have been made in ranking of semantic web pages but still the ranking of these semantic web documents is neither satisfactory and nor up to the user's expectations. In this paper we have proposed a semantic web based document ranking scheme that relies not only on the keywords but also on the conceptual instances present between the keywords. As a result only the relevant page will be on the top of the result-set of searched web pages. We explore all relevant relations between the keywords exploring the user's intention and then calculate the fraction of these relations on each web page to determine their relevance. We have found that this ranking technique gives better results than those by the prevailing methods.

  3. 矿业权实地核查技术方法指南研究%Methodological studies of national survey and verification of mineral rights

    Institute of Scientific and Technical Information of China (English)

    杨建锋; 林燕; 孙炳旭; 王永志

    2011-01-01

    全国矿业权实地核查是一项复杂的系统工程,科学、实用、先进的技术方法是按期保质完成矿业权实地核查工作的前提条件.按照科学实用、便于操作、快捷高效、成果应用潜力等原则,本文提出了矿业权实地核查工作的技术流程,确定了矿业权实地核查关键技术参数,提出了全国矿业权实地核查的组织实施方式,用来指导和规范各个地区的矿业权实地核查工作.实践表明,本文所提出的矿业权实地核查技术方法指南达到了预期效果,保障了全国矿业权实地核查工作的顺利推进.%National survey and verification of mineral rights is a complicated systematic engineering. According to the scientific and practical, easy to operate, and high efficient principles for the project, the paper put forward the directive procedure of mineral rights survey and verification, determined some key technical parameters such as surveying coordinate systems, number and precision of surveying control points, format of attribute data and spatial data, etc. Through the experimental work in 4 typical counties, the paper proposed a work pattern of mineral rights survey and verification.

  4. Optimizing TLB entries for mixed page size storage in contiguous memory

    Science.gov (United States)

    Chen, Dong; Gara, Alan; Giampapa, Mark E.; Heidelberger, Philip; Kriegel, Jon K.; Ohmacht, Martin; Steinmacher-Burow, Burkhard

    2013-04-30

    A system and method for accessing memory are provided. The system comprises a lookup buffer for storing one or more page table entries, wherein each of the one or more page table entries comprises at least a virtual page number and a physical page number; a logic circuit for receiving a virtual address from said processor, said logic circuit for matching the virtual address to the virtual page number in one of the page table entries to select the physical page number in the same page table entry, said page table entry having one or more bits set to exclude a memory range from a page.

  5. Web Pages for Your Classroom The EASY Way!

    CERN Document Server

    Mccorkle, Sandra

    2003-01-01

    A practical how-to guide, this book provides the classroom teacher or librarian with all of the tools necessary for creating Web pages for student use. Useful templates-a CD ROM is included for easy use-and clear, logical instructions guide you in the creation of pages that students can later use for research or other types of projects that familiarize students with the power and usefulness of the Web. Gaining this skill allows you the flexibility of tailoring Web pages to students' specific needs and being sure of the quality of resources students are accessing. This book is indispensable for

  6. A New Page Ranking Algorithm Based On WPRVOL Algorithm

    OpenAIRE

    Roja Javadian Kootenae; Seyyed Mohsen Hashemi; mehdi afzali

    2013-01-01

    The amount of information on the web is always growing, thus powerful search tools are needed to search for such a large collection. Search engines in this direction help users so they can find their desirable information among the massive volume of information in an easier way. But what is important in the search engines and causes a distinction between them is page ranking algorithm used in them. In this paper a new page ranking algorithm based on "Weighted Page Ranking based on Visits of ...

  7. Research on PageRank Algorithm Based on Web Page Segmentation Model%基于页面分块模型的PageRank算法研究

    Institute of Scientific and Technical Information of China (English)

    白似雪; 刘华斌

    2008-01-01

    提出了一个基于页面分块重要性模型的PageRank改进算法.该算法考虑同一页面内属于不同分块的出链接有着不同的重要性,故对不同分块的出链接赋予相应的权重,从而更合理、更公正、更有效地计算页面的PageRank值.与以往的PageRank算法及其改进算法相比,该算法以基于视觉特征的页面分块算法为核心,更好地反映了网页的特性,符合了用户的使用习惯,具有良好的效果.

  8. The ATLAS Public Web Pages: Online Management of HEP External Communication Content

    CERN Document Server

    Goldfarb, Steven; Phoboo, Abha Eli; Shaw, Kate

    2015-01-01

    The ATLAS Education and Outreach Group is in the process of migrating its public online content to a professionally designed set of web pages built on the Drupal content management system. Development of the front-end design passed through several key stages, including audience surveys, stakeholder interviews, usage analytics, and a series of fast design iterations, called sprints. Implementation of the web site involves application of the html design using Drupal templates, refined development iterations, and the overall population of the site with content. We present the design and development processes and share the lessons learned along the way, including the results of the data-driven discovery studies. We also demonstrate the advantages of selecting a back-end supported by content management, with a focus on workflow. Finally, we discuss usage of the new public web pages to implement outreach strategy through implementation of clearly presented themes, consistent audience targeting and messaging, and th...

  9. Heap/stack guard pages using a wakeup unit

    Energy Technology Data Exchange (ETDEWEB)

    Gooding, Thomas M; Satterfield, David L; Steinmacher-Burow, Burkhard

    2014-04-29

    A method and system for providing a memory access check on a processor including the steps of detecting accesses to a memory device including level-1 cache using a wakeup unit. The method includes invalidating level-1 cache ranges corresponding to a guard page, and configuring a plurality of wakeup address compare (WAC) registers to allow access to selected WAC registers. The method selects one of the plurality of WAC registers, and sets up a WAC register related to the guard page. The method configures the wakeup unit to interrupt on access of the selected WAC register. The method detects access of the memory device using the wakeup unit when a guard page is violated. The method generates an interrupt to the core using the wakeup unit, and determines the source of the interrupt. The method detects the activated WAC registers assigned to the violated guard page, and initiates a response.

  10. An Optimization Model for Product Placement on Product Listing Pages

    Directory of Open Access Journals (Sweden)

    Yan-Kwang Chen

    2014-01-01

    Full Text Available The design of product listing pages is a key component of Website design because it has significant influence on the sales volume on a Website. This study focuses on product placement in designing product listing pages. Product placement concerns how venders of online stores place their products over the product listing pages for maximization of profit. This problem is very similar to the offline shelf management problem. Since product information sources on a Web page are typically communicated through the text and image, visual stimuli such as color, shape, size, and spatial arrangement often have an effect on the visual attention of online shoppers and, in turn, influence their eventual purchase decisions. In view of the above, this study synthesizes the visual attention literature and theory of shelf-space allocation to develop a mathematical programming model with genetic algorithms for finding optimal solutions to the focused issue. The validity of the model is illustrated with example problems.

  11. An Efficient Paging Algorithm for Multi-Carrier CDMA System

    CERN Document Server

    Mostafa, Sheikh Shanawaz; Rashid, Gazi Maniur; Moinuddin, Muhammad; Amin, Md Ziaul; Nahid, Abdullah Al

    2011-01-01

    To cope with the increasing demand of wireless communication services multi-carrier systems are being used. Radio resources are very limited and efficient usages of these resources are inevitable to get optimum performance of the system. Paging channel is a low-bandwidth channel and one of the most important channels on which system performance depends significantly. Therefore it is vulnerable to even moderate overloads. In this paper, an efficient paging algorithm, Concurrent Search, is proposed for efficient use of paging channel in Multi- carrier CDMA system instead of existing sequential searching algorithm. It is shown by the simulation that the paging performance in proposed algorithm is far better than the existing system.

  12. PageRank model of opinion formation on social networks

    CERN Document Server

    Kandiah, Vivek

    2012-01-01

    We propose the PageRank model of opinion formation and investigate its rich properties on real directed networks of Universities of Cambridge and Oxford, LiveJournal and Twitter. In this model the opinion formation of linked electors is weighted with their PageRank probability. We find that the society elite, corresponding to the top PageRank nodes, can impose its opinion to a significant fraction of the society. However, for a homogeneous distribution of two opinions there exists a bistability range of opinions which depends on a conformist parameter characterizing the opinion formation. We find that LiveJournal and Twitter networks have a stronger tendency to a totalitar opinion formation. We also analyze the Sznajd model generalized for scale-free networks with the weighted PageRank vote of electors.

  13. Book Holder And Page Turner For The Elderly And Handicapped

    Science.gov (United States)

    Kerley, James; Eklund, Wayne

    1993-01-01

    Device holds reading matter and facilitates page turning for person not having use of arms and hands. Accommodates variety of publication formats, whether book, magazine, or newspaper. Holder sits on hospital-bed table and adjusted to convenient viewing angle. Includes flat upright back support for reading matter, hinged base, and main bracket with bent-wire page holders. Top support on back extended for such large items as newspapers. Wings on back support extended for oversize materials. Reader turns page by gripping special rod via mouthpiece, applying friction cup at its tip to page, and manipulating rod. Mouthpiece wide and tapered so user grips with teeth and uses jaws to move it, rather than using tongue or lips. Helpful to older people, whose facial and mouth muscles weak.

  14. Does Aesthetics of Web Page Interface Matters to Mandarin Learning?

    CERN Document Server

    Zain, Jasni Mohamad; Goh, Yingsoon

    2011-01-01

    Aesthetics of web page refers to how attractive a web page is in which it catches the attention of the user to read through the information. In addition, the visual appearance is important in getting attentions of the users. Moreover, it was found that those screens, which were perceived as aesthetically pleasing, were having a better usability. Usability might be a strong basic in relating to the applicability for learning, and in this study pertaining to Mandarin learning. It was also found that aesthetically pleasing layouts of web page would motivate students in Mandarin learning The Mandarin Learning web pages were manipulated according to the desired aesthetic measurements. GUI aesthetic measuring method was used for this purpose. The Aesthetics-Measurement Application (AMA) accomplished with six aesthetic measures was developed and used. On top of it, questionnaires were distributed to the users to gather information on the students' perceptions on the aesthetic aspects and learning aspects. Respondent...

  15. Treelicious: a System for Semantically Navigating Tagged Web Pages

    CERN Document Server

    Mullins, Matt; 10.1109/WI-IAT.2010.289

    2011-01-01

    Collaborative tagging has emerged as a popular and effective method for organizing and describing pages on the Web. We present Treelicious, a system that allows hierarchical navigation of tagged web pages. Our system enriches the navigational capabilities of standard tagging systems, which typically exploit only popularity and co-occurrence data. We describe a prototype that leverages the Wikipedia category structure to allow a user to semantically navigate pages from the Delicious social bookmarking service. In our system a user can perform an ordinary keyword search and browse relevant pages but is also given the ability to broaden the search to more general topics and narrow it to more specific topics. We show that Treelicious indeed provides an intuitive framework that allows for improved and effective discovery of knowledge.

  16. Species status assessment report for the Page Springsnail

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Page springsnail is a small hydrobiid snail that is currently found in a complex of springs along Oak Creek and Spring Creek in Yavapai County, central Arizona....

  17. The Information Manager Vol.8 (2)2008 Page 22

    African Journals Online (AJOL)

    Gbaje E.S

    Page 22. Rating of Information Sources / Channels of Social Work Lecturers and Their Students: A Case. Study of ... Social Work discipline exist in Nnamdi Azikiwe library, University of Nigeria, Nsukka. ..... sciences information system. Social ...

  18. A Dynamic Web Page Prediction Model Based on Access Patterns to Offer Better User Latency

    CERN Document Server

    Mukhopadhyay, Debajyoti; Saha, Dwaipayan; Kim, Young-Chon

    2011-01-01

    The growth of the World Wide Web has emphasized the need for improvement in user latency. One of the techniques that are used for improving user latency is Caching and another is Web Prefetching. Approaches that bank solely on caching offer limited performance improvement because it is difficult for caching to handle the large number of increasingly diverse files. Studies have been conducted on prefetching models based on decision trees, Markov chains, and path analysis. However, the increased uses of dynamic pages, frequent changes in site structure and user access patterns have limited the efficacy of these static techniques. In this paper, we have proposed a methodology to cluster related pages into different categories based on the access patterns. Additionally we use page ranking to build up our prediction model at the initial stages when users haven't already started sending requests. This way we have tried to overcome the problems of maintaining huge databases which is needed in case of log based techn...

  19. HandiVIH—A population-based survey to understand the vulnerability of people with disabilities to HIV and other sexual and reproductive health problems in Cameroon: protocol and methodological considerations

    Science.gov (United States)

    De Beaudrap, Pierre; Pasquier, Estelle; Tchoumkeu, Alice; Touko, Adonis; Essomba, Frida; Brus, Aude; Desgrées du Loû, Annabel; Aderemi, Toyin Janet; Hanass-Hancock, Jill; Eide, Arne Henning; Mont, Daniel; Mac-Seing, Muriel; Beninguisse, Gervais

    2016-01-01

    Introduction In resource-limited countries, people with disabilities seem to be particularly vulnerable to HIV infection due to barriers to accessing information and services, frequent exposure to sexual violence and social exclusion. However, they have often been left behind in the HIV response, probably because of the lack of reliable epidemiological data measuring this vulnerability. Multiple challenges in conducting good quality epidemiological surveys on people with disabilities require innovative methods to better understand the link between disability and HIV. This paper describes how the design and methods of the HandiVIH study were adapted to document the vulnerability of people with disabilities to HIV, and to compare their situation with that of people without disabilities. Methods and analysis The HandiVIH project aims to combine quantitative and qualitative data. The quantitative component is a cross-sectional survey with a control group conducted in Yaoundé (Cameroon). A two-phase random sampling is used (1) to screen people with disabilities from the general population using the Washington Group questionnaire and, (2) to create a matched control group. An HIV test is proposed to each study participant. Additionally, a questionnaire including a life-event interview is used to collect data on respondents’ life-course history of social isolation, employment, sexual partnership, HIV risk factors and fertility. Before the cross-sectional survey, a qualitative exploratory study was implemented to identify challenges in conducting the survey and possible solutions. Information on people with disabilities begging in the streets and members of disabled people's organisations is collected separately. Ethics and dissemination This study has been approved by the two ethical committees. Special attention has been paid on how to adapt the consenting process to persons with intellectual disabilities. The methodological considerations discussed in this paper may

  20. LINK PREDICTION MODEL FOR PAGE RANKING OF BLOGS

    Directory of Open Access Journals (Sweden)

    S.Geetha

    2012-11-01

    Full Text Available Social Network Analysis is mapping and measuring of relationships and flows of information between people, organizations, computers, or other information or knowledge processing entities. Social media systems such as blogs, LinkedIn, you tube are allows users to share content media, etc. Blog is a social network notepad service with consider on user interactions. In this paper study the link predictionand page ranking using MozRank algorithm using blog websites. It finds out how all the websites on the internet link to each other with the largest Link Intelligence database. As link data is also a component of search engine ranking, understanding the link profile of Search Engine positioning. Here the MozRank algorithm is using backlinks from blog websites and linking websites quality. Good websites with many backlinks which linking the corresponding WebPage give highly value of MozRank. MozRank can be improved a web page's by getting lots of links from semi-popular pages or a few links from very popular pages. The algorithm for page ranking must work differently and MozRank is more comprehensive and accurate than Goggle’s page rank. Another tool is Open Site Explorer that is ability to compare five URL's against each other. Open Site Explorer’s Compare Link Metrics option is how one measures pagelevel metrics, the other domain. This result can help to generate a chart form for the comparative URLs. A comparison chart of the important metrics for these pages is shown which makes it very clear and easy to compare the data between the five URL's.

  1. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  2. Relevant Pages in semantic Web Search Engines using Ontology

    Directory of Open Access Journals (Sweden)

    Jemimah Simon

    2012-03-01

    Full Text Available In general, search engines are the most popular means of searching any kind of information from the Internet. Generally, keywords are given to the search engine and the Web database returns the documents containing specified keywords. In many situations, irrelevant results are given as results to the user query since different keywords are used in different forms in various documents. The development of the next generation Web, Semantic Web, will change this situation. This paper proposes a prototype of relation-based search engine which ranks the page according to the user query and on annotated results. Page sub graph is computed for each annotated page in the result set by generating all possible combinations for the relation in the sub graph. A relevance score is computed for each annotated page using a probability measure. A relation based ranking model is used which displays the pages in the final result set according to their relevance score. This ranking is provided by considering keyword-concept associations. Thus, the final result set contains pages in the order of their constrained relevant scores.

  3. Metadata Schema Used in OCLC Sampled Web Pages

    Directory of Open Access Journals (Sweden)

    Fei Yu

    2005-12-01

    Full Text Available The tremendous growth of Web resources has made information organization and retrieval more and more difficult. As one approach to this problem, metadata schemas have been developed to characterize Web resources. However, many questions have been raised about the use of metadata schemas such as which metadata schemas have been used on the Web? How did they describe Web accessible information? What is the distribution of these metadata schemas among Web pages? Do certain schemas dominate the others? To address these issues, this study analyzed 16,383 Web pages with meta tags extracted from 200,000 OCLC sampled Web pages in 2000. It found that only 8.19% Web pages used meta tags; description tags, keyword tags, and Dublin Core tags were the only three schemas used in the Web pages. This article revealed the use of meta tags in terms of their function distribution, syntax characteristics, granularity of the Web pages, and the length distribution and word number distribution of both description and keywords tags.

  4. Public Preferences Related to Radioactive Waste Management in the United States: Methodology and Response Reference Report for the 2016 Energy and Environment Survey.

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins-Smith, Hank C. [Univ. of Oklahoma, Norman, OK (United States); Silva, Carol L. [Univ. of Oklahoma, Norman, OK (United States); Gupta, Kuhika [Univ. of Oklahoma, Norman, OK (United States); Rechard, Robert P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    This report presents the questions and responses to a nationwide survey taken June 2016 to track preferences of US residents concerning the environment, energy, and radioactive waste management. A focus of the 2016 survey is public perceptions on different options for managing spent nuclear fuel, including on-site storage, interim storage, deep boreholes, general purpose geologic repositories, and geologic repositories for only defense-related waste. Highlights of the survey results include the following: (1) public attention to the 2011 accident and subsequent cleanup at the Fukushima nuclear facility continues to influence the perceived balance of risk and benefit for nuclear energy; (2) the incident at the Waste Isolation Pilot Plant in 2014 could influence future public support for nuclear waste management; (3) public knowledge about US nuclear waste management policies has remined higher than seen prior to the Fukushima nuclear accident and submittal of the Yucca Mountain application; (6) support for a mined disposal facility is higher than for deep borehole disposal, building one more interim storage facilities, or continued on-site storage of spent nuclear fuel; (7) support for a repository that comingles commercial and defense related waste is higher than for a repository for only defense related waste; (8) the public’s level of trust accorded to the National Academies, university scientists, and local emergency responders is the highest and the level trust accorded to advocacy organizations, public utilities, and local/national press is the lowest; and (9) the public is willing to serve on citizens panels but, in general, will only modestly engage in issues related to radioactive waste management.

  5. 农村土地调查基本农田上图方法%Basic Farmland Mapping Methodology in Rural Land Survey

    Institute of Scientific and Technical Information of China (English)

    钱小龙

    2014-01-01

    土地是人类赖以生存和社会发展的物质基础,耕地是农业生产最基本、不可替代的生产资料。基本农田调查是农村土地调查最为重要的一个部分,基本农田上图工作对于政府部门制定土地政策、优化土地结构、保护耕地、提高耕地质量具有十分重要的意义。本文从基本农田概念的引入、基本农田上图技术流程、基本农田保护块上图以及图件编制、数据汇总几个方面阐述了农村土地调查基本农田的上图方法。%The Material Basis of human existence and social development is Land ;Cultivated Land is the fundamental and irreplacea-ble production is in Agricultural Production .Basic Farmland survey is the most important part of the rural land survey , Basic Farmland Mapping has great significance on developing land policy , optimizing land structure , protecting farmland and improving quality of ara-ble land.This paper elaborates method of Basic Farmland Mapping in rural land survey from several aspects , such as the concept of basic farmland , the technical processes of basic farmland mapping , basic farmland protection block diagram , and map edit and data aggregation .

  6. A survey of attitudes and factors associated with successful cardiopulmonary resuscitation (CPR knowledge transfer in an older population most likely to witness cardiac arrest: design and methodology

    Directory of Open Access Journals (Sweden)

    Brehaut Jamie C

    2008-11-01

    Full Text Available Abstract Background Overall survival rates for out-of-hospital cardiac arrest rarely exceed 5%. While bystander cardiopulmonary resuscitation (CPR can increase survival for cardiac arrest victims by up to four times, bystander CPR rates remain low in Canada (15%. Most cardiac arrest victims are men in their sixties, they usually collapse in their own home (85% and the event is witnessed 50% of the time. These statistics would appear to support a strategy of targeted CPR training for an older population that is most likely to witness a cardiac arrest event. However, interest in CPR training appears to decrease with advancing age. Behaviour surrounding CPR training and performance has never been studied using well validated behavioural theories. Methods/Design The overall goal of this study is to conduct a survey to better understand the behavioural factors influencing CPR training and performance in men and women 55 years of age and older. The study will proceed in three phases. In phase one, semi-structured qualitative interviews will be conducted and recorded to identify common categories and themes regarding seeking CPR training and providing CPR to a cardiac arrest victim. The themes identified in the first phase will be used in phase two to develop, pilot-test, and refine a survey instrument based upon the Theory of Planned Behaviour. In the third phase of the project, the final survey will be administered to a sample of the study population over the telephone. Analyses will include measures of sampling bias, reliability of the measures, construct validity, as well as multiple regression analyses to identify constructs and beliefs most salient to seniors' decisions about whether to attend CPR classes or perform CPR on a cardiac arrest victim. Discussion The results of this survey will provide valuable insight into factors influencing the interest in CPR training and performance among a targeted group of individuals most susceptible to

  7. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.

  8. Multifractal methodology

    CERN Document Server

    Salat, Hadrien; Arcaute, Elsa

    2016-01-01

    Various methods have been developed independently to study the multifractality of measures in many different contexts. Although they all convey the same intuitive idea of giving a "dimension" to sets where a quantity scales similarly within a space, they are not necessarily equivalent on a more rigorous level. This review article aims at unifying the multifractal methodology by presenting the multifractal theoretical framework and principal practical methods, namely the moment method, the histogram method, multifractal detrended fluctuation analysis (MDFA) and modulus maxima wavelet transform (MMWT), with a comparative and interpretative eye.

  9. [National survey of transfusion practices in the neonatal period for the development of recommendations based on the "Haute Autorité de Santé methodology"].

    Science.gov (United States)

    Wibaut, B; Saliba, E; Rakza, T; Lassale, B; Hubert, H; Wiel, E

    2012-11-01

    Although transfusion practices have changed these last years, the neonatal period remains one period when the transfusion of blood components (in particular in red blood cells concentrates) is frequent, particularly for low birth weight premature babies. It is thus important to know well the pathophysiological characteristics specific to this age of life in order to reduce the risks of transfusion and to allow an optimal effectiveness of this treatment. Various studies on neonatal transfusion show that transfusion practices during the neonatal period are very heterogeneous from a team to another, and even within the same team. Therefore, we wanted to know the practices in France, by addressing a questionnaire to neonatology centres, in collaboration with the French Society Vigilance and Transfusion Therapy and the French Society of Neonatology (SFN). The results obtained confirm the heterogeneity of practices. To follow up on this study, we constituted a working group, in partnership with the SFN, the SFVTT and the EFS, with an aim of proposing good practice recommendations according to the methodology of the French "High Authority for Health", in order to homogenize at the national level transfusion practices of the new-born baby.

  10. The Hubble Space Telescope Medium Deep Survey with the Wide Field and Planetary Camera. 1: Methodology and results on the field near 3C 273

    Science.gov (United States)

    Griffiths, R. E.; Ratnatunga, K. U.; Neuschaefer, L. W.; Casertano, S.; Im, M.; Wyckoff, E. W.; Ellis, R. S.; Gilmore, G. F.; Elson, R. A. W.; Glazebrook, K.

    1994-01-01

    We present results from the Medium Deep Survey (MDS), a Key Project using the Hubble Space Telescope (HST). Wide Field Camera (WFC) images of random fields have been taken in 'parallel mode' with an effective resolution of 0.2 sec full width at half maximum (FWHM) in the V(F555W) and I(F785LP) filters. The exposures presented here were targeted on a field away from 3C 273, and resulted in approximately 5 hr integration time in each filter. Detailed morphological structure is seen in galaxy images with total integrated magnitudes down to V approximately = 22.5 and I approximately = 21.5. Parameters are estimated that best fit the observed galaxy images, and 143 objects are identified (including 23 stars) in the field to a fainter limiting magnitude of I approximately = 23.5. We outline the extragalactic goals of the HST Medium Deep Survey, summarize our basic data reduction procedures, and present number (magnitude) counts, a color-magnitude diagram for the field, surface brightness profiles for the brighter galaxies, and best-fit half-light radii for the fainter galaxies as a function of apparent magnitude. A median galaxy half-light radius of 0.4 sec is measured, and the distribution of galaxy sizes versus magnitude is presented. We observe an apparent deficit of galaxies with half-light radii between approximately 0.6 sec and 1.5 sec, with respect to standard no-evolution or mild evolution cosmological models. An apparent excess of compact objects (half-light radii approximately 0.1 sec) is also observed with respect to those models. Finally, we find a small excess in the number of faint galaxy pairs and groups with respect to a random low-redshift field sample.

  11. Jean-Pierre Pages: construction of a constructivist paradigm of risk perceptions

    Energy Technology Data Exchange (ETDEWEB)

    Pages, J.P. [Association pour l' Etude des Structures de l' Ipinion, 75 - Paris (France); Poumadere, M. [GRID-Ecole Normale Superieur de Cachan, 94 (France)

    1998-07-01

    This paper is the transcription of an interview after the granting of the Society for Risk Analysis-Europe Distinguished Scientist Award: Jean-Pierre Pages. This scientist has set a very high standard for opinion research among the nuclear establishment in France. This interview is closing the conference on risk analysis, which took place in Paris in 11-14 october 1999. This interview treats with the goal and the contribution of a questionnaire survey, concerning the ''acceptable risks of nuclear energy''. Cost benefit analysis, the public opinion and information, the risks assessment impact on decision makers are discussed. (A.L.B.)

  12. The uses and gratifications of online care pages: a study of CaringBridge.

    Science.gov (United States)

    Anderson, Isolde K

    2011-09-01

    This study investigated how online care pages help people connect with others and gain social support during a health care event. It reports the results of a survey of 1035 CaringBridge authors who set up personalized web pages because of hospitalization, serious illness, or other reasons, regarding the uses and gratifications obtained from their sites. Four primary benefits were found to be important to all authors of CaringBridge sites: providing information, receiving encouragement from messages, convenience, and psychological support. Hierarchical multiple regression revealed significant effects for six demographic and health-related variables: gender, age, religiosity, Internet usage, the purpose for which the site was set up, and sufficiency of information received from health care providers. Support was obtained for the perspective that online care pages provide new media gratifications for authors, and that health-related antecedents of media use may affect media selection and gratifications. The implications of this study for communication researchers and support services like CaringBridge are also discussed.

  13. Give your feedback on the new Users’ page

    CERN Multimedia

    CERN Bulletin

    If you haven't already done so, visit the new Users’ page and provide the Communications group with your feedback. You can do this quickly and easily via an online form. A dedicated web steering group will design the future page on the basis of your comments. As a first step towards reforming the CERN website, the Communications group is proposing a ‘beta’ version of the Users’ pages. The primary aim of this version is to improve the visibility of key news items, events and announcements to the CERN community. The beta version is very much work in progress: your input is needed to make sure that the final site meets the needs of CERN’s wide and mixed community. The Communications group will read all your comments and suggestions, and will establish a web steering group that will make sure that the future CERN web pages match the needs of the community. More information on this process, including the gradual 'retirement' of the grey Users' pages we are a...

  14. Automatic comic page image understanding based on edge segment analysis

    Science.gov (United States)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  15. Network and User-Perceived Performance of Web Page Retrievals

    Science.gov (United States)

    Kruse, Hans; Allman, Mark; Mallasch, Paul

    1998-01-01

    The development of the HTTP protocol has been driven by the need to improve the network performance of the protocol by allowing the efficient retrieval of multiple parts of a web page without the need for multiple simultaneous TCP connections between a client and a server. We suggest that the retrieval of multiple page elements sequentially over a single TCP connection may result in a degradation of the perceived performance experienced by the user. We attempt to quantify this perceived degradation through the use of a model which combines a web retrieval simulation and an analytical model of TCP operation. Starting with the current HTTP/l.1 specification, we first suggest a client@side heuristic to improve the perceived transfer performance. We show that the perceived speed of the page retrieval can be increased without sacrificing data transfer efficiency. We then propose a new client/server extension to the HTTP/l.1 protocol to allow for the interleaving of page element retrievals. We finally address the issue of the display of advertisements on web pages, and in particular suggest a number of mechanisms which can make efficient use of IP multicast to send advertisements to a number of clients within the same network.

  16. Relating Web pages to enable information-gathering tasks

    CERN Document Server

    Bagchi, Amitabha

    2008-01-01

    We argue that relationships between Web pages are functions of the user's intent. We identify a class of Web tasks - information-gathering - that can be facilitated by a search engine that provides links to pages which are related to the page the user is currently viewing. We define three kinds of intentional relationships that correspond to whether the user is a) seeking sources of information, b) reading pages which provide information, or c) surfing through pages as part of an extended information-gathering process. We show that these three relationships can be productively mined using a combination of textual and link information and provide three scoring mechanisms that correspond to them: {\\em SeekRel}, {\\em FactRel} and {\\em SurfRel}. These scoring mechanisms incorporate both textual and link information. We build a set of capacitated subnetworks - each corresponding to a particular keyword - that mirror the interconnection structure of the World Wide Web. The scores are computed by computing flows on ...

  17. Diseño y metodología de la Encuesta Nacional de Salud 2000 National Health Survey 2000: design and methodology

    Directory of Open Access Journals (Sweden)

    Jaime Sepúlveda

    2007-01-01

    Full Text Available OBJETIVO: Como parte del Sistema de Encuestas Nacionales de Salud, durante los últimos meses de 1999 y los primeros tres del año 2000 se realizó la Encuesta Nacional de Salud de México (ENSA 2000. Se estudió la accesibilidad, calidad, utilización y cobertura de los Servicios de Salud; de modo adicional se actualizaron los marcadores serológicos de enfermedades infecciosas prevenibles por vacunación, infecciones de transmisión sexual y hepatitis. MATERIAL Y MÉTODOS: Para la ENSA 2000 se seleccionaron tres grupos etarios y a los utilizadores de los servicios de salud. Se captó la información mediante entrevista directa y se tomaron muestras biológicas para análisis clínicos y medidas de parámetros biológicos y somatométricos. El diseño muestral de la ENSA 2000 fue probabilístico, polietápico, estratificado y de conglomerados. El tamaño de la muestra fue de 1 470 viviendas por estado, para un total de 47 040 viviendas a nivel nacional; los factores de expansión se modificaron por la falta de respuesta y la posestratificación. El personal operativo se capacitó y estandarizó para mantener una alta respuesta, en especial para las muestras de sangre. RESULTADOS: En total se obtuvieron 83 157 muestras de sangre de las 94 000 esperadas (respuesta de 88% que se mantuvieron refrigeradas en tanto se ubicaron en el laboratorio del Instituto Nacional de Salud Pública donde se prepararon cuatro alícuotas y se congelaron a -150° C hasta el análisis.OBJECTIVE: The 2000 Mexican National Health Survey (NHS was created as part of the System for National Health Surveys conducted during the last months of 1999 and the first three of 2000. The 2000 NHS is a probabilistic survey of households from which users of health services were selected according to three age groups. Information was gathered through direct interviews with appropriate informants. Biological samples were taken for clinical tests as well as for measuring biological and

  18. Methodology improvements in China mental disorder epidemiological surveys from 1950 to 2008%中国精神疾病流行病学调查进展的分析和评价

    Institute of Scientific and Technical Information of China (English)

    钟宝亮; 张建芳; 何民富; 黄悦勤; 陈红辉

    2010-01-01

    Objective To assess the advances of methodology in China mental disorder epidemiological surveys from 1950 to 2008. Methods Using bibliometrics approach, a number of English (PubMed, EMBASE) and Chinese databases (Chinese National Knowledge Infrastructure, Chinese Biomedical Database Disc and Wanfang Database), related reports and literature reviews' references list were comprehensively searched between Jan. 1, 1950 and Dec. 22, 2008, Chinese Journal of Neurology & Psychiatry (from 1955 to 1996) was also searched by hand, and the large-scale regional mental disorder epidemiological surveys were collected. Taking time as clue, the methodological characteristics in study design, sample size calculation, sampling method, diagnostic evaluation tool, field work quality control,etc., was analyzed. Results Totally 39 surveys, including 1 children and adolescents mental disorder epidemiological survey, were searched out. All the surveys from 1958 to 1981 were censuses without diagnostic evaluation tool. In the surveys from 1982 to 2000 compared to the ones from 2001 to 2008, 0(0/16) and 25.0% (4/16) of the studies reported the sample size estimation method, 0.0% (0/20) and 12.5% (2/16) of them reported the implementation of informed consent, 5.0% (1/20) and 31.2% (5/16)of them reported the adoption of field work quality control approaches, respectively. Conclusions The methodological quality has been increasing in recent years, but the overall methodological level is irregular and uneven. More attention must be attached to the methodological design. There is a pressing need to conduct mental disorder epidemiological survey among children and adolescents.%目的 分析我国精神疾病流行病学调查(以下简称流调)的进展和不足.方法 采用文献计量学方法,计算机检索英文数据库(PubMed,EMBASE)和中文数据库[中国知网(CNKI)中文期刊全文数据库,中国生物医学文献光盘数据库和万方数据库]中1950年1月1日至2008年12

  19. A Methodological Approach to Assessing the Health Impact of Environmental Chemical Mixtures: PCBs and Hypertension in the National Health and Nutrition Examination Survey

    Directory of Open Access Journals (Sweden)

    Paul White

    2011-11-01

    Full Text Available We describe an approach to examine the association between exposure to chemical mixtures and a health outcome, using as our case study polychlorinated biphenyls (PCBs and hypertension. The association between serum PCB and hypertension among participants in the 1999–2004 National Health and Nutrition Examination Survey was examined. First, unconditional multivariate logistic regression was used to estimate odds ratios and associated 95% confidence intervals. Next, correlation and multicollinearity among PCB congeners was evaluated, and clustering analyses performed to determine groups of related congeners. Finally, a weighted sum was constructed to represent the relative importance of each congener in relation to hypertension risk. PCB serum concentrations varied by demographic characteristics, and were on average higher among those with hypertension. Logistic regression results showed mixed findings by congener and class. Further analyses identified groupings of correlated PCBs. Using a weighted sum approach to equalize different ranges and potencies, PCBs 66, 101, 118, 128 and 187 were significantly associated with increased risk of hypertension. Epidemiologic data were used to demonstrate an approach to evaluating the association between a complex environmental exposure and health outcome. The complexity of analyzing a large number of related exposures, where each may have different potency and range, are addressed in the context of the association between hypertension risk and exposure to PCBs.

  20. Integrated Methodologies for the 3D Survey and the Structural Monitoring of Industrial Archaeology: The Case of the Casalecchio di Reno Sluice, Italy

    Directory of Open Access Journals (Sweden)

    Gabriele Bitelli

    2011-01-01

    Full Text Available The paper presents an example of integrated surveying and monitoring activities for the control of an ancient structure, the Casalecchio di Reno sluice, located near Bologna, Italy. Several geomatic techniques were applied (classical topography, high-precision spirit levelling, terrestrial laser scanning, digital close-range photogrammetry, and thermal imagery. All these measurements were put together in a unique reference system and used in order to study the stability and the movements of the structure over the period of time observed. Moreover, the metrical investigations allowed the creation of a 3D model of the structure, and the comparison between two situations, before and after the serious damages suffered by the sluice during the winter season 2008-2009. Along with the detailed investigations performed on individual portions of the structure, an analysis of the whole sluice, carried out at a regional scale, was done via the use of aerial photogrammetry, using both recently acquired images and historical photogrammetric coverage. The measurements were carried out as part of a major consolidation and restoration activity, carried out by the “Consorzio della Chiusa di Casalecchio e del Canale di Reno”.

  1. Twelve-months prevalence of mental disorders in the German Health Interview and Examination Survey for Adults - Mental Health Module (DEGS1-MH): a methodological addendum and correction.

    Science.gov (United States)

    Jacobi, Frank; Höfler, Michael; Strehle, Jens; Mack, Simon; Gerschler, Anja; Scholl, Lucie; Busch, Markus A; Hapke, Ulfert; Maske, Ulrike; Seiffert, Ingeburg; Gaebel, Wolfgang; Maier, Wolfgang; Wagner, Michael; Zielasek, Jürgen; Wittchen, Hans-Ulrich

    2015-12-01

    We recently published findings in this journal on the prevalence of mental disorders from the German Health Interview and Examination Survey for Adults Mental Health Module (DEGS1-MH). The DEGS1-MH paper was also meant to be the major reference publication for this large-scale German study program, allowing future users of the data set to understand how the study was conducted and analyzed. Thus, towards this goal highest standards regarding transparency, consistency and reproducibility should be applied. After publication, unfortunately, the need for an addendum and corrigendum became apparent due to changes in the eligible reference sample, and corresponding corrections of the imputed data. As a consequence the sample description, sample size and some prevalence data needed amendments. Additionally we identified a coding error in the algorithm for major depression that had a significant effect on the prevalence estimates of depression and associated conditions. This addendum and corrigendum highlights all changes and presents the corrected prevalence tables. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  2. A methodological approach to assessing the health impact of environmental chemical mixtures: PCBs and hypertension in the National Health and Nutrition Examination Survey.

    Science.gov (United States)

    Yorita Christensen, Krista L; White, Paul

    2011-11-01

    We describe an approach to examine the association between exposure to chemical mixtures and a health outcome, using as our case study polychlorinated biphenyls (PCBs) and hypertension. The association between serum PCB and hypertension among participants in the 1999-2004 National Health and Nutrition Examination Survey was examined. First, unconditional multivariate logistic regression was used to estimate odds ratios and associated 95% confidence intervals. Next, correlation and multicollinearity among PCB congeners was evaluated, and clustering analyses performed to determine groups of related congeners. Finally, a weighted sum was constructed to represent the relative importance of each congener in relation to hypertension risk. PCB serum concentrations varied by demographic characteristics, and were on average higher among those with hypertension. Logistic regression results showed mixed findings by congener and class. Further analyses identified groupings of correlated PCBs. Using a weighted sum approach to equalize different ranges and potencies, PCBs 66, 101, 118, 128 and 187 were significantly associated with increased risk of hypertension. Epidemiologic data were used to demonstrate an approach to evaluating the association between a complex environmental exposure and health outcome. The complexity of analyzing a large number of related exposures, where each may have different potency and range, are addressed in the context of the association between hypertension risk and exposure to PCBs.

  3. Research Methodology

    CERN Document Server

    Rajasekar, S; Philomination, P

    2006-01-01

    In this manuscript various components of research are listed and briefly discussed. The topics considered in this write-up cover a part of the research methodology paper of Master of Philosophy (M.Phil.) course and Doctor of Philosophy (Ph.D.) course. The manuscript is intended for students and research scholars of science subjects such as mathematics, physics, chemistry, statistics, biology and computer science. Various stages of research are discussed in detail. Special care has been taken to motivate the young researchers to take up challenging problems. Ten assignment works are given. For the benefit of young researchers a short interview with three eminent scientists is included at the end of the manuscript.

  4. A teen's guide to creating web pages and blogs

    CERN Document Server

    Selfridge, Peter; Osburn, Jennifer

    2008-01-01

    Whether using a social networking site like MySpace or Facebook or building a Web page from scratch, millions of teens are actively creating a vibrant part of the Internet. This is the definitive teen''s guide to publishing exciting web pages and blogs on the Web. This easy-to-follow guide shows teenagers how to: Create great MySpace and Facebook pages Build their own unique, personalized Web site Share the latest news with exciting blogging ideas Protect themselves online with cyber-safety tips Written by a teenager for other teens, this book leads readers step-by-step through the basics of web and blog design. In this book, teens learn to go beyond clicking through web sites to learning winning strategies for web design and great ideas for writing blogs that attract attention and readership.

  5. Applying weighted PageRank to author citation networks

    CERN Document Server

    Ding, Ying

    2011-01-01

    This paper aims to identify whether different weighted PageRank algorithms can be applied to author citation networks to measure the popularity and prestige of a scholar from a citation perspective. Information Retrieval (IR) was selected as a test field and data from 1956-2008 were collected from Web of Science (WOS). Weighted PageRank with citation and publication as weighted vectors were calculated on author citation networks. The results indicate that both popularity rank and prestige rank were highly correlated with the weighted PageRank. Principal Component Analysis (PCA) was conducted to detect relationships among these different measures. For capturing prize winners within the IR field, prestige rank outperformed all the other measures.

  6. A Novel Approach for Web Page Set Mining

    CERN Document Server

    Geeta, R B; Totad, Shasikumar G; D, Prasad Reddy P V G

    2011-01-01

    The one of the most time consuming steps for association rule mining is the computation of the frequency of the occurrences of itemsets in the database. The hash table index approach converts a transaction database to an hash index tree by scanning the transaction database only once. Whenever user requests for any Uniform Resource Locator (URL), the request entry is stored in the Log File of the server. This paper presents the hash index table structure, a general and dense structure which provides web page set extraction from Log File of server. This hash table provides information about the original database. Web Page set mining (WPs-Mine) provides a complete representation of the original database. This approach works well for both sparse and dense data distributions. Web page set mining supported by hash table index shows the performance always comparable with and often better than algorithms accessing data on flat files. Incremental update is feasible without reaccessing the original transactional databa...

  7. Intelligent Paging Strategy for Multi-Carrier CDMA System

    CERN Document Server

    Mostafa, Sheikh Shanawaz; Amin, Md Ziaul; Ahmad, Mohiuddin

    2011-01-01

    Subscriber satisfaction and maximum radio resource utilization are the pivotal criteria in communication system design. In multi-Carrier CDMA system, different paging algorithms are used for locating user within the shortest possible time and best possible utilization of radio resources. Different paging algorithms underscored different techniques based on the different purposes. However, low servicing time of sequential search and better utilization of radio resources of concurrent search can be utilized simultaneously by swapping of the algorithms. In this paper, intelligent mechanism has been developed for dynamic algorithm assignment basing on time-varying traffic demand, which is predicted by radial basis neural network; and its performance has been analyzed are based on prediction efficiency of different types of data. High prediction efficiency is observed with a good correlation coefficient (0.99) and subsequently better performance is achieved by dynamic paging algorithm assignment. This claim is sub...

  8. Facebook pages as ’demo versions’ of issue publics

    DEFF Research Database (Denmark)

    Birkbak, Andreas

    In this paper, I examine the use of Facebook pages in a recent controversy that resulted in the non-actualization of the so-called ’payment ring’ to be built around Copenhagen to curb congestion. I argue that if we swap the distinction between public and private interest on which the institutions...... of representative democracy are founded with a distinction between direct and indirect consequences of action (Dewey 1927), Facebook can be understood as an experimental issue public-generating device. In the payment ring controversy, several Facebook pages became spaces of ’demonstration’ in three senses...... of the word (Barry 2001). First, Facebook pages became used as a device for opposing the government’s plans to actualize the payment ring object. For this purpose, classic features of protests were adopted, including slogans denouncing the decision-makers, the mobilization of peers, and the attempt to show...

  9. Experimental Results on Statistical Approaches to Page Replacement Policies

    Energy Technology Data Exchange (ETDEWEB)

    LEUNG,VITUS J.; IRANI,SANDY

    2000-12-08

    This paper investigates the questions of what statistical information about a memory request sequence is useful to have in making page replacement decisions: Our starting point is the Markov Request Model for page request sequences. Although the utility of modeling page request sequences by the Markov model has been recently put into doubt, we find that two previously suggested algorithms (Maximum Hitting Time and Dominating Distribution) which are based on the Markov model work well on the trace data used in this study. Interestingly, both of these algorithms perform equally well despite the fact that the theoretical results for these two algorithms differ dramatically. We then develop succinct characteristics of memory access patterns in an attempt to approximate the simpler of the two algorithms. Finally, we investigate how to collect these characteristics in an online manner in order to have a purely online algorithm.

  10. Facebook pages as ’demo versions’ of issue publics

    DEFF Research Database (Denmark)

    Birkbak, Andreas

    In this paper, I examine the use of Facebook pages in a recent controversy that resulted in the non-actualization of the so-called ’payment ring’ to be built around Copenhagen to curb congestion. I argue that if we swap the distinction between public and private interest on which the institutions...... of representative democracy are founded with a distinction between direct and indirect consequences of action (Dewey 1927), Facebook can be understood as an experimental issue public-generating device. In the payment ring controversy, several Facebook pages became spaces of ’demonstration’ in three senses...... of the word (Barry 2001). First, Facebook pages became used as a device for opposing the government’s plans to actualize the payment ring object. For this purpose, classic features of protests were adopted, including slogans denouncing the decision-makers, the mobilization of peers, and the attempt to show...

  11. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  12. Methodological advances

    Directory of Open Access Journals (Sweden)

    Lebreton, J.-D.

    2004-06-01

    Full Text Available The study of population dynamics has long depended on methodological progress. Among many striking examples, continuous time models for populations structured in age (Sharpe & Lotka, 1911 were made possible by progress in the mathematics of integral equations. Therefore the relationship between population ecology and mathematical and statistical modelling in the broad sense raises a challenge in interdisciplinary research. After the impetus given in particular by Seber (1982, the regular biennial EURING conferences became a major vehicle to achieve this goal. It is thus not surprising that EURING 2003 included a session entitled “Methodological advances”. Even if at risk of heterogeneity in the topics covered and of overlap with other sessions, such a session was a logical way of ensuring that recent and exciting new developments were made available for discussion, further development by biometricians and use by population biologists. The topics covered included several to which full sessions were devoted at EURING 2000 (Anderson, 2001 such as: individual covariates, Bayesian methods, and multi–state models. Some other topics (heterogeneity models, exploited populations and integrated modelling had been addressed by contributed talks or posters. Their presence among “methodological advances”, as well as in other sessions of EURING 2003, was intended as a response to their rapid development and potential relevance to biological questions. We briefly review all talks here, including those not published in the proceedings. In the plenary talk, Pradel et al. (in prep. developed GOF tests for multi–state models. Until recently, the only goodness–of–fit procedures for multistate models were ad hoc, and non optimal, involving use of standard tests for single state models (Lebreton & Pradel, 2002. Pradel et al. (2003 proposed a general approach based in particular on mixtures of multinomial distributions. Pradel et al. (in prep. showed

  13. PageRank model of opinion formation on Ulam networks

    CERN Document Server

    Chakhmakhchyan, L

    2013-01-01

    We consider a PageRank model of opinion formation on Ulam networks, generated by the intermittency map and the typical Chirikov map. The Ulam networks generated by these maps have certain similarities with such scale-free networks as the World Wide Web (WWW), showing an algebraic decay of the PageRank probability. We find that the opinion formation process on Ulam networks have certain similarities but also distinct features comparing to the WWW. We attribute these distinctions to internal differences in network structure of the Ulam and WWW networks. We also analyze the process of opinion formation in the frame of generalized Sznajd model which protects opinion of small communities.

  14. Key-phrase based classification of public health web pages.

    Science.gov (United States)

    Dolamic, Ljiljana; Boyer, Célia

    2013-01-01

    This paper describes and evaluates the public health web pages classification model based on key phrase extraction and matching. Easily extendible both in terms of new classes as well as the new language this method proves to be a good solution for text classification faced with the total lack of training data. To evaluate the proposed solution we have used a small collection of public health related web pages created by a double blind manual classification. Our experiments have shown that by choosing the adequate threshold value the desired value for either precision or recall can be achieved.

  15. Enriching the trustworthiness of health-related web pages.

    Science.gov (United States)

    Gaudinat, Arnaud; Cruchet, Sarah; Boyer, Celia; Chrawdhry, Pravir

    2011-06-01

    We present an experimental mechanism for enriching web content with quality metadata. This mechanism is based on a simple and well-known initiative in the field of the health-related web, the HONcode. The Resource Description Framework (RDF) format and the Dublin Core Metadata Element Set were used to formalize these metadata. The model of trust proposed is based on a quality model for health-related web pages that has been tested in practice over a period of thirteen years. Our model has been explored in the context of a project to develop a research tool that automatically detects the occurrence of quality criteria in health-related web pages.

  16. Recurrence of Acute Page Kidney in a Renal Transplant Allograft

    Directory of Open Access Journals (Sweden)

    Rajan Kapoor

    2016-01-01

    Full Text Available Acute Page Kidney (APK phenomenon is a rare cause of secondary hypertension, mediated by activation of renin-angiotensin-aldosterone system (RAAS. Timely intervention is of great importance to prevent any end organ damage from hypertension. We present a unique case of three episodes of APK in the same renal transplant allograft.

  17. Recurrence of Acute Page Kidney in a Renal Transplant Allograft.

    Science.gov (United States)

    Kapoor, Rajan; Zayas, Carlos; Mulloy, Laura; Jagadeesan, Muralidharan

    2016-01-01

    Acute Page Kidney (APK) phenomenon is a rare cause of secondary hypertension, mediated by activation of renin-angiotensin-aldosterone system (RAAS). Timely intervention is of great importance to prevent any end organ damage from hypertension. We present a unique case of three episodes of APK in the same renal transplant allograft.

  18. 47 CFR 22.503 - Paging geographic area authorizations.

    Science.gov (United States)

    2010-10-01

    ... in its sole discretion, the FCC determines that the public interest would be served by such replacement. (d) Filing windows. The FCC accepts applications for paging geographic area authorizations only during filing windows. The FCC issues Public Notices announcing in advance the dates of the filing...

  19. The Inquiry Page: Bringing Digital Libraries to Learners.

    Science.gov (United States)

    Bruce, Bertram C.; Bishop, Ann Peterson; Heidorn, P. Bryan; Lunsford, Karen J.; Poulakos, Steven; Won, Mihye

    2003-01-01

    Discusses digital library development, particularly a national science digital library, and describes the Inquiry Page which focuses on building a constructivist environment using Web resources, collaborative processes, and knowledge that bridges digital libraries with users in K-12 schools, museums, community groups, or other organizations. (LRW)

  20. The importance of prior probabilities for entry page search

    NARCIS (Netherlands)

    Kraaij, W.; Westerveld, T.; Hiemstra, D.

    2002-01-01

    Een belangrijke groep zoekopdrachten op het internet heeft ten doel de startpagina of 'entry page' van een organisatie te vinden. Zoeken naar een startpagina verschilt sterk van algemeen of 'Ad Hoc' zoeken. De resultaten van een simpel algemeen zoeksysteem zijn teleurstellend. In het rapport wordt g

  1. Recurrence of Acute Page Kidney in a Renal Transplant Allograft

    Science.gov (United States)

    Zayas, Carlos; Mulloy, Laura; Jagadeesan, Muralidharan

    2016-01-01

    Acute Page Kidney (APK) phenomenon is a rare cause of secondary hypertension, mediated by activation of renin-angiotensin-aldosterone system (RAAS). Timely intervention is of great importance to prevent any end organ damage from hypertension. We present a unique case of three episodes of APK in the same renal transplant allograft. PMID:27725836

  2. What Snippets Say About Pages in Federated Web Search

    NARCIS (Netherlands)

    Demeester, Thomas; Nguyen, Dong-Phuong; Trieschnigg, Dolf; Develder, Chris; Hiemstra, Djoerd; Hou, Yuexian; Nie, Jian-Yun; Sun, Le; Wang, Bo; Zhang, Peng

    2012-01-01

    What is the likelihood that a Web page is considered relevant to a query, given the relevance assessment of the corresponding snippet? Using a new federated IR test collection that contains search results from over a hundred search engines on the internet, we are able to investigate such research qu

  3. 48 CFR 804.1102 - Vendor Information Pages (VIP) Database.

    Science.gov (United States)

    2010-10-01

    ... (VIP) Database. 804.1102 Section 804.1102 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.1102 Vendor Information Pages (VIP) Database. Prior to January 1, 2012, all VOSBs and SDVOSBs must be listed in the VIP database, available at...

  4. Automatic Caption Localization for Photographs on World Wide Web Pages.

    Science.gov (United States)

    Rowe, Neil C.; Frew, Brian

    1998-01-01

    Explores the indirect method of locating for indexing the likely explicit and implicit captions of photographs, using multimodal clues including the specific words used, syntax, surrounding layout of the Web page, and general appearance of the associated image. The MARIE-3 system thus avoids full image processing and full natural-language…

  5. The Inquiry Page: Bringing Digital Libraries to Learners.

    Science.gov (United States)

    Bruce, Bertram C.; Bishop, Ann Peterson; Heidorn, P. Bryan; Lunsford, Karen J.; Poulakos, Steven; Won, Mihye

    2003-01-01

    Discusses digital library development, particularly a national science digital library, and describes the Inquiry Page which focuses on building a constructivist environment using Web resources, collaborative processes, and knowledge that bridges digital libraries with users in K-12 schools, museums, community groups, or other organizations. (LRW)

  6. Exploring the Use of a Facebook Page in Anatomy Education

    Science.gov (United States)

    Jaffar, Akram Abood

    2014-01-01

    Facebook is the most popular social media site visited by university students on a daily basis. Consequently, Facebook is the logical place to start with for integrating social media technologies into education. This study explores how a faculty-administered Facebook Page can be used to supplement anatomy education beyond the traditional…

  7. RDFa Primer, Embedding Structured Data in Web Pages

    NARCIS (Netherlands)

    W3C, institution; Birbeck, M.; et al, not CWI

    2007-01-01

    Current Web pages, written in XHTML, contain inherent structured data: calendar events, contact information, photo captions, song titles, copyright licensing information, etc. When authors and publishers can express this data precisely, and when tools can read it robustly, a new world of user functi

  8. Building interactive simulations in a Web page design program.

    Science.gov (United States)

    Kootsey, J Mailen; Siriphongs, Daniel; McAuley, Grant

    2004-01-01

    A new Web software architecture, NumberLinX (NLX), has been integrated into a commercial Web design program to produce a drag-and-drop environment for building interactive simulations. NLX is a library of reusable objects written in Java, including input, output, calculation, and control objects. The NLX objects were added to the palette of available objects in the Web design program to be selected and dropped on a page. Inserting an object in a Web page is accomplished by adding a template block of HTML code to the page file. HTML parameters in the block must be set to user-supplied values, so the HTML code is generated dynamically, based on user entries in a popup form. Implementing the object inspector for each object permits the user to edit object attributes in a form window. Except for model definition, the combination of the NLX architecture and the Web design program permits construction of interactive simulation pages without writing or inspecting code.

  9. What Should Be On A School Library Web Page?

    Science.gov (United States)

    Baumbach, Donna; Brewer, Sally; Renfroe, Matt

    2004-01-01

    As varied as the schools and the communities they serve, so too are the Web pages for the library media programs that serve them. This article provides guidelines for effective web design and the information that might be included, including reference resources, reference asistance, curriculum support, literacy advocacy, and dynamic material. An…

  10. A reverse engineering approach for automatic annotation of Web pages

    NARCIS (Netherlands)

    R. de Virgilio (Roberto); F. Frasincar (Flavius); W. Hop (Wim); S. Lachner (Stephan)

    2013-01-01

    textabstractThe Semantic Web is gaining increasing interest to fulfill the need of sharing, retrieving, and reusing information. Since Web pages are designed to be read by people, not machines, searching and reusing information on the Web is a difficult task without human participation. To this aim

  11. The 'Don'ts' of Web Page Design.

    Science.gov (United States)

    Balas, Janet L.

    1999-01-01

    Discusses online resources that focus on what not to do in Web page design. "Don'ts" include: making any of the top 10 mistakes identified by Nielsen, qualifying for a "muddie" award for bad Web sites, forgetting to listen to users, and forgetting accessibility. A sidebar lists the Web site addresses for the nine resources…

  12. A Quantitative Comparison of Semantic Web Page Segmentation Approaches

    NARCIS (Netherlands)

    Kreuzer, Robert; Hage, J.; Feelders, A.J.

    2015-01-01

    We compare three known semantic web page segmentation algorithms, each serving as an example of a particular approach to the problem, and one self-developed algorithm, WebTerrain, that combines two of the approaches. We compare the performance of the four algorithms for a large benchmark of modern w

  13. The importance of prior probabilities for entry page search

    NARCIS (Netherlands)

    Kraaij, W.; Westerveld, T.; Hiemstra, D.

    2002-01-01

    Een belangrijke groep zoekopdrachten op het internet heeft ten doel de startpagina of 'entry page' van een organisatie te vinden. Zoeken naar een startpagina verschilt sterk van algemeen of 'Ad Hoc' zoeken. De resultaten van een simpel algemeen zoeksysteem zijn teleurstellend. In het rapport wordt

  14. Exploring the Use of a Facebook Page in Anatomy Education

    Science.gov (United States)

    Jaffar, Akram Abood

    2014-01-01

    Facebook is the most popular social media site visited by university students on a daily basis. Consequently, Facebook is the logical place to start with for integrating social media technologies into education. This study explores how a faculty-administered Facebook Page can be used to supplement anatomy education beyond the traditional…

  15. Google's Web Page Ranking Applied to Different Topological Web Graph Structures.

    Science.gov (United States)

    Meghabghab, George

    2001-01-01

    This research, part of the ongoing study to better understand Web page ranking on the Web, looks at a Web page as a graph structure or Web graph, and classifies different Web graphs in the new coordinate space (out-degree, in-degree). Google's Web ranking algorithm (Brin & Page, 1998) on ranking Web pages is applied in this new coordinate…

  16. The "Pathological Gambling and Epidemiology" (PAGE) study program: design and fieldwork.

    Science.gov (United States)

    Meyer, Christian; Bischof, Anja; Westram, Anja; Jeske, Christine; de Brito, Susanna; Glorius, Sonja; Schön, Daniela; Porz, Sarah; Gürtler, Diana; Kastirke, Nadin; Hayer, Tobias; Jacobi, Frank; Lucht, Michael; Premper, Volker; Gilberg, Reiner; Hess, Doris; Bischof, Gallus; John, Ulrich; Rumpf, Hans-Jürgen

    2015-03-01

    The German federal states initiated the "Pathological Gambling and Epidemiology" (PAGE) program to evaluate the public health relevance of pathological gambling. The aim of PAGE was to estimate the prevalence of pathological gambling and cover the heterogenic presentation in the population with respect to comorbid substance use and mental disorders, risk and protective factors, course aspects, treatment utilization, triggering and maintenance factors of remission, and biological markers. This paper describes the methodological details of the study and reports basic prevalence data. Two sampling frames (landline and mobile telephone numbers) were used to generate a random sample from the general population consisting of 15,023 individuals (ages 14 to 64) completing a telephone interview. Additionally, high-risk populations have been approached in gambling locations, via media announcements, outpatient addiction services, debt counselors, probation assistants, self-help groups and specialized inpatient treatment facilities. The assessment included two steps: (1) a diagnostic interview comprising the gambling section of the Composite International Diagnostic Interview (CIDI) for case finding; (2) an in-depth clinical interview with participants reporting gambling problems. The in-depth clinical interview was completed by 594 participants, who were recruited from the general or high-risk populations. The program provides a rich epidemiological database which is available as a scientific use file.

  17. 2009 Survey of Gulf of Mexico Dockside Seafood Dealers

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This survey employed a two page, self-administered mail survey structured to collect economic and financial information from dockside seafood dealers who operated...

  18. XWRAPComposer: A Multi-Page Data Extraction Service for Bio-Computing Applications

    Energy Technology Data Exchange (ETDEWEB)

    Liu, L; Zhang, J; Han, W; Pu, C; Caverlee, J; Park, S; Critchlow, T; Coleman, M; Buttler, D

    2005-02-16

    This paper presents a service-oriented framework for the development of wrapper code generators, including the methodology of designing an effective wrapper program construction facility and a concrete implementation, called XWRAPComposer Three unique features distinguish XWRAPComposer from existing wrapper development approaches. First, XWRAPComposer is designed to enable multi-stage and multi-page data extraction. Second, XWRAPComposer is the only wrapper generation system that promotes the distinction of information extraction logic from query-answer control logic, allowing higher level of robustness against changes in the service provider's web site design or infrastructure. Third, XWRAPComposer provides a user-friendly plug-and-play interface, allowing seamless incorporation of external services and continuous changing service interfaces and data format.

  19. Modified Weighted PageRank Algorithm using Time Spent on Links

    Directory of Open Access Journals (Sweden)

    Priyanka Bauddha

    2014-09-01

    Full Text Available With dynamic growth and increasing data on the web, it is very difficult to find relevant information for a user. Large numbers of pages are returned by search engine in response of user’s query. The ranking algorithms have been developed to prioritize the search results so that more relevant pages are displayed at the top. Various ranking algorithms based on web structure mining and web usage mining such as PageRank, Weighted PageRank, PageRank with VOL and Weighted PageRank with VOL have been developed but they are not able to endure with the time spent by user on a particular web page. If user is conferring more time on a web page that signifies the page is more relevant to user. The proposed algorithm consolidates time spent with the Weighted PageRank using Visit of Links.

  20. An Improved Approach to perform Crawling and avoid Duplicate Web Pages

    Directory of Open Access Journals (Sweden)

    Dhiraj Khurana

    2012-06-01

    Full Text Available When a web search is performed it includes many duplicate web pages or the websites. It means we can get number of similar pages at different web servers. We are proposing a Web Crawling Approach to Detect and avoid Duplicate or Near Duplicate WebPages. In this proposed work we are presenting a keyword Prioritization based approach to identify the web page over the web. As such pages will beidentified it will optimize the web search.

  1. Validation of a Web Application by Using a Limited Number of Web Pages

    OpenAIRE

    Doru Anastasiu Popescu; Maria Catrinel Dănăuţă

    2012-01-01

    In this paper, we are trying to introduce a method of selection of some web pages from a web application, which will be verified by using different validating mechanisms. The number of selected web pages cannot be higher than a previously established constant. The method of selection of these web pages must assure the highest possible quality of the verification of the entire application. The error detection of these web pages will automatically lead to the error detection in other pages. Thi...

  2. HTML Tags as Extraction Cues for Web Page Description Construction

    Directory of Open Access Journals (Sweden)

    Timothy C. Craven

    2003-01-01

    Full Text Available Using four previously identified samples of Web pages containing meta-tagged descriptions, the value of meta-tagged keywords, the first 200 characters of the body, and text marked with common HTML tags as extracts helpful for writing summaries was estimated by applying two measures: density of description words and density of two-word description phrases. Generally, titles and keywords showed the highest densities. Parts of the body showed densities not much different from the body as a whole: somewhat higher for the first 200 characters and for text tagged with "center" and "font"; somewhat lower for text tagged with "a"; not significantly different for "table" and "div". Evidence of non-random clumping of description words in the body of some pages nevertheless suggests that further pursuit of automatic passage extraction methods from the body may be worthwhile. Implications of the findings for aids to summarization, and specifically the TexNet32 package, are discussed.

  3. Overhaul of CERN's top-level web pages

    CERN Multimedia

    2004-01-01

    The pages for CERN users and for the general public have been given a face-lift before they become operational on the central web servers later this month. You may already now inspect the new versions in their "waiting places" at: http://intranet.cern.ch/User/ and http://intranet.cern.ch/Public/ We hope you will like these improved versions and you can report errors and omissions in the usual way ("comments and change requests" link at the bottom of the pages). The new versions will replace the existing ones at the end of the month, so you do not need to change your bookmarks or start-up URL. ETT/EC/EX

  4. Digital libraries and World Wide Web sites and page persistence.

    Directory of Open Access Journals (Sweden)

    Wallace Koehler

    1999-01-01

    Full Text Available Web pages and Web sites, some argue, can either be collected as elements of digital or hybrid libraries, or, as others would have it, the WWW is itself a library. We begin with the assumption that Web pages and Web sites can be collected and categorized. The paper explores the proposition that the WWW constitutes a library. We conclude that the Web is not a digital library. However, its component parts can be aggregated and included as parts of digital library collections. These, in turn, can be incorporated into "hybrid libraries." These are libraries with both traditional and digital collections. Material on the Web can be organized and managed. Native documents can be collected in situ, disseminated, distributed, catalogueed, indexed, controlled, in traditional library fashion. The Web therefore is not a library, but material for library collections is selected from the Web. That said, the Web and its component parts are dynamic. Web documents undergo two kinds of change. The first type, the type addressed in this paper, is "persistence" or the existence or disappearance of Web pages and sites, or in a word the lifecycle of Web documents. "Intermittence" is a variant of persistence, and is defined as the disappearance but reappearance of Web documents. At any given time, about five percent of Web pages are intermittent, which is to say they are gone but will return. Over time a Web collection erodes. Based on a 120-week longitudinal study of a sample of Web documents, it appears that the half-life of a Web page is somewhat less than two years and the half-life of a Web site is somewhat more than two years. That is to say, an unweeded Web document collection created two years ago would contain the same number of URLs, but only half of those URLs point to content. The second type of change Web documents experience is change in Web page or Web site content. Again based on the Web document samples, very nearly all Web pages and sites undergo some

  5. The relative worst order ratio applied to paging

    DEFF Research Database (Denmark)

    Boyar, Joan; Favrholdt, Lene Monrad; Larsen, Kim Skak

    2007-01-01

    The relative worst order ratio, a new measure for the quality of on-line algorithms, was recently defined and applied to two bin packing problems. Here, we apply it to the paging problem and obtain the following results: We devise a new deterministic paging algorithm, Retrospective-LRU, and show...... that it performs better than LRU. This is supported by experimental results, but contrasts with the competitive ratio. All deterministic marking algorithms have the same competitive ratio, but here we find that LRU is better than FWF. According to the relative worst order ratio, no deterministic marking algorithm...... can be significantly better than LRU, but the randomized algorithm MARK is better than LRU. Finally, look-ahead is shown to be a significant advantage, in contrast to the competitive ratio, which does not reflect that look-ahead can be helpful....

  6. Science Letters: Efficient page layout analysis on small devices

    Institute of Scientific and Technical Information of China (English)

    Eun-jung HAN; Chee-onn WONG; Kee-chul JUNG; Kyung-ho LEE; Eun-yi KIM

    2009-01-01

    Previously we have designed and implemented new image browsing facilities to support effective offline image contents on mobile devices with limited capabilities: low bandwidth, small display, and slow processing. In this letter, we fulfill the automatic production of cartoon contents fitting small-screen display, and introduce a clustering method useful for various types of cartoon images as a prerequisite stage for preserving semantic meaning. The usage of neural networks is to properly cut the various forms of pages. Texture information that is useful for grayscale image segmentation gives us a good clue for page layout analysis using the multilayer perceptron (MLP) based x-y recursive algorithm. We also automatically frame the segment MLP using agglomerative segmentation. Our experimental results show that the combined approaches yield good results of segmentation for several cartoons.

  7. Simple detection of phosphoproteins in SDS-PAGE by quercetin

    Directory of Open Access Journals (Sweden)

    Xi Wang

    2014-09-01

    Full Text Available A novel fluorescence-based staining method was developed for phosphoprotein analysis in sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE. Similar to the mechanism of immobilized metal ion affinity chromatography (IMAC, the method employed quercetin–aluminum (III-appended complex as a fluoroprobe to selectively visualize phosphorylated proteins among total proteins. According to the results, as low as 16–32 ng of phosphoproteins (α-casein, β-casein and phosvitin could be selectively detected in 90 min with a wide linear dynamic range. In addition, the specificity of this novel stain for phosphoproteins was confirmed by 1-D and 2-D SDS-PAGE, dephosphorylation, western blot and liquid chromatography–mass spectrometry analysis (LC–MS/MS, respectively.

  8. User Modeling Combining Access Logs, Page Content and Semantics

    CERN Document Server

    Fortuna, Blaz; Grobelnik, Marko

    2011-01-01

    The paper proposes an approach to modeling users of large Web sites based on combining different data sources: access logs and content of the accessed pages are combined with semantic information about the Web pages, the users and the accesses of the users to the Web site. The assumption is that we are dealing with a large Web site providing content to a large number of users accessing the site. The proposed approach represents each user by a set of features derived from the different data sources, where some feature values may be missing for some users. It further enables user modeling based on the provided characteristics of the targeted user subset. The approach is evaluated on real-world data where we compare performance of the automatic assignment of a user to a predefined user segment when different data sources are used to represent the users.

  9. Building Interactive Simulations in Web Pages without Programming.

    Science.gov (United States)

    Mailen Kootsey, J; McAuley, Grant; Bernal, Julie

    2005-01-01

    A software system is described for building interactive simulations and other numerical calculations in Web pages. The system is based on a new Java-based software architecture named NumberLinX (NLX) that isolates each function required to build the simulation so that a library of reusable objects could be assembled. The NLX objects are integrated into a commercial Web design program for coding-free page construction. The model description is entered through a wizard-like utility program that also functions as a model editor. The complete system permits very rapid construction of interactive simulations without coding. A wide range of applications are possible with the system beyond interactive calculations, including remote data collection and processing and collaboration over a network.

  10. The Next Page Access Prediction Using Makov Model

    Directory of Open Access Journals (Sweden)

    Deepti Razdan

    2011-09-01

    Full Text Available Predicting the next page to be accessed by the Webusers has attracted a large amount of research. In this paper, anew web usage mining approach is proposed to predict next pageaccess. It is proposed to identify similar access patterns from weblog using K-mean clustering and then Markov model is used forprediction for next page accesses. The tightness of clusters isimproved by setting similarity threshold while forming clusters.In traditional recommendation models, clustering by nonsequentialdata decreases recommendation accuracy. In thispaper involve incorporating clustering with low order markovmodel which can improve the prediction accuracy. The main areaof research in this paper is pre processing and identification ofuseful patterns from web data using mining techniques with thehelp of open source software.

  11. Consumer’s Participation on Brand Pages on Facebook

    Directory of Open Access Journals (Sweden)

    Bianca MITU

    2014-06-01

    Full Text Available  The focus of this study is to analyze consumer’s participation and communication in the online brand communities on Facebook. This type of brand community represents a subgroup of virtual communities, which is known as communities of consumption or fan clubs (Kozinets 1999, Szmigin et al. 2005. Understanding consumer relationships in such communities is important for the success of both the brand and the community. The aim of our study is to investigate how and in what sense consumers participate and communicate with one another via online brand communities, so as to explore the nature of the consumer’s participation on brand pages on Facebook. Also, we aim to investigate the importance of the Facebook fan page as a tool for a company’s business strategy. n order to investigate all these different aspects, a quantitative audience research was conducted, using a structured questionnaire.

  12. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  13. Metadata Schema Used in OCLC Sampled Web Pages

    OpenAIRE

    Fei Yu

    2005-01-01

    The tremendous growth of Web resources has made information organization and retrieval more and more difficult. As one approach to this problem, metadata schemas have been developed to characterize Web resources. However, many questions have been raised about the use of metadata schemas such as which metadata schemas have been used on the Web? How did they describe Web accessible information? What is the distribution of these metadata schemas among Web pages? Do certain schemas dominate the o...

  14. Paged GIRS (Graph Information Retrieval System) Users Manual.

    Science.gov (United States)

    1981-05-01

    existing overlay structure. Accession For VTIS GRA&I DTIC TAB Unannounced E Justification By_ Distribution/ Availability Codes Av&aii mnd/or Dist Spoc ~nl...program. During the course of execution, if LVHAPG(1) exceeds LVHREQ, continuant zero of a new page is created and LVHAPG(1) is incremented by one...or written out to disk, of course . The formula is: value = A * order + B * usage + C * space + D * write where the weighting factors A, B, C, and D

  15. Physics Letters B, Volume 716, Issue 1- Cover Page

    CERN Multimedia

    CERN

    2012-01-01

    The cover page of the Physics Letters B Journal, Volume 716, Issue 1, dedicated to the observation of a new particle in the search for the Standard Model Higgs boson. To celebrate this historical discovery, Elsevier reprinted the ATLAS and the CMS articles together with a foreword by Peter Higgs and the other scientists that predicted the existence of the so-called Higgs boson and published this in a separate booklet.

  16. Page Smith: Founding Cowell College and UCSC, 1964-1973

    OpenAIRE

    Smith, Page; Jarrell, Randall; Regional History Project, UCSC Library

    1996-01-01

    This oral history chronicles the late Page Smith's experiences as founding provost of the campus's first college and his major contributions in shaping the college system here. His narration includes chapters on student culture in the 1960s and 1970s, the pass/fail grading system, his educational philosophy, town/gown relations, campus architecture, the History of Consciousness Program, his relationship with founding Chancellor Dean E. McHenry, arts on the campus and the role of his wife, Elo...

  17. On the convergence of Le Page series in Skohorod space

    CERN Document Server

    Davydov, Youri

    2011-01-01

    We consider the problem of the convergence of the so-called Le Page series in the Skohorod space $\\bbD^d=\\bbD([0,1],\\bbR^d)$ and provide a simple criterion based on the moments of the increments of the random process involved in the series. This provides a simple sufficient condition for the existence of an $\\alpha$-stable distribution on $\\bbD^d$ with given spectral measure.

  18. TRANSPORTATION IN Lhasa CITY TURNS TO A NEW PAGE

    Institute of Scientific and Technical Information of China (English)

    SONAM; TSERING

    2007-01-01

    Road construction in Lhasa City has turned a new page over recent decades due to disinterested support from provincial and municipal governments in the hinterland, in particular under the auspices of the government of Beijing Municipality and Jiangsu Province.Unbelievable changes of the roads in Lhasa City not only improve the city's transportation infrastructure,but bring about direct benefits to local residents as well.

  19. Program for Culture and Conflict Studies, web page capture

    OpenAIRE

    Naval Postgraduate School (U.S.)

    2014-01-01

    web page capture from the NPS website The Program for Culture and Conflict Studies (CCS) is premised on the belief that the United States must understand the cultures and societies of the world to effectively interact with local people. It is dedicated to the study of anthropological, ethnographic, social, political, and economic data to inform U.S. policies at both the strategic and operational levels.

  20. Cardiology Still a Man's Field, Survey Finds

    Science.gov (United States)

    ... page: https://medlineplus.gov/news/fullstory_162700.html Cardiology Still a Man's Field, Survey Finds Women less ... Dr. Claire Duvernoy, chair of the Women in Cardiology Council at the American College of Cardiology (ACC). ...

  1. Predicting HCAHPS scores from hospitals' social media pages: A sentiment analysis.

    Science.gov (United States)

    Huppertz, John W; Otto, Peter

    2017-02-22

    Social media is an important communication channel that can help hospitals and consumers obtain feedback about quality of care. However, despite the potential value of insight from consumers who post comments about hospital care on social media, there has been little empirical research on the relationship between patients' anecdotal feedback and formal measures of patient experience. The aim of the study was to test the association between informal feedback posted in the Reviews section of hospitals' Facebook pages and scores on two global items from the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey, Overall Hospital Rating and Willingness to Recommend the Hospital. We retrieved star ratings and anecdotal comments posted in Reviews sections of 131 hospitals' Facebook pages. Using a machine learning algorithm, we analyzed 57,985 comments to measure consumers' sentiment about the hospitals. We used regression analysis to determine whether consumers' quantitative and qualitative postings would predict global measures from the HCAHPS survey. Both number of stars and the number of positive comments posted on hospitals' Facebook Reviews sections were associated with higher overall ratings and willingness to recommend the hospital. The findings suggest that patients' informal comments help predict a hospital's formal measures of patient experience. Consistent with crowd wisdom, ordinary consumers may have valid insights that can help others to assess patient experience at a hospital. Given that some people will judge hospital quality based on opinions voiced in social media, further research should continue to explore associations between anecdotal commentary and a variety of quality indicators. Administrators can tap into the wealth of commentary on social media as the forum continues to expand its influence in health care. Comments on social media may also serve as an early snapshot of patient-reported experiences, alerting

  2. How to Understand Cities?—Methodology of Urban Space Survey in Contemporary Urban Design Practices%当代城市设计实践中的城市空间调研方法论研究

    Institute of Scientific and Technical Information of China (English)

    刘堃; 金广君

    2011-01-01

    Urban design practices increasingly require the understanding of urban spaces. Professional urban space survey needs to be dependent on clear and definite methodology. The paper introduces a new graduated approach of reading text in studing urban spaces: the description of spatial form→the evaluation of urban life impacts→the formulatin of future visions.%随着当代城市设计实践的不断发展,理解城市空间变得越来越重要.城市空间调研作为以理解城市空间为目标的专业实践,需要清晰明确的方法论指导.遵循从认识论到方法论的经典研究路线,呼应于当今城市空间研究成果,以建立城市空间的"文本观",并将"阅读文本"的思路引入,演绎形成了"描述空间形态层→评价生活效能层→阐释发展意向层"这一阅读理解城市空间的基本思路,作为指导城市空间调研的方法论基础.

  3. VEGAS-SSS. A VST survey of elliptical galaxies in the southern hemisphere: analysis of small stellar systems. Testing the methodology on the globular cluster system in NGC3115

    CERN Document Server

    Cantiello, Michele; Napolitano, Nicola; Grado, Aniello; Limatola, Luca; Paolillo, Maurizio; Iodice, Enrica; Romanowsky, Aaron J; Forbes, Duncan A; Raimondo, Gabriella; Spavone, Marilena; La Barbera, Francesco; Puzia, Thomas H; Schipani, Pietro

    2014-01-01

    We present a study of globular clusters (GCs) and other small stellar systems (SSSs) in the field of NGC3115, observed as part of the VEGAS imaging survey, carried out with the VST telescope. We use deep g and i data of NGC3115, a well-studied lenticular galaxy with excellent scientific literature. This is fundamental to test the methodologies, verify the results, and probe the capabilities of the VEGAS-SSS. Leveraging the large field of view of the VST allow us to accurately study of the distribution and properties of SSSs as a function of galactocentric distance Rgc, well beyond ~20 galaxy effective radii, in a way not often possible. Our analysis of colors, magnitudes and sizes of SSS candidates confirms the results from existing studies, some of which carried out with 8-10m class telescopes, and further extends them to unreached Rgc distances, with similar accuracy. We find a color bimodality for the GC population and a r1/4 profile for the surface density of GCs as for the galaxy light profile. The radia...

  4. Efficient Computational Research Protocol to Survey Free Energy Surface for Solution Chemical Reaction in the QM/MM Framework: The FEG-ER Methodology and Its Application to Isomerization Reaction of Glycine in Aqueous Solution.

    Science.gov (United States)

    Takenaka, Norio; Kitamura, Yukichi; Nagaoka, Masataka

    2016-03-03

    In solution chemical reaction, we often need to consider a multidimensional free energy (FE) surface (FES) which is analogous to a Born-Oppenheimer potential energy surface. To survey the FES, an efficient computational research protocol is proposed within the QM/MM framework; (i) we first obtain some stable states (or transition states) involved by optimizing their structures on the FES, in a stepwise fashion, finally using the free energy gradient (FEG) method, and then (ii) we directly obtain the FE differences among any arbitrary states on the FES, efficiently by employing the QM/MM method with energy representation (ER), i.e., the QM/MM-ER method. To validate the calculation accuracy and efficiency, we applied the above FEG-ER methodology to a typical isomerization reaction of glycine in aqueous solution, and reproduced quite satisfactorily the experimental value of the reaction FE. Further, it was found that the structural relaxation of the solute in the QM/MM force field is not negligible to estimate correctly the FES. We believe that the present research protocol should become prevailing as one computational strategy and will play promising and important roles in solution chemistry toward solution reaction ergodography.

  5. Importance of intrinsic and non-network contribution in PageRank centrality and its effect on PageRank localization

    CERN Document Server

    Deyasi, Krishanu

    2016-01-01

    PageRank centrality is used by Google for ranking web-pages to present search result for a user query. Here, we have shown that PageRank value of a vertex also depends on its intrinsic, non-network contribution. If the intrinsic, non-network contributions of the vertices are proportional to their degrees or zeros, then their PageRank centralities become proportion to their degrees. Some simulations and empirical data are used to support our study. In addition, we have shown that localization of PageRank centrality depends upon the same intrinsic, non-network contribution.

  6. A Novel Approach for Web Page Set Mining

    Directory of Open Access Journals (Sweden)

    R.B.Geeta

    2011-11-01

    Full Text Available The one of the most time consuming steps for association rule mining is the computation of the frequency of the occurrences of itemsets in the database. The hash table index approach converts a transaction database to an hash index tree by scanning the transaction database only once. Whenever user requests for any Uniform Resource Locator (URL, the request entry is stored in the Log File of the server. This paper presents the hash index table structure, a general and dense structure which provides web page set extraction from Log File of server. This hash table provides information about the original database. Web Page set mining (WPs-Mine provides a complete representation of the original database. This approach works well for both sparse and dense data distributions. Web page set mining supported by hash table index shows the performance always comparable with and often better than algorithms accessing data on flat files. Incremental update is feasible without reaccessing the original transactional database.

  7. Document representations for classification of short web-page descriptions

    Directory of Open Access Journals (Sweden)

    Radovanović Miloš

    2008-01-01

    Full Text Available Motivated by applying Text Categorization to classification of Web search results, this paper describes an extensive experimental study of the impact of bag-of- words document representations on the performance of five major classifiers - Naïve Bayes, SVM, Voted Perceptron, kNN and C4.5. The texts, representing short Web-page descriptions sorted into a large hierarchy of topics, are taken from the dmoz Open Directory Web-page ontology, and classifiers are trained to automatically determine the topics which may be relevant to a previously unseen Web-page. Different transformations of input data: stemming, normalization, logtf and idf, together with dimensionality reduction, are found to have a statistically significant improving or degrading effect on classification performance measured by classical metrics - accuracy, precision, recall, F1 and F2. The emphasis of the study is not on determining the best document representation which corresponds to each classifier, but rather on describing the effects of every individual transformation on classification, together with their mutual relationships. .

  8. Arabic web pages clustering and annotation using semantic class features

    Directory of Open Access Journals (Sweden)

    Hanan M. Alghamdi

    2014-12-01

    Full Text Available To effectively manage the great amount of data on Arabic web pages and to enable the classification of relevant information are very important research problems. Studies on sentiment text mining have been very limited in the Arabic language because they need to involve deep semantic processing. Therefore, in this paper, we aim to retrieve machine-understandable data with the help of a Web content mining technique to detect covert knowledge within these data. We propose an approach to achieve clustering with semantic similarities. This approach comprises integrating k-means document clustering with semantic feature extraction and document vectorization to group Arabic web pages according to semantic similarities and then show the semantic annotation. The document vectorization helps to transform text documents into a semantic class probability distribution or semantic class density. To reach semantic similarities, the approach extracts the semantic class features and integrates them into the similarity weighting schema. The quality of the clustering result has evaluated the use of the purity and the mean intra-cluster distance (MICD evaluation measures. We have evaluated the proposed approach on a set of common Arabic news web pages. We have acquired favorable clustering results that are effective in minimizing the MICD, expanding the purity and lowering the runtime.

  9. Do-It-Yourself: A Special Library's Approach to Creating Dynamic Web Pages Using Commercial Off-The-Shelf Applications

    Science.gov (United States)

    Steeman, Gerald; Connell, Christopher

    2000-01-01

    Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.

  10. 2002 Navy Quality of Life Survey: Methodology

    Science.gov (United States)

    2007-11-02

    Carrier Cruiser Destroyer types (includes frigates) Minecraft Submarine Tender/Repair ship Reserve Unit Service Force ship Amphibious ship Amphibious...8217aviation squadron’ 5 ’carrier-based aviation squadron-detachment’ 6 ’aircraft carrier’ 7 ’cruiser’ 8 ’destroyer types’ 9 ’ minecraft ...Squadron/Detachment Aircraft Carrier Cruiser Destroyer types (includes frigates) Minecraft Submarine Tender/Repair ship Reserve Unit Service Force ship

  11. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon capture processes. As such, we will develop, as needed and beyond existing capabilities, a suite of robust and efficient computational tools for UQ to be integrated into a CCSI UQ software framework.

  12. Research and Application of PageRank Algorithm Combined with VSM Technique%融合VSM技术的PageRank算法研究与应用

    Institute of Scientific and Technical Information of China (English)

    李卫东; 陆玲

    2011-01-01

    为解决PageRank算法存在的"主题漂移"问题,本文提出一种融合VSM(向量空间模型)技术的改进方法.首先根据网页的链接结构计算PageRank值,然后建立网页的内容特征向量空间,计算主题内容相似度,最后将这两个值按一定的权重系数进行融合计算,产生新的PageRank值.经过对比实验证明,改进后的PageRank算法减少了无关网页的数量,为搜索引擎提供了更好的排序结果.%In order to solve the "Topic Drift" problem of PageRank algorithm, this paper proposes an improved method combined with VSM( vector space model) technique. First, it computes PageRank value by hyperlink structure of Web page, then builds vector space model of Web page content and computes topic content similarity. Finally it sums up new PageRank value according these two values by certain weight coefficient. Contrast experiments show that improved PageRank algorithm reduces the quantity of irrelevant Web page and provides better sorting results for search engine.

  13. The ATLAS Public Web Pages: Online Management of HEP External Communication Content

    Science.gov (United States)

    Goldfarb, S.; Marcelloni, C.; Eli Phoboo, A.; Shaw, K.

    2015-12-01

    The ATLAS Education and Outreach Group is in the process of migrating its public online content to a professionally designed set of web pages built on the Drupal [1] content management system. Development of the front-end design passed through several key stages, including audience surveys, stakeholder interviews, usage analytics, and a series of fast design iterations, called sprints. Implementation of the web site involves application of the html design using Drupal templates, refined development iterations, and the overall population of the site with content. We present the design and development processes and share the lessons learned along the way, including the results of the data-driven discovery studies. We also demonstrate the advantages of selecting a back-end supported by content management, with a focus on workflow. Finally, we discuss usage of the new public web pages to implement outreach strategy through implementation of clearly presented themes, consistent audience targeting and messaging, and the enforcement of a well-defined visual identity.

  14. Bases conceptuales y metodológicas de la Encuesta Nacional de Salud II, México 1994 Conceptual and methodological basis of the National Health Survey II, Mexico, 1994

    Directory of Open Access Journals (Sweden)

    1998-01-01

    Full Text Available Se describen las bases conceptuales y metodológicas de la Encuesta Nacional de Salud II (ENSA-II, que integra avances de la investigación multidisciplinaria en salud pública, tanto en el terreno conceptual como en el metodológico, que se han dado en nuestro país últimamente. Su diseño se concentró particularmente en las condiciones del acceso, la calidad y los costos de los servicios de atención a la salud, tanto a nivel ambulatorio como hospitalario. Se incluyen detalles de su marco conceptual, así como los aspectos relacionados con el procesamiento y análisis. La cobertura geográfica fue hecha para cinco regiones; se visitaron 12 615 viviendas a escala nacional, y se recabó información sobre 61 524 individuos. La tasa global de respuesta fue de 96.7%, tanto para los hogares como para los utilizadores identificados de servicios de salud. La conclusión general apunta hacia la incorporación del enfoque de la población al proceso de planeación y asignación de recursos para la atención a la salud.The conceptual and methodological basis of the National Health Survey II (NHS-II are described and recent advances in multidisciplinary public health research in Mexico, both conceptual and methodological, are synthesized. The design of the NHS-II concentrated on the study of the access, quality of care and health attention expenses in ambulatory and hospitalization services. Details on the conceptual framework related with the analysis and processing of data are also included. Five geographic regions were covered; 12 615 households at national level were visited and information on 61524 individuals was gathered. The overall response rate was 96.7% both for households and for identified health service users. The general conclusion emphasizes the need to incorporate the population perspective to the planning and allocation of health resources.

  15. Research on Web Page Automatic Classification Based on Internet News Corpus

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Web pages contain more abundant contents than pure text ,such as hyperlinks ,html tags and metadata et al. So that Web page categorization is different from pure text. According to Internet Chinese news pages, a practical algorithm for extracting subject concepts from web page without thesaurus was proposed, when incorporated these category-subject concepts into knowledge base, Web pages was classified by hybrid algorithm, with experiment corpus extracting from Xinhua net. Experimental result shows that the categorization performance is improved using Web page feature.

  16. Identify Web-page Content meaning using Knowledge based System for Dual Meaning Words

    OpenAIRE

    Sinha, Sukanta; Dattagupta, Rana; Mukhopadhyay, Debajyoti

    2012-01-01

    Meaning of Web-page content plays a big role while produced a search result from a search engine. Most of the cases Web-page meaning stored in title or meta-tag area but those meanings do not always match with Web-page content. To overcome this situation we need to go through the Web-page content to identify the Web-page meaning. In such cases, where Webpage content holds dual meaning words that time it is really difficult to identify the meaning of the Web-page. In this paper, we are introdu...

  17. Proposta metodológica para o módulo de consumo alimentar pessoal na pesquisa brasileira de orçamentos familiares Methodological proposal for the individual food intake module of the Brazilian household budget survey

    Directory of Open Access Journals (Sweden)

    Edna Massae Yokoo

    2008-12-01

    Expenditure Study done from 1974 to 1975. Although useful, national food consumption studies are expensive and only a few countries can conduct them regularly. Nonetheless, household budget surveys are important sources of data on the availability of food at home determined by records of foods purchased. Recent changes in consumption habits, particularly eating out, limit the use of data from household budget surveys to estimate food intake. Thus, the Brazilian government suggested that the next household budget survey to be done in 2008-2009 include a module on individual food consumption. Information on individual food intake will be used to supplement the data regarding food purchases. The objective of this study is to report the development of the methodology to be used in the module of individual food consumption of the household budget survey of 2008-2009. Budget data will be combined with intake data to estimate the usual individual food consumption.

  18. Jimmy Page "Number Two" Les-Paul 问世

    Institute of Scientific and Technical Information of China (English)

    黄伟

    2010-01-01

    @@ 2009年12月31日,田纳西州纳什维尔:Gibson Custom推出了Gibson Custom Jimmy Page Number Two Les Paul限量版吉他.该款吉他由传奇演奏家Jimmy Page亲自改良. 每个音乐家都知道,五十年代晚期推出的Sunburst Les Paul Standards如今已经很难再改良.对一把经典的59Burst进行改进,使其的性能至臻完美,最大程度地提高其音质,这可能吗?这对你来说是闻所未闻……除非你是Jimmy Page.就是这位来自传奇的齐柏林飞艇乐队(Led Zeppelin)的吉他手,或许也是世界上最著名的Les Paul演奏者,常常用自己的59 Les Paul Standard进行演奏的Jimmy Page才能做到的事--感谢Gibson Custom Shop付出的不懈努力及其和Jimmy Page的紧密合作--这款所有艺术家都会视为瑰宝的"Number Two"Les Paul这才得以最终诞生,那就是Custom Shop Jimmy Page "Numher Two"Les Paul.

  19. Diseño metodológico de la Encuesta Nacional sobre Violencia contra las Mujeres en México Methodological Design for the National Survey Violence Against Women in Mexico

    Directory of Open Access Journals (Sweden)

    Gustavo Olaiz

    2006-01-01

    Full Text Available OBJETIVO: Describir la metodología utilizada en la Encuesta Nacional sobre Violencia contra las Mujeres 2003 (ENVIM 2003 en México, junto con el diseño de investigación, la estimación y la selección de muestras, la definición de variables, los instrumentos de recolección, el diseño operativo para su instrumentación y los procedimientos de análisis. MATERIAL Y MÉTODOS: En la parte cuantitativa se recurrió a un diseño transversal en dos etapas. En el componente cualitativo se realizaron entrevistas en profundidad y observación participante en unidades médicas. RESULTADOS: Se obtuvo un total de 26 240 entrevistas, aplicadas a usuarias de los servicios de salud; y 2 636 cuestionarios correspondientes a proveedores, que abarcaron los 32 estados de la República. En el estudio cualitativo se llevó a cabo un total de 26 entrevistas de profundidad a usuarias y 60 entrevistas de profundidad a prestadores de servicios de salud en los estados de Quintana Roo, Coahuila y el Distrito Federal.OBJECTIVE: To describe the methodology, the research designs used, the estimation and sample selection, variable definitions, collection instruments, and operative design and analytical procedures for the National Survey Violence Against Women in Mexico. MATERIAL AND METHODS: A complex (two-step cross-sectional study was designed and the qualitative design was carried out using in-depth interviews and participant observation in health care units. RESULTS: We obtained for the quantitative study a total of 26 240 interviews in women users of health services and 2 636 questionnaires for health workers; the survey is representative of the 32 Mexican states. For the qualitative study 26 in-depth interviews were conducted with female users and 60 interviews with health workers in the States of Quintana Roo, Coahuila and the Federal District.

  20. Aspectos metodológicos do Projeto SBBrasil 2010 de interesse para inquéritos nacionais de saúde Relevant methodological issues from the SBBrasil 2010 Project for national health surveys

    Directory of Open Access Journals (Sweden)

    Angelo Giuseppe Roncalli

    2012-01-01

    Full Text Available O Projeto SBBrasil 2010 (SBB10 foi concebido como um levantamento epidemiológico em saúde bucal, de base nacional, dentro da estratégia de vigilância em saúde. O objetivo deste artigo é apresentar aspectos da metodologia do SBB10 que possam contribuir para ampliar e desenvolver conhecimentos na área de saúde. Com relação ao plano amostral, trata-se de uma pesquisa por conglomerados e com múltiplos estágios. Capitais e municípios do interior das cinco regiões brasileiras compõem os domínios da amostra, cujas unidades amostrais foram, respectivamente, setor censitário e domicílio para as capitais, e município, setor censitário e domicílio para o interior. Nas capitais foram sorteados 30 setores e, no interior de cada região, 30 municípios. A precisão considerou os domínios agrupados segundo o grau de densidade no total da população e a variabilidade interna dos índices. Foram avaliadas as condições de cárie dentária, doença periodontal, oclusopatias, fluorose, traumatismo dentário e edentulismo em cinco grupos etários (5, 12, 15 a 19, 35 a 44 e 65 a 74 anos.The SBBrasil 2010 Project (SBB10 was designed as a nationwide oral health epidemiological survey within a health surveillance strategy. This article discusses methodological aspects of the SBB10 Project that can potentially help expand and develop knowledge in the health field. This was a nationwide survey with stratified multi-stage cluster sampling. The sample domains were 27 State capitals and 150 rural municipalities (counties from the country's five major geographic regions. The sampling units were census tracts and households for the State capitals and municipalities, census tracts, and households for the rural areas. Thirty census tracts were selected in the State capitals and 30 municipalities in the countryside. The precision considered the demographic domains grouped by density of the overall population and the internal variability of oral health

  1. Business Systems Branch Abilities, Capabilities, and Services Web Page

    Science.gov (United States)

    Cortes-Pena, Aida Yoguely

    2009-01-01

    During the INSPIRE summer internship I acted as the Business Systems Branch Capability Owner for the Kennedy Web-based Initiative for Communicating Capabilities System (KWICC), with the responsibility of creating a portal that describes the services provided by this Branch. This project will help others achieve a clear view ofthe services that the Business System Branch provides to NASA and the Kennedy Space Center. After collecting the data through the interviews with subject matter experts and the literature in Business World and other web sites I identified discrepancies, made the necessary corrections to the sites and placed the information from the report into the KWICC web page.

  2. Categorization of web pages - Performance enhancement to search engine

    Digital Repository Service at National Institute of Oceanography (India)

    Lakshminarayana, S.

    are the major areas of research in IR and strive to improve the effectiveness of interactive IR and can be used as performance evaluation tool. The classification studies at early stages were with strong human interaction than machine learning. The term... and the location of the link. In the absence such works, the spider/worm either moves to the next page available at the least time or by network selection. This classification serves in judgment of traversal of web spider/worm and minimization. Such processes...

  3. Children's recognition of advertisements on television and on Web pages.

    Science.gov (United States)

    Blades, Mark; Oates, Caroline; Li, Shiying

    2013-03-01

    In this paper we consider the issue of advertising to children. Advertising to children raises a number of concerns, in particular the effects of food advertising on children's eating habits. We point out that virtually all the research into children's understanding of advertising has focused on traditional television advertisements, but much marketing aimed at children is now via the Internet and little is known about children's awareness of advertising on the Web. One important component of understanding advertisements is the ability to distinguish advertisements from other messages, and we suggest that young children's ability to recognise advertisements on a Web page is far behind their ability to recognise advertisements on television.

  4. Modeling and predicting page-view dynamics on Wikipedia

    CERN Document Server

    Thij, Marijn ten; Laniado, David; Kaltenbrunner, Andreas

    2012-01-01

    The simplicity of producing and consuming online content makes it difficult to estimate how much attention will be devoted from Internet users to any given content. This work presents a general overview of temporal patterns in the access to content on a huge collaborative platform. We propose a model for predicting the popularity of promoted content, inspired by the analysis of the page-view dynamics on Wikipedia. Compared to previous studies, the observed popularity patterns are more complex; however, our model uses just few parameters to fully describe them. The model is validated through empirical measurements.

  5. Parameterized analysis of paging and list update algorithms

    DEFF Research Database (Denmark)

    Dorrigiv, Reza; Ehmsen, Martin R.; López-Ortiz, Alejandro

    2009-01-01

    It is well-established that input sequences for paging and list update have locality of reference. In this paper we analyze the performance of algorithms for these problems in terms of the amount of locality in the input sequence. We define a measure for locality that is based on Denning's working...... to a better performance. We obtain similar separation for list update algorithms. Lastly, we show that, surprisingly, certain randomized algorithms which are superior to MTF in the classical model are not so in the parameterized case, which matches experimental results. © 2010 Springer-Verlag Berlin...

  6. Parameterized Analysis of Paging and List Update Algorithms

    DEFF Research Database (Denmark)

    Dorrigiv, Reza; Ehmsen, Martin R.; López-Ortiz, Alejandro

    2015-01-01

    It is well-established that input sequences for paging and list update have locality of reference. In this paper we analyze the performance of algorithms for these problems in terms of the amount of locality in the input sequence. We define a measure for locality that is based on Denning’s working...... that a larger cache leads to a better performance. We also apply the parameterized analysis framework to list update and show that certain randomized algorithms which are superior to MTF in the classical model are not so in the parameterized case, which matches experimental results....

  7. Combined Viterbi Detector for a Balanced Code in Page Memories

    Institute of Scientific and Technical Information of China (English)

    Chen Duan-rong; Xie Chang-sheng; Pei Xian-deng

    2004-01-01

    Based on the two path metrics being equal at a merged node in the trellis employed to describe a Viterbi detector for the detection of data encoded with a rate 6∶8 balanced binary code in page-oriented optical memories, the combined Viterbi detector scheme is proposed to improve raw bit-error rate performance by mitigating the occurrence of a two-bit reversing error event in an estimated codeword for the balanced code. The effectiveness of the detection scheme is verified for different data quantizations using Monte Carlo simulations.

  8. PageRank for low frequency earthquake detection

    Science.gov (United States)

    Aguiar, A. C.; Beroza, G. C.

    2013-12-01

    We have analyzed Hi-Net seismic waveform data during the April 2006 tremor episode in the Nankai Trough in SW Japan using the autocorrelation approach of Brown et al. (2008), which detects low frequency earthquakes (LFEs) based on pair-wise waveform matching. We have generalized this to exploit the fact that waveforms may repeat multiple times, on more than just a pair-wise basis. We are working towards developing a sound statistical basis for event detection, but that is complicated by two factors. First, the statistical behavior of the autocorrelations varies between stations. Analyzing one station at a time assures that the detection threshold will only depend on the station being analyzed. Second, the positive detections do not satisfy "closure." That is, if window A correlates with window B, and window B correlates with window C, then window A and window C do not necessarily correlate with one another. We want to evaluate whether or not a linked set of windows are correlated due to chance. To do this, we map our problem on to one that has previously been solved for web search, and apply Google's PageRank algorithm. PageRank is the probability of a 'random surfer' to visit a particular web page; it assigns a ranking for a webpage based on the amount of links associated with that page. For windows of seismic data instead of webpages, the windows with high probabilities suggest likely LFE signals. Once identified, we stack the matched windows to improve the snr and use these stacks as template signals to find other LFEs within continuous data. We compare the results among stations and declare a detection if they are found in a statistically significant number of stations, based on multinomial statistics. We compare our detections using the single-station method to detections found by Shelly et al. (2007) for the April 2006 tremor sequence in Shikoku, Japan. We find strong similarity between the results, as well as many new detections that were not found using

  9. Rich-club and page-club coefficients for directed graphs

    Science.gov (United States)

    Smilkov, Daniel; Kocarev, Ljupco

    2010-06-01

    Rich-club and page-club coefficients and their null models are introduced for directed graphs. Null models allow for a quantitative discussion of the rich-club and page-club phenomena. These coefficients are computed for four directed real-world networks: Arxiv High Energy Physics paper citation network, Web network (released from Google), Citation network among US Patents, and email network from a EU research institution. The results show a high correlation between rich-club and page-club ordering. For journal paper citation network, we identify both rich-club and page-club ordering, showing that “elite” papers are cited by other “elite” papers. Google web network shows partial rich-club and page-club ordering up to some point and then a narrow declining of the corresponding normalized coefficients, indicating the lack of rich-club ordering and the lack of page-club ordering, i.e. high in-degree (PageRank) pages purposely avoid sharing links with other high in-degree (PageRank) pages. For UC patents citation network, we identify page-club and rich-club ordering providing a conclusion that “elite” patents are cited by other “elite” patents. Finally, for email communication network we show lack of both rich-club and page-club ordering. We construct an example of synthetic network showing page-club ordering and the lack of rich-club ordering.

  10. PREFACE: PAGES 1st Young Scientists Meeting (YSM) - 'Retrospective views on our planet's future'

    Science.gov (United States)

    Margrethe Basse, Ellen

    2010-03-01

    'Retrospective views on our planet's future' - This was the theme of a tandem of meetings held by Past Global Changes (PAGES; http://www.pages-igbp.org), a project of the International Geosphere-Biosphere Programme (IGBP). It reflects the philosophy of PAGES and its community of scientists that the past holds the key to better projections of the future. Climatic and environmental evidence from the past can be used to sharpen future projections of global change, thereby informing political and societal decisions on mitigation and adaptation. Young scientists are critical to the future of this endeavour, which we call 'paleoscience'. Their scientific knowledge, interdisciplinarity, international collaboration, and leadership skills will be required if this field is to continue to thrive. Meanwhile, it is also important to remember that science develops not only by applying new strategies and new tools to make new observations, but also by building upon existing knowledge. Modern research in paleoscience began around fifty years ago, and one could say that the third generation of researchers is now emerging. It is a wise investment to ensure that existing skills and knowledge are transferred to this generation. This will enable them to lead the science towards new accomplishments, and to make important contributions towards the wider field of global change science. Motivated by such considerations, PAGES organized its first Young Scientists Meeting (YSM), held in Corvallis (Oregon, USA) in July 2009 (http://www.pages-osm.org/ysm/index.html). The meeting took place immediately before the much larger 3rd PAGES Open Science Meeting (OSM; http://www.pages-osm.org/osm/index.html). The YSM brought together 91 early-career scientists from 21 different nations. During the two-day meeting, PhD students, postdoctoral researchers, and new faculty met to present their work and build networks across geographical and disciplinary borders. Several experienced and well

  11. The Leggett-Garg inequality and Page-Wootters mechanism

    CERN Document Server

    Gangopadhyay, D

    2016-01-01

    Violation of the Leggett-Garg inequality (LGI) implies quantum phenomena. In this light we establish that the Moreva \\textit{et al.} \\cite{moreva} experiment demonstrating the Page-Wootter's mechanism \\cite{wootters} falls in the quantum domain. An observer outside a 2-photons world does not detect any change in the $2-$photons state,i.e. there is no time parameter for the outside observer. But an observer attached to one of the photons sees the other photon evolving and this means there is an "internal" time. The LGI is violated for the clock photon whose state evolves with the internal time as measured by the system photon. Conditional probabilities in this 2-photons system are computed for both sharp and unsharp measurements. The conditional probability increases for entangled states as obtained by Page and Wootters for both ideal and also unsharp measurements. We discuss how the conditional probabilities can be used to distinguish between massless and massive gravitons. This is important in the context of...

  12. Clustering of Deep WebPages: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Muhunthaadithya C

    2015-10-01

    Full Text Available The internethas massive amount of information. This information is stored in the form of zillions of webpages. The information that can be retrieved by search engines is huge, and this information constitutes the ‘surface web’.But the remaining information, which is not indexed by search engines – the ‘deep web’, is much bigger in size than the ‘surface web’, and remains unexploited yet. Several machine learning techniques have been commonly employed to access deep web content. Under machine learning, topic models provide a simple way to analyze large volumes of unlabeled text. A ‘topic’is a cluster of words that frequently occur together and topic models can connect words with similar meanings and distinguish between words with multiple meanings. In this paper, we cluster deep web databases employing several methods, and then perform a comparative study. In the first method, we apply Latent Semantic Analysis (LSA over the dataset. In the second method, we use a generative probabilistic model called Latent Dirichlet Allocation(LDA for modeling content representative of deep web databases.Both these techniques are implemented after preprocessing the set of web pages to extract page contents and form contents.Further, we propose another version of Latent Dirichlet Allocation (LDA to the dataset. Experimental results show that the proposed method outperforms the existing clustering methods.

  13. Testing protein permeability of dialysis membranes using SDS-PAGE.

    Science.gov (United States)

    Mann, H; Melzer, H; Al-Bashir, A; Xu, X Q; Stiller, S

    2002-05-01

    Permeability of dialysis membranes for high molecular weight compounds should be similar to that of the glomerular membrane in order to remove uremic toxins like the human kidney does. In order to evaluate permeability of high-flux dialysis membranes SDS-PAGE is applied for examination of filtrate of dialysers during routine dialysis with different membranes. SDS-PAGE analysis is performed with silver staining method according to the modification of Melzer (5) and consecutive laser densitometry. The protein pattern of filtrate from dialysis membranes is similar to that of the glomerular membrane containing IgG, transferrin, albumin, alpha-1-microglobulin, retinol binding protein and beta-2-microglobulin. Comparing different membranes there are considerable differences depending on cut-off, charge and adsorption capacity of the particular membrane. In all membranes tested permeability of proteins decreases during one treatment session. Protein permeability of high-flux dialysis membranes is similar to the gloemerular membrane but modified according to pore-size, surface charge, adsorption and time on dialysis. In contrast to the glomerular membrane in each of the investigated membranes protein permeability decreases during function.

  14. Problems of long-term preservation of web pages

    Directory of Open Access Journals (Sweden)

    Mitja Dečman

    2011-01-01

    Full Text Available The World Wide Web is a distributed collection of web sites available on the Internet anywhere in the world. Its content is constantly changing: old data are being replaced which causes constant loss of a huge amount of information and consequently the loss of scientific, cultural and other heritage. Often, unnoticeably even legal certainty is questioned. In what way the data on the web can be stored and how to preserve them for the long term is a great challenge. Even though some good practices have been developed, the question of final solution on the national level still remains. The paper presents the problems of long-term preservation of web pages from technical and organizational point of view. It includes phases such as capturing and preserving web pages, focusing on good solutions, world practices and strategies to find solutions in this area developed by different countries. The paper suggests some conceptual steps that have to be defined in Slovenia which would serve as a framework for all document creators in the web environment and therefore contributes to the consciousness in this field, mitigating problems of all dealing with these issues today and in the future.

  15. Young children's ability to recognize advertisements in web page designs.

    Science.gov (United States)

    Ali, Moondore; Blades, Mark; Oates, Caroline; Blumberg, Fran

    2009-03-01

    Identifying what is, and what is not an advertisement is the first step in realizing that an advertisement is a marketing message. Children can distinguish television advertisements from programmes by about 5 years of age. Although previous researchers have investigated television advertising, little attention has been given to advertisements in other media, even though other media, especially the Internet, have become important channels of marketing to children. We showed children printed copies of invented web pages that included advertisements, half of which had price information, and asked the children to point to whatever they thought was an advertisement. In two experiments we tested a total of 401 children, aged 6, 8, 10 and 12 years of age, from the United Kingdom and Indonesia. Six-year-olds recognized a quarter of the advertisements, 8-year-olds recognized half the advertisements, and the 10- and 12-year-olds recognized about three-quarters. Only the 10- and 12-year-olds were more likely to identify an advertisement when it included a price. We contrast our findings with previous results about the identification of television advertising, and discuss why children were poorer at recognizing web page advertisements. The performance of the children has implications for theories about how children develop an understanding of advertising.

  16. Lifting Events in RDF from Interactions with Annotated Web Pages

    Science.gov (United States)

    Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad

    In this paper we present a method and an implementation for creating and processing semantic events from interaction with Web pages which opens possibilities to build event-driven applications for the (Semantic) Web. Events, simple or complex, are models for things that happen e.g., when a user interacts with a Web page. Events are consumed in some meaningful way e.g., for monitoring reasons or to trigger actions such as responses. In order for receiving parties to understand events e.g., comprehend what has led to an event, we propose a general event schema using RDFS. In this schema we cover the composition of complex events and event-to-event relationships. These events can then be used to route semantic information about an occurrence to different recipients helping in making the Semantic Web active. Additionally, we present an architecture for detecting and composing events in Web clients. For the contents of events we show a way of how they are enriched with semantic information about the context in which they occurred. The paper is presented in conjunction with the use case of Semantic Advertising, which extends traditional clickstream analysis by introducing semantic short-term profiling, enabling discovery of the current interest of a Web user and therefore supporting advertisement providers in responding with more relevant advertisements.

  17. Appraisals of Salient Visual Elements in Web Page Design

    Directory of Open Access Journals (Sweden)

    Johanna M. Silvennoinen

    2016-01-01

    Full Text Available Visual elements in user interfaces elicit emotions in users and are, therefore, essential to users interacting with different software. Although there is research on the relationship between emotional experience and visual user interface design, the focus has been on the overall visual impression and not on visual elements. Additionally, often in a software development process, programming and general usability guidelines are considered as the most important parts of the process. Therefore, knowledge of programmers’ appraisals of visual elements can be utilized to understand the web page designs we interact with. In this study, appraisal theory of emotion is utilized to elaborate the relationship of emotional experience and visual elements from programmers’ perspective. Participants (N=50 used 3E-templates to express their visual and emotional experiences of web page designs. Content analysis of textual data illustrates how emotional experiences are elicited by salient visual elements. Eight hierarchical visual element categories were found and connected to various emotions, such as frustration, boredom, and calmness, via relational emotion themes. The emotional emphasis was on centered, symmetrical, and balanced composition, which was experienced as pleasant and calming. The results benefit user-centered visual interface design and researchers of visual aesthetics in human-computer interaction.

  18. MedlinePlus FAQ: Can you tell me how to cite MedlinePlus pages?

    Science.gov (United States)

    ... Here: Home → FAQs → Question URL of this page: https://medlineplus.gov/faq/citation.html Question: Can you ... Aug 12; cited 2005 Aug 11]. Available from: https://medlineplus.gov/. Health Topic page Begin by citing ...

  19. Measuring the Utilization of On-Page Search Engine Optimization in Selected Domain

    National Research Council Canada - National Science Library

    Goran Matošević

    2015-01-01

    Search engine optimization (SEO) techniques involve „on-page“ and „off-page“ actions taken by web developers and SEO specialists with aim to increase the ranking of web pages in search engine results pages (SERP...

  20. Proteomic study of muscle sarcoplasmic proteins using AUT-PAGE/SDS-PAGE as two-dimensional gel electrophoresis.

    Science.gov (United States)

    Picariello, Gianluca; De Martino, Alessandra; Mamone, Gianfranco; Ferranti, Pasquale; Addeo, Francesco; Faccia, Michele; Spagnamusso, Salvatore; Di Luccia, Aldo

    2006-03-20

    In the present study, an alternative procedure for two-dimensional (2D) electrophoretic analysis in proteomic investigation of the most represented basic muscle water-soluble proteins is suggested. Our method consists of Acetic acid-Urea-Triton polyacrylamide gel (AUT-PAGE) analysis in the first dimension and standard sodium dodecyl sulphate polyacrylamide gel (SDS-PAGE) in the second dimension. Although standard two-dimensional Immobilized pH Gradient-Sodium Dodecyl-Sulphate (2D IPG-SDS) gel electrophoresis has been successfully used to study these proteins, most of the water-soluble proteins are spread on the alkaline part of the 2D map and are poorly focused. Furthermore, the similarity in their molecular weights impairs resolution of the classical approach. The addition of Triton X-100, a non-ionic detergent, into the gel induces a differential electrophoretic mobility of proteins as a result of the formation of mixed micelles between the detergent and the hydrophobic moieties of polypeptides, separating basic proteins with a criterion similar to reversed phase chromatography based on their hydrophobicity. The acid pH induces positive net charges, increasing with the isoelectric point of proteins, thus allowing enhanced resolution in the separation. By using 2D AUT-PAGE/SDS electrophoresis approach to separate water-soluble proteins from fresh pork and from dry-cured products, we could spread proteins over a greater area, achieving a greater resolution than that obtained by IPG in the pH range 3-10 and 6-11. Sarcoplasmic proteins undergoing proteolysis during the ripening of products were identified by Matrix Assisted Laser Desorption/Ionization-Time of Flight (MALDI-ToF) mass spectrometry peptide mass fingerprinting in a easier and more effective way. Two-dimensional AUT-PAGE/SDS electrophoresis has allowed to simplify separation of sarcoplasmic protein mixtures making this technique suitable in the defining of quality of dry-cured pork products by immediate

  1. An evaluation on the Web page navigation tools in university library Web sites In Turkey

    OpenAIRE

    Çakmak, Tolga

    2010-01-01

    Web technologies and web pages are primary tools for dissemination of information all over the world today. Libraries are also using and adopting these technologies to reach their audiences. The effective usage of these technologies can be possible with user centered design. Web pages that have user centered design help users to find information without being lost in the web page. As a part of the web pages, navigation systems have a vital role in this context. Effective usage of navigation s...

  2. Which Methodology Works Better? English Language Teachers' Awareness of the Innovative Language Learning Methodologies

    Science.gov (United States)

    Kurt, Mustafa

    2015-01-01

    The present study investigated whether English language teachers were aware of the innovative language learning methodologies in language learning, how they made use of these methodologies and the learners' reactions to them. The descriptive survey method was employed to disclose the frequencies and percentages of 175 English language teachers'…

  3. Which Methodology Works Better? English Language Teachers' Awareness of the Innovative Language Learning Methodologies

    Science.gov (United States)

    Kurt, Mustafa

    2015-01-01

    The present study investigated whether English language teachers were aware of the innovative language learning methodologies in language learning, how they made use of these methodologies and the learners' reactions to them. The descriptive survey method was employed to disclose the frequencies and percentages of 175 English language teachers'…

  4. Using Facebook Page Insights Data to Determine Posting Best Practices in an Academic Health Sciences Library

    Science.gov (United States)

    Houk, Kathryn M.; Thornhill, Kate

    2013-01-01

    Tufts University Hirsh Health Sciences Library created a Facebook page and a corresponding managing committee in March 2010. Facebook Page Insights data collected from the library's Facebook page were statistically analyzed to investigate patterns of user engagement. The committee hoped to improve posting practices and increase user…

  5. Using Facebook Page Insights Data to Determine Posting Best Practices in an Academic Health Sciences Library

    Science.gov (United States)

    Houk, Kathryn M.; Thornhill, Kate

    2013-01-01

    Tufts University Hirsh Health Sciences Library created a Facebook page and a corresponding managing committee in March 2010. Facebook Page Insights data collected from the library's Facebook page were statistically analyzed to investigate patterns of user engagement. The committee hoped to improve posting practices and increase user engagement…

  6. Investigation of Seed Storage Proteins in some Wild Wheat Progenitors Using SDS-PAGE and ACID-PAGE

    Directory of Open Access Journals (Sweden)

    Omid SOFALIAN

    2009-06-01

    Full Text Available Wheat storage proteins accounted for up to 60% of the total grain proteins. They form gluten proteins, which make a visco-elastic network enables dough to be processed into bread, pasta and other products. In order to study genetic variation of wild wheat relatives, electrophoretic patterns of seed storage proteins, the high-molecular-weight glutenins and gliadins from about 12 wild species and some check improved cultivars were fractionated by SDS-PAGE and Acid-PAGE. The results showed some close relationship between T. urartu, T. dicoccum and bread wheat in the case of glutenin and gliadin. Therefore It was speculated that progenitor of A genome of cultivated wheat could be T. urartu strongly. A high level of polymorphism was detected in the glutenin and gliadin subunits of the wild wheat relatives, showing some similarities with cultivated bread wheat, useful breeding perspectives. The electrophoresis proved to be a suitable method to discriminate wheat variety and species. Also results of this study confirmed that the genetic variation amongst seed storage proteins of wild relatives were considerable. The wild progenitors are important genetic resources and therefore observed genetic variability could be use in any selection strategies.

  7. A Survey on Semantic Focused Crawler For Mining Service Information

    OpenAIRE

    Thakor, Aneri; Singh, Dheeraj Kumar

    2015-01-01

    Focused Crawler play a very important role in field of web mining for extracting and indexing the web pages which are most relevance to the pre define topic.But heterogeneity, ubiquity and ambiguity are major issues in these web pages.Thus various semantic focused crawler used to an extract and annotate the web pages that retrieved according to semantic web technology to overcome the three issues. It is intent to survey of semantic focused crawler.

  8. Distributed Collections of Web Pages in the Wild

    CERN Document Server

    Bogen, Paul Logasa; Furuta, Richard

    2011-01-01

    As the Distributed Collection Manager's work on building tools to support users maintaining collections of changing web-based resources has progressed, questions about the characteristics of people's collections of web pages have arisen. Simultaneously, work in the areas of social bookmarking, social news, and subscription-based technologies have been taking the existence, usage, and utility of this data for granted with neither investigation into what people are doing with their collections nor how they are trying to maintain them. In order to address these concerns, we performed an online user study of 125 individuals from a variety of online and offline communities, such as the reddit social news user community and the graduate student body in our department. From this study we were able to examine a user's needs for a system to manage their web-based distributed collections, how their current tools affect their ability to maintain their collections, and what the characteristics of their current practices ...

  9. Credibility judgments in web page design - a brief review.

    Science.gov (United States)

    Selejan, O; Muresanu, D F; Popa, L; Muresanu-Oloeriu, I; Iudean, D; Buzoianu, A; Suciu, S

    2016-01-01

    Today, more than ever, knowledge that interfaces appearance analysis is a crucial point in human-computer interaction field has been accepted. As nowadays virtually anyone can publish information on the web, the credibility role has grown increasingly important in relation to the web-based content. Areas like trust, credibility, and behavior, doubled by overall impression and user expectation are today in the spotlight of research compared to the last period, when other pragmatic areas such as usability and utility were considered. Credibility has been discussed as a theoretical construct in the field of communication in the past decades and revealed that people tend to evaluate the credibility of communication primarily by the communicator's expertise. Other factors involved in the content communication process are trustworthiness and dynamism as well as various other criteria but to a lower extent. In this brief review, factors like web page aesthetics, browsing experiences and user experience are considered.

  10. Group Recommendation Based on the PageRank

    Directory of Open Access Journals (Sweden)

    Jing Wang

    2012-12-01

    Full Text Available Social network greatly improve the social recommendation application, especially the study of group recommendation. The group recommendation, analyze the social factors of the group, such as social and trust relationship between users, as the factors for the prediction model established. In this paper, PageRank algorithm is introduced in the recommendation method to calculate the member’s importance in the group respectively, and to amend the aggregate function of individual preferences. The aggregate function consider the relationship between various users in the group, and optimize the aggregate function according to users different influence on the group, which can better reflect the social characteristics of group. In short, the study on group recommended model and algorithm can take the initiative to find the user's needs. Extensive experiments demonstrate the effectiveness and efficiency of the methods, which improve the prediction accuracy of the group recommended algorithms.

  11. Semantic Web Techniques for Yellow Page Service Providers

    Directory of Open Access Journals (Sweden)

    Raghu Anantharangachar

    2012-08-01

    Full Text Available Applications providing “yellow pages information” for use over the web should ideally be based on structured information. Use of web pages providing unstructured information poses variety of problems to the user, such as use of arbitrary formats, unsuitability for machine processing and likely incompleteness of information. Structured data alleviates these problems but we require more. Capturing the semantics of a domain in the form of an ontology is necessary to ensure that unforeseen application can easily be created at a later date. Very often yellow page systems are implemented using a centralized database. In some cases, human intermediaries accessible over the phone network examine a centralized database and use their reasoning ability to deal with the user’s need for information. Centralized operation and considerable central administration make these systems expensive to operate. Scaling up such systems is difficult. They behave like isolated systems and it is common for such systems to be highly domain specific, for instance systems dealing with accommodation and travel. This paper explores an alternative – a highly distributed system design meeting a variety of needs – considerably reducing efforts required at a central organization, enabling large numbers of vendors to enter information about their own products and services, enablingend-users to contribute information such as their own ratings, using an ontology to describe each domain of application in a flexible manner for uses foreseen and unforeseen, enabling distributed search and mashups, use of vendor independent standards, using reasoning to find the best matches to a given query, geospatial reasoning and a simple, interactive, mobile application/interface. We view this design as one in which vendors and end-users do the bulk of the work in building large distributed collections of information in a Web 2.0 style. We give importance to geo-spatial information and

  12. Hawking–Page phase transition in new massive gravity

    Directory of Open Access Journals (Sweden)

    Shao-Jun Zhang

    2015-07-01

    Full Text Available We consider Hawking–Page phase transition between the BTZ black hole with M≥0 and the thermal soliton with M=−1 in new massive gravity. By comparing the on-shell free energies, we can see that there exists a critical temperature. The thermal soliton is more probable than the black hole below the critical temperature while the black hole is more probable than the thermal soliton above the critical temperature. By consistently constructing the off-shell free energies taking into account the conical singularity, we show that there exist infinite non-equilibrium states connecting the BTZ black hole and the thermal soliton, so that they provide a picture of continuous evolution of the phase transition.

  13. Improving PageRank for Local Community Detection

    CERN Document Server

    Hollocou, Alexandre; Bonald, Thomas

    2016-01-01

    Community detection is a classical problem in the field of graph mining. While most algorithms work on the entire graph, it is often interesting in practice to recover only the community containing some given set of seed nodes. In this paper, we propose a novel approach to this problem, using some low-dimensional embedding of the graph based on random walks starting from the seed nodes. From this embedding, we propose some simple yet efficient versions of the PageRank algorithm as well as a novel algorithm, called WalkSCAN, that is able to detect multiple communities, possibly overlapping. We provide insights into the performance of these algorithms through the theoretical analysis of a toy network and show that WalkSCAN outperforms existing algorithms on real networks.

  14. Collecting responses through Web page drag and drop.

    Science.gov (United States)

    Britt, M Anne; Gabrys, Gareth

    2004-02-01

    This article describes how to collect responses from experimental participants using drag and drop on a Web page. In particular, we describe how drag and drop can be used in a text search task in which participants read a text and then locate and categorize certain elements of the text (e.g., to identify the main claim of a persuasive paragraph). Using this technique, participants respond by clicking on a text segment and dragging it to a screen field or icon. We have successfully used this technique in both the argument element identification experiment that we describe here and a tutoring system that we created to teach students to identify source characteristics while reading historical texts (Britt, Perfetti, Van Dyke, & Gabrys, 2000). The implementation described here exploits the capability of recent versions of Microsoft's Internet Explorer Web browser to handle embedded XML documents and drag and drop events.

  15. Learning Hierarchical User Interest Models from Web Pages

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    We propose an algorithm for learning hierarchical user interest models according to the Web pages users have browsed. In this algorithm, the interests of a user are represented into a tree which is called a user interest tree, the content and the structure of which can change simultaneously to adapt to the changes in a user's interests. This expression represents a user's specific and general interests as a continuum. In some sense, specific interests correspond to short-term interests, while general interests correspond to long-term interests. So this representation more really reflects the users' interests. The algorithm can automatically model a user's multiple interest domains, dynamically generate the interest models and prune a user interest tree when the number of the nodes in it exceeds given value. Finally, we show the experiment results in a Chinese Web Site.

  16. Standards opportunities around data-bearing Web pages.

    Science.gov (United States)

    Karger, David

    2013-03-28

    The evolving Web has seen ever-growing use of structured data, thanks to the way it enhances information authoring, querying, visualization and sharing. To date, however, most structured data authoring and management tools have been oriented towards programmers and Web developers. End users have been left behind, unable to leverage structured data for information management and communication as well as professionals. In this paper, I will argue that many of the benefits of structured data management can be provided to end users as well. I will describe an approach and tools that allow end users to define their own schemas (without knowing what a schema is), manage data and author (not program) interactive Web visualizations of that data using the Web tools with which they are already familiar, such as plain Web pages, blogs, wikis and WYSIWYG document editors. I will describe our experience deploying these tools and some lessons relevant to their future evolution.

  17. Pages from the Past: Part I [In Bulgarian

    Directory of Open Access Journals (Sweden)

    F. Stefanova Strigacheva

    2016-07-01

    Full Text Available Pages from the past are recollections written in 1970s by Ferdinanda - Venka Stefanova Strigacheva (1897-1976. A large period of time after 1910 is covered. She was born in the village of Targovishte, Belogradchik region. She graduated from the Secondary School for Girls in Vidin. Then she became a teacher. Later she studied agronomy in Wien and Berlin. Ferdinanda Strigacheva was an active and influential communist. However, when the communists became in power in Bulgaria in the autumn of 1944, she left aside from the policy disappointed from the actions of the Bulgarian Communist Party; nevertheless Strigacheva kept her loyality to the Party and beliefs into the communism. Strigacheva son Dr. Atanas Strigachev and his son Dr. Anton Strigachev prepared the text for publishing, mainly with changing the spelling into the modern one.

  18. Learning Layouts for Single-Page Graphic Designs.

    Science.gov (United States)

    O'Donovan, Peter; Agarwala, Aseem; Hertzmann, Aaron

    2014-08-01

    This paper presents an approach for automatically creating graphic design layouts using a new energy-based model derived from design principles. The model includes several new algorithms for analyzing graphic designs, including the prediction of perceived importance, alignment detection, and hierarchical segmentation. Given the model, we use optimization to synthesize new layouts for a variety of single-page graphic designs. Model parameters are learned with Nonlinear Inverse Optimization (NIO) from a small number of example layouts. To demonstrate our approach, we show results for applications including generating design layouts in various styles, retargeting designs to new sizes, and improving existing designs. We also compare our automatic results with designs created using crowdsourcing and show that our approach performs slightly better than novice designers.

  19. PAGES-Powell North America 2k database

    Science.gov (United States)

    McKay, N.

    2014-12-01

    Syntheses of paleoclimate data in North America are essential for understanding long-term spatiotemporal variability in climate and for properly assessing risk on decadal and longer timescales. Existing reconstructions of the past 2,000 years rely almost exclusively on tree-ring records, which can underestimate low-frequency variability and rarely extend beyond the last millennium. Meanwhile, many records from the full spectrum of paleoclimate archives are available and hold the potential of enhancing our understanding of past climate across North America over the past 2000 years. The second phase of the Past Global Changes (PAGES) North America 2k project began in 2014, with a primary goal of assembling these disparate paleoclimate records into a unified database. This effort is currently supported by the USGS Powell Center together with PAGES. Its success requires grassroots support from the community of researchers developing and interpreting paleoclimatic evidence relevant to the past 2000 years. Most likely, fewer than half of the published records appropriate for this database are publicly archived, and far fewer include the data needed to quantify geochronologic uncertainty, or to concisely describe how best to interpret the data in context of a large-scale paleoclimatic synthesis. The current version of the database includes records that (1) have been published in a peer-reviewed journal (including evidence of the record's relationship to climate), (2) cover a substantial portion of the past 2000 yr (>300 yr for annual records, >500 yr for lower frequency records) at relatively high resolution (identify and assimilate relevant records that have not yet been included. The goal is to develop a comprehensive and open-access resource that will serve the diverse community interested in the climate of the Common Era in North America.

  20. CT paging arteriography with a multidetector-row CT. Advantages in splanchnic arterial imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kobayashi, Seiji [Keio Univ., Tokyo (Japan). School of Medicine

    1999-11-01

    The purpose of this study is to assess the utility of CT paging arteriography with a multidetector-row CT as a replacement for conventional angiography in the evaluation of splanchnic arterial anomalies. Sixty-three patients underwent CT paging arteriography with a multidetector-row CT. In the 56 patients with conventional angiographic correlation, there was only one minor disagreement with CT paging arteriography. In the 7 patients who underwent IVDSA (intra venous digital subtraction angiography), CT paging arteriography defined four hepatic arterial anomalies which could not be depicted by IVDSA. In conclusion, CT paging arteriography provides noninvasive means to identify splanchnic arterial anomalies. (author)